Freepik
A young girl looking at a smartphone in her bedroom.
Two bills that stalled this session would have placed age-verification requirements on social media apps.
There was some appetite among Wisconsin lawmakers on both sides of the aisle this year to join the growing number of states addressing minors’ use of Instagram, Snapchat and Facebook. Today’s teens can’t seem to get enough of these social media platforms and there is growing concern about how their use affects developing brains and may increase depression, social isolation, and feelings of negative self-worth.
But none of the proposals passed both chambers before the state Assembly and Senate adjourned in February and March, respectively. Still, jurors’ decisions in New Mexico and California in March to find Meta and, in the latter state, Google, liable for harming children’s mental health could invigorate more states to seek regulation. The verdicts set up a potential future legal fight over Section 230, a federal law that holds online platforms are not liable for the content users post.
They could also push more states to enact age-verification requirements for social media platforms.
“I think the verdicts provide political tailwind for [state-level] age-verification requirements,” Nikolas Guggenberger, a professor of law at the University of Houston Law Center who studies antitrust and technology law, tells Isthmus in an email.
Unlike such states as California and Florida, Wisconsin has no statutes regulating children’s use of social media. That has prompted some pledges in the gubernatorial race — Democratic candidate and Lt. Gov. Sara Rodriguez in February proposed banning “addictive features” and allowing parents to sue for “damaging content.”
Other candidates for Wisconsin governor are also weighing in on the issue. Republican gubernatorial frontrunner and U.S. Rep. Tom Tiffany, asked whether he supported regulatory legislation from Rep. Joy Goeben, R-Hobart, told Isthmus in a statement that “we need to keep kids safe, hold Big Tech accountable, and empower parents to stay in control of their children’s online activity.”
“As governor, I will support commonsense age verification requirements for app providers and parental consent protections,” Tiffany says.
Goeben is the author of two bills that tackle age verification: one would force app stores, such as the Apple App Store or Google Play Store, to implement age-verification methods that, if a user is determined to be a minor, would require parental approval before downloading apps.
Another would implement similar verification requirements for individual social media apps. This bill would forbid “profile-based, paid commercial marketing” to minors on social media platforms and so-called “addictive features,” such as auto-scrolling and public like or share counts. It also requires that minors’ accounts use the highest-possible privacy settings, unless a parent consents to a lesser level of privacy.
The two bills passed the Assembly with bipartisan support on Feb. 19 but did not get a vote in the Senate.
Another gubernatorial candidate, Democratic Sen. Kelda Roys of Madison, authored a separate bill this past session that would put the onus on social media platforms to use an age verification method identified and approved by the Wisconsin Department of Justice, such as requiring the uploading of a photo ID.
“We've basically conducted a 15- to 20-year experiment on a generation of young people that did not give permission to be guinea pigs,” Roys says in an interview. “The state has a compelling governmental interest in regulating [this], because there are harms.”
Roys’ bill would also prevent social media platforms from gathering or selling data on how minors use such platforms, and prevent the platforms from conducting targeted advertising. In addition, it would only allow content recommendations to come from specific searches on the platform itself, not from data gathered across their internet use.
Attempts to regulate social media apps among minors have been met with legal challenges from companies and trade groups. In separate cases in 2025, federal judges struck down age verification laws in Louisiana and Texas, arguing that such legislation unlawfully impairs users’ First Amendment rights. Trade associations representing technology and social media companies brought both suits.
“[The law] restricts access to a vast universe of speech by requiring Texans to prove their age before downloading a mobile app or accessing paid content within those apps and requires minors to obtain parental consent,” U.S. District Court Judge Robert Pitman wrote in a Dec. 23 order.
Roys expects that if her bill, if enacted, would be challenged in court: “I would have loved to ban advertising altogether on social media sites for minors, but it was very, very clear from looking at the court cases that that was not going to fly, right? If we pass a regulation bill, I don't want it to be struck down.”
Trade groups such as NetChoice, which represents such companies as Nextdoor, Snapchat and Meta, the parent company of Facebook and Instagram, lobby hard against legislation regulating social media platforms.
“This [bill] creates strong incentives for platforms to collect sensitive identifying information about all Wisconsin users — including minors,” testified Amy Bos, vice president of government affairs and commerce for NetChoice, during testimony on Goeben’s bill.
Bos added that the bill posed “at least two independent First Amendment violations”: forcing minors to receive parental consent prior to using a platform and preventing platforms from disseminating content to users without first verifying their age.
The ACLU of Wisconsin also opposed Goeben’s proposals, writing that they raise “concerns about free expression, privacy, and the constitutional rights of both minors and adults” and would compel social media companies to collect more data on users to determine whether they are adults.
The organization also argued that the legislation would impede minors’ free expression and political engagement. The ACLU of Wisconsin is also registered in opposition to Roys’ bill, according to the Wisconsin Ethics Commission.
Some state leaders have urged that federal lawmakers establish a baseline expectation for children’s online safety. In a Feb. 10 open letter, 40 state attorneys general called for the passage of the U.S. Senate’s Kids Online Safety Act, which would require that companies take “reasonable” steps to protect children from such harms as depression and eating disorders. It would also require minors’ accounts to have the highest-possible privacy settings and allow users to opt out of algorithmic content recommendations.
The bill also permits states to enact their own child safety legislation, unlike a separate version currently in the U.S. House.
“Increasing evidence demonstrates that these companies are aware of the adverse mental health consequences imposed on underage users, yet they have chosen to persist in these practices,” the open letter reads. “Accordingly, many of our offices have initiated investigations and filed lawsuits against Meta and TikTok for their role in harming minors.”
Asked why Wisconsin Attorney General Josh Kaul did not sign on to the letter, Wisconsin Department of Justice spokesperson Riley Vetterkind says the office “doesn't comment on potential internal legal analysis.”
Vetterkind noted that Kaul joined 42 bipartisan attorneys general in suing Meta and in an amicus brief supporting California’s Protecting Our Kids from Social Media Addiction Act.
