Silicon Valley battles states over new online safety laws for children
Last summer, Ohio enacted a social media law that would require Instagram, Snapchat, TikTok and YouTube to seek parental consent before allowing children under 16 to use their platforms.
But this month, just before the measure took effect, a tech industry group called NetChoice — which represents Google, Meta, Snap, TikTok and others — filed a lawsuit to block it on free speech grounds, a Persuaded the federal district court judge. Temporarily pause new rules.
The case is part of a broader litigation campaign by NetChoice to block new state laws protecting youth online — an anti-regulation effort likely to come under scrutiny on Wednesday. As the Senate Judiciary Committee questions Social Media Officer about online child sexual exploitation. The NetChoice lawsuits have irked state officials and lawmakers who had sought input from the tech company when drafting new measures.
“I think it's cowardly and fraudulent,” Ohio Lieutenant Governor Jon Husted said of the industry lawsuit, noting that either he or his staff met with Google and Meta about the bill last year. And had accommodated the companies' concerns. “We tried to be as cooperative as possible — and then at the 11th hour, they filed a lawsuit.”
The social media platforms said some state laws conflict with each other and they would like Congress to enact a federal law setting national standards for protecting children online.
NetChoice said the new state laws impact its members' First Amendment rights to freely distribute information, as well as the rights of minors to receive information.
“There's a reason it's such a slam dunk win every time for NetChoice,” said carl szabo, Group Vice President. “And that's because it's clearly unconstitutional.”
Due to growing public concerns over the mental health of young people, lawmakers and regulators across the United States are leading a bipartisan effort to rein in popular social media platforms by creating a wave of laws, even as tech industry groups Working to overturn them.
The first law of its kind, passed last spring in Utah, will require social media companies to verify the age of users and obtain parental consent before allowing minors to set up accounts. Arkansas, Ohio, Louisiana, and Texas later passed similar laws requiring parental consent for social media services.
A landmark new law in California, the Age-Appropriate Design Code Act, will require many popular social media and multiplayer video game apps to turn on the highest privacy settings — and turn off potentially risky features, like messaging systems that allow adult strangers. Allows to contact young people. – By default for minors.
“The intent is to ensure that access to any technology product under the age of 18 is safe for children by design and default,” said California Assemblymember Buffy Wicks, who sponsored the bill.
But the free speech lawsuits by NetChoice have dealt a major blow to these state efforts.
Last year in California and Arkansas, judges in NetChoice cases temporarily blocked new state laws from taking effect. (The New York Times and the Student Press Law Center filed a joint application Brief description of friend of the court NetChoice's defense in a California case last year argued that the law could limit the newsworthy content available to students.)
“There's been a lot of pressure put on states to regulate social media, to protect against its harms, and a lot of the concerns are now being incorporated into laws specifically about children,” she said. Geneviève Laquier, a professor at the University of Chicago Law School. “What you're seeing here is that the First Amendment is still a concern, in many cases these laws have been put on hold.”
State lawmakers and officials said they viewed the pressure on the tech industry as a temporary setback, and called their new laws reasonable measures to ensure basic protections for children online. California Attorney General Rob Bonta said the state's new law would regulate platform design and company conduct — not content. The California law, which is set to take effect in July, does not explicitly require social media companies to verify the age of every user.
Mr Bonta recently appealed the decision blocking the law.
“NetChoice has a “The strategy is 'burn everything,' and they're going to challenge every law and regulation to protect children and their privacy in the name of the First Amendment,” he said in a phone interview Sunday.
California introduced on Monday Two Children's Online Privacy and Safety Bill Which was sponsored by Mr. Bonta.
NetChoice has also filed a lawsuit to try to block a new social media bill in Utah that would require Instagram and TikTok to verify the age of users and get parental permission for minors to have accounts.
Civil rights groups have warned that such legislative efforts could hamper freedom of expression – allowing adults as well as minors to establish and use social media accounts using documents such as driver's licenses to prove their age. Needs to be verified. He says requiring parental consent for social media may also hinder youth from finding support groups or important resources about reproductive health or gender identity.
The Supreme Court has overturned several laws that were intended to protect minors from potentially harmful material, including violent video games and “indecent” online material, on freedom of speech grounds.
Social media companies said they have put in place many protections for young people and that they would prefer Congress enacted federal laws rather than requiring companies to comply with sometimes contradictory state laws.
Snap recently became the first social media company to support a federal bill called the Kids Online Safety Act, which has some similarities with California's new law.
Snap said in a statement that many of the provisions in the federal bill are mirrored Company's existing security measures, such as setting teen accounts to the strictest privacy settings by default. The statement said the bill will direct government agencies to study technological approaches to age verification.
Meta has called upon Congress To pass legislation that would require Apple and Google app stores – not social media companies – to verify a user's age and seek permission from a parent before allowing anyone under 16 to download an app. Will make you responsible for it. Meta recently began advertising on Instagram saying it supports the federal law.
“We support clear, consistent legislation that makes it simple Help parents manage their teens' online experiences, and it holds all apps used by teens to the same standard,” META said in a statement. “We look forward to continuing to work with policymakers to help find more practical solutions.”
But simply requiring consent from parents will do nothing to reduce the potentially harmful effects of social media platforms, the federal judge in the NetChoice case in Ohio has said.
“Preventing minors under the age of 16 from accessing all content on social media websites is a breathtakingly blunt instrument to minimize the harms of social media to children,” Chief Justice of the U.S. District Court for the Southern District of Eastern Ohio Justice Algenan L. Marbley said in the division, wrote in his decision Temporarily halting the state's social media law.