Will lawmakers actually act to protect children online? Some say yes.
In the final minutes of a congressional hearing on Wednesday that rebuked tech CEOs for not protecting children online, Senator Richard J. Durbin urged lawmakers to act to protect the youngest users of the Internet.
“No excuses,” he said.
Lawmakers have long made similar statements about holding tech companies accountable — and they have little to show for it. Republicans and Democrats alike have declared at various points that the time has come to regulate tech giants on matters like privacy and antitrust. Yet for years, it ended there: There were no new federal regulations for companies to follow.
The question is whether something will be different this time. And already, there are indicators that the topic of online child safety may gain greater traction legislatively.
At least six legislative proposals pending in Congress target the spread of child sexual abuse material online and would require platforms like Instagram, Snapchat and TikTok to do more to protect minors. These efforts are supported by the emotional stories of children who were victimized online and died by suicide.
The only federal Internet law to be passed in recent years, SESTA (for the Stop Enabling Sex Traffickers Act and the Fight Online Sex Trafficking Act), which made it easier for victims of sex trafficking to sue websites and online platforms, was passed in 2018. Approval was given. Even after the heartbreaking testimony of one victim's mother.
Online safety experts and lawmakers said child safety is a personally concerning and intense topic that is an easier political sell than some other matters. At Wednesday's hearing, faced with stories of children dying after being sexually abused, Meta's Mark Zuckerberg said he was sorry the families suffered.
“Similar to the tobacco industry, it took a series of embarrassing hearings for tobacco – but finally Congress took action,” said Jim Steyer, president of Common Sense Media, a nonprofit child advocacy group. “The dam finally broke.”
Any legislative progress on online child safety would be a counterpoint to the impasse that has beset Congress over other technology issues in recent years. Time and again, proposals for rules to regulate tech giants like Google and Meta have failed to become law.
For example, in 2018, Congress questioned Mr. Zuckerberg about the leak of Facebook user data to Cambridge Analytica, a voter profiling company. Outcry over the incident led to calls for Congress to pass new rules to protect people's online privacy. But while California and other states eventually approved online privacy laws, Congress did not.
Lawmakers also attacked Section 230 of the Communications Decency Act, a statute that protects online platforms like Instagram and TikTok from many lawsuits over content posted by their users. Congress has made no substantive changes to the law other than making it harder for platforms to use legal shields when they are accused of meaningfully facilitating sex trafficking.
And after companies like Amazon and Apple were accused of being monopolies and abusing their power over smaller rivals, lawmakers proposed a bill to make some of their business practices illegal. An attempt to push the law to its final limit failed in 2022.
Senators Amy Klobuchar, Democrat of Minnesota, and Republican Josh Hawley of Missouri, as well as other lawmakers, have blamed the power of tech lobbyists for killing the proposed rules. Others have said tech regulations have not been a priority for congressional leaders, who have focused on bills and measures to subsidize U.S. companies that make critical computer chips and use renewable energy.
The Senate Judiciary Committee, which hosted Wednesday's hearing, spoke on five child protection bills directed at tech platforms ahead of the hearing. The committee passed the bills last year; No law has been made.
The proposals included the STOPCSAM Act (Strengthening Transparency and Obligations to Protect Children Victims of Abuse and Abuse Act), which would give victims new avenues to report child sexual abuse material to Internet companies, and the REPORT Act (Reporting Modifying existing procedures on technology), which would expand the types of potential crimes online platforms are required to report to the National Center for Missing and Exploited Children.
Other proposals would make it a crime to distribute an intimate image of someone without that person's consent and would prompt law enforcement to coordinate investigations of crimes against children.
A separate proposal passed last year by the Senate Commerce Committee, the Kids Online Safety Act, would create a legal duty for certain online platforms to protect children. Some legislative proposals have been criticized by digital rights groups such as the Electronic Frontier Foundation, which say they could encourage platforms to remove legitimate content while companies attempt to comply with laws.
Ms. Klobuchar, who questioned technology officials at Wednesday's hearing, said in an interview that the session “felt like a success.” He added, “As someone who has managed these companies for years, this is the first time I feel hope for a movement.”
Others were skeptical. To pass any proposal he will need the support of Congress leaders. Bills passed by committee last year will need to be re-introduced and go through that process.
Hany Farid, a professor at the University of California, Berkeley, who helped create the technology used by platforms to detect child sexual abuse material, said he came to Congress after hearing about protecting children online. saw.
“This is one thing we must agree on: that we have a responsibility to protect children,” he said. “If we can't get it right, what hope do we have for anything else?”