Inicio Reviews US regulators propose new online privacy safeguards for children

US regulators propose new online privacy safeguards for children


US regulators propose new online privacy safeguards for children

Federal Trade Commission proposed on Wednesday major changes Strengthening a key federal rule protecting children's privacy online, one of the U.S. government's most significant efforts to strengthen consumer privacy in more than a decade.

The changes are intended to strengthen the rules underlying the Children's Online Privacy Protection Act of 1998, a law that prohibits online tracking of youth by services such as social media apps, video game platforms, toy retailers and digital advertising networks. Regulators said the move would shift the “burden” of online safety from parents to apps and other digital services, while curbing how platforms can use and monetize children's data.

The proposed changes would require some online services to turn off advertising targeted at children under 13 by default. They will prevent online services from using personal details such as a child's cellphone number to motivate young people to stay on their platforms longer. This means online services will no longer be able to use personal data to send push notifications to young children.

The proposed update would also strengthen security requirements for online services that collect children's data and limit how long online services keep that information. And they would limit the collection of student data by learning apps and other educational-tech providers by requiring schools to consent to collecting children's personal details only for educational purposes, not commercial purposes.

“Children should be able to play and learn online without being constantly tracked by companies looking to collect and make money from their personal data,” Federal Trade Commission Chairwoman Lena M. Khan said in a statement Wednesday. He added, “By requiring firms to better protect children's data, our proposal places positive obligations on service providers and prevents them from outsourcing their responsibilities to parents.”

COPPA is the central federal law protecting children online in the United States, although members of Congress have since attempted to introduce more comprehensive online safety bills for children and teens.

under COPPA lawOnline services targeted at children, or who know they have children on their platforms, are required to provide parental consent before collecting, using or sharing personal details – such as first and last name, address and phone number – from an underage child. Father's permission will have to be taken. Of 13.

To comply with the law, popular apps like Instagram and TikTok have terms of service that prevent children under 13 from creating accounts. Social media and video game apps commonly ask new users to provide their date of birth.

Nevertheless, regulators have filed several complaints against big tech companies accusing them of failing to set up effective age-gating systems; Showing targeted advertisements to children based on their online behavior without parental permission; enabling strangers to contact children online; Or keeping children's data even after parents ask them to delete it. Amazon; Microsoft; Google and its YouTube platform; Epic Games, creators of Fortnite; And, the social app now known as TikTok, has paid a multimillion-dollar fine to settle charges that it violated the law.

Separately, a coalition of 33 state attorneys general filed a joint federal lawsuit in October against Facebook and Instagram's parent company Meta, saying the company violated children's privacy laws. In particular, the states criticized Meta's age-verification system, saying that the company allowed millions of underage users to create accounts without parental consent. Meta said it has spent a decade working to make online experiences safe and age-appropriate for teens and the states' complaint “misrepresents” the company's work.

The FTC proposed stronger children's privacy protections amid growing public concern over the potential mental health and physical safety risks posed to young people by popular online services. Parents, pediatricians and children's groups warn that social media content recommendation systems routinely show young girls inappropriate content promoting self-harm, eating disorders and plastic surgery. And some school officials worry that social media platforms distract students from their work in the classroom.

States have passed more than a dozen laws this year that restrict minors' access to social media networks or pornography sites. Industry trade groups have successfully sued to temporarily block many of those laws.

The FTC initiated a review of the Children's Privacy Rule in 2019, receiving more than 175,000 comments from tech and advertising industry trade groups, video content developers, consumer advocacy groups, and members of Congress. resulting proposal Runs over 150 pages.

The proposed changes include reducing an exception that allows online services to collect persistent identification codes for children without parental consent for certain internal functions such as product improvement, consumer personalization or fraud prevention.

The proposed changes would prevent online operators from using such user-tracking code to maximize the time children spend on their platforms. This means that online services would not be able to use technologies such as sending mobile phone notifications “to induce a child to engage with the site or service, without verifiable parental consent,” according to the proposal.

How online services will comply with the changes is not yet known. Members of the public have 60 days to comment on the proposals, after which the commission will vote.

Initial reactions from industry trade groups were mixed.

Software and Information Industry Association, Whose members also include AmazonApple, Google and Meta said it was “grateful” for the FTC's efforts to consider outside input and that the group's recommendations were cited in the agency's proposal.

“We are interested in participating in the next phase of the effort and hope the FTC will take a similarly thoughtful approach,” Paul Lekas, the group's head of global public policy, said in an email.

netchoice, whose members include tiktokIn contrast, Snap, Amazon, Google and Meta said the agency's proposed changes go too far by setting defaults that parents might not want. The group has sued several states to block new laws that would limit access to online services by minors.

“With this new rule, the FTC is overriding the wishes of parents,” Carl Szabo, the group's general counsel, said in a statement. This would “make it even more difficult for websites to provide essential services to children that are not approved by their parents.”



Por favor ingrese su comentario!
Por favor ingrese su nombre aquí