Tuesday, February 27, 2024
Google search engine
InicioReviewsLaw enforcement prepares for flood of child sexual abuse images generated by...

Law enforcement prepares for flood of child sexual abuse images generated by AI

Law enforcement prepares for flood of child sexual abuse images generated by AI

Law enforcement officials are bracing for an explosion of artificial intelligence-generated content that realistically depicts the sexual exploitation of children, deepening the challenge of identifying victims and responding to such abuse.

The concerns come after Meta, which is a primary resource for authorities in flagging sexually explicit content, made it harder to track criminals by encrypting its messaging service. This complexity underscores the tricky balance that technology companies must balance by weighing privacy rights against the safety of children. And the prospect of prosecuting that type of crime raises questions about whether such images are illegal and what kind of recourse victims might have.

Congressional lawmakers have raised some of those concerns to press for more stringent security measures, including calling technology executives on Wednesday to testify about their safety for children. The fake, sexually explicit images of Taylor Swift, possibly generated by AI, that swept social media last week only highlight the risks of such technology.

Steve Grocki, head of the Justice Department's Child Exploitation and Obscenity Section, said, “Creating sexually explicit images of children through the use of artificial intelligence is a particularly egregious form of online exploitation.”

The ease of AI technology means criminals can create multiple images of children being sexually exploited or abused with the click of a button.

Simply entering a gesture brings up realistic images, videos and text in a matter of minutes, revealing new images of real children as well as apparent images of children who don't actually exist. These may include AI-generated content of infants and children being raped; According to a recent report, famous young children are being sexually exploited studied in britain, and regular classroom photos, adapted so that all the children were naked.

Dr. Michael Bourke said, “The scary thing we have now is that someone can take a picture of a child from social media, from a high school page or from a sporting event, and they can engage in something that some People say 'nudification'.” Former chief psychologist for the US Marshals Service, who has worked on sex crimes involving children for decades. He said the use of AI to alter photos in this way is becoming common.

Experts say the photos are indistinguishable from the real thing, making it hard to identify real victims and fake ones. “The investigation is much more challenging,” said Lt. Robin Richards, commander of the Los Angeles Police Department's Internet Crimes Against Children Task Force. “It takes time to investigate, and then once we're fully into investigation, it's AI, and then what do we do with it going forward?”

Law enforcement agencies, understaffed and underfunded, are already struggling to keep pace as rapid advances in technology have allowed the fantasy of child sexual exploitation to flourish at a shocking rate. Images and videos are circulating across the Internet, enabled by smartphone cameras, the dark web, social media, and messaging applications.

Only a fraction of the material considered criminal is being investigated. John Pizzurro, head of Raven, a nonprofit that works with lawmakers and businesses to fight the sexual exploitation of children, said that in a recent 90-day period, law enforcement officials searched nearly 100,000 IP addresses nationwide. Has been linked to child sexual abuse. Material. (An IP address is a unique sequence of numbers assigned to each computer or smartphone connected to the Internet.) Fewer than 700 of them were being investigated, he said, because of a chronic lack of funding dedicated to fighting these crimes. .

Although a 2008 federal law authorized $60 million to assist state and local law enforcement officials in investigating and prosecuting such crimes, Congress has never appropriated that much in any year, in New said Mr Pizzuro, a former commander who oversaw online child exploitation cases. Jersey.

The use of artificial intelligence has complicated other aspects of tracking child sexual abuse. Typically, known content is randomly assigned a string of numbers that is equivalent to a digital fingerprint, which is used to detect and remove illegal content. If known images and videos are modified, the content appears new and is no longer associated with the digital fingerprint.

Adding to those challenges is the fact that although the law requires tech companies to report when they discover illegal content, it does not require them to actively seek it out.

Tech companies may have different approaches. Meta has been the best partner of the authorities when it comes to flagging sexually explicit content involving children.

Out of total in 2022 32 million tips Meta referred approximately 21 million to the National Center for Missing and Exploited Children, a federally designated clearinghouse for child sexual abuse material.

But the company is encrypting its messaging platform to compete with other secure services that shield users' content, essentially turning off the lights for investigators.

Raven's legal counsel Jennifer Dunton warned of the consequences, saying the decision could drastically reduce the number of crimes tracked by authorities. “Now you have images that no one has ever seen, and now we're not even finding them,” she said.

Britain's security minister Tom Tugendhat said the move would empower child predators around the world.

“Meta’s decision to implement end-to-end encryption without strong security features makes these images available to millions of people without fear of being caught,” Mr. Tugendhat said in a statement.

The social media giant said it would continue to pass on any tips on child sexual abuse material to authorities. “We're focused on finding and reporting this content, while working to prevent abuse in the first place,” said Meta spokesperson Alex Dzidzon.

Even though the number of current cases involving AI-generated child sexual abuse material is very small, this number is expected to grow rapidly and will highlight new and complex questions about whether existing federal and state laws are sufficient to prosecute these crimes. Are enough.

For one, there is the issue of how to treat entirely AI-generated materials.

In 2002, the Supreme Court overturned the federal ban on computer-generated imagery of child sexual abuse, finding that the law was written so broadly that it could potentially limit even political and artistic works. Alan Wilson, Attorney General of South Carolina led a letter Congress urged lawmakers to act swiftly, saying in an interview that it hoped the decision would be tested as cases of AI-generated child sexual abuse material rise.

Several federal laws, including obscenity statutes, can be used to prosecute cases involving online child sexual abuse material. Some states are considering how to criminalize such AI-generated content, including how to hold minors who create such images and videos to account.

For Francesca Mani, a high school student in Westfield, NJ, the lack of legal consequences for creating and sharing such AI-generated images is especially serious.

In October, Francesca, who was 14 at the time, learned that she was one of the girls in her class whose likeness was manipulated And a nude image of her was taken off her clothes, to which she had not consented, which was then circulated in an online group chat.

Francesca has gone from upset to angry to empowered, her mother, Dorota Mani, said in a recent interview, adding that they were working with state and federal lawmakers to draft new laws that would outlaw such counterfeits. Would make nude images illegal. The incident is still under investigation, although at least one male student was briefly suspended.

This month, Francesca spoke in washington about his experience and called on Congress to pass a bill that would make sharing such content a federal crime.

“What happened to me at 14 could happen to anyone,” he said. “That's why legislation is so important.”

https://static01.nyt.com/images/2024/01/30/multimedia/00dc-ai-abuse-zcvm/00dc-ai-abuse-zcvm-facebookJumbo.jpg

RELATED ARTICLES

DEJA UNA RESPUESTA

Por favor ingrese su comentario!
Por favor ingrese su nombre aquí

- Advertisment -
Google search engine

Most Popular

Recent Comments