Sunday, December 3, 2023
Google search engine
InicioTechnology NewsAmid rise of sextortion, computer scientists turn to AI to identify risky...

Amid rise of sextortion, computer scientists turn to AI to identify risky apps


Almost weekly, Brian Levine, a computer scientist at the University of Massachusetts Amherst, gets asked the same question by his 14-year-old daughter: Can I download this app?

Mr. Levine responds by scanning hundreds of customer reviews in the App Store for allegations of harassment or child sexual abuse. The manual and arbitrary process has them wondering why there aren’t more resources available to help parents make quick decisions about apps.

Over the past two years, Mr. Levin has tried to help parents by creating a computational model that measures customer reviews about social apps. He and a team of researchers have used artificial intelligence to evaluate the context of reviews with words such as “child porn” or “pedo”. A searchable website called the App Danger ProjectWhich provides clear guidance on the security of social networking apps.

The website matches users’ reviews about sexual predators and provides a safety assessment of apps with negative reviews. It lists reviews that mention sexual harassment. Although the team did not contact reviewers to verify their claims, it read each one and excluded claims that did not highlight child-safety concerns.

“There are reviews that talk about the type of dangerous behavior that is occurring, but those reviews are suppressed,” Mr. Levine said. “You can’t find them.”

Hunters are increasingly weaponizing apps and online services to collect candid images. Last year, Law enforcement received 7,000 reports Children and teenagers were forced to send nude photos and then blackmailed for the photos or money. The FBI declined to say how many of those reports were credible. Incidents, known as sextortion, more than doubled during the pandemic.

Because Apple and Google’s app stores don’t offer keyword searches, Levin said, it can be difficult for parents to find warnings of inappropriate sexual conduct. He envisions the App Danger project, which is free, to complement other services that check the suitability of products for children, such as Common Sense Media, by identifying apps that aren’t doing enough to police users. Are. He does not plan to make a profit from the site, but is encouraging donations to the University of Massachusetts to offset its costs.

Mr. Levine and a dozen computer scientists examined the number of reviews warning of child sexual abuse on more than 550 social networking apps distributed by Apple and Google. They found that a fifth of those apps had two or more complaints of child sexual abuse material and that 81 offerings on the App and Play Store had seven or more such reviews.

Their investigation builds on previous reports of apps containing complaints of unwanted sex. In 2019, the new York Times It detailed how predators treat video games and social media platforms as hunting grounds. a separate The Washington Post reported that year Six apps received thousands of complaints, due to which Apple removed Monkey, ChatLive and Chat for Strangers apps.

Apple and Google have a financial interest in distributing apps. The tech giant, which accounts for 30 percent of App Store sales, helped three apps with multiple user reports of sexual abuse generate $30 million in sales last year: Hoop, MeetMe and Whisper, according to market research firm Sensor Tower. According to.

In more than a dozen criminal cases, the Justice Department has described apps as tools that were used to solicit sexual pictures or meetings from children. Minnesota, meet me California, Kentucky And Iowa, and whisper inside Illinois, Texas And ohio,

Mr Levin said Apple and Google should provide parents with more information about the risks posed by certain apps and better policing those with a track record of abuse.

“We are not saying that all apps should be removed whose reviews say they are child predators, but if they have the technology to check it, why are some of these problematic apps still in the store? ” asked Hani Farid, a computer scientist at the University of California, Berkeley, who worked with Levin on the App Danger project.

Apple and Google said they regularly scan user reviews of apps with their own computational models and investigate allegations of child sexual abuse. When apps violate their policies, they are removed. Apps have age ratings to help parents and children, and the software allows parents to veto downloads. The companies also provide app developers tools to monitor for child sexual content.

Google spokesman Dan Jackson said the company had examined the apps listed by the App Danger Project and had not found evidence of child sexual abuse material.

“While user reviews play an important role as a signal to initiate further investigation, the allegations in the reviews are not credible enough by themselves,” he added.

Apple also checked the apps listed by the App Danger Project and removed 10 apps that violated distribution rules. It declined to provide a list of those apps or the reasons for taking action.

“Our app review team works 24/7 to carefully review every new app and app update to ensure it meets Apple’s standards,” a spokesperson said in a statement.

App Danger Project said it’s found a A large number of reviews suggest that the hoopA social networking app was found to be unsafe for children; For example, it found that 176 of 32,000 reviews since 2019 included reports of sexual abuse.

A review taken from the App Store states, “There are an abundance of sexual predators out there who spam people with the name ‘Read My Picture’ along with links to hook them up on dating sites.” “It has a picture of a small child and says go to their site for child porn.”

Hoop, which is under new management, has a new content moderation system in place to strengthen user protections, Leith Erich, Hoop’s chief executive, said, highlighting that the researchers found that the original founders protected them from bots and malicious users. How did you struggle to cope? “The situation has improved considerably,” the chief executive said.

Meet Group, which owns MeetMe, said it does not tolerate abuse or exploitation of minors and uses artificial intelligence tools to detect predators and report them to law enforcement. reports inappropriate or suspicious activity to the authorities, including episode 1 of 2019 in which a man from Raleigh, NC solicited child pornography.

Whisper did not respond to requests for comment.

Sean Pierce, who leads the San Jose Police Department’s task force on Internet crimes against children, said some app developers avoid investigating complaints about sextortion to reduce their legal liability. The law says they don’t have to report criminal activity unless they know of it, he said.

“It’s more the apps’ fault than the app store because the apps are doing it,” said Sergeant Pierce, who makes presentations to San Jose schools through a program called the Vigilant Parent Initiative. Part of the challenge, he said, is that many apps connect strangers for anonymous conversations, which makes it difficult for law enforcement to verify.

Apple and Google Submit hundreds of reports annually to the US Clearinghouse for child sexual abuse But don’t specify whether any of those reports are related to apps.

Whisper is one of the social media apps that Mr. Levine’s team found mentioning sexual abuse in several reviews. After downloading the app, a high school student received a message in 2018 from a stranger who offered to contribute to a school robotics fund-raiser in exchange for a topless photo. After sending a photo, the stranger threatened to send it to her family if she didn’t provide more photos.

According to a report from the Mascota Police Department in Illinois, the teen’s family reported the incident to local law enforcement, who later A local man, Joshua Brakel, was arrested., He was sentenced to 35 years in prison on charges of extortion and child pornography. Although Whisper was not found responsible, it was named as the primary tool, along with half a dozen apps, that he used to collect photos of victims aged between 10 and 15.

Chris Hoell, a former federal prosecutor for the Southern District of Illinois who worked on the Brakel case, said the App Danger Project’s comprehensive assessment of reviews could help parents protect their children from problems on apps like Whisper.

“It’s like an aggressively spreading, treatment-resistant tumor,” said Mr. Hoel, who now has a private practice in St. Louis. “We need more tools.”



Source link

RELATED ARTICLES

DEJA UNA RESPUESTA

Por favor ingrese su comentario!
Por favor ingrese su nombre aquí

- Advertisment -
Google search engine

Most Popular

ff

The Art of Tipping

Crypto fraud with AI tie up

Recent Comments