Almost weekly, Brian Levine, a computer scientist at the University of Massachusetts Amherst, is asked the same question by his 14-year-old daughter: Can I download this app?
Mr. Levine responds by scanning hundreds of customer reviews in the App Store for allegations of child molestation or sexual abuse. The manual and arbitrary process has him wondering why more resources aren’t available to help parents make quick decisions about apps.
During the last two years, Mr. Levine sought to help parents by designing a computational model that evaluates customer reviews of social apps. Using artificial intelligence to evaluate the context of reviews with words like “child porn” or “pedo,” he and a team of researchers have built a searchable website called the App Danger Project, which provides clear guidance on the safety of social networking apps. .
The website collects user reviews about sexual predators and provides safety ratings for apps with negative reviews. It shows reviews that mention sexual abuse. Although the team did not follow up with reviewers to confirm their claims, it read each one and excluded those that did not raise concerns about child safety.
“There are reviews out there that talk about the type of dangerous behavior that occurs, but those reviews have been drowned out,” Mr. Levine. “You can’t find them.”
Predators increasingly weaponize apps and online services to collect explicit images. Last year, law enforcement received 7,000 reports of children and teenagers being forced to send nude photos and then blackmailed for photographs or money. The FBI declined to say how many of those reports were credible. The incidents, known as sextortion, more than doubled during the pandemic.
Because Apple’s and Google’s app stores don’t offer keyword searches, Mr. Levine, it can be difficult for parents to find warnings about inappropriate sexual behavior. He envisions the App Danger Project, which is free and would complement other services that know the suitability of products for children, such as Common Sense Media, by identifying apps that don’t do enough to police users. He does not plan to make a profit from the site, but is encouraging donations to the University of Massachusetts to offset its costs.
Mr. Levine and a dozen computer scientists examined the number of reviews warning of child sexual abuse across more than 550 social networking apps distributed by Apple and Google. They found that a fifth of these apps had two or more complaints about child sexual abuse material, and that 81 offerings across the App and Play stores had seven or more of these types of reviews.
Their study builds on previous reports of apps with complaints about unwanted sexual interactions. In 2019, The New York Times described how predators treat video games and social media platforms as hunting grounds. A separate report that same year by The Washington Post found thousands of complaints across six apps, leading Apple to remove apps Monkey, ChatLive and Chat for Strangers.
Apple and Google have a financial interest in distributing apps. The tech giants, which take up to 30 percent of app store sales, helped three apps with more user reports of sexual abuse generate $30 million in sales last year: Hoop, MeetMe and Whisper, according to Sensor Tower, a market research firm.
In more than a dozen criminal cases, the Justice Department has described these apps as tools used to solicit children for sexual images or encounters — Hoop in Minnesota; MeetMe in California, Kentucky and Iowa; and Whisper in Illinois, Texas and Ohio.
Mr. Levine said Apple and Google should provide parents with more information about the risks associated with some apps and better police those who have a history of abuse.
“We’re not saying that every app with reviews that say there are child predators on it should be launched, but if they have the technology to check this, why are some of these problematic apps still in the stores?” asked Hany Farid, a computer scientist at the University of California, Berkeley, who worked with Mr. Levine on the App Danger Project.
Apple and Google said they regularly scan user reviews of apps with their own computational models and investigate allegations of child sexual abuse. When apps violate their policies, they are removed. Apps have age ratings to help parents and children, and software allows parents to veto downloads. The companies also offer app developers tools to monitor children’s sexual material.
A Google spokesman said the company had examined the apps listed by the App Danger Project and found no evidence of child sexual abuse material.
“While user reviews play an important role as a signal to trigger further investigation, claims made by reviews are not reliable enough by themselves,” he said.
Apple also investigated the apps listed by the App Danger Project and removed 10 that violated its distribution rules. It declined to provide a list of those apps or the reasons it took action.
“Our App Review team works 24/7 to carefully review each new app and app update to ensure it meets Apple’s standards,” a spokesperson said in a statement.
The App Danger Project said it had found a significant number of reviews suggesting that Hoop, a social networking app, was unsafe for children; for example, it found that 176 of 32,000 reviews since 2019 included reports of sexual abuse.
“There is an abundance of sexual predators here who spam people with links to join dating sites, as well as people with the name ‘Read My Picture,'” says a review pulled from the App Store. “It has a picture of a small child and says to go to their site for child porn.”
Hoop, which is under new management, has a new content moderation system to strengthen user security, said Liath Ariche, Hoop’s CEO, adding that the researchers focused on how the original founders struggled to deal with bots and malicious users. “The situation has improved drastically,” the CEO said.
The Meet Group, which owns MeetMe, said it did not condone the abuse or exploitation of minors and used artificial intelligence tools to detect predators and report them to law enforcement. It reports inappropriate or suspicious activity to authorities, including a 2019 episode in which a man from Raleigh, NC, solicited child pornography.
Whisper did not respond to requests for comment.
Sgt. Sean Pierce, who heads the San Jose Police Department’s Internet Crimes Against Children task force, said some app developers avoided investigating sextortion complaints to reduce their legal liability. The law says they don’t have to report criminal activity unless they find it, he said.
“It’s more the apps’ fault than the app store because it’s the apps that are doing this,” said Sergeant Pierce, who gives presentations at San Jose schools through a program called the Vigilant Parent Initiative. Part of the challenge, he said, is that many apps connect strangers to anonymous conversations, making it difficult for law enforcement to verify.
Apple and Google submit hundreds of reports annually to the U.S. Child Sexual Abuse Clearinghouse, but do not specify whether any of those reports are related to apps.
Whisper is among the social media apps that Mr. Levine’s team found, had several reviews that mention sexual exploitation. After downloading the app, a high school student received a message in 2018 from a stranger offering to contribute to a school robot fundraiser in exchange for a topless photograph. After she sent a photo, the stranger threatened to send it to her family unless she provided more photos.
The teenager’s family reported the incident to local law enforcement, according to a report from the Mascoutah Police Department in Illinois, who later arrested a local man, Joshua Breckel. He was sentenced to 35 years in prison for extortion and child pornography. Although Whisper was not found responsible, it was named, along with half a dozen apps, as the primary tools he used to collect images from victims ranging in age from 10 to 15.
Chris Hoell, a former federal prosecutor in the Southern District of Illinois who worked on the Breckel case, said the App Danger Project’s extensive evaluation of reviews could help parents protect their children from problems with apps like Whisper.
“This is like an aggressively spreading, treatment-resistant tumor,” said Mr. Hoell, who now has a private practice in St. Louis. “We need more tools.”
#rise #Sextortion #computer #scientists #tapping #identify #risky #apps