LiveMe deletes 600k accounts after FOX 11 reveals pedophiles use app to sexually exploit kids

WARNING GRAPHIC MATERIAL:

It’s been two months since a FOX 11 investigation first exposed the dangers of the popular livestreaming app LiveMe, and despite the company’s assurances that changes are being made, experts say it’s simply not enough.

There are so many underage children using the app, that LiveMe tells FOX 11 they’ve deleted 600,000 accounts of kids under the age of thirteen in just the last 60 days since FOX 11 first aired the report.
FOX 11’s previous story was seen by millions across the country, and revealed that pedophiles regularly use the app to exploit young children sexually.

The original report can be seen here.

As FOX 11 has reported, LiveMe has been downloaded more than 90 million times. It shares the user’s location, and allows users to search for who is streaming near them. FOX 11 found countless streams of underage girls who were being solicited by pedophiles to perform sexual tasks.

Some of the girls sook off their clothes, others danced sexually.

All while, in many cases, older men wrote perverted comments to them.

FOX 11 found some of the streams were recorded and then posted on porn sites, where customers were directed straight to live streams or web captures of underage girls on LiveMe, with the links advertising that the girls are 'young jailbait' who were secretly recorded without their knowledge.

After FOX 11’s story aired, new disclaimers began appearing on LiveMe’s videos which said that violent and sexual content will not be tolerated, and violators will be banned.

The company told FOX 11 they’ve been enlisting the help of vetted volunteers to help report bad conduct on the app, and that they’ve been cracking down on users who lie about their age.

The minimum age required to use the app is 13.

Now, FOX 11 has obtained video showing the moment a young girl is confronted by her mother, and law enforcement, after she took off her clothes and exposed herself at the request of perverts during a LiveMe stream.

The girl had livestreamed herself out to thousands as she exposed herself in a closet with a close friend, at the request of pedophiles.

Chat logs show the pedophiles ordered the girl to pull her top up, and perform a sexual act on her friend. They also told her to take off her pants and show her private parts.

Police were alerted to the livestream by a group called Sheepdog Bloodhound. It’s ‘a group of a volunteers from around the world who monitor LiveMe in an effort to try to report and prevent child exploitation.

“The child had been online with a friend and inappropriate, showing herself, showing her body, basically doing what the predators had asked her to do,” said Paul Irwin, the leader of Sheepdog Bloodhound. “In most cases, shall we say 90 percent, the families have no idea what their youth are doing now.

Irwin told FOX 11 that his group is seeing an uptick in inappropriate behavior on LiveMe during these summer months as kids are now out of school.

The video of the young girl being confronted by her mother and officers is eye opening.

The girl is streaming live on LiveMe when all of a sudden, her mother comes into the room.
“Are you on your phone, or on your tablet?” she asks.
“Both” the daughter replies.
“Okay, I want to look at both,” mom says.
“Why?” the daughter replies.
“Because I want to talk to you, and we have company, and I need your passwords,” the mom replies.
The mother brings her daughter and officers into another room.
“Do you know why we’re here?” an officer asks.
“No,” the girl replies.
“Did you make a video yesterday?” the officer says.
“With ****? Yeah” the girl replies.
“So tell us about the video where you exposed yourself,” the officer says. “What happened? We’ve seen the video, and we heard it. Some people were chatting with you, they were asking you to take your shirt off.”
“**** did that,” the girl says.
The young girl sits in stunned silence, looking like a deer in the headlights.
“You’d better start talking and getting it out because I’m sure there’s more,” the mother says.
The officers then look at the daughter’s phone with her mom’s permission, and realize they’re being streamed out live as they speak.
“So what’s LiveMe, tell me about that?” one asks. “You’re logged in right now huh? So we’re live streaming right now?”

The stream is then cut off.

“It was interesting to watch the video live given that the mother in that situation literally, I felt kind of, kinda threw her daughter under the bus and didn’t take any responsibility,” said Dr. Lisa Strohman, a clinical psychologist who worked with the FBI and is now the director of the Digital Citizen Academy.

“She’s not aware that legally, her daughter was absolutely liable to creating and distributing child pornography.”

Dr. Strohman said that the video shows parents have to be more involved with their kid’s lives on social media.

“One of the most interesting things I saw on the tape was that the mother was asking what the daughters passwords were, so again, this is a classic example of how we want parents to be the ones in charge,” she said.

Dr. Strohman believes LiveMe’s volunteer program isn’t enough, and the company has to do more to stop bad content from happening on its platform.

“The problem is, now, they’ve opened it up to allow the general public to come in and do the work that quite frankly, they should be paying for,” she said. “As a parent, there’s absolutely no way I would allow my child on that platform.”

FOX 11 spoke with a young girl who works with Sheepdog Bloodhound to catch predators on LiveMe. She had previously used the app since age 10.

FOX 11 is hiding her identity, and with her mother’s permission, she explains how attention, and virtual currency make the app so enticing to young girls.

“We go live, and we’re just bored and we get no comments,” the girl said. “But then the predators coming in and enticing them, and then they’ll give a gift, and that’s when we’re like ‘oh, you’re paying attention to me so I’ll do whatever they want me to do and I want those gifts.”

LiveMe told FOX 11 they’ve worked aggressively over the last 60 days to clean up bad behavior on their platform.

They say they’ve deleted 600,000 accounts flagged with users under the age of 13, and plan to delete all users under thirteen within the next 60 days.

“We’ve also launched the LiveMe Safety Advocate program wich has successfully onboarded 1,000 [volunteers],” a LiveMe representative said. “These are individuals who passed a strict screening process within the app, and then successfully cleared a 3rd party background check. These [volunteers] receive priority reporting privileges which enable LIveMe’s moderation team to respond to issues much faster.”

The company also said they’re working on improvements to AI systems for image recognition and comment management systems, and they have added another 10,000 images, keywords, and phrases to help improve detection in the past 30 days.

LiveMe also said they are improving moderation and tightening regulation of our feature board to ensure no broadcasters under the age of 18 appear on a feature feed unless you are connected with them.

Copyright 2018 FOX 11 Los Angeles: Download our mobile app for breaking news alerts or to watch FOX 11 News | Follow us on Facebook, Twitter, Instagram and YouTube.