What You Need To Know About “Momo”

March 6, 2019

It’s been an unsettling week for parents after a series of reports claimed that kids found disturbing images in online videos. Word about the “Momo Challenge,” as well as upsetting video splices, spread quickly across parents’ groups — but are the dangers as real as they seem? Here’s what you need to know.

The “Momo Challenge” looks horrifying, but its reputation may be overblown.

A Facebook post by mom Kayleigh Govier, who claims her son came across the “Momo Challenge” while watching a video about the game Roblox on YouTube, has gone viral due to the upsetting nature of what her son had seen

Govier’s son supposedly wasn’t the first to come across the challenge. “Momo,” whose frightening visage was taken from an unrelated sculpture by a Japanese special-effects company, is an urban legend. Lore has it that Momo will pop into videos, or onto messaging services, and dare children to do certain things, including inflicting harm on themselves or others — and they have to comply or their family will be cursed. After a series of escalating dares, legend says, the “challenge” ends when Momo instructs followers to commit suicide.

With something so horrifying, of course parents are going to warn each other. Even Kim Kardashian posted about the Momo Challenge on her Instagram Stories.

The thing is, the Momo Challenge looks like it’s actually more hoax than horror.

“Contrary to press reports, we’ve not received any recent evidence of videos showing or promoting the Momo challenge on YouTube,” a YouTube spokesperson told GoodHousekeeping.com in an email. “Content of this kind would be in violation of our policies and removed immediately.”

The Parent Zone, which has been keeping track of all things Momo, agrees that there isn’t much evidence that people are engaging in the Momo Challenge. In fact, the site notes that children are more likely to come across the Momo image from reports about the challenge on the news.

“Suicide instructions” found on Youtube kids

The Washington Post first reported a story about Dr. Free Hess, a pediatrician who came across something unsettling in a video on YouTubeKids about the Nintendo game Splitoon. “Four minutes and forty-five seconds into the video, a man quickly walked onto the screen, held his arm out, and taught the children watching this video how to properly kill themselves,” she writes on her blog. “The man quickly walked in, held his arm out, and tracing his forearm, said, ‘Kids, remember, cut this way for attention, and this way for results,’ and then quickly walked off.”

Sadly, this one is real. And it’s is even more unsettling, because there’s an impression that the videos on the YouTubeKids app — which is separate from YouTube.com — are safer for kids. But YouTube notes that the content is determined by a mix of human supervised machines, user input, and human review. Something sinister like this, tucked more than four minutes into the video, can slip through until it’s reported.

“We work to ensure the videos in YouTubeKids are family-friendly and take feedback very seriously,” the YouTube spokesperson says. “We appreciate people drawing problematic content to our attention, and make it possible for anyone to flag a video. Flagged videos are manually reviewed 24/7 and any videos that don’t belong in the app are removed. We’ve also been investing in new controls for parents including the ability to hand-pick videos and channels in the app. We are making constant improvements to our systems and recognize there’s more work to do.”

The spokesperson also notes that, if you have the YouTubeKids app, if the “search” function is on, kids have access to millions more videos than users who keep the search function off. (See how to turn it off here.)

It’s always best to keep tabs on what your kids are watching online

This isn’t the first time that disturbing content found its way into kids’ videos. Previously, there was a rash of so many weird, inappropriate videos featuring beloved children’s characters that the scandal that resulted was called “Elsagate.” And recently, Bloomberg reported that Disney, Nestlé, and Fornite’s Epic Games pulled ads from YouTube after vlogger Matt Watson demonstrated how people were usingYouTube comments to exploit underage children. (Since Watson posted his video, YouTube has banned comments on videos featuring minors.)

YouTube can remove content that gets flagged, lock the comments section, update its features and algorithms, and review what it can (and reviewing content in person takes a human toll — some content moderators for Facebook report having PTSD), but if there are people out there determined to slip stuff through, they’ll find a way around the site’s controls. The best thing is to vet the videos your kids watch yourself. Set up a family account, and subscribe to trusted channels or save videos you know don’t have any disturbing content, and limit the amount of browsing around your kids do on their own.

Be the first to comment

Leave a Reply

Your email address will not be published.