YouTube shows kids as young as two videos promoting skin bleaching, weight loss, drug culture and firearms, new research by the company’s Kids app has found.
YouTube Kids, an app and website released in 2015, aims to be a safer, curated version of the video-sharing site aimed at children under the age of 13. She tailors content to three age groups: Older, Younger, and Preschool. ‘ corresponds approximately to the age groups of nine to twelve years, four to seven and under four years.
The company ensures the service’s videos are family-friendly through “a mix of automated filters created by our engineering teams, human verification and parental feedback to protect our youngest users online.” But it warns users: “No system is perfect and inappropriate videos can slip through”.
Research by the Tech Transparency Project, a US-based non-profit organization, shows that the system is indeed far from perfect. Using three different accounts, each tuned to one of the app’s age groups, the analysts uncovered numerous videos that shouldn’t have made it through Google’s filters.
A Breaking Bad cooking show, for example, in which the hosts dress up in respirators and joke about the risk of inhaling the fumes, might be light-hearted for adults or teens, but was deemed appropriate by YouTube for “younger kids” — as well a Minecraft project to replicate the RV “where the crystal meth is made” from the hit series.
Songs also sometimes add mature themes to the kids app. Eric Clapton’s Cocaine – Sample lyrics “When your feeling is gone and you want to keep driving, cocaine” – is available for children aged five and up as part of a guitar lesson series.
Content aimed at gun users is slipping through the web, resulting in younger kids being shown a ranking of butt plates that protect shooters from firearm recoil, and older kids being shown step-by-step instructions on how to build one Regals are offered with a hidden compartment for hiding a gun.
Most alarming was content aimed at children that could lead to harmful body image issues. A popular post by an Indian beauty influencer on using skin-bleaching products was available for older kids, while even preschoolers were shown a cartoon about the importance of burning calories to lose weight, admonishing them to “wiggle the wiggle”.
“YouTube Kids caters specifically to young children, even toddlers,” said Katie Paul, director of TTP. “This is a product that, according to YouTube, does a lot of machine learning to filter out malicious content. It’s designed specifically to be safe for children and we didn’t expect to find such a variety of inappropriate content that we did.”
“The most shocking thing for me personally was seeing a lot of drug-related content,” she added. “Obviously no drugs are sold here — but a show like Breaking Bad, which is definitely intended for adults, is mimicked to push rock candy like it’s ‘baked meth,’ using a lot of drug phrases.”
In a statement, a YouTube spokesperson said, “We built YouTube Kids to create a safer environment for children to explore their interests and curiosity, while giving parents the tools to enhance the experience for their children.” children can customize. We have a higher bar for videos that can be part of the app and we also give parents the ability to control what content their child can and cannot see. Upon review, we have removed or age-restricted a number of the reported videos from the Kids app.”
Paul argued that the results showed “that algorithmically curated content should not be marketed to children. This is just another example that even when a company says they are doing their best child protection efforts to curate that content, we still see some of the most vulnerable populations suffer harm.”
The UK and EU are world leaders in regulating services like YouTube Kids, Paul added, but she said the US Congress must do its part.
She said: “Ultimately, we are talking about American companies. And it is the US Congress that should lead the way in developing stricter regulations to ensure these companies do not harm children.”