YouTube and companies that advertise with it are coming under fire after one of the video-sharing site鈥檚 users showed how the platform can apparently be used to bolster child exploitation.

User MattsWhatItIs posted the video late Sunday night. It quickly went viral, garnering nearly 600,000 views within 14 hours.

In the video, MattsWhatItIs claims to have discovered a 鈥渨ormhole鈥 leading to 鈥渁 softcore pedophile ring.鈥

He says that after starting a new YouTube account unconnected to any of his previous browsing history, it takes him only a few moments of searching and surfing to get from the YouTube homepage to one of many seemingly innocuous videos that have nonetheless racked up unusually high view totals and comment counts.

The videos typically feature young girls engaged in activities such as yoga, gymnastics or other day-to-day activities. Comments in many languages along the lines of 鈥渂eautiful goddess鈥 and 鈥渂eautiful video Barbie鈥 are common, as are suggestions for what the girls could do in future videos. The vast majority of the comments stop short of being sexually explicit.

The comments often include timestamps linking users鈥 messages to a specific moment in the video. 鈥淭hese guys aren鈥檛 timestamping this stuff because the little girl made a funny joke,鈥 MattsWhatItIs notes in his video.

More concerning to MattsWhatItIs is that YouTube鈥檚 recommendation engine makes it even easier for users interested in these sorts of videos to find more of them. While he says it took 鈥渁bout five clicks鈥 for his new account to find its first such video, he was soon bombarded with suggestions of similar content, all one click away.

鈥淵ouTube鈥檚 algorithm, through some kind of glitch or error in its programming, is actually facilitating their ability to do this,鈥 he says.

The video has attracted thousands of comments and social media posts largely supporting its claims and slamming YouTube for allowing the questionable behaviour to fester. Some users have also contacted prominent companies that advertise on YouTube, alerting them that their commercials could be associated with the videos and comments in question.

What is YouTube doing?

Some of the videos flagged by MattsWhatItIs contain advertisements, meaning the people who uploaded them 鈥 or in some cases, appear to have stripped them from their creators鈥 pages and reuploaded them 鈥 are making money off the inappropriate interest the videos have generated.

A YouTube spokesperson said the platform is 鈥渋nvest[ing] heavily鈥 in efforts to combat child exploitation, including by forming partnerships with non-profit groups.

鈥淎ny content 鈥 including comments 鈥 that endangers minors is abhorrent and we have clear policies prohibiting this on YouTube,鈥 the spokesperson said in a statement to CTVNews.ca.

鈥淲e enforce these policies aggressively, reporting it to the relevant authorities, removing it from our platform and terminating accounts.鈥

YouTube was criticized in 2017 for monetizing videos which appeared to be harmless cartoons but later morphed into sexual or otherwise potentially disturbing content.

The platform that it was strengthening its to crack down on 鈥渃ontent on YouTube that attempts to pass as family-friendly, but is clearly not.鈥 YouTube also said it was taking an 鈥渆ven more aggressive鈥 stance on 鈥渋nappropriate sexual or predatory comments,鈥 including turning off all comments on videos of children where those sorts of remarks are noticed.

A few of the videos MattsWhatItIs showcased had comments turned off. Most did not, possibly suggesting that they had never been reported by YouTube鈥檚 users or flagged by its monitoring system.

The videos themselves would not be taken down by YouTube, as the guidelines cover content that is intended to be in some way sexual 鈥 not innocuous videos which prompt sexual interest in a portion of their audience.

YouTube that it had removed 7.8 million videos, nearly 1.7 million accounts and more than 224 million comments over three months for violating the platform鈥檚 guidelines. The vast majority of those videos contained either spam or adult content, with about 10 per cent being flagged for child safety reasons. About 60 per cent of the videos had not been viewed even once.

How to protect your children

YouTube requires its users to sign up for Google accounts, which requires them to provide their birthdates. Canadians and Americans under the age of 13 are not allowed to create accounts, meaning they cannot publish videos to YouTube, although there is nothing stopping them from providing fake birthdates.

The platform also provides a page of , suggesting that they make use of YouTube鈥檚 privacy controls and stay away from filming sexually suggestive content. YouTube鈥檚 include watching how their children use the service and flagging anything that appears to violate the platform鈥檚 guidelines.

Accounts that repeatedly violate the guidelines are subject to increasing punishments, with YouTube deleting any accounts that break the rules three times within three months. Accounts can also be terminated immediately if YouTube finds that they engaged in 鈥減redatory behaviour鈥 or otherwise endangered children.

Privacy and cybersafety expert Claudiu Popa would like to see YouTube and other social media platforms allocate more resources to filtering out inappropriate content and teaching their users about the consequences of using their services.

鈥淭here needs to be more investment in filtering; there needs to be more investment in prevention and in education,鈥 he told CTVNews.ca Monday.

Popa is the founder of the , which works with teachers and schools to promote cybersafety. He said parents should not use social media 鈥渁s a babysitter鈥 and always be aware of what their children are doing online, without watching every keystroke.

鈥淵ou don鈥檛 necessarily want to be watching everything that they鈥檙e constantly typing 鈥 because they鈥檙e just going to burrow deeper,鈥 he said.

The Canadian Centre for Child Protection maintains the website, which offers advice for parents seeking more information about dangers children may face by using the internet and social media.