YouTube to age-restrict way more videos
After launching the standalone YouTube Kids app for users under the age of 13, YouTube is rolling out more artificial intelligence-powered technology to catch more videos that may require age restrictions, meaning more viewers will be asked to sign into their accounts to verify their age before watching. YouTube already uses machine learning to flag inappropriate content for its Trust and Safety team to review, but moving forward, the company plans to expand the use of the technology. Starting later this year, YouTube will use machine learning to add age restrictions to inappropriate videos automatically. As part of the same initiative, YouTube plans to make it more difficult for children to skirt those restrictions. If your child tries to watch a restricted video through an embed on another website, Google will redirect them to YouTube, where they’ll need to sign in to prove they’re over 18. “This will help ensure that, no matter where a video is discovered, it will only be viewable by the appropriate audience,” according to YouTube.
One of the biggest questions facing creators in YouTube’s Partner Program (those who are able to monetize their videos) is whether these moderation measures will have an effect on their moneymaking potential. YouTube’s team doesn’t believe so because the majority of the videos it anticipates will receive automatic age restrictions also likely violate the company’s advertiser-friendly guidelines. Basically, those videos would already have limited or no ads, according to YouTube.
That doesn’t mean mistakes won’t happen; they will, as countless incidents of wrongfully applied labels and takedowns and all manner of copyright strike controversies have illustrated in the past. But YouTube is bulking up its appeals team to handle appeals as they come in. Another concern creators have is that age-restricted videos won’t appear on the homepage. While age-restricted videos are less likely to appear on the homepage, age-restricting doesn’t automatically prohibit videos from appearing on the homepage, according to YouTube.
Today’s announcement is Google’s latest move to make YouTube look more responsible to parents and their children after it was marred by controversy.
Also read: Misinformation made YouTube to revert back to human moderators