YouTube has finally taken a step for controlling videos that are aimed at kids. The biggest streaming video service in the world has removed 50 users channels and stopped putting ads in 3.5 million videos.
Johanna Wright YouTube Vice President said, “Across the board, we have scaled up resources to ensure that thousands of people are working around the clock to monitor, review and make the right decisions across our ads and content policies. These latest enforcement changes will take shape over the weeks and months ahead as we work to tackle this evolving challenge.”
Undoubtedly videos on YouTube are seen by every age group. For kids, there are millions of cartoons available, but not all of them are showing kid-appropriate content. Parents, regulators, advertisers and law enforcement have issues regarding the issue that there are no limitations on YouTube. They want that Google put some restrictions on certain content. It includes sexual, extremist and unhealthy videos shown on the platform.
BuzzFeed and the New York Times and an online essay by British writer James Bridle pointed out some clips shown on YouTube that were questionable. When people started voicing the issues, the concern about content shown on YouTube grew.
YouTube’s Wright quoted “a growing trend around content on YouTube that attempts to pass as family-friendly, but is clearly not” for the new efforts “to remove them from YouTube.”
YouTube take in to account user feedback, expert opinion, and an automated computer programme to help them decide which content should be removed from YouTube.
Now moderators are told to delete the videos “featuring minors that may be endangering a child, even if that was not the uploader’s intent.”
Videos that have famous characters “but containing mature themes or adult humor” will only be more adults.
Also, the comments option will be disabled on videos where the comments refer children in a “sexual or predatory” way.