YouTube has disclosed that it has removed 8.3m videos from its platform in between October to December 2017. In just three months the company has removed these video as they breached its community guidelines.
YouTube has revealed its first quarterly moderation report. There is quite a lot of pressure and criticism on the company for not being able to control abusive and extremist content.
YouTube is a subsidiary of Google’s Parent company Alphabet. Recently along with other sites & apps, there has been pressure on YouTube by national governments & EU to remove videos spreading hate and unhealthy content.
As per YouTube, the report is the first step to deal with the issue and it will “help show the progress we’re making in removing violative content from our platform”.
In a blog post, YouTube announced that 8m plus videos have been removed in three months. The post said, “The majority of these 8m videos were spam or people attempting to upload adult content and represent a fraction of a percent of YouTube’s total views during this time period.”
Previously Google announced that it will hire 10,000 human moderators, but did not specify it was to control YouTube video content or some other purpose. As per YouTube machine learning has given aid in removing videos in “high-risk, low-volume areas like violent extremism” quickly. But a graph published along with the data shows that above fifth of the videos showing extremist content has already been viewed over 100 times.
YouTube said, “Our investment in machine learning to help speed up removals is paying off across high-risk, low-volume areas — like violent extremism — and in high-volume areas, like spam. We’ve also hired full-time specialists with expertise in violent extremism, counter-terrorism, and human rights, and we’ve expanded our regional expert teams.”
Media coordinator and junior editor at Research Snipers RS-NEWS, I studied mass communication and interested technology business, I have 3 years experience in the media industry.