Documents have leaked from Facebook that display what the company controls in the millions of post and videos uploaded every day. As per the documents Facebook keeps a check on hate speech, terrorism promoting post and videos, pornography and images of suicide and self-inflicting harm.
Now Facebook and its team is also facing new trials from the users which include revenge porn. Usually the employers controlling the traffic on this social media app have just ten seconds to decide whether to delete an unhealthy post or keep it intact. Per week Facebook encounters almost 6.5 million reports of possible fake accounts.
As per Guardian, the workers controlling content on this social media platform often face inconsistencies in the policies. Especially on sexual content the policy is not clear. Facebook didn’t clarify this point except saying that safety is their prime responsibility and concern. Further it was said that they want to create platform for people to speak freely but even in that freedom safety is ensured. It is a detailed procedure, in which everything is kept in mind once a controversial post is being made.
Facebook allows users to live stream attempts of inflicting self-harm because the social media giant feels they don’t want to cause further pain to people already in distress.
Facebook is upgrading its technology to control unhealthy content. Software used to control graphic content is being made, but it is currently in its initial stages.
Recently Facebook increased its staff and hired 3000 more employees with the sole purpose to remove inappropriate videos posted by the audience before they get viral. This shows that the workers on Facebook do have their policies along with giving a free hand to people to post anything they feel like.
Media coordinator and junior editor at Research Snipers RS-NEWS, I studied mass communication and interested technology business, I have 3 years experience in the media industry.