While search engine giant is very open about its strategies and measures taken in combat circulation of hate speech, fake news and other illegal things online, Facebook has its own manual, but keeps it to itself. Now, the secret is out.
The Guardian has got hold of the Facebook's document containing guidelines (internal training manuals, spreadsheets and flowcharts) for its moderators to keep a check on user behaviour and also action taking procedure if they find the content circulating on their social media site is illegal (revenge porn), live violent involving self-harm or harming innocent victims.
Also read: Google tweaks YouTube's Restricted Mode to allow LGBT-related videos; lays out guidelines for content creators
However, Facebook moderators are said to have concerns over points related to sexual content, as it is found to be "complex" and "confusing," Guardian added.
For instance, Facebook policy allows people to live-stream attempts to self-harm, as it "doesn't want to censor or punish people in distress."
Recently, Facebook moderators got a new memo instructing to escalate any content related to "13 Reasons Why", a Netflix TV series based on high school students suicide tendencies, as they found spike in teenagers mimicking those acts on its social media site.
Also read: Here's how to order food on Facebook
Reacting to the leak, Monica Bickert, Facebook's Head of Global Policy Management, said, "Keeping people on Facebook safe is the most important thing we do. We work hard to make Facebook as safe as possible while enabling free speech. This requires a lot of thought into detailed and often difficult questions, and getting it right is something we take very seriously."
Facebook is also testing new software to intercept graphics content on its site, but it is still in early stages, Bickert added.
To put a stop to the disturbing trend of suicides on Facebook Live feature, the social media mogul, with help of renowned psychiatrists, has developed an AI-based suicide prevention tool.