Facebook releases secret content rules
Facebook has released its long form document, a rule book that guides what is and isn't acceptable to be posted on its pages, giving far more detail than ever before on what is permitted
on subjects ranging from drug use and sex work to bullying, hate speech
and inciting violence.
For years the
social media network has had "community standards" for what people can
post. But only a relatively brief and general version was publicly
available, while it had a far more detailed internal document to decide
when individual posts or accounts should be removed.
Now, the company is providing the longer document on its website to clear up confusion and be more open about its operations, Monika Bickert, Facebook's vice president of product policy and counter-terrorism, said.
Facebook has faced fierce criticism from governments and rights groups in many countries for failing to do enough to stem hate speech and prevent the service from being used to promote terrorism, stir sectarian violence and broadcast acts including murder and suicide.
At the same time, the company has also been accused of doing the bidding of repressive regimes by aggressively removing content that crosses governments and providing too little information on why certain posts and accounts are removed.
New policies will, for the first time, allow people to appeal a decision to take down an individual piece of content. Previously, only the removal of accounts, Groups and Pages could be appealed.
Facebook is also beginning to provide the specific reason why content is being taken down for a wider variety of situations.
Facebook, the world's largest social network, has become a dominant source of information in many countries around the world. It uses both automated software and an army of moderators that now numbers 7,500 to take down text, pictures and videos that violate its rules. Under pressure from several governments, it has been beefing up its moderator ranks since last year.
Reuters
Now, the company is providing the longer document on its website to clear up confusion and be more open about its operations, Monika Bickert, Facebook's vice president of product policy and counter-terrorism, said.
You should, when you come to Facebook, understand where we draw these lines and what's OK and what's not OK," Bickert told reporters in a briefing at Facebook's headquarters.
Facebook has faced fierce criticism from governments and rights groups in many countries for failing to do enough to stem hate speech and prevent the service from being used to promote terrorism, stir sectarian violence and broadcast acts including murder and suicide.
At the same time, the company has also been accused of doing the bidding of repressive regimes by aggressively removing content that crosses governments and providing too little information on why certain posts and accounts are removed.
New policies will, for the first time, allow people to appeal a decision to take down an individual piece of content. Previously, only the removal of accounts, Groups and Pages could be appealed.
Facebook is also beginning to provide the specific reason why content is being taken down for a wider variety of situations.
Facebook, the world's largest social network, has become a dominant source of information in many countries around the world. It uses both automated software and an army of moderators that now numbers 7,500 to take down text, pictures and videos that violate its rules. Under pressure from several governments, it has been beefing up its moderator ranks since last year.
Reuters
No comments
Thanks for viewing, your comments are appreciated.
Disclaimer: Comments on this blog are NOT posted by Olomoinfo, Readers are SOLELY responsible for their comments.
Need to contact us for gossips, news reports, adverts or anything?
Email us on; olomoinfo@gmail.com