The 27-page Facebook rule book released Tuesday offers an unprecedented insight into how the company decides what its two billion users may or may not share, and how the social media giant navigates the line between censorship and free speech. In 2016, for example, the company reversed its decision to remove a post containing the Pulitzer-winning "napalm girl" photo, which depicted a nude and burned child in the Vietnam War.
The company has chose to employ 10,000 additional safety, security and product and community operation employees to their team before the year is up, with weekly audits reviewing various decisions meant to refine any taken or considered steps and internal choices.
Despite the fact that Facebook is the largest social network in the world, it often operates as a almost opaque box. So we're realistic about that. On Monday, Facebook took another hit when it was sued for defamation by Martin Lewis, a British financial expert who claims his image has been used in 50 fake Facebook adverts to scam millions from vulnerable people.
Bickert said the policies would continue to evolve and acknowledged that mistakes would be made "because our processes involve people, and people are fallible". This is our way of saying these things are not tolerated. "Report them to us, and we'll remove them".
"Many of us have worked on the issues of expression and safety long before coming to Facebook". The company says it developed them in conjunction with a "couple hundred" of experts and advocacy groups representing the entire world.
Bickert said with the document that has been opened up to the public today, where they will be able to see the rationale behind each policy to give the context for the granular details.
An internal meeting is held every two weeks to review policies and update them when appropriate.
Amid a series of unfolding humanitarian crises, Facebook has been under pressure to improve content moderation around the globe.
Facebook's failure to properly clean its site of abuse and hate speech led the United Nations to blame it for inciting a possible genocide against the Rohingya minority in Myanmar.
Facebook's updated standards now list some exceptions for depictions of adult nudity, including "acts of protests", "breast-feeding", and "post-mastectomy scarring". According to training slides on hate speech, for example, the statements "You are such a Jew" or "Migrants are so filthy" are allowed, but writing "Irish are the best, but really French sucks" is not. The company will double its 10,000-person safety, security, and product and community operation teams by the end of this year. Companies like Facebook and YouTube have been able to succeed in their quests for rapid expansion by keeping their teams of engineers and designers nearly infamously "happy" and well-paid, and with billions of users knee-deep in their platforms' content, it only makes sense that the same sort of mindset should apply to those in control of it. "I have actually had conversations where I talked about our standards and people said, 'I didn't actually realize you guys have policies, '" said Bickert. "This is not a self-congratulatory exercise. And we want to hear about that, so we can build that into our process".
Facebook also announced plans to develop a more robust process for appealing takedowns that were made in error.
The site has also been involved in multiple arguments involving photos of breastfeeding women. There'll be a link to request a review, which will be carried out by a person, and "typically within 24 hours", Facebook promises. If Facebook removes something you posted for one of those reasons, it will notify you about the action and give you the option to request an additional review.