Facebook says its ‘oversight board’ will use outside experts to weigh in on content appeals, as it reveals details on plan to ensure review process ‘exercises independent judgment’
- Firm has been soliciting feedback from people at workshops in 88 countries
- Attendees at workshops agreed Facebook employees should not sit on the board
- Some proposed that board should have power to influence site’s content policies
Facebook has released the findings from thousands of consultations with outside experts weighing in on its content review process, as the company works to build an ‘external oversight board’ amid increasing public scrutiny.
The social media giant has been soliciting feedback over the past six months from more than 650 people at workshops in 88 countries on its draft plan for the board.
According to Facebook, this committee will eventually function as an independent court of appeals on content decisions.
Chief Executive Mark Zuckerberg has said decisions about acceptable speech on Facebook’s suite of social networks – used by some 2.4 billion people worldwide – should not rest in the company’s hands alone.
The company will finalize the board’s charter in August, it said.
Scroll down for video
Facebook has released the findings from thousands of consultations with outside experts weighing in on its content review process, as the company works to build an ‘external oversight board’ amid increasing public scrutiny
According to the report, attendees at the workshops broadly agreed that Facebook employees should not sit on the board.
The company also should not be able to remove members without cause, and should clarify how it would define ’cause,’ they said.
In a blog post published Thursday morning, Facebook said people ‘want a board that exercises independent judgment — not judgment influenced by Facebook management, governments or third parties.’
‘The board will need a strong foundation for its decision-making,’ the firm added, including ‘a set of higher-order principles — informed by free expression and international human rights law — that it can refer to when prioritizing values like safety and voice, privacy and equality.’
Other popular proposals were that the board should be able to choose its own cases; that board decisions should establish precedent for future cases; and that the board should have the power to influence Facebook’s content policies.
Attendees expressed concerns over the board’s independence, both from state actors and the company itself.
But, Facebook says it will be taking suggestions from the public consultations along with executive search firms to ensure a fair selection process.
‘We want to make sure that we’re casting a wide net, not just looking to those experts who may already be known to us,’ Facebook said.
‘Facebook will select the first few people and those members will then help select the remaining people.’
Facebook has long faced criticism for doing too little to block hate speech, incitements to violence, bullying and other types of content that violate its ‘community standards.’
It has stepped up enforcement of those standards over the past year, employing more than 30,000 people to monitor content and focus on improving ‘safety and security’ on the platforms, many of them low-paid contractors.
But the company continues to struggle with high-profile controversies over content posted on its site, such as the live streaming of a shooting that killed 51 people at two mosques in Christchurch, New Zealand in March.