Facebook is working on appointing an independent global oversight board, which will have the authority to review the social network's content decisions.
According to Ime Archibong, Facebook's VP of product partnerships, the world's biggest social networking site is putting measures in place to create a global body that will provide strategic direction on the often "challenging" and "contentious" content decisions it has to make.
As part of that process, Facebook will host consultation workshops in various parts of Africa, to engage recognised experts to provide guidelines around implementing a level of censorship on the platform.
The social media giant, which has 2.2 billion users, faced backlash over the past two years from government leaders and citizens who accused it of being used as a tool to spread fake news, propaganda and terrorist activity.
A scathing British parliamentary report last month labelled Facebook a "digital gangster" that violated data privacy and competition laws through its improper collection of user data.
Facebook has always maintained it is seeking to implement processes for self-regulation and governance to help regain the trust of the public, politicians and regulatory authorities.
Archibong said in a statement that the new oversight board, as envisioned by Facebook, will consist of about 40 global experts with experience in content, privacy, free expression, human rights, journalism and safety, who will represent various regions across the globe.
"Over the next year, we will design a global body, an oversight board, which will have the authority to review some of our most challenging and contentious content decisions. Every day we grapple with our responsibility to keep our community safe while giving people freedom to express their opinions about the issues that matter the most to them," explained Archibong.
"We take this responsibility seriously and know that we don't have all the answers. We also know we must continue to learn from experts and members of our community, in particular those of you who live and work across Africa."
The board will include local experts, including academics, NGOs and civil society from across the world, who will exercise independent judgement when reviewing Facebook's most difficult and disputed content decisions and hold the social media network publicly accountable if it does not make the right decisions.
As part of its information-gathering and consultation process, Facebook says it will host a workshop in Nairobi over the coming few weeks, with participants from across the African continent. During the workshop, Facebook representatives will engage with this group on the hard questions related to what content should be allowed to stay up and what content should be allowed to come down.
"This should in time bring more perspective, accountability and transparency to our content decisions. The board will have the power to overrule or uphold Facebook's content decisions and will be able to recommend changes or additions to policies. Where we need to, we will supplement member expertise through consultation with geographic and cultural experts to help ensure decisions are fully informed," continued Archibong.
Social media regulation
Facebook has been facing calls for regulation from the US Congress and British privacy regulators after reports last year revealed political data firm Cambridge Analytica harvested the personal data of millions of people's Facebook profiles without their consent, and used the information for political advertising during the 2016 US presidential election.
The Cambridge Analytica scandal affected over 50 million users and prompted several apologies from chief executive Mark Zuckerberg, who promised to take tougher steps to restrict developers' access to user information.
Academics at the University of Oxford in the UK and Stanford University in the US issued a research report consisting of a list of guidelines and recommendations for Facebook, titled: "Glasnost! Nine Ways Facebook Can Make Itself a Better Forum for Free Speech and Democracy".
One of the recommendations of the report, released in January, was that Facebook should hire more culturally diverse content reviewers.
"Facebook still has too little content policy capacity to meet legitimate, and sometimes urgent, public interest and human rights concerns in many different countries. Similar problems have been reported in Sri Lanka, Libya and the Philippines, where content that was not just hate speech but dangerous speech was left up for too long, often with disastrous consequences," it adds.
The report also suggested Facebook should tighten its community standards wording on hate speech, increase decisional transparency, expand and improve the appeals process, and create an external content policy advisory group.
In 2017, Facebook and Twitter faced pressure in the US and Europe to tackle extremist content on their platforms more effectively.
Facebook responded by saying it was already removing 99% of content related to militant groups Islamic State and al Qaeda before being told to do so.
Lead author of the Glasnost report, professor Timothy Garton Ash, explains that while industry-wide self-regulation should be actively pursued, attaining it will be a "long and complex task".
"In the meantime, there is a great deal that a platform like Facebook can do right now to address widespread public concerns, and to do more to honour its public interest responsibilities as well as international human rights norms.
"Executive decisions made by Facebook have major political, social and cultural consequences around the world. A single small change to the News Feed algorithm, or to content policy, can have an impact that is both faster and wider than that of any single piece of national legislation," asserted Garton Ash.
As SA prepares for the general elections to be held on 8 May, Akua Gyekye, Facebook's public policy manager for Africa elections, said in a blog post that Facebook is working to reduce the spread of misinformation, protect election integrity and support civic engagement across Africa.
Share