This article is part of the On Tech newsletter. You can sign up here to receive it on weekdays.
Facebook’s new “Supreme Court” takes on its biggest case: Donald Trump.
The company’s recent decision to freeze Mr. Trump’s account after instigating a mob has been controversial, to say the least. On Thursday, the company asked its independent board of directors to review its decision and finally clarify whether the former president is allowed to return on Facebook and Instagram, which he owns.
Let me explain what this oversight body will do and what are the benefits and limitations:
An independent referee is good. Up to a point: Facebook outlined its plans for a court-like body in 2019 to reconsider the most prominent situations it believes Facebook has made a mistake in applying its rules against hate speech, incitement to violence, or other abuses.
Many people, including Facebook CEO Mark Zuckerberg, are uncomfortable with the idea that Facebook has the undisputed power to silence world leaders and shape online discourse. The governing body, whose decisions Facebook calls binding, is a measure of the independent accountability for the website’s decisions.
The Trump suspension is by far the largest case for the oversight body, which is made up of outside experts and only recently selected its first cases for review. The verdict will be closely watched and will affect the legitimacy of this new measure of Facebook justice.
(For more information, see this post by Evelyn Douek, a law lecturer and SJD candidate at Harvard Law School who studies online language regulation.)
Is it time to change the policies of world leaders? The Board of Directors is also asked to consider a question that goes well beyond Mr Trump: Should Facebook continue to give world leaders more leeway than the rest of us?
Both Facebook and Twitter allow top authorities to post hateful or untrue things that most of us would block or delete our posts. The principle behind this is solid: what the world leaders say is a matter of public concern and the public should be able to see and evaluate their views without a filter.
However, there are tradeoffs in the real world when powerful people have a megaphone to blow what they want.
In Myanmar, military leaders used Facebook to instigate genocide against the mostly Muslim Rohingya minority. In India, a prominent politician threatened the destruction of mosques and called Muslims traitors in his Facebook posts. The Iranian Ayatollah Ali Khamenei has called for the destruction of Israel on Twitter. And on social media sites, Trump and Philippine President Rodrigo Duterte have alluded to shooting their own citizens.
Business & Economy
Jan. 22, 2021, 7:23 p.m. ET
These world leaders can and often do so on television or in press releases, but when this happens journalists usually have the opportunity to provide context and responses.
Greg Bensinger, a member of the New York Times editorial team, recently argued that the world’s leading social media company policies are backward. If anything, there should be more rules than less for global leaders on Facebook and Twitter, he said.
What the regulator says on this issue could reset a crucial global policy.
What about the other billion people? Every year Facebook makes billions of decisions about people’s posts, but the board of directors will potentially only consider dozens of high-level disputes.
The board will not help the many millions of people in far less power than Mr. Trump, whose voices have been silenced over a decision Facebook made or not made.
This includes companies and people whose Facebook accounts are blocked and who cannot get anyone in the company to pay attention. A teen who is harassed on Facebook and leaves the site has no one to intervene on their behalf. And Rohingya who have been slaughtered in their homes cannot turn to this board.
The board’s decision on Mr. Trump could affect how online forums treat world leaders. However, the fact remains that for most Facebook users, the company is the last and the last word on what people may or may not say. And Facebook is hardly responsible for the consequences.