Strange IndiaStrange India


The Facebook Oversight Board is on the cusp of deciding whether Donald Trump should be allowed to return to a platform he used to incite racist violence.

While the board ostensibly has the authority to make this decision, Facebook itself will make the final call. From the board’s inception in 2018, we’ve noted that its power is illusory. It provides cover for Facebook, a veneer of accountability, even as the company enables and promotes hate and disinformation.

The board is dysfunctional by design, which is why it did nothing over the past year even as Facebook amplified Trump’s lies about the Covid-19 pandemic. The board’s toothlessness became even more apparent as Facebook allowed Trump to repeat claims of election fraud, which set the stage for the deadly white-supremacist insurrection at the US Capitol on January 6. It was only after the world witnessed Trump’s incitement of this violent raid that the platform giant suspended his Facebook and Instagram accounts.

Facebook’s business model has benefited from the promotion of hate and lies far beyond those spread by Trump. No board decision will change that. If board members truly want to have an impact, they must all resign.

It’s not surprising that the board’s impressive roster of legal scholars and human-rights experts has been unable to rein in Facebook’s toxicity. The company designed the board to be ineffectual. This allegedly independent entity gives off the appearance of autonomy and authority, but it’s powerless to make structural changes to Facebook’s deeply flawed content-moderation process.

Facebook has claimed that the board’s decisions will be binding, but its actions don’t instill a great deal of confidence. It narrowed the initial scope of the board’s review to content removals, and only recently expanded it to content that has been left up (and only Facebook can ask the board to review other issues). In all instances, Facebook controls the entire content-review and appeals process, and a user must exhaust all their options through Facebook before appealing to the board. And the company is very opaque about how it determines what content can or cannot be appealed.

The Oversight Board takes a tiny fraction of the appeals funneled through Facebook, with a maximum 90-day adjudication process. It can speak with experts, commission reports, and solicit comments from the public. Ironically, while the board touts transparency as one of its main pillars, the comment process leaves much to be desired. It requires a scavenger hunt to find its open cases on its website.

Finally, the board can also make policy recommendations as part of its decisions, but these are merely suggestions that Facebook can take or leave.

It’s clear that a board bankrolled by Facebook will never be allowed to address the hate and disinformation that drives the core of the company’s enragement-engagement business model. The board can’t hold Facebook accountable for its lackluster enforcement of its rules regarding world leaders. It can’t hold Facebook to account for its role in actively amplifying Trump’s rhetoric and undermining the health and safety of its users, especially people of color, women, and others who are most frequently targeted by the hate groups and disinformation mongers who’ve made Facebook their home.

Facebook controls every step of the board’s operations, creating a clever catch-22: Reinstating banned content means Facebook was “wrong,” allowing the board to claim independence. But keeping controversial content off the network could suggest that Facebook is “right” and that the board is beholden to Facebook. Ultimately the board is unable to truly affect the kind of change that would protect users and strengthen our democracy.

That’s why it’s time for the members of the Oversight Board to uphold their professional integrity and step down. By rejecting the premise that Facebook can be governed by a quasi-official entity, its members might regain some trust in their respective fields, and more importantly remove an obstacle to getting important work done.

Facebook should focus its efforts on what we know works: creating clearer standards, enhancing transparency, and shoring up efforts to equitably enforce its rules. We all know that isn’t enough. We also need public-policy solutions that will upend this hate-for-profit business model. Facebook alone can’t, and won’t, fix itself.


WIRED Opinion publishes articles by outside contributors representing a wide range of viewpoints. Read more opinions here, and see our submission guidelines here. Submit an op-ed at opinion@wired.com.


More Great WIRED Stories



Source link

By AUTHOR

Leave a Reply

Your email address will not be published. Required fields are marked *