From Tech Policy Press:

A little more than a year ago, in the first article announcing the release of the Facebook Files, the documents brought out of the company by whistleblower Frances Haugen, the Wall Street Journal’s Jeff Horwitz reported on Cross Check, a Facebook system that “exempted high-profile users from some or all” of the platform’s rules. The program shields millions of elites from normal content moderation enforcement. While the existence of such a program was known, its scale was and perhaps still is shocking.

Following the Journal’s reporting and subsequent concern in the public, Facebook (now Meta) President of Global Affairs Nick Clegg announced the company would request a policy advisory opinion from its independent Oversight Board. 14 months later, the Oversight Board has completed its review and published its opinion.

To talk more about the opinion, the Cross Check system and the problem of content moderation more generally, I’m joined with one member of the Oversight Board, Nighat Dad, a lawyer from Pakistan and founder of the Digital Rights Foundation; and one outside observer who answered the board’s call for opinions about the Cross Check system, R Street Institute senior fellow and University of Pennsylvania Annenberg Public Policy Center distinguished research fellow Chris Riley.

https://open.spotify.com/episode/3UW3sYTqaS9XceI9hBZoGJ?si=afc2373c4c414441

Featured Publications