The Institute for Rebooting Social Media (RSM) collaborated with the Oversight Board to co-organize “Now and Next: Platform Accountability and Content Governance”. The Oversight Board, an independent governing body established by Meta in 2020 to oversee the company’s content moderation decisions across its platforms, joined us at Harvard Law School to discuss its decision-making process, the challenges members face, and suggestions for how content moderation and governance can be improved in the future.
The two-day event kicked off with a closed-door meeting on September 21, featuring Oversight Board members, staff, and liaisons from Meta, along with leading practitioners and thinkers in the tech and regulation fields. Attendees gathered at RSM’s HQ in the Lewis International Law Center to discuss the history and future of the Board with conversations facilitated by RSM Faculty Directors Jonathan Zittrain and James Mickens. There were also sessions dedicated to AI, international regulation, and the impact of social media on elections.
Day two was open to the public in Harvard’s renowned Ames Courtroom, giving attendees the opportunity to question Board members and Meta staff directly about their practices and the Board’s effectiveness in holding social media companies accountable for the content hosted on their platforms. The day was broken up into three distinct sessions: “Three Years In: The Oversight Board and Platform Accountability”, “A Conversation on the Future of Online Governance”, and a live mock deliberation by the Board, which gave audience members a glimpse into how the Board selects its cases and what the eventual appeal process looks like.
An overarching point of contention across the event was the fact that, despite the Oversight Board’s intentions to uphold free speech, the Board is ultimately beholden to Meta’s longevity as a company. It was also evident in these debates that there is often a disconnect between the theoretical approach the Board takes and the actual practice of content moderation–a gap that can leave people across the globe unintentionally vulnerable to rights violations or harmful content from ill-behaved individuals, groups, and governments alike.
By the end of the two days, it was clear that the Board’s efforts in the last three years have indeed paved a path toward more platform accountability, but that there are plenty of obstacles still blocking effective governance. Therefore, shining light on how decision makers and industry leaders make decisions that impact daily users across the globe is of the utmost importance. The organizers hope this event is just the beginning of an ongoing conversation on how to improve our collective experience across the Internet.
Watch all three public sessions below: