Attention capitalism has generated design processes and product development decisions that prioritize platform growth over all other considerations. To the extent limits have been placed on these incentives, interventions have primarily taken the form of content moderation. While moderation is important for what we call acute harms, societal-scale harms such as negative effects on mental health and social trust require new forms of institutional transparency and scientific investigation, which are members of a group classified within accountability infrastructure.
This workshop, facilitated by Nathaniel Lubin, will outline a novel framework for addressing these societal-scale harms. Participants will have the opportunity to hear from leading experts in the tech and psychology fields, then break out into smaller groups to discuss potential metrics for holding platforms accountable for the design decisions they make.