Skip Navigation
Resource

Law Requiring Social Media Transparency Would Break New Ground

Social media companies should release critical information to the public and to researchers.

Published: April 6, 2022

Social media companies often shield themselves from scrutiny of the harm they cause by refusing to disclose information. When information does inadvertently seep out, it can lead to shattering revelations.

Documents released as part of a leak known as the Facebook Papers, for instance, show that internal Facebook research concluded Instagram worsened teen girls’ body image issues and mental health, potentially affecting millions of users. Thanks to leaked documents and research, we now know that the platform provides a venue for international drug cartels to recruit and adults to prey on vulnerable children. We also know that Facebook has previously dispensed favors and special treatment to select categories of “elite” users, granting certain people a sweeping exemption from platform enforcement and content moderation actions. 

Despite these and other matters of grave public concern, social media platforms maintain near-exclusive control over the spigot of information released to the public and to researchers. Facebook has clamped down on researchers attempting to study political ads on the platform. Platforms’ terms of service commonly forbid researchers from using essential investigative tools such as research accounts and the use of software to automatically download, or “scrape,” public data directly from the platforms. Existing platform-provided tools such as CrowdTangle and Facebook’s Ad Library API, meanwhile, offer tightly circumscribed and spotty information to the public and to researchers. Such tools do not exist at all for most social media platforms.

Algorithm-driven personalization of social media feeds also means that individuals are exposed to different kinds of information and have little basis to compare their online experiences. As a result, the public and policy debate about social media platforms’ impact remains woefully underinformed.

The Platform Accountability and Transparency Act

A bipartisan bill proposed by Sens. Rob Portman (R-OH), Chris Coons (D-DE), and Amy Klobuchar (D-MN), the Platform Accountability and Transparency Act contains provisions that would help lift the curtain on social media companies’ potentially harmful practices. 

Organic Content Disclosure and Advertising Transparency

As currently written, the Platform Accountability and Transparency Act would permit the Federal Trade Commission to require disclosure of certain platform data directly to the public or to researchers. For platforms with at least 25 million consistent and unique monthly U.S. users, the bill would permit the FTC to require disclosure of:

  • public content on social media platforms that has been broadly disseminated according to metrics determined by the FTC
  • content “originated or spread by major public accounts,” — that is, accounts with at least 25,000 followers, or whose content attracts 100,000 unique views per month — as well as associated information such as audience numbers, whether platforms determined such content to have violated any company policies, and whether platforms recommended or promoted such content.

The legislation would also compel disclosure of:

  • “statistically representative samples of public content” on platforms, including
    “a sampling that is weighted by the number of impressions the content receives,”
  • advertising content, targeting criteria, and associated information,
  • information on platforms’ use of algorithms
  • information on platforms’ content moderation policies and decisions.

Information for Researchers

The bill would also compel social media platforms — with at least 25 million consistent and unique monthly U.S. users — to give certain data to independent researchers where researchers’ project proposals and data requests have been approved by the National Science Foundation, an independent federal agency. Such data sharing would be subject to privacy and cybersecurity protections put in place by the FTC, potentially including data anonymization or use of secure “white rooms” for review of materials.

Liability Immunity

The bill would immunize companies against legal liability for properly releasing data under the bill. It would also immunize researchers against legal liability for research activities performed pursuant to the bill, including activities that may conflict with platforms’ terms and services. 

Companies that do not comply with the bill’s disclosure requirements would face potential Federal Trade Commission enforcement. The bill would further amend Section 230 of the Communications Decency Act to allow qualified researchers to potentially sue social media giants for damages when platforms unlawfully withhold data needed for a qualified research project in a manner that significantly contributes to the researchers’ alleged harms.

•  •  •

Any social media transparency law — including any final iteration of the Platform Accountability and Transparency Act — should contain minimum standards for data anonymization and other privacy protections, should ensure that the government’s enforcement discretion is not unfettered, and should ensure that important information is provided to the public and researchers in a fair, logical, and consistent way. Coupled with measured and commonsense solutions to social media’s transparency problem, such a law would be a win for the public.