Skip Navigation
Resource

Law Requiring Social Media Transparency Would Break New Ground

Social media companies should release critical information to the public and to researchers.

Published: April 6, 2022

Social media compan­ies often shield them­selves from scru­tiny of the harm they cause by refus­ing to disclose inform­a­tion. When inform­a­tion does inad­vert­ently seep out, it can lead to shat­ter­ing revel­a­tions.

Docu­ments released as part of a leak known as the Face­book Papers, for instance, show that internal Face­book research concluded Instagram worsened teen girls’ body image issues and mental health, poten­tially affect­ing millions of users. Thanks to leaked docu­ments and research, we now know that the plat­form provides a venue for inter­na­tional drug cartels to recruit and adults to prey on vulner­able chil­dren. We also know that Face­book has previ­ously dispensed favors and special treat­ment to select categor­ies of “elite” users, grant­ing certain people a sweep­ing exemp­tion from plat­form enforce­ment and content moder­a­tion actions. 

Despite these and other matters of grave public concern, social media plat­forms main­tain near-exclus­ive control over the spigot of inform­a­tion released to the public and to research­ers. Face­book has clamped down on research­ers attempt­ing to study polit­ical ads on the plat­form. Plat­forms’ terms of service commonly forbid research­ers from using essen­tial invest­ig­at­ive tools such as research accounts and the use of soft­ware to auto­mat­ic­ally down­load, or “scrape,” public data directly from the plat­forms. Exist­ing plat­form-provided tools such as CrowdTangle and Face­book’s Ad Library API, mean­while, offer tightly circum­scribed and spotty inform­a­tion to the public and to research­ers. Such tools do not exist at all for most social media plat­forms.

Algorithm-driven person­al­iz­a­tion of social media feeds also means that indi­vidu­als are exposed to differ­ent kinds of inform­a­tion and have little basis to compare their online exper­i­ences. As a result, the public and policy debate about social media plat­forms’ impact remains woefully under­in­formed.

The Plat­form Account­ab­il­ity and Trans­par­ency Act

A bipar­tisan bill proposed by Sens. Rob Port­man (R-OH), Chris Coons (D-DE), and Amy Klobuchar (D-MN), the Plat­form Account­ab­il­ity and Trans­par­ency Act contains provi­sions that would help lift the curtain on social media compan­ies’ poten­tially harm­ful prac­tices. 

Organic Content Disclos­ure and Advert­ising Trans­par­ency

As currently writ­ten, the Plat­form Account­ab­il­ity and Trans­par­ency Act would permit the Federal Trade Commis­sion to require disclos­ure of certain plat­form data directly to the public or to research­ers. For plat­forms with at least 25 million consist­ent and unique monthly U.S. users, the bill would permit the FTC to require disclos­ure of:

  • public content on social media plat­forms that has been broadly dissem­in­ated accord­ing to metrics determ­ined by the FTC
  • content “origin­ated or spread by major public accounts,” — that is, accounts with at least 25,000 follow­ers, or whose content attracts 100,000 unique views per month — as well as asso­ci­ated inform­a­tion such as audi­ence numbers, whether plat­forms determ­ined such content to have viol­ated any company policies, and whether plat­forms recom­men­ded or promoted such content.

The legis­la­tion would also compel disclos­ure of:

  • “stat­ist­ic­ally repres­ent­at­ive samples of public content” on plat­forms, includ­ing
    “a sampling that is weighted by the number of impres­sions the content receives,”
  • advert­ising content, target­ing criteria, and asso­ci­ated inform­a­tion,
  • inform­a­tion on plat­forms’ use of algorithms
  • inform­a­tion on plat­forms’ content moder­a­tion policies and decisions.

Inform­a­tion for Research­ers

The bill would also compel social media plat­forms — with at least 25 million consist­ent and unique monthly U.S. users — to give certain data to inde­pend­ent research­ers where research­ers’ project propos­als and data requests have been approved by the National Science Found­a­tion, an inde­pend­ent federal agency. Such data shar­ing would be subject to privacy and cyber­se­cur­ity protec­tions put in place by the FTC, poten­tially includ­ing data anonym­iz­a­tion or use of secure “white rooms” for review of mater­i­als.

Liab­il­ity Immunity

The bill would immun­ize compan­ies against legal liab­il­ity for prop­erly releas­ing data under the bill. It would also immun­ize research­ers against legal liab­il­ity for research activ­it­ies performed pursu­ant to the bill, includ­ing activ­it­ies that may conflict with plat­forms’ terms and services. 

Compan­ies that do not comply with the bill’s disclos­ure require­ments would face poten­tial Federal Trade Commis­sion enforce­ment. The bill would further amend Section 230 of the Commu­nic­a­tions Decency Act to allow qual­i­fied research­ers to poten­tially sue social media giants for damages when plat­forms unlaw­fully with­hold data needed for a qual­i­fied research project in a manner that signi­fic­antly contrib­utes to the research­ers’ alleged harms.

•  •  •

Any social media trans­par­ency law — includ­ing any final iter­a­tion of the Plat­form Account­ab­il­ity and Trans­par­ency Act — should contain minimum stand­ards for data anonym­iz­a­tion and other privacy protec­tions, should ensure that the govern­ment’s enforce­ment discre­tion is not unfettered, and should ensure that import­ant inform­a­tion is provided to the public and research­ers in a fair, logical, and consist­ent way. Coupled with meas­ured and common­sense solu­tions to social medi­a’s trans­par­ency prob­lem, such a law would be a win for the public.