Posted

Disclosure, Complaints and Process: Texas H.B. 20 and Similar Bills Contain Provisions That Go Beyond Content Regulation

 @ sign with human face silhouette and red tape over the "mouth"In what is either one of the more ironic acts in a year full of irony or one of the more expressive power moves of the Texas legislative session, Gov. Greg Abbott announced on one social media platform that people can watch a livestream on another social media platform of him signing into law a bill that will restrict the ability of social media platforms to moderate posts on their platforms. But there is more to this new law that online content providers should be aware of.

Texas H.B. 20 is the latest of a species of bills being proposed in Republican-led states that appears aimed at social media platforms that some politicians claim have silenced conservative viewpoints. The headline-grabbing section of the law prohibits a social media platform from “censoring” a user or a user’s content, or a user’s ability to receive content, based on viewpoint, regardless of whether the viewpoint was expressed on the platform or somewhere else. The term “censor” is defined very broadly and would cover almost any attempt to censor nearly any form of “perceivable communication.” The law becomes effective on December 2, 2021.

Almost immediately, critics of the bill pointed out the obvious potential First Amendment implications of a law that compels a private entity to host speech that violates its standards. Indeed, the bill’s authors included a three-page coda that amounts to an effort to avoid having the entire Act struck down for violating the U.S. Constitution. It is important to note that this coda specifically argues that if any portion of the law is ever deemed to be unconstitutional, it shall be severed from the law and all remaining portions will continue in effect.

A recent and conceptually similar Florida law prohibits, among other things, a social media platform from barring any candidate for office from its website, meaning that anyone who files qualification papers to run for a public office may post any legal content they wish on a social media platform and the platform may not restrict it or the user. This has been an expanding trend this year with other bills with similar intent being introduced in Kentucky, North Dakota and Oklahoma. Both the Kentucky bill and the Oklahoma bill would create civil liability for a social media platform that censors a user’s religious or political speech. The North Dakota bill would allow a suit for damages (including punitive damages) by any person whose writing, speech or publication is restricted, as well as any person who “reasonably otherwise would have received the writing, speech or publication.”

Perhaps serving as a bellwether for similar laws, the Florida law has already hit a constitutional snag. In a 31-page ruling, a U.S. district court found that nearly every part of the law likely violates the First Amendment. This result appears to have surprised no one, and the state of Florida has already indicated that it will appeal the district court’s ruling and fully expects the issue to rise to the U.S. Supreme Court.

However, while most coverage of Texas H.B. 20 has centered on the First Amendment implications, the law has some significant provisions that do not seem to touch on content regulation.

While the law only applies to social media platforms that have more than 50 million active U.S. users per month, its provisions cover any user (including a banned or suspended user) who resides in Texas, does business in Texas or “shares or receives content on a social media platform” in Texas.

Under the new law, a covered social media platform will have to take the following measures:

  • Make available on its website a public disclosure regarding the platform’s content management, data management and business practices, including the manner in which it: curates and targets content to users; promotes content, services or products; moderates content; and uses algorithms to promote content.
  • Publish an “acceptable use policy” that clearly informs users about the types of content allowed on the platform and the steps the company will take to ensure compliance with the policy. The acceptable use policy also must include a biannual transparency report outlining actions taken by the platform to enforce the policy. The transparency report must disclose statistics about the number of instances involving illegal or policy-violating content and how such instances were detected, as well as the number of instances when the platform took action and what type of action was taken.
  • Implement a complaint system that allows users to report a complaint about illegal content or activity and track the status of a complaint, and requires the platform to make a good faith effort to investigate within 48 hours every complaint it receives about illegal content or activity.
  • Adopt a content removal process that includes giving notice to the user who posted the policy-violating content and an appeal process for the user. When a user complains about removed content, within 14 days the platform must review the blocked content and make a determination with respect to the platform’s acceptable use policy, then notify the user who complained as to the determination and any steps taken.

Although not as extensive as the provisions of the Texas law, the Florida statute also places some obligations on social media platforms that go beyond censorship of content. For instance, Florida’s law requires a social media platform to:

  • Provide a mechanism that allows a user to request the number of other users who were provided the user’s content.
  • Categorize the algorithms used for content prioritization and “shadow banning,” and allow users to opt out of such algorithmic prioritization.
  • Provide to users an annual notice on the use of algorithms for content prioritization and “shadow banning.”

Because these provisions governing public disclosures, acceptable use policies, complaint processes and content removal processes arguably do not regulate a social media platform’s speech, they are less likely to be struck down in a constitutional challenge. While it remains to be seen whether such provisions of the Florida statute will survive the ongoing court challenge, covered platforms will potentially have to navigate these new operational burdens in Texas come December, regardless of any court challenges that arise.


RELATED ARTICLES

The Misinformation of Capitol Hill: Section 230 and the Weaponization of Social Media

Social Distancing, Social Media and Section 230: DOJ Calls On Internet Companies to Provide Safer Online Communities

Stumbling “Blocks”: When Is Social Media Moderation a First Amendment Violation?