ARTICLE
23 April 2018

Mark Zuckerberg, FOSTA-SESTA, And The Challenges Of Content Moderation

FH
Foley Hoag LLP

Contributor

Foley Hoag provides innovative, strategic legal services to public, private and government clients. We have premier capabilities in the life sciences, healthcare, technology, energy, professional services and private funds fields, and in cross-border disputes. The diverse experiences of our lawyers contribute to the exceptional senior-level service we deliver to clients.
In his testimony before Congress last week, Facebook CEO Mark Zuckerberg observed that, on issues ranging from fake news to hate speech ...
United States Corporate/Commercial Law
To print this article, all you need is to be registered or login on Mondaq.com.

In his testimony before Congress last week, Facebook CEO Mark Zuckerberg observed that, on issues ranging from fake news to hate speech, the company "didn't take a broad enough view of our responsibility, and that was a big mistake."

Looking ahead, it remains to be seen what a "broad enough view" means for companies that both host online content. When the content that you and I see on various websites is determined by a complex ecosystem of content writers, human moderators, algorithms, and artificial intelligence, it will not be easy to get the balance right. We expect companies to provide open platforms and to protect free expression while also taking action to address harmful content in all its various forms. It does seem as we are at a moment where the normative expectations for online service providers are shifting such that intermediaries will be expected to take a "broader" view. These expectations present significant operational challenges, especially for companies with a global reach.

In this context, it is notable that on April 11, the same day that Mr. Zuckerberg testified before the U.S. House of Representatives, President Trump signed FOSTA-SESTA into law and therefore enacted controversial amendments to Section 230 of the Communications Decency Act. The amendments removed liability protections for online platforms that knowingly assist, support, or facilitate sex trafficking. Most observers agree that the policy objectives at issues in FOSTA-SESTA are undeniably laudable. That said, there are significant questions with respect to how companies that host content will respond to FOSTA-SESTA.

With new concerns about liability, will companies aggressively moderate content to avoid potential litigation? How can such efforts be calibrated to address real risks? Or will companies steer clear of content moderation due to concerns that such activities may support a finding that they "knowingly" facilitated trafficking activity if permitted content is ultimately found to be linked to wrongful acts? How can good faith corporate efforts be rewarded?

The answers to these questions may depend on the size of each company, the resources that any individual company can apply to content moderation efforts, and future case law. More generally, as governments around the world seek to enact new requirements specific to content moderation — with a mix of both commendable and repressive policy objectives — we will likely see events in the United States this month influence both rhetoric and regulation in other countries.

To view Foley Hoag's Corporate Social Responsibility Blog please click here

The content of this article is intended to provide a general guide to the subject matter. Specialist advice should be sought about your specific circumstances.

We operate a free-to-view policy, asking only that you register in order to read all of our content. Please login or register to view the rest of this article.

See More Popular Content From

Mondaq uses cookies on this website. By using our website you agree to our use of cookies as set out in our Privacy Policy.

Learn More