Court filings show how Amazon Web Services is using Section 230 as a legal sword against Parler

news3 years ago293

[Editor’s Note: This story originally appeared in journalist Eli Sanders’ “Wild West” newsletter, which covers internet-related legal issues. Subscribe here.]

The social media platform Parler marketed itself as a place where Americans could “speak freely” and avoid the fear of being “deplatformed.” Its investors included conservative donor Rebekah Mercer, who described Parler as “a beacon to all who value their liberty.” But Parler was never any sort of wholly independent island of absolutely free speech.

Before Parler’s August 2018 launch, according to court documents, its leaders signed the nascent site up for Amazon’s cloud-based web hosting services. In practical terms, this meant that a platform that billed itself as “the world’s town square,” and that ended up being known mostly as a gathering spot for right-wing extremists, was both operationally and contractually linked to a major player in cloud computing headquartered in Seattle, one of the country’s most liberal cities.

MORE FROM WILD WEST

Cloud Control to Parler Mob There Will Be Consequences Clash of the Utopians

Over the past several months, as extremist venom and violent talk surged on Parler, the perils and opportunities of this symbiotic relationship appeared to hit home for certain Amazon employees. On Nov. 17, as President Trump was ramping up efforts to delegitimize and overturn his election loss, an Amazon Web Services employee sent an email to a Parler contact. The platform’s recent surge in user growth was “exciting,” the Amazon employee wrote, but “specific examples of potential hate speech and incitement of violence” on Parler were prompting questions at Amazon about Parler’s content-moderation policies.

Those content-moderation polices were exceedingly lax when compared to rules at Facebook and Twitter, and enforcement of Parler’s policies relied on “juries” of users, a system that ultimately produced a backlog of some 26,000 reported Parler violations awaiting crowd-based adjudication.

In that Nov. 17 email, as an example of just one piece of concerning speech on Parler, the Amazon employee attached a screen shot of a racist, pro-Trump post that attacked Michelle Obama using vile language. The post linked to a video in which Michelle Obama had criticized Trump while endorsing Joe Biden, and the author of the post asked why Michelle Obama was “telling white people what to do.” In the comments attached to the post, a Parler user had written: “The only good democrat is a dead one. Kill ’em ALL!”

A representative of Parler replied to Amazon that the post, “hateful as it is, would not be deemed a violation of our terms of service.”

That exchange, chronicled in recent court filings, occurred seven weeks before right-wing extremists, some of whom reportedly used Parler to organize, stormed the US Capitol on Jan. 6 in a violent insurrection that left five people dead. Before, during, and after that riot, more than 500 on-the-scene videos were uploaded to Parler showing the chaos and harm from that day. The assault prompted Trump’s second impeachment and has fueled an ongoing maelstrom of recriminations and responses.

Between the email to Parler on Nov. 17 and Amazon’s decision to suspend the platform from its cloud services on Jan. 10, Amazon reported to Parler “more than 100 additional representative pieces of content advocating violence,” according to court filings. Those Parler posts included calls for civil war, rape, the assassination of political leaders, and the murder of police officers. They also urged, or wished for, the violent death of a long list of regular right-wing targets, including tech leaders, Stacey Abrams, Georgia Secretary of State Brad Raffensperger, African-Americans, Jews, and teachers.

Parler executives and many conservatives have cast Parler’s ejection from the Amazon cloud as the most recent example of Big Tech’s power to limit political speech. In truth, no one has a free speech right to use a private business’s digital megaphone. There are, however, questions this episode raises about the concentration of tech power and the civic responsibility of tech leaders. Those queries seem likely to be taken up when the new Congress convenes, and they include this question: Why did it take seven weeks, five deaths, and a historic wound to American democracy for Amazon to suspend Parler for violating Amazon’s policies on incitement to violence?

At present, the closest thing we have to an answer comes from a federal court hearing last week in the still-unfolding case of Parler vs. Amazon. During that hearing, Ambika Doran, an attorney for Amazon, told a federal judge in Seattle that “the events of Jan. 6 changed the way we think about the world. It took what was merely hypothetical and made it chillingly real.”

Amazon, Parler and Section 230

Parler is asking the federal judge, Barbara Rothstein, for an emergency ruling that would effectively reinstate Parler’s access to the Amazon cloud. Judge Rothstein said last week that she’ll rule “as quickly as possible.”

No matter which way Judge Rothstein rules, the filings in Parler vs. Amazon have already added an intriguing new layer to the ongoing debate over Section 230, the foundational law of the internet era that was being targeted for repeal or revision long before the Jan. 6 attack.

Section 230 gives any provider of “an interactive computer service” broad immunity from lawsuits over content moderation. Until now, the law has almost exclusively been debated in the context of major social media platforms, with Facebook and Twitter usually cited as prime examples of the wide, unchecked power this law gives tech giants to disappear, label, or “throttle” digital speech. The situation with Parler involves a smaller player in the social media game and its business relationship with an often invisible giant of cloud computing, and it therefore offers a different angle on the consequences of Section 230.

Experts frequently describe how Section 230 hands tech platforms both a “sword” and a “shield” to use when dealing with problematic content. The sword is immunity from lawsuits for any digital platform that decides to slice out (or block, or label) offensive content. The Section 230 shield is another type of immunity, this one from lawsuits over content that digital platforms decide to leave up—or never even know was posted in the first place. Except in certain circumstances, the posted content can be illegal, obscene, and even contain incitements, and still the platforms will have Section 230 shield immunity. Only the user responsible for the content can be brought before a judge.

In branding itself as the place where users can say anything without fear of “deplatforming,” Parler strongly embraced the shield aspect of Section 230 (arguing that what its users said was almost always their problem). At the same time, it largely laid down the sword of content moderation. This made Parler a controversial and, for right-wing extremists, enticing platform.

Until Amazon’s recent actions against Parler, the debate over Section 230 hadn’t much focused on deployment of the content moderation sword by essential components of the online experience that aren’t social media providers — components one might call, in lay terms, the platforms beneath the platforms.

Amazon, with its cloud-based web hosting, is a kind of platform beneath Parler. Through a similar arrangement, Amazon also helps Twitter operate. (Leading Parler to allege, in its federal antitrust and business interference lawsuit, that Amazon’s true motive for booting Parler was to help Twitter. Amazon denies this.)

By running to federal court to complain it’s been wrongly deplatformed by Amazon, Parler now finds itself making the same argument aired by users who migrated to Parler after being deplatformed by Facebook or Twitter.

Amazon cites immunity under statute

In response, Parler is hearing from Amazon’s attorneys the same thing those Facebook or Twitter users would likely hear if they ever told a judge they’d been unlawfully silenced by one of this country’s leading social media platforms: Section 230 means your claims will go nowhere.

In Amazon’s response to Parler’s lawsuit, it takes only one paragraph for Amazon lawyers to explain why Section 230 firmly allows the company to suspend Parler for “objectionable” content:

From Amazon’s reply to Parler’s lawsuit.

In addition, Amazon cites terms of service that Parler agreed to when it signed up for Amazon’s web hosting services. An Amazon “Acceptable Use Policy” Parler agreed to prohibits content that violates “the rights of others, or that may be harmful to others.”

In a declaration filed in the suit, Parler CEO John Matze complains that Amazon’s actions have effectively made his company a pariah, causing many of Parler’s business partners and vendors to “withdraw potential financing and infrastructural services.” He ticks through a number of those lost services, including services provided by Stripe, Slack, and American Express, and further laments:

It’s not clear how this grim take squares with new reports that Parler is back online. But in his court declaration, Matze, who as CEO of Parler should be pretty familiar with Section 230, seems as shocked as anyone that an internet infrastructure provider like Amazon Web Services can, just like social media platforms, deploy Section 230’s “sword.” But unless Parler’s lawyer can somehow convince Judge Rothstein that Amazon Web Services is not, in fact, a provider of “an interactive computer service,” it would seem that Amazon is very likely to be covered by Section 230 immunity for the Parler suspension.

Should Judge Rothstein agree that’s the case, or should she simply reject Parler’s suit without getting to the merits of Amazon’s Section 230 defense, expect a new era of focus on the enablers of violent, harmful, and illegal speech that sit at various unseen layers beneath social media platforms, in what tech geeks call “the stack” of services and infrastructure that make each user’s internet experience possible.

As former Facebook data scientist Roddy Lindsay wrote recently:

It is only a matter of time before support for infrastructure-level content moderation extends even further down the stack—to companies that provide the backbone of the internet and to consumer ISPs. There are already calls for Verizon and Comcast to remove One America News and Newsmax from their TV offerings; blocking their websites is a logical next step.

If one early promise of the internet was that we’d all be wonderfully connected, then one early consequence of the Jan. 10 Parler suspension and the ongoing Parler vs. Amazon lawsuit may be a dawning awareness that for any one piece of internet speech, many different companies can be connected in a chain of alleged complicity. Some of the links in such a chain, like Amazon, are likely to have considerable power if they decide not to support certain online speech and, as a result, feel the need to remove themselves.

Source: https://www.geekwire.com/2021/court-filings-show-amazon-web-services-using-section-230-legal-sword-parler/