The US Supreme Court Tuesday heard oral arguments in Gonzales v. Google, a case that could upend how social media companies handle content distribution. At the heart of the case is a question of whether tech giants like Google can face liability when their algorithm recommends ISIS recruitment videos. That said, the case also has broader civil rights implications because, as one of the amicus briefs filed in the case pointed out, “As society has moved online, so too have discrimination, redlining, voter suppression, and harassment.”
The statute at issue before the court is Title 47, section 230(c) of the US Code, which was originally passed under section 230 of the Communications Decency Act. The statute broadly shields online platforms from liability for content posted to the platform by its users. As Justice Elena Kagan described, the issue before the court Tuesday concerned figuring out “how this statute applies–the statute which was a pre-algorithm statute applies in a post-algorithm world.”
In oral arguments, attorney for the plaintiffs Eric Schnapper claimed that Youtube’s behavior fell out of the structure of section 230(c). Because of this, Schnapper argued, Youtube should face liability for allegedly aiding and abetting ISIS in their recruitment efforts. Normally, section 230(c) prevents lawsuits against social media companies over content moderation and distribution decisions. Schnapper argued this case is different because Youtube’s algorithm is “affirmatively recommending or suggesting ISIS content,” and that this is different than “mere inaction.”
Arguing on behalf of the US, Malcom Stewart agreed and disagreed in part with Schnapper. Stewart did not adopt the plaintiffs’ view of section 230(c). Instead, Stewart urged the the court to “distinguish carefully between liability for the content itself, [and] liability for statements about the content.” That said, Stewart conceded that organizational decisions of a company may still be subject to suit, outside of section 230(c)’s coverage.
Lisa Blatt argued on behalf of Google and stood in opposition to Schnapper. Blatt argued that “[s]ection 230(c)(1)’s 26 words created today’s internet.” Under section 230(c)(1), Blatt argued, websites cannot be treated as the “publisher or speaker of any information provided by another.” Following this reasoning, Blatt said that when websites like Google communicate third-party information–like ISIS recruitment videos–and “the plaintiff’s harm flows from that information,” section 230(c)(1) protects Google from liability.
The justices, particularly Justice Clarence Thomas, seemed skeptical of Schnapper’s arguments. The justices repeatedly raised concerns over the flood of litigation that could occur if the court changed the interpretation of section 230(c). Thomas asked whether Youtube used a different algorithm to recommend ISIS recruitment videos as compared to cooking videos. This captured a question the court repeatedly raised: if Youtube’s algorithm can be said to be neutral depending on the content it recommends, how is the plaintiffs’ claim any different than previous claims denied by the court?
In an amicus brief to the court, Managing Attorney of the Digital Justice Initiative at the Lawyers’ Committee for Civil Rights Under Law David Brody emphasized that the court has the potential to shape online civil rights law. “If platforms are more concerned with liability, they are likely to turn up the dial and be even less permissive of controversial speech,” Brody said. “The goal is to thread the needle.” Brody argued that they way to do this is to adopt the “consensus test,” a two-part test that lower courts have largely adopted up until this point. The test asks courts to consider: (1) does the section 230(c) claim against a party seek to treat the platform as a publisher; and (2) if so, is the platform materially contributing to the illegality? Brody emphasized:
What’s important for the court to do here is to put a little meat on the bones about what it means for a claim to treat someone as a publisher, or what it means for someone to materially contribute illegality….Because when these tests are properly applied, they don’t sweep in everything under the sun and give platforms blank check immunity, but they also don’t open the floodgates to litigation.
Through its ruling, the court has the opportunity to clarify the issue and protect against potential civil rights violations, which, in recent years, has become complicated as the internet has grown beyond the confines originally described within section 230(c).
The European Court of Human Rights (ECHR) Thursday ruled that applications made by Ukraine and the Netherlands against Russia alleging multiple violations of the European Convention on Human Rights(ECHR) are partly admissible. The court declared that all complaints in Ukraine v. Russia (I) and the application brought by the Netherlands are admissible in full. Ukraine brought its first case against Russia in 2014, in the aftermath of the invasion of Crimea and the emergence of pro-Russian separatist movements in Donbas. Ukraine brought its second case in June 2014. The court will convene next to decide if Russia has violated the articles of the ECHR mentioned in the applications.2 months ago JURIST
This process is thought to take place 120 days into the pregnancy, and thus many believe abortion should be illegal after that point. In large part, this is due to the fact that under Islamic law, debates on abortion have customarily been legal, not moral. Modern Muslim scholars have largely been more nuanced, flexible, and tolerant of many opinions compared to the Supreme Court. People on the left are drawn to the Muslim scholars who have covered the subject from a progressive perspective, portraying Islamic law as pro-choice. Pro-life and pro-choice activists alike have adopted differing interpretations of Islamic law in an ostensibly serious bid to solicit Muslims as foot soldiers in their battle.5 months ago JURIST
The verdict comes two months after the successful prosecution of Adam Fox and Barry Croft, Jr., the ring leaders of the 2020 plot. All five men were part of the “Wolverine Watchmen”, a small militant organization that trained in Jackson County, Michigan. Joe Morrison, Pete Musico, and Paul Bellar conducted “shooting drills” with Fox and Croft in preparation for an anticipated shoot-out with Gov. The convicted men saw themselves as part of a larger movement of anti-government extremists hoping for a second American Civil War. The idea that the country is crying out for a second 1776, a second Fort Sumter, a second Murrah Federal Building.5 months ago JURIST
Here, Pitt Law 1L Morgan Hubbard reports on the first and only scheduled debate between Pennsylvania US Senate candidates John Fetterman and Mehmet Oz. Pennsylvania US Senate candidates John Fetterman (D) and Mehmet Oz (R) met in a TV debate Tuesday evening for the first time in the midterm election campaign. The most substantial conversation about policy between the candidates came from questions on the issues of abortion and minimum wage. Similarly, he favored a higher minimum wage to support working families, but stressed the reliance on a market-driven wage increase rather than a federally mandated one. Answers varied, but one issue that many students, including Luisa, were left with little informed policy debate about was climate change.5 months ago JURIST