Whether Google can be held liable for recommending harmful content on YouTube is at the heart of a watershed case that could shape the future of the internet. Credit: Magdalena Petrova The US Supreme Court today heard oral arguments from lawyers representing Google, the Department of Justice, and the family of a 23-year old woman killed in Paris by terrorists in 2015. The case, Gonzalez v. Google, represents a crucial legal landmark in how the US legal system holds large technology platforms like Google responsible for the content they host. The family of Nohemi Gonzalez argues that Google acted as a recruiting platform for the Islamic State group, which the US State Department describes as a terrorist organization. By recommending Islamic State-related videos on YouTube, Google violated US laws against providing aid to terrorist groups, the family argues. Google, however, has argued that it is legally immune from such suits thanks to Section 230 of the Communications Decency Act, which prevents internet-based companies from liability for user-generated content. The hearing was a contentious one, with the assembled justices peppering the lawyers for each party with questions. Google’s lawyer, Lisa Blatt, argued strenuously that algorithmically generated recommendations for content are covered by Section 230, and that the legal immunity provided by that law is a fundamental building block of the modern internet. Without Section 230, Blatt said, every content-driven platform on the internet, from Yelp to Zillow to Amazon, would be liable for each and every piece of content that they host. Google says eliminating liability protections threatens the internet “Exposing websites to liability for implicitly recommending third-party context defies the text [of Section 230] and threatens today’s internet,” she said. The thrust of Google defense was echoed and backed up by multiple briefs filed to the Supreme Court by big tech companies including Microsoft, Twitter and Facebook parent company Meta. The lawyer for the Gonzalez family, University of Washington law professor Eric Schnapper, argued that recommendations provided by platforms like YouTube are essentially editorial choices — those platforms could have been designed such that they don’t surface or recommend harmful or defamatory content, but they were not. The decision to let YouTube recommend that harmful content, therefore, is one that the platform providers made consciously, which means that they should be held accountable for its publication. “In some circumstances, the manner in which third-party content is organized or presented could convey other information from the defendant itself,” he said, underscoring the point that the ability to provide recommendations is not necessarily neutral. Twitter liability case also goes before Supreme Court In the Gonzalez case, as well as the closely related matter of Twitter v. Taamneh, which is scheduled for a hearing tomorrow, the stakes are high. Any finding that large tech companies are liable for the content they promote or recommend, even in an automated, algorithmic way, could represent a massive sea change in the way tech giants operate. In the Taamneh case, the suit by the family of a Jordanian national killed in a terrorist attack alleges that Twitter wasn’t sufficiently aggressive in prohibiting the Islamic State group from using that platform. It’s a similar “aiding-and-abetting” issue to Gonzalez. Liability for user-generated content could have any number of follow-on effects, from vastly increased oversight and heavier restrictions from the internet-based companies, to simply invalidating the business model for companies that rely on user-generated content to function. The justices seemed to be concerned that any change to Section 230 could generate a wave of new lawsuits against big tech. “Really anytime you have content, you [would] also have these presentational and prioritization choices that can be subject to suit,” said Associate Justice Elena Kagan. A decision is expected by the time the court’s term ends in June. Related content news Businesses lack AI strategy despite employee interest — Microsoft survey Microsoft’s fourth annual Work Trend Index survey shows that workers are coming to grips with generative AI tools, but leaders aren’t convinced they have a proper deployment strategy in place. By Matthew Finnegan May 08, 2024 6 mins Microsoft Generative AI IT Skills news analysis Apple Silicon sets scene for a new AI ecosystem With its new iPads, Apple presses home the message that Apple Silicon is built for AI. By Jonny Evans May 08, 2024 12 mins Apple Generative AI iPad news The CHIPS Act money: A timeline of grants to chipmakers The Department of Commerce is divvying up $52 billion in the hopes of spurring on-shore chip manufacturing in the US. Here's what's been allocated and where the money is going. By Lucas Mearian May 08, 2024 5 mins CPUs and Processors Government Manufacturing Industry reviews Arc browser for Windows — better than Chrome? This might just be the best web browser for power users. But you’ll have to rewire your brain. By Chris Hoffman May 08, 2024 13 mins Windows Browsers Productivity Software Podcasts Videos Resources Events SUBSCRIBE TO OUR NEWSLETTER From our editors straight to your inbox Get started by entering your email address below. Please enter a valid email address Subscribe