ADVERTISEMENT
US Supreme Court frustrated and wary over legal shield for tech companiesThe federal law shields online platforms from lawsuits over what their users post
International New York Times
Last Updated IST
Representative image. Credit: iStock Photo
Representative image. Credit: iStock Photo

In a case with the potential to alter the very structure of the internet, the Supreme Court did not appear ready Tuesday to limit a law that protects social media platforms from lawsuits over their users’ posts.

In the course of a sprawling argument lasting almost three hours, the justices seemed to view the positions taken by the two sides as too extreme, giving them a choice between exposing search engines and Twitter shares to liability on the one hand and protecting algorithms that promote pro-Islamic State group content on the other.

At the same time, they expressed doubts about their own competence to find a middle ground.

ADVERTISEMENT

“You know, these are not like the nine greatest experts on the internet,” Justice Elena Kagan said of the Supreme Court, to laughter.

Others had practical concerns. Justice Brett Kavanaugh worried that a decision imposing limits on the shield “would really crash the digital economy with all sorts of effects on workers and consumers, retirement plans and what have you.”

Drawing lines in this area, he said, was a job for Congress. “We are not equipped to account for that,” he said.

The federal law at issue in the case, Section 230 of the Communications Decency Act, shields online platforms from lawsuits over what their users post and the platforms’ decisions to take content down. Limiting the sweep of the law could expose the platforms to lawsuits claiming they had steered people to posts and videos that promote extremism, advocate violence, harm reputations and cause emotional distress.

The case comes as developments in cutting-edge artificial intelligence products raise profound new questions about whether old laws — Section 230 was enacted in 1996 — can keep up with rapidly changing technology.

“This was a pre-algorithm statute,” Kagan said, adding that it provided scant guidance “in a post-algorithm world.” Justice Neil Gorsuch, meanwhile, marveled at advances in AI. “Artificial intelligence generates poetry,” he said. “It generates polemics.”

The case was brought by the family of Nohemi Gonzalez, a 23-year-old college student who was killed in a restaurant in Paris during the terrorist attacks in November 2015, which also targeted the Bataclan concert hall. Eric Schnapper, a lawyer for the family, argued that YouTube, a subsidiary of Google, bore responsibility because it had used algorithms to push Islamic State group videos to interested viewers, using information that the company had collected about them.

ADVERTISEMENT
(Published 22 February 2023, 09:13 IST)