The US Supreme Court on Thursday refused to clear a path for victims of attacks by militant organizations to carry social media firms liable under an anti-terrorism law for failing to stop the groups from using their platforms, handing a victory to Twitter.
The court in a separate case involving Google sidestepped a bid to weaken legal protections for web firms.
The justices in a 9-0 decision reversed a lower court’s ruling that had revived a lawsuit against Twitter by the American relatives of Nawras Alassaf, a Jordanian man killed in a 2017 attack during Recent 12 months’s celebration in a Istanbul nightclub claimed by the Islamic State militant group.
The case was one in all two that the Supreme Court weighed in its current term geared toward holding web firms accountable for contentious content posted by users – a difficulty of growing concern for the general public and US lawmakers.
The justices on Thursday, in an identical case against Google-owned YouTube, a part of Alphabet, sidestepped making a ruling on a bid to narrow a federal law protecting web firms from lawsuits for content posted by their users – called Section 230 of the Communications Decency Act.
The justices, in a transient and unsigned ruling, returned to a lower court a lawsuit by the family of Nohemi Gonzalez, a 23-year-old college student from California who was fatally shot in an Islamic State attack in Paris in 2015. The lower court had thrown out the lawsuit.
The Istanbul massacre on Jan. 1, 2017, killed Alassaf and 38 others. His relatives accused Twitter of aiding and abetting the Islamic State, which claimed responsibility for the attack, by failing to police the platform for the group’s accounts or posts in violation of a federal law called the Anti-Terrorism Act that permits Americans to get better damages related to “an act of international terrorism.”
Twitter and its backers had said that allowing lawsuits like this could threaten web firms with liability for providing widely available services to billions of users because a few of them could also be members of militant groups, whilst the platforms repeatedly implement policies against terrorism-related content.
The case hinged on whether the family’s claims sufficiently alleged that the corporate knowingly provided “substantial assistance” to an “act of international terrorism” that will allow the relatives to keep up their suit and seek damages under the anti-terrorism law.
After a judge dismissed the lawsuit, the San Francisco-based ninth US Circuit Court of Appeals in 2021 allowed it to proceed, concluding that Twitter had refused to take “meaningful steps” to stop Islamic State’s use of the platform.
Conservative Justice Clarence Thomas, who authored the ruling, said the allegations made by the plaintiffs were insufficient because they “point to no act of encouraging, soliciting or advising the commission” of the attack.
“Somewhat, they essentially portray defendants as bystanders, watching passively as ISIS carried out its nefarious schemes,” Thomas added.
President Joe Biden’s administration supported Twitter, saying the Anti-Terrorism Act imposes liability for assisting a terrorist act and never for “providing generalized aid to a foreign terrorist organization” with no causal link to the act at issue.
Within the Twitter case, the ninth Circuit didn’t consider whether Section 230 barred the family’s lawsuit. Google and Meta’s Facebook, also defendants, didn’t formally join Twitter’s appeal.
Islamic State called the Istanbul attack revenge for Turkish military involvement in Syria. The essential suspect, Abdulkadir Masharipov, an Uzbek national, was later captured by police.
Twitter in court papers has said that it has terminated greater than 1.7 million accounts for violating rules against “threatening or promoting terrorism.”
The case against Google involved the scope of a 1996 US law called Section 230 of the Communications Decency Act, which provides safeguards for “interactive computer services” by ensuring they can not be treated for legal purposes because the “publisher or speaker” of data provided by users.
The family had argued that YouTube provided illegal assistance to the Islamic State, which claimed responsibility for the attack, by recommending the militant group’s content to users.
Of their transient ruling on Thursday, the justices wrote that they “decline to deal with the appliance of (Section 230) to a criticism that appears to state little, if any, plausible claim for relief.”