Facebook has once again been hit with a lawsuit by someone seeking to hold the company responsible for its users' activities.
This newest case, filed Thursday, was brought by Angela Underwood Jacobs, whose brother -- federal security officer Dave Patrick Underwood -- was fatally shot in Oakland, California during a 2020 protest over the murder of George Floyd.
Jacobs' complaint, like other lawsuits against the social networking platform, takes aim at Facebook's algorithms -- including ones that make recommendations to users.
“The shooting was not a random act of violence,” Jacobs alleges in papers filed in Alameda County Superior Court in California, against Facebook's parent company, Meta. “It was the culmination of an extremist plot hatched and planned on Facebook by two men who Meta connected through Facebook’s groups infrastructure and its use of algorithms designed and intended to increase user engagement and, correspondingly, Meta’s profits.”
The alleged shooter, Steven Carrillo, is believed to be affiliated with the “boogaloo” extremist movement.
He allegedly conspired with another Facebook user, Robert Alvin Justus, Jr., who, according to the complaint, was led “down a road toward extremism” by the social networking service.
“Carrillo and Justus only met on Facebook because Meta recommended that Justus join groups dedicated to promoting the boogaloo movement,” the complaint states.
Jacobs claims that Facebook acted negligently by “aiding the growth of boogaloo groups,” and that it negligently designed its site “to promote and engage its users in boogaloo-related groups.”
Jacobs isn't the first to sue Facebook over a crime. She isn't even the first to sue Facebook over a death related to the George Floyd protests.
In September of 2020, Facebook was sued over violence in Kenosha, Wisconsin, where two people protesting police violence were fatally shot after right-wing militia groups descended on the city.
In that case, five people -- including Hannah Gittings, whose boyfriend, Anthony Huber, was killed -- alleged that “white racist militias” used Facebook “to broadcast and publicize a 'call to arms' for untrained private citizens to travel across state lines to the peaceful protests advocating for racial justice in America with assault rifles, tactical gear, and militia grade equipment.”
The lawsuit was withdrawn without explanation several months later, before a judge made any substantive rulings in the dispute.
In numerous other cases, people have alleged that Facebook's algorithms help facilitate terrorism.
Facebook has some obvious defenses to these kinds of lawsuits. First, Section 230 of the Communications Decency Act immunizes online companies for content posted by users.
What's more, a federal appellate court in New York ruled three years ago that Section 230 applies even when tech companies use algorithms to promote content.
The judges said in their ruling that websites typically recommend third-party content to users -- even if only by prominently featuring it on a homepage, or displaying English-language articles to users who speak English. The use of algorithms, as opposed to human editors, to make those recommendations doesn't make Facebook liable for the posts, the appellate panel ruled.
(A different federal appellate court recently allowed terror victims to proceed with claims that tech companies violated a specific federal law -- the Anti-Terrorism Act -- by aiding and abetting terrorists. Jacobs' lawsuit against Facebook doesn't accuse the company of violating that statute.)
Even apart from Section 230, Facebook has other defenses -- including that even assuming Jacobs' allegations are true, its role as a communications platform didn't directly cause her brother's death.
“At the core, you have a causation problem,” Santa Clara University law professor Eric Goldman says.
In fact, a federal appeals court ruled four years ago that family members of victims of a terrorist shooting in Amman, Jordan couldn't proceed with claims against Twitter because their complaint didn't show a "direct relationship" between anything Twitter did and the shooting.
For its part, Facebook says Jacobs' claims lack a legal basis, and that it takes steps to remove content that violates its policies.
“We’ve banned more than 1,000 militarized social movements from our platform and work closely with experts to address the broader issue of internet radicalization,” a company spokesperson states.