London: This month, the social platform X closed its Brazil operations after one of its executives was threatened with arrest for not taking down certain content. Last year, Changpeng Zhao, the founder of Binance, pleaded guilty to federal money-laundering violations that took place on his cryptocurrency platform. In 2021, Twitter executives in India faced arrest over posts that the government wanted removed from the site.
And on Saturday, Pavel Durov, who founded the online communications tool Telegram, was arrested in France as part of an investigation into the platform’s complicity in crimes including possession and distribution of child sexual abuse imagery.
For years, internet company executives rarely faced personal liability in Western democracies for what took place on their platforms. But as law enforcement agencies, regulators and policymakers ramp up scrutiny of online platforms and exchanges, they are increasingly considering when to hold company leaders directly responsible.
That shift was punctuated by Durov’s arrest over the weekend, raising questions over whether tech executives like Meta’s Mark Zuckerberg also risk being arrested when they next set foot on European soil.
For now, tech executives have little to fear, with cases like Durov’s likely to be outliers, experts said. Historically, companies have been held responsible for a platform’s transgressions, rather than individuals. And legally, the bar is high in the United States and Europe to prosecute individuals for activities at their companies, especially with U.S. laws like Section 230 of the Communications Decency Act, which protects internet platforms from being responsible for harmful speech.
But the threshold for holding executives liable for what takes place on their sites is lowering in specific areas, particularly child safety, said TJ McIntyre, an associate professor at University College Dublin’s School of Law.
Last year, Britain passed an online safety law that can hold tech leaders personally responsible if their company is made aware of content that risks child safety and systematically fails to remove it. Even Section 230 doesn’t apply to some forms of outlawed speech, such as child sexual abuse.
“There’s a 30-year arc here,” McIntyre said. Since the 1990s, he said, tech executives have not typically been held responsible for what users did on their platforms, though that approach is now being questioned by those who want stronger accountability.
Durov, 39, has not been formally charged with any offenses and could remain in the custody of French authorities through Wednesday. While French authorities have provided few specifics, he faces a raft of potential charges related to activities on Telegram, including child sexual abuse material, drug trafficking, fraud, money laundering, abetting criminal transactions and refusing to cooperate with law enforcement.
Durov made himself a target with an anti-authority ethos that governments should not restrict what people say and do online except in rare instances, experts said. Unlike Meta, Google and other online platforms that typically comply with government orders, Telegram was also called out by French authorities for failing to cooperate with law enforcement.
After Durov’s arrest, Telegram said that it abided by EU laws and that it was “absurd to claim that a platform or its owner are responsible for abuse of that platform.”
Tech companies are paying close attention to the legal liability that their executives may face. This year, Meta successfully fought to have Zuckerberg, its CEO, removed as a named defendant in a lawsuit brought by New Mexico’s attorney general against the company for child protection failures.
In China, Russia and other authoritarian countries, U.S. tech companies have sometimes pulled out their employees to prevent them from being arrested. The concern is employees will be used as leverage to force companies to do things like remove content unfavorable to the government.
Previously, only a few notable cases surfaced in which tech executives were seen as potentially liable for activities that took place on their services. In 1998, Felix Somm, a former executive at CompuServe, an online services company, was given a suspended two-year sentence in Germany for complicity in the proliferation of pornography on the internet. He was later acquitted. In 2002, Timothy Koogle, a former CEO of Yahoo, faced charges in France for the sale of Nazi memorabilia on the website. He was also later acquitted.
In 2012, Kim Dotcom, the founder of Megaupload, was arrested by U.S. authorities for copyright infringement related to his website. Ross W. Ulbricht, the creator of the Silk Road online black market, was convicted in the United States for facilitating illicit drug sales in 2015. In 2016, Brazil briefly imprisoned a Facebook executive for failing to turn over WhatsApp messaging data in a drug trafficking investigation.
These instances were capped over the weekend by Durov’s arrest.
One challenge for prosecutors and law enforcement agencies is proving a tech executive had knowledge of illegal activity on their platforms and did not try to curb the harms, said Daphne Keller, a professor of internet law at Stanford University Law School.
That’s difficult to demonstrate, since TikTok, YouTube, Snap and Meta, which owns Facebook and Instagram, have worked to take down and report illegal content to law enforcement officials, so their executives can argue they tried to do the right thing.
“Knowledge is the key issue here,” said Keller, a former lawyer for Google. “It’s the usual trigger for anyone losing immunity.”
Still, the risk of prosecution is needed to force tech companies to act, said Bruce Daisley, who was a vice president at Twitter before Elon Musk bought the site in 2022 and renamed it X.
“That threat of personal sanction is much more effective on executives than the risk of corporate fines,” Daisley wrote recently in The Guardian.