A lawyer’s perspective: Thoughts on the arrest of Telegram founder Pavel Durov

This article is machine translated
Show original

Thoughts on the Durov arrest

By Preston Byrne, Partner at Byrne & Storm LLP

Compiled by: 0xjs, Jinse Finance

On August 24, Pavel Durov, the founder of the popular messaging app Telegram, was arrested when his private jet landed in France.

Early indications are that the arrest stems from Telegram’s alleged non-compliance with French requests for content moderation and data disclosure:

Some legal background

It’s no accident that most non-Chinese social media companies with global reach are headquartered in the United States.

The United States (wisely) took policy steps in the late 1990s to minimize the liability of online service operators, most notably the enactment of Section 230 of the Communications Decency Act. This law (essentially) makes social media site operators not liable for the copyright infringement or criminal conduct of their users. Of course, there are some very narrow exceptions to this rule; for example, illegal pornography is subject to mandatory takedown and reporting (see: 18 US Code § 2258A), and the passage of FOSTA-SESTA prohibits operators from providing services that involve sex trafficking or prostitution (see: United States v. Lacey et al. (Backpage), 47 US Code § 230(e)(5)).

Beyond that, social media site operators are generally not liable for the tortious or criminal conduct of their users. Nor can they be held liable under the aider/abbet theory if they merely passively host content. (See: Twitter v. Taamneh, 598 US _ (2023) – at least on this side of the Atlantic, civil liability for aiding and abetting requires “knowing and substantial assistance,” while federal criminal liability – because Section 230 does not apply to state criminal laws – requires specific intent to assist in a crime).

This means that if I use Facebook to organize a drug deal, Facebook (a) has no obligation to scan its service for illegal use and (b) has no obligation to restrict that use and generally cannot be subject to civil penalties for my misuse unless Facebook “materially facilitates” that illegal use, i.e., expressly encourages it (see, e.g., Force v Facebook, 934 F.3d 53 (2d Cir. 2019), in which Facebook was found not civilly liable under JASTA to Hamas victims who used Facebook to disseminate propaganda online; see also Taamneh, supra), and is not criminally liable (a) under state criminal law section 230 and (b) under federal criminal law, so long as Facebook did not intentionally and knowingly aid, abet, counsel, or cause the commission of a crime, 18 U.S.C. § 2.

Most countries don’t have such a permissive system. France is one of them. For example, the 2020 Loi Lutte Contra la Haine sur Internet law stipulates that global internet companies can be fined $1.4 million per failure to restrict “hate speech” (hate speech is “protected speech” in the United States) on their sites, with fines of up to 4% of their global revenue. Similarly, Germany has its own law, the Network Enforcement Act (sometimes called the “Facebook Law” but usually shortened to NetzDG), which stipulates that inflammatory political content must be removed or the government has the power to impose fines of more than €50 million.

I’m not a French lawyer, so it’s hard to figure out exactly which legislative provisions are being invoked here. The charging document or arrest warrant will tell us more once it’s released. I’m pretty sure the US isn’t going to file a fine against Telegram Messenger, Inc. under hate speech laws (such as the EU DSA), because if it were us, Durov wouldn’t have been dragged off the plane in handcuffs. French media outlet TFI Info, which reported the news, said the charges could be aiding and abetting, or conspiracy:

The Justice Department argues that the lack of auditing, cooperation with law enforcement, and the tools provided by Telegram (disposable numbers, cryptocurrency, etc.) make it an accomplice to drug trafficking... and fraud.

More information will be revealed after the arrest warrant is published. For example, if it is found that Durov did actively help criminal users access the platform, such as a drug user who wrote to the support channel and said, "I want to sell drugs on your platform. How can I do this?" Durov replied that he would help, then he would suffer the same fate in both the United States and France.

However, if the French are simply saying that Durov's failure to police his users or to respond promptly to French document requests is a crime (which I doubt is the case), then this represents a dramatic escalation in the war on censorship. It means that European countries will try to dictate from their own borders what foreign companies can and cannot host on foreign web servers.

If correct, this would be a significant departure from the current approach taken by most US-based social companies to comply with US regulations, which generally dominates the global compliance strategy of most non-Chinese social media companies, including those that fully encrypt their services (among them Telegram, WhatsApp and Signal). In short, these platforms believe that if they do not intend to use their platforms for crime, they are unlikely to be criminally charged. Clearly, this is no longer the case.

Telegram is not the only company in the world that uses social media platforms for illegal purposes. It is well known that Facebook’s popular encrypted messaging app WhatsApp has been used for years by the Taliban, the former non-state terrorist group and current rulers of Afghanistan. This fact was widely known by NATO generals during the Afghan war and reported in the media, and even again in the New York Times last year:

About a month later, when security officer Inkayade was unable to contact his commander during a night operation, he reluctantly purchased a new SIM card, opened a new WhatsApp account, and began recovering his lost phone number and rejoining WhatsApp groups.

Inkayade sat in his police station, a converted shipping container topped with a handheld walkie-talkie. He pulled out his phone and began scrolling through his new accounts. He pointed out all the groups he had joined: one for all police officers in his precinct, another for former fighters loyal to a single commander, a third he used to communicate with his superiors at headquarters. In all, he said, he was in about 80 WhatsApp groups, more than a dozen of which were for official government purposes.

Of course, the Taliban now controls the entire government of Afghanistan—all levels—and Afghanistan is an enemy of the United States, which is Facebook’s home country. If Facebook really wanted to prevent people like this from using their service, the most effective way would not be to play whack-a-mole with individual government employees, as Facebook did, but to ban Afghanistan’s entire IP range and all Afghan phone numbers and disable domestic app downloads, which Facebook did not do. Facebook chose non-measures rather than measures of action.

Yet Facebook CEO Mark Zuckerberg lives comfortably on an estate in Hawaii, not in exile, and presumably has no arrest warrant out for him, whereas Durov clearly does. I admit that it’s possible (even likely, given that Telegram operates with a team of just 15 engineers and about 100 employees worldwide) that Facebook could respond more quickly to a French judicial request than Telegram did. But when you run a globally accessible encrypted platform, it’s inevitable — I repeat, inevitable, absolutely certain — that criminal activity will occur beyond your vision or control.

If Telegram is accused of violating French law by failing to moderate (as media reports suggest), then apps like Signal (which is clearly unable to respond to law enforcement requests for content data and has similar functionality to Telegram) are equally guilty, and no American social company (or its top leadership) that offers end-to-end encryption is safe. Do we really think Meredith Whitaker (Signal Chair) should go to jail if she decides to go to France?

Image licensed under Pixabay

There are still many questions. Right now, this does not look good for the future of interactive web services in Europe. American tech entrepreneurs who run services in accordance with American values ​​(especially protecting free speech and privacy through strong encryption) should not visit Europe, should not hire in Europe, and should not host infrastructure in Europe until this situation is resolved.

Aiding and abetting in France

Updated on August 26, 2024

Basically my hunch was correct:

There is a long list of crimes there. Most of them are related to the French crime of conspiracy, which is roughly equivalent to the aider/abbott liability in the United States.

What’s important here is that in the US, aider/abettor liability requires specific intent to cause a criminal result—that is, criminal conduct was the defendant’s purpose. The failure of US social media companies to police their users does not rise to that level, which is why US social media company CEOs are generally not arrested by the US government for the criminal conduct of their users. In particular, CSAM charges would only rise to the level of a crime in the US if Durov failed to comply with the US notice-and-reporting regime for such content. The mere presence of criminal content without any notice does not give rise to criminal liability.

The French government accused Durov of participating in (i.e. aiding and abetting) criminal activity and providing "encryption" software without a license. Encryption products must be approved by the government before they can be used in France. The crimes he was accused of assisting include crimes roughly similar to the Anti-Fraud and Corruption Organizations Law, a compilation of criminal acts, money laundering, drugs, hacking, and providing unlicensed encryption technology.

In the absence of overwhelming evidence that Durov and Telegram explicitly intended to commit these crimes or cause them to occur (which would be highly unusual for a social media CEO to do, especially since these crimes are illegal around the world, including in the United States, which has historically been very good at extraditing criminals), there is no reason why similar charges could not be brought against any other social media service provider in France with less than perfect moderation practices, especially one that offers end-to-end encryption.

We need to wait for the evidence to come in before we can draw any firm conclusions on this point. However, my guess is that Durov was not “aiding and abetting” as the US understands it, and France decided to use different principles to try to regulate a foreign company because it believed that their audit policies were too lax.

To summarize:

Currently, if you run a social media company or you offer encrypted messaging services that are accessible in France and you are based in the United States, leave Europe.

Source
Disclaimer: The content above is only the author's opinion which does not represent any position of Followin, and is not intended as, and shall not be understood or construed as, investment advice from Followin.
Like
Add to Favorites
Comments