skip to main bit
a man slumped on his desk, from 'The Sleep of Reason Produces
      Monsters'

Oblomovka

Currently:

Pavel Durov and the BlackBerry Ratchet

Why do governments go after companies and executives of services of more weakly encrypted tools?

It’s very hard, this early, to pierce through what’s going on with the French authorities’ arrest of Pavel Durov, the CEO of Telegram — but that doesn’t stop people from having pet theories. Was it retaliation from the US and the FBI for not backdooring Telegram? Was it a favor to Durov so he could hide from Putin? Was it just the grinding wheels of French justice?

I’m sure we’ll understand more details of Durov’s case in the next few days, but motivations — especially those anthropomorphically projected onto entire states — are never really resolved satisfactorily. If you think LLMs lack explainability, try guessing the weights of a million human-shaped neurons in the multi-Leviathan model that is international politics. It’s not that we’ll never have explanations: it’s just that we’ll never be able to point to one as definitive.

Of course, the intractability of large systems never stopped anyone from trivializing those crushed under their lumberings with a pat explanation or two on a blog. (It certainly won’t stop me, who, when I was a columnist, made more-or-less a career out of it.)

So let me dig out an old theory, which I think may fit the facts here. I think Durov and Telegram are prisoners of the same ratchet that trapped Research In Motion (RIM)’s BlackBerry in the 2000s.

Back in the Before iPhone Times, BlackBerry was a cute range of mobile devices with a little keyboard and screen that offered low-cost messaging in an era when phones were bad at everything that wasn’t “talking to people” (and they weren’t great at that).

We think of mobile phones these days as individually-owned devices — intimately so — but BlackBerrys were the stuff of institutional purchasing. In the 90s, companies and governments bought or rented BlackBerrys en masse, and handed out the units to their staff to keep in touch. In the pre-cloud era, these institutions were cautious about ceding a chunk of their internal comms infrastructure to a third-party, let alone a Canadian third-party, so RIM built reassuring-sounding content privacy into their design. A chunk of the message-relaying work was done by “BlackBerry Enterprise Server” which was closed-source, but sat on-prem. Corporate BlackBerrys could send instant messages directly to one another, via RIM’s systems, but enterprises could flash their users’ devices with a shared key that would make their messages undecipherable by anyone who didn’t have the key, including RIM or the telecomms networks the message passed over. None of it would really pass muster by modern cryptographic best practices, but it would be enough to get a CTO to sigh and say “ok, seems good enough. Sure.”, and sign off on the purchase.

Importantly, though, a lot of this encrypted security was optional, and protected these comms at the organizational, not individual, level. Companies could turn message privacy on and off. Even when turned on, the company itself could decrypt all the messages sent over their network if they needed to. Useful if you’re a heavily-regulated industry, or in the government or military.

Now, BlackBerry users loved their little type-y clicky things, and inevitably RIM realized they might have a consumer play on their hands (especially as smartphones began to get popular). They started selling BlackBerry devices direct to individuals via the mobile phone companies. RIM and the telcos played the part of the institutional buyers in this deal — they could turn on the encryption, and had access to the messages, although it was unclear from the outside who played what part. Did the telcos flash their devices with a shared key, or did RIM? Who was in charge of turning the privacy on and off?

All this ambiguity made infosec people leery of RIM’s promises, especially with consumer BlackBerry devices. But in general, people read this all as meaning that consumer BlackBerrys were secure enough. After all, even President Obama had a BlackBerry, so that must mean something?

Apparently so: Around about 2010, governments started publicly attacking RIM and BlackBerrys as a threat to national security and crime prevention. Law enforcement agencies started complaining about RIM’s non-cooperation. Countries like the UAE and India talked of throwing out RIM from their country entirely. It was the first big government vs internet messaging drama to play out in the press.

At the time, this puzzled me immensely. From the viewpoint of infosec insiders, spooks should have loved RIM! BlackBerrys were actually kind of insecure! If you wanted to get at the messages that individual BlackBerry customers — including, most visibly, drug dealers, who loved their BlackBerrys– you just had to hit up the (certainly domestic) telephone company they were using and get that shared key. Or you could maybe mandate what key that would be. You didn’t need to put pressure or ban RIM to do this!

But as I dug into it, I realized what may have been going on. RIM and the telcos had been helping the authorities, to the best of their abilities. They probably did a fair bit of explaining to the authorities how to tap a BlackBerry, and may even have done some of the heavy-lifting. When it came to consumer BlackBerrys, RIM didn’t have the hard and fast line of a Signal or other truly end-to-end encrypted tool. They could hand over the messages, and (as they would sometimes protest) often did.

But, crucially, they could not do this in every case. The reasons when they could not were primarily bureaucratic and technical. The drug dealers might have got smart and decided to change the key on their network, and neither RIM or the cops had a device to extract the key from. Or the authorities might want info on a corporate BlackBerry, which was uncrackable by BlackBerry using their existing infrastructure. Or a BlackBerry’s shared key might have been set by the phone company, not RIM, so RIM couldn’t directly co-operate, and needed to refer them back to the telco — who might have just cluelessly bounced them back to RIM. That kind of shuttlecock-up happens all too often, and it’s easy for the tech company to take the blame.

Ultimately, the problem was that RIM could not 100% state they had no access to BlackBerry data at all. They complied with some requests, but not others. The reasons were generally technical, not political — but they sounded to law enforcement and intelligence community ears like they were political.

Those political actors were not entirely wrong. RIM had made political decisions when designing the privacy of its tools. In particular, they had promised a bunch of customers that they were secure, and let a bunch of other customers think they were secure. RIM’s critics in governments were simply asking — why can’t you move the customers that we’d like to spy on from one bucket to the other?

Declining to do this was an existential commitment for RIM — if they undid those protections once, none of their major military and corporate customers would ever trust them again. They had to fight the ratchet that the governments were placing them in, because if they didn’t, their business would be over. And the more they fought, the angrier their government contacts became, because hey — you’re already doing this for some people. Why aren’t you doing it for this case? Law enforcement saw this as a political problem, so responded to it with political tactics: behind-the-scenes pressure, and when that didn’t work, public threats and sanctions.

Durov and the Ratchet

Like BlackBerry, I think a lot of infosec professionals are again confused as to why Telegram is getting it in the neck from the French government. It’s not even a well-designed tool.And I think the reason is the same: like BlackBerry, because of its opt-in, weakly protective tooling, Telegram can, and does, assist the authorities in some ways, but not others. I don’t mean this in a damning way — if Telegram gets a CSAM report, it takes down a channel. End-to-end encryption is opt-in on Telegram; they really do have access to user information that, say, a Signal or even WhatsApp doesn’t. There’s no technical reason for it not to have features on the backend to deal with spam and scams: a backend which — unlike an end-to-end encrypted tool — can peer in detail at a lot of user content. The authorities can plainly see that Telegram can technically do more to assist them: a lot more.

So why doesn’t Telegram do more to help the French government? As with RIM, Telegram’s excuses will be convoluted and hard for political authorities to parse. Maybe it’s because the French requests are for data it doesn’t have — chats where the participants were smart enough to turn on encryption. Maybe it’s just that if they provide that service for France, they’d have to provide it for everyone. Maybe France wants to see Russian communications. Maybe Telegram just doesn’t have the manpower. But the point here is that Durov is caught in the ratchet — the explanations as to what Telegram can and can’t do are a product of contingent history, and the French authorities can’t see why those contingencies can’t be changed.

If it sounds like I’m basically victim-blaming Durov for his own lack of commitment to infosec crypto orthodoxy here, I want to be clear: best practice, ideologically-pure end-to-end apps like Signal absolutely face the same ratchet. What I’m mostly trying to understand here is why Telegram and BlackBerry get more publicly targeted. I think the truth behind the amount of pushback they receive is more psychological than cryptographic. Humans who work in politics-adjacent roles get madder at someone who concedes part of the way, but refuses to bow further for what seem like political reasons, than someone who has a convincing argument that it is mathematics, not politics, that prevents them from complying further, and has stayed further down on the ratchet. Not much madder, but mad enough to more quickly consider using political tools against them to exact their compliance.

Echoing BlackBerry’s woes, I don’t think Telegram’s security compromises are a product of government pressure so much as historical contingencies. But I do think its weaknesses have ironically made it a greater target for the kind of unjust, escalatory, fundamentally ill-conceived actions that we have seen against Durov by the French authorities.

The motivations of government officials are hard to guess: but I do think it is accurate to say they see the world through political, not technical lenses.








6 Responses to “Pavel Durov and the BlackBerry Ratchet”

  1. Pavel Durov and the Blackberry Ratchet - CodeGurus Says:

    […] Article URL: https://www.oblomovka.com/wp/2024/08/25/pavel-durov-and-the-blackberry-ratchet/ […]

  2. Pavel Durov and the Blackberry Ratchet dannyobrien on August 25, 2024 at Says:

    […] Article URL: https://www.oblomovka.com/wp/2024/08/25/pavel-durov-and-the-blackberry-ratchet/ […]

  3. wiktors Says:

    I was waiting for a new article and I am not disappointed and the Durov case also interested me. For some time I have been thinking a lot about leaving Facebook and looking for alternatives and of course one of the options is Telegram. I don’t want to write much here except that I’m interested in your blog and I thank you for your work.

  4. 保罗·杜罗夫和黑莓棘轮 - 偏执的码农 Says:

    […] 详情参考 […]

  5. Danny O'Brien Says:

    Thank you, wiktor!

  6. wiktors Says:

    With your next post I will add a comment on how I found your site again after about 17 years :) best wishes