(评论)
(comments)

原始链接: https://news.ycombinator.com/item?id=41350530

本文讨论了围绕加密消息应用程序 Telegram 及其创始人 Pavel Durov 的争议。 具体来说,它引发了人们对该应用程序数据安全的担忧,特别是其加密密钥的处理以及执法部门访问用户通信的可能性。 文章称,Telegram 允许用户以明显加密的方式进行通信,但该应用程序在其服务器上以明文形式存储一些信息,例如元数据,包括发送者、接收者和消息时间。 批评者认为,这为当局以及 Telegram 本身的潜在访问留下了空间。 此外,文章还指出 Telegram 的加密方案可能并不像声称的那么安全,指出了实施选择不佳以及应用程序设计某些方面存在漏洞等问题。 作者认为,杜罗夫的宣传研究背景可能影响了他对该应用程序安全功能的决定。 最后,本文涉及 Telegram 和 Durov 面临的法律斗争,特别是与执法部门获取用户数据的要求和审查制度的指控有关。 文章最后质疑 Telegram 声称的数据安全性的可靠性,暗示该应用程序可能会被故意削弱,以方便遵守政府要求。 总体而言,本文提出了对流行消息应用程序 Telegram 安全性的担忧,重点关注该应用程序处理加密密钥所带来的潜在风险以及执法部门访问用户数据的可能性。 此外,这表明杜罗夫的背景可能以损害用户隐私和安全的方式影响了应用程序的设计。 最后,文章暗示该应用程序可能故意包含弱点,以促进与政府实体的合作。

相关文章

原文


And it only works because a corporation likely would want to offer this to its users as a convenient feature. If they were actively trying to hide this, they can rig the test and keep access to themselves.



It is true that passing the mud puddle test does not guarantee robust end-to-end encryption (there can still be backdoors reserved for company/law enforcement). But failing it definitely guarantees that there is no robust end-to-end encryption.



> If the answer is yes then law enforcement can too.

Is it technically possible for them to see it: yes

Does Telegram let them see it: I don't think so. That seems to be the core issue around Durov being arrested.

They probably should implement E2EE for everything. Then they will have a good excuse not to cooperate, because they simply don't have the data.



> Does Telegram let them see it: I don't think so.

This is exceptionally naive. Even if he was arrested for not sharing with the French, what about for other countries? Was he arrested for not ever sharing or not sharing enough? Even if he, personally, has never shared, that doesn’t say anything about his employees who have the same access to these systems.

Your data is not private with Telegram. You are trusting Telegram. It is a trust-based app, not a cryptographically secure app.

If you trust telegram, that’s your choice, but just because a person says the right words in interviews doesn’t mean your data is safe.



You cannot be sure and yet Telegram often gets mentioned for being the only platform where states do not have easy access to user information or the ability to censor certain messages/content.

So from a broad perspective, they probably behave better than comparable services.

I think Telegram should not be trusted, but I also do not trust the alternatives, that readily share information with states. A special focus for me is that my own jurisdiction does not have access to my social media content. Other countries are secondary at first.



> Telegram often gets mentioned for being the only platform where states do not have easy access to user information or the ability to censor certain messages/content.

By who?

Simplex especially or even Signal are far better.



Following the St. Petersburg attack, the Federal Security Service (FSB), in an event that may ring somewhat familiar to many in the United States and Europe, asked Telegram for encryption keys to decode the dead attacker’s messages. Telegram said it couldn’t give the keys over because it didn’t have them. In response, Russia’s internet and media regulator said the company wasn’t complying with legal requirements. The court-ordered ban on accessing Telegram from within Russia followed shortly thereafter. Telegram did, though, enact a privacy policy in August 2018 where it could hand over terror suspects’ user information (though not encryption keys to their messages) if given a court order.

...

... Pavel Durov, Telegram’s founder, called on Russian authorities on June 4 to lift the ban. He cited ongoing Telegram efforts to significantly improve the removal of extremist propaganda from the platform in ways that don’t violate privacy, such as setting a precedent of handing encryption keys to the FSB.

https://www.atlanticcouncil.org/blogs/new-atlanticist/whats-...



This doesn't make any sense. Either the author of the article is confused, lying, or is drawing conclusions from source material that is untrue.

In the US case, there was a phone where data was encrypted at rest. Though Apple was capable of creating and signing a firmware update that would have made it easier for the FBI to brute force the password, Apple refused to do so.

In the Russian case, the FSB must have already had access to the suspect's phone because if it did not then Telegram would not be in any position to help at all.

So, the FSB must have already had access. And therefore, by having access to the phone they also had complete access to the suspect's chats in plaintext, regardless of whether or not the suspect used Telegram's private chat. There would have been no keys to ask Telegram for copies of.

Alternatively, the FSB might have had access to some other user's chats with the suspect, and wanted Telegram to turn over the suspect's full data. Telegram is 100% able to do that if they want to.

As the specific part of the article you have quoted is definitely bullshit, I suspect the rest of it is bullshit too and that despite what Roskomnadzor states in public, the real fight with Durov was over censorship.



Telegram is the only messaging app that I know of which brought attention to the fact that your messages go through Google/Apple notification APIs, which seems like it would utterly defeat any privacy advantage offered by E2EE



Why? I think Google suggests that you send the payload encrypted through the notification. Google then only knows which app to send the message to, they don't know from whom the message originates (only "a Telegram server") nor what the content is.

Also, you could just send a notification instructing the app to fetch a new message from your server.

From the docs:

Encryption for data messages

The Android Transport Layer (see FCM architecture) uses point-to-point encryption. Depending on your needs, you may decide to add end-to-end encryption to data messages. FCM does not provide an end-to-end solution. However, there are external solutions available such as Capillary or DTLS.

https://firebase.google.com/docs/cloud-messaging/concept-opt...



Assuming an adversarial relationship, what sort of metadata could Google capture simply knowing which app was sending the notifications and who was receiving them?



Schneier mentioned this late in 2023:

https://www.schneier.com/blog/archives/2023/12/spying-throug...

> Wyden’s letter cited a “tip” as the source of the information about the surveillance. His staff did not elaborate on the tip, but a source familiar with the matter confirmed that both foreign and U.S. government agencies have been asking Apple and Google for metadata related to push notifications to, for example, help tie anonymous users of messaging apps to specific Apple or Google accounts.



Aren’t notifications enqueued on the server side, implying sender info is inscrutable? I’m curious what mechanism you’d propose to gather any valuable metadata given a sufficient volume of encrypted notifications.



If the text appears on your screen I'm pretty sure there are ways for Google to capture it. I don't need to know how android's API works, knowing it probably just makes one blind to the big picture. You have to trust your OS/phone maker not to do a MITM.



Yes, but Google cannot be compelled to turn over data they don't actually have on their servers because the users encrypted it before it arrived with keys Google don't control.

Signal could modify the application so a remote flag in the Play store binaries could be triggered to exfiltrate data as well. But the key distinction is the normal path of Signal gives them absolutely nothing they can tell anyone other then the bits they've put in the disclosure reports (namely: date and time an account ID used Signal I believe).



Yes, that likely is the GP's point, but it's not really relevant to the discussion going on in this thread. Certainly Google could "backdoor" its OS in that way, but they have little motivation to do so (and a lot to lose if they were to do so and were found out). Their recent move to make their location history / timeline product an on-device-only feature because they don't want to have to respond to law enforcement requests for user location data would seem to suggest they really would prefer to not have this sort of data.

At any rate, the discussion going on here is about how Durov has been arrested because Telegram refuses to respond to law enforcement requests, when they do have the ability to do so; and if they were to actually implement E2EE by default (and for group chats), Durov would likely not be in trouble, since Telegram would be unable to provide anything when requested.



> Their recent move to make their location history / timeline product an on-device-only feature because they don't want to have to respond to law enforcement requests for user location data would seem to suggest they really would prefer to not have this sort of data.

I suspect that isn’t the motivation. GDPR says that you have to give users choices about data stored like this (including right to be forgotten, how it’s processed and used and so on), and this becomes a technical, legal and commercial nightmare very quickly. The easier route is just to get rid of it if you can.

This saves Google money (it likely wasn’t that useful to sell to advertisers), makes legal compliance a lot easier and de-risks them from very large fines.

I suspect that the EU lawmakers didn’t think about second order effects like making it harder for law enforcement to access this data in scenarios like this.



Most of Telegram clients except initial mobile apps was actually open source projects that was choosen by company to become "offcial" ones.

They just dont implement E2EE since almost no one uses it on Telegram.



This claim is what really makes me skeptical of Telegram's privacy story. Their assertion is completely incorrect. (Source: have implemented end to end encrypted payload delivery over APNs / GCM.)

And if they are so off base on this, they must either be incompetent or liars. Neither of which builds trust.



I’m old enough to remember when Signal first implemented cross-device sync using a Chrome plugin.

I’d rather developers issue cautionary warnings than give a false sense of perfect security



they probably share it with russian authorities. Just look now. russia is allowing protests in favour of him (they only allow protest they support) and they arrested a french citizen on fake drug charges right after



AFAIK this current case has absolutely nothing to do with any form of chat features, it’s about telegram’s public channels that more or less work like reddit/twitter/any other news channels, except it refuses to censor content.



> Does Telegram let them see it: I don't think so. That seems to be the core issue style Durov being arrested

The UAE requires decryption keys as part of their Telco regulations.

If Telegram can operate in the UAE without VPN (and it can), then at the very least the UAE MoI has access.

They (and their shadow firms like G42 and G42's shadow firms) were always a major buyer for offensive capabilities at GITEX.

On that note, NEVER bring your personal phone to DEFCON/Blackhat or GITEX.

Edit: cannot reply below so answering here

Cybersecurity conferences.

DEFCON/Blackhat happen during the same week, so you have a lot of script kiddies who lack common sense trying to pwn random workloads. They almost always get caught (and charged - happens every year), but it's a headache.

GITEX is MENA and Asia's largest cybersecurity conference. You have intelligence agencies from most of the Middle East, Africa, Europe, and Asia attending, plus a lot of corporate espionage because of polticially connected MSSPs as well as massive defense tenders.



Sorry, but as someone who's completely out of the loop with these things. What's DEFCON/Blackhat or GITEX about and why shouldn't you bring your personal phone?

I'm genuinely interested.



defcon and blackhat are hacker/computer security conferences started by Jeff Moss (aka DT or Dark Tangent) in 1993 and held at the end of July or early August every year in Las Vegas.... The reason you don't bring your phone is it might get hacked



On a separate note, Zerodium is dead now. They're in the middle of an active fire sale, but the Zero Day market's bottom fell out now that countries are increasingly moving exploit development in-house or to vendors that can do both zero day acquisition AND exploit deployment (which Zerodium cannot do as an American company).

Also, u/reissbaker's answer is correct.



All the encryption stuff is just a red herring to a larger degree. It’s not the technical access to the information that is the issue, it is that people can share and exchange information that the various regimes do not want shared that is the primary issue. They want censorship, i.e., control of thought and speech, arresting the information flow.

They know what is being said and that’s what they want to arrest, that information can be sent and received. And by “they” I mean more than just the French. That was just coincidental and pragmatic.

The French state does not operate that quickly on its own, to get an arrest warrant five minutes after he landed and execute on it immediately. That has other fingerprints all over it in my view.



recent support. kremlin yesterday arranged big protests in moscow demanding his release. kremlin yesterday arrested the nephew of the french ambassador claim he was dealing drugs (claiming he carried a package of heroin marked with the label "for distribution in russia" as if all drug dealers put their intentions in writing) clearly to try to trade him



> They probably should implement E2EE for everything

Certainly not because then Telegram would lose alot of its functionality that makes it great. One thing that I really enjoy about Telegram is that I can have it open and synched across many independent devices. Telegram also has e2e as an option on some clients which cant be synched



Many people think that Telegram tries to be a Signal or Matrix replacement. I dont think Telegram tries to be any of that. If anything you can compare it to Discord, except much better.

To enable synched e2e conversations accross many devices you also need to synch private keys, which is a security nightmare.



Do you have some info about Durov being arrested for not letting law enforcement see encrypted messages? The public info says he was arrested for "...lack of moderation, ...[and] failing to take steps to curb criminal uses of Telegram."

I don't see anywhere saying he's been arrested for anything to do with encryption or cooperating with investigations.

eg https://www.bbc.co.uk/news/articles/ckg2kz9kn93o but pretty much all the sources I have read say the same



Well of course, but this is a feature of Telegram. It's the only messaging app where messages are stored on the cloud. This of course has security implications, but also allows you to have a big number of chats without wasting your device memory like WhatsApp does, or having to delete old conversations, and allows you to access your chats from any device. By the way you can also set a password to log in from another device (two factor authentication, also on WhatsApp now you have this option).

To me it's a good tradeoff, of course I wouldn't use Telegram for anything illegal or suspect.



But that's literally the entire point of this article. That is, in this day and age, when people talk about "secure messaging apps" they are usually implying end-to-end encryption, which Telegram most certainly is not for the vast majority of usages.



Many companies in the industry mislead users about encryption and just try to use it as a buzzword to attract customers. Take Apple, as example. Apple cloud backups are not E2E encrypted by default (like Telegram chats), and even if you opt into E2E encryption, contact list and calendar won't be E2E encrypted anyway [1].

Yet, Apple tries to create an image that iPhone is a "secure" device, but if you use iCloud, they can give your contact list to government any time they want.

Apple by default doesn't use E2E for cloud backups, and Telegram doesn't use E2E for chats by default. So Telegram has comparable level of security to that of the leaders of the industry.

[1] https://support.apple.com/en-us/102651



Also, iMessage is very secure...but then all your stuff is backed up on iCloud servers unless you specifically disable it. That includes all your iCloud encryption keys and plaintext messages.

Worse, iPhones immediately start backing up to iCloud when set up for a new user - the only way to keep your network passwords and all manner of other stuff from hitting iCloud servers is to set the phone up with no network connection or even a SIM card installed.

Did I mention there's no longer a SIM slot, so you can't even control that?

And that iPhones by default if they detect a 'weak' wifi network will switch to cellular, so you can't connect the phone to a sandboxed wifi network?

You shouldn't have to put your phone in a faraday cage to keep it from uploading plaintext versions of your private communications and network passwords.



If that is the correct default then why Telegram is blamed for having non-E2E chats by default? Maybe they also care about users who can accidentally lose their conversations. When Apple does it, it is good, but when Telegram or TikTok do the same, it is bad and not secure.



because telegram and it’s users heavily insinuate it’s comparable to Signal rather than Tiktok.

right on their front page in giant font they declare “private” and “secure” when they’re neither. it’s telegram’s own fault they receive this criticism repeatedly—and they strangely constantly complain every time they’re publicly spanked and taken to task. theyre heavily insinuating (i call it it lying) to their users and then over and over crying because they get called out.

if they don’t want to be called out then they should quit insinuating those things, it’s dangerous af. they know they’re lying though, obviously they won’t stop. but omg i wish their users would run fast and run far—it’s like watching an abused person who keeps going back to their abusive partner “oh they mean well”… pffft, no, they really dont.



They are stored encrypted but whether Apple has the key depends on whether you've turned on "Advanced Data Protection" (aka "I don't expect Apple to bail me out when I lose access to all my devices"). The table in this support article details the treatment of various data categories under the two options:

https://support.apple.com/en-us/102651

The default for many categories is that your keys are in iCloud so Apple can recover them for you. With Advanced turned on, the keys are only on your personal devices. A few categories, like the keychain, are always only on your devices.

Specifically, see Note 3: "If you use both iCloud Backup and Messages in iCloud, your backup includes a copy of the Messages in iCloud encryption key to help you recover your data." Under normal protection, Apple has the key to your backups, but with Advanced they don't.



And even "advanced" protection is not advanced enough to protect your calendar and contact list from the government (under silly excuse that Apple uses standard protocols for those data).



Which is one of the best features. I wouldn’t mind having an option to disable it, but then you also don’t get the advantage of others’ phones finding your device.



Can I enroll my personal iPhone in MDM myself? And if I can have MDM with just my personal phone, do I need to buy some kind of subscription for it from Apple? Or pay some third-party?

I thought MDM was only for enterprise businesses and schools and universities, but I may very well be mistaken about that.



This saved me one time when I was gifted an Apple TV without a remote.

No way to add a WiFi profile, thus no way to use an iPhone as a remote. No ethernet available either.

Configured a WiFi profile, uploaded to the Apple TV and could finalize the setup.

It’s quite a powerful too for initial setup.



^^^ Highly recommend this. If you are technical enough, a family managed Apple Configuration is more than enough to protect for most situations and from most threat actors.

If you're threat actor has the resource to break that, get a CC or a good lawyer on retainer I guess.



> It's the only messaging app where messages are stored on the cloud.

Besides Slack and Discord and Teams and whatever the heck Google has these days and iMessage and...

I think you mean it's the only messaging app that purports to have a focus on security where messages are stored in the cloud, which is true, but also sus. There's a reason why none of the others are doing it that way, and Telegram isn't really claiming to have solved a technical hurdle that the E2E apps didn't, it's just claiming that you can trust them more than you can trust the major messaging apps.

Maybe you can and maybe you can't, the point is that you can't know that they're actually a safer choice than any of the other cloud providers.



Matrix also keeps your message on the server. Except you can run your own server. And the messages are end to end encrypted. And you can keep a proper backup of the keys.

Granted it can be clunky at times, but the properties are there and decentralised end to end encrypted messaging is quite and incredible thing. (Yes, Matrix nerds, it's not messaging per se it's really state replication, I know :))



As you alluded to, Matrix has really horrible UX. Telegram is meant to be easy for the many to use: finding content in chats or even globally across public channels for example is intuitive and snappy because their server does the heavy lifting. That's a huge sell for many, myself included.



My Matrix messages are, I presume, not encrypted, because every device I have prompts me to sign this device's keys with the keys of another device (which doesn't exist) and the option to reset the encryption keys and lose access to old messages doesn't work either (it just crashes Element).



>it's just claiming that you can trust them more than you can trust the major messaging apps.

All the cool kids in the block eliminated the need to trust the provider decades ago. PGP: 33 years ago, OTR 20 years ago, Signal 14 years ago.



You have to trust the provider with signal; they are fiercely anti-third party clients, control the network and have released version of the code that are not tracked by sources- in extreme cases we’re aware of years old code being in there (mobile coin for example).

Signal evangelicalism needs to halt, you mean the Whisper protocol.



You can’t know what’s running on your client. Reproducible builds aren’t reproducible, open source was not followed (there was code in the client that was not present in the repos).

So, yes, trust is needed.



No serious project wants to collaborate with a bunch of hobbyist projects who may or may not keep their code up-to-date. Years ago, the Matrix ecosystem was a prime example of even basic features like end-to-end encryption being in many cases missing.

Having a single client gives you insane boost to security agility over decentralized alternatives.

Feel free to strive towards functional decentralized ecosystem that feels as good to use, then switching will be a no-brainer.



I don't completely agree. I am perfectly fine with there being multiple options for various use cases. Signal has its place. So does Telegram for that matter. Even Whatsapp..

That said, what I would love to see ( and likely won't at this point ) is the world where pidgin could exist again, because everyone is using some form of sensible standards that could be used.. right now it is mostly proprietary secret mess of things.

And don't get me started on convincing anyone in group to moving from one ecosystem to another. Fuck, I just want email for chat that is not owned by one org.. Is it really so much to ask ( it is rhetorical, I know the hurdles are there and only some deal with human nature )?



(On Android), if you don't care about the (old) WhatsApp media, just delete it from your phone. It's all just loose files in `/storage/android/data/com.whatsapp` (or thereabouts). The text content of the chats will remain available.



This is such a misrepresentation. Telegram could at-will feed the cloud-2FA password to password hashing function like Argon2 to derive a client-side encryption key. Everything could be backed up to the cloud in encrypted state only you can access. Do they do that? No.

So it's not as much as trade-off, as it is half-assed security design.



Telegram currently has very intuitive and snappy search, even in very active groups with years of content. That's because the heavy lifting is done by the server. Think that'd still be possible if there was no way for the server to process the data?



Grep is inefficient search engine, because it needs to scan through whole content (and Telegram uses search indexes). Also, grep cannot deal with words forms and inflections (you type "foot" and you also want to find "feet"). Inflections are not very important in English, but you need to deal with them in other languages where the word can have many forms.



I'm not trying to claim Telegram uses grep or whatever. My point is even very active chats on telegram generate somewhat small amount of text data and I dont believe that searching through it require massive complex search engine with super-fast backend.

I basically participate in hundreds of chats and message history doesn't take 10 of GBs. And I also know that search in history of such chats isn't so snappy on older Android phone.



Not at all. Try searching 500/1000 sources (maximum number of conversations any free/premium user can be part of), each with potentially millions of messages, and providing the results in under a second.



AFAIK telegram dont have any super-advanced search features neither it instantly return you results for all these years. Also if you search less common terms it's usually take longer than less than second.

And if you just run client on device without a lot of this history cached search wouldn't be anywhere as fast as you expect. So I pretty sure there no server-side magic there, but instead very good UX.

Also I can tell for certain that with right index grepping tons of JSONs can be very effective on any modern devices.



> Also I can tell for certain that with right index grepping tons of JSONs can be very effective on any modern devices.

But to run local search you need to download the conversations to device first which might require lot of (expensive) traffic.



Yeah, try searching anything older than a year, the amazing snappy search grinds to halt. Meanwhile I'm storing years worth of stuff on Signal with no issues, and it searches ridiculously fast offline with no seconds long pause for buffering.



So interesting. I just did a search for mentions of someone I know in multiple Telegram groups and channels, and got all the results, going back 5 years, instantly. And these groups and channels have millions of messages. All media is also perpetually available (unless deliberately deleted), and take a couple seconds to load. I don't see any other platform having that kind of convenience.



Apple could also use E2E for their cloud backups by default, but they don't (and if you enable E2E, it doesn't apply to contact list and calendar backup anyway). Why do you demand more from Telegram than from Apple or Google?



Cryptography is nightmare magic math that cares about the color of the pencil you write it with.

It's not enough you know how to design a cipher that is actually secure, you need to know how to implement it so that the calculator you run it on consumes exactly the right amount of time, and in some cases power, per operation.

Then you need to know how to use the primitives together, their modes of operation, and then you get to business, designing protocols. And 10% of your code is calling the libraries that handle all that stuff above, 90% is key management.

There's a good amount of misuse resistant libraries available, but Nikolai was too proud to not look into how the experts do this, and he failed even with trivial stuff: He went with SHA-1 instead of SHA-256. He didn't implement proper fingerprints. His protocol wasn't IND-CCA secure. He went with weird AES-IGE instead of AES-GCM which is best practice. He used the weird nonces with the FF-DH, instead of going with more robust stuff like x25519.

One thing you learn early in academia, is that expertise is very narrow. I bet he knows a lot about geometry. Maybe even quite a bit about math in general. But it's clear he doesn't know enough to design cryptographic protocols. The cobbler should have stuck to his last.

EDIT, to add, the real work with cryptographic protocols starts with designing everyday things that seem easy on the paper, with cryptographic assurance. Take group management that the server isn't controlling.

For Telegram it's a few boolean flags for admin status and then it's down to writing the code that removes the user from the group and prevents them from fetching group's messages.

For Signal it's a 58 page whitepaper on the design of how that is done properly https://eprint.iacr.org/2019/1416.pdf

This is ultimately what separates the good from the bad, figuring out how to accomplish things with cryptography that first seem almost impossible to do.



> Well of course, but this is a feature of Telegram. It's the only messaging app where messages are stored on the cloud.

Wrong, Matrix does it too, but fully e2ee.

> and allows you to access your chats from any device.

No it doesn't, because it is possible withh e2ee as well



Not really. WhatsApp only keep them temporarily (and E2EE!) until they're delivered to each device. Signal too. Telegram keeps everything for all time. Which is kinda handy too I have to say.

Of course you can send your backup to Google for WhatsApp and signal but that's optional. You can keep it locally too. And it's encrypted too. With WhatsApp you can even choose to keep the key locally only.



WhatsApp? The closed source app that AFAIK has never been externally audited, owned by one of the most privacy-disrespecting corporations in the world? You say I can trust it wholeheartedly as long as I don't upload backups to the cloud?



I'm probably dumb, but why would that be proof?

I upload encrypted backups to a cloud service provider (AWS, Google Cloud). I go to another computer, download them, use a key/password to decrypt them.

Sure, I get it, you're typing in something that decrypts the data into their app. That's true of all apps including WhatsApp, etc... The only way this could really be secure is if you used a different app to the encryption that you wrote/audited such that the messaging app never has access to your password/private key. Otherwise, at some point, you're trusting their app to do what they claim.



> > using the password recovery flow

> use a key/password

The previous poster intentionally mentioned password recovery flow. If you can gain access without your password, than law enforcement can too. If you could only gain access with your password, you could consider your data safe.



> If you could only gain access with your password, you could consider your data safe.

You can't assume the negation.

If you can get access without your password then you have proven that law enforcement or the hosting company can to.

If you can't get access then you haven't proven anything. They may be securely storing your data end-to-end encrypted. Or they may just have a very strict account recovery process but the data is still on their servers in the clear.



That's it. The article could be just that. You log back in and all your messages are there without you having to provide a secret or allow access to some specific backup? Your data just lives on the server. The only thing preventing anyone from accessing it is the goodwill of the people running the server.



Not true. Secret chats only live on a device where you started it. Regular people may not use them (their problem), but these are common for business-critical chats in my circles.



Indeed and this is the other thing - even if Telegram don't themselves co-operate with law enforcement, it'd be fairly easy for law enforcement to request access to the phone number from the carrier, then use it to sign into the Telegram account in question and access all of the messages.



You can set a password that’s required to authenticate a new device.

Once that’s set, after the SMS code, then (assuming you don’t have access to an existing logged in device because then you are already in…), you can either reset the password via an email confirmation _or_ you can create a new account under that phone number (with no existing history, contacts, etc).

If you set a password and no recovery email, there is no way for them to get access to your contacts or chat history barring getting them from Telegram themselves.



Offhand, this sounds like a terribly insecure workflow but...

Client creates a Public Private key pair used for E2EE.

Client uses the 'account password (raw)' as part of the creation of a symmetric encryption key, and uses that to encrypt and store the SECRET key on the service's cloud.

NewClient signs in, downloads the encrypted SECRETKeyBlob and decodes using the reconstructed symmetric key based on the sign in password. Old messages can then be decoded.

-- The part that's insecure. -- If the password ever changes the SAME SECRET then needs to be stored to the cloud again, encrypted by the new key. Some padding with random data might help with this but this still sounds like a huge security loophole.

-- Worse Insecurity -- A customer's device could be shipped a compromised client which uploads the SECRET keys to requesting third parties upon sign-in. Those third parties could be large corporations or governments.

I do not see how anyone expects to use a mobile device for any serious security domain. At best average consumers can have a reasonable hope that it's safe from crooks who care about the average citizen.



> When you regain consciousness you'll be perfectly fine, but won't for the life of you be able to recall your device passwords or keys

You can't use your password as input to the mud puddle test.



Telegram has secure calls and secure e2e private chats. All other chats are cloud-backupped. So if you have an intent of using private communication - the answer is "no", if you don't care - the answer is "yes"



How to do that on initial account creation:

- locally create a recovery key and use it to wrap any other essential keys

- Split that or wrap that with two or more keys.

- N - 1 goes to the cloud to be used as MFA tokens on recovery.

- For the other, derive keys from normalized responses to recovery questions, use Shamir's secret sharing to pick a number of required correct responses and encrypt the Nth key.

You can recover an account without knowing your original password or having your original device.



You already know how Signal is going to come out here, because this is something people complain incessantly about (the inconvenience of not getting transcripts when enrolling new devices).



It's a bit unfortunate there isn't a mechanism to establish a key between your desktop and smart phone client that would allow message history to be synced over an E2EE connection. It's doable, but perhaps it's an intentional safety feature one can't export the messages too easily.



I agree with the principle here wholeheartedly. One addendum though is I think this isn't quite the same as the mud puddle test. The idea behind the mud puddle test is if you've forgotten everything, but then manage to recover your data, then the principle must be that someone other than you has to have had access. With Signal, they intentionally refuse to sync data as an extra security step even if you have the keys, the software just refuses to do the syncing step. I'm glad they do personally and I'm not contradicting your point, just adding some notes. Just thought it worth noting.

Edit: Actually, yeah that proves your point.



This isn't fully accurate. You can backup your Signal messages on Android with an encrypted file and a key you control. So yes, just installing on a new device isn't going to give you history. I'd prefer they offer a universal structure for that backup file so we could easily switch between Android and iOS and have some way to backup your data at all on iOS (presently if anything goes wrong when setting up a new phone you lose your entire message history).



Also the same with Skype "encryption". The data is "encrypted", but you receive the private key from the server upon sign-on... So, just need to change that password temporarily.



Unless you can prove (e.g. using your old device or a recovered signing key) that the new device is yours. In that case, if the service supports it, the new device could automatically ask your contacts to re-send the old messages using the new device's public key.



Why not the "founder locked up" test? If the founder claims secure encryption, yet they are not in jail, that means there's no secure encryption because they negotiated their freedom in exchange for secret backdoors.



Maybe, but not a good litmus test. If it’s truly secure and the founder can’t provide information because they don’t have access to it it’s also possible they can’t build a case in most countries.



That isn’t applicable here. Telegram isn’t encrypted and yet they refused to comply with subpoenas. Companies whose customer data is encrypted can truthfully say that they have no way to access it for law enforcement. Telegram can’t.

Maybe in the future, creators of encrypted messaging apps will get locked up. I certainly hope not. But this case doesn’t indicate anything one way or another.



> Companies whose customer data is encrypted can truthfully say that they have no way to access it for law enforcement. Telegram can’t.

I dunno man, kinda seems like you ought to either have a right to privacy or not. Surely there's other ways to make a case, without extraordinarily abusable legal strong-arming.

Why should a wealthy person be able to legally afford encrypted communication on a secure device, when 90+% of people can't because they're poor and tech illiterate?

Does our historically unequal society need more information and rights asymmetry between rich and poor? Between privileged and marginalized?



As I said, tech illiterate - or as likely, legally illiterate.

It's unreasonable to expect most people to intuit the distinction you describe.

However, you don't see wealthy people communicating on insecure devices, because they have people to take care of that stuff.



I'm really not sure what you're referring to. You see lots of wealthy people communicate on insecure devices, and it's quite common for law enforcement to demand and obtain the contents of their communications. "Look at these terrible messages we subpoenaed" is a staple of white collar criminal prosecutions.



* White-collar crimes are estimated to make up only 3% of federal prosecutions.

* White-collar crime prosecutions decreased 53.5% from 2011 to 2021.

* Annual losses from white-collar crimes as of 2021 are anywhere from $426 billion to $1.7 trillion. The wide range here is due to the lack of prosecutions.

* There were 4,180 white-collar prosecutions in 2022.

* It’s estimated that up to 90% of white-collar crimes go unreported.

Etc.

- https://www.zippia.com/advice/white-collar-crime-statistics/

***

Responding by edit due to rate limit:

Guys the connection is clear if you think about it.

High-net-worth individuals use encrypted messaging apps more than the general population, without doubt.

They also have far more resources and abilities to fight a subpoena. It's all distinctively unfair and highly misleading to normal people; for very little real reason and with great potential for abuse.



Most prisoners in the US though are state prisoners (i.e., convicted by a state court) not federal prisoners (by a large margin I think). Lots of people are convicted in a state court for example of showing up at a bank branch with fake id and trying to cash a check. I gather that would be considered a white-collar crime?



Yeah, and the only way to get government to learn about why e2ee is important is to show them that if law enforcement can get it, then so can hackers/phishers. We need as many politicians dark secrets hacked and ousted as possible. It should be a whistblower protected right codified into law to perform such hacks



I know this is getting off-topic, but all the discussion about encryption missing an important weakness in any crypto algorithm - the human factor.

I found it interesting that countries like Singapore haven’t introduced requirements for backdoors. They are notorious for passing laws for whatever they want as the current government has a super majority and court that tends to side with the government.

Add on top Telegram is used widely in illegal drug transactions in Singapore.

What’s the reason? They just attack the human factor.

They just get invites to Telegram groups, or they bust someone and force them to handover access to their Telegram account. Set up surveillance for the delivery and boom crypto drug ring is taken down. They’ve done it again and again.

One could imagine this same technique could be used for any Telegram group or conversation.



In my opinion, Telegram is more of a social network than a messenger. There are many useful channels and in many countries, it plays an important role in sharing information. If we look at it from this point of view, e2ee does not seem very important.

We should also not forget that, in the time when all social media (Reddit, X, Instagram etc.) close their APIs, Telegram is one of the only networks that still has a free API.



That's the dangerous part. It's a messaging app that took in the function of a social media platform. It did so without robust security features like end-to-end encryption yet it advertised itself as heavily encrypted. Like Green stated in his blog post, users expect that to mean only recipient can read what you say, i.e. end-to-end encryption.

Telegram would be fine if it advertised itself as a public square of the internet, like Twitter does. Instead, it lures people into false sense of security for DMs and small group chats, which is what Green's post and thus this thread is ultimately about.

Free API doesn't mean anything until they fix what's broken, i.e. provide meaningful security for cases where there's reasonable expectation of it.



> It's a messaging app that took in the function of a social media platform. It did so without robust security features like end-to-end encryption yet it advertised itself as heavily encrypted.

Do you want to say that social networks must implement E2E? Personally I think it is a good idea, but existing social networks and dating apps do not implement it so Telegram is not obliged to do it as well.

As for promises of security, everybody misleads users. Take Apple. They advertise that cloud backups are encrypted, but what they don't like to mention is that by default they store the encryption keys in the same cloud, and even if the user opts into "advanced" encryption, the contact list and calendar are still not E2E encrypted under silly excuse (see the table at [1]). If you care about privacy and security you probably should never use iCloud in the first place because it is not fully E2E encrypted. Also note, that Apple doesn't even mention E2E in user interface and instead uses misleading terms like "standard encryption".

This is not fair. Apple doesn't do E2E cloud backups by default and nobody cares, phone companies do not encrypt anything, Cloudflare has disabled Encrypted Client Hello [2], but every time someone mentions Telegram, they are blamed for not having E2E chats by default. It looks like the bar is set different for Telegram compared to other companies.

[1] https://support.apple.com/en-us/102651

[2] https://developers.cloudflare.com/ssl/edge-certificates/ech/



> It looks like the bar is set different for Telegram compared to other companies.

I too find it disingenuous. Many people here support a monopoly and privacy nightmare like WhatsApp but somehow, a closed-box implementation of E2EE is automatically better than an app with a proven track record of not selling the user data.



> a social media platform. It did so without robust security features like end-to-end encryption

Most social media platforms doesn't support e2ee.

Some chat apps do support e2ee but also requires a god damn phone number to login (yeah so does telegram), this makes "encryption" useless because authorities just ask the teleco to hand out the login SMS code.



The author of this article makes the point that social media is its key feature, but they still advertise Telegram as an encrypted messenger. So your messages to friends will be on Telegram, they're there for the social network, and they will be unencrypted because they don't support E2EE for group chats and deliberately hide the "secret chats" function.



Most "normal" people use messaging app and social medias DM interchangeably.

For instance 2 days ago my partner wanted to show me a message her friend sent, went to whatsapp and couldn't find it then realized said friend had used instagram DM for that. Most people don't care enough.



The free API is amazing I have so many little helper bots that help me automated my life. It's easy better easier and more feature rich than twilio or slack. I made my own stock management bot that ate a screener spreadsheet I upload in the chat and tell me if I should sell my stocks.

There is even that freqtrade bot that runs on telegram, even RSS bots. It really is amazing. So easy to use for chat ops.

I don't know what else you would use the API for.



It’s not encrypted by default, and even if it were encrypted, you should never trust any connected device with anything important. That being said, Telegram is hands down the best communication platform right now. It is feature-rich, with features implemented years ago that are only now being added to other platforms. It has normal chatting/video calls, groups, channels, and unlimited storage in theory, all for free. I just hope it doesn’t go downhill after what happened these last days because there’s no proper replacement that fulfills all Telegram features at once.



Signal has probably the worst UX of any messaging app. It also used to require sharing phone numbers to add contacts, which imo is already a privacy violation.

Telegram is fast, responsive, gets frequent updates, has great group chat, tons of animated emojis, works flawlessly on all desktop and mobile platforms, has great support for media, bots, and a great API, allows edits and deleting messages for all users, and I really like the sync despite it not being e2e.



You’re also not stuck with the official client and all of its decisions like with Signal. In addition to the official Qt and Swift/Cocoa Telegram clients, you can find third party clients written in WinUI and GTK as well as a CLI client, which gives users the choice to use the one that fits their wants/needs best.

I use both on desktop for different people and the desktop Signal client doesn’t hold up well in comparison. In some ways it feels more clunky than the iMessage ancestor iChat did 20 years ago.



Telegram consumes up to 50% of battery charge on iOS, with practically zero daily usage, all energy saving settings enabled, and a single followed channel, whether or not I force close the app or reinstall it. I gave up on trying to make it work, merely installing the fucking app ensures my phone is dead in the morning.



I have group of 15 friends using it and it barely uses 2% of battery while using it. Either you are just spreading misinformation or you should check your phone for custom wires added by the bad guys.

13 people on iOSes, iPhones from 11s to 15 Pro; 2 Androids.



> Signal doesn't require sharing of phone numbers

It does require a phone number to create an account. That’s the reason I do not consider it being private because at least in Germany a phone number can only be activated by using a personal ID card which it is connected to.



> Signal has probably the worst UX of any messaging app

Really? I don't see any real difference between the UX of WhatsApp and Signal for example. And they're really on-par feature wise.

The only things in your list that are not available on Signal are "tons of animated emojis" and "bots". Recently they also introduced usernames to keep your phone number private. And Signal have had all the other things for a few years now, and with actual security.



> allows edits and deleting messages for all users

And it has those little features like masked text and what not, features wise, telegram is just the best. I didn’t use Signal for a long time, you can’t edit the messages there!?



>It also used to require sharing phone numbers to add contacts

It no longer doesn't. It took them a while because you can't just slap features like that. It's not a string in a database like with Telegram.

Telegram has great UX because you can build things fast and easy when you don't have to give two shits about the security side of things. You can cover that part with grass-roots marketing department and volunteering shills.



Telegram is great for large groups. It's better to compare Telegram to Reddit than Signal.

Signal is excellent for tiny groups of known participants. I prefer it over anything else for this use case. The group permissions Signal introduced a few years ago are well suited for that purpose. I've recently started running small groups on Signal with about 100 participants who mostly know each other, but not tightly. The recent addition of phone number privacy makes this feasible.

Once you start moving up in scale you really need moderation tools, and Signal doesn't do so well there. When you have thousands of people and it's open to the public you need to moderate or else bad actors will cause your valuable contributors to leave. Basic permissions like having admins who can kick people out and restricting how new members can join only gets you so far.

The issue is that in Signal there is no group as far as the server is concerned: The state of the group exists only on client devices and is updated in a totally asynchronous manner. As a consequence it is more difficult for Signal to provide such features. For example, Signal currently has no means to temporarily mute users, to remove posts from all group members, easy bots to deal with spam, granting specific users special privileges like ability to pin messages, transferable group ownership as opposed to a flat "admin" privilege, etc.

Think about the consequences of Signal's async nature with no server state: What does it mean to kick someone out? An admin sends a group update message that tells other clients to stop including that user in future messages. Try this: Have a group member just delete Signal and then re-register. Send a message to the group. They're still in the group. You get an identity has changed message. These are really only actionable with people who you know... that is, in tiny groups.

And then, the biggest strengths of Signal, which are its end to end encryption and heroic attempts to avoid giving the server metadata, are less valuable in the context of a large public group: Anyone interested in surveilling the group can simply join it, so you have to assume you're being logged anyway. Signal lacks strong identities as a design choice, so in big groups it's harder to know who you're really talking to like you know that "Joe Example, founder of Foo Project" is @Foo1988 on Telegram and @FooOfficial on X and u/0xFooMan on Reddit.



The worst UX you can provide. Clumsy, slowly switching views, search worse than on WhatsApp, stickers like from 2005, no formatting, no bot API (of course there are few "hacked" ones implementations, but is it really the way?), margin and padding bloated UI.

# No smooth animations - that's makes Telegram stand out from everything else here, but maybe not everyone is happy when 6-core phones can deliver something more than 60fps in 2024...

That's what I remember and yes - mostly those are probably easy to fix UI/UX features/bugs, but even being open-source - they aren't.



This is one of those questions where it's hard to answer but it's obvious once you use it.

What's the difference between a fiat and a ferrari? What's the difference between CentOS and Linux Mint? What's the difference between a macdonalds and a michelin burger?

I have friends and groups on both platforms. On Signal, I'm basically just sending messages (and only unimportant one, like, when are we meeting. Sending media mostly sucks so I generally only have very dry chats on Signal).

Whereas on Telegram, I'm having fun. In fact it's so versatile, that my wife and I use it as a collaborative note-taking system, archiver, cvs, live shopping list, news app (currently browsing hackernews from telegram), etc. We basically have our whole life organised via Telegram. I lose count of all the features I use effortlessly on a daily basis, and only realise it when I find myself on another app. This is despite the fact that both Signal and whatsapp have since tried to copy some of these features, because they do so badly. A simple example that comes to mind: editing messages. It took years for whatsapp to be able to edit a message (I still remember the old asterisk etiquette to indicate you were issuing a correction to a previous message). Now you can, but it's horrible ux; I think you long press and then there's a button next to copy which opens a menu where you find a pencil which means edit, or sth like that. In telegram I don't even remember how you do it, because it's so intuitive that I don't have to.

Perhaps that's why I find the whole "Telegram encryption" discussion baffling to be honest. For me, it's just one of Telegram's many extra features you can use. You don't have to use it, but it's there if you want to. I don't feel like Telegram has ever tried to mislead its users that it's raison d'etre is for it to be a secret platform only useful if you're a terrorist (like the UK government seems to want to portray it recently).

I get the point about "encryption by default", but this doesn't come for free, there are usability sacrifices that come with it, and not everyone cares for it. Insisting that not having encryption by default marrs the whole app sounds similar to me saying not having a particular set of emojis set as the default marrs the whole app. It feels disingenuous somehow.



> Perhaps that's why I find the whole "Telegram encryption" discussion baffling to be honest. For me, it's just one of Telegram's many extra features you can use. You don't have to use it, but it's there if you want to. Well, as soon as you crate all e2ee chat most features are gone for this chat. It doesn’t even sync on multiple devices. And e2ee is not available for group chats.

It’s more like they implemented it to check a box …



I second the point about the difference. Can’t tell why, but signal and whatsapp feel just awful ui/ux-wise. And that’s not a habit thing, I’ve used whatsapp before telegram (and still it was unideal). Telegram knows UX-fu and how to grow without being the only player on the board.



I think it's mainly Telegram's native feel (and it is native on every platform it supports afaik). It's even in little trivial things like the rubber band effect on Apple's platforms, then in how smooth the loading of missing stuff from the network is, and finally it's in the design: Telegram is slick.

All those little things combined and when you switch from Telegram to Signal or WhatsApp it feels like going a couple of decades back, or something like that.

Honestly I don't know how much I can trust Telegram and its founder Pavel Durov (I probably shouldn't), but in terms of the quality of software it's unmatched.



> What's in Telegram that you don't see in Signal?

The first feature that comes to mind for me is being able to use multiple devices. Signal only allows using it with one phone. If you add a second device, the first one stops working. You can use a computer and a phone, but not multiple phones. Telegram supports this without any issues. I still struggle to understand this limitation.



It’s easy for telegram to support this since it’s not e2ee. When you create a so called private chat on telegram, this chat is also only available on the device you created it on.



>It’s easy for telegram to support this since it’s not e2ee.

E2EE is not important to me. Continuity of chats and lack of friction in accessing them is important to me.

>When you create a so called private chat on telegram, this chat is also only available on the device you created it on.

Signal is able to do this with my phone and my computer. The one-phone limit seems arbitrary.



User base, large groups (I think the max is 200k members), channels, bots to automate work, animated stickers, video messages (not the calls one), and video/voice calls within the group (not sure if Signal has that), file storage and file sharing, multiple devices without worrying about losing messages -and you might mention the security part and that’s ok, I want the accessibility, if I want security I will look somewhere else- among other features. Those are on top of my head.



Anecdotal evidence, so take this with a grain of salt - I work with a bunch of people from Ukraine and almost all of them exclusively use Telegram to keep up with the news and family back home. From talking to them for a while, it's mostly because it's free, has excellent support for sync across multiple devices (including audio, video and other media), has support for proxies to circumvent any kind of blocking, public channels for news updates.

Honestly it would be better if Telegram dropped the facade of having E2EE. It's generally very low on the priority list of most people anyway, as much as it would hurt anyone reading this, but that's the truth. People are not using it for secure messaging, but for a better UX and reliability.

EDIT: Telegram does require a phone number to sign up.



Actually I was wrong. Just checked and Telegram does require a phone number to sign up. I haven't used it myself much, but was relaying the general reasons why regular people use it.



Yep but you still need to have it staying activated and on whenever you need to activate the telegram app on a device.

I was using telegram for one single usage, which was a group organizing local meetups events for expats. When I switched smartphone I really didn't want to install an app just for one group and would have preferred using telegram web to consult it occasionnally. Every time I tried logging in on a computer/smartphone it told me to validate the login from telegram on my original, now wiped clean, smartphone. I just gave up.



> You need it to register, but afaik it's not shown to anyone in any way.

Then why is a phone number needed to register? If PII is "not shown to anyone in any way" then it should be completely unecessary to provide it to the service. Do not let that particular wool be pulled over your eyes.



Those who need to dissociate with a number have anonymous sim cards in abundance. Costs around $2-5 a piece when ordered in bulk.

That said, such high-tech operation is just a geeks fantasy about spies. When you cross the line where it becomes reality, you’re either a very big name with a sudden drug/rape history or a subject for waterboarding which is the most effective cryptoanalysis tool invented.



While this is a well-trodden stereotype, and it certainly has merit, not all crimes are Snowden-level crimes against the state. Felonies such as embezzlement, fraud and trafficking are often investigated by exposing the digital trail. Law enforcement most definitely do pull those records with a subpoena. It's often one of the first things done (pull all banking and phone records) and is often a key ingredient in a successful conviction.

Yes, burner sims definitely help evade investigations, but they are harder to get nowadays, depending on jurisdiction. For instance you can't pay cash for a SIM in North America. It has to be a credit card or a bank transfer and that's a form of ID.



Not a single person I know who uses Telegram cares about or thinks of it as e2ee. Whether "techie" or "non-techie" (whatever the definition of that is). People use it because it has a nice interface, was one of the first to have good "sticker" message support (yes, a lot of people care about that kind of stuff), and of course because of the good old network effect.

It's only on HN I ever see people set up Telegram as some supposed uber-secure private app for Tor users and then demolish that strawman gleefully.



Do you read other news sites that mention Telegram or is this an N=1 situation?

Today, on the same topic, another tech site which generally gets a lot of things right (but whoever is responsible for writing about Telegram, or maybe their internal KB, is consistently wrong and doesn't care about feedback) wrote that it is an encrypted chats service: https://tweakers.net/nieuws/225750/ceo-en-oprichter-telegram... ("versleutelde-chatdienst" means that for those fact checking at home)



You could also ask about whether they think it's private. And if they say yes, ask them what it means. Does it mean only sender and intended recipients can read the message, or is it fine if the service has someone check the content. Would they agree on the notion "it's OK my nudes I send to my SO are up for grabs for anyone who hacks Telegram's servers", or do they think should Telegram plug this gaping hole.

Also, people tend to state they have nothing to hide, when they feel they have nothing to fight with. But I can't count the number of times I've seen a stranger next to me on a bus cover their chat the second I sit next to them. Me, a complete random person with no interest in their life is a threat to them.



You may try sitting near a completely open-space developer and watch what they are doing, and see the 10x performance drop on average, while there was zero privacy on screen at all times. It helps to realize that people not always behave logically (we have lots of group instincts legacy) and it doesn’t always work as a proper argument.



So they take theatrics over logical evaluation of the situation. Cool. Tell them Durov could have locked himself out of their data and spared himself the trip to behind bars.



Durov is in jail because he is not doing moderation of public chat channels, as far as has been shared. It has exactly nothing to do with encryption or privacy, in both directions (that is, it doesn't in the slightest prove that Telegram doesn't share private data with various states; and E2EE of private chats would not have done one iota to keep him out of jail).



You probably don't use Telegram channels much. There are some drug and prostitution related channels you can search for but they disappear rather quickly or are totally empty.

Christo Grozev shared screenshots of a few CSAM channels yesterday, but if you search for them, they do not seem to exist.

Telegram clearly does less pre-moderation than Facebook, but they are smaller and have less computing and they do not seem to rely on the masses of Nigerian moderators that work for 5$/day as Facebook does.



Why is he in jail anyway? Certainly he's not a pedo drug dealing terrorist… So there is another reason. As to what that is, we can only speculate.

My speculation is that he set a too high price to share the private data with france or USA.



For the past few weeks I've been using Telegram to create my own cool sticker and when talking with people in whatsapp (eughh) I find myself having trouble finding the words my telegram stickers would mean



Amplified by journalists, and most frustratingly to me even some techies that just can't be bothered to properly examine all available facts despite their technical capabilities to examine them.



I’d guess (not gonna test it but it feels reasonable) that “almost every non-techie” has a very vague idea of what e2ee even is, so it’s not clear where the worst part comes from. Pretty sure the best ideas they have about security are from hacker movies best case on average.



BS. Vast majority of non-tech users do not, for a simple reason that they can't know it even if they cared, and they do not. Even tech users can't be bothered to read links to the faq on tg site.

There is so much misinformation around telegram that alone made me trust it more (if a known liar tries to discredit something, it increases chances of it being good--it is about comments here on HN).



I am null at cryptography but thie following does not sound too bad as a default tbh. And I think it is misleading to focus solely on e2ee and not mention the distributed aspect.

https://telegram.org/faq#q-do-you-process-data-requests

> To protect the data that is not covered by end-to-end encryption, Telegram uses a distributed infrastructure. Cloud chat data is stored in multiple data centers around the globe that are controlled by different legal entities spread across different jurisdictions. The relevant decryption keys are split into parts and are never kept in the same place as the data they protect. As a result, several court orders from different jurisdictions are required to force us to give up any data.

> Thanks to this structure, we can ensure that no single government or block of like-minded countries can intrude on people's privacy and freedom of expression.

> Telegram can be forced to give up data only if an issue is grave and universal enough to pass the scrutiny of several different legal systems around the world.

> To this day, we have disclosed 0 bytes of user data to third parties, including governments.



You can coherently argue that encryption doesn't matter, but you can't reasonably argue that Telegram is a serious encrypted messaging app (it's not an encrypted messaging app at all for group chats), which is the point of the article. The general attitude among practitioners in the field is: if you have to reason about how the operator will handle legal threats, you shouldn't bother reasoning about the messenger at all.



> you can install a reproducible build of Telegram and be sure it's end-to-end encrypting things.

This is incorrect. The construction for group chats in Telegram is not e2e at all. The construction for dm’s is considered dubious by many cryptographers.

It does not matter if you can reproduce a non-e2e encrypted message scheme, you must still trust the servers which you have no visibility on.

Trustworthy e2e is table stakes for this for that reason. Reproducible builds aren’t because we can evaluate a bunch of different builds collected in the wild and detect differences in implementation. This is the same thing we’d do if reproducible builds were in effect.

There are lots of reasons splitting jurisdictions makes sense but you wrote a whole bunch of words that fall back to “hope Telegram doesn’t change their protections in the face of governmental violence”.



It's interesting how all these years later and cryptographers can still only be dubious; nobody has actually cracked the implementation (or if they have, they haven't publicized it for whatever reason).



The reproducible build of Telegram lets you evaluate the code doing end-to-end encryption. Once you satisfy yourself it's doing this kind of encryption without implementation-level backdoors, then you don't need to worry about servers reading it (except for #5 above).

I didn't claim it encrypted "group chats". I said "things". If you want me to be specific, the "things" are individual 1-1 end-to-end encrypted chats.



Reproducible builds are not required to evaluate the encryption algorithm used in Telegram.

Software auditors use deployed binaries as a matter of course.

They’d do so even if reproducible builds are on offer because the code and the binary aren’t promised to be the same even with reproducible builds and validating that they are can be more problematic than the normal case of auditing binaries.



> On the question of balancing privacy and security, there are in fact solutions, but you have to get away from the idea of a centralized police force / centralized government, and think in terms of a free market of agencies, that can decrypt limited evidence only with a warrant and only if they provide a good reason. The warrant could be due to an AI at the edge flagging stuff, but the due process must be followed and be transparent to all

What does this mean? How can "we" move away from centralized states to "a free market of agencies"? How can there be a "market" of police forces, even in principle? Who are the customers in this imagined market? Who enforces the laws to keep it a free market?

At first glance, this sounds like libertarian fan fiction, to be honest, but I am curious.



Have you read the article I link to in that point? After you read it, you'll have a better idea, and then if you have a specific point, we can discuss.



I read it now, and I saw nothing at all about a free market of LEAs, either police or intelligence agencies. It's only speaking about some silly idea of filming every private scene but relying on magic encryption with keys that are stored... Somewhere?... And somehow the keys to decrypt these most private moments are only accessible when it would be nice to do so. It's clearly not a serious idea, so it gives me a good idea how wild the speculation goes about the broader trends.



Well, at least you got the broad strokes.

Since the parent post was flagged, I do not see any sense to continue, since no one will really see this conversation.

I do interviews with the top people. I build solutions. I give it away for free. I discuss the actual substance. And in the end it's just flagged before a serious conversation can be had. HN is not what it once was.



> if you have to reason about how the operator will handle legal threats, you shouldn't bother reasoning about the messenger at all.

That's true.

You need to run your own platform people. XMPP is plenty simple, plenty powerful, and plenty safe -- and even your metadata is in your control.

Just self host. There's no excuse in 2024.

Wake up people!

Why should the arrest of someone else affect YOU?



"You need to run your own platform people." What problem does this solve?

I'm someone who's been on the business end of a subpoena for a platform I ran, and narcing on my friends under threat of being held in contempt is perhaps the worst feeling I'm doomed to live with.

"XMPP is ..." not the solution I'd recommend, even with something like OMEMO. Is it on by default? Can you force it to be turned on? The answer to both of those is, as it turns out, "no," which makes it less than useful. (This is notwithstanding several other issues OMEMO has.)



> The answer to both of those is, as it turns out, "no"

This is not true, it depends on the client. Conversations has OMEMO enabled per default.



This is like saying we shouldn't use TCP/IP because it's not encrypted. How it actually works is that encryption is enforced by the application - indeed the only place you can reasonably enforce it. See for example the gradual phasing out of HTTP in browsers by various means.

What this means in practice is that you shouldn't focus on whether XMPP (or Matrix, or whatever) protocols are encrypted, but whether the applications enforce it. Just as there are many web browsers to choose from, there are many messaging apps. Use (and recommend) apps that enforce encryption if that's what you want.



I'm not sure I agree, particularly given that there's some incentive for us to get our relatives using these messenger protocols and clients. The Web made it work because everyone came together and gathered consensus (well, modulo some details) that enforcing HTTPS is, ultimately, a good idea given the context.

So far, I'm not seeing that same consensus from the XSF and client vendors. If the capital investment can be made to encourage that same culture, the comparison can perhaps be a little closer.



The consensus comes from the people using the clients, not from the standards bodies. It's the same for HTTPs, where the users (in this case the server admins) decided it would be a good idea to use encryption.

There are even apps like Quicksy which have a more familiar onboarding experience using the mobile phone number as the username, while still being federated with other standard compliant servers. There is little reason to use walled garden apps like Signal these days.



Note in particular that the Ethernet connection to xmpp.ru/jabber.ru's server was physically intercepted by German law enforcement (or whatever-you-think-they're-actually-enforcing enforcement), allowing them to issue fraudulent certificates through Let's Encrypt and snoop on all traffic. This was only noticed when the enforcement forgot to renew the certificate. https://news.ycombinator.com/item?id=37961166


As if it were that simple. Where are you going to host that self-hosted instance? What protections against law enforcement inspections do you have? What protections against curious/nefarious hackers? How are you going to convince every single person you interact with to use it?

Gung-ho evangelists rarely convert like a reasonable take on the subject does



  > Just self host. There's no excuse in 2024.
I hate to break it to you, but there's plenty of excuses. We live in a bubble on HN.

May I remind you what the average person is like with this recently famous reddit post:

https://archive.is/hM2Sf

If you want self hosting to happen, with things like Matrix, and so on, the hard truth is that it has to not be easy for someone who can program, but trivial for someone who says "wow, can you hack into " if they see you use a terminal



You're assuming end-to-end encryption doesn't exist, and that the only way to be safe is to have someone close to you self-hosting.

Self-hosting is terrible in that it gives Mike, the unbeknownst creepy tech guy in the group 100% control over the metadata of their close ones. Who talks to whom, when etc. It's much better to either get rid of that with Tor-only p2p architecture (you'll lose offline-messaging), or to outsource hosting to some organization that doesn't have interest in your metadata.

The privacy concern Green made was confidentiality of messages. There is none for Telegram, and Telegram should have moderated content for illegal stuff because of that. They made a decision to become a social media platform like Facebook, but they also chose not to co-operate with the law. Durov was asked to stop digging his hole deeper back in 2013, and now he's reaping what he sow.



Yes: End-to-end encryption is technically quite difficult, but politically and legally feasible (at least currently, at least in most countries).

Simply not cooperating with law enforcement is technically moderately difficult, but politically and legally impossible.

Between a difficult and an impossible option, the rational decision is to pick the difficult one.



Indeed. Even being charitable and assuming that they're not lying (they say elsewhere that they've shared zero bytes with law enforcement, despite this being demonstrably false), in reality if say, they were to arrest the founder in an EU country (France, perhaps), all they need to do is threaten him with twenty years in prison and I'm sure he'll gladly give up the keys from all the different locations they supposedly have.



A possible implementation using existing infrastructure where at least the client is open: modify the messaging client so that when it receives multiple pvt connections it routes every incoming message to all connected members. Now if you have say 10 users that want group encrypted chats, have one of them run the modded client too so that any user connecting to a pvt chat with that client will essentially enter a room with other users. Of course this requires trust between members, and adding another encryption layer on all clients might turn out necessary so that you don't need to worry about the carrier telling the truth (all p2p connections encrypted, etc)..



This kills the forward secrecy.

IIRC Signal just has each group member send each group message to each recipient with the standard pair-wise encryption keys. It's the message's headers that lets the recipient know it's intended for the group and not the 1:1 group.



this is pretty much what Matrix does, if I understand correctly.

Additionally the key is regularly updated to provide some degree of perfect forward secrecy and avoid encrypting for people who left the group chat



The entire platform is a joke. It pretends to have no identifiers and heavily markets queues (a programming technique) as a solution to privacy problem.

You ask the authors how they solved the problem of server needing to know to which client connection an incoming ciphertext needs to be forwarded, and they'll run to the hills.

They're lying by omission about their security, and misleading about what constitutes as a permanent identifier.



That you don't like the design is well known. But this is not the reason to lie.

You understand the design quite well, from our past conversations, you simply don't like the fact that we don't recognise user IP address as a permanent user identifier on the protocol level. It is indeed a transport identifier, not a protocol-level identifier that all other messaging networks have for the users (in addition to transport identifiers).

Message routing protocol has anonymous pairwise identifiers for the connections between users (graph edges), but it has no user identifiers - messaging servers have no concept of a user, and no user accounts.

Also, recently we added a second step in message routing that protects both user IP addresses and transport sessions: https://simplex.chat/blog/20240604-simplex-chat-v5.8-private...

In general, if you want to meaningfully engage in the design criticism, I would be happy too, and it will help, but simply spitting out hate online because you don't like something or somebody, is not a constructive approach – you undermine your own reputation and you also mislead people.

> You ask the authors how they solved the problem of server needing to know to which client connection an incoming ciphertext needs to be forwarded, and they'll run to the hills

This is very precisely documented, and this design was recently audited by Trail of Bits (in July 2024), we are about to publish their report. So either you didn't understand, or your are lying.

> They're lying by omission about their security, and misleading about what constitutes as a permanent identifier.

You would have to substantiate this claim, as otherwise it is slander. We are not lying about anything, by omission or otherwise. You, on another hand, are lying here.

That you are spiteful for some reason is not a good enough reason.

Factually, at this point SimpleX Chat is one of the most private and secure messengers, see the comparisons of e2e encryption properties in SimpleX Chat and other messengers: https://simplex.chat/blog/20240314-simplex-chat-v5-6-quantum...



I wonder if this is practically relevant at all.

Given that users can access their messages without interaction with people at Telegram, automatic aggregation of the cloud data for single end points is in place.

In consequence the data can be accessed from a single jurisdiction anyways.



Wouldn’t being forced to give up the password and logging in be a violation of the 5th amendment, at least in the US? I think it’s a mixed bag of rulings right now, but it seems like it would make sense for it to fall that way at the end of the day.



Problem with this claim is that it's hardly verifiable. Telegram's backend is closed source, and the only thing you can be sure of is that their backend sees every message in plaintext.



Crypto is really hard. You have to trust that whoever implemented the crypto is smart and diligent, and you have to trust that whoever operates the crypto is smart and diligent, and you have to trust both of those parties.

Centralization means that it's very easy to trust that whoever implements and operates the crypto is smart. Do I trust them? I don't know. I trust myself, but I don't think I am independently capable of operating or implementing crypto - if I want to make assertions like "this is end-to-end-encrypted" and ensure those assertions remain true, I will need a several million dollar a year budget, at a minimum. "Decentralized" means you've got tons of endpoints that need securing, and they can share crypto implementations, but the operations are duplicated. Which means it's more expensive, and you're trusting more operators, especially if you want resiliency.

Yes, something like Signal or Whatsapp means you've got a single point of failure, but something like Matrix, you've got many points of failure and depending on how it's configured every point of failure can allow a different party to break the confidentiality of the system.

Decentralization is great for resiliency but it actively works against reliable and confidential message delivery.



It's always very easy to trust as long as you're allowed to be mistaken in your trust. That's literally how people fall for all kinds of things, including wars, advertising, etc. It's much harder to fool all the people all the time, than corrupt some of the people (the ones in charge) all the time:

https://www.npr.org/sections/parallels/2014/04/02/297839429/...

The mistake Moxie makes (and you do as well, you should really click on the links I posted to understand why)

is that "no one wants to run a server". In fact, an entire industry of professional "hosting companies" exists for Wordpress, Magento, etc. It's a free market of hosting.

You can't trust the software they're hosting, that's true. Which is why we have things like Subresource Integrity on the Web, IPFS, and many other ways to ensure that the thing you're loading is in fact bit-for-bit the same as the thing that was just audited by 3 different agencies, and battle-tested over time.

Think UniSwap. I'd rather trust UniSwap with a million dollars than Binance. I know exactly what UniSwap will do, both because it's been audited and because it's been battle-tested with billions of dollars. No amount of "trust me bro" will make me trust Binance to that extent. The key is "Smart contract factories":

https://community.intercoin.app/t/intercoin-smart-contract-s...

In short, when you decouple the infrastructure layer (people running ethereum nodes) from the app layer (the smart contracts) all of a sudden you can have, for the first time in human history, code you can trust. And again, there is a separation of responsibilities: one group of people runs nodes, another group of people writes smart contracts, another group audits them, another makes front-end interfaces on IPFS, etc. etc. And they all can get paid, permissionlessly and trustlessly.

Look at Internet Computer canisters, for instance. Or the TON network smart contracts. There are may examples besides slow clunky blockchains today.



What do web3 and crypto moneys have anything to do with the discussion?

Decentralized protocols have existed for a very long time. Email have existed since the 70s. Telephone is also arguably decentralized and have existed for even longer.



The technology has potential to be decentralized, but telephones were famously considered a "natural monopoly" and ended up centralized under Ma Bell.

Government split Ma Bell into multiple smaller pieces, but they still operated as a cartel and kept prices high. They had centralized telephone switchboard operators etc.

It is only when authors of decentralized file-sharing networks like Kazaa (who built them to get around yet another government-enforced centralized regime of Intellectual Property, RIAA, MPAA etc.) went clean did we get Skype, and other Voice over IP consumer products. And seemingly overnight, the prices dropped to zero and we got packed-switched networks over dumb hubs, that anyone can run.

That's the key. We need to relegate these centralized platforms (X, Meta, etc.) into glorified hubs running nodes and earning some crypto, akin to IPFS nodes earning filecoin, or BitTorrent nodes earning BTT, etc.

Everything centralized gets enshittified

Clay Shirky gave a talk abot this in 2005: https://www.ted.com/talks/clay_shirky_institutions_vs_collab...

And Cory Doctorow recently: https://doctorow.medium.com/https-pluralistic-net-2024-04-04...



> and earning some crypto

You are not answering my main concern. Again, you snick in crypto into the discussion. Why?

We have decentralized stuff. Email, xmpp, matrix, the fediverse, all this works without this web3/crypto stuff. Those things are not perfect, including their decentralized aspect (sometimes to the point of doubting that decentralization really works well, although I personally think decentralization is a good thing).

I didn't downvote you but I suspect this is exactly why you are being downvoted. Since you asked. Many of us just think cryptos and this web3 stuff is bullshit and gets mentioned totally off topic without any convincing link to the discussion every single time.



Because crypto is literally how entities on a decentralized network get paid in an autonomous network. It's not via cash transfers. It's not via bank transfers. Or having accounts in some central bank.

Look at FileCoin and IPFS, for instance. Once you automate the micropayments and proofs of spacetime, it becomes a cryptocurrency. And then the providers of services can sell it to the next consumers.

Just because you hear the word "crypto" doesn't mean it's automatically off-topic, when it's literally the thing that is inevitably used by decentralized systems to do proper accounting and reward the providers for providing any services. Without it, you'll still be sitting -- as you are -- with no viable alternatives to Twitter and Facebook.



> Because crypto is literally how entities on a decentralized network get paid in an autonomous network

That doesn't ring true. What is an autonomous network? Those things runs on the internet, largely backed by infrastructure funded by traditional money. Moreover, emails, tor nodes, xmpp servers, matrix homeservers, fediverse hosts... None of those need cryptocoins to fund themselves, and are indeed largely and for the most part funded using traditional money. Micropayments are also not something needed for decentralization.

Decentralization is way more than just about decentralizing money and many of us don't trust crypto coins.



> it's literally the thing that is inevitably used by decentralized systems to do proper accounting and reward the providers for providing any services.

Or just like, advertisement. ActivityPub, Matrix, PeerTube, NextCloud and Urbit are all fully decentralized and let any instance host monetize themselves however they want.

Decentralized services, even for-profit ones, are not synonymous with cryptocurrency. Stop spreading misinformation to promote an unrelated topic.



Urbit uses NFTs as IDs, which can be transferred

"Urbit IDs aren’t money, but they are scarce, so each one costs something. This means that when you meet a stranger on the Urbit network, they have some skin in the game and are less likely to be a bot or a spammer." https://urbit.org/overview

Who pays for the hosting of ActivityPub and Matrix instances?

What if one instance abuses other instances too much? How do you prevent it?

What if some spammer abuses Nexcloud? Oh, look at that, Nextcloud and Sia announce "cloud storage in the blockchain": https://nextcloud.com/blog/introducing-cloud-storage-in-the-...

Now we come to your ActivityPub stuff, including PeerTube. The question is, who pays for storage? What are the economics of storage?

I literally go into detail here: https://community.intercoin.app/t/who-pays-for-storage-nfts-...

I met the founders of LBRY / Odysee and other tokens that are actually being used for actual streaming. LBRY is a genuine utility token being used for instance.

You are totally ignoring the part that people need to get paid for storing stuff, and at the same time the payment needs to happen automatically.

Any other examples?



> Who pays for the hosting of ActivityPub and Matrix instances?

And

> What if one instance abuses other instances too much? How do you prevent it?

Simple, they get blocked by other instances?

How cryptos change anything to these three questions?

> Oh, look at that, Nextcloud and Sia announce "cloud storage in the blockchain"

That just means Sia wrote a Nextcloud integration for their stuff and somehow nextcloud decided to showcase this integration. That doesn't mean Nexctloud has much to do with blockchain stuff. Nextcloud integrates with anything and its dog.

> What if some spammer abuses Nexcloud?

What kind of spam are you imagining and how do you think crypto coins are going to solve this? You don't use cryptos for this, you use good old system administration and in particular antispam systems, which don't use coins.

> You are totally ignoring the part that people need to get paid for storing stuff, and at the same time the payment needs to happen automatically.

We're not.

First, they don't always need to, some people run stuff out of advocacy for instance.

Second, getting paid with regular money is not an unsolved problem, there are plenty of options, many of which also coming with some builtin guaranties against fraud. It's literally how the whole world works. Now, I can't say I'm a huge fan of our financial system but that's a social issue in need for a social solution, not a technical one.

I'm stopping here, it's pretty clear that I won't get a solid, reasonable argument in favor of cryptos here. And that since your top comment is flagged to death, nobody reads us anyway.



I could go on for literal days. The blockchain isn't a panacea, and rationally most solutions don't ultimately settle on a bespoke transactional network with audited consensus protocols. It is stupid, overdesigned, and as a sign of its poor fitness it dies over time. I'm not relaying some "evil villain" speech from someone that wants to see decentralized services die, this is a reality check from your peers who also hate centralization. Borderline intervention if it has to be - you've echoed this same sentiment in several threads while apparently ignoring the self-evident failure of protocols that embody your vision.

This was a cutting-edge and untested concept in maybe 2011. You missed the boat by a decade and a half.

> LBRY is a genuine utility token being used for instance.

Yeah, last I heard of their brand was when a member of my graduating class became a neo-nazi for six months and incessantly uploaded videos detailing his hallucinations to the internet. You're in good company, it sounds like.



Ugh.

> I could go on for literal days

I believe there should be a way to have a rational discussion about this, point by point, maybe threaded. This ain't it. But whatever.

I am not married to blockchains, I have literally criticized blockchains. I have said that smart contracts and distributed systems are what we need, whether that's Internet Computer canisters, SAFE network datachains, IOTA DAGs, Hashgraphs, or Intercloud (something I designed). I don't know why people on HN love to do this strawman over and over.

Blockchain is a settlement layer. I don't even say it's needed for day-to-day micropayments. I explain how the systems could work, which could use any smart contracts for their settlement layer, I don't care about the underlying technology for those, but the Web2 elements are there: https://qbix.com/ecosystem#DIGITAL-MEDIA-AND-CONTENT

> The last I heard of [LBRY]

Yeah, they got sued by Gary Genzler's SEC even though their coin was one of the few actually being used as a utility token. They were forced to shut down their entire company, and only the network survived. Similarly with Telegram, with the SEC. I will wager that you're reflexively on the side of Gary Gensler and the government "because blockchain". But I would have liked to see this innovation grow, not be killed by governments.



> Many people on HN silently downvote anything that has to do with crypto and decentralization.

I primarily downvote them because I haven't seen anything come out of that space that seems like it's remotely capable of actually achieving decentralization (for which I also see a dire need in today's structure of the Internet and the applications running on it).

95% of the time, these things are built as a Potemkin village of technical decentralization backed up by complete administrative centralization, with the path to actual decentralization "very high on our public roadmap available here, we promise!!!"



I respect someone who downvotes and explains why.

I wish the downvote button would require at least a private message to the person of why they are being downvoted. (Upvote could have an optional message).

Otherwise it's the most toxic feature on HN as it promotes extreme groupthink activism.



A public message seems better. There's zero accountability in private messages - you can just smash your keyboard. You can't leave such a message if it's public.



The problem with this approach is that it relies on governments accepting your legal arguments. You can say "no, these are separate legal entities and each one requires a court order from a different country" all you want, but you also need to get the courts themselves to agree to that fact.



Maybe hijack the key and message before it gets distributed. Or just get after the pieces themselves if they are from Chinese or Russian authorities. Or just threaten to close the local data center if they do not collect the pieces from elsewhere, see if they can be convinced to hand over what they have, regardless where they put it.

We can be null in cryptography, but handing over both the secret and the key to this secret to the very same person is quite a trustful step, even when they say 'I promise I will not peek or let others peek, pinky promise!' - with an 'except if we have to or if we change our mind' in the small prints or between the lines.



> The relevant decryption keys are split into parts and are never kept in the same place as the data they protect. As a result, several court orders from different jurisdictions are required to force us to give up any data.

Or the CEO and owner, staring down the barrel of a very long time in prison, obtains the keys from his employees and provides them to the authorities.

Would he do this? To me, it matters little how much I trust someone and believe in their mental fortitude. I could instead rely on mathematical proofs to keep secrets, which have proven to be far better at it than corporations.



In practice also didn't work, only one government was needed to arrest the guy. And now all they need is a hammer or some pliers. No need for multiple governments to coordinate.



Well I'm sure France isn't taking Durov to some black site at this point. But since there's no such thing as distributed computation of single AES block operation, each server must by definition have access to the server's SQL-database key, and that key can be confiscated from which ever node is interacting with the database. Last I heard the servers in EU were in Netherlands, so if needed, perhaps the authorities there will handle it after court proceedings.



Clearly the investigating authorities are not buying that argument because, well, it's completely absurd. Both technically and legally, Telegram are in control of those keys, regardless of where they are hosted.



That’s Telegram's CEO saying how he and his employees were “persuaded and pressured” by US FBI agents to integrate open-source libraries into Telegram (1).. There are a lot of questions to ask, like if the open-source libraries are indeed compromised, among other things. I take it as this arrest was the final straw to pressure him to give up and hand over some “needed” data, as all the accusations I read are laughable. Instagram is full of human trafficking and minor exploitation, drug dealers, and worse. The same goes with other social media, and I don’t see Elon or Zuck getting arrested. I am confident that this arrest is to obtain specific information, and after that, he will be released, or spend 20 years if he doesn’t comply.

(1) https://youtu.be/1Ut6RouSs0w?t=1082



Or he's trained in the art of lying

"At St. Petersburg State University, Mr. Durov studied linguistics. In lieu of military service, he trained in propaganda, studying Sun Tzu, Genghis Khan and Napoleon, and he learned to make posters aimed at influencing foreign soldiers."

https://www.nytimes.com/2014/12/03/technology/once-celebrate...

You really think the FBI would casually go to Durov and start telling him which libraries to deploy in his software.

This "They're trying to influence me that means its working" 5D-chess is the most stupid way to assess security of anything.

There's nothing to backdoor because it's already backdoored:

Code does not lie about what it does. And Telegram clients' code doesn't lie it doesn't end-to-end encrypt data it outputs to Telegram's servers. That's the backdoor. It's there. Right in front of you. With a big flashing neon light says backdoor. It's so obvious I can't even write a paper about it because no journal or conference wouldn't accept me stating the fucking obvious.



Splitting stuff between multiple companies doesn't really protect anyone if the boss of all companies is held hostage.

Also

> To this day, we have disclosed 0 bytes of user data to third parties, including governments.

Didn't they conclude an agreement with Russian gvt in 2021?



Metadata is indeed an open issue on Matrix. I believe addressing it is on their to-do list.

Many rooms are not encrypted because they are public rooms, where there would be no point in it. Encryption has been the default for quite a while now.



> I believe addressing it is on their to-do list.

I doubt that it's very high on that list, as the problem seems a very hard. Very hard as in that do we even know it's possible? "Metadata" includes a lot of stuff, but basically the originator, the destination and the timing of the messages and participants of a room are all quite difficult to hide in a federated system.

I do believe there is a plan for getting rid of the association of one user in multiple rooms, but that's but a small bit of metadata. I think it is part of the puzzle for supporting changing homeservers.



I was referring to the metadata that are typical complaints about Matrix, like usernames and reactions.

> "Metadata" includes a lot of stuff, but basically the originator, the destination and the timing of the messages

Indeed. AFAIK, sender/recipient correlation cannot actually be protected at the software level, because packet switched networking necessarily reveals it. The common way I'm aware of to mitigate this problem is at the network level, by trying to avoid common routes that would allow monitoring many users' traffic from any one place.

Concretely, that might mean having everyone use Tor (which some folks suggest already) or going fully peer-to-peer (which some messengers do already, and Matrix has been experimenting with).

Signal tries to improve the situation with Sealed Sender, but I'm pretty confident that can't protect against the Signal servers being compromised, nor against network monitoring. When trying to think of how it's useful at all, the only thing that comes to mind is that it might strengthen the Signal Foundation's position when a government demands logs. (And if that is why they implemented it, I suppose they must be keeping logs, at least for a short period.)

Related:

https://www.ndss-symposium.org/ndss-paper/improving-signals-...



I do wonder if this would hold up though, if telegram stored each character of your chat in a different country, would a single country not be able to force them to hand over the data and either fine them or force them to stop operating if they wouldn't share the full chat? It seems like a loophole but I don't know what the precedent is.

联系我们 contact @ memedata.com