![]() |
|
![]() |
| Following the St. Petersburg attack, the Federal Security Service (FSB), in an event that may ring somewhat familiar to many in the United States and Europe, asked Telegram for encryption keys to decode the dead attacker’s messages. Telegram said it couldn’t give the keys over because it didn’t have them. In response, Russia’s internet and media regulator said the company wasn’t complying with legal requirements. The court-ordered ban on accessing Telegram from within Russia followed shortly thereafter. Telegram did, though, enact a privacy policy in August 2018 where it could hand over terror suspects’ user information (though not encryption keys to their messages) if given a court order.
... ... Pavel Durov, Telegram’s founder, called on Russian authorities on June 4 to lift the ban. He cited ongoing Telegram efforts to significantly improve the removal of extremist propaganda from the platform in ways that don’t violate privacy, such as setting a precedent of handing encryption keys to the FSB. https://www.atlanticcouncil.org/blogs/new-atlanticist/whats-... |
![]() |
| Why? I think Google suggests that you send the payload encrypted through the notification. Google then only knows which app to send the message to, they don't know from whom the message originates (only "a Telegram server") nor what the content is.
Also, you could just send a notification instructing the app to fetch a new message from your server. From the docs: Encryption for data messages The Android Transport Layer (see FCM architecture) uses point-to-point encryption. Depending on your needs, you may decide to add end-to-end encryption to data messages. FCM does not provide an end-to-end solution. However, there are external solutions available such as Capillary or DTLS. https://firebase.google.com/docs/cloud-messaging/concept-opt... |
![]() |
| Assuming an adversarial relationship, what sort of metadata could Google capture simply knowing which app was sending the notifications and who was receiving them? |
![]() |
| Schneier mentioned this late in 2023:
https://www.schneier.com/blog/archives/2023/12/spying-throug... > Wyden’s letter cited a “tip” as the source of the information about the surveillance. His staff did not elaborate on the tip, but a source familiar with the matter confirmed that both foreign and U.S. government agencies have been asking Apple and Google for metadata related to push notifications to, for example, help tie anonymous users of messaging apps to specific Apple or Google accounts. |
![]() |
| I’m old enough to remember when Signal first implemented cross-device sync using a Chrome plugin.
I’d rather developers issue cautionary warnings than give a false sense of perfect security |
![]() |
| Sorry, but as someone who's completely out of the loop with these things. What's DEFCON/Blackhat or GITEX about and why shouldn't you bring your personal phone?
I'm genuinely interested. |
![]() |
| Do you have some info about Durov being arrested for not letting law enforcement see encrypted messages? The public info says he was arrested for "...lack of moderation, ...[and] failing to take steps to curb criminal uses of Telegram."
I don't see anywhere saying he's been arrested for anything to do with encryption or cooperating with investigations. eg https://www.bbc.co.uk/news/articles/ckg2kz9kn93o but pretty much all the sources I have read say the same |
![]() |
| They are stored encrypted but whether Apple has the key depends on whether you've turned on "Advanced Data Protection" (aka "I don't expect Apple to bail me out when I lose access to all my devices"). The table in this support article details the treatment of various data categories under the two options:
https://support.apple.com/en-us/102651 The default for many categories is that your keys are in iCloud so Apple can recover them for you. With Advanced turned on, the keys are only on your personal devices. A few categories, like the keychain, are always only on your devices. Specifically, see Note 3: "If you use both iCloud Backup and Messages in iCloud, your backup includes a copy of the Messages in iCloud encryption key to help you recover your data." Under normal protection, Apple has the key to your backups, but with Advanced they don't. |
![]() |
| And even "advanced" protection is not advanced enough to protect your calendar and contact list from the government (under silly excuse that Apple uses standard protocols for those data). |
![]() |
| Which is one of the best features. I wouldn’t mind having an option to disable it, but then you also don’t get the advantage of others’ phones finding your device. |
![]() |
| Cryptography is nightmare magic math that cares about the color of the pencil you write it with.
It's not enough you know how to design a cipher that is actually secure, you need to know how to implement it so that the calculator you run it on consumes exactly the right amount of time, and in some cases power, per operation. Then you need to know how to use the primitives together, their modes of operation, and then you get to business, designing protocols. And 10% of your code is calling the libraries that handle all that stuff above, 90% is key management. There's a good amount of misuse resistant libraries available, but Nikolai was too proud to not look into how the experts do this, and he failed even with trivial stuff: He went with SHA-1 instead of SHA-256. He didn't implement proper fingerprints. His protocol wasn't IND-CCA secure. He went with weird AES-IGE instead of AES-GCM which is best practice. He used the weird nonces with the FF-DH, instead of going with more robust stuff like x25519. One thing you learn early in academia, is that expertise is very narrow. I bet he knows a lot about geometry. Maybe even quite a bit about math in general. But it's clear he doesn't know enough to design cryptographic protocols. The cobbler should have stuck to his last. EDIT, to add, the real work with cryptographic protocols starts with designing everyday things that seem easy on the paper, with cryptographic assurance. Take group management that the server isn't controlling. For Telegram it's a few boolean flags for admin status and then it's down to writing the code that removes the user from the group and prevents them from fetching group's messages. For Signal it's a 58 page whitepaper on the design of how that is done properly https://eprint.iacr.org/2019/1416.pdf This is ultimately what separates the good from the bad, figuring out how to accomplish things with cryptography that first seem almost impossible to do. |
![]() |
| Not true. Secret chats only live on a device where you started it. Regular people may not use them (their problem), but these are common for business-critical chats in my circles. |
![]() |
| > When you regain consciousness you'll be perfectly fine, but won't for the life of you be able to recall your device passwords or keys
You can't use your password as input to the mud puddle test. |
![]() |
| You already know how Signal is going to come out here, because this is something people complain incessantly about (the inconvenience of not getting transcripts when enrolling new devices). |
![]() |
| Also the same with Skype "encryption". The data is "encrypted", but you receive the private key from the server upon sign-on... So, just need to change that password temporarily. |
![]() |
| Maybe, but not a good litmus test. If it’s truly secure and the founder can’t provide information because they don’t have access to it it’s also possible they can’t build a case in most countries. |
![]() |
| * White-collar crimes are estimated to make up only 3% of federal prosecutions.
* White-collar crime prosecutions decreased 53.5% from 2011 to 2021. * Annual losses from white-collar crimes as of 2021 are anywhere from $426 billion to $1.7 trillion. The wide range here is due to the lack of prosecutions. * There were 4,180 white-collar prosecutions in 2022. * It’s estimated that up to 90% of white-collar crimes go unreported. Etc. - https://www.zippia.com/advice/white-collar-crime-statistics/ *** Responding by edit due to rate limit: Guys the connection is clear if you think about it. High-net-worth individuals use encrypted messaging apps more than the general population, without doubt. They also have far more resources and abilities to fight a subpoena. It's all distinctively unfair and highly misleading to normal people; for very little real reason and with great potential for abuse. |
![]() |
| > It's a messaging app that took in the function of a social media platform. It did so without robust security features like end-to-end encryption yet it advertised itself as heavily encrypted.
Do you want to say that social networks must implement E2E? Personally I think it is a good idea, but existing social networks and dating apps do not implement it so Telegram is not obliged to do it as well. As for promises of security, everybody misleads users. Take Apple. They advertise that cloud backups are encrypted, but what they don't like to mention is that by default they store the encryption keys in the same cloud, and even if the user opts into "advanced" encryption, the contact list and calendar are still not E2E encrypted under silly excuse (see the table at [1]). If you care about privacy and security you probably should never use iCloud in the first place because it is not fully E2E encrypted. Also note, that Apple doesn't even mention E2E in user interface and instead uses misleading terms like "standard encryption". This is not fair. Apple doesn't do E2E cloud backups by default and nobody cares, phone companies do not encrypt anything, Cloudflare has disabled Encrypted Client Hello [2], but every time someone mentions Telegram, they are blamed for not having E2E chats by default. It looks like the bar is set different for Telegram compared to other companies. [1] https://support.apple.com/en-us/102651 [2] https://developers.cloudflare.com/ssl/edge-certificates/ech/ |
![]() |
| It’s easy for telegram to support this since it’s not e2ee. When you create a so called private chat on telegram, this chat is also only available on the device you created it on. |
![]() |
| Actually I was wrong. Just checked and Telegram does require a phone number to sign up. I haven't used it myself much, but was relaying the general reasons why regular people use it. |
![]() |
| Do you read other news sites that mention Telegram or is this an N=1 situation?
Today, on the same topic, another tech site which generally gets a lot of things right (but whoever is responsible for writing about Telegram, or maybe their internal KB, is consistently wrong and doesn't care about feedback) wrote that it is an encrypted chats service: https://tweakers.net/nieuws/225750/ceo-en-oprichter-telegram... ("versleutelde-chatdienst" means that for those fact checking at home) |
![]() |
| So they take theatrics over logical evaluation of the situation. Cool. Tell them Durov could have locked himself out of their data and spared himself the trip to behind bars. |
![]() |
| Amplified by journalists, and most frustratingly to me even some techies that just can't be bothered to properly examine all available facts despite their technical capabilities to examine them. |
![]() |
| I am null at cryptography but thie following does not sound too bad as a default tbh. And I think it is misleading to focus solely on e2ee and not mention the distributed aspect.
https://telegram.org/faq#q-do-you-process-data-requests > To protect the data that is not covered by end-to-end encryption, Telegram uses a distributed infrastructure. Cloud chat data is stored in multiple data centers around the globe that are controlled by different legal entities spread across different jurisdictions. The relevant decryption keys are split into parts and are never kept in the same place as the data they protect. As a result, several court orders from different jurisdictions are required to force us to give up any data. > Thanks to this structure, we can ensure that no single government or block of like-minded countries can intrude on people's privacy and freedom of expression. > Telegram can be forced to give up data only if an issue is grave and universal enough to pass the scrutiny of several different legal systems around the world. > To this day, we have disclosed 0 bytes of user data to third parties, including governments. |
![]() |
| Have you read the article I link to in that point? After you read it, you'll have a better idea, and then if you have a specific point, we can discuss. |
![]() |
| > The answer to both of those is, as it turns out, "no"
This is not true, it depends on the client. Conversations has OMEMO enabled per default. |
![]() |
| Note in particular that the Ethernet connection to xmpp.ru/jabber.ru's server was physically intercepted by German law enforcement (or whatever-you-think-they're-actually-enforcing enforcement), allowing them to issue fraudulent certificates through Let's Encrypt and snoop on all traffic. This was only noticed when the enforcement forgot to renew the certificate. https://news.ycombinator.com/item?id=37961166
|
![]() |
| That you don't like the design is well known. But this is not the reason to lie.
You understand the design quite well, from our past conversations, you simply don't like the fact that we don't recognise user IP address as a permanent user identifier on the protocol level. It is indeed a transport identifier, not a protocol-level identifier that all other messaging networks have for the users (in addition to transport identifiers). Message routing protocol has anonymous pairwise identifiers for the connections between users (graph edges), but it has no user identifiers - messaging servers have no concept of a user, and no user accounts. Also, recently we added a second step in message routing that protects both user IP addresses and transport sessions: https://simplex.chat/blog/20240604-simplex-chat-v5.8-private... In general, if you want to meaningfully engage in the design criticism, I would be happy too, and it will help, but simply spitting out hate online because you don't like something or somebody, is not a constructive approach – you undermine your own reputation and you also mislead people. > You ask the authors how they solved the problem of server needing to know to which client connection an incoming ciphertext needs to be forwarded, and they'll run to the hills This is very precisely documented, and this design was recently audited by Trail of Bits (in July 2024), we are about to publish their report. So either you didn't understand, or your are lying. > They're lying by omission about their security, and misleading about what constitutes as a permanent identifier. You would have to substantiate this claim, as otherwise it is slander. We are not lying about anything, by omission or otherwise. You, on another hand, are lying here. That you are spiteful for some reason is not a good enough reason. Factually, at this point SimpleX Chat is one of the most private and secure messengers, see the comparisons of e2e encryption properties in SimpleX Chat and other messengers: https://simplex.chat/blog/20240314-simplex-chat-v5-6-quantum... |
![]() |
| Problem with this claim is that it's hardly verifiable. Telegram's backend is closed source, and the only thing you can be sure of is that their backend sees every message in plaintext. |
![]() |
| It's always very easy to trust as long as you're allowed to be mistaken in your trust. That's literally how people fall for all kinds of things, including wars, advertising, etc. It's much harder to fool all the people all the time, than corrupt some of the people (the ones in charge) all the time:
https://www.npr.org/sections/parallels/2014/04/02/297839429/... The mistake Moxie makes (and you do as well, you should really click on the links I posted to understand why) is that "no one wants to run a server". In fact, an entire industry of professional "hosting companies" exists for Wordpress, Magento, etc. It's a free market of hosting. You can't trust the software they're hosting, that's true. Which is why we have things like Subresource Integrity on the Web, IPFS, and many other ways to ensure that the thing you're loading is in fact bit-for-bit the same as the thing that was just audited by 3 different agencies, and battle-tested over time. Think UniSwap. I'd rather trust UniSwap with a million dollars than Binance. I know exactly what UniSwap will do, both because it's been audited and because it's been battle-tested with billions of dollars. No amount of "trust me bro" will make me trust Binance to that extent. The key is "Smart contract factories": https://community.intercoin.app/t/intercoin-smart-contract-s... In short, when you decouple the infrastructure layer (people running ethereum nodes) from the app layer (the smart contracts) all of a sudden you can have, for the first time in human history, code you can trust. And again, there is a separation of responsibilities: one group of people runs nodes, another group of people writes smart contracts, another group audits them, another makes front-end interfaces on IPFS, etc. etc. And they all can get paid, permissionlessly and trustlessly. Look at Internet Computer canisters, for instance. Or the TON network smart contracts. There are may examples besides slow clunky blockchains today. |
![]() |
| The technology has potential to be decentralized, but telephones were famously considered a "natural monopoly" and ended up centralized under Ma Bell.
Government split Ma Bell into multiple smaller pieces, but they still operated as a cartel and kept prices high. They had centralized telephone switchboard operators etc. It is only when authors of decentralized file-sharing networks like Kazaa (who built them to get around yet another government-enforced centralized regime of Intellectual Property, RIAA, MPAA etc.) went clean did we get Skype, and other Voice over IP consumer products. And seemingly overnight, the prices dropped to zero and we got packed-switched networks over dumb hubs, that anyone can run. That's the key. We need to relegate these centralized platforms (X, Meta, etc.) into glorified hubs running nodes and earning some crypto, akin to IPFS nodes earning filecoin, or BitTorrent nodes earning BTT, etc. Everything centralized gets enshittified Clay Shirky gave a talk abot this in 2005: https://www.ted.com/talks/clay_shirky_institutions_vs_collab... And Cory Doctorow recently: https://doctorow.medium.com/https-pluralistic-net-2024-04-04... |
![]() |
| Urbit uses NFTs as IDs, which can be transferred
"Urbit IDs aren’t money, but they are scarce, so each one costs something. This means that when you meet a stranger on the Urbit network, they have some skin in the game and are less likely to be a bot or a spammer." https://urbit.org/overview Who pays for the hosting of ActivityPub and Matrix instances? What if one instance abuses other instances too much? How do you prevent it? What if some spammer abuses Nexcloud? Oh, look at that, Nextcloud and Sia announce "cloud storage in the blockchain": https://nextcloud.com/blog/introducing-cloud-storage-in-the-... Now we come to your ActivityPub stuff, including PeerTube. The question is, who pays for storage? What are the economics of storage? I literally go into detail here: https://community.intercoin.app/t/who-pays-for-storage-nfts-... I met the founders of LBRY / Odysee and other tokens that are actually being used for actual streaming. LBRY is a genuine utility token being used for instance. You are totally ignoring the part that people need to get paid for storing stuff, and at the same time the payment needs to happen automatically. Any other examples? |
![]() |
| Ugh.
> I could go on for literal days I believe there should be a way to have a rational discussion about this, point by point, maybe threaded. This ain't it. But whatever. I am not married to blockchains, I have literally criticized blockchains. I have said that smart contracts and distributed systems are what we need, whether that's Internet Computer canisters, SAFE network datachains, IOTA DAGs, Hashgraphs, or Intercloud (something I designed). I don't know why people on HN love to do this strawman over and over. Blockchain is a settlement layer. I don't even say it's needed for day-to-day micropayments. I explain how the systems could work, which could use any smart contracts for their settlement layer, I don't care about the underlying technology for those, but the Web2 elements are there: https://qbix.com/ecosystem#DIGITAL-MEDIA-AND-CONTENT > The last I heard of [LBRY] Yeah, they got sued by Gary Genzler's SEC even though their coin was one of the few actually being used as a utility token. They were forced to shut down their entire company, and only the network survived. Similarly with Telegram, with the SEC. I will wager that you're reflexively on the side of Gary Gensler and the government "because blockchain". But I would have liked to see this innovation grow, not be killed by governments. |
![]() |
| A public message seems better. There's zero accountability in private messages - you can just smash your keyboard. You can't leave such a message if it's public. |
![]() |
| In practice also didn't work, only one government was needed to arrest the guy. And now all they need is a hammer or some pliers. No need for multiple governments to coordinate. |
![]() |
| Or he's trained in the art of lying
"At St. Petersburg State University, Mr. Durov studied linguistics. In lieu of military service, he trained in propaganda, studying Sun Tzu, Genghis Khan and Napoleon, and he learned to make posters aimed at influencing foreign soldiers." https://www.nytimes.com/2014/12/03/technology/once-celebrate... You really think the FBI would casually go to Durov and start telling him which libraries to deploy in his software. This "They're trying to influence me that means its working" 5D-chess is the most stupid way to assess security of anything. There's nothing to backdoor because it's already backdoored: Code does not lie about what it does. And Telegram clients' code doesn't lie it doesn't end-to-end encrypt data it outputs to Telegram's servers. That's the backdoor. It's there. Right in front of you. With a big flashing neon light says backdoor. It's so obvious I can't even write a paper about it because no journal or conference wouldn't accept me stating the fucking obvious. |
![]() |
| I was referring to the metadata that are typical complaints about Matrix, like usernames and reactions.
> "Metadata" includes a lot of stuff, but basically the originator, the destination and the timing of the messages Indeed. AFAIK, sender/recipient correlation cannot actually be protected at the software level, because packet switched networking necessarily reveals it. The common way I'm aware of to mitigate this problem is at the network level, by trying to avoid common routes that would allow monitoring many users' traffic from any one place. Concretely, that might mean having everyone use Tor (which some folks suggest already) or going fully peer-to-peer (which some messengers do already, and Matrix has been experimenting with). Signal tries to improve the situation with Sealed Sender, but I'm pretty confident that can't protect against the Signal servers being compromised, nor against network monitoring. When trying to think of how it's useful at all, the only thing that comes to mind is that it might strengthen the Signal Foundation's position when a government demands logs. (And if that is why they implemented it, I suppose they must be keeping logs, at least for a short period.) Related: https://www.ndss-symposium.org/ndss-paper/improving-signals-... |
And it only works because a corporation likely would want to offer this to its users as a convenient feature. If they were actively trying to hide this, they can rig the test and keep access to themselves.