(评论)
(comments)

原始链接: https://news.ycombinator.com/item?id=40341716

一名用户批评自己与使用 Telegram 的企业之间缺乏加密,引发了对当局可能删除和操纵数据的担忧。 他们承认 Telegram 之前声称具有卓越的安全性,但承认由于其漏洞和与隐私问题的关联,这不是他们的首选。 他们认为,像 Signal 这样的替代服务可以提供更好的安全性、透明度和灵活性。 尽管承认 Signal 的弱点,但他们更喜欢它,因为它的开源状态、更广泛的接受度以及对隐私的更加关注。 他们最后强调了通信工具中端到端加密和透明度的重要性。

相关文章

原文


Another thing that wasn't pointed out: Du Rove said "Signal messages have been exploited against them in US courts or media."

This would be the same case for Telegram as well, if someone has your phone. I believe that Signal can have a lock on the client, and the database is encrypted.

The other part that Du Rove conveniently left out: Signal went against the US courts and won [0]. When subpoenaed to give all user information they gave them all that had: the unix timestamp of when the account was created, and the last date you connected to the signal service. That was in late 2021. I'm really curious as to what Telegram has told the FSB.

[0]: https://signal.org/bigbrother/cd-california-grand-jury/



Telegram iirc moved it's lead developers to Dubai specifically because the FSB was demanding info from them, so you could argue that's an unfounded concern.

The bigger problem with Telegram is that it by default has insecure encryption settings (as opposed to Signal, where encrypted is the default, you need to manually activate it with Telegram + I think it's not possible to enable for all chats and clients) and to my knowledge, Telegram will outright co-operate with law enforcement agencies to just hand over unencrypted communications. I'd personally argue that's a security dark pattern - make privacy a big selling point, but then don't activate the security by default



> security dark pattern - make privacy a big selling point, but then don't activate the security by default

pretty similar to whatsapp. they boast end to end encryption, but business account (all of them now) uses the facebook server's key, so that the business can give access to several other clients to answer customers. they still call it end to end encryption, and this was actually the last crap the original founder accepted before leaving with lots of money on the table.



Good for you, you probably do not live in a country under digital colonialism where the gov allowed facebook et al to force internet providers to tax the pop with absurdly low and expensive data limits and then "not count" things like facebook and whatsapp and one music app.

In most of the global south, 100% of business have a whatsapp. In those places it pretty much replaced telephone and the green whatsapp icon is now what the current young generation recognize as the we did the black telephone outline on a business front next to a number.

and if you own a business, or is self employed, it is even worse: you live by that app.



The lack of encryption between myself and a business is less offensive than replacing an open standard (plain old telephone systems, eMail) with a proprietary and closed one, backed by a single, private corporation


i completely agree with your sentiment, but i will also say this.

As an expat, this feature has enabled me to transact with locals from the convenience of my phone, even though i don't have any local line and i will not bother to get a local SIM card, nor do i want to have a US SIM and a local one interchangeably.

It also enables me to be very effective when requesting services on demand, and cutting thru the on-hold time, disconnected calls, or the needless chitchat.

I have many bad things to say about WA, but making living more difficult in a foreign country is not one of them.



See, that's the problem.

Telephony is an artificial monopoly. There's not a single reason why you cannot call everyone around the globe for mostly free, as whatsapp proves. Yet here we are.



> Telegram iirc moved it's lead developers to Dubai specifically because the FSB was demanding info from them, so you could argue that's an unfounded concern.

I'd argue it's not giving us any certainty. They could've moved away to escape. They could've moved away to a nice FSB-sponsored location while making good publicity. Ideally the tech should be good enough for this issue to not matter.



EDIT: Disregard below. I'm an idiot when it comes to maps. The statement regarding if the developers are still Russian I believe is still relevant.

Considering the state of Saudi Arabia, having it there is marginally better, but still problematic.

And if the developers are still Russian, there's nothing saying they aren't being squeezed unless their families came with them to Dubai.



> because the FSB was demanding info from them

But they gave the FSB info they asked for -- the vk.com website (facebook clone, at that time it had way more massive amounts of user data than telegram). They could have deleted the data, but no, they handed it over to FSB.



I will point out, in their defence, they handed it over to an organisation that has a habit of assisting people in learning how to fly from windows. This isn't to say Telegram is secure but that it's unlikely they "could have deleted the data" and remained alive.


FSB mostly wanted to prevent people organizing, and that would serve it well. They already had another popular service (odnoklassniki.ru) where to direct people.


You conveniently forgot about the second part of that comment. Durov was forced out of the country and had to cell vk.com for peanuts because of his refusal to cooperate with the government. He is still pissed off at the country at large (not just the government) and refused to add the Russian translation for years, for example, despite it having absolutely nothing to do with Putin.

Since he is Russian in origin, it's okay to throw baseless accusations at him and spout nonsense like "maybe they're FSB agents" or "maybe they hired an FSB agent without knowing it". You see it here everywhere, and HN is one of the better sites in that regard. Well, maybe Signal has hired an NSA agent and doesn't know about it either? How does that sound?



> Durov was forced out of the country and had to cell vk.com for peanuts because of his refusal to cooperate with the government.

It wouldn't be the first time a cover story was ever used.

> Well, maybe Signal has hired an NSA agent and doesn't know about it either? How does that sound?

You should presume they're trying. I, frankly, presume they've succeeded, either in placing an agent or by compromising something, in virtually every prominent messaging platform.



> I'd personally argue that's a security dark pattern - make privacy a big selling point, but then don't activate the security by default

I think it's a great approach instead: the secure, end to end encryption is there and it's ready to be used.

You can easily activated it but you aren't burdened by it for 99% of the time when e2e encryption is not needed.



> You can easily activated it but you aren't burdened by it for 99% of the time when e2e encryption is not needed.

So, in those 1% of the cases when you actually need it, you're instantly flagging yourself as doing something fishy? Because if it ever comes down to it, good luck proving otherwise in a court.

That's like the whole point of why it should be on by default. Not because me making dinner plans is something super-secret that needs to be e2e-encrypted, but because those two scenarios need to be indistinguishable from each other for e2e to be effective.



Yes. Additionally you are at bare minimum signalling that the metadata of the encrypted comms is worth further analysis.

For exactly the same reason if you have a paper shredder, you don't only shred confidential material, you shred a bunch of junk as well to make it harder to find which pieces to reconstruct.



> they gave them all that had: the unix timestamp of when the account was created, and the last date you connected to the signal service.

I'm confused. Signal also has your phone number because they require it, that's the primary privacy criticism against Signal.



The winning in this case was they had to fight to be allowed to release what they provided.

As nice as it would be to not have to provide that information, Signal proved that the only information they have to give is largely useless to law enforcement.



Durov travels freely to and from Russia and several of their employees are still based in Russia. So yeah, the FSB have leverage if they need to use it.


As I stated in a sister comment, Dubai is marginally better, but not significantly better. If it's the same original developers, they could be squeezed through their family.


Same goes for Signal devs, or any devs really. You're only stating the obvious: humans can be forced and coerced given enough motivation and resources.

Singling out Telegram, or Signal, or any other service's devs is not advancing any argument forward.



There is more reason to be concerned about Telegram than most other similar services.

Partly because it’s insecure by default, which makes a large percentage of conversations vulnerable.

And also because the team behind it is very susceptible to pressure from the Russian government, which is especially bad when it comes to these things. Even if some of them are based out of Dubai now, it doesn’t mean that they aren’t still at risk of coercion, either directly or through for example threats against family members who remain in the country.

If you don’t trust Russia, which you shouldn’t, then don’t trust Telegram with anything sensitive.



Can we trust some more than others without trusting anyone completely?

I for one trust that there are more Americans who would say no to the NSA when they have a legal basis for doing so than there are Russians saying no to the FSB.

The state of the rule of law is certainly not great anywhere in the world right now. But it's far worse in some places than in others. The difference still matters to some degree.



> Can we trust some more than others

No, we cannot, so yours and other comments come across as picking favorites. As an Eastern European I'll state that I distrust USA as much as I distrust Russia, China and North Korea. All nations with ambitions use every dirty trick known to man.



They don't have to be friends to turn a blind eye.

If Dubai had to pick between letting some nobody foreign national living on their soil get squeezed by a foreign secret police, or pissing off the Russians, what do you think they would do?

(This isn't a knock on Dubai specifically, substitute them for almost any non-NATO country in the world).



Death is death, whether it's brutal or not, your dead corpse will not care.

Still no idea what your argument is. Or that of the grand-parent comment. "Oh look, Russia is poisoning people!" Yeah, we knew that ever since the Cold War. Show me a nation that doesn't persecute people it doesn't like.



On desktop, on Android and iOS it uses the OS keystore. It really should do on desktop as well, Windows, Mac and Linux (through freedesktop standard) all have APIs for that, there really isn't much excuse. Desktop Signal has always had terrible security, unfortunately.


On Signal vs Telegram:

Telegrams Encryption is off most of the time. They have serverside access to messages. The optional E2E is annoying to use and isnt even available on every platform. For example Tdesktop afaik still has no E2E support. (And has a very brittle software architecture.) You can't register Telegram accounts with the open source client anymore. This should be a non-Discussion.

MG implying that just because other messengers like Whatsapp use Signals encryption scheme does not make them more more trustworthy.

Yes you can verify in a binary if the stuff is implemented well. But if a vendor has control over the update channel or beta rollout features its kinda easy to hide targeted features. Wasn't Whatsapp caught exfiltrating chats in ways that don't involve the normal channel bypassing E2E?

Btw there is no Signal in Fdroid but nowadays there is an accepted by upstream third party implementation. You could separate software and infra vendor. Look at Molly.im

Better to bring non tech folk to Signal than to other messengers that do the same but less protected.

Matrix? Lol!



Both services are relatively insecure because they require phone authentication. In the EU at least the number can always be traced back to you if you don't buy specific burner phones.

The level of encryption isn't as important anymore at that point. It is less probable you get into problems by using a service that doesn't know your identity.



> but your phone number isn't visible to anyone you chat with.

That's irrelevant - the phone number is known to Signal and can be request by law enforcement. And, since it's been made pretty much impossible to buy a SIM in the EU without showing identification [0], this will allow law enforcement to link the account to you.

[0] IIRC the Netherlands is the only country left where you can buy SIMs without ID.



> [0] IIRC the Netherlands is the only country left where you can buy SIMs without ID.

As far as I know, in Romania you can still buy and activate a prepaid SIM card without having to show your ID. There was an attempt a few years ago to make it mandatory to tie the phone number to an ID, but it was overruled by the Constitutional Court.



> That's irrelevant - the phone number is known to Signal and can be request by law enforcement.

So how does this work? Law enforcement asks signal if they have an account for a phone number, signal saying "yes, here's when they created it".

Then what?



> Law enforcement asks signal if they have an account for a phone number, signal saying "yes, here's when they created it".

Law enforcement says that the suspect chatted with some username/told people to contact him by his Signal username, then they go to Signal and request the linked phone number, which is then linked to the ID shown when the card was bought.



This only works as long as the username is active/unchanged. It would probably be better if usernames were never linkable to phone numbers, but if your threat model requires a persistent, non-ephemeral username to remain anonymous when targeted by law enforcement that has access to your telecom records and warrants... that's going to require a pretty high level of opsec.

The UX on usernames in Signal might be non-ideal. It might be helpful to have a toggle that regularly cycles your username if that's important for your threat model.



> That's irrelevant - the phone number is known to Signal and can be request by law enforcement.

Maybe I'm missing something here, but if usernames are treated as ephemeral, what's the threat model here?



> That hasn't been the case for Signal for some months

Wrong. Because:

> You still require a phone number for sign up for Signal

So, they have your phone number. What is displayed is irrelevant.

If they have your phone number (which they do), they will have to disclose it for any subpoena/NSL, so they do.



You can buy "anonymous number" on fragment without using any client and without providing any personal information and use it as much as you can

When signal becomes at least remotely as popular as telegram it will implement same protection to fight against spammers because you can't have free unrestricted registrations and don't drown in spam

Telegram currently makes it as accessible as possible: either use it freely but register using phone number and official app or pay and use anonymously as you want



I just looked at the fragment.com site to see how much such a number costs. The lowest possible bid you can currently make, and that is for an auction that has six days to go, so probably not even the final price, is over 100$. That is an unacceptable price for basic privacy.


Signal is already extremely popular, their anti-spam by default is that you need to get matched to the user's local contact list or the spam becomes an allow/deny prompt. They also require a confirmed phone number and handle registration throttling.


* enough valid accounts. If the instance gets popular, you're going to need hundreds of them to get past rate limits according to the announcements some time ago (maybe the rate limits have changed though)


Signal's definition of "reproducible" meant for quite a while "download this binary docker image and build Signal inside of it". I don't know if that has changed since.

Signal rejects F-Droid for a different reason, though: They only want to distribute through channels where they get download statistics and control update rollouts.



F-Droid uses a package maintainer-esque process where the maintainers of F-Droid can intervene and prevent an update to an app from reaching users if it's deemed to be malicious or to add anti-features.

It's of particularly high need on mobile since popular apps, even those who were originally FOSS, are sold to scummy publishers who fill it with ads and subscription schemes (oft called anti-features, since removing them could be seen as a feature in and of itself), ruining the original. You can't really trust mobile app devs because the track record is downright awful. Recently that happened with the "Simple" collection of apps, where the Play Store version got filled with junk but the F-Droid maintainer froze the version and marked the apps as outdated since nobody could conceivably want the new versions.

Of course, that strokes poorly with developers who a. don't want to deal with potential third parties in their distribution chain rejecting their updates or b. are planning to add anti-features to their apps later down the line. With signal, I'm gonna guess it's mainly a; the Play Stores checks and balances are much less invasive than the sort of thing an F-Droid maintainer might check for. (As I understand it, Google Plays checks mostly are anti-exploit and keyword scans.)



> where the maintainers of F-Droid can intervene and prevent an update to an app from reaching users if it's deemed to be malicious

That sounds like a feature you want when using FOSS.

Imagine distros wouldn't have been able to intervene quickly and malicious xz would be still deployed through their channels just because the authors want to.



Oh yeah, it's an absolutely wonderful feature. F-Droid is pretty much the main app store I'd recommend to get "the basics" from if you're ever in the unfortunate position of having to manage the mobile devices of family members. Having a maintainer "on the lookout" gives so much peace of mind. Not suddenly having the gallery app turn into a data collection machine and baiting less tech-savvy people into vaguely defined subscriptions is a value that's too good not to pass up on.

FOSS isn't really the important part for me there; it's nice, but the real value is that F-Droid is pretty much the only app store that has some reckoning on how the relationship between mobile devs and mobile customers should be far more adversarial than on any other platform due to the poor track record of mobile devs and empowers users to be able to deal with that in a way that restores some degrees of trust.

It's a fucking shame there's not an equivalent on iOS where you can just say "yeah, what you find here can be trusted" and then not have that gets polluted a year down the line. Apple used to somewhat police the App Store back in the early 2010s for similar peace of mind, but that's not the case anymore.



> With signal, I'm gonna guess it's mainly a; the Play Stores checks and balances are much less invasive than the sort of thing an F-Droid maintainer might check for. (As I understand it, Google Plays checks mostly are anti-exploit and keyword scans.)

It might have been b as well – Signal did keep their server code proprietary for many months to add their custom cryptocurrency to it, and added this cryptocurrency for microtransactions into the app as well. There may be many more features like this planned, some of which F-Droid might oppose.



> This way F-Droid could potentially insert a backdoor in an update.

Google requires app developers on play store to give goole the keys that enable google to insert backdoors in any release. I can't trust anything on the play store for this reason. There is no way to tell which apps have been backdoored by google for whatever reason (the usual reason is a NSL).



Ah so that's how Telegram got reproducible iOS builds:

> you need a jailbroken (old) iPhone. And at the end you still can’t verify the whole app. Some files stay encrypted

So basically, it works you just have to bend over backwards to verify that it's truly reproducible.



Telegram were claiming they were more secure even when they had their own home-rolled crypto. Security is not Telegram's strong point and it never was.


Why is MTProto considered "home-rolled" but the Signal Protocol isn't? Both are boutique and written from scratch to fit their respective systems.


Of course. But the history of the Signal protocol and implementation traces back 20 years. It's good enough that Facebook, WhatsApp, and Skype use it for E2EE messages. Telegram's traces back 10 years, the first version was very bad, and both versions have had a lot of scrutiny for weird design decisions.

Crypto schemes which get broken usually follow a pattern of "something smells wrong", "we have weakened it a little bit", "we have weakened it a little bit more", "this is now completely broken", "my god why are you still using MD5, it's 2017".

We're in the "something smells wrong" or "we have weakened it a little bit" phase for MTProto2, depending on how you view it.



> But the history of the Signal protocol and implementation traces back 20 years

Are you sure about that? TextSecure was created more 10 years ago than 20. 20 years ago, we did not have smartphones.

As I remember, TextSecure started with SMS (but that was not the Signal protocol) and added "internet" messages right after WhatsApp got bought (which was about when Telegram was started).

I love the Signal protocol, but I would say it's more 10 years old (like Telegram). Or am I missing something?



Don't forget that Signal is the name of the app, and "Signal Protocol" is the name of the E2EE protocol. The parent was talking about the Signal Protocol.

The fact that Facebook, WhatsApp, etc. use the Signal Protocol kind of shows that it is an accepted standard. But of course there are many reasons to use Signal (the App) instead of those apps, for instance:

- The Signal App is open source. You can check the protocol implementation before you use it. For Facebook, WhatsApp and Skype, you have to trust them (or some audits).

- E2EE is only one part: it ensures that nobody except the recipient can read the content of your messages. But there is a whole story around the metadata. The metadata say who writes to whom, and when. It essentially helps build a social graph. Facebook is very interested in this social graph. It would appear that the Signal Foundation is not. And even if it is not perfect, Signal does a lot to try to minimize the amount of metadata it has access to (and quite obviously Facebook has a huge incentive not to do that).

This said, IMHO it is still a lot better to use WhatsApp than to use Telegram, because at least you benefit from a good E2EE.



It's only the protocol for their E2EE chats. There are two big caveats:

- Facebook and Skype E2EE messages are optional, and people rarely use that option, and - Those apps collect a huge amount of data outside the contents of the messages.



Telegram had some weird primitives which they said we should trust because they were made by their top team of mathematicians. Signal builds on widely used crypto primitives even if their protocol is their own (vetted by actual cryptographers though)


Sort of, but it's heavily peer reviewed and generally regarded as very good.

I really dislike the "hand rolled is bad" meme. Someone rolled all crypto. The questions are "who is doing the rolling," "do they know what they are doing," and "was it peer reviewed or directly and faithfully built from a peer reviewed design?"



> I really dislike the "hand rolled is bad" meme.

Crypto is notoriously easy to get wrong, even if you know a lot about it - and most people do not. Secondly, proving something secure is pretty hard as well. If the crypto isn't a bog-standard algorithm in a well-known and reviewed implementation, assuming it to be insecure is a pretty good rule of thumb.



My take on "don't roll your own" is:

The people who take this advice are people who have respect for the difficulty of things like crypto and should be the ones implementing it, or at least on-ramped into learning how to do so.

The sorts of people who ship bad crypto because they don't bother to learn anything about the field are going to ignore this advice.

So I think as a strategy for fighting bad crypto it's neutral or maybe even net-negative by discouraging the right people from learning crypto and having no effect on overconfident fools.



> should be the ones implementing it

Someone (Bruce Schneier?) said that the best way to get into actually inventing/implementing crypto is to first get handy inventing attacks / hacking into other algorithms and tools.



To add to the idea that crypto is hard, it is not just hard in the same way that, say, making a physics engine is hard. It is hard because there is no telltale sign you did it wrong.

All crypto algorithms, even weak ones output what looks like random numbers that can be deciphered back into the original plaintext. Just by looking at it, there is no way to differentiate between secure and insecure crypto. Contrast to a physics engine, it is hard to get right, but at least, if you did it wrong, it tends to be obvious.

Also, like everything security-related, it is adverserial. You may have some of the smartest and most resourceful guys on the planet working to break your thing. It is worse than even critical systems. Aircraft engine control is critical, people may die if it goes wrong, so robustness and correctness are crucial, but at least, pilots won't go out of their way to break it.



It's inherently risky – cryptography is hard and building secure software is hard, so starting it from scratch rather than re-using well-vetted code increases the risk unnecessarily.

It's not inherently broken, but it's sufficiently risky that it may be fair to assume it is broken. History has proven that software that's not known to be secure is typically insecure when it gets to the really hard crypto implementation. I think it's fair therefore to approximate it as "inherently insecure".



It’s insecure if done by your average full stack developer, that barely passed high school math. That’s why the usual mantra. It’s waaay different when done by math experts specialized in this topic, as is the case with telegram.


Math experts are not necessarily good cryptographers, and authors of MTProto were not renowned cryptographers (unlike with Signal).

Most people in the cryptographic community agree that the Signal protocol is well-designed, is widely believed to be secure, and the authors react openly and swiftly to potential issues. Meanwhile, a lot of the MTProto crypto is just weird (that is, it does not follow standard practices of the field, without strong reasons to do so), and many cryptographers treat it with suspicion.



Because doing proper crypto is VERY hard. You might think you've gotten the ultimate security and one year later someone will defeat it (or cryptanalize within a practical limit, which boils down to the same), because you forgot a detail. Even just implementing a crypto algorithm properly in a way that doesn't leak information is very complex, reason for which most applications tend to use well-established crypto libraries.


> An alarming number of important people I’ve spoken to remarked that their “private” Signal messages had been exploited against them in US courts or media.

Any sources for this except the private testimony of a Signal competitor talking about his important friends? (ETA: Or is it when the court/media obtains your unlocked phone, in which case Telegram won't protect you either...)



My guess would be that their phone was taken from them, unlocked, and their messages were accessed that way.

I know several large IT orgs that have done this when Legal got involved. Literally using a 2nd phone to take pictures of Signal chats on the phone in question.



There are multiple layers where interception can happen:

1) On-screen keyboard - by default most phones do send what is being typed - a lot of phones also have 3rd party keyboards of doubtful origin preinstalled

2) "Enable backup" scam - on starting an app (like Google Photos or WhatsApp) chances you or your wife accidentally press "ok" on a pop up message

3) Hardware drivers - non open source binary blobs with back doors

4) Operating system - you basically don't know what information is logged and sent back to phone's vendor



>by default most phones do send what is being typed

That's extraordinary if true. Do you have anything to back it up, though? Even Google (!) wasn't brazen enough to log everything typed on Gboard, they implemented federated learning.



I think Signal could stand to gain popularity by either prioritizing overall niceness and polish in their clients (especially on desktop) or by allowing third parties to build clients which prioritize those things.

iMessage, Telegram, and Signal all get usage from me, with the vast majority of that usage weighted heavily on the former two because that’s where most people in my circles are. When comparing user experience between the three, it’s easy to see why.



Allowing third party clients that provide identity verification signatures would be totally excellent, and I would support that.

Signal does this today by verifying phone numbers themselves, so they’d have to continue doing so centrally; “never trust the client” applies to their own client just as much as anyone else’s, and “allow unverified users to initiate contact with strangers” is the spam vector infecting all modern telephony (thus STIR/SHAKEN).

So with that need resolved, the biggest risk of third party clients would be intentionally compromised code within an attractive wrapper — but the only way to defend against that is to not allow third party clients at all.

So.. I guess I no longer support third-party clients, having worked through the timelines of what will occur. Ah well.



- "I don't like where one of their board worked" (find someone high up in the cryptography ecosystem who hasn't been involved in this sort of thing somewhere in their career)

- "I don't like where their funding comes from" (US govt regularly funds secure software because they depend on it for their own operations, see: Tor)

- "An alarming number of people think their chats were leaked". It's easy to state things without sources. Also an alarming number of people think Facebook listens to them through their phones' mic. People are bad at opsec. Not news.

- "No reproducible builds. They closed a GitHub request from the community." Well, except Android is reproducible, and they explicitly state on that closed issue that they don't do feature requests via GitHub and asked the reporter to raise in the proper channel.

- "Telegram is the only service with reproducible builds". Telegram barely has encrypted chats, reproduce all you like, that doesn't make the chats secure. Signal has E2E encryption and verifiable builds for Android, that's a strictly better security position.



> An alarming number of people think their chats were leaked

Easily explained by direct access to the phone or Pegasus (or Pegasus-like) spyware. Both of which Telegram is also vulnerable to.



Durov's exile and distancing from Russia after the VK takeover may be just for show and for selling Telegram as 'the dissident app'. It is popular, easy to use and insecure.


They want to do this because they want more traction for their blockchain: TRON, which, IIRC, is the payment method for ads, usernames and "stuff" inside Telegram.

However Du Rove is right about a bunch of things:

- Signal clients suck, specially the Desktop one where they ship (or used to) pre-built binaries like their own lib: https://github.com/signalapp/ringrtc - Also you can't have Signal without Google Play Store - Signal client suck in usability. I wish I had Telegram client (android) and desktop (qt) instead of this electron garbage. Telegram clients are super-duper-awesome - I would say that removing phone number requirement is their #1 request. yet they take so much time to address it, specially when they cry about phone number validation SMS costs - BTW, telegram is implementing a very nice idea of a crowd sourced sms validation, where they use their users phone numbers to send the validation sms - They have a very questionable crypto integration with MobileCoin, which have a obscure value: they depend on IntelSGX and is 95% pre-mined



> you can't have Signal without Google Play Store

You can use Signal without the Play Store. Download the apk from Signal's website and it will use a background connection to receive calls and notifications. The downside is that it's heavier on the battery.



It feels like any platform that allows for one-way initiation of a conversation is bound to increase in spam as the platform grows in usage (phone calls, email, SMS, various social media, various messengers, etc.).

Do any platforms require that both parties add one another? (And/or allow for restricting an account to such a mode)

e.g. if user123 and user789 wish to communicate, then user123 must add/contact user789 AND user789 must add/contact user123. Until both do so, then nothing happens.

It's more work to legitimately establish contact with someone, but that seems like it pales in comparison to the effort produced by spam/scams.

Same thing with verifying identities. In order to actually establish proper contact with someone, you need to communicate with them via some outside means (ideally in person) in order to establish the connection. Requiring both parties to enter/scan some ID/code/whatever seems like it would only facilitate proper verification (though not guarantee it, of course).

I'm sure that I'm missing something, though. I assume I'm just not familiar enough with these platforms and that some/all of them provide such a feature. It's just odd to me that spam sounds like such a problem when it feels like the above solution would be highly effective and simple to include.



I am in a Signal group which has an invite link discoverable on public internet (it's a local OpenStreetMap group). From time to time, a bot joins and proceeds to spam the group's members one-on-one.


The same happens on the Telegram OSM group. Now, the easy and 99% effective mitigation is to make a "bridge group" where you need to click something to join the real deal but changing that would invalidate any existing links.


Isn't it an expected issue with popular services, particularly ones with proper e2e encryption?

Things like WhatsApp and iMessage get scam messages too, and the less visibility the operators have for contents of messages the harder it is to proactively filter out spam.



I really like Telegram from a UI/UX standpoint (so much better than Signal), but Pavel Durov is such a sketchy character that it's starting to turn me off. How can he be touting about being secure when they still haven't implemented end-to-end encryption by default? Also so many other things if you follow Durov's channel.

(I use Telegram and Signal each for about 45% of my messaging, even though I'm in Europe where WhatsApp is so prevalent.)



There seems to be a concerted effort to discredit Matthew's claims. Even here on HN. I find this suspicious. The Signal protocol has been heavily audited by many different people from many different countries. It's usually found to be sound. The telegram protocol has been found to have issues that are, if not malicious, amateur level mistakes.

Once again, this is not my opinion. This is the result of independent auditors who have no affiliation with either the USA or Russia.

There are positives to the UI of Telegram, there are negatives to the UI of Signal. None of these has much to do with the underlying protocol of either.

Personally I'd rather we all put our collective efforts into something like the protocol suggested by Matrix, but if only given the choice of Telegram or Signal, I'd avoid Telegram like the plague. They are either malicious or amateur. Either one isn't a good choice for security.



> The telegram protocol has been found to have issues that are, if not malicious, amateur level mistakes.

Please provide evidence of such issues. Because at most, the issues with MTProto were at the level of "we are not familiar with this, but seems ok". Which seem to be inflated by Signal activists into maliciousness.

You do make bear service here.



> The meaning of "bear's service" originally comes from a fable about a man and a bear. The bear wanted to help the man by killing a gnat which sat on his forehead. As a result both the gnat and the man died.

Basically, by being proactive you do more damage as if you didn't do anything.



Replying to this, as I can't reply to your down-thread reply for some reason.

What if the gnat isn't a gnat? What if the gnat is another man who now knows the communications of the first man? I'm not saying the Bear should kill both, but I'm pointing out that the analogy falls apart when the gnat isn't just a mildly annoying third party.



Ok, apparently I can now reply to this comment... Weird HN delays aside.

I don't care if the people who can decrypt Telegram chats are allied with any one side or another. I believe the idea of "Class enemy" to be abhorrent, and the moral / social threats of "the overall impact" to be negligible when compared to the fact that using compromised communications platforms will inevitably lead to greater problems than the act of calling them out.

This is the equivalent of "You'll keep quiet if you know what's good for you".

If Telegram is broken, certain people need to stop using it. The socio-political climate of the areas most likely to be using Telegram just makes this more urgent. This applies independent of if / how / why it's broken, and who, if anyone, may benefit from this.



Or a Polish one. (I guess the expression will be popular across Eastern Europe)

It's funny to see the basic cultural stuff float to the surface in comments like that. Like when there was a large number of "American" accounts some time ago on Twitter responding to financial news, but putting USD after the numbers... (To be clear, I'm not suggesting anything specific about the author here, just that sometimes you see enough opinions about something with the origin "leaking" through the side channel and wonder how organic it is)



From your own link:

> Recently, in [MV21 ] MTProto 2.0 (the current version) was proven secure in a symbolic model, but assuming ideal building blocks and abstracting away all implementation/primitive details.

Translation: it is secure, except for bugs, if any.



That's a generous translation! They were shown to be double-encrypting, using nonces where they weren't required, and generally making a bunch of mistakes that would be fine if they were writing a student level implementation of a secure messenger protocol, but not one that went on to be tacitly endorsed by a bunch of nation states!

It's like a clunkier version of the backdoor in Dual EC DRBG. When problems like this are found, you can either assume deliberate malice (as in the case of NIST) or accidental incompetence. Either should be immediate grounds for not using the software. This isn't Flappy Bird. This is meant to be secure comms. The "This Is Fine" mentality doesn't cut it.



You can have a secure verified protocol but an insecure implementation of the protocol (the app). Note though that Im not saying that Signal the app is insecure. However I do think that Signal can certainly do more to make itself more transparrent and to accomodate libre 3rd party implementations of their protocol


Eh, split any important message into pieces, put a piece each in Signal, WhatsApp, Telegram, Threema, Line, and then the Americans, Russians, Swiss, and Koreans will each have some parts, but if you're lucky, nobody has all...


Not being in either Telegram/Signal camp I see a lot of tribalism in the comments. It seems that any arguments for/against either one end up in politics.

Like I understand that Telegram is probably not very secure, but seeing what proponents of Signal are saying doesn't really make me trust Signal either.



You mean the BigTech social media platforms that use Signal's protocol for messaging ? Wow I wonder why the people who don't trust BigTech social media don't choose Signal that is actually insane.

If it is indeed political don't try to bring some kind of technological merit into this, it makes you look really dishonest.



Let me set a few things straight: Telegram is for the most part tiktok for people that don't mind putting some effort into reading on a few odd occasions. Saying that I have a lot of Ukrainian friends would be an understatement and the are the only reason I have telegram-all of them favor it, which, all things considered, is a grave mistake. In practice, telegram is far more closely related to tiktok and twitter than a messaging app and by extension it is heavily used to spread misinformation: telegram channels are ultimately under the complete control of their admins and they have the ultimate authority with no way of doing anything about it. Twitter was forced to put some effort into it through community notes but that hasn't even made a dent: it literally takes two google searches to find tens of thousands of bot accounts spreading misinformation. In that regard, telegram is much worse since it's an infinite source of cognitive dissonance: People are willfully joining echo chambers, which are openly advertised as such.

I am really glad that telegram is nowhere nearly as big in western countries compared to eastern Europe. It pains me to say this but, but even till this day, us eastern Europeans are way more susceptible to propaganda than the western world, although, for a million and one reasons that seems to have a huge effect on the western world as well. In that sense, telegram is an active contributor.

10/10 times I'll sit firmly behind Signal, despite the many shortcomings: there is no developer integration, if you want to create a signal account for your own personal bots or whatever, you can but only through a hacky repo that's on github.

Yes, the people behind telegram know all this very well and they don't like the fact that people who are aware of it as well are favoring signal infinitely more than telegram.



And that's good, that's their strength. I use it to read information from all sides of the conflict and decide for myself what's "disinformation" and what's not. A grown person doesn't need a gatekeeper that pushes their own interests and shuts up anyone daring to contradict them.


Oh yeah, "both sides". Sure... Wanna ask the two orphans living at my cousin's where their parents are and who killed them? How many thousands of such examples do you need? I'm sure as hell I can supply you with a sufficient amount, even worse than straight up shooting a child's parents in front of their eyes.


> Wanna ask the two orphans living at my cousin's where their parents are and who killed them?

Applying an emotional argument to shut up discussions against censorship is propaganda 101.



That would be a good argument if it wasn't for the thousands of videos of men, women and children getting raped and killed and russians gloating in the comment sections. Something which telegram is notorious for.


No, it does not.

Let's enumerate the purported problems:

- "Elon Musk said so", which does not matter. - Signal attachments can be viewed by an attacker with local access to the client. This is not Signal's job to protect against. - Signal offers an optional `--no-sandbox` flag which only has security options if enabled on Linux. - Weaknesses in sealed sender. This is the only one that might be an actual problem (two theoretical and one empirical attack, but the latter comes from an 18 page paper that I have not read). But this does not compromise the integrity of the chats, and is not something Telegram improves on.

Given how the posted described the optional `--no-sandbox` flag as "no sandbox on Linux", it's clear that they don't understand anything they're sharing, and they just want to spread FUD.

---

edit: Per discussion below, I was wrong about the `--no-sandbox` flag. It's enabled by default. The risk is that an attacker could figure out how to use Signal to run arbitrary JavaScript. I take back my insult- it was I who did not understand the linked issue.

I still stand by Signal > Telegram. The risk here is that an attacker could figure out how to abuse Signal to run arbitrary Javascript, e.g. through a specially crafted message.



You're right. It seems I am eating my words on that item, the `--no-sandbox` flag does seem to be on in most Linux installs. From context and search, it looks necessary for it to work on Debian.

Can confirm with `cat /usr/share/applications/signal-desktop.desktop`.

This still would require a pretty sophisticated attack to take advantage of, but I wouldn't rule it out as an attack surface. (We regularly see iPhone exploits that attack font and image rendering, after all.)

I'll amend my post given this.



It seems like a twitter thread of multiple messages. How can I read the rest of the messages, not just /1? There's no links to them.


Go on, keep defending the overlord you believe have your best interests at heart while the other 57 of us go worry-free, using Matrix or XMPP.


You will eventually revise your opinion once you find your chat logs 20 years later in some randomly occuring IRC logs because that one guy was using an IRC bridge.

You cannot critique missing guaranteed end to end encryption when effectively matrix cannot guarantee it either.



> You will eventually revise your opinion once you find your chat logs 20 years later in some randomly occuring IRC logs because that one guy was using an IRC bridge.

E2EE can not prevent the receivers from sharing the message (they are one of the "end"s in "end to end-encryption" after all"). The same thing could happen because one person in the group chat ends up getting some ransomware on his phone; E2EE can not prevent that.



Given the location of Telegram's servers (Dubai), and the nature of the government (neutral dictatorship) and the lack of encryption, my default assumption would be that not only are they selling access to your data to major governments, they've probably even streamlined the bidding process.


Both Russians and Ukrainians use Telegram, including confidential messaging with their agents on the foreign territory. So that's a prove enough for me, that it's safe enough.


It's not about ordinary soldiers. It's about special services agents contacting their "partisans" agents, while other side special services trying to catch them. They're supposed to apply best security possible in the given circumstances.

If you claim that neither Russian, nor Ukrainian special services are competent, I'd disagree with you.



The part where they make up stories about the other side doing dumb shit in order to boost/maintain their team's morale.

It's especially critical to drip-feed feel good news when you losing.



I mean, the Russian Ministry of Defense admits it.

> “It is already clear that the main reason of what took place included the massive use, contrary to the ban, of personal mobile phones in the range of enemy weapons,” the Russian Defense Ministry said in a statement. The cellphone data allowed Ukraine, it said, to “determine the coordinates of the location of military service members to inflict a rocket strike.”



"including confidential messaging with their agents on the foreign territory"

Possible, as many ridiculous things happened around the whole war. (Recently german generals on a video chat were targeted by the russians, wasn't too hard, they did not use any encyption at all)

Sources would be nice though.

But it really would not be a reason for me to trust telegrams security.

Rather a confirmation again, that also secret services can show great incompetence.



> Recently german generals on a video chat were targeted by the russians, wasn't too hard, they did not use any encyption at all

They used Webex. Doesn't Webex use any encryption at all?



I would guess that those two would turn encryption on?

IDK, the whole anti-Signal post really makes me suspicious of Telegram whereas I wasn't really before. Are trying to be the universal honeypot for agencies?



Yep. The magic of "you could turn on encryption" is that nearly all people using it won't.

"Ah, but if you need encryption then you'll..." - well, two things now. Suddenly you're the person who has encryption switched on. And also more likely, someone they talk to will forget to switch it on and just blab everything into cleartext anyway.

The entire importance of Signal's model is that it is always encrypted. It's why LetsEncrypt is also important: to have effective security you need to be able to hide in the crowd. If encryption usage is rare, then who's using it itself (or suddenly starts using it) becomes an extremely valuable datapoint.

(so I'd add: Telegram absolutely sell timeline details of which user accounts change their frequency of encrypted chat usage).



Addressing only one point, not your main one which I agree with:

> And also more likely, someone they talk to will forget to switch it on and just blab everything into cleartext anyway.

I expect that if you enable a Telegram Secret Chat with Bob, Bob cannot unilaterally un-secret it. I would be very surprised if that was the case.

Of course Bob can then share the contents with Carol via an un-encrypted channel. But every encrypted channel has that weakness.



Last I used Telegram, creating an e2ee chat with someone added an encrypted chat in addition to the unencrypted chat. This means if your not careful in which chat with a single person a message is sent to it's easy to accidentally send unencrypted data.

I'd guess this is possible because Telegram e2ee chats aren't multi-device capable, so it's necessary to be able to use unencrypted chats while using Telegram on something else than the phone with e2e.



Not a huge fan of Signal (phone number requirement [0], crypto push a while ago), but there are worlds between those two, and every time the Telegram CEO makes a post it looks more like a scam than before.

[0]: Yeah, might be changing or has already. Now, after ages.



> phone number requirement. Yeah, might be changing or has already. Now, after ages.

A phone number is still required for registration. As of a few weeks, it's not necessarily communicated to your contacts anymore, which solves a few concerns (but not all).

> crypto push a while ago

I was worried about this, but I use Signal daily and I haven't even noticed anything in the UI about this, it seems like a non event in the end.



In theory somebody could just make a client that takes your message, generates a random string, XORs your message by that, and sends the XOR via Signal and the rest via Telegram.


Which big centralized messenger operator can be more trusted to run the SW they say they are running is always a contentious shifting argument. If you host your own or use a small hoster, these arguments about who might have been compromised or compelled are not relevant. Decentralized and federated protocols such as Simplex.chat, XMPP, Nextcloud Talk, Matrix, Session, and Delta Chat eliminate this concern.


I've always at least strongly suspected that Telegram is a FSB honeypot.

It's insecure by default so I guess it could be an everyone-honeypot. I'll keep using Signal for my secure messaging thank you very much. Honestly I trust Apple iMessage encryption more than Telegram.



As far as we can tell, they are both insecure: Telegram is closed source and Signal published their source but basically forces users to use the Google Play version which lags behind the OS version and you can never be 100% sure what it does, not to mention things like SGX.


What do you mean by SGX? SGX, even if it's fatally flawed, won't be worse than not using SGX. That's the worst case - they added a broken sandbox. Best case - they added a working one.


So since Signal has a board member who worked at a place that some people don't like then the Signal app must be backdoored/compromised/honeypot?

That's one hell of a leap. How far has our requirement for evidence fallen?



Um, yes. When the “place that some people don’t like” is all sorts of CIA-connected NGOs and you’re a member of group defined by its paranoia about privacy, then absolutely this becomes disqualifying.


AFAICT Signal is collateral damage in this disinformation campaign. The original attack seems to be aimed at the CEO of NPR, coming from an assortment of right wing (and some Russian-aligned) voices. She happens to also be on the board of Signal which, through the prism of conspiracy theory, now extends their crusade. Given that Telegram is commonly understood to be aligned with the Russian government, this maps neatly on the US/left vs Russia/right axis through which such people already understand the world.
联系我们 contact @ memedata.com