Reviewing the Cryptography Used by Signal
作者批评Telegram不安全,并解释Signal和Tor的优势。Signal和Tor免费且开源,安全性高。作者强调审计的重要性,并指出Signal的加密技术经过严格审查。 2025-2-18 12:7:0 Author: soatok.blog(查看原文) 阅读量:28 收藏

Last year, I urged furries to stop using Telegram because it doesn’t actually provide them with any of the privacy guarantees they think it gives them. Instead of improving Telegram’s cryptography to be actually secure, the CEO started spreading misleading bullshit about Signal®.

Since then, I’ve been flooded with people asking me about various other encrypted messaging apps and accused by Internet reply-guys of having malicious intentions. Some of the more egregiously stupid accusations were that I was somehow being paid to promote Signal.

Not only am I not being paid to promote Signal, I refuse to ever be paid to promote anything ever! I’ve been extremely vocal about this.

To be clear, being accused of being a paid shill for recommending Signal isn’t exactly unique to Signal, it also happens with other technologies.

For example: Have you ever wondered by influencers (streamers, vloggers, etc.) always promote “VPN services” instead of Tor (which is free)?

It’s not just “today’s sponsor”, either.

Bad security advice, usually marketed as infosec advice for activists, absolutely loves to recommend specific VPN companies. Interestingly, they frequently make no mention of Tor. When you point this out, the same crowd will try to weasel in FUD about Tor.

Soatok: That they recommend a VPN and not Tor in their first table immediately makes me suspicious.

Archive.org mirror

First Reply: Why? I’ve personally seen more news articles about Tor users getting de-anonymized than I have VPN users. […]

The rhetorical sleight-of-hand here isn’t particularly clever.

  • Tor uses onion-routing to provide anonymity to Internet traffic.
  • VPNs just provide an encrypted tunnel to another ISP, and therefore do not offer anonymity.
  • You can’t deanonymize VPN users because they were never anonymized to begin with!
  • VPN providers that lied about “no logs” never faced any meaningful consequences.

Tor is at least as private as any VPN. If you’re worried about exit nodes, only use Tor to access Onion Services or websites that use HTTPS.

As a security engineer that specializes in applied cryptography, I’m generally not interested in the “Tor vs VPN” debate.

I’m much more interested in the “WireGuard vs OpenVPN” debate (on the side of WireGuard), and what lessons about software security the rest of the industry could learn from WireGuard.

In fact: If someone is promoting a VPN service in 2025 and that service doesn’t use WireGuard as its underlying protocol, they are almost certainly LARPing at security expertise rather than offering valuable advice.

But I digress.

Like Tor, Signal doesn’t cost you anything to use. Nobody makes money by telling you to use either of those things.

Despite having no financial incentive for doing so, security and privacy experts (including the EFF’s director of cybersecurity, Eva Galperin) constantly stake their reputation by recommending Tor and Signal.

And therein lies the question: Is Signal’s cryptography actually good? And how can we be sure of that?

To know this, we first need to discuss cryptography audits.

Audits For Normies

Audits are a type of engagement between a vendor and a team of security consultants with specific expertise in the technologies involved.

How an audit works is, loosely:

  • The vendor (or a third party, such as OSTIF) hires the consultants for a timeboxed assessment of the security of the product or service in question.
  • The consultants (ideally with the source code in hand) will then try to find any way to subvert the normal operation of the product/service, especially in a way that’s useful for an attacker.
  • Any findings that result from the consultants’ work are compiled together into an Audit Report, with specific recommendations for remediating the issues they identified.
  • The vendor responds by either fixing each issue, or documenting them as known limitations if a fix is impractical.
  • Optional: The Audit Report is made public.

Regardless of the expertise of the consultants, every audit suffers from the same limitations:

  1. The engagement has a specific timebox, which means that coverage will be finite.
  2. The engagement is performed over a finite number of snapshots of the source code (typically, one commit hash), so each subsequent commit to the codebase erodes the relevance of the audit.
  3. The consultants are human beings, and therefore imperfect.

Furthermore, performing an audit of a product or service without a clear threat model can lead to a lot of disagreement about the relevance or severity of any findings.

This cuts both ways: High-severity issues could actually be nothingburgers to the users of the app, or “informational” findings could be a dealbreaker to your users. Lacking clarity about the security goals and assumptions can hamstring any efforts to providing security assurance.

Unfortunately, sometimes you will see encrypted messaging apps proudly proclaim, “We were audited” when facing criticism, except:

  • Their last audit was 5+ years (and/or over 1000 commits) ago.
  • They only have the one public audit report.
  • The company and/or person that did the audit has no other online footprint, including other audits, and only seemed to pop up to opine about this one vendor.
  • The audit report reads more like sales copy than a critical analysis of the product’s security.
  • The timebox for the audit is tiny compared to the quantity and complexity of the software in question.

It isn’t important that the company providing the audit be one of the more recognizable names (e.g., for cryptography: Cure53, Kudelski Security, Least Authority, NCC Group, Trail of Bits, and myriad blockchain / smart contract security firms that sometimes demonstrate real cryptography chops).

Plenty of smaller security consulting teams do excellent work.

What a more recognizable brand gives you, however, is a reasonable amount of quality control: They’re generally better resourced to ensure that the appropriate experts are assigned to each project, so you’re less likely to end up with a useless rubber-stamp Audit Report than you might from a one-person shop in over their head.

It’s probably better to use the list above to inform your heuristics for assessing the credibility of an Audit Report, rather relying on, “Did the biggest name sign off on it?”

Why Are You Telling Us All This?

Because, like the title says, I’m going to review the cryptography used by Signal.

This series is morally equivalent to the sort of work you’d get from an audit, if it were timeboxed over a weekend rather than several weeks.

Objectives For This Series

I’m writing this series in the same spirit as SwiftOnSecurity’s Decent Security guides:

Everyone can be secure.

It is with those four words this website is founded. Computer, smartphone, and online security does not require a degree or years of experience. All it requires is someone show you the way.

My spin on the Decent Security mission statement is a bit more ambitious, if narrowly scoped:

You can understand applied cryptography.

Although the bar for designing and implementing cryptography must be very high to prevent overconfident developers from making preventable mistakes that hurt users, being able to critique the security of private messaging apps is a skill anyone is capable of learning.

Verifying the security claims made by an encrypted messaging app does not require a Ph.D in mathematics or an encyclopedic knowledge of software vulnerabilities. All it requires is someone teaching the fundamentals, and then indulging in a bit of self-exploration.

Time will tell if I’m successful or not.

Additionally, Key Transparency has recently landed in libsignal (though it has not been consistently implemented in the Signal app yet).

A Note On Timeboxing

I did the header image and wrote this blog post earlier, but I did the entirety of this cryptography review over the course of a single weekend. That was the original timebox I set for myself.

I do feel that, if I had allocated more time to it, I might have been able to explain some of the later components in greater detail.

However, as I was writing this, Twitter decided to ban links to signal.me in an attempt to censor Signal users. Thus, I decided to pull the trigger and stay true to my original timebox.

Dead Twitter logo
Fuck Elon Musk

Contents

  1. Introduction (you are here)
  2. How Soatok Approaches Cryptography Audits
  3. Mapping Signal and Prioritizing Targets
  4. Message and Media Encryption
  5. Forward-Secure Ratcheting Protocols
  6. Miscellaneous Cryptographic Features
  7. Signal’s New Key Transparency Feature
  8. Summary and Findings

Signal, the Signal logo, and all related names, designs, and slogans are registered trademarks of Signal Technology Foundation.

Though my inclusion of their logo is almost certainly Fair Use (as I am writing about Signal, not pretending to represent Signal), their Trademarks page indicates that this text should be included.

Header art: CMYKat, Signal’s logo, magnifying glass emoji from Noto Color


文章来源: https://soatok.blog/2025/02/18/reviewing-the-cryptography-used-by-signal/
如有侵权请联系:admin#unsafe.sh