Don’t Use Session (Signal Fork)
2025-1-15 04:24:3 Author: soatok.blog(查看原文) 阅读量:30 收藏

Last year, I outlined the specific requirements that an app needs to have in order for me to consider it a Signal competitor.

Afterwards, I had several people ask me what I think of a Signal fork called Session. My answer then is the same thing I’ll say today:

Don’t use Session.

The main reason I said to avoid Session, all those months ago, was simply due to their decision to remove forward secrecy (which is an important security protocol they inherited for free when they forked libsignal).

Lack of forward secrecy puts you in the scope of Key Compromise Impersonation (KCI) attacks, which serious end-to-end encryption apps should prevent if they want to sit at the adults table. This is why I don’t recommend Tox.

And that observation alone should have been enough for anyone to run, screaming, in the other direction from Session. After all, removing important security properties from a cryptographic security protocol is exactly the sort of thing a malicious government would do (especially if the cover story for such a change involves the introduction of swarms and “onion routing”–which computer criminals might think sounds attractive due to their familiarity with the Tor network).

Unfortunately, some people love to dig their heels in about messaging apps. So let’s take a closer look at Session.

I did not disclose this blog post privately to the Session developers before pressing publish.

I do not feel that cryptographic issues always require coordinated disclosure with the software vendor. As Bruce Schneier argues, full disclosure of security vulnerabilities is a “damned good idea”.

I have separated this blog post into two sections: Security Issues and Gripes.

Security Issues

  1. Insufficient Entropy in Ed25519 Keys
  2. In-Band Negotiation for Message Signatures
  3. Using Public Keys as AES-GCM Keys

Insufficient Entropy in Ed25519 Keys

One of the departures of Session from Signal is the use of Ed25519 rather than X25519 for everything.

Ed25519 Keypairs generated from their KeyPairUtilities object only have 128 bits of entropy, rather than the ~253 bits (after clamping) you’d expect from an Ed25519 seed.

fun generate(): KeyPairGenerationResult {
    val seed = sodium.randomBytesBuf(16)
    try {
        return generate(seed)
    } catch (exception: Exception) {
        return generate()
    }
}

fun generate(seed: ByteArray): KeyPairGenerationResult {
    val padding = ByteArray(16) { 0 }
    val ed25519KeyPair = sodium.cryptoSignSeedKeypair(seed + padding)

As an implementation detail, they encode a recovery key as a “mnemonic” (see also: a gripe about their mnemonic decoding).

Does This Matter?

You might think that clearing the highest 127 or so bits of the Ed25519 seed is fine for one of the following reasons:

  1. It’s hashed with SHA512 before clamping.
  2. Ed25519 only offers 128 bits of security.
  3. Some secret third (and possibly unreasonable) argument.

It’s true that Ed25519 targets the 128-bit security level, if you’re focused on the security of the Elliptic Curve Discrete Logarithm Problem. Achieving 128 bits of security in this model requires 256-bit secrets. Having 256-bit secrets makes the multi-user security of the scheme easy to reason about.

When your secret only has 2^{128} possible values, your multi-user security is no longer as secure as Ed25519 expects.

Additionally, you can shove the SHA512 + clamping in your attack script (thus negating the first objection) and find the corresponding secret key in 2^{64} queries if you know the top 128 bits were initialized to 0, using a modified version of Pollard’s rho for discrete logarithms.

This means that Session’s KeyPairUtilities class only provides 64 bits of ECDLP security.

What does 64 bits of ECDLP Security actually mean?

I provided a technical definition already (ECDLP stands for “Elliptic Curve Discrete Logarithm Problem”), but that’s probably not meaningful to most people outside computer security.

What this means is that a distributed computing effort can find the secret key for a given Ed25519 public key generated from this algorithm in only 2^{64} queries.

For flavor, 2^{64} queries is approximately the attack cost to find a SHA1 collision, which we know is possible and economical.

Based on this attack, the authors projected that a collision attack on SHA-1 may cost between US$75K and US$120K by renting GPU computing time on Amazon EC2 using spot-instances, which is significantly lower than Schneier’s 2012 estimates.

— from the Shattered paper, page 2.

I don’t know if this was mere stupidity or an intentional NOBUS backdoor that only well-resourced adversaries can crack. (I also don’t have hundreds of thousands of dollars lying around to test this myself.)

In-Band Negotiation for Message Signatures

If you thought the previous issue was mitigated by the use of Ed25519 signatures on each message, don’t worry, the Session developers screwed this up too!

// 2. ) Get the message parts
val signature = plaintextWithMetadata.sliceArray(plaintextWithMetadata.size - signatureSize until plaintextWithMetadata.size)
val senderED25519PublicKey = plaintextWithMetadata.sliceArray(plaintextWithMetadata.size - (signatureSize + ed25519PublicKeySize) until plaintextWithMetadata.size - signatureSize)
val plaintext = plaintextWithMetadata.sliceArray(0 until plaintextWithMetadata.size - (signatureSize + ed25519PublicKeySize))
// 3. ) Verify the signature
val verificationData = (plaintext + senderED25519PublicKey + recipientX25519PublicKey)
try {
    val isValid = sodium.cryptoSignVerifyDetached(signature, verificationData, verificationData.size, senderED25519PublicKey)
    if (!isValid) { throw Error.InvalidSignature }
} catch (exception: Exception) {
    Log.d("Loki", "Couldn't verify message signature due to error: $exception.")
    throw Error.InvalidSignature
}

What this code is doing (after decryption):

  1. Grab the public key from the payload.
  2. Grab the signature from the payload.
  3. Verify that the signature on the rest of the payload is valid… for the public key that was included in the payload.

Congratulations, Session, you successfully reduced the utility of Ed25519 to that of a CRC32!

Using Public Keys As AES-GCM Keys

I wasn’t entirely sure whether this belongs in the “gripes” section or not, because it’s so blatantly stupid that there’s basically no way Quarkslab would miss it if it mattered.

When encrypting payloads for onion routing, it uses the X25519 public key… as a symmetric key, for AES-GCM. See, encryptPayloadForDestination().

val result = AESGCM.encrypt(plaintext, x25519PublicKey)
deferred.resolve(result)

Session also does this inside of encryptHop().

val plaintext = encode(previousEncryptionResult.ciphertext, payload)
val result = AESGCM.encrypt(plaintext, x25519PublicKey)

In case you thought, maybe, that this is just a poorly named HPKE wrapper… nope!

 /**
 * Sync. Don't call from the main thread.
 */
internal fun encrypt(plaintext: ByteArray, symmetricKey: ByteArray): ByteArray {
    val iv = Util.getSecretBytes(ivSize)
    synchronized(CIPHER_LOCK) {
        val cipher = Cipher.getInstance("AES/GCM/NoPadding")
        cipher.init(Cipher.ENCRYPT_MODE, SecretKeySpec(symmetricKey, "AES"), GCMParameterSpec(gcmTagSize, iv))
        return ByteUtil.combine(iv, cipher.doFinal(plaintext))
    }
}

This obviously doesn’t encrypt it such that only the recipient (that owns the secret key corresponding to the public key) can decrypt the message. It makes it to where anyone that knows the public key can decrypt it.

I wonder if this impacts their onion routing assumptions?

Why should I trust session?

(…)

When using Session, your messages are sent to their destinations through a decentralised onion routing network similar to Tor (with a few key differences) (…)

Session FAQs

Gripes

Some of these aren’t really security issues, but are things I found annoying as a security engineer that specializes in applied cryptography.

  1. Mnemonic Decoding Isn’t Constant-Time
  2. Unsafe Use of SecureRandom on Android

Mnemonic Decoding Isn’t Constant-Time

The way mnemonics are decoded involves the modulo operator, which implicitly uses integer division (which neither Java nor Kotlin nor Swift implement in constant-time).

return wordIndexes.windowed(3, 3) { (w1, w2, w3) ->
    val x = w1 + n * ((n - w1 + w2) % n) + n * n * ((n - w2 + w3) % n)
    if (x % n != w1.toLong()) throw DecodingError.Generic
    val string = "0000000" + x.toString(16)
    swap(string.substring(string.length - 8 until string.length))
}.joinToString(separator = "") { it }

This isn’t a real security problem, but I did find it annoying to see in an app evangelized as “better than Signal” on privacy forums.

Unsafe Use of SecureRandom on Android

The recommended way to get secure random numbers on Android (or any Java or Kotlin software, really) is simply new SecureRandom(). If you’re running a service in a high-demand environment, you can take extra care to make a thread-local instance of SecureRandom. But a local RNG for a single user isn’t that.

What does Session do? They use SHA1PRNG, of course.

public static byte[] getSecretBytes(int size) {
  try {
    byte[] secret = new byte[size];
    SecureRandom.getInstance("SHA1PRNG").nextBytes(secret);
    return secret;
  } catch (NoSuchAlgorithmException e) {
    throw new AssertionError(e);
  }
}

And again here.

SecureRandom secureRandom = SecureRandom.getInstance("SHA1PRNG");

Why would anyone care about this?

On modern Android devices, this isn’t a major concern, but the use of SHA1PRNG used to be a source of vulnerabilities in Android apps. (See also: this slide deck.)

Closing Thoughts

There are a lot of Signal that is poorly specified in their Whitepaper and I didn’t look at. For example, how group messaging keys are managed.

When I did try to skim that part of the code, I did find a component where you can coerce Android clients into running a moderately expensive Argon2 KDF by simply deleting the nonce from the message.

val isArgon2Based = (intermediate["nonce"] == null)
if (isArgon2Based) {
    // Handle old Argon2-based encryption used before HF16

That’s hilarious.

Cryptography nerds should NOT be finding the software that activists trust with their privacy hilarious.

So if you were wondering what my opinion on Session is, now you know: Don’t use Session. Don’t let your friends use Session.


文章来源: https://soatok.blog/2025/01/14/dont-use-session-signal-fork/
如有侵权请联系:admin#unsafe.sh