Why most people use 256 bit encryption instead of 128 bit?

• Isn't 128 bit security enough for most practical applications?

Because it's larger, and thus sounds better.

i think the minimum of 128 bits is decided upon with a large security margin in mind. maybe many people doesn't know that and so think that the difference between 128 and 256 can matter for their needs. if it takes with a 128 bit key 500 years to break a ciphertext and 1000000000 years with a 256 bit one, does it matter?

I might be remembering this incorrectly...If you are encrypting a lot of data, you may end up with duplicate subkeys (subkey is probably the wrong term) if there is enough data. This is functionally equivalent to reusing a one-pad cypher. I believe key size, block size, and cypher mode were what determined how much data was too much data.

more complex === more secure

Advances in quantum computing will reduce effective key size of symmetric key cryptosystems by half in the foreseeable future. `2^(128/2) = 2^64 = ` conceivably brute-forceable on a quantum computer. `2^(256/2) = 2^128 = ` still secure.

Please leave me some time to answer. 500 years should be enough.

There's a near duplicate post on http://crypto.stackexchange.com/q/20/2373

I wonder regarding the 500 year estimate, are assumed technology advances in computational speed taken in consideration?

@recursion.ninja do you chapter and verse you can quote on the effective key size halving?

I assume you're referring to the key size. **Note** that the key size required also depends on the algorithm: For an asymmetric algorithm like RSA you will need 2048 or 4096 bits, 128 bits would be far too weak.

@recursion.ninja That is assuming an elementary quantum operation can be done at the same speed as an elementary classical operation. It is very possible that a quantum computer will never run any faster than the equivalent of 0.7 MHz in which case breaking a 128-bit key, even with grover's algorithm reducing it to the effective strength of a 64-bit key, would be infeasible, yet breaking a 4096-bit RSA key would be easy. Could you imagine searching a 2^64 keyspace with an old Intel 4004?

@forest I don't understand your point. Taking only frequency in account, 0.7MHz is only 5700 times slower than a recent 4GHz CPU. Make it 1 000 000 if you want, accounting for various architecture differences. However, a 64 bits keyspace is 18446744073709551616 times smaller than a 128 bits keyspace. That does not mean it can be broken, but it sure gives an edge to quantum computers.

@youen My point is that a quantum computer may be _significantly_ slower than a modern computer in terms of cycles per second. While 2^64 classical operations (a 64-bit keyspace) for a classical computer is not unrealistically difficult, 2^64 quantum operations (a 128-bit keyspace) for a quantum computer may be far beyond what we will be capable of. Additionally, the speed of grover's algorithm is only sped up by the square root of the number of discrete computers running the algorithm.

@forest agreed, if the (cluster of) quantum computer(s) is slower by a factor of 10^19, then it won't help break a 128bits key.

And although I agree that a QC still has a huge advantage, I'd say that a 5,700 improvement is many orders of magnitude off. A 4004 is likely hundreds of millions, if not billions of times slower than a modern, multicore, SIMD-capable, heavily-pipelined and caching processor. There are things it can do in a few cycles that would take a 4004 seconds of time. If a QC was as limited as a 4004 (hypothetically), then a 64-bit keyspace would be absolutely out of reach. To be quite honest, I would be surprised if the first generation of cryptoanalytic QCs were _not_ more than 2^19 slower.

Yes... AES128 will remain uncrackable until at least 2030, so the only thing one is doing by utilizing AES256 is taxing the CPU unnecessarily. If AES256 is used over AES128 in OpenVPN for instance, it massively throttles throughput with zero additional security benefit.

• Why do people buy red sport cars ? They do not go faster than sport cars of any other colour...

AES comes with three standard key sizes (128, 192 and 256 bits). Many people see this and think that if there are three distinct sizes instead of just one, then there must be some difference, and since the 256-bit version is a bit slower than the 128-bit version (by about 40%), it must be "more secure". So they go for "the most secure" and choose 256-bit keys.

In reality, the AES has three distinct key sizes because it has been chosen as a US federal algorithm apt at being used in various areas under the control of the US federal government, and that includes US Army. US Army has a long-standing Tradition of using cryptography, and that Tradition crystallized into internal regulation with all the flexibility and subtlety that armies around the world constantly demonstrate (just listen to some "military music" and you'll understand what I mean). Unfortunately, this happend quite some time ago, before the invention of the computer, and at that time most encryption systems could be broken, and the more robust were also very hard and slow to use. So the fine military brains came up with the idea that there should be three "security levels", so that the most important secrets were encrypted with the heavy methods that they deserved, but the data of lower tactical value could be encrypted with more practical, if weaker, algorithms.

These regulations thus called for three distinct levels. Their designers just assumed that the lower level were necessarily weak in some way, but weakness was not mandatory. So the NIST decided to formally follow the regulations (ask for three key sizes) but to also do the smart thing (the lowest level had to be unbreakable with foreseeable technology). 128 bits are quite sufficient for security (see this answer for details). Therefore AES accepts 256-bit keys because of bureaucratic lassitude: it was easier to demand something slightly nonsensical (a key size overkill) than to amend military regulations.

Most people don't know or don't care about History, and they just go for big because they feel they deserve it.

Can you provide sources that the three levels are really only to satisfy (old) military regulations? It doesn't make a lot of sense to me, if 128 bits was secure enough they might as well have used 128, 136 and 156 for that matter. The en/decryption time would have been shorter yet still secure according to you.

@Luc: the 128/192/256 bits are for aesthetics: powers of 2 are always better (and 3DES was already using, formally, a 192-bit key -- out of which 24 are ignored, but that's another story). For the source, it was what someone told me directly at that time (I think it was Schneier) so I do not have a written source. As for the decryption time, it is Rijndael-specific; some other candidates offered the same performance for all key lengths.

256 bit keys aren't completely useless. They do provide a defense against the possibility of Quantum Computers, specifically Grovers Algorithm which can reduce the search space by effectively half. We're still don't have real quantum computers yet, and nobody knows when we will. The point being though that if you wanted to protect your secrets for 50+ years, you might pick double the currently accepted key size. https://en.wikipedia.org/wiki/Post-quantum_cryptography

@SteveSether: this is the occasion to point out that the "128-bit" limit is in no way universal; we use 128-bit keys because this is substantially beyond what can be cracked _with classical computing_. Grover's algorithm theoretically turns the search in a 2^64 effort, but these are 2^64 _quantum_ operations. Whether individual QC operations are similar in cost to individual classical operations remains to be seen. Right now, 2^64 QC operations is way harder than 2^64 classical operations, notably because QC computers do not exist yet.

@ThomasPornin I think the point is that 256 bits provides SOME additional level of security, however marginal. Much remains to be seen. For some people even the theoretical possibility of the message being cracked is enough to invoke the 40% cost of additional CPU cycles. The is largely relegated to the NSA, or CIA, etc. I'd make the same recommendations to anyone that claims to need to protect a secret for 50+ years. CPU is cheap.

@ThomasPornin Is there a difference in the probability of finding the key earlier with 128 vs. 256? Mostly ignorant of the whole field but why does the statistical chance of finding a key earlier than max or average time not seem to play into most brute force discussions?

@Dave: there is a mathematical difference in probabilities, but the point is that the difference does not matter in practice: if a probability is so small that you can plan your business or even your life on the idea that it won't happen, then making the probability even lower will not have any practical consequence.

@ThomasPornin, The first paragraph of this answer is inaccurate. There are decisions and messages whose secrecy-violation can affect more than just one life or even the lives of one entire country. The 256 bits is to guard against unknown unknowns and known unknowns, e.g. the possibility that Russian/Chinese/etc crackers have *[cont]...*

*...[cont]* more knowledge than their U.S. counterparts. (And the U.S. itself seems to be like 2 different countries consisting of US-the-citizens vs US-the-cia.) Besides no one will know what will happen in the next 80 years. Messages might not require secrecy for thousands of years, but they might need secrecy for a few hundred.

The NSA has since switched to using 256 keys for AES out of concern for quantum attacks. The answer should be amended to include @SteveSether's analysis : )

The question is why this answer has **128** likes?

• When you are building a security system you need to plan on failure. This is the idea behind a defense in depth strategy.

Cryptographic primitives become weaker over time. Although a 128 bit primitive is plenty, a flaw could be uncovered in the cipher which reduces this level of security. So you need to add a security margin when the underlining primitive fails.

For example md5 produces a 128 bit hash, however using a chosen-prefix attack an attacker can produce a collision with a complexity only 2^39.

Essentially it's about security margin. The longer the key, the higher the effective security. If there is ever a break in AES that reduces the effective number of operations required to crack it, a bigger key gives you a better chance of staying secure. Besides, with commodity hardware available today, the performance difference between 256-bit AES and 128-bit AES is fairly small. That and, as CodeInChaos mentioned, bigger numbers sound better and more secure.

Ok not a bad logic. but can anyone tell exactly how strong a 128 bit cipher is at the current time, assuming no considerable breach is found in it and large quantum computers aren't realized?

regarding md5 story, i have heard that cryptographic hash algorithms are generally considered less reliable than cipher algorithms, because they have not as strong a security proof as cipher algorithms have. i don't know the details, and sorry for no reference, but i have read such words several times in several places, and so think a difference really exists between hash algorithms and cipher algorithms in this regard. also afaik no practically dangerous breach in modern and standard cipher algorithms like AES is discovered for years up now.

@Polynomial Yeah margin, that is the word i should have used.

The comparison with MD5 is a bit misleading. The collision resistance of MD5 has never been better than 2^64 due to the birthday paradox.

@HM I think the opposite is true. Hash functions often need more difficult-to-achieve security properties, such as collision resistance. It is true that hashes are more fragile when it comes to that property, but for preimage resistance, they're generally quite secure. In fact, even something as weak as MD5 (hell, even MD4 or MD2!) could be used to construct an _extremely_ secure stream cipher, far more secure than any of the numerous broken ciphers that litter history (DES, RC4, E0, A5/1, etc).

• I didn't see this mentioned in the answers or comments so I thought to add this as an answer. Key size does not always correlate directly to complexity of an algorithm. A common fallacy is to assume that a message encrypted using AES256 is more difficult to crack (an adversary getting any sort of meaning information given only the ciphertext) than the same information protected using AES128. It makes logical sense that a larger key size provide introduces greater complexity but as with any systems, implementations are subject to weaknesses.

Assuming you're talking about AES 128 versus AES 256, there is a known weakness in the key expansion function that affects AES256. Fundamentally, the weakness reduces the complexity of AES256 to that lower than AES128. There's a similar attack for AES192 as well, though in this case, the complexity of AES192 remains greater than AES128.

Moral of the story, people don't understand crypto... j/k (I'm not a mathematician). Reality is that people assume "big" with "secure." A big gun is better than having a small gun. Larger key sizes are more secure than smaller key sizes.

In reality, the implementation of crypto is more important than key size alone.

If I remember correctly, that weakness is only relevant when using AES is quite unusual modes, and not any typical encryption mode where random keys are used. I'm pretty sure that AES-256 is stronger for normal use (CBC, CTR,...)

`A big gun is better than having a small gun` And detonating the Tsar Bomba in someone's face is more lethal than detonating the Little Boy in their face, but they're dead either way.

@CodesInChaos Also, don't larger keys require more rounds to achieve complete diffusion (if I recall, the number of extra rounds given to AES256 was based on a hunch, not any empirical testing or proofs)? Or is that only an issue if the key itself is not uniform?

@forest I expect that to only matter if you either want to achieve the higher security level expected of a 256-bit key or if the key is badly non uniform (e.g. second half is constant). There is also the question of what you mean by diffusion, the number of rounds it takes a small change in the input to diffuse or the number of rounds it takes a small change in the key to diffuse.

@CodesInChaos Since I referenced the difference in diffusion rate when a larger keysize is used (rather than a larger block size), I meant the number of rounds it takes to achieve full diffusion when a bit of the key changes.

It's confusing that the conclusion of the sources do not support the recommendation against AES256: "While these complexities are much faster than exhaustive search, they are completely non-practical, and do not seem to pose any real threat to the security of AES-based systems", from http://eprint.iacr.org/2009/374

• FWIW, NIST draft on post quantum crypto is recommending 256.

http://csrc.nist.gov/publications/drafts/nistir-8105/nistir_8105_draft.pdf

Though they say it might be "overly conservative" and does not account for "the possibility of more sophisticated quantum attacks", they do indeed say "For symmetric key systems, one simple heuristic is to double the key lengths to compensate for the quadratic speedup achieved by Grover’s algorithm." This seems a reasonable approach to take right now, knowing that a lot of advances have been made in quantum computers, and interest in the technology is growing. Messages sent today might be decrypted in a few years, if they use cryptography techniques that are not "quantum-safe".

`the possibility of more sophisticated quantum attacks` Isn't there a proof that grover's algorithm is the most efficient possible way to traverse a finite keyspace? I.e. the equivalent keyspace can never possibly be reduced to less than 2^(n / 2) / √k where _n_ is the bit length and _k_ is the number of parallel instances of grover's algorithm. I might be misremembering though.

• Your premise seems wrong to me. I am not aware of any evidence that "most people use 256 bit encryption instead of 128 bit". Indeed, if I had to guess, I suspect the reverse is the case.

I think you are right in that most SSL setups are, by default, set to prefer 128-bit. However, I think the point of the question is "why use a longer key if 128-bit is secure anyway?".

For example both Firefox and Opera prefer AES-256 cipher suites. They even prefer non (EC)DHE suites with AES-256 over (EC)DHE suites with AES-128, which IMO is insane.

• I'm assuming you are talking about symmetric cryptography. The answer is that it is never secure enough (even though I suspect that using 256 bit vs 128 bit keys is a marketing strategy to make the client feel more secure).

And don't forget the rise of quantum computing, which significantly lowers the amount of time needed for a brute-force attack.

yes it's symmetric. about quantum computers wikipedia says: "Bennett, Bernstein, Brassard, and Vazirani proved in 1996 that a brute-force key search on a quantum computer cannot be faster than roughly 2n/2 invocations of the underlying cryptographic algorithm, compared with roughly 2n in the classical case. Thus in the presence of large quantum computers an n-bit key can provide at least n/2 bits of security. Quantum brute force is easily defeated by doubling the key length."

But seems to me that for such attacks becoming practical, very large fully fledged quantum computers are needed that i don't think this can be realized very soon. anyway it seems a good idea to use 256 bit if relatively long term protection is needed (and changing the keys/key length at need is not practical), although 256 bit will become 128 bit in the quantum computers era, so why not use 512 bit? but i personally don't remember any 512 bit supporting cipher!!

@HM "Quantum brute force is easily defeated by doubling the key length". I don't know about this so I won't go into much detail, but it this is true, there you understand why you use 256 bits instead of 128 :) in any case, as you say, longer keys give you long term protection, which is probably important enough to justify their use.

It would be easier to just run Schors algorithm on the public cryptography of the key exchange in a lot of cases anyway.

@HM "AES has a fixed block size of 128 bits and a key size of 128, 192, or 256 bits, whereas Rijndael can be specified with block and key sizes in any multiple of 32 bits, with a minimum of 128 bits. The blocksize has a maximum of 256 bits, but the keysize has no theoretical maximum." (from wikipedia)

And what about security tokens? i had already questioned about this but that question was closed! i mean when we use a random string as a token, that means it is not used by any cryptographic primitives (ciphers, hashes, etc), just as itself as a security token, is a 128 bit token enough? is quantum computing a concern in this area too?

@HM do you mean something like session IDs?

Probably session id is a misleading example because afaik session ids are not generated completely at random (e.g. some client parameters are used in generating them). also restricting the question to web scenarios seems another bad idea that has the potential to misdirect the discussion, because remote brute-force attacks are almost always severely limited in speed (specially web). local brute-force attacks can run at much higher speeds.

assume a security token as a secret random string shared between 2 machines that is used for authentication directly. anyone can try an unlimited number of permutations to impersonate himself as the other(trusted) side. but if the number of permutations is big enough, there is practically no chance for outsiders to succeed.

@HM I don't want to give you incorrect answers, so I will refrain from answering. I'm not an expert in quantum computing, I only know that it can be used to break cryptographic keys faster. I will definitely look into the subject more though.

It can easily be secure enough. 2^128 is darn hard but somewhat theoretically plausible, and 2^256 is well out of the range of what we have any idea whatsoever to build a machine for. With a "perfectly efficient" computer (which I'm pretty sure that insofar as we know cannot exist) though, these limits become effectively meaningless. This really comes down to: *theoretically* hard, or merely hard *given what we have any idea how to do*? I suspect most people are more concerned with the latter.

License under CC-BY-SA with attribution

Content dated before 6/26/2020 9:53 AM
• {{ error }}