198

Cryptographic primitives usually assert some security level given as number of operations to mount an attack. Hash functions, for example, give different security levels for collision attacks, preimage attacks and second preimage attacks. From these, "safe" key sizes are derived for different primitives.

There are many different recommendations for safe key sizes and many different means of estimating future capabilities in performing computation. For example, www.keylength.com has a lot of these recommendations combined.

What I'm looking for, however, is the amount of simple operations that can be obviously seen as out of reach for all humanity for the foreseeable future - or actually, the lowest such value that is still believable.

It is very obvious that 2^256 simple operations is something that will never be reached. It is also very obvious that 2^64 simple operations can be reached as it already has been. Many of the recommendations seem to calculate 2^128 as a number that would be safe for 30 years or more. So the value I am looking for is likely between 2^128 and 2^256. I am guessing 2^160 or 2^192 might be safely out of reach.

But I want concrete arguments that can be easily reasoned about. I'd love to see arguments that are based on simple laws of physics or relations to concrete constants about the universe. For example, Landauer's principle could be used.

Note: the actual simple operations used are not relevant here - they might be operations on a quantum computer, or hash invocations, or whatever.

AviD
  • 72,708
  • 22
  • 137
  • 218
Nakedible
  • 4,531
  • 4
  • 26
  • 22
  • 2
    Of course this depends on how long a single "simple operation" takes, and how many of them you can do in parallel. An application of a SHA-265 hash takes more than a simple addition (one which fits your register size). – Paŭlo Ebermann Aug 11 '11 at 18:25
  • 1
    Well, obviously. But I'm looking for a count of operations that will not ever be reached, regardless of how simple it is. So counting values in a loop is good enough, or something even simpler. For example, in Landauer's principle, the unit of calculation is the change of a single bit. – Nakedible Aug 11 '11 at 19:05
  • @Nakedible, When you start talking about [Landauer's principle](http://en.wikipedia.org/wiki/Landauer%27s_principle), you should also talk about [unknown unknowns](https://en.wikipedia.org/wiki/There_are_known_knowns). – Pacerier Apr 12 '16 at 10:43
  • This sounds like a question for Crypto more than here, but this is a pretty interesting read. – Robert Mennell Jul 15 '16 at 17:17

4 Answers4

334

As a starting point, we will consider that each elementary operation implies a minimal expense of energy; Landauer's principle sets that limit at 0.0178 eV, which is 2.85×10-21 J. On the other hand, the total mass of the Solar system, if converted in its entirety to energy, would yield about 1.8×1047 J (actually that's what you would get from the mass of the Sun, according to this page, but the Sun takes the Lion's share of the total mass of the Solar system). This implies a hard limit of about 6.32×1068 elementary computations, which is about 2225.2. (I think this computation was already presented by Schneier in "Applied Cryptography".)

Of course this is a quite extreme scenario and, in particular, we have no idea about how we could convert mass to energy -- nuclear fission and fusion converts only a tiny proportion of the available mass to energy.

Let's look at a more mundane perspective. It seems fair to assume that, with existing technology, each elementary operation must somehow imply the switching of at least one logic gate. The switching power of a single CMOS gate is about C×V2 where C is the gate load capacitance, and V is the voltage at which the gate operates. As of 2011, a very high-end gate will be able to run with a voltage of 0.5 V and a load capacitance of a few femtofarads ("femto" meaning 10-15). This leads to a minimal energy consumption per operation of no less than, say, 10-15 J. The current total world energy consumption is around 500 EJ (5×1020 J) per year (or so says this article). Assuming that the total energy production of the Earth is diverted to a single computation for ten years, we get a limit of 5×1036, which is close to 2122.

Then you have to take into account technological advances. Given the current trend on ecological concerns and the peak oil, the total energy production should not increase much in the years to come (say no more than a factor of 2 until year 2040 -- already an ecologist's nightmare). On the other hand, there is technological progress in the design of integrated circuits. Moore's law states that you can fit twice as many transistors on a given chip surface every two years. A very optimistic view is that this doubling of the number of transistor can be done at constant energy consumption, which would translate to halving the energy cost of an elementary operation every two years. This would lead to a grand total of 2138 in year 2040 -- and this is for a single ten-year-long computation which mobilizes all the resources of the entire planet.

So the usual wisdom of "128 bits are more than enough for the next few decades" is not off (it all depends on what you would consider to be "safely" out of reach, but my own paranoia level is quite serene with 128 bits "only").

A note on quantum computers: a QC can do quite a lot in a single "operation". The usual presentation is that the QC performs "several computations simultaneously, which we filter out at the end". This assertion is wrong in many particulars, but it still contain a bit of truth: a QC should be able to attack n-bit symmetric cryptography (e.g. symmetric encryption with a n-bit key) in 2n/2 elementary quantum operations. Hence the classic trick: to account for quantum computers (if they ever exist), double the key length. Hence AES with a 256-bit key, SHA-512... (the 256-bit key of AES was not designed to protect against hypothetical quantum computers, but that's how 256-bit keys get justified nowadays).

forest
  • 65,613
  • 20
  • 208
  • 262
Thomas Pornin
  • 322,884
  • 58
  • 787
  • 955
  • 61
    I really only wanted to say, wow. This is a beautiful answer, well done Thomas. :-) – Christoffer Aug 11 '11 at 19:15
  • Great answer! Just the kind of reasoning I was looking for. However, I'm not looking for "safely" out of reach, I'm looking for "not achievable by humanity by any means". Assuming no faster than light travel, limiting resources to our solar system seems reasonable - however, I think a figure far less than 2^225 could be reasoned that would still be reasonably out of reach of all humanity even if the fate of the entire species would depend on it. – Nakedible Aug 11 '11 at 19:26
  • 6
    Mightily impressive answer, wish I could upvote more than once. But even assuming hypercomputing is out of reach, don't forget reversible computing--if you can perform 2^31 of your operations reversibly, you might be able to to reach that 2^256 operations on a classical computer, using a black hole for negentropy and throwing in the whole solar system piece by piece. – user502 Aug 12 '11 at 15:43
  • I (personally) don't think reversible computing will ever appear, so just like FTL, I don't consider it. – Nakedible Aug 14 '11 at 15:38
  • 1
    This answer is great, but I have one reservation with it. Let's say that we restate Moore's law as that the size of each individual transistor halves every two years (that's the same thing as doubling the number of transistors on the die, if the die is the same size). Where does that leave us? [With sub-atomic transistors](https://en.wikipedia.org/wiki/Moore%27s_law#Ultimate_limits_of_the_law), which pretty obviously isn't going to happen unless there's a major breakthrough in physics somewhere. An entirely new type of technology within a few years? Quantum computers aren't near that point. – user Apr 19 '14 at 10:53
  • Could the difference between 128-bit and 256-bit keys also be interpreted as some protection against weaknesses in AES discovered in the future? Would you ever expect an attack on AES to reduce its strength by half, kind of like a quantum computer might? – Jack O'Connor Feb 02 '15 at 22:29
  • 3
    That interpretation sure exists, and is even widespread; some people want bigger keys because they assume that key bits are somehow consumed gradually by attacks. This assumption, however, is not supported by facts. In reality, breaks tend to be all-or-nothing. In some specific cases, bigger keys can be weaker (for instance, AES-256 turned out to be substantially weaker than AES-128 against the rather esoteric, and fortunately non applicable in practice, related-key attacks). – Thomas Pornin Feb 03 '15 at 12:08
  • @ThomasPornin sure you can find a flaw in the algorithm, which is another concern entirely. This analysis is about the energy requirements to actually perform a brute force attack, supposing you do have to do it. If it's the algorithm that's broken, then deciding on a "safe" number of bits may not help. Most algorithm failures knock a few bits of the size of the search space an attacker has to search, but some failures are much worse than that. – mc0e Feb 11 '15 at 10:25
  • Mass is not converted to energy, that's a common misconception. Mass is *equivalent* to energy. They are descriptions of the same underlying thing. – isarandi May 18 '15 at 00:49
  • 14
    Interesting side note for universal limits: assuming 2.76K (background radiation temperature) instead of room temperature, and assuming there are 10^24 stars in the universe, we'll need at least 312 bits of security; 624 bits to be immune to intergalactic alien quantum computers. – Jay Sullivan Aug 01 '15 at 01:42
  • 4
    I think there's a problem with the first assumption that must be called out. Laundauer's Principle only applies to irreversable operations. – davenpcj Jul 15 '16 at 16:18
  • Adding to @davenpcj's comment above, [If we had a “perfectly efficient” computer and all the energy in the Milky-way available, what number could it count to?](https://physics.stackexchange.com/q/257323/14091) may be relevant here. – user Apr 18 '17 at 11:50
32

Note: the actual simple operations used are not relevant here - they might be operations on a quantum computer, or hash invocations, or whatever.

Well, a quantum computer is the reason no one can tell you the "amount of simple operations that can be obviously seen as out of reach for all humanity for the forseeable future". By definition a quantum computer performs the opposite of "actual simple operations"; it allows one to bypass some large portion of "simple operation" space through quantum sleight-of-hand. Once the computer that bypasses portions of that simple operation space exists, then your question about "how big does the space need to be" becomes unpredictably irrelevant.

That's the theory, anyway. We haven't reached the level of future where quantum computers can do what we think they should be able to do. Although I'm comfortable saying that such a quantum computer does and does not exist in a box somewhere.

gowenfawr
  • 72,355
  • 17
  • 162
  • 199
  • 4
    Quantum algorithms may reduce the amount of simple operations required to achieve a certain *goal* - such as cracking a cryptographic primitive - but I am not interested in that. A quantum operation is still an operation and quantum computers are still just computers. – Nakedible Aug 11 '11 at 19:03
  • 11
    I very nearly upvoted this for the final sentence alone. – user May 02 '13 at 15:03
  • 6
    @MichaelKjörling: I certainly did. – Kevin Li Apr 19 '14 at 00:19
9

Although I very much like @thomas-pornin's answer, I think there's a problem with the first assumption that must be called out.

Laundauer's Principle only applies to irreversible operations.

Contrary to what some may assume, reversible computing is already achievable. The operations are common in quantum computers and homomorphic encryption systems.

More specifically, a given hash algorithm may use an operation like XOR to mix two bits and destroy the information about which was 1 and which was 0, but a CNOT (wiki) as in a Fredkin Gate is a reversible equivalent which produces the XOR result and a distinguishing control bit. If that data is preserved, the operation can be reversed, freely. If it is instead destroyed leaving only the XOR, Landauer's Principle applies.

As others have indicated, quantum computing is changing things, but that's because it uses CNOT gates instead of XOR with qubits, leaving the control bits around to preserve the quantum superposition state and perform the operation without expending the Landauer cost of destroying it.

Consequently, if the output states are collapsed, then bits of the hash (for example) destroyed, with an energy cost, to match the known output, no additional cost is required to compute the input, beyond probing the value of each bit.

Minimally, that should require at least the amount of bits in the hash, and at least the number of bits in the input data. For a given hash algorithm, computing the digest for a block may require many more XOR/CNOT operations than the data itself, these would all have to be added up.

For a SHA-256 (wiki) of 1 Gigabit, That's 256 bits of the output hash, 1 Gigabit of input, and 16 and/add/xor operations on each 32 bit part of the 512 bit chunk, plus 8 more 32 bit adds to fold in the current value; or 33Kbits.

If you have ~2Terabits of data storage in qubits, and the ~10-15J per bit to set up the problem and interrogate the state, you can reverse that hash.

Well, not quite reverse. You can find the set of all 1 gigabit-sized inputs which produce that output hash, and start spending more per bit to choose one of them.

Depending on the security needs, a collision may be sufficient.

(EDIT) Recently, researchers have published a non-reversible primitive operation requiring less than the cited limit, implying error in the original calculations.

davenpcj
  • 210
  • 2
  • 4
9

A recent thing to add here which probably is relevant to the question is that Landauer's Principle might not actually hold up:

http://phys.org/news/2016-07-refutes-famous-physical.html

They measured the amount of energy dissipated during the operation of an "OR" gate (that is clearly a logically irreversible gate) and showed that the logic operation can be performed with an energy toll as small as 5 percent of the expected limit of kBT ln2. The conclusion of the Nature Communications article is that there is no fundamental limit and reversible logic is not required to operate computers with zero energy expenditure.

Why did it take so long to discover this? Partly because the experiment had to achieve exceptional sensitivity in order to show that the Landauer limit could be beaten: more than 10 to 21 Joule, where 1 Joule is the energy that it takes to raise an apple one meter above the ground. This is a very small amount of energy.

Nakedible
  • 4,531
  • 4
  • 26
  • 22
  • 10 to the power of 21 joules doesn't strike me as an exceptionally sensitive experiment, seeing as 10^21 joules is twice the human annual global energy consumption. – Mr47 Aug 11 '17 at 11:56
  • @Mr47 I have a feeling that it was supposed to be 10^-21, which is indeed very small. – forest Mar 07 '18 at 09:48