I've watched Mr. Robot lately and can't stop thinking why it was so hard to decrypt files encrypted using AES encryption with a 256-bit key.
Let us say the only method to find the key is through brute force.
Can't we set a computer to brute force starting from the first possible key, and another to begin from the last possible key, and perhaps a few computers to try the keys in the middle?
Wouldn't that reduce time dramatically?
-
7This post can help to understand why: https://www.reddit.com/r/theydidthemath/comments/1x50xl/time_and_energy_required_to_bruteforce_a_aes256/ – Jérémie B Mar 05 '16 at 16:38
-
2related question on Crypto.SE: http://crypto.stackexchange.com/q/1145/23623 (including canonical bear and physics based answer) – SEJPM Mar 05 '16 at 19:09
-
1This has been [done before](http://www.distributed.net/History) with low-security algorithms. – Michael Hampton Mar 06 '16 at 00:19
-
3That's essentially what using a good GPU does. – DJMcMayhem Mar 06 '16 at 00:35
-
10So instead of eleventy gazillion years, you get another computer and now it only takes fifty-five gazillion years? – user253751 Mar 06 '16 at 10:07
-
1Been there, done that. Related: [RSA Secret-Key Challenge](https://en.wikipedia.org/wiki/RSA_Secret-Key_Challenge). I had four computers chumping away at that one. Those were algorithms *already considered obsolete*, and distributed.net took *years* for *thousands* of computers to crack them. ;-) – DevSolar Mar 07 '16 at 14:43
-
This is exactly what [distributed.net](https://en.wikipedia.org/wiki/Distributed.net) did. See also [Amount of simple operations that is safely out of reach for all humanity?](http://security.stackexchange.com/q/6141/2138). – user Mar 07 '16 at 15:03
3 Answers
Sure it's possible, but it doesn't really help. The number of possibilities is just too large.
Consider that a 256-bit key has 2256 possible values. That's 12✕1076, or 12 followed by 76 zeroes. If we generously assume that a computer can test a trillion (that's 1012) possible keys a second, and that we have a trillion computers (where will we get them from?) performing the key search, it will take 12✕1076/(1012✕1012) seconds to search the entire keyspace. That's 12✕1052 seconds. As there are only 3,155,760,000 seconds in a century, it will take approximately 4✕1043 centuries to try all possible keys. There's a 50-50 chance that you'll find the key in only half that time.
That's the way encryption is designed. The number of possibilities are just too large to be cracked in time that is interesting for humans.
- 14,702
- 4
- 38
- 55
-
2Indeed. This is why I believe the side-channel is nearly always the right approach. – Mark Buffalo Mar 05 '16 at 16:45
-
-
38@tjt263 a typical side channel attack is a wrench https://xkcd.com/538/ – Neil Smithline Mar 05 '16 at 16:54
-
2@NeilSmithline Wrench-channel? :) There's almost always a way around the best encryption. With regards to communication, if you can compromise either User 1, or User 2, it's all a moot point. If you can physically access the computer of someone using FDE, or change their hardware in transit, you can silently implement a physical keylogger... for example, when they are on vacation / when they're buying hardware online (Supply Chain Interdiction). Another way would be to find a problem with the encryption's implementation. – Mark Buffalo Mar 05 '16 at 16:56
-
@NeilSmithline the wrench is one side channel. another side channel is called money. – emory Mar 05 '16 at 18:38
-
6@MarkBuffalo please. Side-channels are a well-defined construct in cryptography. You're talking about ways to not bother with crypto at all. A sidechannel is formally defined to leak *some* potentially helpful information upon execution of crypto (like [cache] timings and electromagnetic field changes) that if properly analyzed usually will leak secret information. Sidechannel attacks by themselves do not alter the way the crypto works. – SEJPM Mar 05 '16 at 19:07
-
1> *"In cryptography, a side-channel attack is any attack based on information gained from the physical implementation of a cryptosystem, rather than brute force or theoretical weaknesses in the algorithms (compare cryptanalysis)."* I stand corrected; had my terminology mixed up. Thanks! – Mark Buffalo Mar 05 '16 at 19:17
-
@MarkBuffalo an example of a side channel attack: [Acoustic cryptanalysis](https://en.wikipedia.org/wiki/Acoustic_cryptanalysis) and a video about this side channel: https://www.youtube.com/watch?v=DU-HruI7Q30 – Mar 06 '16 at 03:07
-
3@SEJPM I don't see how rubber-hose cryptanalysis is excluded from that definition. Humans are part of the physical implementation of cryptography, and attacking the human-computer interface seems to perfectly fit into Mark Buffalo's definition. – March Ho Mar 06 '16 at 04:06
-
Using some [theoretical techniques](https://en.wikipedia.org/wiki/Grover%27s_algorithm) you can discard 38 of those zeros. That gets it down to 108,000 centuries, which is still impractically long but at least doable before the sun burns out. – aroth Mar 07 '16 at 05:29
-
1100k CPU-centuries is just one century on a 100k-CPU supercomputer. Assuming that Moore's law holds for 18 month doubling, that's just a litle over 1.5 years in 9 years' time, for a total of 10.5 years. So, 4*10^43 centuries becomes 10.5 years. For a single CPU, the Moore breakeven point is instead 33 years away, where 10,000,000 of our current cpu-years will cost less than 2.4 years, for a total crack time of 35.4 years. – Dewi Morgan Mar 07 '16 at 15:44
-
-
-
@Forest Tell that to the transistors: https://upload.wikimedia.org/wikipedia/en/9/9d/Moore%27s_Law_Transistor_Count_1971-2016.png – Dewi Morgan Apr 12 '18 at 19:00
-
-
I did a calculation on this one once. Let's assume AES can only be broken using brute force. Clearly we are going to need a counter, which counts from 0 to 2256-1, and on average it will need to count to 2255. Running this counter takes energy. How much energy does it take?
As it turns out, there is a thermodynamic limit here, Landauer's principle. At a given temperature, there is a minimum amount of energy it can take to set a bit (1 bit of entropy), because if we don't spend that much energy, we can actually decrease the entropy of the system, which is thermodynamically impossible. The energy it takes is kT ln 2, where k is Boltzman's constant (1.38×10−23 J/K) and T is the temperature in kelvin. Obviously we want to do this as affordably as possible, so lets do the calculations at 3 kelvin, which is roughly the temperature of the background radiation of the universe. We can't get any cooler than that without spending more energy to cool the system than we'd have spent on flipping the bits! This pins the energy cost of flipping a bit at 2.87×10−23 J/bit.
Now, how many bit flips do we need? The answer will be a lot, so to keep the energy quantities in human understandable terms, I'd like to simplify the problem. Rather than solving AES-256, let's pretend we were solving AES-192, which only requires counting to 2191. So how many bit flips do we need? If we counted in normal binary, we may need to flip multiple bits per increment of the counter. That's annoying to calculate, so lets pretend we could do this counter with Grey Codes, which only flip one bit per increment.
Incrementing a counter 2191 times, at 2.87×10−23 J/bit yields 9×1034 J. That's a lot of energy. In fact, if I go to one of my favorite Wikipedia pages, Order of Magnitude (energy), we see that the energy emitted by our sun every year is 1.2×1034 J. That's right. Just running the counter that would be at the core of the AES breaking process would take the sum total of nearly a decade of the sun's energetic output. All of it.
Now if we revisit the original AES-256 problem, the energy costs go up by 264. Thus that counter would take 1.6×1054 J. Again, looking at Order of Magnitude (energy), we find that the total visible mass energy in the milky way galaxy is 4×1058 J. Thus, if you were to convert 0.004% of the total mass energy of the galaxy (i.e. converting all of the mass to energy using E=mc2), you could run a counter which could count from 0 to 2255.
This is why one never brute forces a modern crypto algorithm. The amounts of energy called for are literally at the level of "heat death of the universe."
- 65,613
- 20
- 208
- 262
- 9,216
- 3
- 26
- 26
-
6I was hoping that someone would mention even building a counter to enumerate all the values by flipping bits is literally impossible with the amounts of energy we can currently harness. There are some other fun [physical limits to computation on Wikipedia](https://en.wikipedia.org/wiki/Limits_to_computation) if you're interested in this type of thing. – neocpp Mar 05 '16 at 22:15
-
Cort, could I get your opinion on my answer in this thread? Considering your knowledge and interest in both physics and information security, I think this would be a good question for you. – BuvinJ Mar 05 '16 at 23:17
-
1Of course, there are a lot of galaxies we could convert to mass energy if we really wanted to break a key. – PyRulez Mar 06 '16 at 01:46
-
@PyRulez Of course, if you plan to make jokes like that, I would hope you understand *why* such a concept is laughable. (as a hint, do the same calculations for a 305 bit key, and compare the number to the visible mass of the universe, and then just imagine a 512 bit key) – Cort Ammon Mar 06 '16 at 01:53
-
-
I am surprised by your result. I thought there wasn't enough energy in the entire universe to count that far. And you are saying we wouldn't even need an entire galaxy. But either way it is of course unfeasible. – kasperd Mar 06 '16 at 12:02
-
That doesn't consider using "reversible" computing, or quantum computers. – JDługosz Mar 06 '16 at 14:49
-
1@JDługosz BuvinJ actually asked about the quantum computers in another question. As for reversable computers, I think upgrading technology from technology we have right now to technology that we're not entirely positive it can be done within the limits of physical reality is beyond the scope of what the OP was asking for =) It is also entirely possible that reversible computers, while avoiding the energy costs of bit-erasing, still may not avoid the computational time costs mentioned in the winning answer. – Cort Ammon Mar 06 '16 at 17:17
-
Ah! Reversible computing. Or "we can avoid increase in entropy if we horde everything" so now, instead of having to have a counter to enumerate all the keys we just go and store EVERY key. In which case if we stored the whole 256 bit key on a proton. There aren't enough protons in the universe to store all the keys. – Aron Mar 06 '16 at 18:30
-
@Aron Bah! Use your imagination. Don't waste your time storing keys on protons, concentrate how how many angels you can fit on the head of a pin. Angels are much better at helping you crack encryption =) Joking aside, I'm willing to handwave around the storage space requirements because we have no idea what sort of clever trickery we'll invent before reversable computers can be a reailty. And, at some point of absurdity, we do have to consider rubber hose cryptography https://xkcd.com/538/ – Cort Ammon Mar 06 '16 at 18:37
-
1@CortAmmon Sure. But at some point you use too big a rubber hose and then try to sue Apple. – Aron Mar 06 '16 at 23:17
It's not just possible, people have actually done this successfully. But only with very short keys.
distributed.net managed to break a 64bit RC5-encryption in 2002 by using a distributed network of computers. The computers were mostly consumer-grade PCs owned by volunteers who installed a program to crunch keys in the background. This was part of the RSA secret key challenge - a contest by RSA Labs to decrypt messages encrypted with keys far shorter than you would use in the real world.
"After over four years of effort, hundreds of thousands of participants, and millions of cpu-hours of work, Distributed.net has brute forced the key to RSA Security's 64 bit encryption challenge"
- Source: slashdot.org
They started another project to win the RSA 72-bit challenge in 2003. 13 years later, they are still calculating and have not even tested 4% of the keyspace.
Keep in mind that these are extremely simplified versions of RSA. The recommended key length for RSA-RC5 in the real world is at least 128bit. Every additional bit doubles the computation time, so these distributed approaches are still light-years away from attacking any real-world encryption.
- 49,017
- 8
- 127
- 158
-
-
1999 - 2003: average processor speed was ~1Ghz. 2003-2016: avg processor speed ~3Ghz. So breaking 72-bit will take approximately: (2^(72-64)/3) * 4 years ~ 6144 years. – stackErr Mar 07 '16 at 20:36
-
@stackErr You're possibly ignoring graphics cards, which have gotten much faster, larger, with better optimization and lower latency than a 3-fold jump. Plus, the number of processors available between 2003 and 2016 has jumped considerably as well. I.e., 6144 is definitely an upper bound. – Nate Diamond Mar 08 '16 at 00:40
-
@NateDiamond purposely ignoring graphic cards/FPGAs/ASICs since majority of the users(I would think) are not using those to brute force this. But yes multiple processors can add another (quad core) 4x decrease in the calculations. That brings it down to 1536 years (still not feasible in a persons lifetime) – stackErr Mar 08 '16 at 05:06
-
I wouldn't be so sure though. A large number of computers nowadays come with dedicated graphics cards, nor the relatively reduced cost of older generation graphics cards. Plus you're still missing that the number of available computers has increased greatly since 1999-2003. If someone gamified the inclusion such that people could let their cell phones run it overnight, the number of available users would skyrocket relative to the original period. The issue is, nobody has put in the work yet. There's little incentive. A better measure would be to equate the process to the bitcoin mining pools. – Nate Diamond Mar 08 '16 at 16:29