Amount of simple operations that is safely out of reach for all humanity?

As a starting point, we will consider that each elementary operation implies a minimal expense of energy; Landauer's principle sets that limit at 0.0178 eV, which is 2.85×10-21 J. On the other hand, the total mass of the Solar system, if converted in its entirety to energy, would yield about 1.8×1047 J (actually that's what you would get from the mass of the Sun, according to this page, but the Sun takes the Lion's share of the total mass of the Solar system). This implies a hard limit of about 6.32×1068 elementary computations, which is about 2225.2. (I think this computation was already presented by Schneier in "Applied Cryptography".)

Of course this is a quite extreme scenario and, in particular, we have no idea about how we could convert mass to energy -- nuclear fission and fusion converts only a tiny proportion of the available mass to energy.

Let's look at a more mundane perspective. It seems fair to assume that, with existing technology, each elementary operation must somehow imply the switching of at least one logic gate. The switching power of a single CMOS gate is about C×V2 where C is the gate load capacitance, and V is the voltage at which the gate operates. As of 2011, a very high-end gate will be able to run with a voltage of 0.5 V and a load capacitance of a few femtofarads ("femto" meaning 10-15). This leads to a minimal energy consumption per operation of no less than, say, 10-15 J. The current total world energy consumption is around 500 EJ (5×1020 J) per year (or so says this article). Assuming that the total energy production of the Earth is diverted to a single computation for ten years, we get a limit of 5×1036, which is close to 2122.

Then you have to take into account technological advances. Given the current trend on ecological concerns and the peak oil, the total energy production should not increase much in the years to come (say no more than a factor of 2 until year 2040 -- already an ecologist's nightmare). On the other hand, there is technological progress in the design of integrated circuits. Moore's law states that you can fit twice as many transistors on a given chip surface every two years. A very optimistic view is that this doubling of the number of transistor can be done at constant energy consumption, which would translate to halving the energy cost of an elementary operation every two years. This would lead to a grand total of 2138 in year 2040 -- and this is for a single ten-year-long computation which mobilizes all the resources of the entire planet.

So the usual wisdom of "128 bits are more than enough for the next few decades" is not off (it all depends on what you would consider to be "safely" out of reach, but my own paranoia level is quite serene with 128 bits "only").

A note on quantum computers: a QC can do quite a lot in a single "operation". The usual presentation is that the QC performs "several computations simultaneously, which we filter out at the end". This assertion is wrong in many particulars, but it still contain a bit of truth: a QC should be able to attack n-bit symmetric cryptography (e.g. symmetric encryption with a n-bit key) in 2n/2 elementary quantum operations. Hence the classic trick: to account for quantum computers (if they ever exist), double the key length. Hence AES with a 256-bit key, SHA-512... (the 256-bit key of AES was not designed to protect against hypothetical quantum computers, but that's how 256-bit keys get justified nowadays).


Note: the actual simple operations used are not relevant here - they might be operations on a quantum computer, or hash invocations, or whatever.

Well, a quantum computer is the reason no one can tell you the "amount of simple operations that can be obviously seen as out of reach for all humanity for the forseeable future". By definition a quantum computer performs the opposite of "actual simple operations"; it allows one to bypass some large portion of "simple operation" space through quantum sleight-of-hand. Once the computer that bypasses portions of that simple operation space exists, then your question about "how big does the space need to be" becomes unpredictably irrelevant.

That's the theory, anyway. We haven't reached the level of future where quantum computers can do what we think they should be able to do. Although I'm comfortable saying that such a quantum computer does and does not exist in a box somewhere.


A recent thing to add here which probably is relevant to the question is that Landauer's Principle might not actually hold up:

http://phys.org/news/2016-07-refutes-famous-physical.html

They measured the amount of energy dissipated during the operation of an "OR" gate (that is clearly a logically irreversible gate) and showed that the logic operation can be performed with an energy toll as small as 5 percent of the expected limit of kBT ln2. The conclusion of the Nature Communications article is that there is no fundamental limit and reversible logic is not required to operate computers with zero energy expenditure.

Why did it take so long to discover this? Partly because the experiment had to achieve exceptional sensitivity in order to show that the Landauer limit could be beaten: more than 10 to 21 Joule, where 1 Joule is the energy that it takes to raise an apple one meter above the ground. This is a very small amount of energy.