Related to: Thermodynamics

Information and entropy relation to energy

We can distinguish 2 types of information in relation to abstract and physical worlds:

  • Unbound information - possible symbols or states are understood as purely abstract and not related to any physical system. That is information used in mathematics.
  • Bound information - possible symbols or states are identified as microstates of some physical system

Information to be recorded, transmitted or processed has always some material carrier and has to be reflected or mapped to some signal. With processing such a signal the spread of matter or energy in some thermodynamic system is changing. So any process of computation of bound information is necesarilly followed by change in thermodynamic entropy.

Information in thermodynamic units as information bound to states of some thermodynamic system is negative entropy (negentropy).

Getting some information (by observation or measuring), the entropy is going down. But this observation needs some energy and this energy taken is increasing the entropy. From second thermodynamic law this observation or getting information needs always more energy than is the amount that resulting information contains.

Thermodynamic system is a continuous space containing a large amount of particles in interactions with each other. The rest of physical world is system surroundings.

The thermodynamic system can be one of the next 3 types:

  • Isolated system, which does not exchange any matter or energy with surroundings
  • Closed system, which can exchange energy, but not matter
  • Open system, which exchanges both energy and matter with surroundings.

First two are only abstract or temporary states of the systems. We even cannot get any information about state of the isolated system. All computational systems are open, as computational devices are material, and material is being mined, formed, assembled, disassembled, so any computation, no matter how abstract and symbolic, is bound to the material and energy exchange.

When the abstract mathematical or symbolic processing is done in pure mind of the human, there is an energy and material exchange needed by related biological processes in the brain.

Energy needed for signal modulation

Landauer's principle asserts that there is a minimum possible amount of energy required to erase one bit of information, known as the Landauer limit: E=k_B T ln 2 (k_B is the Boltzmann constant and T is the temperature). For T equal to room temperature 20 °C, we can get the Landauer limit of 0.0175 eV (2.805 zJ) per bit erased.

The equation can be deduced from the Boltzmann's entropy formula S=k_B ln W , considering that W is the number of states of the system, which in case of a bit is 2, and the entropy S is defined as E/T. So the operation of erasing a single bit increases the entropy of a value of at least k_B ln 2}, emitting in the environment a quantity of energy equal or greater than Landauer limit.

It puts a fundamental ceiling on the increase in the number of computations per joule of energy dissipated. Until recently, this increase has been exponential (doubling every 2 years), so by 2048 we would reach Landauer's limit. Probably the slowdown already increased the doubling to 2.6 years, which means there will be more limited increases in performance per Watt.