Do Neurons Process Information Efficiently?
Information is measured in bits, whether it is processed as binary digits within a computer, or as action potentials within a neuron. However, Shannon’s theory of communication (1948) states that the energy cost of each bit increases disproportionately as the number of bits per second increases. In the context of Darwin’s theory of evolution, Shannon’s theory suggests that neurons should have evolved to process as much information as possible for each Joule of energy expended. Accordingly, two competing hypotheses are evaluated. 1) Coding Efficiency: Neurons transmit as much information per second as possible (bits/s), and, 2) Energy Efficiency: Neurons transmit as much information per Joule as possible (bits/J). The available evidence strongly suggests that, whenever there is a choice between energy efficiency and coding efficiency, neurons choose energy efficiency. Overall, energy efficiency appears to be a dominant factor in determining the form and function of neurons, and may even represent a design principle for information processing within the brain. Background readings can be found here (with apologies for the book advertisements): https://tinyurl.com/y8nonkmn Some background reading material can be found here (with apologies for the book advertisements): https://tinyurl.com/y8nonkmn