Information storage in the brain

Last updated 9 November 2020

The brain probably stores around 10-100TB of data.

Support

According to Forrest Wickman, computational neuroscientists generally believe the brain stores 10-100 terabytes of data.1 He suggests that these estimates are produced by assuming that information is largely stored in synapses, and that each synapse stores around 1 byte. The number of bytes is then simply the number of synapses.

These assumptions are simplistic (as he points out). In particular:

  • synapses may store more or less than one byte of information on average
  • some information may be stored outside of synapses
  • not all synapses appear to store information
  • synapses do not appear to be entirely independent

We estimate that there are 1.8-3.2 x 10¹⁴ synapses in the human brain, so according to the procedure Wickman outlines, this suggests that the brain stores around 180-320TB of data. It is unclear from his article whether the variation in the views of computational neuroscientists is due to different opinions on the assumptions stated above, or on the number of synapses in the brain. This makes it hard to adjust our estimate well, so our best guess for now is that the brain can store around 10-100TB of data, based on this being the common view among computational neuroscientists.


  1. “…Most computational neuroscientists tend to estimate human storage capacity somewhere between 10 terabytes and 100 terabytes, though the full spectrum of guesses ranges from 1 terabyte to 2.5 petabytes. (One terabyte is equal to about 1,000 gigabytes or about 1 million megabytes; a petabyte is about 1,000 terabytes.)

    The math behind these estimates is fairly simple. The human brain contains roughly 100 billion neurons. Each of these neurons seems capable of making around 1,000 connections, representing about 1,000 potential synapses, which largely do the work of data storage. Multiply each of these 100 billion neurons by the approximately 1,000 connections it can make, and you get 100 trillion data points, or about 100 terabytes of information.

    Neuroscientists are quick to admit that these calculations are very simplistic. First, this math assumes that each synapse stores about 1 byte of information, but this estimate may be too high or too low…”

    – Wickman 2012


We welcome suggestions for this page or anything on the site via our feedback box, though will not address all of them.

4 Comments

  1. this becomes less etcixing as we move to a decentralized/online storage model.Right, except that we’re not. People are creating their own multimedia content, being sold bloatier operating systems and applications, downloading High-Definition movies, and playing games. There is more being stored on the average user’s drive than ever before, and any online storage market growth is: 1) still too insignificant to make a dent in traditional storage needs,2) still requiring the use of storage somewhere to hold that data, requiring someone to buy that hard drive,3) unrealistic with current and near-future broadband speeds especially in Canada.Distributed storage is growing, but generally within a household or office space among several computers, serves, game consoles, media centers, etc all devices which use (and demand) bigger and faster storage.The announcement isn’t etcixing for other reasons though, namely that a for a couple hundred bucks.

  2. I don’t like these kinds of estimates. The brain does not store information in the same way a modern r-address model computer does. A computer stores information as a series of boolean values, so any unit of storage (in computers, this is a word, doubleword, or quadword, being 2, 4, or 8 bytes, respectively) is defined as being capable of storing 2^(number of bits) possible values. So one byte, being 8 bits, can exist in a single of 2^8, or 256, states. Since boolean is used, you can say with certainty that an 8 bit value can represent integers between exactly 0 and 255, a 16 bit value can represent integers between exactly 0 and 65535, etc. Saying a synapse can store one byte assumes several things:

    1) That a synapse can exist in 256 possible states. This is a problematic assumption because, unlike a byte where each possible value has the same weight, a synapse can “prefer” to be in a given state. Synaptic facilitation or augmentation is far easier and faster than synaptic engraming. Other factors can be in play that affect the number of states a synapse can be in.

    2) That a synapse stores a single independent unit if information, i.e. that two synapses store exactly twice that of a single synapse. This comes directly from our experience with computers, but when you ask a computer scientist, this often cannot be further for the truth. The brain more likely stores information in the form of relationships. This is like a super complicated version of a radix tree (a keyed membership-inclusion algorithm). If a single synapse acts like a single node in a radix tree, the “amount” of information it stores is directly dependent on every other synapse it is related to.

    3) That every synapse, or even a large number of synapses, store information (an issue you pointed out already). Many synapses are not capable of storing information for long periods, or can’t store information at all.

    4) That synapses are atomic units of storage, with the number of synapses being the only factor determining information storage, with factors such as the state of the neuron and the proximity of one synapse with another being discounted. This means that there is an assumption that one neuron with 100,000 synapses can exist in the same number of possible states as ten neurons with 10,000 synapses. An action on one synapse will affect other synapses nearby, and collective actions on many synapses can change the behavior of the whole neuron.

    5) That neurons are the only source of information storage. In reality, glial cells heavily influence the excitability of a neuron. Astrocyte support can change a neuron’s behavior and thus “capacity”.

    6) That all information has the same value. To quantify storage capacity, any individual bit is no different from any other bit. Sector 1234 on a hard drive is no more “important” than sector 4321, and either can be used to store any information you want. In the brain, a given synapse may be able to exist in 256 possible states, but might only encode the information necessary to calibrate its own response so it is not overactive. This neuron cannot be used to store yet another digit of pi you are trying to remember. The closest analogy in computer science is the harvard execution model, where machine instructions and data are kept in different areas. The brain is like this, but on steroids. Not all data is equal.

    In the end, applying information theory on neurons requires knowing how many bits each individual factor adds to a given neuron. Saying that a single synapse can exist in 256 possible states (or rather, than a postsynaptic neuron can only accurately determine activity with 1/256th granularity) and thus provides a byte of information is so simplified as to be incorrect.

    The only similarity between the brain and a computer is that they are both are capable of turing-complete execution.

3 Trackbacks / Pingbacks

  1. AI Impacts – Cost of human-level information storage
  2. AI Impacts – Index of hardware articles
  3. When Will Computed Humans be More Energy-Efficient than Biological Humans? | The Transhumanist

Comments are closed.