How large does a computer need to be before it is ‘as powerful’ as the human brain?

This is a difficult question, which people have answered before, with much uncertainty.

We have a new answer! (Longer description here; summary in the rest of this post.) This answer is based on ‘traversed edges per second’ (TEPS), a metric which emphasizes communication within a computer, instead of computing operations (like FLOPS). That is, TEPS measures how fast information can move around.

Communication can be a substantial bottleneck for big computers, slowing them down in spite of their powerful computing capacity. It seems plausible that communication is also a bottleneck for the brain, which is both a big computer, and one that spends lots of resources on communication. This is one reason to measure the brain in terms of TEPS: if communication is a bottleneck, then it is especially important to know when computers will achieve similar performance to the brain there, not just on easier aspects of being a successful computer.

The TEPS benchmark asks the computer to simulate a graph, and then to search through it. The question is how many edges in the graph the computer can follow per second. We can’t ask the brain to run the TEPS benchmark, but the brain is already a graph of neurons, and we can measure edges being traversed in it (action potentials communicating between neurons). So we can count how many edges are traversed in the brain per second, and compare this to existing computer hardware.

The brain seems to have around 1.8-3.2 x 10^14  synapses. We’d like to know how often these synapses convey spikes, but this has been too hard to discover. So we use neuron firing frequency as a proxy. We previously calculated that each neuron spikes around 0.1-2 times per second. Together with the number of synapses, this suggests the brain performs at around 0.18 – 6.4 * 1014 TEPS. This assumes many things, and is hazy in many ways, some of which are detailed in our longer page on the topic. The estimate could be tightened on many fronts with more work.

The Sequoia supercomputer is currently the best computer in the world on the TEPS benchmark. Its record is 2.3 *1013 TEPS. So the human brain seems to be somewhere between as powerful and thirty times as powerful as the best supercomputer, in terms of TEPS.

At current prices for TEPS, the brain’s performance should cost roughly $4,700 – $170,000/hour. Our previous fairly wild guess was that TEPS prices should improve by a factor of ten every four years. If this is true, it should take seven to fourteen years for a computer which costs $100/hour to be competitive with the human brain. At that point, if having human-level hardware in terms of TEPS were enough to have human-level AI, human-level AI should be replacing well paid humans.

Moravec’s and Kurzweil’s estimates of computation in the brain suggest human-equivalent hardware should cost $100/hour either some time in the past or in about four years respectively, so our TEPS estimate is actually late relative to those. However they are all pretty close together. Sandberg and Bostrom’s estimates of hardware required to emulate a brain span from around then to around thirty years later, though note that emulating is different from replicating functionally. Altogether ‘human-level’ hardware seems likely to be upon us soon, if it isn’t already. The estimate from TEPS points to the near future even more strongly.

(Featured image by MartinGrandjean)