Human memory capacity comparable to entire Internet, study shows
Published at | Updated at
In the past few decades, computing power and memory has increased exponentially.
In 1977, consumers were thrilled with the whopping 64 kilobytes that the Apple II was able to pack in. Imagine the awe those same consumers would experience if you told them that about 40 years later, the public would have access to 8 terrabyte hard drives — 125 million times more memory.
A similar revelation shook the science world Wednesday with the realease of a study at the Salk Institute for Biological Studies. Researchers found that memory capacity of the human brain was about one petabyte. For context that is roughly the size of the entire World Wide Web.
"This is a real bombshell in the field of neuroscience," said Terry Sejnowski, co-author of the study. "We discovered the key to unlocking the design principle for how hippocampal neurons function with low energy but high computation power."
While running a computer simulation on a segment of a rat brain, the Salk researchers discovered something interesting about the hippocampus. Occasionally, they would find that one axon from one neoron sending out would form two dendritic connections with another neuron. This means the first neuron would be sending two copies of the same message to the other.
Upon closer inspection, they found that in many cases these two dendrites were very similar in size, as close as an 8 percent size difference. Though this seems rather mundane to the layman, the size similarities led to further investigation, which allowed the Salk team to find that there are 26 different sizes of synapse, when it was previously supposed that there were only 3. To put things in computer terms, this equates to storing 4.7 bits at each synapse.
The new research also brought to light how the human brain is very efficient while running on just 20 watts of power (about as much as a dim lightbulb). The reliability of a firing neuron is actually pretty low. When activated, there is between a 10 and 20 percent chance that the recieving neuron will activate the next neuron.
It appears that the brain combats this low batting average by modulating the size — and therefore the strength — of each synapse every 2 to 20 minutes. The stronger the synapse, the more likely it is to stimulate the next neuron. This, coupled with the 26 different possible sizes and redundant dendrites, allows the brain to prioritize certain networks at certain times, allowing the brain to spend less energy than a computer working on the same task.
Researchers believe that these new findings could vastly inform not only the future of neuroscience, but of computing as well.
"This trick of the brain absolutely points to a way to design better computers," says Sejnowski. "Using probabilistic transmission turns out to be as accurate and require much less energy for both computers and brains."