You are on page 1of 2

The brain is often described as “the world’s most sophisticated computer,” which may lead

us to ask the question: what is its capacity? To answer that question, and to determine
whether or not we can analogise the brain as having a RAM, we have to consider the various
psychological perspectives on the brain and how we understand it to work.

Psychology is made up of 5 paradigms or perspectives on the discipline. Over time, the main
paradigm (ie. the one taken seriously that makes up the largest amount of published
work/accepted research) has changed: it was once behavioural (classical conditioning style
experiments, thinking everything about us could be explained through learned behaviours,
etc.) but in the mid 20-th century switched over to cognitive. Modern psychology mainly
consists of cognitive psychology with some biological psychology as well, but the former is
the paradigm that has the most sway and has thus popularised its root metaphor, which is
that the human brain is like a “programmed computer”. The only problem is that cognitive
psychology only focusses on mental processes, using the scientific method in human
experiments to determine things about attention, memory, thinking and performance.
Biological psychology also thinks of humans as machines to some extent, but instead focusses
on biological processes which, while complex, are not binary. While both reveal insights on
the brain, we can’t only listen to the popularised metaphor and forget that the brain is a
complex organ. In a way the brain is like light: we need a number of different perspectives on
how it works to fully understand its response in all situations.

Many people have tried to quantify the amount of “storage” in the brain, but most just count
the number of synapses (connections between neurons) and assume each carries 1 bit of
data. By these estimates, the brain has a total memory capacity of about 100 terabytes, but
this is a ridiculously simplistic calculation. When neurotransmitters are released across the
synapses, carrying information from one neuron to another, they can be of a variety of
different chemical compounds (all of which carry different sorts of messages) and can also be
emitted at different “strengths”. This could mean this is an underestimate: a synapse could
carry more than one bit of information. For this reason some estimates go up to 2.5 petabytes
(2.5 million gigabytes), which is described as being “the equivalent of 3 million hours of
recorded TV shows” but again, this calculation is flawed and gives us no proper idea of our
memory capability: “episodic” memories, as formed in the hippocampus, are the sort that are
remembered like a piece of the event, but even so are far from a film reel and would be stored
differently (45 minutes of memory != downloading the new game of thrones episode onto
your computer). On the other hand, we need to remember that multiple synapses are often
needed to convey one piece of information. If we see each piece of information as a bit of
data, assuming 1 bit per synapse could even be an overestimate. So people broaden the range
of the estimate: anywhere from 10 to 100 terabytes (or possibly up to that 2.5 petabytes).
But that doesn’t help us at all, and we still don’t know how much of the brain is for processing
and/or storage.

The RAM of a computer is its “on call” memory: that is, all information in the RAM can be
accessed in the same amount of time regardless of its location in the file structure. So imagine
we define this as short-term memory: the thoughts we have “in the now” which are on call at
any given moment and don’t require any long processing or digging to find. How do we
calculate the capacity of short-term memory? Initially, a psychologist named Miller claimed
that the “working memory” of young adults was around 7 “chunks” (elements of any type);
however, the type of the chunk affected how many could simultaneously be processed. We
can keep more digits in our working memory than we can words, but even within the category
of words, those that take longer to say or which are unfamiliar before that moment represent
a greater demand to keep in the working memory. Although other factors affect personal
working memory, it was later proposed that young adults had about 4 chunks of working
memory, with this number decreasing in younger and older individuals.

However, this still isn’t an explanation! Chunks vary in types and sizes, and working memory
can be increased by encoding multiple pieces of information as one chunk: for example,
people who can memorise large numbers of digits often recall them as chunks of multiple
numbers. All chunks are not born equal: and how would we quantify chunks which are in the
form of words? Not only their length, but their sound and familiarity all effect their storage
demand. Additionally, working memory can be used to cue recall of more deeply stored
information. If this information is effectively “on call” as If it were working memory, should
we take it into account as part of the brain’s RAM?

The conclusion is that there is no conclusion: the brain is not binary and it is not a computer,
so we cannot analogise in that way. There is sort of an equivalent to RAM (working memory),
but it isn’t quantifiable in terms of megabytes.