How Much Data Can Our Brains Store?


[INTRO ♪] Computers are cool, and our brain is probably
the coolest computer of all. When you’re shopping around for a new computer, one of the things you look for is the storage size of the hard drive—how much data your computer can hold. So it’s fun to pose the same question about ourselves. If our brains were computers,
how much data could they store? It’s not an easy question to answer— in part because any comparison between a computer and our brain is far from exact. They don’t really work the same way—but that doesn’t stop scientists from trying. In computer science, the smallest unit of
data is a bit, short for “binary digit.” A bit can have one of two binary states: 0
or 1, similar to on and off. 8 bits make a byte. By combining bits, you can store more information. For example, if you have 2 bits, you can store 4 different states: 00, 01, 10, or 11. But not everyone agrees on what the neurological
equivalent of a bit would be. Since a bit is the smallest unit of storage in a computer, one possible approach is to compare a single bit to a single synapse in the brain. That’s the area where two neurons meet and
exchange information. Synapses are the workhorse of memory—the
smallest functional unit, much like a bit. And like a bit, they were originally thought
to have two states: on and off. There are approximately 250 trillion synapses in our brains, so if one synapse is one bit, that would be about 30 terabytes of data in our brain, or 30 trillion bytes. However, it gets trickier. Signals from neurons at the synapse can vary
in size. Instead of just being on or off, a single synapse can carry a signal with varying degrees of strength. These differences in synaptic strength are
associated with how strong a memory is. If the synaptic strength is small, you will be less likely to remember something than if the strength is large. For a while, it was believed that a synaptic
signal could only be small, medium, or large. With three possible states, this would make a synapse roughly equivalent to 1 to 2 bits—which boosts the overall estimate a little. But it gets even more complex. A study published in 2015 found that instead of a single synapse being able to have only small, medium, and large signals, there were actually 26 possible states. The researchers measured the synaptic strength
of two synapses coming from the same neuron. They predicted that each would have the same strength. Instead, there was an 8% difference. Remember—these two synapses originated
from the same neuron. When the researchers crunched the numbers, that 8% difference translated into a minimum of 26 different synaptic states. This increase in potential states increases the amount of synaptic strength that a single neuron can exhibit, making a single synapse capable of storing almost 5 bits. Our brains are thought to have around 86 billion neurons, and a single neuron can actually have thousands of synapses. That works out to roughly
250 trillion synapses in the brain. If we do some back of the envelope math, at 5 bits per synapse, that gets us in the neighborhood of 150 terabytes. It should be noted that these theoretical
estimates are just approximations. We can’t really know for sure what our brain can store—this is more of an educated wild guess. Plus, what our brains do and what computers do are fairly different—so this is more of a thought experiment than actually saying anything about your brain being a hard drive. Even so, if you figure a DVD can hold about 4.7 gigs, that’s in the neighborhood of 30 thousand DVDs. So give your brain some credit. It’s probably pretty spacious. Thanks for asking, and thanks to our patrons on Patreon for making it possible to spend our time posing and answering such wild hypotheticals in the name of science. Patrons get to choose questions for us to
answer, so if you’ve got a burning question of your own, consider supporting us at patreon.com/scishow. [OUTRO ♪]

100 thoughts on “How Much Data Can Our Brains Store?

  1. Since it's so hard for modern computers to replicate the human brain will quantum computers be able to fix said issue in the future?

  2. what a way to determine the different "states" a synapse can have, lel. Synapses are not firering single Ions, these all are dynamic biological/chemical events.

  3. Biologist: By computer standards, you are a marvel. You have virtually unlimited potential and space for storing information with your brain.
    Me: [focuses on only all the bad memories and everything I ever got wrong] D=

  4. I would say the Brain is more like RAM in a computer than the HDD, because once it is turned off, the data is all gone. It's volatile. You could also make other interesting parallels like damaged RAM and damaged parts of the brain, and/or degradation due to age.

  5. I can’t buy any of this. A SYNAPSE is NOT A REAL THING! It’s a PLACE, it’s a space where something happens only defined by “things” in that space interacting.

  6. Yet exams are coming up, and everything I read my brain’s just like “Ya know, I could store this. But why would I use up VaLuAbLe SpAcE on that”

  7. The same as not all neurons (nor synapses) transmit heat or pain sensation, not all neurons (nor synapses) can store information. The numbers given can most likely be reduced in an 80%, since most likely only the prefrontal cortex is involved in storing information.

    People that can't remember or learn sometimes have a small part or the brain affected.

  8. you'd be better off working out how much data neural network programs can hold within the network and start from there as its not like traditional hard drives its not stored the same way. In computer science each neuron has weights and bias for each connection to the next neuron so i think you probably have underestimated the volume of data to be stored by a substantial amount.

  9. This video just came out today, and is already out of date. Brains are analogue and don't have an equivalent to bits (as single points that store a static bit), and have an equivalent to some out-dated analogue storage, SPECIFICALLY, Brains use what is called Delay Line Memory, However they do so in ways analogue hardware never did (because it was rendered obsolete by silicon transistors), by having multiple delay lines overlap in line, and then be separated out with what in analogue would be called a Multiplexer. (Mux/DeMux), which means it processes it more as analogue signals intelligence from the 50's-70's, and more akin to how old cable & satellite TV signals would be broadcast (which was already outdated tech when it was wide-spread), rather than computer science.

    That's how brains physically store data, and the synapses are simply the paths that that data takes. those are just the wires and gates, and outputting into the hypocampus It's speculated that Alzheimer's is a form of degradation of this delay line memory, as going out-of-sync, either by speeding up or slowing down the signal in the delay line.

    The big questions are, how many delay lines can the brain support (it would be variable from person to person), and how many signals can be on a given line, and how long those signals are, how much data does that loop account for, and if there's data compression. Counting synapses as bits, is a gross oversimplification.

  10. But not all neurons are dedicated to memory. I don't consider it part of my hard drive for my brain to be able to receive a signal coming from down the optic nerve.

  11. Cool topic I had a feeling it would be this host I just can't stand to listen to her like literally I cannot stay on listening to her

  12. That estimate also doesn't account for space for a processor. At least some of those neurons must be used to process and transmit info to other parts.

  13. The synaptic strength argument doesn't make sense. If the strength of the synapse determines how well the memory is stored then each synapse would have an upper limit of 1 bit since computer memories store reality as it is (ie at the strongest synaptic strength). Anything else is just decay.

  14. This doesn't even take into account for dreams which are miraculous to me. While we sleep our minds are creating people, places, events that can feel very real, yet may have never actually occurred. It really blows my mind.

  15. How many years could a person live (assuming negligible senescence) before significant memories started being pruned to make room for new information?

    In simpler but inaccurate terms, how many years of memories fit in one brain?

  16. If it's true what she says the brain is pretty spacious it's no wonder I keep hearing wind whistling when most people walk by, most of them are pretty much idiots, self-serving idiots!
    (And don't worry I know I'm guilty of it periodically my own self, just like everybody is!)

  17. I don't think brains store information in a way that can be quantified in bits and bytes. I'm definately sure that brains don't store anything in a binary form.

  18. Mathematically, measure the number of nerve cells, transferring information from nerve cell to nerve cell is irrelevant, the nerves themselves decides which goes where & such.

  19. Would be intersting to research if those states are indeed totally independant. My gut feeling is that they are not… which would lower the number.

  20. You can't really compare brains with computers. They are far too different.

    Memories just aren't encoded like data. One bit of information is going to need a lot of synapses to be stored, while entire memories are stored much more efficiently than they would be in a computer.
    If you talk about how much a brain can store by storing the strength of synapses, you also need to store which neurons are linked by each one. That makes it a lot more.

  21. What evidence is there that synapses are discrete at all? Do we have anything that suggests that they aren't in fact continous like an analog signal?

  22. Our brains are all like broken hard drives that only run in data recovery mode. If we're lucky and it was just yesterday that we emptied our recycling bin we might be able to get the majority of that important message back, but chances are we'll misremember it.

  23. if we compare the brain to a hard drive
    it would approximately have 150 TB for ROM

    yet i have 1 GB RAM for such "huge" storage

  24. All indications are that synapses are analog, not digital. That means the number of states they can have has no real limit; each of the uncountable number of different states can each have different interpretations, i.e. represent different values. That means can't be equated to a bit, or a couple bits, or a few bytes, or any other reasonable number of digital states. It all depends on how the states are interpreted, but in the brain, these interpretations are also analog, so all we can say is that the information capacity for a synapse is unknown, but could be "quite a lot". That's about as far as you can take this kind of comparison.

  25. It is really impossible to distinguish the computing part and the data storage parts when they are one and the same in the brain.

    They should have talked about the cerebellum. It is small but contains much more synapses than the other parts together, but it has very simple wiring. As a result of the simple wiring it cannot really store memories but is more useful for fast movement control feedback. (Actually there are some extra connections which enable the cerebrum to use the cerebellum for other stuff too, but it is very poorly understood)

  26. And of course it gets even more complicated by the fact that synapses aren't the only way neurons store information! Neurotransmitters from one synapse can float through interstitial fluid to reach nearby synapses, or even nonsynaptic receptors, allowing for wildly complex analog data transmission.

  27. Hmm…the problem I see with this is assuming a synapse can be “on or off.” Our brains work with neurotransmitters, right? Doesn’t that mean that each neurotransmitter is its own state? Considering there are over 100 of them, doesn’t that mean each synapse can hold over 100 bits? If so, 150 TB is still way too low.

  28. I wonder what the equivalent in bits is in our brain for doing tasks like learning how to cook a particular recipe, playing a song on an instrument, memorizing lines in a play (or in a sci-show video), etc.

  29. Our brain is generally a duo core processor. We have barely any ram. Like no more then 512 mb. We have about 1TB hard drive. Our eye GPU is always the latest. RTX ++

  30. Clicked away in 1 second. Lose the stupid septum ring, or become a full cow. All credibility lost. Opinion invalidated. Grow up.

  31. Many portions of our memories contain data that doesn't have any kind of computer analog yet. Like touch, taste, smell, our ability to balance. So i feel like it might be interesting to do this thought experiment from the opposite direction. Like estimating how much any given memory would take to store in bytes. How do you store data of a taste? However it could be stored would probably take up a ton of space.

  32. 250 trillion ^ 26 = 27755575615628913510590791702270507812500000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000 bytes

  33. Being that this is related to the research I will be doing, I find this pretty cool. Now, to add to what was said, the brain works in a much more complicate way. Much of the brains resources are devoted to bodily functions and processing sensory input, leaving less for such things as storing or processing data. Our brains use many tricks for data compression, vision being a good example. It has been argued that much of what's left is used for storing and processing data related to language (vocabulary, grammar, etc).

  34. What if neurons are actually transistors like in CPU and brain simply accesses memories stored entirely as electrical signals?

  35. Heh. And we can even take it a step further, though I never read the specific research paper (circa 2012): each memory is stored using a specific synaptic firing pattern, unique to the individual, and uses multiple synapses.

    So, with 250T synapses, each with min. 26 firing states, how many different patterns can be formed? And if we assume that each synapse can fire in each state with another synapse within a specific distance from itself to form a pattern, the number expands.

    I haven't done the math, on account of I'm not that good at numbers that size, but if I recall from the article, the estimate was somewhere in the exabyte range.

  36. If the strength of the signal only affects the power of the memory, not which memory then it still only represents the same information, not different information, therefore the number of strengths it can potentially be is irrelevant and 1 synapse is still only equivalent to 1 bit. Right?

Leave a Reply

Your email address will not be published. Required fields are marked *