Memory
Are We Running Out of Memory?
Human memory is limited, and we may be running out of space.
Posted April 28, 2025 Reviewed by Davia Sills
Key points
- Memory has a limit, even though human beings don't perceive it.
- Prolonged sleep deprivation blurs the line between memory and reality—that's how a "memory limit" feels.
- Information overload may similarly push memory to the limit, degrading one's ability to remember new things.
- Being selective about information intake may improve the quality of people's memory.
Imagine that you knew how much memory you had left in your brain. Maybe there’s a progress bar in the corner of your eye. Or maybe you can visualize the percentage at will.
In any case, anytime you experience something new, you use up a little more memory and sense that. After a difficult class or a long movie, you might find that you are running low on storage. If you run out—you can’t memorize anything new.
Such memory awareness might seem like a useful ability, until you realize the caveat: You can’t simply erase memory and free up space. The only way to get more space is to wait. Gradually, if you don’t create more memories, and especially after sleep, old memories fade and leave some room for new ones. But you have no control over that process. That’s how human memory works.
Imagine how paranoid you would get! Open a social media app, and see it pour gigabytes of useless soup into your memory capacity. Turn on the TV, and watch your brain rapidly fill up with jingles from commercials. I bet that simply having that reminder of limited memory capacity would make you obsessive about controlling what enters your brain.
But hold on, you might say, we may not be aware of how much storage we have left, but we don’t run out of memory in the same way a computer does when you are trying to copy a big file.
As a matter of fact, we do. We just don’t feel it.
We memorize things by making adjustments to the wiring diagram of the brain. Neurons connect to each other using junctions called synapses, and the passage of signals through each of those synapses can be made stronger or weaker. The more often a signal passes through a particular synapse, the stronger or “broader” the synapse becomes, and the more likely the signal is to flow through the same broadened synapse in the future. That’s what our memory is: paths carved out in the brain by our experiences and thoughts, like water carving its own riverbed out of bedrock as it flows down the mountain.
For this memory to be useful, to reflect flexibly what is going on, there has to be something that balances out the carving—restores some bedrock from which new experiences can continue to carve new memories. Synapses can’t always be broadened—they also need to be narrowed. Otherwise, water will always flow everywhere at the same time.
This “narrowing” (weakening, or scaling) of synapses mostly occurs during sleep—which is why prolonged sleep deprivation causes an inability to focus, memory deficits, and eventually hallucinations.1 So when we max out on memory capacity, we don’t just stall and refuse to move forward. We lose the ability to distinguish what is reality and what is memory, and it all becomes a homogenous blur.
And isn’t this exactly what is happening to humankind in 2025?
We submerge our brains, on a daily basis, in quantities of information that no biological brain was ever meant to comprehend. Could it be that we are all constantly hovering at 99 percent of memory capacity without even realizing it?
There is ample evidence, for example, that digital multitasking negatively affects focus, sustained attention, working memory, and long-term memory.2 What is especially interesting is that the attention of heavy digital media users is not just worse—it is more spread out. In one experiment, participants were supposed to focus on a task and ignore distractions. But researchers covertly inserted helpful hints about the main task into the distraction—and this helpful distraction was more effectively used by people who used digital media the most.3
Maybe, then, we have reached a civilizational equivalent of a two-week sleep deprivation. When was the last time you thought about the fire in the Notre Dame Cathedral? That was only six years ago. The war in Ukraine captivated minds for a year, but eventually faded from public imagination. When Donald Trump got shot on July 14, 2024, the image of his defiant fist in the air was hailed as the “Photo of the Century.” Where is it now? Mixed in with the soup, somewhere between a travel video from Thailand and a cute baby elephant.
We are losing the ability to distinguish important things from unimportant things, between what we are experiencing for the first time and what we already know and expect. This is what maxing out on memory looks like.
What’s the solution? We need to recognize that brains do not have an infinite capacity—that, in fact, it is very easy to run low on storage. That does not abruptly cut off our ability to learn, but it does gradually degrade the quality and strength of memories we are able to form. So even if having that progress bar in the corner of your eye would be too much, imagining that you have one might occasionally be useful.
At the deepest level, we need to understand that a memory is not an object that we jam into our brain—it is a selection of one neural pathway over another. The more selective you are about what you memorize, the sharper, stronger, and more vivid each memory will be.
Facebook/LinkedIn image: fizkes/Shutterstock
References
1. Waters, F., Chiu, V., Atkinson, A., & Blom, J. D. (2018). Severe sleep deprivation causes hallucinations and a gradual progression toward psychosis with increasing time awake. Frontiers in Psychiatry, 9, 303.
2. Ophir, E., Nass, C., & Wagner, A. D. (2009). Cognitive control in media multitaskers. Proceedings of the National Academy of Sciences, 106(37), 15583-15587.
3. Lui, K. F., & Wong, A. C. N. (2012). Does media multitasking always hurt? A positive correlation between multitasking and multisensory integration. Psychonomic bulletin & review, 19, 647-653.

