We often joke that our attention spans have diminished significantly in recent years with the rise of digital technologies and screen-based entertainment, but there is solid science to support this observation. In fact, a shorter attention span is just a side effect of a recent explosion of screen distractions, as neurologist and author Richard E. Cytowic argues in his new book, “Your Stone Age Brain in the Age of Screens: Coping with Digital Distraction and Sensory Overload” (MIT Press, 2024).
In his book, Cytowic explains that the human brain has not changed significantly since the Stone Age, leaving us ill-equipped to handle the influence and allure of modern technologies – particularly those propagated by large technology companies. In this excerpt, Cytowic highlights how our brains struggle to keep up with the rapid pace at which modern technology, culture, and society are evolving.
From a technical perspective, the brain has fixed energy limits that dictate how much work it can handle at any given time. Feeling overloaded leads to stress. Stress leads to distraction. Distraction then leads to error. The obvious solutions are to either stop the inflow or alleviate the stress.
Hans Selye, the Hungarian endocrinologist who developed the concept of stress, said that stress “is not what happens to you, but how you react to it.” The characteristic that allows us to successfully manage stress is resilience. Resilience is a welcome quality, because any demands that take you away from homeostasis (the biological tendency of all organisms to maintain a stable internal environment) lead to stress.
Distractions in front of screens are an ideal candidate for disrupting homeostatic balance. Long before the advent of personal computers and the Internet, Alvin Toffler popularized the term “information overload” in his 1970 bestseller, Future Shock. It promoted the dark idea of possible human dependence on technology. In 2011, before most people had smartphones, Americans received five times more information in a typical day than they did twenty-five years earlier. And now, even today’s digital natives are complaining about the constant stress technology puts them under.
Visual overload is more likely a problem than auditory overload, because today, eye-brain connections anatomically outnumber ear-brain connections by about three times. Auditory perception was more important to our early ancestors, but vision gradually took precedence. This could evoke what-if scenarios. Vision also prioritized simultaneous inputs over sequential inputs, meaning there’s always a delay between when sound waves hit your eardrums and when the brain can understand what you’re hearing. The simultaneous input of vision means that the only delay in capturing it is the tenth of a second it takes to travel from the retina to the primary visual cortex, V1.
Smartphones easily outperform conventional phones for anatomical, physiological and evolutionary reasons. The limitation of what I call digital screen input is the ability of the lens of each eye to transfer information to the retina, the lateral geniculate, and from there to V1, the primary visual cortex. The modern dilemma we have found ourselves in is all about flow, the flow of radiant energy that bombards our senses from far and near. For eons, the only stream that human sensory receptors had to transform into perception involved the sights, sounds, and tastes of the natural world. From that time until today, we have only been able to detect a tiny fraction of the total electromagnetic radiation that instruments objectively tell us is present. Cosmic particles, radio waves and cell phone signals pass through us unnoticed because we don’t have biological sensors to detect them. But we are very sensitive to the manufactured flow which began in the 20th century and which adds to the underlying natural flow.
Our self-created digital overabundance keeps hitting us, and we can’t help but notice and be distracted by it. The storage of a smartphone is measured in tens of gigabytes and the hard drive of a computer in terabytes (1,000 gigabytes), while data volumes are calculated in petabytes (1,000 terabytes), zettabytes (1,000,000 000,000 gigabytes) and beyond. Yet humans still possess the same physical brains as our Stone Age ancestors. It is true that our physical biology is incredibly adaptive and that we inhabit every niche on the planet. But it cannot keep up with the dizzying speed at which modern technology, culture and society evolve. Attention span looms large in debates about how much screen exposure we can handle, but no one takes into account the energy cost involved.
A highly cited study conducted by Microsoft Research Canada claims that attention spans have fallen below eight seconds – less than that of a goldfish – and this would explain why our ability to concentrate has become hell. But this study has shortcomings and “attention span” is a colloquial term rather than a scientific one. After all, the brains of some people from the Stone Age had the ability to compose a symphony, monitor the flow of data from a nuclear reactor or space station, or solve mathematical problems hitherto insoluble. There are individual differences in the capacity and ability to cope with stressful events. To pay tribute to California, Gloria Mark of the University of California, Irvine and her colleagues at Microsoft measured attention span in everyday environments. In 2004, people took an average of 150 seconds to move from one screen to another. In 2012, this time had fallen to 47 seconds. Other studies have replicated these results. We are determined to be interrupted, Mark said, if not by others, then by ourselves. Our switching burnout is “like a leaking gas tank.” She has found that a simple board or digital timer that prompts people to take periodic breaks is very helpful.
Neuroscience distinguishes between sustained attention, selective attention and alternating attention. Sustained attention is the ability to concentrate on something for an extended period of time. Selective attention reflects the ability to filter out competing distractions to stay on task. Alternating attention is the ability to switch from one task to another and return to where you left off. In terms of the energetic cost of repeatedly shifting attention throughout the day, I fear we have reached the edge of the Stone Age of the brain. Exceeding it leads to foggy thinking, reduced concentration, thought blocking, memory loss or precision calipers, any tool quickly comes to seem like an extension of oneself. The same applies to smart devices. Two centuries ago, when the first steam locomotives reached a blistering speed of thirty miles per hour, alarmists warned that the human body could not sustain such speeds. Since then, ever-faster cars, communication methods, jet planes, and electronics have spread through culture and been absorbed into daily life. In the past, fewer new technologies appeared each decade, fewer people lived, and society was much less connected than it is today.
In contrast, the invention, proliferation, and evolution of digital technology have put the status quo in constant flux. Unlike their analog counterparts such as a landline phone or record player, smart devices repeatedly demand and hold our attention. We have conditioned ourselves to respond to incoming texts and calls as soon as they arrive. Certainly, jobs and livelihoods sometimes depend on an immediate response. Yet we pay the price in energy costs due to constant shifting and refocusing of our attention.
This excerpt has been edited for style and length. Reprinted with permission from “Your Brain from the Stone Age to the Screen Age: Coping with Digital Distraction and Sensory Overload” by Richard E. Cytowic, published by MIT Press. All rights reserved.