The Next Scarcity Is Already Inside Your Head
Share
Key Points
- The average focus span is now 47 seconds. Down from 2.5 minutes in 2004.
- Attention is elastic, you can train it back. But most people are letting it erode daily.
- AI can free your mind or hollow it out. The difference is how you use it.
Key Points
- The average focus span is now 47 seconds. Down from 2.5 minutes in 2004.
- Attention is elastic, you can train it back. But most people are letting it erode daily.
- AI can free your mind or hollow it out. The difference is how you use it.
Attention Is the New Oil and We Are Burning Through It
There is a resource you spend every waking hour of your life either investing or squandering, and almost no one is managing it strategically. It is not money. It is not time. It is attention. The raw, finite, irreplaceable capacity of the human mind to focus on one thing at a cost to everything else.
We are living through a moment where that resource is being strip-mined by systems smarter than we are, and most of us have not even noticed the excavation underway.
This matters more than any other economic force right now and the question of whether human attention is fixed or elastic may be one of the most consequential questions of the next decade.
The Currency No One Is Guarding
In economics, scarcity creates value. When something becomes rare at the exact moment demand for it explodes, prices follow. We saw it with oil in the 20th century, with data in the early 21st, and we are watching it happen now with human attention.
The term "attention economy" was first coined by psychologist and Nobel laureate Herbert Simon in 1971. He argued that as information becomes abundant, the bottleneck shifts from information supply to the human capacity to process it. Information consumes attention, and attention is finite. More than fifty years later, Simon's framing has never been more accurate or more urgent.
An uncomfortable truth: every app on your phone, every platform on the internet, every algorithmic feed was engineered -- not designed, engineered -- to capture and hold your attention for as long as possible. Not because it is good for you. Because your attention, once captured, converts to advertising revenue.
Cal Newport, computer scientist and author of Deep Work, put it plainly: "This is my main concern with large attention economy conglomerates like X and Facebook: it's not that they're worthless, but instead it's the fact that they're engineered to be as addictive as possible."
The Data Is Damning
Gloria Mark, a professor of informatics at the University of California, Irvine, has been measuring human attention in digital environments for nearly two decades. Her 2023 book Attention Span documents a trajectory that should stop you cold. In 2004, the average time a person spent focused on a single screen before switching was two and a half minutes. By 2012 it had fallen to 75 seconds. Her most recent measurements place it at 47 seconds.
Read that again. Forty-seven seconds. That is how long people actually stay focused before their environment pulls them somewhere else.
And the cost of each interruption is staggering. Mark's research also found that once you are pulled away from a task, it takes an average of 25 minutes to return to the same depth of cognitive engagement. Think about that against the backdrop of a workday. A 2023 study by Economist Impact, commissioned by Dropbox, found that in the United States alone, $468 billion is lost annually to workplace distraction which is roughly $37,000 per manager per year. Globally, the number exceeds $1.4 trillion.
We are hemorrhaging the most valuable cognitive resource on the planet, and we have industrialized the bleeding.
Fixed Supply or Elastic Capacity?
Here is the question I keep turning over: Is human attention a fixed supply, or is it something we can expand?
The conventional assumption, reinforced by the data above, is that attention is a scarce, depletable resource. It drains across the day. It fragments under pressure. It is limited by the architecture of human working memory. And that framing leads to a fairly grim conclusion: we are in a zero-sum war for a fixed pie, and the algorithms are winning.
But I think this framing is incomplete. And the distinction matters enormously.
Consider what researchers have found when they look at attention across different conditions rather than just measuring its degradation. The Center for Brain, Mind and Society has noted something important: the decline in attention span may say more about our environment than our actual cognitive capacity. When the environmental pressures are removed (i.e. when people operate in distraction-free conditions, when they sleep adequately, when they practice sustained focus over time) the underlying hardware performs dramatically better.
In cognitive science evidence-based interventions (i.e. deliberate practice in sustained focus, Pomodoro-style work blocks, phone-free intervals) produce measurable improvements in attention within two to four weeks. Looked at in this way, attention is not purely extractive. It can be renewed and even be developed.
I think the more accurate mental model is: human attention has a natural baseline capacity that is both renewable and trainable, but it operates within a daily budget that is depletable. Sleep, recovery, and intentional practice expand the sustainable range. Chronic distraction and fragmentation shrink it.
Think of it less like a fuel tank and more like a muscle. You can exhaust it. You can atrophy it through disuse. But you can also train it, and the gains are real.
Here is what makes this insight urgent rather than merely interesting: if attention is elastic, then the battle is not just about protecting what you have. It is about whether you are building capacity or letting it erode. And most people, right now, are letting it erode at an accelerating rate while assuming the degradation is simply the cost of modern life.
AI Enters the Picture and Everything Gets Complicated
Now layer artificial intelligence onto this landscape, and the dynamics get genuinely strange.
AI was supposed to solve the cognitive overload problem. And in specific, narrow ways, it does. AI can filter irrelevant information, surface relevant signals, handle routine cognitive tasks, and free up mental bandwidth for higher-order thinking.
But here is the paradox that the research is increasingly documenting: AI does not just reduce your cognitive load. It reshapes the cognitive habits that generate your capacity to think deeply in the first place.
A 2025 study published in the journal Societies found a significant negative correlation between frequent AI tool usage and critical thinking abilities. The mechanism was cognitive offloading (i.e. the more people handed thinking tasks to AI, the less they exercised the cognitive muscles that make independent reasoning possible) and, shockingly, younger participants showed the sharpest decline.
A separate neural-behavioral study cited in educational research showed reduced neural engagement in users who relied heavily on AI to complete cognitive tasks. The brain, like any system, optimizes for efficiency. If AI is doing the heavy lifting, the brain stops training for it.
| AI as a scaffold | AI as a surrogate | |
| How AI is used | A tool that supports your thinking, surfaces information, and handles low-value processing | A replacement for your own reasoning, analysis, and judgment |
| Outcome | Frees attention for deeper work | Gradually hollows out the cognitive capacity you are trying to protect |
Conciseness Is Not Just a Style Choice
There is a thread connecting all of this that I want to pull on, because I think it is underappreciated: conciseness.
In a world where attention is the scarcest and most valuable resource, demanding more of it than necessary is a form of taking something that does not belong to you. Every unnecessary word in a report, every meeting that should have been an email, every AI-generated summary that buries its insight in five paragraphs are all small acts of theft against the attention economy.
This is not just a productivity observation. It is a design philosophy and, increasingly, a competitive signal.
The businesses and communicators who understand that attention is a finite and precious currency will design their outputs accordingly. They will say what needs to be said and stop. They will build tools and products that return attention rather than consume it. They will treat cognitive load as a cost to minimize, not a space to fill.
This is why the best AI-native products are not the ones that do more. They are the ones that surface the right thing, faster, and get out of the way. The AI assistant that saves you four minutes of reading is not just more efficient, it is more respectful. And respect, in a world drowning in noise, is a competitive advantage.
Newport captures this as well as anyone and made it the organizing principle of his career: "The skillful management of attention is the sine qua non of the good life and the key to improving virtually every aspect of your experience."
The organizations that internalize this will win. Not because they are kinder, but because they are smarter about the only resource that cannot be manufactured, purchased, or scaled.
The Strategic Implication to Consider
We are entering an economy where the highest-value human activity is the kind of sustained, focused, deep cognitive work that algorithms cannot yet replicate. The ability to think hard about a complex problem without fragmenting, to hold a difficult idea in working memory long enough to actually turn it into something, to reason from first principles rather than pattern-match from training data. These skills are becoming scarcer at exactly the moment they are becoming more valuable.
Newport wrote the framework for this a decade ago: "The ability to perform deep work is becoming increasingly rare at exactly the same time it is becoming increasingly valuable in our economy. As a consequence, the few who cultivate this skill, and then make it the core of their working life, will thrive."
The forces eroding that capacity have intensified dramatically, and the reward for maintaining it has compounded.
The practical question, then, is not whether attention is the currency of the future. It clearly is. The question is whether you are treating your attentional capacity as an asset worth developing, or as a resource you are allowing others to extract.
Look at your phone screen time report. Look at how many times you checked it yesterday. Look at how many hours of your best cognitive energy went to notifications, shallow content, and reactive communication rather than the one or two things that would actually move your life forward.
That is not a moral judgment. It is an asset allocation problem. And right now, for most of us, the allocation is catastrophically wrong.
What I Am Watching
A few things I am tracking closely as this plays out:
The most interesting products of the next five years will not be the AI systems that do the most. They will be the ones that protect and extend human attention rather than compete with it. Think of them as attention-positive technology: tools that give you more cognitive capacity back than they consume. Whoever solves this at scale will own a category.
The skill premium for people who can do genuine deep work will continue to expand. Not because AI is replacing shallow work (though it is), but because the ability to do the kind of thinking AI cannot replicate is becoming rarer even as it becomes more necessary.
And the question of whether attention is elastic (whether it can be trained, protected, and expanded) will quietly become one of the most important questions in human capital and organizational strategy.
