Leaderology Blog

Something Big Is Happening. Leadership Must Keep Pace.

Written by Leaderology | Feb 19, 2026 5:45:57 PM

Matt Shumer recently described this moment in artificial intelligence as similar to “February 2020” referring to that stretch of time when something significant is already in motion, but most organizations haven’t quite adjusted to that thinking yet. He argues that AI capability is advancing faster than companies are prepared to absorb, that timelines are compressing, output is multiplying, and yes, even white-collar jobs, possibly even his own, could be replaced by AI agents.

He may be right. 

There’s no denying that AI is changing the pace of work. Decisions are being made faster and tasks that once required coordination across teams can now be handled by a system in minutes. That kind of acceleration is real.

But here’s the part we think deserves equal attention: If AI accelerates work, what happens to the people expected to absorb that acceleration?

While organizations flatten, expectations rise, and visibility increases, leaders will be expected to shoulder more. Because speed doesn’t just change workflow. It increases the G-force on the organization, and in flatter systems, leaders are the ones that have to absorb it.

The middle of the organization is becoming more consequential than ever.

This is the part many organizations are underestimating. Mid-level leaders won’t just be managing execution. They’ll be interpreting strategy, making trade-offs, and representing company principles in real time. They’ll be expected to exercise judgment under pressure (often without multiple layers below them to absorb the impact).

Whether we formally label it that way or not, they will be operating more like executives. Pushing someone into a pressure situation they aren’t prepared to carry won't make them grow faster. It will make the system more fragile.

And that’s not a technology problem. That’s a human one. 

 

Knowledge Doesn't Equal Wisdom

AI can summarize the data, recommend the most efficient path forward, and surface ten strategic options before the meeting even starts. But it can't sit in that meeting when the room goes quiet and raise its hand to say, “Hmm... I’m not sure this is the right move.

It can't sense when everyone appears aligned but something feels unresolved. It can't decide to slow down because protecting long-term customer trust matters more than hitting a short-term number.

AI can optimize for speed but it can't anchor to purpose and principles. That requires judgment. And moral or strategic judgment doesn't move at the same pace as technology.

That kind of wisdom takes time.

 

When Speed Increases, So Does The Cost of Silence

Boards want faster answers, markets reward responsiveness, customers expect immediacy, and AI is helping to deliver on these at great speed. But to succeed in this high-velocity environment, standards such as ethics, quality, and culture need to be protected. 

But here’s the uncomfortable truth:

You can mandate standards from the top but to sustain them you need human courage in the room.

It’s no longer just about whether the analysis provided by AI is thorough. It’s about whether anyone feels secure enough to challenge the direction, especially in a climate where roles are shrinking, layoffs are visible, and AI is openly replacing tasks people once built careers around.

Without that psychological safety, pressure doesn’t produce excellence. It produces silence and at best, mediocrity. People stay quiet when they should speak up. They stay silent to protect stability instead of protecting the strategy.

And in fast-moving systems, that silence can get expensive very quickly.

 

The Part AI Can't Do

AI responds instantly, but it doesn't manage ego, regulate anxiety, nor absorb pressure when revenue dips, when competitors move first, and when the board asks harder questions.

But, leaders do.

Leaders can steady a room in a way that AI can't. Emotional regulation in those moments is a strategic advantage. It’s what allows a team to think clearly instead of reacting defensively. Emotional intelligence, understanding nuances, a deep understanding of context and complex situations, making ethical decisions, intuition, and “gut feelings” are just some of human’s innate skills that are difficult to replicate in AI systems. 

However, the most overlooked flex that humans have over AI, is the beauty of human error. The ability to be imperfect, to miscalculate, to not be precise, and to simply be forgetful have led to some superb discoveries and innovations. 

So sure, AI can do a lot of things, but can it accidentally discover Penicillin? Can it leave a petri dish on the bench a little too long, go on vacation, come back, notice something no one told it to look for, and think, ‘That’s interesting…’?

That's the beauty AI can't replicate.

 

The Human Work Ahead

We don’t need to argue with Shumer to see that acceleration through AI is real. It is.

The better question is whether we’re strengthening the human system at the same pace we’re upgrading the technology.

If speed threatens clarity, are our leaders rooted enough in purpose and principle to steady the room?

If pressure intensifies, are they willing to risk relationally to create the kind of environment where dissent isn’t just tolerated, but invited?

If performance increases, are we cultivating psychological safety to ensure standards are protected?

And if systems move faster than people naturally do, are we renewing human capacity intentionally or are we assuming our people's resilience will simply stretch to meet the demand?

There is no doubt AI will accelerate knowledge and output.

But, it will not accelerate wisdom.
It will not build courage.
It will not multiply authentic leadership.

That work remains human.