Everything is faster. Nothing feels simpler.
Everything is faster. Nothing feels simpler.
Over the past couple of weeks, I published a fictional series about how a system fails.
At the time, it felt like distance. A way to step outside the usual conversations about AI and talk about failure without pointing at anyone in the room.
I’m less interested in that framing now.
Because when you look at how work is actually unfolding, the issue isn’t fictional. It’s not even particularly subtle. It’s just hard to name clearly when you’re in the thick of it.
Teams are moving faster. Work that used to take weeks now shows up in hours. People walk into meetings with something built—not a concept, not a rough idea, but a working version of what they think the solution should be. And on paper, that’s progress. It’s what we’ve been asking for: less friction, more output, shorter cycles.
But the way we do work hasn’t kept up.
The roles are mostly the same. The decision structures are mostly the same. The expectations about coordination, ownership, and accountability are mostly the same. We’re still operating with the assumption that our foundations - those processes and frameworks on which we’ve built careers and industries - will hold.
We’ve changed how quickly work can happen. We haven’t changed what it takes to make that work cohesive.
And that gap doesn’t just sit there. It creates pressure—and you can feel it in all the places where things should be working, but somehow aren’t.
Customers show up differently now. They don’t just describe problems—they bring prototypes. They’ve already used AI to generate something that looks finished, and they want to know why it can’t move just as quickly through the rest of the organization as it came out of their head over the weekend.
Inside the organization, things are even tighter. Teams are smaller. Fewer people are picking up more stuff. There’s less buffer, less redundancy, and less room for things to work themselves out over time. And people are anxious about the market, their value, their jobs.
So you get the combination of more speed coming in, less structure holding things together.
And the instinct is to treat those as separate problems—customer expectations on one side, internal constraints on the other.
They’re not separate. They’re both symptoms of the same thing:
The way work is happening has changed. The way we’re managing that work hasn’t.
If you want to understand where that starts to break down, you don’t have to look far. It shows up in the same places, over and over again—not as dramatic failures, but as small points of friction that don’t quite resolve.
Roles are collapsing. Context isn’t.
There’s a version of this story that sounds entirely positive. AI allows people to operate more independently. You don’t need to wait for design, or analysis, or engineering validation to get something moving. You can move from idea to execution in a single loop.
And people are.
It’s increasingly common to see someone walk into a conversation with something that looks finished—a prototype, a flow, a working version of what they think the solution should be. It’s fast, it’s compelling, and it cuts through a lot of the noise that used to slow things down.
But it also skips something.
That earlier process—the drawn-out process everyone complained about—was doing more than just moving work forward. It forced different perspectives to collide. Design translated intent. Product surfaced constraints. Engineering thought about what happens after launch, not just at launch. It wasn’t efficient. It was connective.
When that collapses into a single loop, the output still shows up. And often it shows up faster and cleaner than before. But the context that used to move through those layers doesn’t automatically come with it. And why should we care? It matters more than you think.
You won’t feel that immediately. You’ll feel it later—when something needs to connect, when something needs to scale, or when something that worked perfectly on its own doesn’t quite hold when it meets the rest of the system.
No one did anything wrong. But the system is carrying less shared understanding than it used to.
Decisions are faster. Alignment is thinner.
AI pushes decisions closer to the work.
You don’t need to wait the same way you used to. You can generate options, run analysis, and move forward without pulling in the same number of people. That’s part of the appeal.
And in many cases, it works. Until it doesn’t.
Because decision-making was never just about choosing an option. It was about building shared context—getting multiple people to see the same problem in the same way before something moved forward. When decisions move faster than that context can travel, something else shifts.
Teams make locally reasonable decisions that don’t quite line up when they meet. Work progresses, but it doesn’t always connect. Speed smooths over everything, and you hear ‘we’ll figure it out when we get to it,’ more and more. By the time someone notices, you’re not looking at a single bad decision—you’re looking at a set of decisions that don’t quite fit together and not enough time - or appetite - to root cause where you stepped off the path.
What looks like speed often shows up as fragmentation.
Coordination layers are thinner. The work didn’t go away.
For years, organizations have tried to reduce coordination overhead. Fewer meetings, fewer handoffs, fewer people in the loop. AI helps deliver that. Work moves more directly. Fewer intermediaries. Faster cycles.
But coordination isn’t something you can eliminate. It’s something you can only move.
When you remove formal coordination layers, the work doesn’t disappear. It shows up somewhere else—usually in ways that are harder to see.
It shows up in side conversations. In Slack threads trying to reconnect context. In someone staying late to make sure things line up before a deadline. In a manager stepping in because something feels off, even if they can’t immediately explain why.
From the outside, it looks simpler. From the inside, it’s just less visible.
Reskilling is focused on the wrong thing
Most organizations are responding to this moment by focusing on skills.
What skills do we need? How do we reskill the workforce? How do we measure progress?
That makes sense if the problem is a gap in capability. But the deeper shift isn’t about whether people can produce output. AI can generate output across domains. That’s not the constraint anymore.
The constraint is judgment.
Knowing what to build. Knowing what not to build. Knowing whether something that “works” actually fits into a larger system.
Those aren’t skills you pick up in a course.
They develop through exposure to context, through participation in decisions, through seeing how things connect over time. And right now, many organizations are compressing the very environments where that kind of judgment gets built. So you get more output, faster.
And fewer people who can reliably tell whether that output will hold.
The work that holds everything together is still invisible
Every organization depends on work that doesn’t show up cleanly.
Someone notices that two teams are about to make conflicting decisions and steps in early. Someone translates between groups that aren’t quite aligned. Someone keeps track of how things connect over time, even when no one has formally assigned them that responsibility.
That kind of work doesn’t produce artifacts. It prevents problems.
As output increases, this layer becomes more important—not less. There are more things to connect. More decisions to reconcile. More edges where things can break. But it’s still not tracked. It’s still not measured. It still doesn’t show up in dashboards.
So everything looks stable—because the instability is being absorbed somewhere else. Until it isn’t.
What this means in practice
This is where things start to feel confusing—because none of these changes show up as a single, obvious failure. They show up as moments.
A customer asks why something can’t be implemented immediately, because they’ve already built a version of it themselves. From their perspective, the problem is solved. From yours, nothing has been integrated, validated, or connected to the rest of the system.
A leadership team needs to make a decision quickly. They have more data than ever, more projections, more scenarios—but less shared understanding of how things actually fit together. The decision moves, but it doesn’t feel grounded.
A team delivers continuously, moving faster than it ever has before. But when something breaks, no one can easily trace where it started, who owns it, or how it connects.
These aren’t edge cases. They’re early signals.
The part no one is talking about directly
Culture doesn’t sit above this. It’s shaped by it. Culture is what happens in the middle of everyday work. And everyday work is changing faster than most organizations can keep up with.
Less time to align. Less clarity in roles. More pressure to move. More reliance on individuals to figure things out in real time.
That shifts behavior, whether you intend it to or not.
Toward speed over reflection. Toward autonomy over shared accountability. Toward output over understanding.
In organizations where trust is already fragile, this gets amplified. People don’t trust each other’s decisions. They don’t trust the system. So they don’t trust the outputs either.
AI doesn’t create that dynamic. It removes the buffers that used to hide it.
This is the opportunity
Most organizations are reacting to this as if it’s a tooling problem, or a training problem, or a temporary disruption.
It’s not.
It’s a structural mismatch.
Work is happening faster, more independently, and more visibly than ever. The way we coordinate, make decisions, and hold that work together hasn’t changed to match it.
That’s what people are feeling right now. Not confusion about AI.
Confusion about how to operate inside a way of working that no longer matches reality.
The opportunity isn’t to slow things down. It’s to redesign how work actually holds together under these conditions—intentionally, instead of by accident.
Because that change is coming either way. The only real question is whether you shape it, or end up reacting to it later.