AI Skill Erosion in Product Teams

Aircraft autopilots are some of the most advanced automation systems in the world. The aviation industry is also one of the most safety-focused, and therefore studied industries when things go wrong. Many scientific studies are generated about automation and its effects on human performance in aviation, which can inform how other automations, like AI, might affect human performance.

In aviation automations, research shows the danger is that pilots' understanding of what the system is doing can slowly diverge from reality, making their responses less likely to be correct when the situation suddenly changes, increasing the likelihood of accidents.

Product teams also risk of losing situational awareness — about customers, the business, and the market — increasing the likelihood of failures when conditions shift unexpectedly.

Mode confusion

A term that appears frequently in aviation accident reports is mode confusion.

Mode confusion refers to situations where pilots have an incorrect mental model of what an automation is doing (e.g. what mode the autopilot is in), leading to incorrect expectations, loss of situational awareness, and delayed or inappropriate responses when they do act.

A well-known example happened in San Francisco when a Boeing 777 came in too low, hitting the seawall just short of the runway in 2013. The captain had selected an incorrect autopilot mode causing the auto-throttle to stop controlling speed. The crew didn’t realize it, and when they did, weren’t able to correct in time.

The key insight from human-factors research into this is:

Automation doesn’t just affect the tasks humans do. It affects what they know.

This is not a failure of people, it’s a predictable outcome of relying on systems that appear to manage complexity on our behalf.

Product teams now have autopilots

With the rapid adoption of AI tools, product managers now have powerful “autopilots” of their own. With a short prompt, the AI will dutifully:

  • Summarize research

  • Generate strategies

  • Produce business cases

  • Write requirements

These are critical artifacts that spread through the company and affect downstream decisions for months or years, and yet, most companies have no guidelines for when and how to use AI. They simply expect their product managers to use AI correctly to get accurate results — no training!

In aviation, the use of automation is governed by strict training, checklists, protocols, and regular re-certifications. Pilots are explicitly trained to understand not just how automation works, but when to use it and when not to. In product management, any such rules and norms are nascent, at best, and definitely not in widespread use.

Without rules, the consequence for product teams is an increased risk of mode confusion: incorrect mental models that don’t reflect business or customer realities.

The artifact trap

This gap with reality is caused by what I call the “artifact trap.”

AI can make artifacts that look correct — PRDs, roadmaps, and business cases — because AI is particularly good at producing text that sounds coherent, well-structured, and plausible.

But plausibility is not understanding. When AI is used as the author rather than the assistant, product teams can still ship quality, polished work, while their internal models of the world quietly decay.

This is skill erosion, and it’s the silent killer of product companies over time.

Skill erosion

Skill erosion is not abstract or hypothetical. It has a specific path:

  1. Hypotheses are outsourced — Teams start with generating answers instead of forming their own questions.

  2. First-hand sense-making is reduced — With answers provided, less time is spent wrestling with contradictions and ambiguity, reducing deeper understanding. Understanding is no longer constructed, it’s consumed.

  3. Mental models are inaccurately formed — Superficial understanding forms simple and incorrect models of the world. 

  4. Judgement atrophies — When novel or unexpected things happen, teams make bad decisions based on their simple models.

What makes skill erosion particularly insidious is that it goes undetected until it’s too late. Like a flight crew that has been delegating to their autopilot, they react in all the wrong ways when an emergency strikes, because their mental models don’t reflect what’s actually happening.

Using AI too early

To be clear, the problem in my mind is not using AI. The problem is using AI too early in product development.

Using AI too early means using it before:

  • You have written down what you know to be true about the situation.

  • You have articulated your understanding of the customer.

  • You have considered your known-unknowns.

  • You have explored how things could fail.

In other words, before deep understanding exists.

Using AI early in product development is like engaging autopilot before you know where you are going, or the conditions you’re flying into. Everything works fine, until it doesn’t.

The antidote: make understanding the goal

Human understanding must come first. AI can accelerate the work only after understanding exists, not create the understanding itself. 

One practical way to do this is a lightweight artifact I call the Product Flight Plan, but anything that captures the team’s understanding will work.

In aviation, pilots create a flight plan before takeoff. It's incredibly detailed (destination maps, route waypoints, speed and altitude at every moment, radio codes, weather, and dozens more details). It's the front-loading of understanding pilots do while still safely on the ground — where things happen much slower than in the air — letting them react quicker and more accurately when needed.

In product development, the Product Flight Plan serves a similar purpose, giving product teams the understanding they need to make decisions quickly throughout product development process and after.

It answers these questions:

  1. What’s true (real signals) — Five facts we know to be true, with high confidence, about our situation.

  2. What’s unclear (known unknowns) — Five things we explicitly don’t understand yet.

  3. The user’s inner problem — One or more Jobs-To-Be-Done.

  4. Our feature bet — If we do X, we believe Y will happen because Z.

  5. The fear list — Five ways this project can fail in the real world.

Questions 1 and 2 articulate your understanding of the world.

Question 3 describes your customer.

Question 4 states your idea and its rationale.

Question 5 punctures both human certainty and AI-generated optimism.

Where AI should sit in product development

A healthy product development flow with the plan would look like this:

  1. Write the Flight Plan (no AI) — Forces deep thinking about the situation.

  2. Stress-test the brief (first use of AI) — AI is a partner, not the author. Ask AI to:

    • Suggest alternatives you didn’t consider.

    • List edge cases.

    • Think up more possible failure modes.

  3. Update the brief — Evaluate AI points with the team’s judgement and adjust your understanding.

  4. Produce artifacts (with AI) — Grounded in reality, now AI becomes a powerful (and safe) accelerator of the work.


There’s a great quote in aviation:

“A pilot may earn their full pay for the year in less than two minutes.” — Ernest Gann

It refers to a pilot’s true value being their skill, judgment, and composure when tested in moments of unexpected stress or danger. A product team’s value can be seen similarly.

During calm periods — stable markets, familiar customers, predictable constraints — almost any team can look competent. But when conditions change, only teams that have retained an accurate model of the world, and therefore an ability to sense and think correctly, can "land the plane" for the business.

There’s no question AI has changed product management forever, and accelerated the speed of work. However, teams that apply AI indiscriminately will eventually be exposed as those who have been quietly, inadvertently outsourcing their understanding, which will have consequences for the business in the long-term that outweigh the feeling of speed in the near-term.

Mark Rabo