When AI Helps—But Also Hurts: The "Google Maps Effect" in Medicine
- Eric Goldman

- Aug 19
- 4 min read
“We wanted a co-pilot. But turns out, letting the autopilot handle too much can dull our flying skills.”
Artificial Intelligence (AI) has promised—and often delivered—a turbo boost in many domains. In the medical world, AI-assisted tools are helping to detect polyps during colonoscopies that human eyes occasionally miss. That sounds like a no-brainer win.
But a brand-new, real-world study, published in The Lancet Gastroenterology & Hepatology, reveals a wrinkle: doctors who use AI too routinely may lose some of their sharpness when the AI is not there. It’s the “Google Maps effect” in medicine—wonderful when it’s on, but when it’s off... well, you may not remember the way.

Let’s unpack why this matters—for doctors, patients, and the future of human-AI collaboration.
The Study at a Glance
In Poland, four endoscopy centers participated in the ACCEPT trial, which integrated AI tools for polyp detection in colonoscopies starting in late 2021. Researchers compared adenoma detection rates (ADR) in two periods:
Before AI adoption: experienced endoscopists performed around 795 non-AI colonoscopies.
After AI adoption: they performed 648 non-AI colonoscopies, alongside AI-assisted ones.
The striking finding: ADR dropped from 28.4% to 22.4%, a 6-percentage-point absolute decline (approximately a 20% relative decrease) in detecting adenomas when doctors were not using AI—even among experienced practitioners.
For context, a 1% improvement in ADR corresponds to a 3% reduction in colorectal cancer risk—so losing 6 points could carry serious public health implications.
Why Did This Happen? And Should We Panic?
At first blush, the idea that a short period (mere months) of using AI could weaken doctors’ core diagnostic skills feels jarring. But experts help us interpret the nuance:
The “Google Maps effect”: As co-author Marcin Romańczyk puts it, when you lean on navigation too much, your internal map atrophies—and you may struggle when digital help disappears.
Automation bias: Automation bias: Once automation is introduced, people often defer to it—even when it’s not present. It can quietly reshape habits and attention.
Workload and fatigue: After AI was introduced, overall colonoscopy volume increased. More cases mean more fatigue, which can affect performance regardless of technology..
Study limitations: Study limits: This was an observational study, not a randomized trial. It involved one AI tool and experienced doctors, so the results may not be generalizable.
So, no, we’re not ready to discard AI tools—but yes, the study is a red flag, reminding us that over-reliance can erode vigilance.
Three Mental Models to Understand This
AI as a Gym: Think of AI as workout equipment. It can build strength if used as a tool—but if you’re doing the exercise for the machine (letting it do everything), your muscles (skills) atrophy.
Co-pilot versus Pilot: AI should act like a co-pilot, not fly the plane solo. Pilots train regularly—even with autopilot installed.
Gardeners, Not Just Harvesters: Doctors must continue cultivating their observational and decision-making “skills garden,” not just harvesting results from AI.
Practical Safeguards: What Should Medical Leadership Do?
Here are evidence-based ideas inspired by the study and broader expert commentary:
1. Introduce ‘AI-Off Days’
Schedule regular shifts where doctors perform procedures without AI assistance—so skills stay sharp and muscle memory is maintained.
2. Track Performance with and without AI
Don’t assume overall improvement means universally better performance. Have performance metrics for both AI-aided and non-AI procedures.
3. Blend Training Modes
Ensure surgical training includes “manual mode” simulations—and monitor how performance degrades when AI isn’t available.
4. Monitor Workload and Fatigue
If case volumes spike post-AI implementation, compensate with scheduling adjustments. Fatigue skews everything—even without AI.
5. Improve AI Explanability and Integration
User-centered design and workflow alignment (highlighted in recent HCI-driven studies) keep doctors engaged and focused.
6. Regulate AI Usage and Certification
Professional bodies may consider certification standards, as doctors need to demonstrate proficiency in both modes and must maintain continuing education in manual skills.
Looking Forward: What AI in Medicine Should Be
We stand at a crossroads. AI is powerful—but in pockets where lives depend on pattern recognition and split-second judgment, the danger of drift looms large.
AI’s promise is to augment, not replace, human expertise. And that means designing systems that:
Complement human strengths—spotting things the human might miss—but don’t let us offload cognitive effort entirely.
Educate—offer real-time feedback instead of silent overlay assistance.
Adapt—alert when doctors rely too heavily and prompt self-review.
Fall back safely—ensure clinicians remain proficient when systems fail (e.g., outages, cyber threats).
As Dr. Ahmad cautions, we’ve raced toward AI implementation—but we need to navigate this new human-AI ecosystem thoughtfully.
Final Word: The Art of Cautious Optimism
AI in colonoscopy—and medicine overall—isn’t a villain. It’s a partner with potential, but also a mirror—one that shows us where our skills may be slipping.
This study doesn’t dismiss AI—it raises a cautionary beacon. Without thoughtful integration, the cure may inadvertently erode our capacities.
The best AI systems will remind us of what we can do—and when to do it ourselves. After all, AI should be our co-pilot, not our autopilot.
References:
Study in The Lancet Gastroenterology & Hepatology: Non-AI adenoma detection decreased from 28.4% to 22.4% after AI adoption (absolute –6 pp; ~20% relative drop).
Public health impact: A 1% ADR bump yields approximately a 3% reduction in colorectal cancer.
Expert commentary on de-skilling, automation bias, and the need for safeguards.
Workload as a potential confounder (case volume doubled).
Call for behavioral research and human-centered AI design.




Comments