Doctors risk being ‘deskilled’ by relying too much on AI • The Register

When doctors use AI image recognition technology to spot and remove precancerous growths known as adenomas during colonoscopies, the detection rate is higher. But take the AI away, and their rate drops to below where it was in the first place.

One recent study shows that using this AI tool results in a 12.5 percent increase in the adenoma detection rates (ADR). That is expected to save lives.

But when doctors lose access to AI assistance, their ability to spot adenomas tends to drop below what it was before they started relying on AI, according to a study published in The Lancet Gastroenterology & Hepatology.

“Continuous exposure to AI might reduce the ADR of standard non-AI assisted colonoscopy, suggesting a negative effect on endoscopist behaviour,” the study concludes.

The analysis, based on data from four endoscopy centers in Poland between September 2021 and March 2022, compares the change in ADR of standard, non-AI assisted colonoscopy before and after endoscopists were exposed to AI in their clinics.

“The ADR of standard colonoscopy decreased significantly from 28.4 percent (226 of 795) before to 22.4 percent (145 of 648) after exposure to AI, corresponding with an absolute difference of minus 6.0 percent,” the study says.

The 21 authors of the Lancet paper note that in 2019, the European Society of Gastrointestinal Endoscopy (ESGE) warned about the risk of “deskilling” in its AI guidelines [PDF].

“Possible significant risks with implementation, specifically endoscopist deskilling and over-reliance on artificial intelligence, unrepresentative training datasets, and hacking, need to be considered,” the ESGE said.

The authors say they believe their study is the first to look at the effect of continuous AI exposure on clinical outcomes and they hope the findings prompt further research into the impact of AI on healthcare.

AI, for all its purported benefits in efficiency, may impose a cost on the people who use it. In June, MIT researchers published a related study that found the use of LLM chatbots associated with lower brain activity.

Concern about “deskilling” due to automation dates back decades. As noted in a recent paper from Purdue researchers, psychologist Lisanne Bainbridge’s 1983 work “Ironies of Automation” explored how the automation of industrial processes may expand problems for human system operators rather than solve them.

The Purdue academics argue the situation is similar for designers who come to rely on AI.

“Our findings suggest that while AI-driven automation is perceived as a means of increasing efficiency, excessive delegation may unintentionally hinder skill development,” they conclude.

Princeton University computer scientist Arvind Narayanan recently argued that developer deskilling as a result of AI is a concern. It’s not like compilers eliminating people’s ability to write machine code, a fear expressed years ago that never happened.

“On the other hand, if a junior developer relies too much on vibe coding and hence can’t program at all by themselves, in any language, and doesn’t understand the principles of programming, that definitely feels like a problem,” he said. ®

Continue Reading