AI-Driven, Instructor-Free Platform Could Transform Surgical Training

Surgeons in training have traditionally learned by watching an experienced mentor in the operating room, then gradually taking on parts of the procedure themselves. But this “see one, do one, teach one” model depends on having a skilled instructor present, which isn’t always possible. Variability in who’s available, differences in teaching styles, and the pressures of a busy hospital can leave trainees with uneven experiences and gaps in their skills.

To overcome these challenges, researchers at the University of Rochester Medical Center (URMC) have developed an autonomous educational system that pairs realistic, 3D‑printed organs with an augmented‑reality headset and artificial intelligence (AI) to create a closed-loop training environment. The system, Educational System for Instructorless Surgical Training (ESIST), is described in a study in the Journal of Medical Extended Reality.

“This proof‑of‑principle demonstrates that deep learning, paired with extended reality, can autonomously teach and assess a critical surgical maneuver with near-perfect accuracy and high user satisfaction,” said Jonathan Stone, MD, with Departments of Neurosurgery, Neurology, and Biomedical Engineering, and director of Surgical Innovation at the URMC. “As AI architectures mature and hardware becomes more ergonomic, such systems are poised to transform both how surgeons learn and how AI augments surgeons in real time.”

Combining 3D‑Printed Organs, Augmented Reality & AI

ESIST evaluated surgeons as they performed one of the steps in a partial nephrectomy, when a surgical clamp is placed on the kidney’s artery. The simulation involved lifelike organs created through a combination of medical images, 3D printing, and tunable hydrogels, a process developed by Stone in collaboration with the URMC Department of Urology.

As a trainee practices placing the surgical clamp, the headset streams step-by-step guidance right into their field of vision. The researchers created a bespoke convolutional neural network, a type of AI well-suited for processing data with a grid-like structure, such as images and videos. The network watches through the laparoscopic camera feed, instantly recognizing whether the clamp is on the correct blood vessel, a vein, the ureter, or not clamped at all. If the clamp is misplaced, the system pauses and offers a correction; if it’s on target, it encourages the user to move forward. All of this happens without a human proctor in the room.

When seventeen participants tried ESIST, the AI correctly identified the artery clamp placement 99.9% of the time. Trainees reported that the real-time feedback felt valuable, and 84% of their survey responses rated the system favorably for teaching this critical step. By logging each action and timing how long it took, ESIST also provides objective performance data—highlighting errors and tracking improvements.

“The goal is to move the very early portion of the learning curve outside of the operating room,” said Stone. “This system doesn’t replace the mentor; it prepares the student better before they work with that mentor on patients.”

The Future: Scalable, Real-Time Surgical Guidance

Looking forward, while systems like ESIST could reshape surgical education, the same technology could grow to cover entire procedures, offering consistent, data-driven instruction at scale. In the future, surgeons might even use similar AI-powered overlays during real operations, receiving alerts about anatomy or technique in real time. As hardware and algorithms continue to improve, autonomous platforms have the potential to standardize training and become indispensable assistants at the operating table.

“This early step was about training, but in the future, it will be about augmenting performance,” said Stone. “The long-term vision is that AI models will be running in the background, processing surgical data, and providing real-time predictive analytics that improve efficiency and reduce complications.”

Additional co-authors include Nelson Stone with the Icahn School of Medicine at Mount Sinai, Steven Griffith, Kyle Zeller, and Michael Wilson with Viomerse, Inc. Stone is the founder and principal equity holder of Viomerse, and Griffith and Wilson hold equity in the company. The research was funded by the National Institute of Biomedical Imaging and Bioengineering and the National Science Foundation.

Continue Reading