Article Content
Since the dawn of human spaceflight, scientists have carefully studied the effects of space and microgravity on astronauts. After decades of observations and examinations, one truth is certain: space is brutal on the human body. Muscles atrophy, bones lose mass, limbs stretch and — more unknown — eyesight can degrade in ways not yet fully understood.
To better understand the loss of vision caused by space flight, University of California San Diego researchers used U.S. National Science Foundation (NSF) ACCESS allocations on the Expanse system at the San Diego Supercomputer Center (SDSC) to predict who is most at risk for developing eyesight issues — before liftoff. The team — led by researchers at the Shiley Eye Institute and Viterbi Family Department of Ophthalmology in collaboration with the Halıcıoğlu Data Science Institute (HDSI) at the UC San Diego School of Computing, Information and Data Science — used artificial intelligence (AI), trained on high resolution eye scans, to predict individuals at highest risk.
According to NASA, around 29 percent of crew members who participated in short-duration space flights reported a degradation of distance or near-visual acuity. For crew members in long-duration missions, that number spiked to 60 percent. In 2017, scientists first used the name Spaceflight Associated Neuro-ocular Syndrome (SANS) to describe this vision degradation caused by space flights. While symptoms — including optic disc (beginning of the optic nerve which goes to the brain) swelling, vision shifts and structural distortions — often resolve within weeks to months post-flight, in some cases they do not resolve for years after long-duration missions.
“Our models showed promising accuracy, even when trained on limited data,” said lead author Alex Huang, M.D., Ph.D., a professor of ophthalmology at UC San Diego School of Medicine and HDSI affiliate faculty. “We’re essentially using AI to give doctors a predictive tool for a condition that develops in space, before astronauts even leave Earth.”
Huang added that tools such as the one his team developed can support risk management and, in the future, preventative measures prior to launch.
The AI system was trained using optical coherence tomography (OCT) scans — microscope-like images of the optic nerve — collected before and during space flight. The researchers also used data from head-down tilt bedrest studies on Earth. In this procedure, participants lay in a continuous six-degree head-down tilt, 24-hours a day, mimicking the effects of weightlessness by shifting fluids toward the head.
Predicting the Unpredictable — with Help from a Supercomputer
To overcome the challenge of limited astronaut data, the team used deep learning — a form of AI that mimics how the brain processes images. They broke each eye scan into thousands of slices, creating a much larger dataset for training the models. The researchers also used data augmentation and transfer learning, which help the AI generalize from small samples.