A new wearable, noninvasive brain-computer interface (BCI) system that uses artificial intelligence has been designed to help people with physical disabilities.
The University of California, Los Angeles, developed this new BCI where an AI acts as a “co-pilot.” It works alongside users to understand their intentions and help control a robotic arm or computer cursor.
The system can potentially create new technologies to improve how people with limited mobility, like those with paralysis or neurological conditions, handle objects.
“By using artificial intelligence to complement brain-computer interface systems, we’re aiming for much less risky and invasive avenues,” said Jonathan Kao, study leader and an associate professor of electrical and computer engineering at the UCLA Samueli School of Engineering.
“Ultimately, we want to develop AI-BCI systems that offer shared autonomy, allowing people with movement disorders, such as paralysis or ALS, to regain some independence for everyday tasks,” Kao added.
Interpreting the user’s intent
Until now, the most advanced BCI devices required risky and costly neurosurgery. The advantages of the technology were often outweighed by how invasive the procedure was.
Wearable BCIs, while safer, often lacked the reliability needed for practical application.
This new system pairs an electroencephalography (EEG) cap with a camera-based AI platform, which records brain activity.
The team developed special algorithms to decode brain signals from an EEG cap.
A camera-based AI platform then takes over, interpreting the user’s intent in real time to guide actions such as moving a computer cursor or a robotic arm.
The trials were conducted on a group of four participants, which included three individuals without motor impairments as well as one who was paralyzed.
Faster completion of the task
In a test with two tasks — moving a cursor to eight targets and using a robotic arm to move four blocks — all participants finished much faster with AI assistance.
The paralyzed participant was a key example; he could complete the robotic arm task in roughly six and a half minutes with the AI’s help, a task he couldn’t do on his own.
“Next steps for AI-BCI systems could include the development of more advanced co-pilots that move robotic arms with more speed and precision, and offer a deft touch that adapts to the object the user wants to grasp,” said co-lead author Johannes Lee, a UCLA electrical and computer engineering doctoral candidate advised by Kao.
“And adding larger training data could also help the AI collaborate on more complex tasks and improve EEG decoding itself,” Lee added in the press release.
This system sets a new standard for noninvasive BCI performance.
It holds potential for individuals facing paralysis or neurological conditions, offering a path to regaining independence in daily tasks.
In recent years, the advancement of the brain-computer interface system has been happening rapidly.
Recently, an investigational brain-computer interface system developed by the University of California, Davis, has allowed a patient with ALS to communicate in real time. This technology is designed to enable faster, more natural conversation for those with the neurological disease, which causes a loss of muscle control.
The new findings were published in the journal Nature Machine Intelligence.