Eye-tracking technology from Jabbla and Mo-Vis empowers patients with severe motor impairments to drive an indoor wheelchair with their eyes.
Eye-tracking has already transformed communication for people with severe motor impairments; now it is changing mobility. DriveControl, developed by Belgian AAC company Jabbla in partnership with Mo-Vis, enables users to steer their powered wheelchairs with their eyes.
The technology integrates Jabbla’s Tellus i6 device with a Mo-Vis interface, translating gaze into proportional driving commands. After showcasing the prototype in 2024, the companies released the system across Europe in 2025, positioning it as a tool for independence among patients who can no longer rely on conventional wheelchair controls.
When Eyes Become the Joystick
For patients with advanced neuromuscular conditions, severe cerebral palsy, or high spinal cord injuries, traditional access methods such as joysticks, head controls, or sip-and-puff systems may no longer be feasible. Without a reliable way to control a wheelchair, independence is compromised, and reliance on caregivers increases.
“DriveControl is aimed to empower anyone who is physically disabled and can no longer use a joystick on their wheelchair, and who can use eye gaze and understand the concept of driving around with eye gaze,” Ian Foulger, AAC Manager at Jabbla, explained in an interview with MedicalExpo e-Magazine at RehaCare.
The concept is deliberately simple: the user looks in the direction they want to move, and the wheelchair responds accordingly. The user directs the mouse on the screen with his or her eyes, as well. The interface resembles a computer screen, and the user selects options such as drive. Once “on the road”, so to speak, there are several ways to stop the wheelchair from advancing. Foulger highlighted built-in safety measures, noting:
“You can close your eyes and nothing will happen. That will stop the wheelchair. Or you can look away. There’s also a button that enables you to stop the device by pressing an external switch.”
For clinicians, these safeguards are essential in training and everyday use, ensuring that users and caregivers can halt the chair instantly if needed.
“It’s about giving more autonomy to all of our users.”
That autonomy is central to rehabilitation goals, where independence is often as critical as clinical outcomes.

Inside the Technology: How DriveControl Works
DriveControl is powered by two key components. The first is Jabbla’s Tellus i6, a 14-inch AAC device equipped with a high-performance eye tracker and a rear-facing TiltCam that can be angled to improve situational awareness. The second is the mo-vis interface, which translates the digital commands into the wheelchair’s drive system.
Together, they create a unified system that allows users to both communicate and navigate their environment with the same device. Direction and speed are controlled using a “protractor”-style interface: the angle of the user’s gaze sets direction, while the distance from the screen’s center adjusts speed.
“It’s a question of looking where I want to go,” Foulger explained.
Movement begins only after calibration and activation of a physical “magic switch,” preventing unintended eye movements from launching the chair. Foulger clarified how the integration was achieved:
“The wheelchair interface was developed by this company here called Mo-Vis,” Foulger said while seated on the wheelchair, showing us how it functions. “We collaborated and made a special pageset within our software, our communication software. So, the user can switch between driving and communicating.”
For clinicians, this seamless switching reduces complexity, consolidating communication, environmental control, and mobility in one device. The system is intended for indoor use only, given the limitations of infrared-based eye tracking under sunlight, but within controlled environments, it offers both precision and reliability.
Eye-Tracking Beyond Mobility: Broader Implications
DriveControl exemplifies the convergence of communication and mobility technologies. Eye-tracking is already established in Augmentative and Alternative Communication (AAC) and environmental control; integrating it into wheelchair driving extends the scope of independence. For rehabilitation specialists, this opens new opportunities.
Training users to manage gaze control can serve therapeutic as well as functional purposes, engaging cognitive attention and visual tracking alongside mobility practice. In hospitals and care facilities, the system could reduce caregiver burden by enabling patients to reposition themselves indoors.
Still, limitations remain: DriveControl is explicitly restricted to indoor use because sunlight interferes with eye-tracking sensors, as previously mentioned.
“This is only rated for use indoors,” Foulger emphasized, noting that safety depends on controlled lighting.
As adoption expands, clinicians will look for published data on user performance, safety incidents, and long-term satisfaction. Yet even at this stage, DriveControl represents a meaningful advance in autonomy for patients whose only consistent voluntary movement is their gaze.
“It’s just making people independent,” Foulger said. “That’s what it’s all about.”



Images: Ian Foulger, AAC Manager at Jabbla, uses the eye-tracking gaze technology to move around RehaCare. Photos: ArchiExpo e-Magazine.
Eye-tracking Gaze Technology in the Medical Devices Market
Eye-driven wheelchair control implementations converge on three technical choices: where the tracker is mounted (integrated tablet, wearable glasses, or chair-mounted), the control model (analogue/proportional steering versus discrete command selection), and the safety architecture (mandatory calibration, hardware enable/disable, and immediate stop gestures).
Jabbla’s DriveControl pairs the Tellus i6 tablet (integrated eye tracker and rear TiltCam) with a mo-vis actuator interface and uses a “protractor”-style, analogue mapping: gaze angle sets heading and radial distance sets speed, enabling smooth, proportional steering. DriveControl enforces session calibration and multiple emergency pauses.
By contrast, Ability Drive implementations commonly present discrete on-screen buttons or pagesets compatible with a range of trackers (e.g., Tobii series). This discrete model simplifies intention recognition and reduces continuous correction demands, at the cost of less fluid turning. Wearable solutions such as MyEcc’s Pupil Prop relocate sensors to glasses or headsets, improving line-of-sight for users who cannot position a tablet but introducing wearable comfort and calibration considerations. Chair-mounted modules like GazeDriver prioritise a minimal or screenless footprint to preserve the user’s external view but typically separate driving from AAC, trading integrated communication/mobility for an unobstructed field.
Across systems, shared constraints remain: infrared-based gaze sensors degrade in sunlight (hence indoor focus), per-session calibration is required, and safety depends on layered interlocks (software dwell thresholds, physical “magic” enable switches, and stop gestures such as closing the eyes).
Clinically, analogue protractor schemes (DriveControl) excel for nuanced indoor navigation when gaze stability is adequate; discrete button models favor users needing lower cognitive/motor precision; wearables and chair modules suit particular postural or mounting constraints. Choice should be guided by patient posture, visual stability, cognitive capacity, care environment, and the system’s documented safety and training outcomes.