Engineering Researchers Create New Tools to Improve Digital Accessibility
Two University of Guelph research teams in its College of Engineering have developed new systems to make digital devices easier to use for people with disabilities and/or mobility challenges, such as limited arm movement or shaky hands, offering more ways to use technology independently and with less effort.
Head-and-Mouth Tracking for Hands-Free Control
A study led by Dr. Hussein A. Abdullah, Zeyad Ghulam and Greg French (pictured left and below) introduces a head-and-mouth tracking interface that allows users with little or no arm movement to select and confirm on-screen choices without using their hands. Using an Intel RealSense depth camera and computer vision software, the system detects head movements to navigate and mouth opening to confirm selections.
The innovative research was inspired when Ghulam and team visited recovering stroke patients at Hamilton General Hospital, where he observed nonverbal patients who were highly dependent on healthcare workers to serve their basic needs. Ghulam noted how burdensome the feeding process was for patients and their caregivers, motivating him to develop a system to support their independent behaviour, he says.
Tested with 22 participants, the system achieved high accuracy, improving as users became more familiar with the controls, and earned a 9.7/10 rating from participants for safety, comfort, and ease of use. Participants quickly learned to make smaller, more precise movements, reducing effort and improving efficiency to finetune performance.
"Our goal is to create an intuitive, hands-free interface,” says Abdullah. “This approach is lower-cost and can be more effective than other traditional accessibility tools.”
Wearable System for Easier Touchscreen Use
Dr. Petros Spachos, Wissam Botros and Marc Jayson Baucas (Botros and Baucas pictured above) looked at improving technology independence for people with Essential Tremor, a condition that causes involuntary shaking in the hands and arms, making it hard to tap or swipe on a touchscreen.
The team designed a wearable Internet of Things system that uses motion sensors and Bluetooth Low Energy to detect hand shaking and adjust the touchscreen display in real time. As a user’s finger approaches the screen, the system can enlarge and move buttons to match the intended tap location. This makes it easier to select items accurately without the need for bulky or uncomfortable equipment.
A Shared Goal: Adapting Technology to the User
This University of Guelph research advances inclusive, user-friendly technologies to make digital interaction easier for everyone. Whether replacing manual controls with head gestures or reshaping the touchscreen, these systems show how assistive design can restore independence and reduce frustration.
“Our aim is to adapt technology to the user’s abilities, instead of making the user adapt to the devices they need to use,” says Spachos, engineering professor, pictured.
Both prototypes are built with low-cost, modular parts and can be adjusted for different environments, from home use to clinics and rehabilitation centres. Future work will test the systems with target users, add more customization options, and make them compatible with a wider range of devices.
W. Botros, M. J. Baucas and P. Spachos, "Enhancing Mobile User Experience for Individuals with Essential Tremor Through Wearable Devices," 2025 IEEE Medical Measurements & Applications (MeMeA), Chania, Greece, 2025, pp. 1-6, doi: 10.1109/MeMeA65319.2025.11067986.
Z. Ghulam, G. French and H. A. Abdullah, "Head Tracking for Confirming User Selection for Human-Machine Interaction," 2025 International Conference On Rehabilitation Robotics (ICORR), Chicago, IL, USA, 2025, pp. 382-388, doi: 10.1109/ICORR66766.2025.11063037.