Non-invasive human machine interface for individuals with motor disabilities

During my first semester as a Masters student at Georgia Tech (Spring 2015), I worked on an assistive HCI device that allows individuals with severe movement disabilities (owing to injuries, stroke, ALS etc.) to use a PC solely with head movement, tongue movement and speech. I did this work under the guidance of Professor Maysam Ghovanloo and in partnership with the hugely talented Mohammed Nazmus Sahadat. My contribution to the project was to develop a real time head tracking algorithm which allowed users to move a mouse pointer on a PC simply by moving their head. The algorithm relied on data from MEMS accelerometers and gyroscopes to track head movement. These sensors were embedded in a headset to be worn by the user, which was designed by Sahadat. The head position was esimated using a Kalman Filter which fused data from different sensors (accelerometers, gyroscopes). The head orientation and displacement estimated by the algorithm was translated into mouse movement direction and velocity using a nonlinear mapping function like those used by mouse drivers installed on a PC. The device is named mTDS (Multimodal Tongue Drive System). It was a fruitful project leading to papers in the IEEE BioCAS Conference (2015), the IEEE Transactions in BioCAS (2017) and IEEE Transactions in Neural Systems and Rehabilitation Engineering (2018). The use of the real time head tracking algorithm to control a wheel chair was a subsequent extension by Sahadat. Although the technical details in the manuscripts are cool, I think you will find the demonstration videos below more moving.


Video 1: mTDS wheelchair control demonstration


Video 2: mTDS PC usage demonstration


Video 3: mTDS PC usage and wheelchair control