COVID-19-inspired app to help patients communicate through gaze tracking

Dec 16, 2020
Gaze-tracking communication app

A team of senior computer science students at Florida Polytechnic University is working on the development of an app that uses gaze-tracking software to help improve communication between patients who cannot speak and their care providers.

As the number of intubated COVID-19 patients began increasing across the country this spring, an employee at Lakeland Regional Health whose family member was ill with the disease realized there was a need to make communication easier between these patients and the medical professionals caring for them.

A capstone senior design team of computer science majors at Florida Polytechnic University took on the challenge and is developing an app to allow those unable to speak or write to communicate with their gaze instead.

“I saw this as an opportunity to help a lot of the people,” said Savannah Shahab, a senior from Lindenhurst, New York. “While the original idea was to help COVID patients, it’s something a lot of people will be able to use. We realized there are so many more applications where this can be implemented.”

These applications include accident victims who are injured and need to communicate with paramedics, or those who have a throat injury.

The team of students is using computer vision and specialized gaze-tracking software to allow users to interact with the application. They are being supervised by Dr. Muhammad Abid, an assistant professor of computer science, who wrote the project’s proposal and meets weekly with the students to assess their progress, guide their work, and provide them with helpful resources.

The first phase of the app they are creating will include the ability to convey simple yes or no responses. By lingering a gaze on one of the two options, the selection will be recognized by the application.

Medical professionals will be able to ask about things like allergies, pain levels, and patient preferences. Once successful, subsequent versions of the app may allow for more than two choices or even use of a full keyboard.

“The bread and butter of this is if a patient is unable to use their voice, their eyes will always be available,” said Gage Roper, from Tampa, Florida. “By looking at something for long enough, the app will recognize that response.”

Although the idea for the project originated with an employee from Lakeland Regional Health, the project is sponsored by Florida Polytechnic University.

“There is a lot of use for this in general and emergency medicine,” said Dr. Matt Bohm, the project sponsor and director of industry engagement and capstone projects at Florida Poly. “Imagine you’re in a car wreck and the air is out of you and you can’t speak, but with this you can quickly communicate to an EMT or other staff that you’re allergic to a medication or give them your emergency contact.”

The capstone team has already mapped out how the app will function and will soon begin the implementation process. They plan to have a working prototype ready by the end of the spring semester. The goal is to make the app available for use on both Android and iOS devices.

“We are ready to start doing the computer vision work and the UI framework for the app and really start piecing it together,” said Robert White, from Miami, Florida.

In preparation for this, they have interviewed nearly a dozen medical professionals, including nurses, emergency medical technicians, and radiologists.

“Everyone had very strong feelings about this app being able to help a lot of people and being able to use it in a wide range of situations,” Shahab said. “We want to help as many people as we possibly can.”


Lydia Guzman
Director of Communications