HomeWinBuzzer NewsMicrosoft’s GazeSpeak Brings Accurate Speech Prediction to ALS Sufferers

Microsoft’s GazeSpeak Brings Accurate Speech Prediction to ALS Sufferers

The new app will launch this May. GazeSpeak uses AI and predictive texting to understand where an ALS patient is looking on a grid alphabet. The app can then predict the correct letter more rapidly than previous methods.

-

has debuted a new application that will help patients suffering from ALS. Called GazeSpeak, the app aids patients by allowing them to communicate with the eyes. The company says the app will be available on iOS before the Conference on Human Factors in Computing Systems.

Amyotrophic lateral sclerosis (ALS) or Lou Gehrig's disease is known to most people these days through the viral Ice Bucket Challenge. However, the disease has been affecting people for decades, most notably renowned scientist Stephen Hawking.

ALS afflicts the nervous system, attacking cells and neurons in the brain and spinal cord. In many instances, the disease claims all motor skills, rendering a patient motionless apart from eye movement.

Technology to help ALS sufferers speak exists. Famously, Hawking uses a machine that presents his words in a robotic voice-like tone. However, while Hawking's appearances on TV make it seem like he is speaking in the moment, all his speech is planned. The technology he uses takes a long time to function as it seems.

Microsoft's GazeSpeak aims to make speech faster. The app runs on a device and predicts what the patient wants to say by registering eye movement. An alphabet is presented in a grid structure, with four grids visible through a sticker on the back of the device. The app can read eye movements for up, down, left, and right.

GazeSpeak Accuracy

Of course, accuracy is hugely important. By reading the eye movement, the app can register which grid the person is looking at and predict the letter. Through artificial intelligence and predictive texting, the app creates four options. It then selects the most likely and reads it aloud.

Xiaoyi Zhang, who developed GazeSpeak whilst he was an intern at , explains how this works:

“For example, to say the word ‘task' they first look down to select the group containing ‘t', then up to select the group containing ‘a', and so on,”

Against a traditional method of an ALS sufferer looking at a letter written on a board, GazeSpeak proved more accurate. It took an average of 78 seconds to complete a sentence. The older method takes on average 123 seconds.

Microsoft says the app will arrive this May and will be free to use.

Luke Jones
Luke Jones
Luke has been writing about all things tech for more than five years. He is following Microsoft closely to bring you the latest news about Windows, Office, Azure, Skype, HoloLens and all the rest of their products.

Recent News