A new smartphone app has been recently developed which is aimed at aiding users with, Amyotrophic lateral sclerosis (ALS) or other motor impairments to communicate with their eyes. The app which is aptly named GazeSpeak will now help users suffering from ALS or other motor impairments to relay their messages by interpreting eye gestures.
The GazeSpeak app has been created by scientists and researchers. The group also includes an Indian origin scientist named Harish Kulkarni, from Microsoft Research.
Interestingly, there have previously been attempts at developing such technology and after conducting several trial and errors it seems the group has finally overcome the limitations. The new app has been designed to make the life of users a lot more convenient.
As for the functioning of the app, it will in real time decode eye gestures and interpret them into predicted messages or utterances and simultaneously facilitate communication. In addition, the communication will be facilitated through different user interfaces for speakers and interpreters or translators. With the use of a tripod or maybe a selfie stick, users can engage the front facing camera and communicate through the app with minimal help from their care-givers.
While this sounds exciting, the app is expected to be launched in May, at the Conference on Human Factors in Computing Systems, in Colorado. Researchers say it will be made available on the Apple App Store before the conference and the source code will be made freely available so that developers and tester can help improve it. The GazeSpeak app will be rolled out for iOS devices first.
More significantly, the app will come as a blessing or boon to many suffering from ALS and similar conditions and give their loved ones a lot of conveniences in understanding and have conversations. GazeSpeak is definitely proving the ideology of how technology is the way to a better life and better world, with limitless possibilities and ways that it can bring people together and communicate beyond words.