Google releases machine-based tools; Disabled people can now use Android phones with face gestures

Android has come up with new facial gesture features that enable physically impaired people to unlock their phones. Google Android has released machine learning-based tools to help people with speech and physical disabilities touch their phones.  

There are 2 tools for this purpose, Project Activate and Camera Switches.  Two new tools put machine learning and front-facing cameras on smartphones to work detecting face and eye movements. They work with the front camera of the mobile and the user can activate the options on the mobile either by raising the eyebrows, opening the mouth, or tilting the face to the sides. 

Users can scan their phone screen and select a task by any of these gestures. Users can have the opportunity to choose which movement requires which action and which sensitivity is needed to unlock the phones. The upcoming Android 12 package includes camera switches. “To make Android more accessible for everyone, we’re launching new tools that make it easier to control your phone and communicate using facial gestures,” Google said.

The other is Project Activate, a new Android application that allows people to use those gestures to trigger an action, like having a phone play a recorded phrase, send a text, or make a call. At present, the free Activate app is available in Australia, Britain, Canada, and the United States at the Google Play shop.

Google is continuously working to innovate in the tech world with maximum utilizing the internet and by this innovation that works with the AI demonstrations, Google believes that it would definitely help people with disabilities.