This article was 1st published on our sister Site, Digital World Native.
Developed specifically for Huawei’s Mate 20 Pro phone, the application employs an artificial intelligence-run algorithm to convert facial expressions into short descriptive sounds, thereby communicating the mood of the person on the other end of the line.
The app
By using the rear camera of the phone as a “capture interface”, the app for the blind scans the facial features of the talker, & uses a configuration which takes the relative position of the most dominant features – such as the eyes, nose & mouth – into account, to discern the person’s emotional state. The thing to remember here is to keep the phone’s rear camera aimed at the face, when conversing with a blind person.
Seven major mood-types are recognized by the application, such as anger, joy, fear, sadness, hatred, surprise & disgust – these are translated into appropriate sounds that convey the emotion to the blind person, who has had an opportunity to learn the sound association at an earlier stage.
For the rest of the article, click here.