Emotional Intelligence: Emotions drive our minds. Almost every impulsive action has a connection with some kind of feeling. Every memory and invaluable experience is fun because there’s an emotion attached to it. However, it is very difficult to convey the feeling in your mind via digital medium.
The text message - “I am coming for you.” explains this problem. This simple text can generate a feeling of love as well as fear depending on the situation. But how would your mobile phone know whether you’re happy or sad? Rana el Kaliouby addresses this problem of emotional intelligence with her app.
Technological advancements have reduced a lot of manual work, which is beneficial to everyone in the long run. However, it has some downsides. These include reduced human interaction and tedious work routines. Lot of people have to work on computers for hours at a stretch, which is tiring. This also drains a person emotionally because computers can’t decode the emotions to help people.
Rana El Kaliouby had the same experience while pursuing Ph.D. at Cambridge. She realized the need for emotional intelligence in the workplace and started working for it. We should work on the emotional quotient of technologies with the same intensity as we’re working on computational abilities.
Of all the ways of detecting emotions, facial expressions are the best way to approach. Rana emphasizes that the human face has 45 action units. These units contribute a lot towards hundreds of facial expressions. They also have a direct correlation with emotions. Therefore, she devised a way of combining artificial intelligence and emotional intelligence to read facial expressions.
Rana and her team trained a machine learning algorithm on a very large dataset. It consisted of images of people smiling, smirking, crying, showing anger, frustration, etc. This resulted in a program having high emotional intelligence, which could now recognize facial expressions and can relate them to subsequent emotions.
Rana’s team began training the model on almost 3 million photos and videos. AI algorithms made this application robust to the extent that it can identify subtle smiles from 12 billion data points.
This analysis of this data gave very useful insights. It tells us that women in the USA express their emotions for a longer time as compared to men. But there was no significant difference in emotional expressions of men and women in the UK. Another crucial detail is that our facial expressions change while watching videos, images, and writing emails!
You can use this app with glasses worn by people without eyesight. One can also add it on online learning platforms to adjust the speed of videos when it detects ‘confusion’ on the viewer’s face.
Rana claims that every device will use this technology. It will enhance their emotional intelligence. Furthermore, its practicality overshadows the chances of misuse. That is why common people might use this concept in many gadgets in the coming future.
Intelligent machines will play an important role in the future. So, everyone should keep an eye on them.