Emotion Tracking Software: Beneficial or Harmful?
The possibilities that can be achieved through software are limitless. Technologies such as augmented reality, virtual reality, and wearables are empowering new innovations. One area of recent interest is software that can comprehend human emotions. The first question you might have about such an application is: What is the value of understanding human emotions?
While it can help businesses better plan, market, and time product decisions, there are other significant benefits as well. New applications are being created to work with data from emotional intelligence—such as those that automatically help edit documents and find a better dating match. Additionally, for existing applications, that data can be used for new features, such as automatically quarantining user responses to enable faster content curation.
While you may think all of this is new technology, there are already startups that focus solely on building emotion tracking software. Companies such as Sentimine, Emotient, and Affectiva are some of the leading players and often mention how emotion tracking software can help in important domains, such as healthcare. Imagine a patient who is unable to talk, and the emotion tracking software can help measure the patient’s pain level.
Big data, machine learning, and machine un-learning are the biggest underlying technologies that are aiding innovations in this segment. With these, face tracking software is being built as one of the major drivers in deciphering human emotions captured on cameras. There seems to be a number of applications that leverage face tracking to make a difference in this space, but they are all still up and coming. Large providers such as Microsoft are also slowly entering this market, giving us confidence that this segment may have a bright future.
While we have seen the potential applications, the larger question that will be difficult to answer is whether emotion tracking software will be beneficial or harmful to users. It is understandable that organizations want to use this data for meaningful purposes, but how could this adversely impact users' security and privacy? What if an application says “you are unhappy”? A repeated message such as this may lead to depression—apparently regular Facebook use may increase the chance of depression.
As the entities that directly deal with this evolving software, both internet service vendors and users have a very important responsibility to use these new applications in an ethical and responsible manner and to realize its potential as a benefit and to not let it become harmful.