Define User Reactions Using Artificial Emotional Intelligence

Determining a persona’s intrinsic value is a herculean feat at best. By asking key questions about the state holder and their users in order to examine needs, wants and desired outcomes, AI systems now provide a unbiased based approach that considers:

  • How do we make sure we are building products powered by AI that takes in consideration human behavior?
  • How do we make sure our users/stakeholders are represented accurately in this product?

My perception is not of the world, but of my brain's model of the world.

-Chris Firth, Making Up The Mind, 2007.

One of the key factors to examine is the emotional analysis of the user or stakeholders being tested. AI uses image metadata that collects data on how a person communicates verbally and non-verbally to understand mood or attitude. The technology, also referred to as emotional analytics, provides insights into how a customer perceives a product, the presentation of a product or their interactions with a customer service representative.

Artificial Emotional Intelligence

Emotion AI gathers cues about a user’s emotional state from a variety of sources, including facial expressions, muscle tension, posture, hand and shoulder gestures, speech patterns, heart rate, pupil dilation and body temperature. The technology that supports emotion measurement and analysis includes sensors, cameras, big data, deep learning analytics engines.

Companies that offer artificial emotion intelligence:

  • Affectiva: analyzes complex and nuanced human emotions and cognitive states from face and voice.
  • Humanyze: allows for more informed company wide decisions. Utilizes hidden patterns of corporate-owned communication to measure how work gets done.
  • CrowdEmotion: utilizes image meta data to track attention, facial coding to understand engagement, and implicit testing to quantify memorability.
  • Emotient: (now owned by Apple) deploys artificial intelligence to identify and understand emotional reactions.
  • Microsoft Azure: AI tools designed for developers and data scientists to assist in creating data for a wide range of products. .

Here a example of Affectiva in action using their artificial emotional intelligence tools:

API’s:

  • RESTfull:: (Representational State Transfer) designed to take advantage of existing protocols. While REST can be used over nearly any protocol, it usually takes advantage of HTTP when used for Web API. 
  • Affectiva: analyzes complex and nuanced human emotions and cognitive states from face and voice.
  • Humanyze: allows for more informed company wide decisions. Utilizes hidden patterns of corporate-owned communication to measure how work gets done.
  • CrowdEmotion: utilizes image meta data to track attention, facial coding to understand engagement, and implicit testing to quantify memorability.
  • Emotient: (now owned by Apple) deploys artificial intelligence to identify and understand emotional reactions.
  • Microsoft Azure: AI tools designed for developers and data scientists to assist in creating data for a wide range of products. .

Here is a screenshot of a stock image that I ran through Microsoft’s API:

Why Does It Matter?

It comes down to a unbiased approach. Artificial motional Intelligence allows for the ability to ameliorate your own cultural, emotional, physical and spiritual persona and replace it with valuable data.

Human beings, viewed as behaving systems, are quite simple. The apparent complexity of our behavior over time is largely a reflection of the complexity of the environment in which we find ourselves.

 ― Herbert A. Simon, The Sciences of the Artificial

Artificial emotion intelligent data focuses our decisions surrounding each component of the product and adds a layer of unbiased, real-world considerations to any conversation.