The world of computing is often described in discrete, black and white terms. Everything is rational, and at its heart a computer program is simply a series of bits that are either on or off, simply represented by a value of 1 or by a value of 0.
In many ways, a computer is a device that has been created to embody logic. It is an electro-mechanical representation of a totally left-sided brain. The last thing that a computer could do is to show, or recognize, emotion - right? How absurd would that be!
And yet we have a term - affective computing that describes computing that is in some way connected to emotion. It is sometimes also known as emotional artificial intelligence.
Emotion and Technology
Despite the fact that emotion is a fundamental part of our every-day life, it has usually been ignored by technology over the years. This lack of any emotional interaction has, in many ways, made using technology frustrating for humans.
And as humans we are curious beings. Most of us do have a rapacious thirst for knowledge. We do want to know more about our emotions and the effects our emotions have on how we interact with computers, robotics, software, etc.
Dr. Rosalind Picard and the MIT Affective Computing Research Group
There are now researchers who specialize in affective computing - focusing their research on the new technologies that advance basic understanding of emotion and its role in human experience.
Once such group is the MIT Affective Computing Research Group. Researchers from this group have performed research in such areas as new ways that people can communicate affective-cognitive states, new ways to assess frustration, stress and mood indirectly, and how computers can be more emotionally intelligent, as well as other similar areas involving affective computing.
The actual phrase, Affective Computing, was coined by Dr. Rosalind W Picard, who is the Director of Affective Computing Research at the MIT Affective Computing Research Group. She published a book entitled Affective Computing in 1997, and the name stuck, becoming the common term used for that field of computing.
In her book, she looked in detail at how she envisioned affective computing would progress, as well as such areas as possible applications and potential concerns. Picard argued that there had to be something like emotional reasoning for there to be any form of true machine intelligence. Her key idea was that it should be possible to create machines that relate to, arise from, or deliberately influence emotion and other affective phenomena. She argued that programmers needed to consider affect when writing software that interacted with people.
The roots of her vision of affective computing come from neurology, medicine, and psychology. It looks at the emotional interaction between people and machines from a biological perspective.
According to Picard's school of thought, emotions (which are also known as affects), are identifiable states, which can be modelled to enable as human-like as possible interactions between people and machines.
Other Theories Regarding Affective Computing
Since the late 1990s, many others have expanded on her theoretical ideas and quite a few of her ideas have over the last 20 years come into reality in practice. Her research is considered to be a cognitivistically-inspired design approach.
Of course, as with most areas of scientific research, not all later researchers have agreed with Picard's ideas. This lack of total consensus in itself has helped advancement in the field, as more researchers have tried to expand on and provide proof for their personal theories. The best-known alternative is based on the Affective Interaction Approach. This approach focuses on making emotional experiences available for people to reflect upon which will in some way modify their reactions.
According to a number of researchers, including a team led by R J Davidson in 2002, emotional experiences do not simply occur solely in our minds; our whole bodies experience them. For instance there can be hormone changes in our blood streams, nervous signals to muscles tensing or relaxing, blood rushes to different parts of the body, changing body postures, movements, and facial expressions. Our bodily reactions also feedback into our minds, creating experiences that in turn regulate our thinking, and this feeds back into our bodies.
How Can Computers React Affectively?
The MIT Affective Computing team, along with other researchers also in the field of affective computing, have developed methods and technology to give computers the capacity to read our feelings and react appropriately, in ways that appear surprisingly human.
Computers can be trained to identify patterns in vocal pitch, rhythm, and intensity. They can be programmed to scan a conversation between people and determine whether the participants are angry or frustrated or happy.
There has been other software created which can measure sentiment by analyzing the arrangement of our words, by reading our gestures or from facial expressions. A number of the apps created using the Kairos Emotions API perform that last task.
One experiment was undertaken using facial detection software at Carolina State University. Researchers filmed students as they were being tutored and used the software to detect if students looked bored or frustrated. They were able to use this to determine whether the students needed more challenging work or not.
Affective computing researchers aim to widen the range of areas where computers can connect with human emotions. The ultimate goal of artificial intelligence is for a computer to chat with a human, and for the computer to understand the meaning behind what the human is saying.
Of course, some emotions are harder to detect than others. That is why Paul Ekman only considered that there were six (later seven) universal emotions made obvious by facial expression. For affective computing to truly reach its promise some way of detecting the other emotions will be needed.
Interestingly Intel sees Affective Computing as simply being part of a larger technology, which it refers to as Perceptual Computing. Perceptual Computing uses non-traditional ways to interact with computers, whether it be touchscreens, sensors, cameras, face recognition or gesturing. The days of solely interacting with your computer by entering information with a keyboard and mouse, and viewing data on a monitor or printing it out, are over. As we move towards a more natural usage of our electronic devices, our ways of interacting with them have matured and, well, becoming more representative of how we would interact with other humans.
A more recent development with computing has been a widening of what is considered to be a computer. Technically, the phrase affective computing also applies to many wearable devices. Just as computers can collect data about your emotional state through facial detecton and speech-pattern analysis, devices like Fitbit record physiological symptoms such as your pulse (which can help determine if you're excited or anxious).
A number of Dartmouth researchers have even demonstrated how smartphones can be used to detect stress, loneliness, depression and similar such emotions.
With the recent arrival of the AppleWatch, there is further opportunity for measurement of a person's medical activity and levels of physical activity. Combine all of the data now able to be collected, and there is quite a large degree of emotional measurement undertaken by our everyday electronic devices.
It is still relatively early days in for affective computing. The Kairos Emotions API is one tool that is available for developers to include emotional analysis in their apps. What new or original use could you have for it? How could you help consumers measure or indeed understand their emotions?