Introduction and mood detection
Is Apple Watch setting pulses racing? The latest high-profile wearable's heart rate sensor could provide apps with a clue to mood and emotions. For now, this is basic stuff – Siri won't be reading minds anytime soon – but could future iterations analyse facial expressions and listen to conversations? Probably, and in doing so, smartwatches and wearables could read at least basic levels of happiness or stress in the wearer.
It's not just wearables that will start to take emotional data and sensory technology seriously. One example is Nikon's 'context cameras' (as seen above). Taking a picture is all about capturing a moment – an emotion – but couldn't a camera use more sensors to calculate that more accurately?
By listening to sound, taking the temperature and enhancing certain colours, tones, exposure and contrast levels, a context camera could better recreate the emotion the photographer intended. Nikon's future-gazing report also foresees tiny, always-on devices that continuously capture spontaneous images.
What is emotional data, and how is it gathered?
Emotional data is anything that indicates state of mind. "Emotional data aims to track a user's emotional reactions – such as states of joy, delight, surprise, excitement, fear and sadness – to particular external events," says Diana Marian, Marketing Strategist, Ampersand Mobile.
Some of those emotions have physiological traces, such as an increased pulse or hormone production, as well as facial expressions. Headsets, watches and cameras can discern all of those things, but can they accurately map physiological states onto mental states? It's ambiguous, thinks Marian, because emotions are complex. "No amount of technology will ever be able to get this bit right all the time," she says. "It can only get to rough approximations."
There are two ways of collecting emotional data – via sentiment analysis software (which will look for linguistic patterns in tweets, stats updates and emails) and biometric data from wearables, which detect 'emotional arousal', such as a quicker pulse.
Can wearables detect mood?
They can try. "They can monitor heart rate, blood pressure, temperature, location and movements," says Scott Byrnes-Fraser, Head of UX design at Adaptive Lab. "Based on that information it would be possible calculate the most likely emotion being felt at that time."
Most analysts are sceptical of what wearables can achieve. "Without contextual data a heightened heart rate could mean anger, excitement, love, fear or even that the person just climbed a flight of stairs," suggests David Fletcher, Chief Data Officer at MEC. "Emotions are both psychological and biological – wearables could tell us the biological state of someone's body but not how they were interpreting this as an emotion."
Human emotions aren't easy to interpret. "Mood is a 3D concept and it's a big challenge to fully understand using current wearable technology," says Collette Johnson, Medical Business Development Manager at Plextek Consulting. However, a rough approximation might be enough to be useful. A scenario where it was possible to put a tersely-written email or message in the context of how the author was feeling might be genuinely useful.
"Emotional data is all the things in between the lines, all the unmeasurable data that makes us humans do what we do," says Marcus Mustafa, Global Head of User Experience at global marketing and technology agency DigitasLBi. "But the 'in-between' is rapidly disappearing … we might become more aware of ourselves, and hopefully more tolerant to others."
If gadgets, wearables and software that analyses our every action have a hard time extracting any more than a rudimentary understanding of the wearer's mood, it's because emotions are often poorly understood by the person experiencing them, Fletcher thinks. He suggests that emotions are often influenced by what people think they should feel, rather than what they privately feel.
"Sentiment analysis would tell us that millions of people felt outrage at Kanye West headlining Glastonbury, yet only 150,000 people can go and could genuinely feel upset about it," he says. "It was a socially-encouraged feeling rather than a pure somatic emotion."
Measurement and sharing
Why could emotional data be so useful?
Computers are logical. Humans are not. For a 'brain-computer interface' to work, one needs to understand the other's motivations. "People like to think that they make decisions based on logic, but most decisions are based on emotion," says Byrnes-Fraser. "By understanding somebody's emotional state or potential emotion state, a service can use behaviour design techniques to build on those emotions."
What emotional data promises most of all is personalisation, which is what the modern web is all about. "If emotional data was actually able to track and measure people's emotional states, it would be immensely useful, as it would allow marketers to personalise their offerings based on the actual – rather than assumed – likes and dislikes of their customers," says Marian.
However, personalisation isn't just about advertising. "ED is useful as it helps us understand behaviours of people in certain situations for targeted marketing purposes," says Johnson, "but also for health reasons, such as depression and behavioural change that might need intervention."
How emotions are already being measured
Just because our smartphone cameras aren't being used en masse to gauge our reaction to, say, an advert, a viral video or a movie we've downloaded, it doesn't mean that ED isn't already being gathered by marketeers.
"We've seen an explosion in the market of emotion and facial recognition software that allow brands to pre-validate their message," says Justin Taylor, UK Managing Director at Teads. "Using state-of-the-art facial recognition software and algorithms, developed in partnership with MIT, we can detect facial expressions and head gestures obtained from webcams or mobile cameras."
When you understand how users engage with online video ads, it's easier to find the creative angle that really connects with people, thinks Taylor. "Making viewers smile can make it almost five times more likely to achieve more than 10 million views," he says.
"Our phones could in principle collect data about expressed emotions," says Marian. "This data would not, however, be data about actual emotional states, but about emotional expressions."
Even if it provides a clue to someone's emotional response, the expression on someone's face is such a giveaway that it's bound to become the most primary form of emotional data.
"Vending machines with facial recognition, which recommend what to buy, are perhaps the most commercial example so far," says Mustafa, "but the notion of focusing your thoughts in the present moment is very much in vogue." Headbands like Melon or Muse act like an activity monitor for your brain, and already collect data on cognitive behaviour. "Most wearables have the capacity to measure some types of body data, and if shared, will add to the bigger picture," he says.
Emotional data: would you share it?
Some think that the collection of emotional data could bring better self-awareness, and a deeper understanding of each other's behaviour. Others suspect it will only ever be used for marketing. How emotional data can be collected and used is up for grabs, but it does stray into the personal, private area of all – people's subconscious reaction to the world around them.
"It's still about trust and when the customer wants to give their data away," says Mustafa, "but there could be massive benefits if people can stay in control of what they share, and with whom." Gaining people's trust will be a vital first step if the age of emotional data is to kick-off.
"We are a generation away from 'Her'," says Mustafa, referencing 2013's movie about a man and his emotionally-aware digital assistant. "You can see that the kids of today are so much more at ease with online influence, but so far Siri is mostly used for a laugh."
However, he thinks that AI is creeping towards a 'human operating system' – and he's not the only one. "If computers could crunch the amount of data needed in order to judge an emotion from a face in milliseconds, then it is a very exciting step for AI," says Fletcher. "I think they can do it, but not as quickly or well as humans can, yet."
Whether the collection and use of emotional data becomes accepted or hated, the coming hype-cycle will prove once again how unpredictable, how complex and how critical human emotions and behaviours are. Whether it gets used or not, emotional data is the missing link.
from TechRadar: All latest feeds http://ift.tt/1bjnri9
via IFTTT
0 commentaires :
Enregistrer un commentaire