The Role of Human Emotions in the Future of Transport
The Role of Human Emotions in the Future of Transport
The future of mobility is emotional.
Getting from A to B is about to feel very different.
A new era of connected mobility is emerging, transforming journeys into tailored experiences, designed around the individual preferences of each human undertaking them. With this development — especially the automation of driving — we see a series of challenges around issues like trust, understanding, agency and control.
For us to become comfortable with mobility services tech, not only to optimise our transport experience but to do so without killing us, requires a long, hard look at human-machine interaction.
The Transport Sector is Shifting Gear
Of all the hyped-up industry disruptions I’ve heard of since leaving school, I don’t remember seeing anything quite like what seems to be happening with transport. The automotive industry, dominated by the big vehicle brands over the last century, is facing a perfect storm of drivers for change. Automation of vehicles, ride sharing, and electrification are just three of the main factors.
Sometimes it takes a kick up the arse for an industry to truly change. Automating cars, for instance, won’t just be a convenience, it will save lives. Putting a human behind the wheel will become an insane option. That sounds like a compelling case for change.
The main players in the sector seem to share a common objective for growth. While each of them has its own proprietary name for it, the aim is the same: to move away from selling vehicles, to providing an end-to-end mobility service experience.
As the automotive industry switches gear from ownership of human-driven vehicles to the provision of mobility services (as explained very competently over here), the product will no longer be your car, it will be a seamless flow of services that get you from A to B in a manner that you appreciate.
Trust in the Machine
Once autonomous vehicles become ubiquitous, and road deaths essentially vanish, we may take it for granted that it’s software driving us around. But in this fascinating transition period we are entering, where we learn to let go of the steering wheel, the industry has a massive UX problem on its hands, starting with the question, ‘how do I trust a machine with my life’?
It’s not all going to happen at once. Already there are cars on the road that have some level of self-driving capability, from the common option of cruise-control at one end of the spectrum and fully autonomous vehicles at the other. It’s likely that our control of the vehicle will be taken away from us in stages. For instance, specific areas such as city centres or open highways might become autonomous-only before others do.
In every user interaction we have investigated at our company, Sensum — from TV viewing to skydiving — we have tried to uncover the emotional journey that the user experiences. This is primarily done by collecting biometric data like heart rate, skin conductance, facial expression and voice analysis, along with many other physiological and contextual data sources. We apply emotion AI algorithms to those concurrent data streams to tease out the user’s authentic emotional response towards the interaction, moment-by-moment.
The resulting insights can be used to improve the experience of just about any product, service or environment. We are teaching machines not only to measure our unique emotional characteristics but also to respond appropriately at a level that is personalised to the individual. From this kind of innovation, an era of ‘empathic’ technology and media is emerging that has profound implications for almost any product or service in any sector.
Sensum connecting the audience with a passenger’s emotional response, riding the new Jaguar XE round a racetrack.
But this current change in the mobility climate might hold a special place in the history of human-machine interaction. This is partly because we don’t just need to understand how to make the journey from source to destination more engaging, we first need to establish a vital relationship of trust between the human and the vehicle. Literally ‘vital’. Life-and-death kinda trust.
Mobility is a new kind of digital product, different from something like a smartphone or a website. When one of them ‘crashes’, it doesn’t kill you.
Safety stats alone won’t convince people to trust their lives in driverless transport. The whole experience must be designed to make you feel at ease, managing your expectations throughout the journey so you accept that the robots know best. Then, once we’ve unravelled the putting-my-life-in-the-hands-of-an-AI problem, we can look at the more fulfilling aspects of emotionally competent transport.
Transport that Knows You
We have considered many different ways an empathic vehicle might interact with you. Perhaps it detects that you are getting tired at the wheel, and responds by telling you to take a rest, while turning the AC down a couple of degrees and switching the music playlist to something more energetic. Or, if it senses that you just had a stressful day at work, it might switch the air freshener scent to something calming and advise you to slow down.
Riiiight. OK. But would we welcome that kind of feedback or just find it annoying. Worse, would it be a dangerous distraction?
I’m stuck wondering what kind of empathic feedback will be truly beneficial for us when the dust settles from this tectonic shift in the mobility industry. What will the killer app be in this new paradigm? Our approach to answering this question is to do the same thing we’ve been doing for a while: wiring ourselves up and having a play.
The Car as Mobile Laboratory
Sensum measuring a driver’s engagement and arousal behind the wheel of a performance car.
As R&D environments go, a steel cage on four wheels is quite appealing. On previous projects with Sensum we figured out how to measure biometric data in much more challenging environments, like from the body of an elite mountain-biker. We had to get round a bunch of risks to the rider, then customise the kit to collect a clean signal through a ton of noise. A car, however, can quite easily be converted into a mobile lab.
Already there is an increasing array of sensors being placed into and around our cars that can be used to collect useful data. We can also expect the passengers to stay more or less in the same place for the whole journey. So there is a big opportunity for measuring the biometric changes of the people in the vehicle (heart rate, facial expression, etc.) to understand what aspects of the journey are fun, scary, exciting, boring, comfortable or otherwise.
There are of course environmental challenges, such as background noise, vibration, and above all the need to avoid distracting the driver whenever she may have control of the vehicle. But innovative mobility companies now have the opportunity to connect their customers to the transport experience by infusing empathic technology into every step of the journey.
Empathy is an Evolutionary Step for AI
As with just about any current technological medium, it’s the artificial intelligence behind the tools that transforms them from dumb machines into delightful, addictive experiences. What we’re now seeing is the stitching of human data — physiological, behavioural and emotional — into the AI stack to make every service empathic: to understand not just how we feel in the moment but also how we want our machines to respond to those feelings.
Taking this artificial empathy one step further, consider the interaction between the machines and their environment. Us humans evolved emotional responses to stimuli around us so we would be motivated to take appropriate action. When we see, hear, smell, taste or touch something it can cause us to initiate a basic response: fight, flee, feed or f*ck.
Of course that’s a simplification. In modern and social contexts, these emotional responses have become deeply nuanced and complex. But understanding human emotions can illuminate how empathic technology could and should behave. If an automated machine such as a car detects an object or activity in its surrounding environment, such as an approaching vehicle, its decision on how best to act is an ‘emotional’ response. Seen this way, empathy is an essential component of intelligent behaviour.
The next generation of cars, buses, trucks, planes, ships, and all the infrastructure that surrounds them, will not just be imbued with artificial intelligence, they will behave empathically towards us and each other.
Transport environments such as car interiors could become what some in the industry are calling the ‘third living space’. Human-centred design, enhanced by emotion AI insights, could turn an inconvenience like the office commute into a pleasant time to work, play or chill.
At last, the journey would truly be the destination.
One clap, two clap, three clap, forty?
By clapping more or less, you can signal to us which stories really stand out.