Going on an emotional journey with Computer Vision
Affective Computing solutions generally combine a range of sensors to monitor aspects of emotional conduct. These may include cameras for recording facial expression and posture, microphones for detecting changes in voice and tone, haptic technology to reproduce touch and adhesive body sensors to track movement or perspiration. The affective computer market is expected to be worth over US$41 billion by 2022[1].
Recently, our blog posts have looked at Affective Computing in the healthcare and marketing sectors. Developers of automated vehicles are also expending massive R&D efforts and investment to improve the comfort and responsiveness of automated vehicles.
Affective computing in automated vehicles
We’re very familiar with outward-facing cameras on vehicles but Eyeris and Affectiva are stuffing cars full of cameras on the inside in order to detect and act upon the actions and emotions of both drivers and passengers. Affective technologies monitor driver drowsiness which can be used to trigger alerts, posture and positioning which can be linked to intelligent seating to make passengers more comfortable, and even mood which could, in the long run, be used to avoid incidents of road rage and impatience.
Much of the research in this area has been carried out by the Massachusetts Institute of Technology (MIT) Media Lab’s Affective Computing Group, and Kia were the first manufacturers to announce a partnership with them in time for CES 2019. Their READ (Real-time Emotion Adaptive Driving) concept includes vibrating seats which also encompass a massage function and alert system. The company’s press release explains that “The technology monitors a driver’s emotional state and tailors the interior environment according to its assessment – potentially altering conditions relating to the human senses within the cabin, creating a more joyful mobility experience.”[2]
There’s still work to be done in this area however – ever driven away from an emotional event like a relationship breakup and cried or yelled all the way home? What would your car have made of that? Such circumstances must be accounted for and worked around so as not to send your vehicle into disaster recovery mode!
Cerence are aiming to take the concept of automated driving to new heights; for example enabling drivers to open car windows with a hand gesture, or even to look at a passing restaurant and ask for its rating. Such features combine voice, gesture and gaze recognition tools to bring added value to cars – you’ll no longer need a passenger for companionship or local knowledge.
Stay informed
The road to self-driving vehicles is still a long and bumpy one but affective AI is bringing a new dimension to our relationship with our cars. Want to know more about computer vision and AI? Sign up to our newsletter for highlights. Active Silicon’s range of camera solutions and advanced software can capture, process and manipulate images and video at high speed and high resolution, making them ideal for a wide range of applications.
[1] https://www.businesswire.com/news/home/20170927005549/en/Global-Affective-Computing-Market-2017-2022—Market
[2] https://www.kiamedia.com/us/en/media/pressreleases/14844/ces-2019-kia-prepares-for-post-autonomous-driving-era-with-ai-based-real-time-emotion-recognition-te