A Solid State plc group company

Providing advanced imaging products, embedded systems and solutions

Yellow balls lined up with faces drawn on displaying different emotions

Computer vision with Affective Intelligence benefits healthcare

August 15, 2022

We’ve covered a lot of stories and applications of Artificial Intelligence, and the role of computer vision in this growing field, but what about artificial emotional intelligence? Affective computing is the area of computer science which looks at reading, analyzing and reacting to human emotion, or even replicating emotion in machines. Computer vision is a key component of the technology.

Affective computing solutions generally combine a range of sensors to monitor aspects of emotional conduct. These may include cameras for recording facial expression and posture, microphones for detecting changes in voice and tone, haptic technology to reproduce touch and adhesive body sensors to track movement or perspiration. Then advanced software and AI algorithms process the information into actionable data. There has been copious academic research into applying affective computing in many areas, but the technology is still at a relatively nascent stage. Affective computing was pioneered by Rosalind Picard, an American engineer behind the companies Affectiva and Empatica. She published her first book on the subject in 1997 and has been a driving force behind the application of affective computing ever since. While the concept, therefore, isn’t new, the volume of research compared to the number of real-life applications prove just how complex the technology is. Advancements in AI are now enabling affective computing to come to life, we take a look here at its use in mental well-being and healthcare.

Affective healthcare

Picard believes affective computing could play an immensely valuable role in communication with and for autistic patients. Firstly, they can be given wearables with small cameras to read other peoples’ expressions and moods, with software which translates these into social cues. Additionally, robots programmed with affective AI can be used to measure the level of engagement or mood of an autistic patient during different scenarios or tasks – this information then drives the direction of therapy sessions.

Monitoring mental and physical well-being

A range of software suppliers have been experimenting with sensors and algorithms to service various areas of healthcare and improve the lives of patients suffering from stress, depression and even PTSD. One of these, DeepAffex, has developed an Affective Intelligence platform which uses facial blood flow imaging to measure physiological elements related to well-being including heart rate, breathing rate and blood pressure, all from just videoing a patient’s face. As well as supporting healthcare, the analytics can be applied to recorded video to see, retrospectively, how stressed a subject may have been during a particular experience such as an interview.

Want to know more about computer vision and AI? Sign up to our newsletter for highlights or contact our experts. Active Silicon’s range of embedded vision systems are certified to several medical standards and can capture, process and manipulate images and video at high speed and high resolution, making them ideal for a wide range of applications.

Subscribe-to-our-newsletter-button