Affectiva is an emotion measurement technology company that grew out of MIT’s Media Lab

91
Affectiva
Affectiva

Affectiva, an MIT Media Lab spin-off, is the pioneer in Emotion Artificial intelligence , the next frontier of artificial intelligence. Affectiva’s mission is to bring emotional intelligence to the digital world with emotion recognition technology that humanizes how people and systems interact. Affectiva’s patented technology uses computer vision, deep learning and the world’s largest emotion database of 4.8 million faces analyzed in 75 countries. Affectiva’s SDKs and APIs enable developers to add emotion-sensing and analytics to their own apps, games, devices and digital experiences. Affectiva is used by one third of the Fortune Global 100, including more than 1,400 brands, to gather insight and analytics in consumer emotional engagement. Affectiva’s emotion recognition technology is applied in many different verticals including online education, healthcare, gaming, robotics, media and advertising, market research, automotive retail, human resources, training and coaching, video communication, experiential design, and in wearables and devices.

1- Application in autonomous vehicles:

Affectiva’s emotion recognition tech aims to make autonomous cars less robotic

Emotion-aware cars may be a crucial step towards adopting advanced autonomous driving capabilities, and make the entire experience more human,

The emotion recognition engine is a set of algorithms that are programmed with a camera to focus on 33 different facial points of the driver, and gauge expressions to recognise emotions. Cars, in this case, will be fit with a camera that looks on to the driver’s face, and Affectiva’s algorithms can be directly fed into and integrated with an autonomous vehicle’s own algorithms. Affectiva uses deep neural networks to “learn” facial patterns, thereby allowing its mechanism to get to know a person and the numerous ways in which he/she reacts and emotes. These, then, identify emotions that a driver is experiencing when behind the wheels.

Once the emotion is read, the software can use the car’s mechanisms to alert distracted drivers, and take control of the car (or vice versa) depending on how a driver is reacting at the moment. This is a crucial addition to the field of autonomous driving, as with time, the need will arise to ensure that the person in the driver’s seat remains aware to on-road emergencies. The latest advances in autonomy of driving has presented elements like adaptive cruise control, lane detection, pedestrian detection, proximity and speed limit controls and more, and with this new addition, cars will also become more responsive, or emotionally aware. via digit

2- research on behaviorAffectiva_logo

Affectiva, the leader in artificial emotional intelligence (Emotion AI) technology, today announced thatPLOS One, a major science resource organization, has published the results of the first large-scale, naturalistic analysis of gender differences in facial expressions, conducted using Affectiva’s software. The study, “A Large-scale Analysis of Sex Differences in Facial Expressions”, examined gender differences in expressing emotions across five countries, including the United States, the UK, France, England, and Germany. The lead author of the paper was Dr. Daniel McDuff, at the time head of Affectiva’s science team and now a researcher at Microsoft Research. The data was collected as part of a collaboration with MIT and McDuff’s Ph.D. thesis work.
PR