Follow
Subscribe

Affectiva Launches Automotive AI Software to Track the Emotional State of Passengers

Home > News > Content

【Summary】Much of the work developing self-driving cars has focused on what the car “sees” outside of the vehicle, such as lidar, cameras, and radar used to help an autonomous vehicle navigate. Now, a new startup is focusing instead on what’s going on inside of the vehicle—the facial expressions and emotional state of the occupants.

Eric Walz    Apr 27, 2018 11:54 AM PT
 Affectiva Launches Automotive AI Software to Track the Emotional State of Passengers

BOSTON — Much of the work developing self-driving cars has focused on what the car "sees" outside of the vehicle, such as lidar, cameras, and radar used to help an autonomous vehicle navigate. Now a new startup is focusing instead on what's going on inside of the vehicle—the facial expressions and emotional state of the occupants.

Boston-based startup Affectiva has announced the launch of a unique new software product called "Affectiva Automotive AI," that focuses on the vehicle's occupants. The initial goal is to better monitor driver alertness to help reduce the number of car accidents. Affectivia Automotive AI uses artificial intelligence technologies to measure the "emotional and cognitive states" of vehicle occupants in real time.

In the future, autonomous vehicles might have technology that can understand the mood and preferences of passengers might enable the vehicle to automatically make adjustments that improve the riding experience. Although its still not clear whether vehicle occupants will prefer that to controlling changes themselves, but companies like Affectiva are trying to develop the capabilities anyway.

"All of our technologies and our devices are becoming conversational and perceptual—even our car," said Rana el Kaliouby co-founder and CEO of Affectiva to Xconomy.

Her Boston-based startup is one of the companies trying to bring that vision to life. Since 2009, the MIT Media Lab spinout has raised $26 million from well-known investors, including as Kleiner Perkins, Caufield Byers and Horizon Ventures.

Rana-el-Kaliouby-1100x734.jpg

Co-founder and CEO of Affectiva, Rana el Kaliouby

"Affectiva is the only AI company that can deliver people analytics using algorithms that are built with custom-developed deep learning architectures," said Dr. Rana el Kaliouby, CEO and co-founder, Affectiva. "We have built industry-leading Emotion AI, using our database of more than 6 million faces analyzed in 87 countries. We are now training on large amounts of naturalistic driver and passenger data that Affectiva has collected, to ensure our models perform accurately in real-world automotive environments. With a deep understanding of the emotional and cognitive states of people in a vehicle, our technology will not only help save lives and improve the overall transportation experience, but accelerate the commercial use of semi-autonomous and autonomous vehicles."

The company has been developing technologies and products aimed at sensing people's emotions. It initially took a vision-based approach, using webcams and optical sensors to analyze people's facial expressions and non-verbal cues, for applications in areas such as advertising, marketing, and video games.

Last fall, the company expanded into voice analysis with the release of a cloud-based software product that it says can measure certain basic emotions in speech.

Growing Interest in Emotion Detection

Over the past several years, automakers, vehicle suppliers, and automotive tech startups began expressing growing interest in Affectiva's products, el Kaliouby says. In the race to develop autonomous cars, those companies have mainly focused on outdoor sensing technologies, she says. But now, there's an "increased understanding that you also have these occupants, drivers, and co-pilots in the vehicle that we need to understand as well," she added.

The software analyzes data fed to it by cameras and microphones inside the car. The company says its software can measure facial expressions that indicate joy, anger, and surprise; vocal expressions of anger, laughter, and excitement; and signs of drowsiness, such as yawning, eye closure, and blink rates.

Affectiva's new product moves the company into the autonomous vehicles sector, an industry receiving a lot of interest and investment. The decision could help the startup differentiate its business from competitors in emotion-sensing technology, which include EMOSpeech and Vokaturi.

Affectiva isn't the only company in this sector going after the automotive market, Silicon Valley-based startup Eyeris is also developing "vision A.I." products for monitoring drivers and improving the rider experience.

The starting point for Affectiva's new product was what the company learned by using software to analyzing expressions of 6 million faces, mostly while people watched online videos, el Kaliouby says. To adapt its technology for in-vehicle sensing, the company has paid about 200 people so far to install cameras and microphones in their cars to collect data while they drive.

The company has amassed data from about 2,100 hours of driving, which has helped Affectiva train its machine learning algorithms to recognize things like yawns and to track blinking patterns—things that its software wasn't previously capable of handling, she says.

The first application of the technology is to watch for distracted or drowsy drivers—two significant causes of vehicle accidents. And, as more semi-autonomous vehicles hit the road, el Kaliouby envisions that an "in-cabin sensing solution" for driver alertness will become as crucial as seat belts.

"The [semi-autonomous] car is expecting to be driving for the majority of the time, but in some cases, it might encounter scenarios where it does not know what to do and needs to hand control back to the human driver," she says. "It really needs to know if the driver is awake or not, if the driver is paying attention or not. You might be in the middle of watching a movie or reading a book, but all of a sudden you need to have full context and take control again. … That trust back and forth between the driver and a car is really, really important."

affectiva_emotion_recognition_for_Cars.png

If Affectiva's software determines the driver isn't paying attention, it might trigger some kind of alert, el Kaliouby says. In more extreme cases, perhaps the car could be programmed to pull over and call for help, she says.

Some other car companies have already begun introducing cameras to monitor driver alertness, el Kaliouby says. But she claims they're mostly using "rudimentary eye-tracking" technologies designed only to determine if the eyes are on the road or not.

If autonomous vehicles become ubiquitous, Affectiva intends to shift its in-car sensing technology to focus on improving the riding experience. For example, if the software determines from people's facial expressions or vocal cues that they're becoming nauseous or uncomfortable, it could tell the vehicle's computer to slow the car down, adjust the driving style, or even stop to take a break, el Kaliouby says.

In addition, Affectiva and other autonomous vehicle advocates believe the "vehicle is evolving to become the entertainment hub of the future," el Kaliouby says. Affectiva's in-vehicle software could also be designed to automatically serve up videos, music, or other content based on people's moods.

The bigger picture here is that automakers, especially luxury brands like BMW, are starting to grapple with the prospect of driving getting taken out of the equation for people, el Kaliouby says.

Carmakers have "always been about the driving experience," she says. "That's becoming commoditized. They're trying to figure out what their role is in this emerging ecosystem."

Automotive OEMs recognize the need to deploy in-cabin AI to improve road safety by identifying dangerous driving behavior. As self-driving capabilities continue to advance, it will be important to monitor driver state, in real-time, to assess if the driver can assume control in semi-autonomous vehicles. And, as autonomous vehicles become commercially-available, Affectiva Automotive AI's understanding of the in-cabin environment will shift to allow for the adaptation of the travel experience based on occupant moods and reactions.

One question is why passengers would want to rely on Affectiva's software to sense that they want the car to slow down, change the cabin temperature, or play the latest hit song. Wouldn't it be easier or more reliable to tap a button on the dashboard or speak the command to a virtual assistant like Amazon's Alexa? El Kaliouby argues "it would be so much better if the car had the context of who you are, and it understood your emotional state."

She says Affectiva's software could also improve the interactions passengers have with in-vehicle virtual assistants.

"If you're getting annoyed, you want this conversational interface to understand that and react accordingly," she says. "It could acknowledge it's not getting it right and try to do a better job."

Affectiva has already signed up unnamed car companies as paying customers, el Kaliouby says. It also has partnerships with Autoliv, a Swedish supplier of automotive safety products, and Renovo, a California-based developer of software for autonomous vehicle fleets.

Affectiva's technology could also be used to improve the safety of private rides enabled by apps like Uber and Lyft, Kaliouby said. If a driver gets angry with a rider, or vice versa, the software could be configured to flag those situations.

In addition, the software could become capable of detecting whether passengers are impaired, which could put them "more at risk of being in danger," whether from the driver or other riders, she says.

When asked what the software might do if it detects a dangerous situation in the vehicle, el Kaliouby says that still needs to be worked out. But she says it probably would at least involve sending an alert to the ride-sharing company.

Focus on the Rider Experience

If autonomous vehicles become mainstream, Affectiva intends to shift its in-car sensing technology to focus on improving the riding experience. For example, if the software determines from people's facial expressions or vocal cues that they're becoming nauseous or uncomfortable, it could tell the vehicle's computer to slow the car down, adjust the driving style, or even stop to take a break, el Kaliouby says.

Affectiva believes the "vehicle is evolving to become the entertainment hub of the future," el Kaliouby says. Her company's in-vehicle software could also be designed to automatically serve up videos, music, or other personalized content based on people's moods.

The bigger picture here is that automakers, especially luxury brands like BMW, are starting to grapple with the prospect of driving getting taken out of the equation for people, el Kaliouby says.

Carmakers have "always been about the driving experience," she says. "That's becoming commoditized. They're trying to figure out what their role is in this emerging ecosystem."

affectiva.JPG

One question about the emerging field of emotions tracking is perhaps its would be easier if a virtual assistant like Amazon's Alexa respond to voice commands from passengers instead. El Kaliouby argues that "it would be so much better if the car had the context of who you are, and it understood your emotional state." She also says Affectiva's software could also improve the interactions passengers have with in-vehicle virtual assistants.

"If you're getting annoyed, you want this conversational interface to understand that and react accordingly," she says. "It could acknowledge it's not getting it right and try to do a better job."

Affectiva will be demoing its Affectiva Automotive AI solution at NVIDIA's GPU Technology Conference in Silicon Valley next week. Affectiva will also be speaking at GPU Technology Conference on the panel, The Future of Mobility and In-Car People Analytics, on Thursday, March 29th.

Prev                  Next
Writer's other posts
Comments:
    Related Content