EN
1 February, 2023

How Interior Sensing Improves User Experiences: 5 Use Cases

Interior Sensing and in-cabin sensing systems will soon seamlessly be a part of our lives with engaging mobility experiences that promote comfort, wellness and entertainment. Over the next several years, we expect to see this technology deployed in vehicles on the road.

These are a few examples of how different everyday scenarios could be affected by the implementation of these technologies:

Emotionally Intelligent Virtual Assistance

Isa is driving a vehicle. She engages the virtual assistant to navigate to a new place. The virtual assistant repeatedly gets the address wrong. Isa becomes more frustrated. By analyzing her facial and vocal expressions, the Interior Sensing system detects expressions of anger and irritation. To mitigate the errors of the virtual assistant, the system does the following: 

– Have the in-cabin virtual assistant change its tone of voice 

– Engage recovery methods for dealing with erroneous behavior

Content Recommendations 

Friends are taking a ride, enjoying themselves. The infotainment system identifies the in-cabin occupants, analyzing their age, gender and moods. Based on this information, the system: 

– Tailors content recommendations based on personal profile, preferences and mood

– Curates a playlist and recommends music content based on age and mood 

– Analyzes passengers’ facial expression responses to previous content recommendations in order to improve and personalize future recommendations

Rideshare Customer Experience

Jack is a rideshare driver. He has picked up two passengers and they all engage in a nice conversation. The passengers also feel very comfortable and safe in the car because Jack obeys all the traffic rules and drives smoothly, avoiding sudden accelerations and braking. The interior sensing system recognizes positive emotions and smiling faces of the passengers and the driver. 

– The signals are transmitted to cloud services where the fleet owner analyzes the driver’s behavior and the customer experience over time. 

– Every quarter the fleet owner ranks the drivers and the ones with the highest ratings receive a monetary bonus to encourage positive and safe driving behaviors. receive a monetary bonus to encourage positive and safe driving behaviors. 

– The customers’ positive ride experience is also recorded in the shared driving app so other users can see it.

Stress Identification 

Sara is driving a semi-autonomous vehicle. She observes a light coming up. She does not trust that the system will recognize it needs to engage the breaks and gets anxious. The in-cabin sensors identify expressions of anxiety and irritation, and the system uses this data to build trust between Sara and the vehicle: 

– The vehicle alerts the driver that it is aware of the upcoming stop 

– It adapts to slow the vehicle sooner and more steadily

Motion Sickness

Elsa is riding in a vehicle as she begins to feel uncomfortable and nauseous due to the driving behavior. Using the in-cabin camera and microphones, the system tracks facial expressions, vocal events, eye closure and eye gaze to identify nausea and discomfort. It then uses the data to adapt the occupant transportation experience: 

– Adapting the driving style to be slower and more gentle 

– Adjusting the in-cabin air flow 

– The navigation system tries to find an alternative, smoother route 

 

Want to learn more about interior sensing? Download our eBook on Interior Sensing: The Next Frontier in Improving Road Safety and the Mobility Experience

Written by Fanny Lyrheden
Back to top