EN
19 October, 2021

The Bondurant Study: How Biometric Data Can Make Us Better Drivers

What makes a good driver?

We can all improve our driving skills in different ways. But what happens when you put a driver into a situation far beyond what most of us are used to? One way to answer this question is to examine high-performance driving.

Driving at the level of speed that is considered normal on a racetrack can only be described as extreme driving conditions to everyday drivers. High-performance drivers are constantly put into scenarios where they must make complex decisions quickly and with very little information. To add even more pressure to the driver in this situation, these decisions have huge consequences, whether they result in loss of time or, even worse, a wrecked vehicle and an injured driver.

According to racecar driving instructors, the main bottleneck to going faster while maintaining safety lies in the driver’s ability to effectively perceive events, make snap decisions, and aptly control the vehicle. This not only requires years of training, but focused training that shapes the driver’s cognitive skills in a very specific way.

But how do trainers target the correct events and experiences to shape the driver’s cognitive skills, like anticipation and attention, to decrease the delay between perception, decision and response? And how can researchers effectively measure the learning process in drivers as they learn?

In this study, we put Smart Eye’s eye tracking technology and iMotions’s biometric analysis platform to the ultimate test: turning the physiological signals of race car drivers in extreme conditions into powerful insights that help them become better drivers.

car driving racetrack

The Bondurant study: How does our technology hold up under extreme conditions? 

In September 2020, Smart Eye’s Senior Sales Engineer Aaron Galbraith traveled to Arizona for a pilot data collection with a performance driving instructor while on site at the Bondurant High Performance Driving School.

The Bondurant High Performance Driving school (now renamed the Radford Racing school) specializes in instruction for high-performance driving but is also a way for the everyday driver to enhance their skills. Located in Chandler, Arizona, the racetrack contains multiple challenging driving situations in a desert environment.

 

To get the desired results out of the study, the client had defined a number of requirements:

 

1. Smart Eye’s system had to be able to determine whether the student driver was looking out the windshield, out the driver’s side window, or out the passenger side window.

 

2. The system also had to have the ability to record and replay where the student driver was looking during difficult turns and maneuvers. This would let the trainer provide feedback to help the student understand what they were doing wrong and assist the student in correcting bad driving behavior on the spot.

 

3. After the training session, the client wanted to be able to obtain video of the drivers’ faces to include in their report.

 

4. Lastly, the client wanted to be able to tie the vehicle data, including acceleration, braking, steer angle sensors and more, to other biometric data, like gaze or emotion, to compare how the student was feeling to how the vehicle was handled in that moment.

 

 

Collecting data in a race car: Unique challenges and creative solutions 

To provide our client with the most precise eye tracking data possible, we used our most advanced remote eye tracker: Smart Eye Pro.

Smart Eye Pro features the best combined head box, field of view and gaze accuracy on the market. It is also an incredibly flexible system that is known to deliver exact results no matter the circumstances. But would it be able to hold up in conditions as extreme as on the Bondurant High Performance Driving School racetrack? On site, we were faced with some unique challenges that really put Smart Eye Pro to the test:

 

Mounting the cameras 

Smart Eye Pro uses multiple cameras that are freely placed in the vehicle’s interior. However, in this study, the training instructors weren’t allowed to make any permanent changes to the vehicle. For example, this meant the cameras could not be mounted to the vehicle by drilling holes in the dashboard or the console area. The solution became two-sided tape, which was used to stick the cameras to the vehicle’s interior surfaces. While not as foolproof as a solid mounting solution, the two-sided tape actually held up incredibly well considering the high amount of stress the system was being put under.

In this situation, the flexibility of the system was of great benefit as the Smart Eye Pro cameras can be remotely placed in any environment. The system is also non-intrusive, which means subjects don’t have to wear any type of glasses or head gear to be tracked. Because of this, the cameras could be repositioned to find the best possible view of the driver’s head and eyes, while ensuring a very realistic driving experience for the driver.

 

Vehicle vibration 

It’s hard to find a more challenging environment for a system to be placed in than a race car. Switching between extreme acceleration, deceleration, braking and cornering – the vehicle is literally being put through the paces. Given all these potential problems, we were very pleased to see how well Smart Eye Pro’s camera calibration held up.

The Bondurant study showed us that even though Smart Eye Pro is an advanced research-grade system, it isn’t limited to static research environments, like a lab or stationary vehicle simulator.

 

The number of cameras 

How many cameras would be required to capture the driver’s gaze?

One of the benefits of Smart Eye Pro is that it’s scalable. This means the number of cameras required depends on each, specific use case.

For this study, we decided to use Smart Eye’s traditional automotive set-up: three cameras on the dashboard and one camera next to the center console. This provides very good coverage on the windshields, side mirrors, instrument cluster, and center console. However, we could have obtained similar gaze tracking results by reducing the number of cameras to the three on the dashboard, since Bondurant was not interested in tracking gaze on the center console region.

 

Dramatic differences in lighting 

Smart Eye Pro can operate in any lighting condition. Since the system only uses the light produced from its flashes, external or ambient light is generally not a problem.

But while on the Bondurant track, the difference in lighting between the interior and exterior of the vehicle was so extreme we were forced to get creative.

The original plan was to use an over-the-shoulder scene camera, but because of the contrast in lighting, this wasn’t possible. Instead, we repurposed the camera as a forward-facing responding camera in iMotions, and we were then able to use the Affectiva database in iMotions to detect the driver’s emotions. But this is something we will get back to in a little while.

 

Extreme temperatures 

During our visit, the Arizona August sun was creating temperatures reaching as high as 119 degrees Fahrenheit (48 degrees Celsius). Even so, the two-sided tape mounting solution held up from one day to the next. And since the vehicle cabin was air conditioned, the interior temperature was kept well within normal operating boundaries for the system.

 

Data analysis: How to actually generate insight from the data gathered in the field

Despite challenging circumstances, we were able to collect very valuable data from the Bondurant racetrack. But how do we analyze this data to gain powerful insights from the training session? 

To find out how drivers’ biometrics can help us understand in which ways signals like attention control and emotional reactions affect driving, keep reading on the Affectiva blog.

 

Are you curious about Smart Eye Pro, the iMotions platform or Affectiva’s Emotion AI? Click on the links to find out more or order a demo of the products:

Smart Eye Pro: https://www.smarteye.se/research-instruments/se-pro/

The iMotions platform: https://imotions.com/platform/

Affectiva’s Emotion AI: https://www.affectiva.com/experience-it/

Written by Fanny Lyrheden
Back to top