By Patrick Nebout
Director, advanced technologies and HUD product management, Visteon
Immersive audio integrates an advanced level of sound to the driving experience.
All those warning lights, chimes, buzzes and clicks that we experience on our daily commutes are beginning to turn driving into a guessing game. What does that light on the instrument panel mean? Does that sound indicate that I’m straying out of my lane or that a car is approaching me from behind? Why is my steering wheel shaking? Drivers are becoming overloaded with information that often requires immediate interpretation and that, even worse, may result in simply ignoring the alert rather than trying to figure it out.
What if the auto industry could develop a way for visual, auditory and tactile warnings to complement each other to clearly define for the driver the situation and the action needed? That’s a challenge Visteon has set out to resolve.
Visteon’s multimodal human-machine interaction (HMI) concept – immersive audio for short -- adds sound in a brand-new way to visual systems like head-up displays (HUDs) and video screens. Recognizing that the ear is always active and listens in all directions, Visteon engineers are creating sounds as “objects” that can move around the interior of the vehicle. Sound thus becomes more than just a warning; it works alongside visual signals to help the driver understand exactly what is going on outside the vehicle.
For a multimodal HMI, engineers collaborate with sound designers, like those in the movie industry, to create realistic sounds that seem to change their location in relation to the driver as the vehicle or a nearby object moves. For example:
- If a bicycle is approaching in slow traffic, the driver will hear the simulated sound of bicycle tires and a bicycle chain as the bike gets closer. If the bicycle moves across the front of the car, the driver will hear the bicycle sounds move from the left side of the vehicle toward the right as the bike crosses.
- If the vehicle is approaching a red light, the audio system could simulate a ticking clock in conjunction with an on-screen visual to indicate how soon the light will change to green.
- When driving over a high bridge, the system can produce the sound of the vehicle’s wheels on the bridge and of crosswinds, alerting the driver to potential dangers. As the wind gets stronger, the wind sounds grow louder in the direction they are blowing.
In customer research, this object-based sound system was coupled with head-up displays and video screen displays for a series of tests. Consumers gave very positive feedback to the immersive audio HMI. They felt audio was a good addition to visual assistance support, noting that it should be applied carefully so it doesn’t become distracting. More than eight in 10 respondents felt immersive audio was safe, and three-quarters said they would be happy to get this support. The research showed that dynamic (moving) sound significantly improved the informative aspect of warning signals and that ADAS (advanced driver assistance systems) could benefit from this technology.
Object-based audio opens many new opportunities for driving assistance and creates an audio environment generally found only in concert halls and music studios. In fact, movie theaters soon will begin to be equipped with similar immersive sound.
Visteon plans to work with automakers to help them create their own distinct sounds with a multimodal HMI. Visteon will provide the systems for driving sounds, visuals and haptic feedback, while each vehicle manufacturer creates unique sounds that are governed by an audio electronic control unit in the vehicle.
When autonomous driving arrives, this type of system could be key to ensuring a safe transition from autonomous to manual mode. Sound, along with visuals and haptic feedback, will help prepare the driver to resume control of the wheel. While in autonomous mode, the object-based system will allow occupants to listen to music and other entertainment in a format that seems more like a home theater than a car.
These new ways to combine sensory modes are on the verge of completely changing the driving experience, with sound ideas for safer and more enjoyable travel.
On September 20, 2016 Patrick Nebout gave this presentation titled, Multimodal HMI for Advanced Driver Assistance at SAE Convergence.
As director, advanced technologies and HUD product management, Patrick Nebout is responsible for leading the introduction of new technologies and driving innovation for the company. He is in addition the global director for head-up displays product line, having the responsibility of the product strategy and the related technologies. Patrick has 17 years’ experience in the automotive industry, plus four years in the aerospace field. He gained a master of sciences in physics in 1994 from École Centrale Marseille and graduated as a master in business a year later from HEC Paris.