September 29, 2016

A Sound Idea: Immersive Audio Concept Could Lead to Safer Driving
By Patrick Nebout
Director, advanced technologies and HUD product management, Visteon

Immersive audio integrates an advanced level of sound to the driving experience.

All those warning lights, chimes, buzzes and clicks that we experience on our daily commutes are beginning to turn driving into a guessing game. What does that light on the instrument panel mean? Does that sound indicate that I’m straying out of my lane or that a car is approaching me from behind? Why is my steering wheel shaking? Drivers are becoming overloaded with information that often requires immediate interpretation and that, even worse, may result in simply ignoring the alert rather than trying to figure it out.

What if the auto industry could develop a way for visual, auditory and tactile warnings to complement each other to clearly define for the driver the situation and the action needed? That’s a challenge Visteon has set out to resolve.

Visteon’s multimodal human-machine interaction (HMI) concept – immersive audio for short -- adds sound in a brand-new way to visual systems like head-up displays (HUDs) and video screens. Recognizing that the ear is always active and listens in all directions, Visteon engineers are creating sounds as “objects” that can move around the interior of the vehicle. Sound thus becomes more than just a warning; it works alongside visual signals to help the driver understand exactly what is going on outside the vehicle.

For a multimodal HMI, engineers collaborate with sound designers, like those in the movie industry, to create realistic sounds that seem to change their location in relation to the driver as the vehicle or a nearby object moves. For example:

  • If a bicycle is approaching in slow traffic, the driver will hear the simulated sound of bicycle tires and a bicycle chain as the bike gets closer. If the bicycle moves across the front of the car, the driver will hear the bicycle sounds move from the left side of the vehicle toward the right as the bike crosses.
  • If the vehicle is approaching a red light, the audio system could simulate a ticking clock in conjunction with an on-screen visual to indicate how soon the light will change to green.
  • When driving over a high bridge, the system can produce the sound of the vehicle’s wheels on the bridge and of crosswinds, alerting the driver to potential dangers. As the wind gets stronger, the wind sounds grow louder in the direction they are blowing.
In customer research, this object-based sound system was coupled with head-up displays and video screen displays for a series of tests. Consumers gave very positive feedback to the immersive audio HMI. They felt audio was a good addition to visual assistance support, noting that it should be applied carefully so it doesn’t become distracting. More than eight in 10 respondents felt immersive audio was safe, and three-quarters said they would be happy to get this support. The research showed that dynamic (moving) sound significantly improved the informative aspect of warning signals and that ADAS (advanced driver assistance systems) could benefit from this technology.

Object-based audio opens many new opportunities for driving assistance and creates an audio environment generally found only in concert halls and music studios. In fact, movie theaters soon will begin to be equipped with similar immersive sound.

Visteon plans to work with automakers to help them create their own distinct sounds with a multimodal HMI. Visteon will provide the systems for driving sounds, visuals and haptic feedback, while each vehicle manufacturer creates unique sounds that are governed by an audio electronic control unit in the vehicle.

When autonomous driving arrives, this type of system could be key to ensuring a safe transition from autonomous to manual mode. Sound, along with visuals and haptic feedback, will help prepare the driver to resume control of the wheel. While in autonomous mode, the object-based system will allow occupants to listen to music and other entertainment in a format that seems more like a home theater than a car.

These new ways to combine sensory modes are on the verge of completely changing the driving experience, with sound ideas for safer and more enjoyable travel.

On September 20, 2016 Patrick Nebout gave this presentation titled, Multimodal HMI for Advanced Driver Assistance at SAE Convergence.

As director, advanced technologies and HUD product management, Patrick Nebout is responsible for leading the introduction of new technologies and driving innovation for the company. He is in addition the global director for head-up displays product line, having the responsibility of the product strategy and the related technologies. Patrick has 17 years’ experience in the automotive industry, plus four years in the aerospace field. He gained a master of sciences in physics in 1994 from École Centrale Marseille and graduated as a master in business a year later from HEC Paris.

September 8, 2016

In the Eye of the Beholder: A Biometric Approach to Automatic Luminance Control
By Paul Weindorf, Display Systems Technical Fellow, Visteon

Every driver these days depends on an array of displays on the instrument panel, in the center stack, in the mirror and even through head-up displays (HUDs). Since drivers rely on these displays for critical data, it’s vital that this information be clearly visible, whatever the light levels inside and outside the vehicle may be. Variances in light factors such as sunlight and passing or constant shadows, however, sometimes can make the task of reading the displays challenging.

In some instances, the sun may be shining directly on the display, resulting in reflections that overwhelm the displayed information. In other cases, the driver may be looking out the front windshield into very bright sunlight and then attempt to glance at the instrument cluster without enough time for his or her eyes to adjust to the interior ambient light, again producing a temporary problem seeing the display information.

Automakers are becoming aware that constantly running displays at their highest luminance levels accelerates image burn-in on modern OLED screens and makes cooling the screens more difficult. Cranking up the brightness also draws a lot of power, impacting battery life, especially in electric vehicles.

Display developers are testing silicon light sensors, with one pointing forward, out the windshield and others mounted on the corners of each display to detect ambient light that falls on the screen. These detectors automatically adjust the luminance of the displays to make them brighter or dimmer as lighting conditions require, immensely extending the life of OLED screens and keeping them cooler.

Visteon’s dual OLED display features auto luminance – which adjusts display brightness depending on surrounding conditions

More recently, however, Visteon has proposed a different, more accurate method of automatic luminance control: measuring the constantly changing diameter of the driver’s pupils to determine the appropriate brightness levels of displays. This method, called a total biometric automatic luminance control system, replaces silicon sensors with an infrared eye-gaze camera that precisely determines pupil diameter.

When the driver is looking outside on a sunny day, his or her pupils will contract; when looking at the cockpit instruments, the pupils grow larger. Using the science of “pupillometry” – first applied in the fields of psychology and medicine – the camera detects where the driver is looking and determines the brightness outside and inside the vehicle. The display system automatically adjusts its luminance based directly on the driver’s eye response to light, rather than on input from sensors.

At this year’s national Society for Information Display (SID) conference, our Visteon team discussed how pupillometry can be used to determine luminance value when the driver is looking outside. At the SID Detroit chapter conference later this month, I will propose using pupil-diameter measurements to determine the reflected luminance from the front of the display. The latter is a more difficult issue because, when the driver glances from the road to the instrument cluster, the eye adapts to the dimmer light in an exponential fashion over a 10-second period, requiring an algorithm to determine what the final luminance value should be after the eyes have completed their adjustment.

The primary value of potentially using this biometric system instead of silicon detectors is its straightforward accuracy. When silicon sensors are employed, they are positioned at the corners of the display. Depending on the lighting and shadow conditions, they may not be correctly sensing the true reflected luminance. The biometric approach measures what the human eye actually is seeing off the front of the display. When examining what the driver is gazing at outside, silicon sensors look forward within a particular field of view, but the driver may be looking toward the left or right. The biometric eye-gaze camera uses the glint on the eyes from the infrared emitter to figure out which direction the eyes are looking and to adjust the display luminance based on what may be a greater or lesser intensity than the straightforward field of view.

Another advantage of biometrics is that it allows designers to remove sensors from the display and avoid the need for a forward-looking sensor, providing a sleeker and more pleasing appearance. Furthermore, eye-gaze cameras are now being placed in cars for other purposes, such as to detect drowsiness, and the same camera can also drive automatic luminance control, at no additional cost. An eye-gaze camera can be used to adjust the luminance of projected HUD displays automatically, as well.

The Visteon team’s next step is to build a physical model of a biometric luminance control system based on these principles and technologies. Ultimately, such technologies will allow displays to adjust to the absolute light levels in and around the vehicle, as well as to the driver’s perception of those levels throughout the journey. This concept promises another eye-popping advancement from Visteon for tomorrow’s cars and trucks.

Paul Weindorf is a display technical fellow for Visteon with more than 35 years of experience in the electronics industry. He currently supports display system activities for production, development and advanced projects. His interest lies in the display visibility arena and he participated in the SAE J1757 committee. Weindorf graduated from the University of Washington with a bachelor’s degree in electrical engineering.

August 26, 2016

Heads up! Here come the HUDs
By James Farell, Director of Mechanical, Display and Optical Engineering,

Today’s travelers can feel more like pilots than drivers when they sit behind the wheel. They find themselves in a cockpit with an array of digital gauges and navigation systems, and increasingly they are enjoying the benefits of a head-up display, or HUD.

A HUD consists of a picture generation unit, a series of mirrors, and either a transparent combiner screen or the windshield itself to project information directly in front of the operator’s eyes. The first HUDs evolved from World War II-era reflector sights in fighter aircraft and the technology made its way to automobiles in the 1988 Oldsmobile Cutlass Supreme.

Today’s HUD displays information above the dashboard, such as speed, turn indicators, navigation data and the current radio station. It allows drivers to keep their eyes on the road without having to constantly shift their focus between the road and the instrument panel. HUDs project only the most important information that the driver needs at the time, thereby avoiding unnecessary distractions.

Early HUDs employed a monochrome vacuum fluorescent display that was not customizable. Today’s more advanced HUDs often use TFT (thin-film transistor) LCD (liquid crystal display) screens, like those found in some smartphones and flat-screen TVs, with an LED (light emitting diode) backlight to generate a very bright image.

HUD systems fall into two main classes: combiner and windshield.  A combiner HUD uses a screen to reflect an image to the driver, while a windshield HUD has images projected directly off the windshield. In both categories, a virtual image appears beyond the surface of the reflector, helping the eyes maintain focus on both the data and the roadway.

Head-up displays can be tailored for all markets, reflecting the variety and advancements that have been made with this technology.
  • The entry-level HUD, designed for emerging markets, uses a passive TFT LCD or vacuum fluorescent system and a combiner with extremely high-quality optics, but with a relatively narrow field of view. This HUD often uses a mechanical, manual tilting screen, rather than the automatic or motor-driven covers available in higher-level HUDs.
  • The next step up is the low-end HUD, which is considerably brighter and offers a 4.5 x 1.5 degree field of view. With an active-matrix TFT LCD screen for sharper colors, a wider field of view and faster response, it employs simplified kinematics with a combiner that rotates down to lie flat when not in use.
  • The mid-level HUD, for the midrange automotive sector, also has a 4.5-by-1.5 degree field of view but a more complex combiner that completely retracts with a flap that covers it, for a more seamless appearance. It is about 70 percent brighter than the low-end HUD.
  • The high-end HUD is even brighter, with a larger TFT screen that offers a very wide 6-by-2.5 degree field of view. Its complex kinematics system incorporates a two-piece flap for efficient packaging, and the combiner screen can both shift and rotate.
  • The windshield HUD system, which uses no separate combiner but projects data via virtual images in front of the windshield. Its optics are more complex and its cost is higher than the other systems. While the same combiner HUDs can be designed into different positions and locations in different types of vehicles, windshield HUDs must be designed for a specific windshield and are not as adaptable.

Drivers in Asia and Europe, and to a lesser degree in North America, have shown great interest in HUD systems. Sales are growing 30-40 percent annually, and their attraction is expected to increase now that as many as five types of HUDs are available for various levels of vehicles.

The next generation of HUD will offer an augmented reality system with a very wide field of view and an image that can seem to project right onto the roadway. Its information can overlay what the driver sees in the real world -- a pedestrian ready to cross the street, a stop sign or an exit ramp, for instance. Virtual reality HUDs are expected to begin appearing in 2021 vehicles.

When autonomous driving becomes the norm, occupants may be using HUD systems during automated driving periods for videoconferences, rather than phone calls. HUD technology has virtually no limits on what can be displayed. The task of the auto industry is to ensure that HUDs continue to add to safety by reducing driver distractions while also helping prepare for the day when eyes-off-the-road driving will transition to eyes-on-the-screen activities.

Jim Farell leads Visteon’s technology development for all display products, including head-up displays, center information displays, and optics for displays and instrument clusters. During his 24 years at Visteon and Ford, Jim has led teams delivering a diverse portfolio of electronics products including Visteon’s first commercial infotainment platform and first V2X platform. He has a bachelor’s degree from the GMI Engineering and Management Institute, and a master’s in electrical engineering from Stanford University.