September 8, 2016

In the Eye of the Beholder: A Biometric Approach to Automatic Luminance Control
By Paul Weindorf, Display Systems Technical Fellow, Visteon


Every driver these days depends on an array of displays on the instrument panel, in the center stack, in the mirror and even through head-up displays (HUDs). Since drivers rely on these displays for critical data, it’s vital that this information be clearly visible, whatever the light levels inside and outside the vehicle may be. Variances in light factors such as sunlight and passing or constant shadows, however, sometimes can make the task of reading the displays challenging.

In some instances, the sun may be shining directly on the display, resulting in reflections that overwhelm the displayed information. In other cases, the driver may be looking out the front windshield into very bright sunlight and then attempt to glance at the instrument cluster without enough time for his or her eyes to adjust to the interior ambient light, again producing a temporary problem seeing the display information.

Automakers are becoming aware that constantly running displays at their highest luminance levels accelerates image burn-in on modern OLED screens and makes cooling the screens more difficult. Cranking up the brightness also draws a lot of power, impacting battery life, especially in electric vehicles.

Display developers are testing silicon light sensors, with one pointing forward, out the windshield and others mounted on the corners of each display to detect ambient light that falls on the screen. These detectors automatically adjust the luminance of the displays to make them brighter or dimmer as lighting conditions require, immensely extending the life of OLED screens and keeping them cooler.


Visteon’s dual OLED display features auto luminance – which adjusts display brightness depending on surrounding conditions


More recently, however, Visteon has proposed a different, more accurate method of automatic luminance control: measuring the constantly changing diameter of the driver’s pupils to determine the appropriate brightness levels of displays. This method, called a total biometric automatic luminance control system, replaces silicon sensors with an infrared eye-gaze camera that precisely determines pupil diameter.

When the driver is looking outside on a sunny day, his or her pupils will contract; when looking at the cockpit instruments, the pupils grow larger. Using the science of “pupillometry” – first applied in the fields of psychology and medicine – the camera detects where the driver is looking and determines the brightness outside and inside the vehicle. The display system automatically adjusts its luminance based directly on the driver’s eye response to light, rather than on input from sensors.

At this year’s national Society for Information Display (SID) conference, our Visteon team discussed how pupillometry can be used to determine luminance value when the driver is looking outside. At the SID Detroit chapter conference later this month, I will propose using pupil-diameter measurements to determine the reflected luminance from the front of the display. The latter is a more difficult issue because, when the driver glances from the road to the instrument cluster, the eye adapts to the dimmer light in an exponential fashion over a 10-second period, requiring an algorithm to determine what the final luminance value should be after the eyes have completed their adjustment.

The primary value of potentially using this biometric system instead of silicon detectors is its straightforward accuracy. When silicon sensors are employed, they are positioned at the corners of the display. Depending on the lighting and shadow conditions, they may not be correctly sensing the true reflected luminance. The biometric approach measures what the human eye actually is seeing off the front of the display. When examining what the driver is gazing at outside, silicon sensors look forward within a particular field of view, but the driver may be looking toward the left or right. The biometric eye-gaze camera uses the glint on the eyes from the infrared emitter to figure out which direction the eyes are looking and to adjust the display luminance based on what may be a greater or lesser intensity than the straightforward field of view.

Another advantage of biometrics is that it allows designers to remove sensors from the display and avoid the need for a forward-looking sensor, providing a sleeker and more pleasing appearance. Furthermore, eye-gaze cameras are now being placed in cars for other purposes, such as to detect drowsiness, and the same camera can also drive automatic luminance control, at no additional cost. An eye-gaze camera can be used to adjust the luminance of projected HUD displays automatically, as well.

The Visteon team’s next step is to build a physical model of a biometric luminance control system based on these principles and technologies. Ultimately, such technologies will allow displays to adjust to the absolute light levels in and around the vehicle, as well as to the driver’s perception of those levels throughout the journey. This concept promises another eye-popping advancement from Visteon for tomorrow’s cars and trucks.


Paul Weindorf is a display technical fellow for Visteon with more than 35 years of experience in the electronics industry. He currently supports display system activities for production, development and advanced projects. His interest lies in the display visibility arena and he participated in the SAE J1757 committee. Weindorf graduated from the University of Washington with a bachelor’s degree in electrical engineering.