May 15, 2013


Artificial Intelligence can unlock the door to an improved user experience, every time you step into the car 
By Shadi Mere

Google’s search engine has become famous for delivering a user experience that is personal, intuitive, simple and indispensable. In fact, for most consumers, the term “Googling” has become synonymous with performing a Web search. You may not realize it, but when you do a Google search, you’re tapping into artificial intelligence (AI) machine learning, cognitive science research applications and behavioral psychology – supported by a large investment in big data mining.

Google is not alone. An increasing number of companies – particularly those that sell portable consumer devices – realize that if their products don’t incorporate some form of artificial intelligence, consumers will perceive them as outdated. This perception has negative implications for their brand image – and auto manufacturers are not immune.

Leading research is uncovering the mechanism by which a brain forms a memory or learns a skill. The brain relies on clusters of neurons that hold bits of information. With synapses firing to firm the connection between these clusters, the frequency and the chemical incentive (ex: dopamine for pleasure or adrenaline for fear) solidifies the connection. Furthermore, we don’t form memories as photographic impressions, but rather as a loose recollection of memories that we detail using imagination, logic and our current emotional state, remembering differently each time.

In artificial intelligence, neural networks are represented to recognize complex patterns, such as an image or even mood. These networks are structured with layers of connected nodes that apply calculations. The nodes and its layers go through learning and training processes through iterative inputs of patterns to categorize information and behaviors. Once the network is fully trained, it becomes capable of instantaneously recognizing patterns and profiles with a large amount of complex data at a speed not possible by humans. Examples include: Google (and other Web) searches, spam filtering, speech recognition, robotics, medical diagnostic systems, and many popular recommendation algorithms (such as those employed by YouTube, Netflix and Amazon).

The HABIT cockpit concept

The Visteon HABIT cockpit concept applies this artificial intelligence approach to the user interface, embodying realistic 3-D graphics and animation to deliver a futuristic vision of human machine interaction (HMI) in the car. The demonstration boldly moves the user experience in a novel direction: an AI logic engine with an evolving learning system that factors short- and long-term memories. This is not an attempt to create a static procedural cycling of use-case routines; instead it’s an AI learning system that adapts with each new input.

The goal of HABIT is to deliver an experience that improves each time the driver uses the ever-aware system. After days, weeks and years of owning this system it is likely that the driver will be disappointed in any other system that does not get to know him or her like the HABIT system does. Additionally, the driver will be able to transfer his or her personalized system of secured knowledge and learning anytime he or she rents a car, shares a car, or upgrades to a new car. Finally, the system will always be relevant, upgrading organically over time to reflect the latest in state-of-the-art AI algorithms, and the latest in our habits.

Our approach at Visteon starts with the Consumer Experience Model. This model is derived from extensive consumer research and is built around the notion of creating the ideal experience for drivers. As opposed to employing “technology for technology’s sake,” Visteon takes great care to ensure that any new technology is vetted against the Consumer Experience Model before it makes its way into a new product concept. Successful organizations do not shoehorn technologies and ignore its shortcomings. This is why our clinical studies are crucial in shaping our concepts. Research during the testing of HABIT showed that consumers liked the concept and rated it high; however, they did have a few concerns and dislikes. The main issues  were around “Big Brother” and the privacy of their data. The consumers also did not care much for an Avatar “Infotendant” and they preferred to hear the infotendant’s voice but not see the image, which they viewed as distracting.

They also expressed dislike for “forced” habits, in other words, performing actions without first asking the consumer. This dynamic was particularly interesting because in general people like to think of themselves as unique and unpredictable; with changing tastes in music and habits. Most research contradicts this notion, showing that consumers are generally much more predictable then they like to believe. This creates an interesting paradox between what consumers believe to be true and reality. Perhaps over time, people’s negative perceptions of being predictable will be outweighed by the convenience of an interface that is much easier to operate.


(If you have trouble viewing the video above, you can also watch it on Visteon’s Multimedia page.)

In light of our consumer research findings, the HABIT concept changed to reflect a few important improvements: Examples include:
  1. Personal data and profiles are secured and can be locked to the person’s phone, Cloud account or their personal voice identification print. Alternatively, the system can be switched to an agonistic identification mode altogether.
  2. The visual infotendant was eliminated with interactions limited to voice only.
  3. The system user interface (UI) looks similar to a typical UI. Visually it does not behave differently based on habits; it only shows faint “bread crumbs” highlighting a path to the default predicted habit. The UI also prompts the user to engage its recommendations - they are never forced. Alternatively, the AI can be turned off altogether so that the system can revert to a more traditional operating mode.
  4. The approach starts simply and without a learning curve. However, it is possible for the user to customize the system AI settings to a very deep level, satisfying both the passive user and the technophile.

Moving forward, the ongoing challenge for this technology will be creating a system that is dynamically learning, but at the same time does not cross the line into being intrusive or annoying. The AI is dogmatic, yet it must be simple to perceive, while being accurate and  robust. Ultimately the goal is to create a system that makes us bond with it as it bonds with our habits.



The Visteon HABIT Cockpit Concept

The Future: Multiple Intelligence

If an area of the brain that processes sight is totally damaged, can we see again? What does the above question have to do with the car?

Research indicates that, contrary to long-held beliefs, areas of the brain are not rigid centers for specific tasks, but can evolve to take on sensory skills that were thought to exist strictly in specialized areas. Given enough training and stimulation, severely damaged areas in the brain that process sight can be delegated to other areas of the brain – like the parts that process hearing. Similarly, one day your car’s “brain” might be able to compensate for a damaged sensor by having a different sensor (with a different purpose) take on a quick AI learning pattern to process the critical information. There are also broad implications of how our brains handle multi-tasking, automated tasks, concentration and distraction. Our brains can adapt to certain tasks, experiences or even a new vehicle (HMI), but they have difficulty with true multitasking. (Instead, we have become very good at “task switching.”) AI, on the other hand, can multitask. For example, in the future one can imagine scenarios in which cars can talk to one another and simultaneously apply learning to blind spots, weather conditions, traffic rules, impaired drivers and complex driving patterns – amounts of information that humans simply can’t process quickly enough to make the best decision. Eventually, AI will be able to adapt to perform tasks that (today) we consider to be uniquely human.

At some point, artificial intelligence will be ubiquitous, making our lives much more productive in both expected and unexpected ways. This technology could be seen in vehicles in as soon as five years from now.

Now, I’d like to hear from you. Are you ready for AI in the car?  What features in the car do you feel would benefit the most from applying this technology? 


Shadi Mere is an innovation manager with Visteon’s “Innovation Works” team. He works on advanced innovation, “disruptive” technology, human-machine interaction, creative design management, consumer experience research and high-technology trends. During his 17-year career, Shadi has worked in engineering design, advanced manufacturing, product development and strategy, and program management -- with a focus on bringing promising inventions to life. 

No comments:

Post a Comment