Description
Enhancing the in-car experience for working and resting passengers in Level 5 Autonomous Vehicles by utilizing an adaptive display capability on the vehicle's windshield. Users will control the system with haptic feedback knobs.
TLDR:
This research project seeks to understand the interaction of an autonomous vehicle with our (HUD) interactive windshield that is human-centered and contextually sensitive. Most head-up displays currently available only depict speed or navigation (turn by turn) without concerns for specific situations and are limited to a specific area of the windshield close to the driver's instrument binnacle. Our research lab utilizes a transparent OLED HDTV to simulate an augmented-reality windshield. As the industry prepares for autonomous driving, we are to seek the use cases that transition from level IV to V autonomy. This project will challenge the current paradigm and propose new use cases and technologies for the concept of an interactive windshield and programmable haptic knob user interface. The team will utilize the GM-HMI Transportation design lab and the targeted vehicle platform is a mid-size sedan – e.g., Chevy Malibu. It will also further utilize a transparent OLED TV and two XeelTech Programmable Haptic Rotary Knob Controllers. The team is encouraged to push the boundaries of what an “interactive car” interior can be and what its effective value is to the driver (rider) and the passenger(s).