Gaze-controlled HUD

Though virtual touch is useful for existing HDD, they still involve drivers to take their eyes off the road for interaction. I investigated a see-through Heads-Up display (HUD) and explored hands free interaction techniques in the form of eye gaze-controlled interface. The gaze-controlled interface allows the user to look at the intended location on the screen to select the intended target. The proposed interactive HUD system can be deployed on the windshield of a vehicle and does not occlude road vision. The gaze-controlled HUD system is tested in both driving simulator and actual vehicles. Drivers could operate the display as fast as existing touchscreen with improvement in driving performance in comparison to existing touchscreen-based HDD.

UI Design of HUD:

A survey was conducted through an online system for persons with driving experience and data was collected in terms of user interaction requirements and functions required to be displayed in the dashboard as pointed below:

  • Easy to recognize icons from their prior learning experiences which they can relate to.

  • Icons should be compatible in the car environment among various other existing icons.

  • Avoiding any overlap in the icons with other functions which can call for confusions.

The survey collected data from 90 participants in terms of the required functionalities on the dashboard. Before starting and after ending the trip users prefer to operate seat adjustment, window glass rolling and AC vent control options, checking proximity sensor alert and tyre pressure status. While driving users prefer to have dashboard con- trols for navigation, music, phone notifications, distraction indication, speed indication, fuel indication, window glass rolling options, battery indication, vent control options, warnings and alerts to avoid accidents, headlight adjustments, glass defrosting options and reminder alerts.

We selected a set of functionalities to be offered from the HUD and designed four different Glyph icons for each function. We undertook another survey on icon designs and asked participants to select the most appropriate and relatable one for each function. The most popular icon for each function is incorporated into the HUD. The Figure below shows a snippet from the survey.

The figure below shows the homepage of the final UI for the HUD by encompassing all the above calculations and considerations for icon symbol, icon size and ambient lighting conditions. The icons in the UI are selected based on the results obtained from the survey. All icons are placed on the screen in an elliptical fashion such that it covers the entire screen and are spread out enough to provide easy selection feedback while moving using eye gaze. It can be seen from the picture that the icons with bright light conditions have dynamically set to the reverse contrast condition (dark icons) and vice versa.

The figure below illustrates the working of eye-gaze controlled HUD integrated with a Driving Simulator.

I undertook 5 user studies. The first study compared the new system in a dual task setting. The second study evaluated the system inside the car and with respect to on-board vibration. The second and third studies separately evaluated the eye gaze and finger tracking system inside the running vehicle. The fourth study evaluated the performance of the multimodal system by positioning the sensors and display with respect to a driver.

User Studies:

Demo Video:

Summary:

  • Improved driving performance in terms of mean deviation from lane

  • Selection time and Task load Index score were significantly less for gaze controlled interface with hotspots

  • Eye gaze tracking can be proposed as a modality of operating dashboard in car

Publications:

  • Prabhakar, G., Ramakrishnan, A., Murthy, L. R. D., Sharma, V. K., Madan, M., Deshmukh, S. and Biswas, P. Interactive Gaze & Finger controlled HUD for Cars, Journal of Multimodal User Interface, Springer, 2019, ISSN 1783-8738. DOI: 10.1007/s12193-019-00316-9

  • Prabhakar, G. and Biswas, P. (2018). Eye Gaze Controlled Projected Display in Automotive and Military Aviation Environments. Multimodal Technologies and Interaction, 2(1), ISSN 2414-4088 DOI: 10.3390/mti2010001

  • Biswas, P., Prabhakar, G., Rajesh, J., Pandit, K., & Halder, A. (2017, July). Improving eye gaze controlled car dashboard using simulated annealing. In Proceedings of the 31st British Computer Society Human Computer Interaction Conference(p. 39). DOI: 10.14236/ewic/HCI2017.39

  • Aprana, R., Modiksha, M., Prabhakar, G., Deshmukh, S., Biswas, P., (2019, October). Eye Gaze Controlled Head-up Display. In International Conference on ICT for Sustainable Development 2019. DOI: 10.1007/978-981-15-0630-7_46

  • Babu, M. D., Biswas, P., Prabhakar, G., JeevithaShree, D. V., and Murthy, L. R. D. Eye Gaze Tracking in Military Aviation, Indian Air Force AVIAMAT 2018

  • Babu, M. D., JeevithaShree, D. V., Prabhakar, G., and Biswas, P. Using Eye Gaze Tracker to Automatically Estimate Pilots’ Cognitive Load, 50th International Symposium of the Society for Flight Test Engineer (SFTE)

Patent:

  • Prabhakar, G., and Biswas, P. Wearable Laser Pointer for Automotive User Interface, Application No.: 201741035044, PCT International Application No. PCT/IB2018/057680