Home    Projects    About
Overview



discovery - desk Research
Literature Insights – Accessible in-vehicle UI

Current ADAS(Advanced Driver Assistance Systems) and HMI(Human-Machine Interface) studies show that drivers with vision or motor impairments value situational-awareness aids—lane-keeping and collision alerts—more than full autonomy, and that usability improves with customisable alerts you can see, hear, and feel (high-contrast palettes, scalable text, adaptive brightness, voice + haptics). While standards such as ISO 15005, SAE J2364, and NHTSA glance-time limits exist. And since most research test only one feature at a time and rarely include drivers with vision impairment, the field still has no comprehensive benchmark for end-to-end dashboard accessibility.

Competitive Analysis

We benchmarked Hyundai’s early vision-inclusive UI against three leading EV interfaces—Tesla Model 3, Lucid Air, and Rivian R1S—across layout, voice, customisation, safety, and accessibility features.

Design Guideline
We created evident based design guideline for the team to follow and keep in mind throughout the design process.


Implications for Design Solutions
Base on literature review, competitive analysis, and design guideline, our solution/design needs:
  1. Prioritise alerting and multimodal feedback (voice + haptic + high-contrast visuals) rather than autonomous take-over.
  2. Align with existing automotive HMI standards (ISO 15005, SAE J2364) while extending them with vision-specific settings (colour-blind palettes, glare reduction).
  3. Include real low-vision drivers in usability testing to close the validation gap and generate repeatable metrics (e.g., glance duration, task-success in night-drive scenarios).


Contact Me


LinkedIn         📩 alice.yu.sun@gmail.com        📞(865)801-7068

@2025 Alice Sun. All rights reserved.