Home    Projects    About




Vision Inclusive Interface for Hyundai Vehicles


The capstone project was a collaboration with Hyundai to design an accessible in-vehicle interface for drivers with visual impairments. 

Timeline
Jan 2025 - April 2025
Role
UX Researcher
Team1 UX Researcher
1 Product Manager
3 UX Designers
My TasksUser Research
Usability Testing
UX Interface Design

Overview



discovery - desk Research
Literature Insights – Accessible in-vehicle UI

Current ADAS(Advanced Driver Assistance Systems) and HMI(Human-Machine Interface) studies show that drivers with vision or motor impairments value situational-awareness aids—lane-keeping and collision alerts—more than full autonomy, and that usability improves with customisable alerts you can see, hear, and feel (high-contrast palettes, scalable text, adaptive brightness, voice + haptics). While standards such as ISO 15005, SAE J2364, and NHTSA glance-time limits exist. And since most research test only one feature at a time and rarely include drivers with vision impairment, the field still has no comprehensive benchmark for end-to-end dashboard accessibility.

Competitive Analysis

We benchmarked Hyundai’s early vision-inclusive UI against three leading EV interfaces—Tesla Model 3, Lucid Air, and Rivian R1S—across layout, voice, customisation, safety, and accessibility features.

Design Guideline
We created evident based design guideline for the team to follow and keep in mind throughout the design process.


Implications for Design Solutions
Base on literature review, competitive analysis, and design guideline, our solution/design needs:
  1. Prioritise alerting and multimodal feedback (voice + haptic + high-contrast visuals) rather than autonomous take-over.
  2. Align with existing automotive HMI standards (ISO 15005, SAE J2364) while extending them with vision-specific settings (colour-blind palettes, glare reduction).
  3. Include real low-vision drivers in usability testing to close the validation gap and generate repeatable metrics (e.g., glance duration, task-success in night-drive scenarios).
get to know the users  
User InterviewWe ran early stage interview to move from desk insights to real driver voices—validating our initial findings, uncovering true needs, and pinpointing our team to the design direction.
SurveyIn addition, we also distribute surveys to get more participants and data.

Participants
Interview - 6 individuals
Survey - 15 individuals
Age 18 or older, hold a drivers license, and self-identify with permanent vision impairments— near/farsightness, colour-blindness, myopia, astigmatism


Result Analysis - Affinity MapWe used an affinity map to sort the results into notes and find the common themes regarding participants concerns to guide our design direction.


Participants Key Problems
Drivers must meet legal vision standards, yet many challenges (e.g., glare, color confusion) remain even with glasses.

Colour & contrast
Drivers find dashboard icons and map details blend together; traffic-light cues are hard to distinguish.

Night driving & weather conditionsHead-light glare, rain, and dim signage make evening driving stressful.

Menu overload
Small text, deep sub-menus, and pop-up messages took drivers attention away from the road longer than they feel is safe.

Existing and Desired Techonology
What driver already been using Basic lane-keeping alerts and voice commands

What driver want next
High-contrast or glare-reduced night mode, adjustable text/icon size, voice or haptic confirmation for critical alerts, and AI features that recall seating presets, announce hazards or speed-limit changes, and explain unfamiliar icons.

Brainstorm
Now we know the market and users, we centred our solution on voice interaction, high-contrast text and icons, and proactive alerts for hazards and signal changes to improve accessibility for drivers with vision impairments. 
This led us to prototype an AI-assisted navigation map tailored to low-vision users, including low-fidelity screens, user flow, and a sitemap covering the vehicle’s key interface areas.

Low Fidelity
User Flow

Sitemap



Concept Test
After we designed the inital prototype and ideation, we went back to our drivers to compare the design ideas, and determine direction for high-fidelity design.


Design

Style Guide

Main ScreensThe main design is the AI-supported accessible map, smart telltale notifications, and cluster with driver assistant view.



Usability Test
With the high fidelity, clickable prototype, we give it to the target users for real world tasks while we observe and measure with 1) task completion, 2) number of steps, 3)  time taken, 4) user rating, and open ended feedback. 
The purpose of the usability test is to test the effectiveness of the accessibility interface, and identify usability issues that might exist in key features for final iterations.

Participants
8 individuals 
 Age 18 or older, hold a drivers license, and self-identify with permanent vision impairments

Key Issues
Telltale Notifications
Users want a consistent icon, shorter text that include actionable steps, and the ability to tap an alert for a detailed panel with actionable steps.

Map
Road signs, traffic-light states, countdown seconds, and speed-limit info should appear directly on the map,  and arrival time should be combined  with arrival time to save space.

Visual Design
Distance-to-next-turn and ETA need to be more visible, and several elements (AI text, music card, map imagery) lack sufficient contrast.


Final Design

Video Demo
I created a 1 minute 30 seconds video walkthrough for a step-by-step demo of the interative dashboard prototype.
Short on time? Skip ahead to the Key Features recap just below.


Key Features
Quick snapshot of the final design: the AI-supported accessible map, driver-assist cluster view, and smart tell-tale alerts.



Reflection
Impact
Limitation
This was a University project completed in collaboration with a Hyundai representative, so we could not build or test the solution in a production vehicle. Time constraints also kept me from exploring every feature I thought would be helpful—for example, condition-specific driving modes (such as a red-green colour-blind mode) and dedicated night- or bad-weather displays.
What I learned
Researching AI and accessibility revealed just how many disability profiles exist and how challenging truly inclusive design can be. By narrowing scope to vision-related needs, I saw the value of clear constraints and domain knowledge: understanding automotive HMI standards, visual-impairment nuances, and real driving contexts was essential before proposing solutions. 

Now I’m confident that, regardless of the domain, I can carry the learning skills to all future projects:
  1. Ramp up fast: dive into standards, user studies, and expert interviews to understand new territory.
  2. Set clear limits: define a manageable scope (here, vision-related needs) so the work stays achievable and useful.
  3. Translate insight to actionable solutions: turn fresh knowledge into evidence-based guidelines and prototypes.


Contact Me


LinkedIn         📩 alice.yu.sun@gmail.com        📞(865)801-7068

@2025 Alice Sun. All rights reserved.