Vision Inclusive Interface for Hyundai Vehicles
The capstone project was a collaboration with Hyundai to design an accessible in-vehicle interface for drivers with visual impairments.
Timeline
Jan 2025 - April 2025
Role
UX Researcher
Team1 UX Researcher
1 Product Manager
3 UX Designers
1 Product Manager
3 UX Designers
My TasksUser Research
Usability Testing
UX Interface Design
Usability Testing
UX Interface Design
discovery - desk Research
Literature Insights – Accessible in-vehicle UI
Current ADAS(Advanced Driver Assistance Systems) and HMI(Human-Machine Interface) studies show that drivers with vision or motor impairments value situational-awareness aids—lane-keeping and collision alerts—more than full autonomy, and that usability improves with customisable alerts you can see, hear, and feel (high-contrast palettes, scalable text, adaptive brightness, voice + haptics). While standards such as ISO 15005, SAE J2364, and NHTSA glance-time limits exist. And since most research test only one feature at a time and rarely include drivers with vision impairment, the field still has no comprehensive benchmark for end-to-end dashboard accessibility.
Competitive Analysis
Design Guideline
Implications for Design Solutions
Base on literature review, competitive analysis, and design guideline, our solution/design needs:
- Prioritise alerting and multimodal feedback (voice + haptic + high-contrast visuals) rather than autonomous take-over.
- Align with existing automotive HMI standards (ISO 15005, SAE J2364) while extending them with vision-specific settings (colour-blind palettes, glare reduction).
- Include real low-vision drivers in usability testing to close the validation gap and generate repeatable metrics (e.g., glance duration, task-success in night-drive scenarios).
Participants
Survey - 15 individuals
Result Analysis - Affinity MapWe used an affinity map to sort the results into notes and find the common themes regarding participants concerns to guide our design direction.
Participants Key Problems
Drivers must meet legal vision standards, yet many challenges (e.g., glare, color confusion) remain even with glasses.
Existing and Desired Techonology
Now we know the market and users, we centred our solution on voice interaction, high-contrast text and icons, and proactive alerts for hazards and signal changes to improve accessibility for drivers with vision impairments.
This led us to prototype an AI-assisted navigation map tailored to low-vision users, including low-fidelity screens, user flow, and a sitemap covering the vehicle’s key interface areas.
Concept Test
After we designed the inital prototype and ideation, we went back to our drivers to compare the design ideas, and determine direction for high-fidelity design.
Design
Style Guide
Main ScreensThe main design is the AI-supported accessible map, smart telltale notifications, and cluster with driver assistant view.
Usability TestWith the high fidelity, clickable prototype, we give it to the target users for real world tasks while we observe and measure with 1) task completion, 2) number of steps, 3) time taken, 4) user rating, and open ended feedback.
The purpose of the usability test is to test the effectiveness of the accessibility interface, and identify usability issues that might exist in key features for final iterations.
Participants
Key Issues
Final Design
Video Demo
I created a 1 minute 30 seconds video walkthrough for a step-by-step demo of the interative dashboard prototype.
Short on time? Skip ahead to the Key Features recap just below.
Key Features
Quick snapshot of the final design: the AI-supported accessible map, driver-assist cluster view, and smart tell-tale alerts.
Reflection
Impact
Limitation
This was a University project completed in collaboration with a Hyundai representative, so we could not build or test the solution in a production vehicle. Time constraints also kept me from exploring every feature I thought would be helpful—for example, condition-specific driving modes (such as a red-green colour-blind mode) and dedicated night- or bad-weather displays.
What I learned Researching AI and accessibility revealed just how many disability profiles exist and how challenging truly inclusive design can be. By narrowing scope to vision-related needs, I saw the value of clear constraints and domain knowledge: understanding automotive HMI standards, visual-impairment nuances, and real driving contexts was essential before proposing solutions.
Now I’m confident that, regardless of the domain, I can carry the learning skills to all future projects:
- Ramp up fast: dive into standards, user studies, and expert interviews to understand new territory.
- Set clear limits: define a manageable scope (here, vision-related needs) so the work stays achievable and useful.
- Translate insight to actionable solutions: turn fresh knowledge into evidence-based guidelines and prototypes.