Autonomous Driving HMI - Mercedes
: Inspired by Xbox

Led AI/ML-powered full-body driver monitoring HMI for Mercedes-Benz, improving safety and usability by 65% and generating $750M in recurring revenue.

Client

Mercedes-Benz

Role

Product Designer, PM : A lean, startup-size team within a focused division

Team

1 Product Designer, PM / 1 GUI Designer / 60 Developers x 200+

Period

12 month, 2017 - 2018

Overview

First Autonomous HMI/UX Development for Mercedes-Benz

First Autonomous HMI/UX Development for Mercedes-Benz

Mercedes-Benz aimed to develop advanced HMI for Level 3 autonomous driving through LG’s technology. As Product Design / Project leader, I led the design of a multimodal HMI framework for real-time detection of head motion, gaze, gestures, and posture, integrating predictive gestures for intuitive interaction. The system improved safety and UX by over 65%, setting a new industry standard and generating $750 million annually for LG.

What I did

Project Management, UX Strategy & Planning, User Research, UI Design (Wireframing & Prototyping)

What LG did - Behind Story

Breaking Through a 4,000-Person Layoff Crisis: One Miraculous

Week - Learning by Trial and Fire

Breaking Through a 4,000-Person Layoff Crisis: One Miraculous

Week - Learning by Trial and Fire

After global OEMs failed to deliver Mercedes’ first autonomous HMI, Mercedes came to LG’s small vehicle division. At the time, LG faced 4,000 layoffs—many employees had just been transferred from our collapsed mobile division to automotive. All senior engineers / managers had already failed, and with only one week left, the project landed on me as the youngest. I rushed to used luxury car markets, manually disassembled 20+ vehicles and game devices, and interviewed every hidden expert across LG. Within five sleepless days, After five sleepless nights, I completed the product plan/design with the help of many hidden people. Mercedes was stunned and gradually expanded the solution from S class to C class—leading to $750M+ in annual recurring revenue for LG.

Challenge

After several failed attempts by senior engineers and global OEMs, the autonomous HMI project was handed to me—alone—with only five days left.

With mass layoffs looming and no team or clear specs, I had to rebuild trust from the ground up, starting with used car markets and raw parts.

Objective

We had to combine field research, rapid prototyping, and user-centric intuition to quickly create a realistic and scalable solution that would meet Mercedes’ expectations for the first real-world HMI prototype for autonomous driving.

Result

We completed a full HMI product plan in 5 days. Mercedes adopted and scaled this solution, ultimately generating over $750 million in annual recurring revenue for LG. More importantly, it restored trust in a team that was almost dismantled by the many hidden talents.

Preview final design

Monitoring from head to toe and pointing gestures

Monitoring from head to toe and pointing gestures

Our UX approach focused on enabling intuitive, full-body interactions that support both autonomy and comfort in high-stakes driving scenarios. By integrating head posture, eye tracking, and hand pointing gestures, we created a proactive HMI that eliminates the need for memorized commands. The design allows users—driver or passenger—to interact naturally under pressure, minimizing distraction and maximizing safety. Every gesture model was validated through rapid prototyping and real-world testing, ensuring clarity, speed, and emotional trust at every touchpoint.

In addition to these multimodal interactions, several advanced techniques and proprietary sensing technologies were applied throughout the system. Due to confidentiality agreements, certain technical details cannot be disclosed here—but they played a critical role in delivering a seamless and intelligent experience.

More detail

I’ll walk you through the details during our meeting

Romey's work

Romey © 2025. All Rights Reserved.