AI for the low vision
AI for the low vision
AI for the low vision
The multi-sensory app for the visually impaired improved time perception by 73% using AI voice, Morse code vibration, and high-contrast visuals.
The multi-sensory app for the visually impaired improved time perception by 73% using AI voice, Morse code vibration, and high-contrast visuals.
The multi-sensory app for the visually impaired improved time perception by 73% using AI voice, Morse code vibration, and high-contrast visuals.

Summary

Multi-sensory Time and Sun Position Information App

Role: Project Lead | Period: 10 weeks, 2023

Identifying a strong need for time and sun position information among visually impaired users, I developed a mobile and tablet app to address this. Through UX research, interviews, and testing, I implemented multi-sensory feedback using AI voice interaction, Morse code vibration, and high-contrast visuals, resulting in a 73% improvement in time perception and daily planning.

Tools/Tech: Adobe XD, Figma, After Effects, haptic feedback systems, etc.

Category

UX Research

UX UI Design

Visual Design

Romey's work

Romey © 2025. All Rights Reserved.