for the

The Project

Over 25 million American adults experienced vision loss in 2016. For many of these individuals, navigating new spaces can be a cumbersome or even dangerous experience, requiring them to feel their environment out by hand or by cane.

Our project seeks to utilize the spatial mapping power of augmented reality devices like the Magic Leap One or Microsoft HoloLens in order to help users navigate in unfamiliar environments.

By making audible the 3D mesh of the user’s surroundings that these devices generate, we will create a soundscape that will let users hear the objects around them. Thus, visually impaired users will be able to hear the presence of obstacles like tables and chairs, or the lack of obstacles that represents a doorway, from up to five meters away.

In combining this capability with machine-vision text and facial recognition in one hands-free headset, we hope to give blind users a new level of independence in unfamiliar spaces.

The Team

We are an interdisciplinary team of Master's students at UC Berkeley's School of Information. Our advisor is UCB faculty member and HCI expert Kimiko Ryokai.


We're working together with organizations like VoiceCORP to shape our application to real user needs. We're also partnering up with VR @ Berkeley, one of the world's biggest undergraduate VR/AR clubs, to get access to even more knowledge and VR/AR technology.


AT&T Mixed Reality Hackathon

AT&T Mixed Reality Hackathon

Our first hands-on experience with the Magic Leap One

XR Accessibility: A Capstone Project

XR Accessibility: A Capstone Project

Our initial project focus: generalized XR accessibility

Get in Touch

If you have experiences to share about visual impairment or accessible technology, we want to hear from you! Please feel free to reach out to us at dylan.r.fox@berkeley.edu.