Over 25 million American adults experienced vision loss in 2016. For many of these individuals, navigating new spaces can be a cumbersome or even dangerous experience, requiring them to feel their environment out by hand or by cane.
By making audible the 3D mesh of the user’s surroundings that these devices generate, we will create a soundscape that will let users hear the objects around them. Thus, visually impaired users will be able to hear the presence of obstacles like tables and chairs, or the lack of obstacles that represents a doorway, from up to five meters away.
In combining this capability with machine-vision text and facial recognition in one hands-free headset, we hope to give blind users a new level of independence in unfamiliar spaces.
We are an interdisciplinary team of Master's students at UC Berkeley's School of Information. Our advisor is UCB faculty member and HCI expert Kimiko Ryokai.
Our first hands-on experience with the Magic Leap One
Our initial project focus: generalized XR accessibility
If you have experiences to share about visual impairment or accessible technology, we want to hear from you! Please feel free to reach out to us at firstname.lastname@example.org.