Thea is a concept for a system that enables blind and visually impaired individuals to better navigate their world through artificial intelligence and haptic feedback by way of next generation networks. Using the continuous positioning abilities next-generation networks will enable, Thea understands the most efficient way to get around and guides blind and low-vision individuals accurately and efficiently. As a system, Thea communicates granular directionality through a set of wearable haptic pads and a voice-activated user interface. By first interpreting natural speech, Thea provides non-intrusive audio and tactile feedback. Ultimately, Thea helps those with vision impairments overcome mobility challenges and empowers them to more confidently navigate their world.

Completed during an internship at Moment Design/ Verizon. Recognized as one of five finalists worldwide for Student Innovation, SXSW Interactive Innovation Awards, 2019. Shortlisted, Interaction Design Awards, 2019.





With a conversation UI, Thea responds the requests like a real person, adapts to voice inputs and provides an unparalleled navigation experience. 




Thea’s uses intuitive haptic feedback. The blind and low-vision population rely heavily on sounds and audio cues from their environment, so in high-congestion and noisy areas, Thea switches from audio to haptic feedback. Thea’s haptic language orients users and provides specific directional information.

Thea’s wearable pads are made of comfortable and inexpensive materials. The pads can be placed anywhere on the body, giving users the ability to choose where they’d like haptic feedback.




Thanks to next-generation networks, Thea is able to provide users with specific navigational guidance, including indoor navigation in large public spaces such as train stations or shopping malls. When it comes to blind and visually impaired individuals, navigation can't stop at the door, or in most cases, within close proximity to an individuals desired destination. Thea guides individuals to the exact point they have in mind, such as a specific train platform, classroom or exhibit in a large museum. When meeting up with friends, Thea can use a friends location to provide directions to their exact position, even if their friend has changed locations during their trip.

What the future holds
Using inputs such as crowd-sourced maps, aggregated location data and city-wide camera footage, Thea could provide the fastest and most accurate navigation assistance the world has ever seen. The new capabilities of next-generation networks will provide the infrastructure needed to power Thea, and allow each of us—including the blind and low-vision community— to go beyond our limits and better connect with the world around us.



Lauren Fox
Last Update—May 17, 2019