Project

Soundsight
Problem
Visually impaired people often rely on canes or guide dogs for navigation. Researchers wanted to explore whether spatial audio generated from depth camera data could provide an additional, intuitive navigation aid — one that could run on a standard smartphone without specialist hardware.
Approach
We built Soundsight in Swift for iOS, using depth data from the iPhone's camera to generate a real-time stream of audio parameters. Distance and position of obstacles are mapped to pitch, volume, and spatial placement using CoreAudio. The interface allows researchers to configure audio parameters via gestures and a JSON configuration file, making it adaptable for different experimental setups. Later versions added support for Structure.io and Flir One heat sensors.
Outcome
Soundsight is an active research tool used to study the efficacy of audio-based navigation aids for visually impaired people. It demonstrated that consumer iPhone hardware is sufficient for real-time depth-to-audio translation at a quality useful for navigation research.