Using auditory augmented reality to understand visual scenes

Thumbnail Image
Date
2017
Authors
Stone, Scott
University of Lethbridge. Faculty of Arts and Science
Journal Title
Journal ISSN
Volume Title
Publisher
Lethbridge, Alta. : Universtiy of Lethbridge, Department of Neuroscience
Abstract
Locating objects in space is typically thought of as a visual task. However, not everyone has access to visual information, such as the blind. The purpose of this thesis was to investigate whether it was possible to convert visual events into spatial auditory cues. A neuromorphic retina was used to collect visual events and custom software was written to augment auditory localization cues into the scene. The neuromorphic retina is engineered to encode data similar to how the dorsal visual pathway does. The dorsal visual pathway is associated with fast nonredundant information encoding and is thought to drive attentional shifting, especially in the presence of visual transients. The intent was to create a device capable of using these visual onsets and transients to generate spatial auditory cues. To achieve this, the device uses the core principles driving auditory localization, with a focus on the interaural time and level difference cues. These cues are thought to be responsible for encoding azimuthal location in space. Results demonstrate the usefulness of such a device, but personalization will probably improve the effectiveness of the cues generated. In summary, I have created a device that converts purely visual events into useful auditory cues for localization, thereby granting perception of stimuli that may have been inaccessible to the user.
Description
Keywords
auditory augmented reality , auditory localization , dorsal visual stream , neuromorphic retina , spatial auditory cues , visual impairment
Citation