What factors are important in the calibration of mental representations of auditory space? A substantial body of research investigating the audiospatial abilities of people who are blind has shown ...that visual experience might be an important factor for accurate performance in some audiospatial tasks. Yet, it has also been shown that long-term experience using click-based echolocation might play a similar role, with blind expert echolocators demonstrating auditory localization abilities that are superior to those of people who are blind and who do not use click-based echolocation by Vercillo et al. (Neuropsychologia 67: 35–40, 2015). Based on this hypothesis we might predict that training in click-based echolocation may lead to improvement in performance in auditory localization tasks in people who are blind. Here we investigated this hypothesis in a sample of 12 adult people who have been blind from birth. We did not find evidence for an improvement in performance in auditory localization after 10 weeks of training despite significant improvement in echolocation ability. It is possible that longer-term experience with click-based echolocation is required for effects to develop, or that other factors can explain the association between echolocation expertise and superior auditory localization. Considering the practical relevance of click-based echolocation for people who are visually impaired, future research should address these questions.
Echolocation is the ability to use reflected sound to obtain information about the spatial environment. Echolocation is an active process that requires both the production of the emission as well as ...the sensory processing of the resultant sound. Appreciating the general usefulness of echo-acoustic cues for people, in particular those with vision impairments, various devices have been built that exploit the principle of echolocation to obtain and provide information about the environment. It is common to all these devices that they do not require the person to make a sound. Instead, the device produces the emission autonomously and feeds a resultant sound back to the user. Here we tested if echolocation performance in a simple object detection task was affected by the use of a head-mounted loudspeaker as compared to active clicking. We found that 27 sighted participants new to echolocation did generally better when they used a loudspeaker as compared to mouth-clicks, and that two blind participants with experience in echolocation did equally well with mouth clicks and the speaker. Importantly, performance of sighted participants' was not statistically different from performance of blind experts when they used the speaker. Based on acoustic click data collected from a subset of our participants, those participants whose mouth clicks were more similar to the speaker clicks, and thus had higher peak frequencies and sound intensity, did better. We conclude that our results are encouraging for the consideration and development of assistive devices that exploit the principle of echolocation.
Understanding the factors that determine if a person can successfully learn a novel sensory skill is essential for understanding how the brain adapts to change, and for providing rehabilitative ...support for people with sensory loss. We report a training study investigating the effects of blindness and age on the learning of a complex auditory skill: click-based echolocation. Blind and sighted participants of various ages (21-79 yrs; median blind: 45 yrs; median sighted: 26 yrs) trained in 20 sessions over the course of 10 weeks in various practical and virtual navigation tasks. Blind participants also took part in a 3-month follow up survey assessing the effects of the training on their daily life. We found that both sighted and blind people improved considerably on all measures, and in some cases performed comparatively to expert echolocators at the end of training. Somewhat surprisingly, sighted people performed better than those who were blind in some cases, although our analyses suggest that this might be better explained by the younger age (or superior binaural hearing) of the sighted group. Importantly, however, neither age nor blindness was a limiting factor in participants' rate of learning (i.e. their difference in performance from the first to the final session) or in their ability to apply their echolocation skills to novel, untrained tasks. Furthermore, in the follow up survey, all participants who were blind reported improved mobility, and 83% reported better independence and wellbeing. Overall, our results suggest that the ability to learn click-based echolocation is not strongly limited by age or level of vision. This has positive implications for the rehabilitation of people with vision loss or in the early stages of progressive vision loss.
Some people can echolocate by making sonar emissions (e.g., mouth-clicks, finger snaps, feet shuffling, humming, cane tapping, etc.) and listening to the returning echoes. To date there are no ...statistics available about how many blind people use echolocation, but anecdotal reports in the literature suggest that perhaps between 20 and 30% of totally blind people may use it, suggesting that echolocation affords broad functional benefits. Consistent with the notion that blind individuals benefit from the use of echolocation, previous research conducted under controlled experimental conditions has shown that echolocation improves blind people's spatial sensing ability. The current study investigated if there is also evidence for functional benefits of echolocation in real life. To address this question the current study conducted an online survey. Thirty-seven blind people participated. Linear regression analyses of survey data revealed that, while statistically controlling for participants' gender, age, level of visual function, general health, employment status, level of education, Braille skill, and use of other mobility means, people who use echolocation have higher salary, and higher mobility in unfamiliar places, than people who do not use echolocation. The majority of our participants (34 out of 37) use the long cane, and all participants who reported to echolocate, also reported to use the long cane. This suggests that the benefit of echolocation that we found might be conditional upon the long cane being used as well. The investigation was correlational in nature, and thus cannot be used to determine causality. In addition, the sample was small (N = 37), and one should be cautious when generalizing the current results to the population. The data, however, are consistent with the idea that echolocation offers real-life advantages for blind people, and that echolocation may be involved in peoples' successful adaptation to vision loss.
Echolocation in humans: an overview Thaler, Lore; Goodale, Melvyn A.
Wiley interdisciplinary reviews. Cognitive science,
November/December 2016, Letnik:
7, Številka:
6
Journal Article
Recenzirano
Odprti dostop
Bats and dolphins are known for their ability to use echolocation. They emit bursts of sounds and listen to the echoes that bounce back to detect the objects in their environment. What is not as ...well‐known is that some blind people have learned to do the same thing, making mouth clicks, for example, and using the returning echoes from those clicks to sense obstacles and objects of interest in their surroundings. The current review explores some of the research that has examined human echolocation and the changes that have been observed in the brains of echolocation experts. We also discuss potential applications and assistive technology based on echolocation. Blind echolocation experts can sense small differences in the location of objects, differentiate between objects of various sizes and shapes, and even between objects made of different materials, just by listening to the reflected echoes from mouth clicks. It is clear that echolocation may enable some blind people to do things that are otherwise thought to be impossible without vision, potentially providing them with a high degree of independence in their daily lives and demonstrating that echolocation can serve as an effective mobility strategy in the blind. Neuroimaging has shown that the processing of echoes activates brain regions in blind echolocators that would normally support vision in the sighted brain, and that the patterns of these activations are modulated by the information carried by the echoes. This work is shedding new light on just how plastic the human brain is. WIREs Cogn Sci 2016, 7:382‐393. doi: 10.1002/wcs.1408
This article is categorized under:
Psychology > Brain Function and Dysfunction
Psychology > Perception and Psychophysics
Neuroscience > Plasticity
The study of echolocation in blind humans is a vibrant area of research in psychology and the neurosciences. It is not only a fascinating subject in its own right, but provides a window into neuroplasticity, affording researchers a fresh paradigm for probing how the brain deals with novel sensory information.
A small number of blind people are adept at echolocating silent objects simply by producing mouth clicks and listening to the returning echoes. Yet the neural architecture underlying this type of ...aid-free human echolocation has not been investigated. To tackle this question, we recruited echolocation experts, one early- and one late-blind, and measured functional brain activity in each of them while they listened to their own echolocation sounds.
When we compared brain activity for sounds that contained both clicks and the returning echoes with brain activity for control sounds that did not contain the echoes, but were otherwise acoustically matched, we found activity in calcarine cortex in both individuals. Importantly, for the same comparison, we did not observe a difference in activity in auditory cortex. In the early-blind, but not the late-blind participant, we also found that the calcarine activity was greater for echoes reflected from surfaces located in contralateral space. Finally, in both individuals, we found activation in middle temporal and nearby cortical regions when they listened to echoes reflected from moving targets.
These findings suggest that processing of click-echoes recruits brain regions typically devoted to vision rather than audition in both early and late blind echolocation experts.
People use sensory, in particular visual, information to guide actions such as walking around obstacles, grasping or reaching. However, it is presently unclear how malleable the sensorimotor system ...is. The present study investigated this by measuring how click-based echolocation may be used to avoid obstacles while walking. We tested 7 blind echolocation experts, 14 sighted, and 10 blind echolocation beginners. For comparison, we also tested 10 sighted participants, who used vision. To maximize the relevance of our research for people with vision impairments, we also included a condition where the long cane was used and considered obstacles at different elevations. Motion capture and sound data were acquired simultaneously. We found that echolocation experts walked just as fast as sighted participants using vision, and faster than either sighted or blind echolocation beginners. Walking paths of echolocation experts indicated early and smooth adjustments, similar to those shown by sighted people using vision and different from later and more abrupt adjustments of beginners. Further, for all participants, the use of echolocation significantly decreased collision frequency with obstacles at head, but not ground level. Further analyses showed that participants who made clicks with higher spectral frequency content walked faster, and that for experts higher clicking rates were associated with faster walking. The results highlight that people can use novel sensory information (here, echolocation) to guide actions, demonstrating the action system's ability to adapt to changes in sensory input. They also highlight that regular use of echolocation enhances sensory-motor coordination for walking in blind people.
Public Significance Statement
Vision loss has negative consequences for people's mobility. The current report demonstrates that echolocation might replace certain visual functionality for adaptive walking. Importantly, the report also highlights that echolocation and long cane are complementary mobility techniques. The findings have direct relevance for professionals involved in mobility instruction and for people who are blind.
Similar to certain bats and dolphins, some blind humans can use sound echoes to perceive their silent surroundings. By producing an auditory signal (e.g., a tongue click) and listening to the ...returning echoes, these individuals can obtain information about their environment, such as the size, distance, and density of objects. Past research has also hinted at the possibility that blind individuals may be able to use echolocation to gather information about 2-D surface shape, with definite results pending. Thus, here we investigated people’s ability to use echolocation to identify the 2-D shape (contour) of objects. We also investigated the role played by head movements—that is, exploratory movements of the head while echolocating—because anecdotal evidence suggests that head movements might be beneficial for shape identification. To this end, we compared the performance of six expert echolocators to that of ten blind nonecholocators and ten blindfolded sighted controls in a shape identification task, with and without head movements. We found that the expert echolocators could use echoes to determine the shapes of the objects with exceptional accuracy when they were allowed to make head movements, but that their performance dropped to chance level when they had to remain still. Neither blind nor blindfolded sighted controls performed above chance, regardless of head movements. Our results show not only that experts can use echolocation to successfully identify 2-D shape, but also that head movements made while echolocating are necessary for the correct identification of 2-D shape.
Here, we report novel empirical results from a psychophysical experiment in which we tested the echolocation abilities of nine blind adult human experts in click-based echolocation. We found that ...they had better acuity in localizing a target and used lower intensity emissions (i.e., mouth clicks) when a target was placed 45° off to the side compared with when it was placed at 0° (straight ahead). We provide a possible explanation of the behavioral result in terms of binaural-intensity signals, which appear to change more rapidly around 45°. The finding that echolocators have better echo-localization off axis is surprising, because for human source localization (i.e., regular spatial hearing), it is well known that performance is best when targets are straight ahead (0°) and decreases as targets move farther to the side. This may suggest that human echolocation and source hearing rely on different acoustic cues and that human spatial hearing has more facets than previously thought.
Echolocation is the ability to use sound-echoes to infer spatial information about the environment. Some blind people have developed extraordinary proficiency in echolocation using mouth-clicks. The ...first step of human biosonar is the transmission (mouth click) and subsequent reception of the resultant sound through the ear. Existing head-related transfer function (HRTF) data bases provide descriptions of reception of the resultant sound. For the current report, we collected a large database of click emissions with three blind people expertly trained in echolocation, which allowed us to perform unprecedented analyses. Specifically, the current report provides the first ever description of the spatial distribution (i.e. beam pattern) of human expert echolocation transmissions, as well as spectro-temporal descriptions at a level of detail not available before. Our data show that transmission levels are fairly constant within a 60° cone emanating from the mouth, but levels drop gradually at further angles, more than for speech. In terms of spectro-temporal features, our data show that emissions are consistently very brief (~3ms duration) with peak frequencies 2-4kHz, but with energy also at 10kHz. This differs from previous reports of durations 3-15ms and peak frequencies 2-8kHz, which were based on less detailed measurements. Based on our measurements we propose to model transmissions as sum of monotones modulated by a decaying exponential, with angular attenuation by a modified cardioid. We provide model parameters for each echolocator. These results are a step towards developing computational models of human biosonar. For example, in bats, spatial and spectro-temporal features of emissions have been used to derive and test model based hypotheses about behaviour. The data we present here suggest similar research opportunities within the context of human echolocation. Relatedly, the data are a basis to develop synthetic models of human echolocation that could be virtual (i.e. simulated) or real (i.e. loudspeaker, microphones), and which will help understanding the link between physical principles and human behaviour.