a perceived periodic pulse that structures the perception of musical rhythm and which serves as a framework for synchronized movement to music. What are the neural mechanisms of musical beat ...perception, and how did they evolve? One view, which dates back to Darwin and implicitly informs some current models of beat perception, is that the relevant neural mechanisms are relatively general and are widespread among animal species. On the basis of recent neural and cross-species data on musical beat processing, this paper argues for a different view. Here we argue that beat perception is a complex brain function involving temporally-precise communication between auditory regions and motor planning regions of the cortex (even in the absence of overt movement). More specifically, we propose that simulation of periodic movement in motor planning regions provides a neural signal that helps the auditory system predict the timing of upcoming beats. This "action simulation for auditory prediction" (ASAP) hypothesis leads to testable predictions. We further suggest that ASAP relies on dorsal auditory pathway connections between auditory regions and motor planning regions via the parietal cortex, and suggest that these connections may be stronger in humans than in non-human primates due to the evolution of vocal learning in our lineage. This suggestion motivates cross-species research to determine which species are capable of human-like beat perception, i.e., beat perception that involves accurate temporal prediction of beat times across a fairly broad range of tempi.
•We tested the assumed superiority of auditory over visual timing.•Colliding visual stimuli can drive synchronization nearly as accurately as sound.•Deafness leads to enhancements of visuomotor ...synchronization.•The results constrain theories of timing and cross-modal plasticity.
A striking asymmetry in human sensorimotor processing is that humans synchronize movements to rhythmic sound with far greater precision than to temporally equivalent visual stimuli (e.g., to an auditory vs. a flashing visual metronome). Traditionally, this finding is thought to reflect a fundamental difference in auditory vs. visual processing, i.e., superior temporal processing by the auditory system and/or privileged coupling between the auditory and motor systems. It is unclear whether this asymmetry is an inevitable consequence of brain organization or whether it can be modified (or even eliminated) by stimulus characteristics or by experience. With respect to stimulus characteristics, we found that a moving, colliding visual stimulus (a silent image of a bouncing ball with a distinct collision point on the floor) was able to drive synchronization nearly as accurately as sound in hearing participants. To study the role of experience, we compared synchronization to flashing metronomes in hearing and profoundly deaf individuals. Deaf individuals performed better than hearing individuals when synchronizing with visual flashes, suggesting that cross-modal plasticity enhances the ability to synchronize with temporally discrete visual stimuli. Furthermore, when deaf (but not hearing) individuals synchronized with the bouncing ball, their tapping patterns suggest that visual timing may access higher-order beat perception mechanisms for deaf individuals. These results indicate that the auditory advantage in rhythmic synchronization is more experience- and stimulus-dependent than has been previously reported.
Our perceptions are shaped by both extrinsic stimuli and intrinsic interpretation. The perceptual experience of a simple rhythm, for example, depends upon its metrical interpretation (where one hears ...the beat). Such interpretation can be altered at will, providing a model to study the interaction of endogenous and exogenous influences in the cognitive organization of perception. Using magnetoencephalography (MEG), we measured brain responses evoked by a repeating, rhythmically ambiguous phrase (two tones followed by a rest). In separate trials listeners were instructed to impose different metrical organizations on the rhythm by mentally placing the downbeat on either the first or the second tone. Since the stimulus was invariant, differences in brain activity between the two conditions should relate to endogenous metrical interpretation. Metrical interpretation influenced early evoked neural responses to tones, specifically in the upper beta range (20–30 Hz). Beta response was stronger (by 64% on average) when a tone was imagined to be the beat, compared to when it was not. A second experiment established that the beta increase closely resembles that due to physical accents, and thus may represent the genesis of a subjective accent. The results demonstrate endogenous modulation of early auditory responses, and suggest a unique role for the beta band in linking of endogenous and exogenous processing. Given the suggested role of beta in motor processing and long‐range intracortical coordination, it is hypothesized that the motor system influences metrical interpretation of sound, even in the absence of overt movement.
There is growing interest in how the brain's motor systems contribute to the perception of musical rhythms. The Action Simulation for Auditory Prediction hypothesis proposes that the dorsal auditory ...stream is involved in bidirectional interchange between auditory perception and beat-based prediction in motor planning structures via parietal cortex Patel, A. D., & Iversen, J. R. The evolutionary neuroscience of musical beat perception: The Action Simulation for Auditory Prediction (ASAP) hypothesis.
57, 2014. We used a TMS protocol, continuous theta burst stimulation (cTBS), that is known to down-regulate cortical activity for up to 60 min following stimulation to test for causal contributions to beat-based timing perception. cTBS target areas included the left posterior parietal cortex (lPPC), which is part of the dorsal auditory stream, and the left SMA (lSMA). We hypothesized that down-regulating lPPC would interfere with accurate beat-based perception by disrupting the dorsal auditory stream. We hypothesized that we would induce no interference to absolute timing ability. We predicted that down-regulating lSMA, which is not part of the dorsal auditory stream but has been implicated in internally timed movements, would also interfere with accurate beat-based timing perception. We show (
= 25) that cTBS down-regulation of lPPC does interfere with beat-based timing ability, but only the ability to detect shifts in beat phase, not changes in tempo. Down-regulation of lSMA, in contrast, did not interfere with beat-based timing. As expected, absolute interval timing ability was not impacted by the down-regulation of lPPC or lSMA. These results support that the dorsal auditory stream plays an essential role in accurate phase perception in beat-based timing. We find no evidence of an essential role of parietal cortex or SMA in interval timing.
We investigated Bayesian modelling of human whole‐body motion capture data recorded during an exploratory real‐space navigation task in an “Audiomaze” environment (see the companion paper by ...Miyakoshi et al. in the same volume) to study the effect of map learning on navigation behaviour. There were three models, a feedback‐only model (no map learning), a map resetting model (single‐trial limited map learning), and a map updating model (map learning accumulated across three trials). The estimated behavioural variables included step sizes and turning angles. Results showed that the estimated step sizes were constantly more accurate using the map learning models than the feedback‐only model. The same effect was confirmed for turning angle estimates, but only for data from the third trial. We interpreted these results as Bayesian evidence of human map learning on navigation behaviour. Furthermore, separating the participants into groups of egocentric and allocentric navigators revealed an advantage for the map updating model in estimating step sizes, but only for the allocentric navigators. This interaction indicated that the allocentric navigators may take more advantage of map learning than do egocentric navigators. We discuss relationships of these results to simultaneous localization and mapping (SLAM) problem.
Splitting the participants into two subgroups, one for allocentric navigators and the other for egocentric navigators, showed a similar advantage of the map‐based models over the feedback‐only model in both groups. In the allocentric navigators only, there was a further advantage of the map updating model over the map resetting model. The observation that the map learning (map‐updating) model best fit the allocentric navigator behavior is consistent with the idea that allocentric navigators may have been better in exploring the maze based on the mental maps they built, and this advantage may be further reinforced by repeating the navigation.
Predicting and organizing patterns of events is important for humans to survive in a dynamically changing world. The motor system has been proposed to be actively, and necessarily, engaged in not ...only the production but the perception of rhythm by organizing hierarchical timing that influences auditory responses. It is not yet well understood how the motor system interacts with the auditory system to perceive and maintain hierarchical structure in time. This study investigated the dynamic interaction between auditory and motor functional sources during the perception and imagination of musical meters. We pursued this using a novel method combining high-density EEG, EMG, and motion capture with independent component analysis to separate motor and auditory activity during meter imagery while robustly controlling against covert movement. We demonstrated that endogenous brain activity in both auditory and motor functional sources reflects the imagination of binary and ternary meters in the absence of corresponding acoustic cues or overt movement at the meter rate. We found clear evidence for hypothesized motor-to-auditory information flow at the beat rate in all conditions, suggesting a role for top-down influence of the motor system on auditory processing of beat-based rhythms, and reflecting an auditory-motor system with tight reciprocal informational coupling. These findings align with and further extend a set of motor hypotheses from beat perception to hierarchical meter imagination, adding supporting evidence to active engagement of the motor system in auditory processing, which may more broadly speak to the neural mechanisms of temporal processing in other human cognitive functions.
Humans live in a world full of hierarchically structured temporal information, the accurate perception of which is essential for understanding speech and music. Music provides a window into the brain mechanisms of time perception, enabling us to examine how the brain groups musical beats into, for example a march or waltz. Using a novel paradigm combining measurement of electrical brain activity with data-driven analysis, this study directly investigates motor-auditory connectivity during meter imagination. Findings highlight the importance of the motor system in the active imagination of meter. This study sheds new light on a fundamental form of perception by demonstrating how auditory-motor interaction may support hierarchical timing processing, which may have clinical implications for speech and motor rehabilitation.
The tendency to move in rhythmic synchrony with a musical beat (e.g., via head bobbing, foot tapping, or dance) is a human universal
1 yet is not commonly observed in other species
2. Does this ...ability reflect a brain specialization for music cognition, or does it build on neural circuitry that ordinarily serves other functions? According to the “vocal learning and rhythmic synchronization” hypothesis
3, entrainment to a musical beat relies on the neural circuitry for complex vocal learning, an ability that requires a tight link between auditory and motor circuits in the brain
4, 5. This hypothesis predicts that only vocal learning species (such as humans and some birds, cetaceans, and pinnipeds, but not nonhuman primates) are capable of synchronizing movements to a musical beat. Here we report experimental evidence for synchronization to a beat in a sulphur-crested cockatoo (
Cacatua galerita eleonora). By manipulating the tempo of a musical excerpt across a wide range, we show that the animal spontaneously adjusts the tempo of its rhythmic movements to stay synchronized with the beat. These findings indicate that synchronization to a musical beat is not uniquely human and suggest that animal models can provide insights into the neurobiology and evolution of human music
6.
Many aspects of perception are known to be shaped by experience, but others are thought to be innate universal properties of the brain. A specific example comes from rhythm perception, where one of ...the fundamental perceptual operations is the grouping of successive events into higher-level patterns, an operation critical to the perception of language and music. Grouping has long been thought to be governed by innate perceptual principles established a century ago. The current work demonstrates instead that grouping can be strongly dependent on culture. Native English and Japanese speakers were tested for their perception of grouping of simple rhythmic sequences of tones. Members of the two cultures showed different patterns of perceptual grouping, demonstrating that these basic auditory processes are not universal but are shaped by experience. It is suggested that the observed perceptual differences reflect the rhythms of the two languages, and that native language can exert an influence on general auditory perception at a basic level.
Music engagement is a powerful, influential experience that often begins early in life. Music engagement is moderately heritable in adults (~ 41–69%), but fewer studies have examined genetic ...influences on childhood music engagement, including their association with language and executive functions. Here we explored genetic and environmental influences on music listening and instrument playing (including singing) in the baseline assessment of the Adolescent Brain Cognitive Development study. Parents reported on their 9–10-year-old children’s music experiences (N = 11,876 children; N = 1543 from twin pairs). Both music measures were explained primarily by shared environmental influences. Instrument exposure (but not frequency of instrument engagement) was associated with language skills (
r
= .27) and executive functions (
r
= .15–0.17), and these associations with instrument engagement were stronger than those for music listening, visual art, or soccer engagement. These findings highlight the role of shared environmental influences between early music experiences, language, and executive function, during a formative time in development.
Spontaneous movement to music occurs in every human culture and is a foundation of dance 1. This response to music is absent in most species (including monkeys), yet it occurs in parrots, perhaps ...because they (like humans, and unlike monkeys) are vocal learners whose brains contain strong auditory–motor connections, conferring sophisticated audiomotor processing abilities 2,3. Previous research has shown that parrots can bob their heads or lift their feet in synchrony with a musical beat 2,3, but humans move to music using a wide variety of movements and body parts. Is this also true of parrots? If so, it would constrain theories of how movement to music is controlled by parrot brains. Specifically, as head bobbing is part of parrot courtship displays 4 and foot lifting is part of locomotion, these may be innate movements controlled by central pattern generators which become entrained by auditory rhythms, without the involvement of complex motor planning. This would be unlike humans, where movement to music engages cortical networks including frontal and parietal areas 5. Rich diversity in parrot movement to music would suggest a strong contribution of forebrain regions to this behavior, perhaps including motor learning regions abutting the complex vocal-learning ‘shell’ regions that are unique to parrots among vocal learning birds 6. Here we report that a sulphur-crested cockatoo (Cacatua galerita eleonora) responds to music with remarkably diverse spontaneous movements employing a variety of body parts, and suggest why parrots share this response with humans.
Jao Keehn et al. show that a human-raised sulphur-crested cockatoo (a type of parrot) exhibits a remarkably diverse set of spontaneous movements in response to music, and suggest why parrots, perhaps uniquely among non-human animals, share this response with humans.