Conceptual knowledge reflects our multi-modal ‘semantic database’. As such, it brings meaning to all verbal and non-verbal stimuli, is the foundation for verbal and non-verbal expression and provides ...the basis for computing appropriate semantic generalizations. Multiple disciplines (e.g. philosophy, cognitive science, cognitive neuroscience and behavioural neurology) have striven to answer the questions of how concepts are formed, how they are represented in the brain and how they break down differentially in various neurological patient groups. A long-standing and prominent hypothesis is that concepts are distilled from our multi-modal verbal and non-verbal experience such that sensation in one modality (e.g. the smell of an apple) not only activates the intramodality long-term knowledge, but also reactivates the relevant intermodality information about that item (i.e. all the things you know about and can do with an apple). This multi-modal view of conceptualization fits with contemporary functional neuroimaging studies that observe systematic variation of activation across different modality-specific association regions dependent on the conceptual category or type of information. A second vein of interdisciplinary work argues, however, that even a smorgasbord of multi-modal features is insufficient to build coherent, generalizable concepts. Instead, an additional process or intermediate representation is required. Recent multidisciplinary work, which combines neuropsychology, neuroscience and computational models, offers evidence that conceptualization follows from a combination of modality-specific sources of information plus a transmodal ‘hub’ representational system that is supported primarily by regions within the anterior temporal lobe, bilaterally.
Semantic cognition requires a combination of semantic representations and executive control processes to direct activation in a task- and time-appropriate fashion Jefferies, E., & Lambon Ralph, M. A. ...Semantic impairment in stroke aphasia versus semantic dementia: A case-series comparison.
2132–2147, 2006. We undertook a formal meta-analysis to investigate which regions within the large-scale semantic network are specifically associated with the executive component of semantic cognition. Previous studies have described in detail the role of left ventral pFC in semantic regulation. We examined 53 studies that contrasted semantic tasks with high > low executive requirements to determine whether cortical regions beyond the left pFC show the same response profile to executive semantic demands. Our findings revealed that right pFC, posterior middle temporal gyrus (pMTG) and dorsal angular gyrus (bordering intraparietal sulcus) were also consistently recruited by executively demanding semantic tasks, demonstrating patterns of activation that were highly similar to the left ventral pFC. These regions overlap with the lesions in aphasic patients who exhibit multimodal semantic impairment because of impaired regulatory control (semantic aphasia)—providing important convergence between functional neuroimaging and neuropsychological studies of semantic cognition. Activation in dorsal angular gyrus and left ventral pFC was consistent across all types of executive semantic manipulation, regardless of whether the task was receptive or expressive, whereas pMTG activation was only observed for manipulation of control demands within receptive tasks. Second, we contrasted executively demanding tasks tapping semantics and phonology. Our findings revealed substantial overlap between the two sets of contrasts within left ventral pFC, suggesting this region underpins domain-general control mechanisms. In contrast, we observed relative specialization for semantic control within pMTG as well as the most ventral aspects of left pFC (BA 47), consistent with our proposal of a distributed network underpinning semantic control.
Celotno besedilo
Dostopno za:
DOBA, IZUM, KILJ, NUK, PILJ, PNG, SAZU, SIK, UILJ, UKNU, UL, UM, UPUK
The anterior temporal lobe (ATL) makes a critical contribution to semantic cognition. However, the functional connectivity of the ATL and the functional network underlying semantic cognition has not ...been elucidated. In addition, subregions of the ATL have distinct functional properties and thus the potential differential connectivity between these subregions requires investigation. We explored these aims using both resting-state and active semantic task data in humans in combination with a dual-echo gradient echo planar imaging (EPI) paradigm designed to ensure signal throughout the ATL. In the resting-state analysis, the ventral ATL (vATL) and anterior middle temporal gyrus (MTG) were shown to connect to areas responsible for multimodal semantic cognition, including bilateral ATL, inferior frontal gyrus, medial prefrontal cortex, angular gyrus, posterior MTG, and medial temporal lobes. In contrast, the anterior superior temporal gyrus (STG)/superior temporal sulcus was connected to a distinct set of auditory and language-related areas, including bilateral STG, precentral and postcentral gyri, supplementary motor area, supramarginal gyrus, posterior temporal cortex, and inferior and middle frontal gyri. Complementary analyses of functional connectivity during an active semantic task were performed using a psychophysiological interaction (PPI) analysis. The PPI analysis highlighted the same semantic regions suggesting a core semantic network active during rest and task states. This supports the necessity for semantic cognition in internal processes occurring during rest. The PPI analysis showed additional connectivity of the vATL to regions of occipital and frontal cortex. These areas strongly overlap with regions found to be sensitive to executively demanding, controlled semantic processing.
Previous studies have shown that semantic cognition depends on subregions of the anterior temporal lobe (ATL). However, the network of regions functionally connected to these subregions has not been demarcated. Here, we show that these ventrolateral anterior temporal subregions form part of a network responsible for semantic processing during both rest and an explicit semantic task. This demonstrates the existence of a core functional network responsible for multimodal semantic cognition regardless of state. Distinct connectivity is identified in the superior ATL, which is connected to auditory and language areas. Understanding the functional connectivity of semantic cognition allows greater understanding of how this complex process may be performed and the role of distinct subregions of the anterior temporal cortex.
Traditional neurological models of language were based on a single neural pathway (the dorsal pathway underpinned by the arcuate fasciculus). Contemporary neuroscience indicates that anterior ...temporal regions and the “ventral” language pathway also make a significant contribution, yet there is no computationally-implemented model of the dual pathway, nor any synthesis of normal and aphasic behavior. The “Lichtheim 2” model was implemented by developing a new variety of computational model which reproduces and explains normal and patient data but also incorporates neuroanatomical information into its architecture. By bridging the “mind-brain” gap in this way, the resultant “neurocomputational” model provides a unique opportunity to explore the relationship between lesion location and behavioral deficits, and to provide a platform for simulating functional neuroimaging data.
► A computational implementation of the classic Lichtheim language model ► Incorporation of ventral and dorsal “pathways” into the language model ► A formal simulation of classic and progressive types of aphasia ► Incorporation of neuroanatomical constraints into computational architecture
Most contemporary theories of semantic memory assume that concepts are formed from the distillation of information arising in distinct sensory and verbal modalities. The neural basis of this ...distillation or convergence of information was the focus of this study. Specifically, we explored two commonly posed hypotheses: (a) that the human middle temporal gyrus (MTG) provides a crucial semantic interface given the fact that it interposes auditory and visual processing streams and (b) that the anterior temporal region—especially its ventral surface (vATL)—provides a critical region for the multimodal integration of information. By utilizing distortion-corrected fMRI and an established semantic association assessment (commonly used in neuropsychological investigations), we compared the activation patterns observed for both the verbal and nonverbal versions of the same task. The results are consistent with the two hypotheses simultaneously: Both MTG and vATL are activated in common for word and picture semantic processing. Additional planned, ROI analyses show that this result follows from two principal axes of convergence in the temporal lobe: both lateral (toward MTG) and longitudinal (toward the anterior temporal lobe).
Celotno besedilo
Dostopno za:
DOBA, IZUM, KILJ, NUK, PILJ, PNG, SAZU, SIK, UILJ, UKNU, UL, UM, UPUK
The anterior temporal lobes (ATL) have become a key brain region of interest in cognitive neuroscience founded upon neuropsychological investigations of semantic dementia (SD). The purposes of this ...investigation are to generate a single unified model that captures the known cognitive-behavioural variations in SD and map these to the patients' distribution of frontotemporal atrophy. Here we show that the degree of generalised semantic impairment is related to the patients' total, bilateral ATL atrophy. Verbal production ability is related to total ATL atrophy as well as to the balance of left > right ATL atrophy. Apathy is found to relate positively to the degree of orbitofrontal atrophy. Disinhibition is related to right ATL and orbitofrontal atrophy, and face recognition to right ATL volumes. Rather than positing mutually-exclusive sub-categories, the data-driven model repositions semantics, language, social behaviour and face recognition into a continuous frontotemporal neurocognitive space.
Semantic cognition refers to our ability to use, manipulate and generalize knowledge that is acquired over the lifespan to support innumerable verbal and non-verbal behaviours. This Review summarizes ...key findings and issues arising from a decade of research into the neurocognitive and neurocomputational underpinnings of this ability, leading to a new framework that we term controlled semantic cognition (CSC). CSC offers solutions to long-standing queries in philosophy and cognitive science, and yields a convergent framework for understanding the neural and computational bases of healthy semantic cognition and its dysfunction in brain disorders.
The roles of the right and left anterior temporal lobes (ATLs) in conceptual knowledge are a source of debate between 4 conflicting accounts. Possible ATL specializations include: (1) Processing of ...verbal versus non-verbal inputs; (2) the involvement of word retrieval; and (3) the social content of the stimuli. Conversely, the "hub-and-spoke" account holds that both ATLs form a bilateral functionally unified system. Using activation likelihood estimation (ALE) to compare the probability of left and right ATL activation, we analyzed 97 functional neuroimaging studies of conceptual knowledge, organized according to the predictions of the three specialized hypotheses. The primary result was that ATL activation was predominately bilateral and highly overlapping for all stimulus types. Secondary to this bilateral representation, there were subtle gradations both between and within the ATLs. Activations were more likely to be left lateralized when the input was a written word or when word retrieval was required. These data are best accommodated by a graded version of the hub-and-spoke account, whereby representation of conceptual knowledge is supported through bilateral yet graded connectivity between the ATLs and various modality-specific sensory, motor, and limbic cortices.
To understand the meanings of words and objects, we need to have knowledge about these items themselves plus executive mechanisms that compute and manipulate semantic information in a ...task-appropriate way. The neural basis for semantic control remains controversial. Neuroimaging studies have focused on the role of the left inferior frontal gyrus (LIFG), whereas neuropsychological research suggests that damage to a widely distributed network elicits impairments of semantic control. There is also debate about the relationship between semantic and executive control more widely. We used TMS in healthy human volunteers to create “virtual lesions” in structures typically damaged in patients with semantic control deficits: LIFG, left posterior middle temporal gyrus (pMTG), and intraparietal sulcus (IPS). The influence of TMS on tasks varying in semantic and nonsemantic control demands was examined for each region within this hypothesized network to gain insights into (i) their functional specialization (i.e., involvement in semantic representation, controlled retrieval, or selection) and (ii) their domain dependence (i.e., semantic or cognitive control). The results revealed that LIFG and pMTG jointly support both the controlled retrieval and selection of semantic knowledge. IPS specifically participates in semantic selection and responds to manipulations of nonsemantic control demands. These observations are consistent with a large-scale semantic control network, as predicted by lesion data, that draws on semantic-specific (LIFG and pMTG) and domain-independent executive components (IPS).
Celotno besedilo
Dostopno za:
DOBA, IZUM, KILJ, NUK, PILJ, PNG, SAZU, SIK, UILJ, UKNU, UL, UM, UPUK
Semantic ambiguity is typically measured by summing the number of senses or dictionary definitions that a word has. Such measures are somewhat subjective and may not adequately capture the full ...extent of variation in word meaning, particularly for polysemous words that can be used in many different ways, with subtle shifts in meaning. Here, we describe an alternative, computationally derived measure of ambiguity based on the proposal that the meanings of words vary continuously as a function of their contexts. On this view, words that appear in a wide range of contexts on diverse topics are more variable in meaning than those that appear in a restricted set of similar contexts. To quantify this variation, we performed latent semantic analysis on a large text corpus to estimate the semantic similarities of different linguistic contexts. From these estimates, we calculated the degree to which the different contexts associated with a given word vary in their meanings. We term this quantity a word’s
semantic diversity
(SemD). We suggest that this approach provides an objective way of quantifying the subtle, context-dependent variations in word meaning that are often present in language. We demonstrate that SemD is correlated with other measures of ambiguity and contextual variability, as well as with frequency and imageability. We also show that SemD is a strong predictor of performance in semantic judgments in healthy individuals and in patients with semantic deficits, accounting for unique variance beyond that of other predictors. SemD values for over 30,000 English words are provided as supplementary materials.