Substance-use disorders are a global public health problem that arises from behavioral misallocation between drug use and more adaptive behaviors maintained by nondrug alternatives (e.g., food or ...money). Preclinical drug self-administration procedures that incorporate a concurrently available nondrug reinforcer (e.g., food) provide translationally relevant and distinct dependent measures of behavioral allocation (i.e., to assess the relative reinforcing efficacy of the drug) and behavioral rate (i.e., to assess motor competence). In particular, preclinical drug versus food ‘choice’ procedures have produced increasingly concordant results with both human laboratory drug self-administration studies and double-blind placebo-controlled clinical trials. Accordingly, here we provide a heuristic framework of substance-use disorders based on a behavioral-centric perspective and recent insights from these preclinical choice procedures.
Substance use disorders represent a global public health issue. This mental health disorder is hypothesized to result from neurobiological changes as a result of chronic drug exposure and clinically ...manifests as inappropriate behavioral allocation toward the procurement and use of the abused substance and away from other behaviors maintained by more adaptive nondrug reinforcers (e.g., social relationships, work). The dynorphin/kappa-opioid receptor (KOR) is one receptor system that has been altered following chronic exposure to drugs of abuse (e.g., cocaine, opioids, alcohol) in both laboratory animals and humans, implicating the dynorphin/KOR system in the expression, mechanisms, and treatment of substance use disorders. KOR antagonists have reduced drug self-administration in laboratory animals under certain experimental conditions, but not others. Recently, several human laboratory and clinical trials have evaluated the effectiveness of KOR antagonists as candidate pharmacotherapies for cocaine or tobacco use disorder to test hypotheses generated from preclinical studies. KOR antagonists failed to significantly alter drug use metrics in humans suggesting translational discordance between some preclinical drug self-administration studies and consistent with other preclinical drug self-administration studies that provide concurrent access to an alternative nondrug reinforcer (e.g., food). The implications of this translational discordance and future directions for examining the therapeutic potential of KOR agonists or antagonists as candidate substance use disorder pharmacotherapies are discussed.
Critical features of human addiction are increasingly being incorporated into complementary animal models, including escalation of drug intake, punished drug seeking and taking, intermittent drug ...access, choice between drug and non-drug rewards, and assessment of individual differences based on criteria in the fourth edition of the Diagnostic and Statistical Manual of Mental Disorders (DSM-IV). Combined with new technologies, these models advanced our understanding of brain mechanisms of drug self-administration and relapse, but these mechanistic gains have not led to improvements in addiction treatment. This problem is not unique to addiction neuroscience, but it is an increasing source of disappointment and calls to regroup. Here we first summarize behavioural and neurobiological results from the animal models mentioned above. We then propose a reverse translational approach, whose goal is to develop models that mimic successful treatments: opioid agonist maintenance, contingency management and the community-reinforcement approach. These reverse-translated 'treatments' may provide an ecologically relevant platform from which to discover new circuits, test new medications and improve translation.
General anesthetics have been used to ablate consciousness during surgery for more than 150 yr. Despite significant advances in our understanding of their molecular-level pharmacologic effects, ...comparatively little is known about how anesthetics alter brain dynamics to cause unconsciousness. Consequently, while anesthesia practice is now routine and safe, there are many vagaries that remain unexplained. In this paper, the authors review the evidence that cortical network activity is particularly sensitive to general anesthetics, and suggest that disruption to communication in, and/or among, cortical brain regions is a common mechanism of anesthesia that ultimately produces loss of consciousness. The authors review data from acute brain slices and organotypic cultures showing that anesthetics with differing molecular mechanisms of action share in common the ability to impair neurophysiologic communication. While many questions remain, together, ex vivo and in vivo investigations suggest that a unified understanding of both clinical anesthesia and the neural basis of consciousness is attainable.
Rationale
The adverse consequences of human addictive drug use could be the result of either addictive drug consumption resulting in punishment (e.g., incarceration) or failure to engage in ...negative-reinforced behaviors that might compete with drug-maintained behaviors (e.g., contingency management strategies that reset payment amounts for drug free urines).
Objective
The goal of the present study was to establish a discrete-trial cocaine-vs-negative reinforcer (S
NR
) choice procedure where rats were presented with a simplified model of this conflict: choose negative reinforcement (i.e., escape or avoid foot shock) or choose an intravenous (IV) cocaine infusion followed by an inescapable shock.
Methods
Responding was maintained in male and female rats by IV cocaine infusions (0.32–1.8 mg/kg/inf) and a S
NR
(0.1–0.7 mA shock) under a discrete-trial concurrent “choice” schedule during daily sessions. Following parametric reinforcer magnitude and response requirement experiments, the effects of 12 h extended access cocaine self-administration and acute diazepam (0.32–10 mg/kg, IP) pretreatment were determined on cocaine-vs-S
NR
choice.
Results
Negative reinforcement was chosen over all cocaine doses. Lowering shock magnitude or increasing S
NR
response requirement failed to promote behavioral reallocation towards cocaine. Extended access cocaine self-administration sessions resulted in high daily cocaine intakes but failed to significantly increase cocaine choice in all (19) but one rat. Acute diazepam pretreatment also did not alter choice behavior up to doses that produced behavioral depression.
Conclusions
These results suggest that S
NR
s may be a source of reinforcement that effectively compete with and mitigate maladaptive addictive drug-maintained behaviors in the general population.
Disruption of cortical connectivity likely contributes to loss of consciousness (LOC) during both sleep and general anesthesia, but the degree of overlap in the underlying mechanisms is unclear. Both ...sleep and anesthesia comprise states of varying levels of arousal and consciousness, including states of largely maintained conscious experience (sleep: N1, REM; anesthesia: sedated but responsive) as well as states of substantially reduced conscious experience (sleep: N2/N3; anesthesia: unresponsive). Here, we tested the hypotheses that (1) cortical connectivity will exhibit clear changes when transitioning into states of reduced consciousness, and (2) these changes will be similar for arousal states of comparable levels of consciousness during sleep and anesthesia. Using intracranial recordings from five adult neurosurgical patients, we compared resting state cortical functional connectivity (as measured by weighted phase lag index, wPLI) in the same subjects across arousal states during natural sleep wake (WS), N1, N2, N3, REM and propofol anesthesia pre-drug wake (WA), sedated/responsive (S), and unresponsive (U). Analysis of alpha-band connectivity indicated a transition boundary distinguishing states of maintained and reduced conscious experience in both sleep and anesthesia. In wake states WS and WA, alpha-band wPLI within the temporal lobe was dominant. This pattern was largely unchanged in N1, REM, and S. Transitions into states of reduced consciousness N2, N3, and U were characterized by dramatic changes in connectivity, with dominant connections shifting to prefrontal cortex. Secondary analyses indicated similarities in reorganization of cortical connectivity in sleep and anesthesia. Shifts from temporal to frontal cortical connectivity may reflect impaired sensory processing in states of reduced consciousness. The data indicate that functional connectivity can serve as a biomarker of arousal state and suggest common mechanisms of LOC in sleep and anesthesia.
•Mechanisms of loss of consciousness are key for basic science and clinical practice.•Loss of consciousness during sleep and anesthesia share common mechanisms.•We studied this using intracranial electrophysiology in neurosurgical patients.•Loss of consciousness was characterized by similar shifts in brain connectivity.•The findings will aid improvements in diagnosis of disorders of consciousness.
Substance use disorders are diagnosed as a manifestation of inappropriate behavioral allocation toward abused drugs and away from other behaviors maintained by more adaptive nondrug reinforcers ...(e.g., money and social relationships). Substance use disorder treatment goals include not only decreasing drug‐maintained behavior but also promoting behavioral reallocation toward these socially adaptive alternative reinforcers. Preclinical drug self‐administration procedures that offer concurrent access to both drug and nondrug reinforcers provide a translationally relevant dependent measure of behavioral allocation that may be useful for candidate medication evaluation. In contrast to other abused drugs, such as heroin or cocaine, preclinical methamphetamine versus food choice procedures have been a more recent development. We hypothesize that preclinical to clinical translatability would be improved by the evaluation of repeated pharmacological treatment effects on methamphetamine self‐administration under a methamphetamine versus food choice procedure. In support of this hypothesis, a literature review suggests strong concordance between preclinical pharmacological treatment effects on methamphetamine versus food choice in nonhuman primates and clinical medication treatment effects on methamphetamine self‐administration in human laboratory studies or methamphetamine abuse metrics in clinical trials. In conclusion, this literature suggests preclinical methamphetamine versus food choice procedures may be useful in developing innovative pharmacotherapies for methamphetamine use disorder.
Abstract
Delirium is associated with electroencephalogram (EEG) slowing and impairments in connectivity. We hypothesized that delirium would be accompanied by a reduction in the available cortical ...information (ie, there is less information processing occurring), as measured by a surrogate, Lempil-Ziv Complexity (LZC), a measure of time-domain complexity. Two ongoing perioperative cohort studies (NCT03124303, NCT02926417) contributed EEG data from 91 patients before and after surgery; 89 participants were used in the analyses. After cleaning and filtering (0.1–50Hz), the perioperative change in LZC and LZC normalized (LZCn) to a phase-shuffled distribution were calculated. The primary outcome was the correlation of within-patient paired changes in delirium severity (Delirium Rating Scale-98 DRS) and LZC. Scalp-wide threshold-free cluster enhancement was employed for multiple comparison correction. LZC negatively correlated with DRS in a scalp-wide manner (peak channel r2 = .199, p < .001). This whole brain effect remained for LZCn, though the correlations were weaker (peak channel r2 = .076, p = .010). Delirium diagnosis was similarly associated with decreases in LZC (peak channel p < .001). For LZCn, the topological significance was constrained to the midline posterior regions (peak channel p = .006). We found a negative correlation of LZC in the posterior and temporal regions with monocyte chemoattractant protein-1 (peak channel r2 = .264, p < .001, n = 47) but not for LZCn. Complexity of the EEG signal fades proportionately to delirium severity implying reduced cortical information. Peripheral inflammation, as assessed by monocyte chemoattractant protein-1, does not entirely account for this effect, suggesting that additional pathogenic mechanisms are involved.
Understanding central auditory processing critically depends on defining underlying auditory cortical networks and their relationship to the rest of the brain. We addressed these questions using ...resting state functional connectivity derived from human intracranial electroencephalography. Mapping recording sites into a low-dimensional space where proximity represents functional similarity revealed a hierarchical organization. At a fine scale, a group of auditory cortical regions excluded several higher-order auditory areas and segregated maximally from the prefrontal cortex. On mesoscale, the proximity of limbic structures to the auditory cortex suggested a limbic stream that parallels the classically described ventral and dorsal auditory processing streams. Identities of global hubs in anterior temporal and cingulate cortex depended on frequency band, consistent with diverse roles in semantic and cognitive processing. On a macroscale, observed hemispheric asymmetries were not specific for speech and language networks. This approach can be applied to multivariate brain data with respect to development, behavior, and disorders.