The computer analogy of the mind has been as widely adopted in contemporary cognitive neuroscience as was the analogy of the brain as a collection of organs in phrenology. Just as the phrenologist ...would insist that each organ must have its particular function, so contemporary cognitive neuroscience is committed to the notion that each brain region must have its fundamental computation. InAfter Phrenology, Michael Anderson argues that to achieve a fully post-phrenological science of the brain, we need to reassess this commitment and devise an alternate, neuroscientifically grounded taxonomy of mental function. Anderson contends that the cognitive roles played by each region of the brain are highly various, reflecting different neural partnerships established under different circumstances. He proposes quantifying the functional properties of neural assemblies in terms of their dispositional tendencies rather than their computational or information-processing operations. Exploring larger-scale issues, and drawing on evidence from embodied cognition, Anderson develops a picture of thinking rooted in the exploitation and extension of our early-evolving capacity for iterated interaction with the world. He argues that the multidimensional approach to the brain he describes offers a much better fit for these findings, and a more promising road toward a unified science of minded organisms.
Public transit accounts for 1 percent of US passenger miles traveled but attracts strong public support. Using a simple choice model, we predict that transit riders are likely to be individuals who ...commute along routes with severe roadway delays. These individuals' choices thus have high marginal impacts on congestion. We test this prediction with data from a strike in 2003 by Los Angeles transit workers. Estimating a regression discontinuity design, we find that average highway delay increases 47 percent when transit service ceases. We find that the net benefits of transit systems appear to be much larger than previously believed.
An emerging class of theories concerning the functional structure of the brain takes the reuse of neural circuitry for various cognitive purposes to be a central organizational principle. According ...to these theories, it is quite common for neural circuits established for one purpose to be exapted (exploited, recycled, redeployed) during evolution or normal development, and be put to different uses, often without losing their original functions. Neural reuse theories thus differ from the usual understanding of the role of neural plasticity (which is, after all, a kind of reuse) in brain organization along the following lines: According to neural reuse, circuits can continue to acquire new uses after an initial or original function is established; the acquisition of new uses need not involve unusual circumstances such as injury or loss of established function; and the acquisition of a new use need not involve (much) local change to circuit structure (e.g., it might involve only the establishment of functional connections to new neural partners). Thus, neural reuse theories offer a distinct perspective on several topics of general interest, such as: the evolution and development of the brain, including (for instance) the evolutionary-developmental pathway supporting primate tool use and human language; the degree of modularity in brain organization; the degree of localization of cognitive function; and the cortical parcellation problem and the prospects (and proper methods to employ) for function to structure mapping. The idea also has some practical implications in the areas of rehabilitative medicine and machine interface design.
The view that the returns to educational investments are highest for early childhood interventions is widely held and stems primarily from several influential randomized trials-Abecedarian, Perry, ...and the Early Training Project-that point to super-normal returns to early interventions. This article presents a de novo analysis of these experiments, focusing on two core issues that have received limited attention in previous analyses: treatment effect heterogeneity by gender and overrejection of the null hypothesis due to multiple inference. To address the latter issue, a statistical framework that combines summary index tests with familywise error rate and false discovery rate corrections is implemented. The first technique reduces the number of tests conducted; the latter two techniques adjust the p values for multiple inference. The primary finding of the reanalysis is that girls garnered substantial short- and long-term benefits from the interventions, but there were no significant long-term benefits for boys. These conclusions, which have appeared ambiguous when using "naive" estimators that fail to adjust for multiple testing, contribute to a growing literature on the emerging female-male academic achievement gap. They also demonstrate that in complex studies where multiple questions are asked of the same data set, it can be important to declare the family of tests under consideration and to either consolidate measures or report adjusted and unadjusted p values.
Theory of the Earth is an interdisciplinary advanced textbook on the origin, composition, and evolution of the Earth's interior: geophysics, geochemistry, dynamics, convection, mineralogy, volcanism, ...energetics and thermal history. This is the only book on the whole landscape of deep Earth processes which ties together all the strands of the subdisciplines. It is a complete update of Anderson's Theory of the Earth (1989). It includes many new sections and dozens of new figures and tables. As with the original book, this new edition will prove to be a stimulating textbook on advanced courses in geophysics, geochemistry, and planetary science, and supplementary textbook on a wide range of other advanced Earth science courses. It will also be an essential reference and resource for all researchers in the solid Earth sciences.
This document is an update to the 2011 Clinical Pharmacogenetics Implementation Consortium (CPIC) guideline for CYP2C9 and VKORC1 genotypes and warfarin dosing. Evidence from the published literature ...is presented for CYP2C9, VKORC1, CYP4F2, and rs12777823 genotype‐guided warfarin dosing to achieve a target international normalized ratio of 2–3 when clinical genotype results are available. In addition, this updated guideline incorporates recommendations for adult and pediatric patients that are specific to continental ancestry.
Narrative reviews of paediatric NAFLD quote prevalences in the general population that range from 9% to 37%; however, no systematic review of the prevalence of NAFLD in children/adolescents has been ...conducted. We aimed to estimate prevalence of non-alcoholic fatty liver disease (NAFLD) in young people and to determine whether this varies by BMI category, gender, age, diagnostic method, geographical region and study sample size.
We conducted a systematic review and meta-analysis of all studies reporting a prevalence of NAFLD based on any diagnostic method in participants 1-19 years old, regardless of whether assessing NAFLD prevalence was the main aim of the study.
The pooled mean prevalence of NAFLD in children from general population studies was 7.6% (95%CI: 5.5% to 10.3%) and 34.2% (95% CI: 27.8% to 41.2%) in studies based on child obesity clinics. In both populations there was marked heterogeneity between studies (I2 = 98%). There was evidence that prevalence was generally higher in males compared with females and increased incrementally with greater BMI. There was evidence for differences between regions in clinical population studies, with estimated prevalence being highest in Asia. There was no evidence that prevalence changed over time. Prevalence estimates in studies of children/adolescents attending obesity clinics and in obese children/adolescents from the general population were substantially lower when elevated alanine aminotransferase (ALT) was used to assess NAFLD compared with biopsies, ultrasound scan (USS) or magnetic resonance imaging (MRI).
Our review suggests the prevalence of NAFLD in young people is high, particularly in those who are obese and in males.
Celotno besedilo
Dostopno za:
DOBA, IZUM, KILJ, NUK, PILJ, PNG, SAZU, SIK, UILJ, UKNU, UL, UM, UPUK
Conspectus This Account is about templates as construction tools: molecules for making molecules. A template organizes the reactants and provides information to promote formation of a specific ...product, but it is not part of the final product. We have developed many different strategies for using oligopyridines as templates for the synthesis of alkyne-linked π-conjugated metalloporphyrin oligomers. These compounds include some of the largest macrocycles ever synthesized, such as a 50-porphyrin ring with a diameter of 21 nm containing a ring of 750 C–C bonds. Metalloporphyrins are excellent models for exploring template directed synthesis, as they can be functionalized in many different positions and the central metal (typically Zn or Mg) provides a handle for coordination to templates. Classical template-directed macrocyclization reactions have a 1:1 complementarity between the template and the product. This strategy works well for preparing nanorings of 5–7 porphyrin units, but larger templates are laborious to synthesize. Rings of 8 or more porphyrin units are most easily prepared using “nonclassical” strategies, in which several small templates work together to direct the formation of a large ring. In the Vernier approach, a mismatch between the number of binding sites on the template and the building block leads to a mathematical amplification of the length scale: the number of binding sites in the product is the lowest common multiple of those in the template and the building block. For example, a 40-porphyrin ring can be prepared by coupling a linear decamer in the presence of an octadentate template. Linear Vernier templating opens up intriguing possibilities for self-replication. When several small radial oligopyridine templates bind inside a large nanoring they can form complexes with some vacant coordination sites that display correlated motion like the caterpillar tracks of a bulldozer. These caterpillar track complexes can be used in template-directed synthesis and they provide the most convenient route to 8- and 10-porphyrin rings. Russian doll complexes provide another strategy for template-directed synthesis: a number of specifically designed ligands bind to a central nanoring to form a template for constructing a larger concentric nanoring. The same oligopyridine templates that are used to prepare nanorings can also be used to synthesize three-dimensional nanotubes and nanoballs. Again, nonclassical approaches, in which several small templates work together cooperatively, are much simpler than creating a single large template with sufficient binding sites to define the whole geometry of the product. Oligopyridine ligands can also be used as shadow mask templates to control the demetalation of magnesium porphyrin nanorings, because metal centers that are not coordinated by the template can be selectively demetalated with acid. Thus, the template forms a permanent shadow on the porphyrin nanostructure that remains after the template has been removed. Shadow mask templates provide a simple route to heterometalated molecular architectures. The insights emerging from these studies are widely applicable, and there are many opportunities for inventing new ways of using templates to control reactions.
Abstract
Ensemble Kalman filters use the sample covariance of an observation and a model state variable to update a prior estimate of the state variable. The sample covariance can be suboptimal as a ...result of small ensemble size, model error, model nonlinearity, and other factors. The most common algorithms for dealing with these deficiencies are inflation and covariance localization. A statistical model of errors in ensemble Kalman filter sample covariances is described and leads to an algorithm that reduces ensemble filter root-mean-square error for some applications. This sampling error correction algorithm uses prior information about the distribution of the correlation between an observation and a state variable. Offline Monte Carlo simulation is used to build a lookup table that contains a correction factor between 0 and 1 depending on the ensemble size and the ensemble sample correlation. Correction factors are applied like a traditional localization for each pair of observations and state variables during an ensemble assimilation. The algorithm is applied to two low-order models and reduces the sensitivity of the ensemble assimilation error to the strength of traditional localization. When tested in perfect model experiments in a larger model, the dynamical core of a general circulation model, the sampling error correction algorithm produces analyses that are closer to the truth and also reduces sensitivity to traditional localization strength.
Celotno besedilo
Dostopno za:
DOBA, IZUM, KILJ, NUK, PILJ, PNG, SAZU, UILJ, UKNU, UL, UM, UPUK
Carbon allotropes built from rings of two-coordinate atoms, known as cyclo
carbons, have fascinated chemists for many years, but until now they could not be isolated or structurally characterized ...because of their high reactivity. We generated cyclo18carbon (C
) using atom manipulation on bilayer NaCl on Cu(111) at 5 kelvin by eliminating carbon monoxide from a cyclocarbon oxide molecule, C
O
Characterization of cyclo18carbon by high-resolution atomic force microscopy revealed a polyynic structure with defined positions of alternating triple and single bonds. The high reactivity of cyclocarbon and cyclocarbon oxides allows covalent coupling between molecules to be induced by atom manipulation, opening an avenue for the synthesis of other carbon allotropes and carbon-rich materials from the coalescence of cyclocarbon molecules.