International education is impacted by multiple discourses, in particular the discourse of university as an educational institution responsible for producing and curating knowledge for the public ...good, pursuing truth and transforming student life, and the neoliberal marketing discourse which portrays the university as a business organization providing a service for international students as customers/consumers. Following a multimodal discourse analytic perspective, this study examines ‘Why Choose’ webpages of one British and two Australian universities to identify how the apparently conflicting higher education and neoliberal marketing discourses are managed in the interdiscursive space using language, images and videos. The results reveal that ‘Why Choose’ webpages are hybrid texts where the discourse of higher education is upheld in relation to the neoliberal marketing discourse through multimodal strategies of accentuation, infusion and progression. The study argues for the necessity of undertaking a multimodal discourse approach to understand how various positions are negotiated interdiscursively in online media.
Automaticity in social-cognitive processes Bargh, John A; Schwader, Kay L; Hailey, Sarah E ...
Trends in cognitive sciences,
12/2012, Letnik:
16, Številka:
12
Journal Article
Recenzirano
Over the past several years, the concept of automaticity of higher cognitive processes has permeated nearly all domains of psychological research. In this review, we highlight insights arising from ...studies in decision-making, moral judgments, close relationships, emotional processes, face perception and social judgment, motivation and goal pursuit, conformity and behavioral contagion, embodied cognition, and the emergence of higher-level automatic processes in early childhood. Taken together, recent work in these domains demonstrates that automaticity does not result exclusively from a process of skill acquisition (in which a process always begins as a conscious and deliberate one, becoming capable of automatic operation only with frequent use) – there are evolved substrates and early childhood learning mechanisms involved as well.
The role of short-chain fatty acids (SCFAs) in the brain on the developmental programming of hypertension is poorly understood. The present study explored dysregulated tissue levels of SCFAs and ...expression of SCFA-sensing receptors in the hypothalamic paraventricular nucleus (PVN), a key forebrain region engaged in neural regulation of blood pressure of offspring to maternal high fructose diet (HFD) exposure. We further investigated the engagement of SCFA-sensing receptors in PVN in the beneficial effects of -biotics (prebiotic, probiotic, synbiotic, and postbiotic) on programmed hypertension. Maternal HFD during gestation and lactation significantly reduced circulating butyrate, along with decreased tissue level of butyrate and increased expression of SCFA-sensing receptors, GPR41 and olfr78, and tissue oxidative stress and neuroinflammation in PVN of HFD offspring that were rectified by oral supplement with -biotics. Gene silencing of GPR41 or olfr78 mRNA in PVN also protected adult HFD offspring from programmed hypertension and alleviated the induced oxidative stress and inflammation in PVN. In addition, oral supplement with postbiotic butyrate restored tissue butyrate levels, rectified expressions of GPR41 and olfr78 in PVN, and protected against programmed hypertension in adult HFD offspring. These data suggest that alterations in tissue butyrate level, expression of GPR41 and olfr78, and activation of SCFA-sensing receptor-dependent tissue oxidative stress and neuroinflammation in PVN could be novel mechanisms that underlie hypertension programmed by maternal HFD exposure in adult offspring. Furthermore, oral -biotics supplementation may exert beneficial effects on hypertension of developmental origin by targeting dysfunctional SCFA-sensing receptors in PVN to exert antioxidant and anti-inflammatory actions in the brain.
Tolerance to central hypovolemia is highly variable, and accumulating evidence suggests that protection of anterior cerebral blood flow (CBF) is not an underlying mechanism. We hypothesized that ...individuals with high tolerance to central hypovolemia would exhibit protection of cerebral oxygenation (ScO2), and prolonged preservation of CBF in the posterior vs. anterior cerebral circulation. Eighteen subjects (7 male/11 female) completed a presyncope-limited lower body negative pressure (LBNP) protocol (3 mmHg/min onset rate). ScO2 (via near-infrared spectroscopy), middle cerebral artery velocity (MCAv), posterior cerebral artery velocity (PCAv) (both via transcranial Doppler ultrasound), and arterial pressure (via finger photoplethysmography) were measured continuously. Subjects who completed ≥70 mmHg LBNP were classified as high tolerant (HT; n = 7) and low tolerant (LT; n = 11) if they completed ≤60 mmHg LBNP. The minimum difference in LBNP tolerance between groups was 193 s (LT = 1,243 ± 185 s vs. HT = 1,996 ± 212 s; P < 0.001; Cohen's d = 3.8). Despite similar reductions in mean MCAv in both groups, ScO2 decreased in LT subjects from -15 mmHg LBNP (P = 0.002; Cohen's d=1.8), but was maintained at baseline values until -75 mmHg LBNP in HT subjects (P < 0.001; Cohen's d = 2.2); ScO2 was lower at -30 and -45 mmHg LBNP in LT subjects (P ≤ 0.02; Cohen's d ≥ 1.1). Similarly, mean PCAv decreased below baseline from -30 mmHg LBNP in LT subjects (P = 0.004; Cohen's d = 1.0), but remained unchanged from baseline in HT subjects until -75 mmHg (P = 0.006; Cohen's d = 2.0); PCAv was lower at -30 and -45 mmHg LBNP in LT subjects (P ≤ 0.01; Cohen's d ≥ 0.94). Individuals with higher tolerance to central hypovolemia exhibit prolonged preservation of CBF in the posterior cerebral circulation and sustained cerebral tissue oxygenation, both associated with a delay in the onset of presyncope.
This article presents a mixed methods approach for analysing text and image relations in violent extremist discourse. The approach involves integrating multimodal discourse analysis with data mining ...and information visualisation, resulting in theoretically informed empirical techniques for automated analysis of text and image relations in large datasets. The approach is illustrated by a study which aims to analyse how violent extremist groups use language and images to legitimise their views, incite violence, and influence recruits in online propaganda materials, and how the images from these materials are re-used in different media platforms in ways that support and resist violent extremism. The approach developed in this article contributes to what promises to be one of the key areas of research in the coming decades: namely the interdisciplinary study of big (digital) datasets of human discourse, and the implications of this for terrorism analysis and research.
Research in face recognition has tended to focus on discriminating between individuals, or “telling people apart.” It has recently become clear that it is also necessary to understand how images of ...the same person can vary, or “telling people together.” Learning a new face, and tracking its representation as it changes from unfamiliar to familiar, involves an ion of the variability in different images of that person's face. Here, we present an application of principal components analysis computed across different photos of the same person. We demonstrate that people vary in systematic ways, and that this variability is idiosyncratic—the dimensions of variability in one face do not generalize well to another. Learning a new face therefore entails learning how that face varies. We present evidence for this proposal and suggest that it provides an explanation for various effects in face recognition. We conclude by making a number of testable predictions derived from this framework.
In this paper, we introduce the Intervention Mapping (IM) taxonomy of behaviour change methods and its potential to be developed into a coding taxonomy. That is, although IM and its taxonomy of ...behaviour change methods are not in fact new, because IM was originally developed as a tool for intervention development, this potential was not immediately apparent. Second, in explaining the IM taxonomy and defining the relevant constructs, we call attention to the existence of parameters for effectiveness of methods, and explicate the related distinction between theory-based methods and practical applications and the probability that poor translation of methods may lead to erroneous conclusions as to method-effectiveness. Third, we recommend a minimal set of intervention characteristics that may be reported when intervention descriptions and evaluations are published. Specifying these characteristics can greatly enhance the quality of our meta-analyses and other literature syntheses. In conclusion, the dynamics of behaviour change are such that any taxonomy of methods of behaviour change needs to acknowledge the importance of, and provide instruments for dealing with, three conditions for effectiveness for behaviour change methods. For a behaviour change method to be effective: (1) it must target a determinant that predicts behaviour; (2) it must be able to change that determinant; (3) it must be translated into a practical application in a way that preserves the parameters for effectiveness and fits with the target population, culture, and context. Thus, taxonomies of methods of behaviour change must distinguish the specific determinants that are targeted, practical, specific applications, and the theory-based methods they embody. In addition, taxonomies should acknowledge that the lists of behaviour change methods will be used by, and should be used by, intervention developers. Ideally, the taxonomy should be readily usable for this goal; but alternatively, it should be clear how the information in the taxonomy can be used in practice. The IM taxonomy satisfies these requirements, and it would be beneficial if other taxonomies would be extended to also meet these needs.
This paper presents a novel framework for undertaking climate change impact studies, which can be used for testing the robustness of precautionary climate change allowances used in engineering ...design. It is illustrated with respect to fluvial flood risk in the UK. The methodology departs from conventional scenario-led impact studies because it is based on sensitivity analyses of catchment responses to a plausible range of climate changes (rather than the time-varying outcome of individual scenarios), making it scenario-neutral. The method involves separating the climate change projections (the hazard) from the catchment responsiveness (the vulnerability) expressed as changes in peak flows. By combining current understanding of likelihood of the climate change hazard with knowledge of the sensitivity of a given catchment, it is possible to evaluate the fraction of climate model projections that would not be accommodated by specified safety margins. This enables rapid appraisal of existing or new precautionary allowances for a set of climate change projections, but also for any new set of climate change projections for example arising from a new generation of climate models as soon as they are available, or when focusing on a different planning time horizon, without the need for undertaking a new climate change impact analysis with the new scenarios. The approach is demonstrated via an assessment of the UK Government’s 20% allowance for climate change applied in two contrasting catchments. In these exemplars, the allowance defends against the majority of sampled climate projections for the 2080s from the IPCC-AR4 GCM and UKCP09 RCM runs but it is still possible to identify a sub-set of regional scenarios that would exceed the 20% threshold.