Mutations of genes encoding for proteins within the photorespiratory core cycle and associated processes are characterised by lethality under normal air but viability under elevated CO2 conditions. ...This feature has been described as 'the photorespiratory phenotype' and assumed to be distinctly equal for all of these mutants. In recent years a broad collection of photorespiratory mutants has been isolated, which has allowed a comparative analysis. Distinct phenotypic features were observed when Arabidopsis thaliana mutants defective in photorespiratory enzymes were compared, and during shifts from elevated to ambient CO2 conditions. The exact reasons for the mutant-specific photorespiratory phenotypes are mostly unknown, but they indicate even more plasticity of photorespiratory metabolism. Moreover, a growing body of evidence was obtained that mutant features could be modulated by alterations of several factors, such as CO2 :O2 ratios, photoperiod, light intensity, organic carbon supply and pathogens. Hence, systematic analyses of the responses to these factors appear to be crucial to unravel mechanisms how photorespiration adapts and interacts with the whole cellular metabolism. Here we review current knowledge regarding photorespiratory mutants and propose a new level of phenotypic sub-classification. Finally, we present further questions that should be addressed in the field of photorespiration.
The quality of communication in oncology significantly impacts patients' health outcomes, as poor communication increases the risk of unnecessary treatment, inadequate pain relief, higher anxiety ...levels, and acute hospitalizations. Additionally, ineffective communication skills training (CST) is associated with stress, low job satisfaction, and burnout among doctors working in oncology. While acknowledging the importance of effective communication, the specific features of successful CST remain uncertain. Role-play and recorded consultations with direct feedback appear promising for CST but may be time-consuming and face challenges in transferring acquired skills to clinical contexts. Our aim is to bridge this gap by proposing a novel approach: On-site Supportive Communication Training (On-site SCT). The concept integrates knowledge from previous studies but represents the first randomized controlled trial employing actual doctor-patient interactions during CST.
This randomized multicenter trial is conducted at three departments of oncology in Denmark. Doctors are randomized 1:1 to the intervention and control groups. The intervention group involves participation in three full days of On-site SCT facilitated by a trained psychologist. On-site SCT focuses on imparting communication techniques, establishing a reflective learning environment, and offering emotional support with a compassionate mindset. The primary endpoint is the change in percentage of items rated "excellent" by the patients in the validated 15-item questionnaire Communication Assessment Tool. The secondary endpoints are changes in doctors' ratings of self-efficacy in health communication, burnout, and job satisfaction measured by validated questionnaires. Qualitative interviews will be conducted with the doctors after the intervention to evaluate its relevance, feasibility, and working mechanisms. Doctors have been actively recruited during summer/autumn 2023. Baseline questionnaires from patients have been collected. Recruitment of new patients for evaluation questionnaires is scheduled for Q1-Q2 2024.
This trial aims to quantify On-site SCT efficacy. If it significantly impacts patients/doctors, it can be a scalable CST concept for clinical practice. Additionally, qualitative interviews will reveal doctors' insight into the most comprehensible curriculum parts.
April 2023 - ClinicalTrials.gov (NCT05842083). April 2023 - The Research Ethics Committee at the University of Southern Denmark (23/19397).
Summary
Background
Despite extensive research, the aetiology of atopic dermatitis remains largely unknown, but reduced intestinal microbiota diversity in neonates has been linked to subsequent atopic ...dermatitis. Consequently, postnatal antibiotics have been proposed as a risk factor, but a potential association between prenatal antibiotics and atopic dermatitis is not well studied. Overall, the current evidence suggests a positive association between exposure to prenatal antibiotics and atopic dermatitis.
Objective
To investigate the association between prenatal antibiotics and atopic dermatitis among 18‐month‐old children.
Methods
This study conducted within the Danish National Birth Cohort included 62 560 mother‐child pairs. Data on maternal prenatal antibiotics were collected in the 30th gestation week and 6 months post‐partum, and offspring atopic dermatitis 18 months post‐partum through telephone interviews. Antibiotic use was categorized by the timing of exposure as 1st‐2nd trimester (gestation week 0‐29), 3rd trimester (gestation week 30‐birth), all three trimesters or none. Data were analysed by logistic regression analyses adjusting for potential confounders.
Results
Exposure to antibiotics prenatally was associated with increased odds of atopic dermatitis among children born by atopic mothers but only when used in both 1st‐2nd and 3rd trimester (ORadj 1.45, 95% CI: 1.19‐1.76). The findings were consistent using different definitions of atopic dermatitis.
Conclusions and Clinical Relevance
Prenatal exposure to antibiotics throughout pregnancy was associated with an increased risk of atopic dermatitis but only within the first 18 months of life among children born by atopic mothers. The clinical usefulness of this finding must rest on corroboration in independent data sources.
A 33-year-old female is presented with the first case to our knowledge of new daily persistent headache (NDPH) with a large right benign non-toxic multinodular goiter causing carotid and vertebral ...compression with complete resolution of the headache immediately after thyroidectomy. Although this may be quite rare, hypothyroidism or hyperthyroidism causing NDPH, migraine, or an exacerbation of pre-existing migraine is not. Clinicians should consider routinely obtaining serum thyroid-stimulating hormone (TSH) and free T4 in patients with new onset frequent headaches or an exacerbation of prior primary headaches.
Being intimately intertwined with (C3) photosynthesis, photorespiration is an incredibly high flux-bearing pathway. Traditionally, the photorespiratory cycle was viewed as closed pathway to refill ...the Calvin-Benson cycle with organic carbon. However, given the network nature of metabolism, it hence follows that photorespiration will interact with many other pathways. In this article, we review current understanding of these interactions and attempt to define key priorities for future research, which will allow us greater fundamental comprehension of general metabolic and developmental consequences of perturbation of this crucial metabolic process.
The need for computing in the HEP community follows cycles of peaks and valleys mainly driven by conference dates, accelerator shutdown, holiday schedules, and other factors. Because of this, the ...classical method of provisioning these resources at providing facilities has drawbacks such as potential overprovisioning. As the appetite for computing increases, however, so does the need to maximize cost efficiency by developing a model for dynamically provisioning resources only when needed. To address this issue, the HEPCloud project was launched by the Fermilab Scientific Computing Division in June 2015. Its goal is to develop a facility that provides a common interface to a variety of resources, including local clusters, grids, high performance computers, and community and commercial Clouds. Initially targeted experiments include CMS and NOvA, as well as other Fermilab stakeholders. In its first phase, the project has demonstrated the use of the "elastic" provisioning model offered by commercial clouds, such as Amazon Web Services. In this model, resources are rented and provisioned automatically over the Internet upon request. In January 2016, the project demonstrated the ability to increase the total amount of global CMS resources by 58,000 cores from 150,000 cores - a 38 percent increase - in preparation for the Recontres de Moriond. In March 2016, the NOvA experiment has also demonstrated resource burst capabilities with an additional 7,300 cores, achieving a scale almost four times as large as the local allocated resources and utilizing the local AWS s3 storage to optimize data handling operations and costs. NOvA was using the same familiar services used for local computations, such as data handling and job submission, in preparation for the Neutrino 2016 conference. In both cases, the cost was contained by the use of the Amazon Spot Instance Market and the Decision Engine, a HEPCloud component that aims at minimizing cost and job interruption. This paper describes the Fermilab HEPCloud Facility and the challenges overcome for the CMS and NOvA communities.
Fermilab supports a scientific program that includes experiments and scientists located across the globe. To better serve this community, in 2004, the (then) Computing Division undertook the strategy ...of placing all of the High Throughput Computing (HTC) resources in a Campus Grid known as FermiGrid, supported by common shared services. In 2007, the FermiGrid Services group deployed a service infrastructure that utilized Xen virtualization, LVS network routing and MySQL circular replication to deliver highly available services that offered significant performance, reliability and serviceability improvements. This deployment was further enhanced through the deployment of a distributed redundant network core architecture and the physical distribution of the systems that host the virtual machines across multiple buildings on the Fermilab Campus. In 2010, building on the experience pioneered by FermiGrid in delivering production services in a virtual infrastructure, the Computing Sector commissioned the FermiCloud, General Physics Computing Facility and Virtual Services projects to serve as platforms for support of scientific computing (FermiCloud 6 GPCF) and core computing (Virtual Services). This work will present the evolution of the Fermilab Campus Grid, Virtualization and Cloud Computing infrastructure together with plans for the future.
Experience in using commercial clouds in CMS Bauerdick, L; Bockelman, B; Dykstra, D ...
Journal of physics. Conference series,
10/2017, Letnik:
898, Številka:
5
Journal Article
Recenzirano
Odprti dostop
Historically high energy physics computing has been performed on large purpose-built computing systems. In the beginning there were single site computing facilities, which evolved into the Worldwide ...LHC Computing Grid (WLCG) used today. The vast majority of the WLCG resources are used for LHC computing and the resources are scheduled to be continuously used throughout the year. In the last several years there has been an explosion in capacity and capability of commercial and academic computing clouds. Cloud resources are highly virtualized and intended to be able to be flexibly deployed for a variety of computing tasks. There is a growing interest amongst the cloud providers to demonstrate the capability to perform large scale scientific computing. In this presentation we will discuss results from the CMS experiment using the Fermilab HEPCloud Facility, which utilized both local Fermilab resources and Amazon Web Services (AWS). The goal was to work with AWS through a matching grant to demonstrate a sustained scale approximately equal to half of the worldwide processing resources available to CMS. We will discuss the planning and technical challenges involved in organizing the most IO intensive CMS workflows on a large-scale set of virtualized resource provisioned by the Fermilab HEPCloud. We will describe the data handling and data management challenges. Also, we will discuss the economic issues and cost and operational efficiency comparison to our dedicated resources. At the end we will consider the changes in the working model of HEP computing in a domain with the availability of large scale resources scheduled at peak times.
The Fermilab HEPCloud Facility Project has as its goal to extend the current Fermilab facility interface to provide transparent access to disparate resources including commercial and community ...clouds, grid federations, and HPC centers. This facility enables experiments to perform the full spectrum of computing tasks, including data-intensive simulation and reconstruction. We have evaluated the use of the commercial cloud to provide elasticity to respond to peaks of demand without overprovisioning local resources. Full scale data-intensive workflows have been successfully completed on Amazon Web Services for two High Energy Physics Experiments, CMS and NOνA, at the scale of 58000 simultaneous cores. This paper describes the significant improvements that were made to the virtual machine provisioning system, code caching system, and data movement system to accomplish this work. The virtual image provisioning and contextualization service was extended to multiple AWS regions, and to support experiment-specific data configurations. A prototype Decision Engine was written to determine the optimal availability zone and instance type to run on, minimizing cost and job interruptions. We have deployed a scalable on-demand caching service to deliver code and database information to jobs running on the commercial cloud. It uses the frontiersquid server and CERN VM File System (CVMFS) clients on EC2 instances and utilizes various services provided by AWS to build the infrastructure (stack). We discuss the architecture and load testing benchmarks on the squid servers. We also describe various approaches that were evaluated to transport experimental data to and from the cloud, and the optimal solutions that were used for the bulk of the data transport. Finally, we summarize lessons learned from this scale test, and our future plans to expand and improve the Fermilab HEP Cloud Facility.