Abstract
Background
Prognosis in HFpEF is determined by risk factor control and treatment of comorbidities. Industrially processed TFA (IP-TFA) from partially hydrogenated oils have been linked to ...altered lipoprotein metabolism, endothelial dysfunction, increased biomarkers of inflammation and increased NTproBNP. In patients with heart failure with preserved ejection fraction (HFpEF), associations of TFA blood levels with patient characteristics are unknown.
Purpose
To evaluate associations of blood TFA with cardiovascular risk factors, aerobic capacity and cardiac function in patients with HFpEF.
Methods
This is a secondary analysis from the Aldo-DHF-RCT. From 422 patients, individual blood TFA were analyzed at baseline in n=404 using the HS-Omega-3-Index® methodology. Patient characteristics were; 67±8 years, 53% female, NYHA II/III (87/13%), ejection fraction ≥50%, E/e' 7.1±1.5; median NT-proBNP 158 ng/L (IQR 82–298). Multiple linear regression analyses, using sex and age as covariates, were used to describe associations of TFA with metabolic phenotype, functional capacity, echocardiographic markers for left ventricular diastolic function (LVDF), and neurohumoral activation at baseline and after 12-months-follow-up (12mFU). To account for randomization group, all analyses were repeated as sensitivity analysis with group as covariate. A significance level of α=5% was used for all tests. As all tests were hypothesis generating without confirmatory interpretation, no correction was applied to counteract the problem of multiple comparisons.
Results
Higher blood levels of the naturally occurring TFA C16:1n-7t were broadly associated with a more favorable lipid profile, lower body weight/central adiposity, lower white blood cell count and lower biochemical markers of non-alcoholic fatty liver disease at baseline/12mFU. Conversely, blood levels of the IP-TFA C18:1n9t were directly associated with lipid risk markers triglycerides (β=19.7, p<0,001), non-HDL-C (β=7.9, p=0,001), and LDL-C (β=5.4, p=0,011). The two IP-TFA C18:2n6 isomers C18:2n6tt and C18:2n6ct were positively associated with HbA1c (β=14.6, p=0,003) and (β=4.2, p=0,014) respectively. The IP-TFA C18:2n6tt/-ct isomers were associated with lower submaximal aerobic capacity (distance covered in the 6MWT) at baseline/12mFU. No significant association was found between TFA blood levels and left ventricular filling pressures, left ventricular relaxation or neurohumoral activation. Significant effects of group allocation (spironolactone +/−) were found for the 12mFU outcomes systolic/diastolic blood pressure (all p<0.001), heart rate, E/e$'$ and HbA1c.
Conclusions
In HFpEF patients, higher blood levels of industrially processed TFA, but not of the TFA C16:1n-7t in full fat dairy and meat, were associated with a higher risk phenotype and lower aerobic capacity. Our findings support efforts to remove IP-TFA from the food supply for improving risk factor control in HFpEF patients.
Funding Acknowledgement
Type of funding sources: Foundation. Main funding source(s): German Foundation of Heart Research
The ALICE online data storage system Divià, R; Fuchs, U; Makhlyueva, I ...
Journal of physics. Conference series,
04/2010, Letnik:
219, Številka:
5
Journal Article
Recenzirano
Odprti dostop
The ALICE (A Large Ion Collider Experiment) Data Acquisition (DAQ) system has the unprecedented requirement to ensure a very high volume, sustained data stream between the ALICE Detector and the ...Permanent Data Storage (PDS) system which is used as main data repository for Event processing and Offline Computing. The key component to accomplish this task is the Transient Data Storage System (TDS), a set of data storage elements with its associated hardware and software components, which supports raw data collection, its conversion into a format suitable for subsequent high-level analysis, the storage of the result using highly parallelized architectures, its access via a cluster file system capable of creating high-speed partitions via its affinity feature, and its transfer to the final destination via dedicated data links. We describe the methods and the components used to validate, test, implement, operate, and monitor the ALICE Online Data Storage system and the way it has been used in the early days of commissioning and operation for the ALICE Detector. We will also introduce the future developments needed from next year, when the ALICE Data Acquisition System will shift its requirements from those associated to the test and commissioning phase to those imposed by long-duration data taking periods alternated by shorter validation and maintenance tasks which will be needed to adequately operate the ALICE Experiment.
ALICE (A Large Ion Collider Experiment) is the heavy-ion detector designed to study the physics of strongly interacting matter and the quark-gluon plasma at the CERN Large Hadron Collider (LHC). A ...large bandwidth and flexible Data Acquisition System (DAQ) has been designed and deployed to collect sufficient statistics in the short running time available per year for heavy ions and to accommodate very different requirements originated from the 18 sub-detectors. This paper will present the large scale tests conducted to assess the standalone DAQ performances, the interfaces with the other online systems and the extensive commissioning performed in order to be fully prepared for physics data taking. It will review the experience accumulated since May 2007 during the standalone commissioning of the main detectors and the global cosmic runs and the lessons learned from this exposure on the "battle field". It will also discuss the test protocol followed to integrate and validate each sub-detector with the online systems and it will conclude with the first results of the LHC injection tests and startup in September 2008. Several papers of the same conference present in more details some elements of the ALICE DAQ system.
This paper will describe the hardware and the software developed to build a random trigger simulator used to test the detectors of the ALICE experiment. It will also describe the tests performed in ...our laboratory at CERN on the random trigger generator to confirm its correct behavior and its installation details in one of the counting rooms of ALICE, where it provides the triggers for all the sub-detectors.
Preparing the ALICE DAQ upgrade Carena, F; Carena, W; Chapeland, S ...
Journal of physics. Conference series,
01/2012, Letnik:
396, Številka:
1
Journal Article
Recenzirano
Odprti dostop
In November 2009, after 15 years of design and installation, the ALICE experiment started to detect and record the first collisions produced by the LHC. It has been collecting hundreds of millions of ...events ever since with both proton and heavy ion collisions. The future scientific programme of ALICE has been refined following the first year of data taking. The physics targeted beyond 2018 will be the study of rare signals. Several detectors will be upgraded, modified, or replaced to prepare ALICE for future physics challenges. An upgrade of the triggering and readout systems is also required to accommodate the needs of the upgraded ALICE and to better select the data of the rare physics channels. The ALICE upgrade will have major implications in the detector electronics and controls, data acquisition, event triggering and offline computing and storage systems. Moreover, the experience accumulated during more than two years of operation has also lead to new requirements for the control software. We will review all these new needs and the current R&D activities to address them. Several papers of the same conference present in more details some elements of the ALICE online system.
ALICE moves into warp drive Carena, F; Carena, W; Chapeland, S ...
Journal of physics. Conference series,
01/2012, Letnik:
396, Številka:
1
Journal Article
Recenzirano
Odprti dostop
A Large Ion Collider Experiment (ALICE) is the heavy-ion detector designed to study the physics of strongly interacting matter and the quark-gluon plasma at the CERN Large Hadron Collider (LHC). ...Since its successful start-up in 2010, the LHC has been performing outstandingly, providing to the experiments long periods of stable collisions and an integrated luminosity that greatly exceeds the planned targets. To fully explore these privileged conditions, we aim at maximizing the experiment's data taking productivity during stable collisions. We present in this paper the evolution of the online systems towards helping us understand reasons of inefficiency and address new requirements. This paper describes the features added to the ALICE Electronic Logbook (eLogbook) to allow the Run Coordination team to identify, prioritize, fix and follow causes of inefficiency in the experiment. Thorough monitoring of the data taking efficiency provides reports for the collaboration to portray its evolution and evaluate the measures (fixes and new features) taken to increase it. In particular, the eLogbook helps decision making by providing quantitative input, which can be used to better balance risks of changes in the production environment against potential gains in quantity and quality of physics data. It will also present the evolution of the Experiment Control System (ECS) to allow on-the-fly error recovery actions of the detector apparatus while limiting as much as possible the loss of integrated luminosity. The paper will conclude with a review of the ALICE efficiency so far and the future plans to improve its monitoring.
ALICE experiment web site, May 10, 2009
〈
http://public.web.cern.ch/public/en/LHC/ALICEen.html
〉
(A Large Ion Collider Experiment) is an experiment at the LHC (Large Hadron Collider) optimized for ...the study of heavy-ion collisions.
The main aim of the experiment is to study the behavior of strongly interaction matter and quark gluon plasma. The ALICE DAQ (Data Acquisition) system has been deployed and used intensively during the commissioning of the experiment. This paper will present the evolution of one particular area of the system: the detector readout.
The data produced by each detector are received by DATE ALICE data acquisition web site, May 10, 2009
〈
http://phdepaid.web.cern.ch/phdepaid/
〉
(ALICE Data Acquisition program) using PCI (Peripheral Component Interconnect) based card called D-RORC web site, May 10, 2009
〈
http://alice-proj-ddl.web.cern.ch/alice-proj-ddl/rorc_intro.html
〉
(DAQ Readout Receiver Card). Of the order of 400 of these cards are installed in the PCs of the DAQ farm, and they are connected by optical links called DDL web site, May 10, 2009
〈
http://alice-proj-ddl.web.cern.ch/alice-proj-ddl/ddl_intro.html
〉
(Detector Data Link) to the detector readout electronics. The D-RORC is controlled by the readout software, part of the DATE program that reads the events coming from these cards. We will present the results obtained during the performance tests of the new release of the D-RORC, PCI Express, in development at CERN. The paper will review the working principles of the D-RORC, its use by the readout software and the benefits in using PCI Express instead PCI-X. It will also introduce the work in progress for the new release of the readout software towards the next hardware platform based on 64-bit computer architecture and DDLs based on 10G Ethernet.
ALICE 1 (A Large Ion Collider Experiment) is the detector system at the LHC (Large Hadron Collider) optimized for the study of heavy-ion collisions. Its main aim is to study the behavior of strongly ...interacting matter and the quark gluon plasma. Currently all the information sent by the 18 sub-detectors composing ALICE are read out by DATE 2 (Data Acquisition and Test Environment), the ALICE data acquisition software, using several optical links called DDL3 (Detector Data Link), each one with a maximum throughput of 200 MB/s. In the last year a commercial transmission link with a throughput of 10 Gb/s has become a reality, with a low price affordable for everyone. The DATE system has been upgraded to also support this technology in addition to the DDL. This contribution will describe the VHDL firmware of a detector readout board, sending data using the UDP protocol and the changes made to the readout 4 part of DATE software to receive information coming from the 1 or 10 Gb/s Ethernet link. It will also describe the relevant details of the test firmware and software and will conclude with the results of the performance tests done at CERN using the new setup.