In the fall 2016, GeantV went through a thorough community evaluation of the project status and of its strategy for sharing the R&D results with the LHC experiments and with the HEP simulation ...community in general. Following this discussion, GeantV has engaged onto an ambitious 2-year road-path aiming to deliver a beta version that has most of the final design and several performance features of the final product, partially integrated with some of the experiment's frameworks. The initial GeantV prototype has been updated to a vector-aware concurrent framework, which is able to deliver high-density floating-point computations for most of the performance-critical components such as propagation in field and physics models. Electromagnetic physics models were adapted for the specific GeantV requirements, aiming for the full demonstration of shower physics performance in the alpha release at the end of 2017. We have revisited and formalized GeantV user interfaces and helper protocols, allowing to: connect to user code, provide recipes to access efficiently MC truth and generate user data in a concurrent environment.
The recent emergence of hardware architectures characterized by many-core or accelerated processors has opened new opportunities for concurrent programming models taking advantage of both SIMD and ...SIMT architectures. GeantV, a next generation detector simulation, has been designed to exploit both the vector capability of mainstream CPUs and multi-threading capabilities of coprocessors including NVidia GPUs and Intel Xeon Phi. The characteristics of these architectures are very different in terms of the vectorization depth and type of parallelization needed to achieve optimal performance. In this paper we describe implementation of electromagnetic physics models developed for parallel computing architectures as a part of the GeantV project. Results of preliminary performance evaluation and physics validation are presented as well.
An intensive R&D and programming effort is required to accomplish new challenges posed by future experimental high-energy particle physics (HEP) programs. The GeantV project aims to narrow the gap ...between the performance of the existing HEP detector simulation software and the ideal performance achievable, exploiting latest advances in computing technology. The project has developed a particle detector simulation prototype capable of transporting in parallel particles in complex geometries exploiting instruction level microparallelism (SIMD and SIMT), task-level parallelism (multithreading) and high-level parallelism (MPI), leveraging both the multi-core and the many-core opportunities. We present preliminary verification results concerning the electromagnetic (EM) physics models developed for parallel computing architectures within the GeantV project. In order to exploit the potential of vectorization and accelerators and to make the physics model effectively parallelizable, advanced sampling techniques have been implemented and tested. In this paper we introduce a set of automated statistical tests in order to verify the vectorized models by checking their consistency with the corresponding Geant4 models and to validate them against experimental data.
GeantV Amadio, G.; Ananya, A.; Apostolakis, J. ...
Computing and software for big science,
12/2021, Letnik:
5, Številka:
1
Journal Article
Odprti dostop
Full detector simulation was among the largest CPU consumers in all CERN experiment software stacks for the first two runs of the Large Hadron Collider. In the early 2010s, it was projected that ...simulation demands would scale linearly with increasing luminosity, with only partial compensation from increasing computing resources. The extension of fast simulation approaches to cover more use cases that represent a larger fraction of the simulation budget is only part of the solution, because of intrinsic precision limitations. The remainder corresponds to speeding up the simulation software by several factors, which is not achievable by just applying simple optimizations to the current code base. In this context, the GeantV R&D project was launched, aiming to redesign the legacy particle transport code in order to benefit from features of fine-grained parallelism, including vectorization and increased locality of both instruction and data. This paper provides an extensive presentation of the results and achievements of this R&D project, as well as the conclusions and lessons learned from the beta version prototype.
Performance of GeantV EM Physics Models Amadio, G; Ananya, A; Apostolakis, J ...
Journal of physics. Conference series,
10/2017, Letnik:
898, Številka:
7
Journal Article
Recenzirano
Odprti dostop
The recent progress in parallel hardware architectures with deeper vector pipelines or many-cores technologies brings opportunities for HEP experiments to take advantage of SIMD and SIMT computing ...models. Launched in 2013, the GeantV project studies performance gains in propagating multiple particles in parallel, improving instruction throughput and data locality in HEP event simulation on modern parallel hardware architecture. Due to the complexity of geometry description and physics algorithms of a typical HEP application, performance analysis is indispensable in identifying factors limiting parallel execution. In this report, we will present design considerations and preliminary computing performance of GeantV physics models on coprocessors (Intel Xeon Phi and NVidia GPUs) as well as on mainstream CPUs.
Abstract
Background
Oral anticoagulation (OAC) is paramount to effective thromboprophylaxis; yet adherence to OAC remains largely suboptimal in patients with atrial fibrillation (AF).
Purpose
We ...aimed to assess the impact of an educational, motivational intervention on the adherence to OAC in patients with non-valvular AF.
Methods
Hospitalised patients with non-valvular AF who received OAC were randomly assigned to usual medical care or a proactive intervention, comprising motivational interviewing and tailored counseling on medication adherence. The primary study outcome was adherence to OAC at 1-year, evaluated as Proportion of Days Covered (PDC) by OAC regimens and assessed through nationwide prescription registers. Secondary outcomes included the rate of persistence to OAC, gaps in treatment, proportion of VKA-takers with labile INR (defined as time to therapeutic range<70%) and clinical events.
Results
A total of 1009 patients were randomised, 500 in the intervention group and 509 in the control group. At 1-year follow-up, 77.2% (386/500) of patients in the intervention group had good adherence (PDC>80%), compared with 55% (280/509) in the control group (adjusted odds ratio 2.84, 95% confidence interval 2.14–3.75; p<0.001). Mean PDC±SD was 0.85±0.26 and 0.75±0.31, respectively (p<0.001). Patients that received the intervention were more likely to persist in their OAC therapy at 1 year, while usual medical care was associated with more major (≥3 months) treatment gaps Figure. Among 212 VKA-takers, patients in the intervention group were less likely to have labile INR compared with those in the control group 21/120 (17.1%) vs 34/92 (37.1%), OR 0.33 95% CI 1.15–0.72, p=0.005. Clinical events over a median follow-up period of 2 years occurred at a numerically lower, yet non-significant, rate in the intervention group Table.
Conclusions
In patients receiving OAC therapy for non-valvular AF, a motivational intervention significantly improved patterns of medication adherence, without significantly affecting clinical outcomes.
Primary and secondary outcomes
Funding Acknowledgement
Type of funding source: None
Abstract
Background
The association of heart failure (HF) with the prognosis of atrial fibrillation (AF) remains unclear.
OBJECTIVES
To assess all-cause mortality in patients following ...hospitalization with comorbid AF in relation to the presence of HF.
Methods
We performed a cross-sectional analysis of data from 977 patients discharged from the cardiology ward of a single tertiary center between 2015 and 2018 and followed for a median of 2 years. The association between HF and the primary endpoint of death from any cause was assessed using multivariable Cox regression.
Results
HF was documented in 505 (51.7%) of AF cases at discharge, including HFrEF (17.9%), HFmrEF (16.5%) and HFpEF (25.2%). A primary endpoint event occurred in 212 patients (42%) in the AF-HF group and in 86 patients (18.2%) in the AF-no HF group (adjusted hazard ratio aHR 2.27; 95% confidence interval CI, 1.65 to 3.13; P<0.001). HF was associated with a higher risk of the composite secondary endpoint of death from any cause, AF or HF-specific hospitalization (aHR 1.69; 95% CI 1.32 to 2.16 p<0.001). The associations of HF with the primary and secondary endpoints were significant and similar for AF-HFrEF, AF-HFmrEF, AF-HFpEF.
Conclusions
HF was present in half of the patients discharged from the hospital with comorbid AF. The presence of HF on top of AF was independently associated with a significantly higher risk of all-cause mortality than did absence of HF, irrespective of HF subtype.
Funding Acknowledgement
Type of funding source: None
Abstract
Background
Prior risk stratification schemes for atrial fibrillation (AF) have extensively focused on stroke as the principal outcome. However, an accurate estimation of the risk of death in ...patients with AF has received disproportional attention.
Purpose
The aim of this study was to develop and validate a risk score for predicting mortality in patients with AF who underwent a hospitalization for cardiac reasons.
Methods
The new risk score was developed and internally validated in 887 patients with AF, who were followed up for a median of 2 years. The outcome measure was all-cause mortality. Biomarker samples, echocardiographic data and renal function values were obtained at the date closest to hospital discharge. A Cox-model that determined the variables that significantly contributed to the prediction of all-cause mortality, was adapted to a risk points system through weighting of the model coefficients. The model was internally validated by bootstrapping, assessing both discrimination and calibration.
Results
311 all-cause deaths were reported during 1755 person-years of follow-up (incidence rate 17.7 events per 100 person-years). The most important predictors of death were N-terminal pro B-type natriuretic peptide (NT-proBNP), high-sensitivity troponin-T (hs-TnT), left atrial area indexed to body surface area (LAAi), prior cardiac arrest, kidney impairment, congestive heart failure and age, and were included in the BLACCK (AF) death risk score. The score was well-calibrated (observed probabilities adjusted to predicted probabilities) and showed good discriminative ability c-index 0.87 (95% CI 0.85–0.90). The internal validation of the score reported minimal over-fitting (optimism-corrected c-index of 0.85). The 1, 2 and 3-year risk of death derived by the score's total points may be calculated immediately through the nomogram (Figure 1).
BLACCK (AF) risk score nomogram
Conclusions
We developed a simple, well-calibrated and internally validated novel risk score for predicting 1, 2 and 3-year risk of death in patients with AF after a hospitalization for cardiac reasons. The BLACCK (AF) death risk score included both cardiac biomarkers and clinical information, performed well and may assist physicians in decision-making when treating patients with AF.