Digital twins to personalize medicine Björnsson, Bergthor; Borrebaeck, Carl; Elander, Nils ...
Genome medicine,
12/2019, Letnik:
12, Številka:
1
Journal Article
Recenzirano
Odprti dostop
Personalized medicine requires the integration and processing of vast amounts of data. Here, we propose a solution to this challenge that is based on constructing Digital Twins. These are ...high-resolution models of individual patients that are computationally treated with thousands of drugs to find the drug that is optimal for the patient.
The ATLAS experiment at the Large Hadron Collider has a broad physics programme ranging from precision measurements to direct searches for new particles and new interactions, requiring ever larger ...and ever more accurate datasets of simulated Monte Carlo events. Detector simulation with
Geant4
is accurate but requires significant CPU resources. Over the past decade, ATLAS has developed and utilized tools that replace the most CPU-intensive component of the simulation—the calorimeter shower simulation—with faster simulation methods. Here, AtlFast3, the next generation of high-accuracy fast simulation in ATLAS, is introduced. AtlFast3 combines parameterized approaches with machine-learning techniques and is deployed to meet current and future computing challenges, and simulation needs of the ATLAS experiment. With highly accurate performance and significantly improved modelling of substructure within jets, AtlFast3 can simulate large numbers of events for a wide range of physics processes.
The recent trend for journals to require open access to primary data included in publications has been embraced by many biologists, but has caused apprehension amongst researchers engaged in ...long-term ecological and evolutionary studies. A worldwide survey of 73 principal investigators (Pls) with long-term studies revealed positive attitudes towards sharing data with the agreement or involvement of the PI, and 93% of PIs have historically shared data. Only 8% were in favor of uncontrolled, open access to primary data while 63% expressed serious concern. We present here their viewpoint on an issue that can have non-trivial scientific consequences. We discuss potential costs of public data archiving and provide possible solutions to meet the needs of journals and researchers.
Public data archiving is the archiving of primary data used in publications so that they can be preserved and made accessible to all online.
Public data archiving is increasingly required by journals. However, the costs of public data archiving might be underestimated, in particular with respect to long-term studies.
Long-term studies have been responsible for the answers to many important questions in evolution and ecology which could only be answered through following the life-histories of individuals for decades.
Several papers have been published in favor of public data archiving, but a more balanced viewpoint is necessary to allow a discussion to emerge on a code of ethics and ways to preserve and protect the data, encourage the initiation and continuation of long-term studies, and meet the requirements of the whole scientific community.
•Behavioural study investigated transmission risks and mitigations for UK mass events in 2021.•Explored non-pharmaceutical interventions (NPIs); e.g. social distancing and face coverings.•21 events ...observed. Data collection method presented with overview of data.
The Events Research Programme (ERP) was a multi-disciplinary study undertaken during the COVID-19 pandemic to investigate SARS-CoV-2 transmission risks and mitigations around the reopening of mass events in the UK in 2021 – including a behavioural study, exploring how non-pharmaceutical interventions (NPIs), such as social distancing, pre-event virus testing, and the use of face coverings might enable people to attend events safely. This behavioural study is discussed here.
A total of 21 pilot events were involved in the study between April and July 2021. The venues used for the pilots varied in size, layout, occupancy level, and crowd management approaches. Data was extracted from manual qualitative observations and venue CCTV cameras which recorded routinely at venues. In addition, 890 cameras were installed during the events to capture attendee movement outside the venues, during arrival, in event areas, circulation spaces, bars and restaurants, and on exiting. A mixed method approach was adopted to ensure locations and activities of interest were captured, quantitative data gathered, and that this data could be placed in context. This enabled a behavioural study, quantifying crowd performance behaviours for comparison between and within events. This paper describes the background to this work, the method adopted and provides a brief overview of the data collected, relating primarily to (i) crowd densities, (ii) social distancing (captured here as contact distancing), and (iii) the use of face coverings.
Introduction : Deep learning models for detecting episodes of atrial fibrillation (AF) using rhythm information in long-term ambulatory ECG recordings have shown high performance. However, the ...rhythm-based approach does not take advantage of the morphological information conveyed by the different ECG waveforms, particularly the f-waves. As a result, the performance of such models may be inherently limited. Methods : To address this limitation, we have developed a deep learning model, named RawECGNet, to detect episodes of AF and atrial flutter (AFl) using the raw, single-lead ECG. We compare the generalization performance of RawECGNet on two external data sets that account for distribution shifts in geography, ethnicity, and lead position. RawECGNet is further benchmarked against a state-of-the-art deep learning model, named ArNet2, which utilizes rhythm information as input. Results : Using RawECGNet, the results for the different leads in the external test sets in terms of the F1 score were 0.91-0.94 in RBDB and 0.93 in SHDB, compared to 0.89-0.91 in RBDB and 0.91 in SHDB for ArNet2. The results highlight RawECGNet as a high-performance, generalizable algorithm for detection of AF and AFl episodes, exploiting information on both rhythm and morphology.
Past research provides instructive yet incomplete answers as to how incumbent firms can address competing concerns as they embrace digital innovation. In particular, it offers only partial ...explanations of why different concerns emerge, how they manifest, and how firms can manage them. In response, we present a longitudinal case study of Volvo Cars’connected car initiative. Combining extant literature with insights from the case, we argue that incumbent firms face four competing concerns—capability (existing versus requisite), focus (product versus process), collaboration (internal versus external), and governance (control versus flexibility)—and that these concerns are systemically interrelated. Firms must therefore manage these concerns cohesively by continuously balancing new opportunities and established practices.
The integration of 3D city models with Building Information Models (BIM), coined as GeoBIM, facilitates improved data support to several applications, e.g., 3D map updates, building permits issuing, ...detailed city analysis, infrastructure design, context-based building design, to name a few. To solve the integration, several issues need to be tackled and solved, i.e., harmonization of features, interoperability, format conversions, integration of procedures. The GeoBIM benchmark 2019, funded by ISPRS and EuroSDR, evaluated the state of implementation of tools addressing some of those issues. In particular, in the part of the benchmark described in this paper, the application of georeferencing to Industry Foundation Classes (IFC) models and making consistent conversions between 3D city models and BIM are investigated, considering the OGC CityGML and buildingSMART IFC as reference standards. In the benchmark, sample datasets in the two reference standards were provided. External volunteers were asked to describe and test georeferencing procedures for IFC models and conversion tools between CityGML and IFC. From the analysis of the delivered answers and processed datasets, it was possible to notice that while there are tools and procedures available to support georeferencing and data conversion, comprehensive definition of the requirements, clear rules to perform such two tasks, as well as solid technological solutions implementing them, are still lacking in functionalities. Those specific issues can be a sensible starting point for planning the next GeoBIM integration agendas.