Digital twins to personalize medicine Björnsson, Bergthor; Borrebaeck, Carl; Elander, Nils ...
Genome medicine,
12/2019, Letnik:
12, Številka:
1
Journal Article
Recenzirano
Odprti dostop
Personalized medicine requires the integration and processing of vast amounts of data. Here, we propose a solution to this challenge that is based on constructing Digital Twins. These are ...high-resolution models of individual patients that are computationally treated with thousands of drugs to find the drug that is optimal for the patient.
The ATLAS experiment at the Large Hadron Collider has a broad physics programme ranging from precision measurements to direct searches for new particles and new interactions, requiring ever larger ...and ever more accurate datasets of simulated Monte Carlo events. Detector simulation with
Geant4
is accurate but requires significant CPU resources. Over the past decade, ATLAS has developed and utilized tools that replace the most CPU-intensive component of the simulation—the calorimeter shower simulation—with faster simulation methods. Here, AtlFast3, the next generation of high-accuracy fast simulation in ATLAS, is introduced. AtlFast3 combines parameterized approaches with machine-learning techniques and is deployed to meet current and future computing challenges, and simulation needs of the ATLAS experiment. With highly accurate performance and significantly improved modelling of substructure within jets, AtlFast3 can simulate large numbers of events for a wide range of physics processes.
The recent trend for journals to require open access to primary data included in publications has been embraced by many biologists, but has caused apprehension amongst researchers engaged in ...long-term ecological and evolutionary studies. A worldwide survey of 73 principal investigators (Pls) with long-term studies revealed positive attitudes towards sharing data with the agreement or involvement of the PI, and 93% of PIs have historically shared data. Only 8% were in favor of uncontrolled, open access to primary data while 63% expressed serious concern. We present here their viewpoint on an issue that can have non-trivial scientific consequences. We discuss potential costs of public data archiving and provide possible solutions to meet the needs of journals and researchers.
Public data archiving is the archiving of primary data used in publications so that they can be preserved and made accessible to all online.
Public data archiving is increasingly required by journals. However, the costs of public data archiving might be underestimated, in particular with respect to long-term studies.
Long-term studies have been responsible for the answers to many important questions in evolution and ecology which could only be answered through following the life-histories of individuals for decades.
Several papers have been published in favor of public data archiving, but a more balanced viewpoint is necessary to allow a discussion to emerge on a code of ethics and ways to preserve and protect the data, encourage the initiation and continuation of long-term studies, and meet the requirements of the whole scientific community.
•Behavioural study investigated transmission risks and mitigations for UK mass events in 2021.•Explored non-pharmaceutical interventions (NPIs); e.g. social distancing and face coverings.•21 events ...observed. Data collection method presented with overview of data.
The Events Research Programme (ERP) was a multi-disciplinary study undertaken during the COVID-19 pandemic to investigate SARS-CoV-2 transmission risks and mitigations around the reopening of mass events in the UK in 2021 – including a behavioural study, exploring how non-pharmaceutical interventions (NPIs), such as social distancing, pre-event virus testing, and the use of face coverings might enable people to attend events safely. This behavioural study is discussed here.
A total of 21 pilot events were involved in the study between April and July 2021. The venues used for the pilots varied in size, layout, occupancy level, and crowd management approaches. Data was extracted from manual qualitative observations and venue CCTV cameras which recorded routinely at venues. In addition, 890 cameras were installed during the events to capture attendee movement outside the venues, during arrival, in event areas, circulation spaces, bars and restaurants, and on exiting. A mixed method approach was adopted to ensure locations and activities of interest were captured, quantitative data gathered, and that this data could be placed in context. This enabled a behavioural study, quantifying crowd performance behaviours for comparison between and within events. This paper describes the background to this work, the method adopted and provides a brief overview of the data collected, relating primarily to (i) crowd densities, (ii) social distancing (captured here as contact distancing), and (iii) the use of face coverings.
Introduction : Deep learning models for detecting episodes of atrial fibrillation (AF) using rhythm information in long-term ambulatory ECG recordings have shown high performance. However, the ...rhythm-based approach does not take advantage of the morphological information conveyed by the different ECG waveforms, particularly the f-waves. As a result, the performance of such models may be inherently limited. Methods : To address this limitation, we have developed a deep learning model, named RawECGNet, to detect episodes of AF and atrial flutter (AFl) using the raw, single-lead ECG. We compare the generalization performance of RawECGNet on two external data sets that account for distribution shifts in geography, ethnicity, and lead position. RawECGNet is further benchmarked against a state-of-the-art deep learning model, named ArNet2, which utilizes rhythm information as input. Results : Using RawECGNet, the results for the different leads in the external test sets in terms of the F1 score were 0.91-0.94 in RBDB and 0.93 in SHDB, compared to 0.89-0.91 in RBDB and 0.91 in SHDB for ArNet2. The results highlight RawECGNet as a high-performance, generalizable algorithm for detection of AF and AFl episodes, exploiting information on both rhythm and morphology.
Colonoscopy is considered the gold standard for detection of colorectal cancer and its precursors. Existing examination methods are, however, hampered by high overall miss-rate, and many ...abnormalities are left undetected. Computer-Aided Diagnosis systems based on advanced machine learning algorithms are touted as a game-changer that can identify regions in the colon overlooked by the physicians during endoscopic examinations, and help detect and characterize lesions. In previous work, we have proposed the ResUNet++ architecture and demonstrated that it produces more efficient results compared with its counterparts U-Net and ResUNet. In this paper, we demonstrate that further improvements to the overall prediction performance of the ResUNet++ architecture can be achieved by using Conditional Random Field (CRF) and Test-Time Augmentation (TTA). We have performed extensive evaluations and validated the improvements using six publicly available datasets: Kvasir-SEG, CVC-ClinicDB, CVC-ColonDB, ETIS-Larib Polyp DB, ASU-Mayo Clinic Colonoscopy Video Database, and CVC-VideoClinicDB. Moreover, we compare our proposed architecture and resulting model with other state-of-the-art methods. To explore the generalization capability of ResUNet++ on different publicly available polyp datasets, so that it could be used in a real-world setting, we performed an extensive cross-dataset evaluation. The experimental results show that applying CRF and TTA improves the performance on various polyp segmentation datasets both on the same dataset and cross-dataset. To check the model's performance on difficult to detect polyps, we selected, with the help of an expert gastroenterologist, 196 sessile or flat polyps that are less than ten millimeters in size. This additional data has been made available as a subset of Kvasir-SEG. Our approaches showed good results for flat or sessile and smaller polyps, which are known to be one of the major reasons for high polyp miss-rates. This is one of the significant strengths of our work and indicates that our methods should be investigated further for use in clinical practice.
Past research provides instructive yet incomplete answers as to how incumbent firms can address competing concerns as they embrace digital innovation. In particular, it offers only partial ...explanations of why different concerns emerge, how they manifest, and how firms can manage them. In response, we present a longitudinal case study of Volvo Cars’connected car initiative. Combining extant literature with insights from the case, we argue that incumbent firms face four competing concerns—capability (existing versus requisite), focus (product versus process), collaboration (internal versus external), and governance (control versus flexibility)—and that these concerns are systemically interrelated. Firms must therefore manage these concerns cohesively by continuously balancing new opportunities and established practices.