Abstract
We present PHANGS–ALMA, the first survey to map CO
J
= 2 → 1 line emission at ∼1″ ∼100 pc spatial resolution from a representative sample of 90 nearby (
d
≲ 20 Mpc) galaxies that lie on or ...near the
z
= 0 “main sequence” of star-forming galaxies. CO line emission traces the bulk distribution of molecular gas, which is the cold, star-forming phase of the interstellar medium. At the resolution achieved by PHANGS–ALMA, each beam reaches the size of a typical individual giant molecular cloud, so that these data can be used to measure the demographics, life cycle, and physical state of molecular clouds across the population of galaxies where the majority of stars form at
z
= 0. This paper describes the scientific motivation and background for the survey, sample selection, global properties of the targets, Atacama Large Millimeter/submillimeter Array (ALMA) observations, and characteristics of the delivered data and derived data products. As the ALMA sample serves as the parent sample for parallel surveys with MUSE on the Very Large Telescope, the Hubble Space Telescope, AstroSat, the Very Large Array, and other facilities, we include a detailed discussion of the sample selection. We detail the estimation of galaxy mass, size, star formation rate, CO luminosity, and other properties, compare estimates using different systems and provide best-estimate integrated measurements for each target. We also report the design and execution of the ALMA observations, which combine a Cycle 5 Large Program, a series of smaller programs, and archival observations. Finally, we present the first 1″ resolution atlas of CO emission from nearby galaxies and describe the properties and contents of the first PHANGS–ALMA public data release.
We present PHANGS-ALMA, the first survey to map CO J=2-1 line emission at ~1" ~ 100pc spatial resolution from a representative sample of 90 nearby (d<~20 Mpc) galaxies that lie on or near the z=0 ..."main sequence" of star-forming galaxies. CO line emission traces the bulk distribution of molecular gas, which is the cold, star-forming phase of the interstellar medium. At the resolution achieved by PHANGS-ALMA, each beam reaches the size of a typical individual giant molecular cloud (GMC), so that these data can be used to measure the demographics, life-cycle, and physical state of molecular clouds across the population of galaxies where the majority of stars form at z=0. This paper describes the scientific motivation and background for the survey, sample selection, global properties of the targets, ALMA observations, and characteristics of the delivered ALMA data and derived data products. As the ALMA sample serves as the parent sample for parallel surveys with VLT/MUSE, HST, AstroSat, VLA, and other facilities, we include a detailed discussion of the sample selection. We detail the estimation of galaxy mass, size, star formation rate, CO luminosity, and other properties, compare estimates using different systems and provide best-estimate integrated measurements for each target. We also report the design and execution of the ALMA observations, which combine a Cycle~5 Large Program, a series of smaller programs, and archival observations. Finally, we present the first 1" resolution atlas of CO emission from nearby galaxies and describe the properties and contents of the first PHANGS-ALMA public data release.
The ethics of AI in health care: A mapping review Morley, Jessica; Machado, Caio C.V.; Burr, Christopher ...
Social science & medicine (1982),
September 2020, 2020-09-00, 20200901, Letnik:
260
Journal Article
Recenzirano
This article presents a mapping review of the literature concerning the ethics of artificial intelligence (AI) in health care. The goal of this review is to summarise current debates and identify ...open questions for future research. Five literature databases were searched to support the following research question: how can the primary ethical risks presented by AI-health be categorised, and what issues must policymakers, regulators and developers consider in order to be ‘ethically mindful? A series of screening stages were carried out—for example, removing articles that focused on digital health in general (e.g. data sharing, data access, data privacy, surveillance/nudging, consent, ownership of health data, evidence of efficacy)—yielding a total of 156 papers that were included in the review.
We find that ethical issues can be (a) epistemic, related to misguided, inconclusive or inscrutable evidence; (b) normative, related to unfair outcomes and transformative effectives; or (c) related to traceability. We further find that these ethical issues arise at six levels of abstraction: individual, interpersonal, group, institutional, and societal or sectoral. Finally, we outline a number of considerations for policymakers and regulators, mapping these to existing literature, and categorising each as epistemic, normative or traceability-related and at the relevant level of abstraction. Our goal is to inform policymakers, regulators and developers of what they must consider if they are to enable health and care systems to capitalise on the dual advantage of ethical AI; maximising the opportunities to cut costs, improve care, and improve the efficiency of health and care systems, whilst proactively avoiding the potential harms. We argue that if action is not swiftly taken in this regard, a new ‘AI winter’ could occur due to chilling effects related to a loss of public trust in the benefits of AI for health care.
•This article maps the ethics of artificial intelligence in healthcare.•Ethical issues can be epistemic, normative, or related to traceability.•Issues affect individuals, relationships, groups, institutions, sectors, societies.•An agreed standard for ethical analysis is needed; split by issue and level.
We present a modular approach for analyzing calcium imaging recordings of large neuronal ensembles. Our goal is to simultaneously identify the locations of the neurons, demix spatially overlapping ...components, and denoise and deconvolve the spiking activity from the slow dynamics of the calcium indicator. Our approach relies on a constrained nonnegative matrix factorization that expresses the spatiotemporal fluorescence activity as the product of a spatial matrix that encodes the spatial footprint of each neuron in the optical field and a temporal matrix that characterizes the calcium concentration of each neuron over time. This framework is combined with a novel constrained deconvolution approach that extracts estimates of neural activity from fluorescence traces, to create a spatiotemporal processing algorithm that requires minimal parameter tuning. We demonstrate the general applicability of our method by applying it to in vitro and in vivo multi-neuronal imaging data, whole-brain light-sheet imaging data, and dendritic imaging data.
•We present a new method for analyzing large-scale calcium imaging datasets•The method identifies the cell locations and deconvolves their neural activity•Applications to in vivo somatic and dendritic imaging are presented•We make available MATLAB and Python implementations of our method
Advances in calcium imaging pose significant statistical analysis challenges. Pnevmatikakis et al. present a method for identifying and spatially demixing imaged neural components and deconvolving their activity from the indicator dynamics. The method is applied to a variety of datasets.
Von Hippel-Lindau (VHL) disease is a hereditary tumor syndrome in which carriers are at an increased risk of developing a variety of tumors in multiple organ systems. A clinical diagnosis of VHL is ...determined by the presence of specific clinical manifestations while a molecular genetic diagnosis results from a pathogenic variant in the VHL gene. The majority of mutations occur in VHL coding exons and DNA analysis of these regions has a reported sensitivity of nearly 100%. However, rare variants in the VHL gene promoter may be detected in some cases of suspected VHL disease. We report two cases where VHL promoter variants were detected and describe the role of multi-step mRNA and protein analysis in the diagnostic evaluation of these cases.
Proton beam dumps are prolific sources of mesons enabling a powerful
technique to search for vector mediator coupling of dark matter to neutral pion
and higher mass meson decays. By the end of the ...decade the PIP-II linac will be
delivering up to 1 MW of proton power to the FNAL campus. This includes a
significant increase of power to the Booster Neutrino Beamline (BNB) which
delivers 8 GeV protons to the Short Baseline Neutrino (SBN) detectors. By
building a new dedicated beam dump target station, and using the SBN detectors,
a greater than an order of magnitude increase in search sensitivity for dark
matter relative to the recent MiniBooNE beam dump search can be achieved. This
modest cost upgrade to the BNB would begin testing models of the highly
motivated relic density limit predictions and provide novel ways to test
explanations of the anomalous excess of low energy events seen by MiniBooNE.
A neutrino community workshop was held at Fermilab in Jan 2020, with the aim of developing an implementation plan for a set of common interfaces to Neutrino Event Generators. This white paper ...summarizes discussions at the workshop and the resulting plan.
Proton beam dumps are prolific sources of mesons enabling a powerful technique to search for vector mediator coupling of dark matter to neutral pion and higher mass meson decays. By the end of the ...decade the PIP-II linac will be delivering up to 1 MW of proton power to the FNAL campus. This includes a significant increase of power to the Booster Neutrino Beamline (BNB) which delivers 8 GeV protons to the Short Baseline Neutrino (SBN) detectors. By building a new dedicated beam dump target station, and using the SBN detectors, a greater than an order of magnitude increase in search sensitivity for dark matter relative to the recent MiniBooNE beam dump search can be achieved. This modest cost upgrade to the BNB would begin testing models of the highly motivated relic density limit predictions and provide novel ways to test explanations of the anomalous excess of low energy events seen by MiniBooNE.