Here, we describe an algorithm for identifying point-source transients and moving objects on reference-subtracted optical images containing artifacts of processing and instrumentation. The algorithm ...makes use of the supervised machine learning technique known as Random Forest. We present results from its use in the Dark Energy Survey Supernova program (DES-SN), where it was trained using a sample of 898,963 signal and background events generated by the transient detection pipeline. After reprocessing the data collected during the first DES-SN observing season (2013 September through 2014 February) using the algorithm, the number of transient candidates eligible for human scanning decreased by a factor of 13.4, while only 1.0% of the artificial Type Ia supernovae (SNe) injected into search images to monitor survey efficiency were lost, most of which were very faint events. Here we characterize the algorithm's performance in detail, and we discuss how it can inform pipeline design decisions for future time-domain imaging surveys, such as the Large Synoptic Survey Telescope and the Zwicky Transient Facility. An implementation of the algorithm and the training data used in this paper are available at at http://portal.nersc.gov/project/dessn/autoscan.
We present cosmological results from a combined analysis of galaxy clustering and weak gravitational lensing, using 1321 deg2 of griz imaging data from the first year of the Dark Energy Survey (DES ...Y1). We combine three two-point functions: (i) the cosmic shear correlation function of 26 million source galaxies in four redshift bins, (ii) the galaxy angular autocorrelation function of 650,000 luminous red galaxies in five redshift bins, and (iii) the galaxy-shear cross-correlation of luminous red galaxy positions and source galaxy shears. To demonstrate the robustness of these results, we use independent pairs of galaxy shape, photometric-redshift estimation and validation, and likelihood analysis pipelines. To prevent confirmation bias, the bulk of the analysis was carried out while “blind” to the true results; we describe an extensive suite of systematics checks performed and passed during this blinded phase. The data are modeled in flat ΛCDM and wCDM cosmologies, marginalizing over 20 nuisance parameters, varying 6 (for ΛCDM) or 7 (for wCDM) cosmological parameters including the neutrino mass density and including the 457×457 element analytic covariance matrix. We find consistent cosmological results from these three two-point functions and from their combination obtain S8≡σ8(Ωm/0.3)0.5=0.773−0.020+0.026 and Ωm=0.267−0.017+0.030 for ΛCDM; for wCDM, we find S8=0.782−0.024+0.036, Ωm=0.284−0.030+0.033, and w=−0.82−0.20+0.21 at 68% C.L. The precision of these DES Y1 constraints rivals that from the Planck cosmic microwave background measurements, allowing a comparison of structure in the very early and late Universe on equal terms. Although the DES Y1 best-fit values for S8 and Ωm are lower than the central values from Planck for both ΛCDM and wCDM, the Bayes factor indicates that the DES Y1 and Planck data sets are consistent with each other in the context of ΛCDM. Combining DES Y1 with Planck, baryonic acoustic oscillation measurements from SDSS, 6dF, and BOSS and type Ia supernovae from the Joint Lightcurve Analysis data set, we derive very tight constraints on cosmological parameters: S8=0.802±0.012 and Ωm=0.298±0.007 in ΛCDM and w=−1.00−0.04+0.05 in wCDM. Upcoming Dark Energy Survey analyses will provide more stringent tests of the ΛCDM model and extensions such as a time-varying equation of state of dark energy or modified gravity.
Large-scale imaging surveys will increase the number of galaxy-scale strong lensing candidates by maybe three orders of magnitudes beyond the number known today. Finding these rare objects will ...require picking them out of at least tens of millions of images, and deriving scientific results from them will require quantifying the efficiency and bias of any search method. To achieve these objectives automated methods must be developed. Because gravitational lenses are rare objects, reducing false positives will be particularly important. We present a description and results of an open gravitational lens finding challenge. Participants were asked to classify 100 000 candidate objects as to whether they were gravitational lenses or not with the goal of developing better automated methods for finding lenses in large data sets. A variety of methods were used including visual inspection, arc and ring finders, support vector machines (SVM) and convolutional neural networks (CNN). We find that many of the methods will be easily fast enough to analyse the anticipated data flow. In test data, several methods are able to identify upwards of half the lenses after applying some thresholds on the lens characteristics such as lensed image brightness, size or contrast with the lens galaxy without making a single false-positive identification. This is significantly better than direct inspection by humans was able to do. Having multi-band, ground based data is found to be better for this purpose than single-band space based data with lower noise and higher resolution, suggesting that multi-colour data is crucial. Multi-band space based data will be superior to ground based data. The most difficult challenge for a lens finder is differentiating between rare, irregular and ring-like face-on galaxies and true gravitational lenses. The degree to which the efficiency and biases of lens finders can be quantified largely depends on the realism of the simulated data on which the finders are trained.
We derived constraints on cosmological parameters using weak lensing peak statistics measured on the ∼ 130 deg2 of the Canada–France–Hawaii Telescope Stripe 82 Survey. This analysis demonstrates the ...feasibility of using peak statistics in cosmological studies. For our measurements, we considered peaks with signal-to-noise ratio in the range of ν = 3, 6. For a flat Λ cold dark matter model with only (Ωm, σ8) as free parameters, we constrained the parameters of the following relation Σ8 = σ8(Ωm/0.27)α to be Σ8 = 0.82 ± 0.03 and α = 0.43 ± 0.02. The α value found is considerably smaller than the one measured in two-point and three-point cosmic shear correlation analyses, showing a significant complement of peak statistics to standard weak lensing cosmological studies. The derived constraints on (Ωm, σ8) are fully consistent with the ones from either WMAP9 or Planck. From the weak lensing peak abundances alone, we obtained marginalized mean values of
$\Omega _{\rm m}=0.38^{+0.27}_{-0.24}$
and σ8 = 0.81 ± 0.26. Finally, we also explored the potential of using weak lensing peak statistics to constrain the mass–concentration relation of dark matter haloes simultaneously with cosmological parameters.
Parasite lactate dehydrogenase (pLDH) is a potential drug target for new antimalarials owing to parasite dependence on glycolysis for ATP production. The pLDH from all four species of human malarial ...parasites were cloned, expressed, and analyzed for structural and kinetic properties that might be exploited for drug development. pLDH from Plasmodium vivax, malariae, and ovale exhibit 90−92% identity to pLDH from Plasmodium falciparum. Catalytic residues are identical. Resides I250 and T246, conserved in most LDH, are replaced by proline in all pLDH. The pLDH contain the same five-amino acid insert (DKEWN) in the substrate specificity loops. Within the cofactor site, pLDH from P. falciparum and P. malariae are identical, while pLDH from P. vivax and P. ovale have one substitution. Homology modeling of pLDH from P. vivax, ovale, and malariae with the crystal structure of pLDH from P. falciparum gave nearly identical structures. Nevertheless, the kinetic properties and sensitivities to inhibitors targeted to the cofactor binding site differ significantly. Michaelis constants for pyruvate and lactate differ 8−9-fold; Michaelis constants for NADH, NAD+, and the NAD+ analogue 3-acetylpyridine adenine dinucleotide differ up to 4-fold. Dissociation constants for the inhibitors differ up to 21-fold. Molecular docking studies of the binding of the inhibitors to the cofactor sites of all four pLDH predict similar orientations, with the docked ligands positioned at the nicotinamide end of the cofactor site. pH studies indicate that inhibitor binding is independent of pH in the pH 6−8 range, suggesting that differences in dissociation constants for a specific inhibitor are not due to altered active site pK values among the four pLDH.
We have developed two diagnostic assays based on the specific detection of Plasmodium lactate dehydrogenase (pLDH) activity. These assays exploit a panel of monoclonal antibodies that capture the ...parasite enzyme and allow for the quantitation and speciation of human malaria infections. An immunocapture pLDH activity assay (ICpLDH) allows for the rapid purification and measurement of pLDH from infected blood using the NAD analog APAD, which reacts specifically with Plasmodium LDH isoforms. An immunochromatographic test (the OptiMAL assay) was also formatted and allowed the detection of parasite infections of approximately 200 parasites/microl of blood. By using a combination of antibodies, both tests can not only detect but differentiate between P. falciparum and non-P. falciparum malaria. Both assays show a sensitivity comparable with other commercial nonmicroscopic tests; importantly, we found very few instances of false-positive samples, especially with samples from patients recently cleared of malaria infection. Furthermore, we find that when one uses the quantitative ICpLDH assay, the levels of pLDH activity closely mirror the levels of parasitemia in both initial diagnosis and while following patient therapy. We conclude that diagnostic tests based on the detection of pLDH are both sensitive and practical for the detection, speciation, and quantitation of all human Plasmodium infections and can also be used to indicate drug-resistant infections.
ABSTRACT We report the discovery of eight new Milky Way companions in of optical imaging data collected during the first year of the Dark Energy Survey (DES). Each system is identified as a ...statistically significant over-density of individual stars consistent with the expected isochrone and luminosity function of an old and metal-poor stellar population. The objects span a wide range of absolute magnitudes (MV from to ), physical sizes ( ), and heliocentric distances ( ). Based on the low surface brightnesses, large physical sizes, and/or large Galactocentric distances of these objects, several are likely to be new ultra-faint satellite galaxies of the Milky Way and/or Magellanic Clouds. We introduce a likelihood-based algorithm to search for and characterize stellar over-densities, as well as identify stars with high satellite membership probabilities. We also present completeness estimates for detecting ultra-faint galaxies of varying luminosities, sizes, and heliocentric distances in the first-year DES data.
Derivatives of the sesquiterpene 8-deoxyhemigossylic acid (2, 3-dihydroxy-6-methyl-4-(1-methylethyl)-1-naphthoic acid) were synthesized that contained altered alkyl groups in the 4-position and ...contained alkyl or aralkyl groups in the 7-position. These substituted dihydroxynaphthoic acids are selective inhibitors of human lactate dehydrogenase-H (LDH-H) and LDH-M and of lactate dehydrogenase from the malarial parasite Plasmodium falciparum (pLDH). All inhibitors are competitive with the binding of NADH. Selectivity for LDH-H, LDH-M, or pLDH is strongly dependent upon the groups that are in the 4- and 7-positions of the dihydroxynaphthoic acid backbone. Dissociation constants as low as 50 nM were observed, with selectivity as high as 400-fold.
ABSTRACT We describe an algorithm for identifying point-source transients and moving objects on reference-subtracted optical images containing artifacts of processing and instrumentation. The ...algorithm makes use of the supervised machine learning technique known as Random Forest. We present results from its use in the Dark Energy Survey Supernova program (DES-SN), where it was trained using a sample of 898,963 signal and background events generated by the transient detection pipeline. After reprocessing the data collected during the first DES-SN observing season (2013 September through 2014 February) using the algorithm, the number of transient candidates eligible for human scanning decreased by a factor of 13.4, while only 1.0% of the artificial Type Ia supernovae (SNe) injected into search images to monitor survey efficiency were lost, most of which were very faint events. Here we characterize the algorithm's performance in detail, and we discuss how it can inform pipeline design decisions for future time-domain imaging surveys, such as the Large Synoptic Survey Telescope and the Zwicky Transient Facility. An implementation of the algorithm and the training data used in this paper are available at at http://portal.nersc.gov/project/dessn/autoscan.
We present results from a study of the photometric redshift performance of the Dark Energy Survey (DES), using the early data from a Science Verification period of observations in late 2012 and early ...2013 that provided science-quality images for almost 200 sq. deg. at the nominal depth of the survey. We assess the photometric redshift (photo-z) performance using about 15 000 galaxies with spectroscopic redshifts available from other surveys. These galaxies are used, in different configurations, as a calibration sample, and photo-z's are obtained and studied using most of the existing photo-z codes. A weighting method in a multidimensional colour-magnitude space is applied to the spectroscopic sample in order to evaluate the photo-z performance with sets that mimic the full DES photometric sample, which is on average significantly deeper than the calibration sample due to the limited depth of spectroscopic surveys. Empirical photo-z methods using, for instance, artificial neural networks or random forests, yield the best performance in the tests, achieving core photo-z resolutions ... ~ 0.08. Moreover, the results from most of the codes, including template-fitting methods, comfortably meet the DES requirements on photo-z performance, therefore, providing an excellent precedent for future DES data sets. (ProQuest: ... denotes formulae/symbols omitted.)