ABSTRACT
Many scientific investigations of photometric galaxy surveys require redshift estimates, whose uncertainty properties are best encapsulated by photometric redshift (photo-z) posterior ...probability density functions (PDFs). A plethora of photo-z PDF estimation methodologies abound, producing discrepant results with no consensus on a preferred approach. We present the results of a comprehensive experiment comparing 12 photo-z algorithms applied to mock data produced for The Rubin Observatory Legacy Survey of Space and Time Dark Energy Science Collaboration. By supplying perfect prior information, in the form of the complete template library and a representative training set as inputs to each code, we demonstrate the impact of the assumptions underlying each technique on the output photo-z PDFs. In the absence of a notion of true, unbiased photo-z PDFs, we evaluate and interpret multiple metrics of the ensemble properties of the derived photo-z PDFs as well as traditional reductions to photo-z point estimates. We report systematic biases and overall over/underbreadth of the photo-z PDFs of many popular codes, which may indicate avenues for improvement in the algorithms or implementations. Furthermore, we raise attention to the limitations of established metrics for assessing photo-z PDF accuracy; though we identify the conditional density estimate loss as a promising metric of photo-z PDF performance in the case where true redshifts are available but true photo-z PDFs are not, we emphasize the need for science-specific performance metrics.
It is well known in astronomy that propagating non-Gaussian prediction uncertainty in photometric redshift estimates is key to reducing bias in downstream cosmological analyses. Similarly, ...likelihood-free inference approaches, which are beginning to emerge as a tool for cosmological analysis, require a characterization of the full uncertainty landscape of the parameters of interest given observed data. However, most machine learning (ML) or training-based methods with open-source software target point prediction or classification, and hence fall short in quantifying uncertainty in complex regression and parameter inference settings such as the applications mentioned above. As an alternative to methods that focus on predicting the response (or parameters) y from features x, we provide nonparametric conditional density estimation (CDE) tools for approximating and validating the entire probability density function (PDF) p(y|x) of y given (i.e., conditional on) x. This density approach offers a more nuanced accounting of uncertainty in situations with, e.g., nonstandard error distributions and multimodal or heteroskedastic response variables that are often present in astronomical data sets. As there is no one-size-fits-all CDE method, and the ultimate choice of model depends on the application and the training sample size, the goal of this work is to provide a comprehensive range of statistical tools and open-source software for nonparametric CDE and method assessment which can accommodate different types of settings – involving, e.g., mixed-type input from multiple sources, functional data, and images – and which in addition can easily be fit to the problem at hand. Specifically, we introduce four CDE software packages in Python and R based on ML prediction methods adapted and optimized for CDE: NNKCDE, RFCDE, FlexCode, and DeepCDE. Furthermore, we present the cdetools package with evaluation metrics. This package includes functions for computing a CDE loss function for tuning and assessing the quality of individual PDFs, together with diagnostic functions that probe the population-level performance of the PDFs. We provide sample code in Python and R as well as examples of applications to photometric redshift estimation and likelihood-free cosmological inference via CDE.
Many scientific investigations of photometric galaxy surveys require redshift estimates, whose uncertainty properties are best encapsulated by photometric redshift (photo-z) posterior probability ...density functions (PDFs). A plethora of photo-z PDF estimation methodologies abound, producing discrepant results with no consensus on a preferred approach. We present the results of a comprehensive experiment comparing 12 photo-z algorithms applied to mock data produced for The Rubin Observatory Legacy Survey of Space and Time Dark Energy Science Collaboration. By supplying perfect prior information, in the form of the complete template library and a representative training set as inputs to each code, we demonstrate the impact of the assumptions underlying each technique on the output photo-z PDFs. In the absence of a notion of true, unbiased photo-z PDFs, we evaluate and interpret multiple metrics of the ensemble properties of the derived photo-z PDFs as well as traditional reductions to photo-z point estimates. We report systematic biases and overall over/underbreadth of the photo-z PDFs of many popular codes, which may indicate avenues for improvement in the algorithms or implementations. Furthermore, we raise attention to the limitations of established metrics for assessing photo-z PDF accuracy; though we identify the conditional density estimate loss as a promising metric of photo-z PDF performance in the case where true redshifts are available but true photo-z PDFs are not, we emphasize the need for science-specific performance metrics.
A novel modification of the classical Julia-Lythgoe olefination, using sulfoxides instead of sulfones, affords, after in situ benzoylation and SmI
2
/HMPA or SmI
2
/DMPU-mediated reductive ...elimination, 1,2-di-, tri- and tetrasubstituted olefins in moderate to good yields and
E
/
Z
selectivity. The conditions are mild and the procedure is widely applicable. The reaction mechanism was studied and a general model, describing the reaction selectivity, is proposed.
A novel modification of the classical Julia−Lythgoe olefination, using sulfoxides instead of sulfones, affords, after in situ benzoylation and SmI2/HMPA- or DMPU-mediated reductive elimination, ...1,2-di-, tri-, and tetrasubstituted olefins in moderate to excellent yields and E/Z selectivity. The conditions are mild, and the procedure is broadly applicable.
Reaction of the antitumor agent leinamycin with cellular thiols results in conversion of the natural product to a DNA-alkylating episulfonium alkylating agent via an intriguing sequence of chemical ...reactions. To establish whether the chemistry first seen in leinamycin represents a general motif that can function in various molecular frameworks, construction of greatly simplified analogues containing only the “core” funcional groups anticipated to be necessary for thiol-triggered generation of an alkylating agent was undertaken. For this purpose, the “stripped-down” leinamycin analogue 7-(3-methyl-but-2-enyl)-1-oxo-1H-λ4-benzo1,2dithiol-3-one (4) was synthesized. Treatment of 4 with thiol under several different conditions results in efficient conversion of the compound to cyclized 2,3-dihydro-benzobthiophene-7-carboxylic acid products (13) that are envisioned to arise from Markovnikov addition of solvent to an intermediate episulfonium ion (14). Thus, the relatively simple molecule 4 is able to mimic the thiol-triggered alkylating properties displayed by the natural product leinamycin. This work helps define why the core functional groups required thiol-accelerated generation of an alkylating intermediate from leinamycin and indicates that substantially altered analogues of the natural product may retain alkylating properties. In a broader context, the results provide evidence that the unique cascade of chemical reactions first seen in the context of leinamycin represents a general motif that can operate in a variety of molecular frameworks.
2,5-Dimethylphenacyl phosphoric and sulfonic esters release the corresponding acids upon irradiation in nearly quantitative isolated yields, with quantum yields phi = 0.71 and 0.68 in methanol, 0.09 ...and 0.19 in benzene. In methanol solution the reactions proceed predominantly via the (Z)-photoenol, the lifetimes of which (20 and 25 micros) were determined by laser flash photolysis. The chromophore is proposed as an excellent photoremovable protecting group for use in organic synthesis and biochemistry.
Many scientific investigations of photometric galaxy surveys require redshift estimates, whose uncertainty properties are best encapsulated by photometric redshift (photo-z) posterior probability ...density functions (PDFs). A plethora of photo-z PDF estimation methodologies abound, producing discrepant results with no consensus on a preferred approach. We present the results of a comprehensive experiment comparing twelve photo-z algorithms applied to mock data produced for The Rubin Observatory Legacy Survey of Space and Time (LSST) Dark Energy Science Collaboration (DESC). By supplying perfect prior information, in the form of the complete template library and a representative training set as inputs to each code, we demonstrate the impact of the assumptions underlying each technique on the output photo-z PDFs. In the absence of a notion of true, unbiased photo-z PDFs, we evaluate and interpret multiple metrics of the ensemble properties of the derived photo-z PDFs as well as traditional reductions to photo-z point estimates. We report systematic biases and overall over/under-breadth of the photo-z PDFs of many popular codes, which may indicate avenues for improvement in the algorithms or implementations. Furthermore, we raise attention to the limitations of established metrics for assessing photo-z PDF accuracy; though we identify the conditional density estimate (CDE) loss as a promising metric of photo-z PDF performance in the case where true redshifts are available but true photo-z PDFs are not, we emphasize the need for science-specific performancemetrics.