Stellar Population Inference with Prospector Johnson, Benjamin D.; Leja, Joel; Conroy, Charlie ...
The Astrophysical journal. Supplement series,
06/2021, Letnik:
254, Številka:
2
Journal Article
Recenzirano
Odprti dostop
Abstract
Inference of the physical properties of stellar populations from observed photometry and spectroscopy is a key goal in the study of galaxy evolution. In recent years, the quality and ...quantity of the available data have increased, and there have been corresponding efforts to increase the realism of the stellar population models used to interpret these observations. Describing the observed galaxy spectral energy distributions in detail now requires physical models with a large number of highly correlated parameters. These models do not fit easily on grids and necessitate a full exploration of the available parameter space. We present
Prospector
, a flexible code for inferring stellar population parameters from photometry and spectroscopy spanning UV through IR wavelengths. This code is based on forward modeling the data and Monte Carlo sampling the posterior parameter distribution, enabling complex models and exploration of moderate dimensional parameter spaces. We describe the key ingredients of the code and discuss the general philosophy driving the design of these ingredients. We demonstrate some capabilities of the code on several data sets, including mock and real data.
It has been over 30 years since a paradigm shift from abstract stochastic process models to more concrete Fraction-of-Time Probability models for time-series data was called for and was supported by ...this journal’s editor in chief. Yet, little, if any, detectable progress in making this transition has occurred. This paper reviews this needed transition and attempts to facilitate it with a new type of stochastic process model. The primary purpose of this model is to serve as a pedagogical tool for facilitating the conceptual transition from the standard relatively abstract way of thinking to a more concrete alternative. The utility of this parsimonious alternative was thoroughly proven when it was introduced in an advanced 1987 textbook, and the evidence in support has continued to accumulate in subsequent theoretical and applied research publications. But resistance to change is ever present.
•Comparison of two alternative generic stochastic process models for data analysis and inference.•The standard model is relatively abstract, and the new model is better suited to empirical data.•Mathematical and Pragmatic pros and cons are exposed.•A paradigm shift is urged.
When a non-contact and remote temperature measurement is of importance, several environmental factors must be considered. Of these, two are of crucial importance — the surface emissivity of the ...observed object and the directional emissivity in the object–sensor relationship. This paper proposes FDEM — a Fuzzy Directional Emissivity Model built with the assumption that the observed surface is cylindrically shaped. The model consists of two submodels — for vertical and horizontal directions separately. The main contribution of this paper is the formalization of a linguistic description of measurement conditions typical for an industrial environment. Additionally, dual linguistic fuzzy modifiers were used to increase the accuracy of the modeled emissivity. Finally, the proposed model achieved up to 40% reduction of RMSE error when used for correcting the temperature measurements of a rotating cylinder. Overall, our model provides a novel and reliable way for temperature readout correction on the entire surface of a drying roller, such as ones used in the papermaking industry. Furthermore, the proposed approach is not only inherently capable of compensating for the position of the camera, but also does it in an interpretable, and hence expert-verifiable, manner.
•Directional emissivity in non-contact temperature measurements.•Fuzzy process of thermal image reconstruction of a cylindrical rotating surface.•Fuzzy logic modifiers in thermal image processing.•Precise imaging and control of thermal fields.•Qualitative knowledge representation using fuzzy set theory and fuzzy logic.
In mobile scenarios, there is a need for general user representations to solve multiple target tasks. However, there are some challenges in the related research (e.g., difficulty in learning a ...representation that satisfies both great generalization and performance). To address these problems, we proposed a network for downstream-adaptable mobile user modeling, which employed a novel fine-tuning strategy for optimizing the performance of several downstream tasks. Additionally, we designed a time-difference module to eliminate the impact of low-frequency and non-uniform app usage behavior over time. A parallel decoder structure was developed to incorporate multi-type features by minimizing information loss. We evaluated our method on a real-world dataset of 100,000 mobile users and three downstream tasks (i.e., age prediction, gender prediction, and app recommendation). The experimental results showed that our method could outperform existing methods significantly. It achieved 96.5% ACC on gender prediction, 68.1% ACC on age prediction, and 64.2% Recall@5 on app recommendation. These results imply that our method performs well on both generalization and performance. It could be anticipated promising to the unseen tasks inference.
•The paper develops a novel dynamic PCA (DiPCA) algorithm to extract a set of dynamic latent variables that capture the most dynamic variations in the data with a given number of latent factors.•The ...new models generate a number of principal time series that are most correlated to their past and thus most predictable from their past data.•The residuals are essentially uncorrelated in time after the DiPCA factor extraction, which can be handled by static PCA.•Geometric properties are explored to give insight into the new dynamic model structure.•Process monitoring and fault detection indices based on DiPCA are developed, which monitors the dynamic latent factors and residuals to detect faults that violate different part of the model correlations.
Principal component analysis (PCA) has been widely applied for data modeling and process monitoring. However, it is not appropriate to directly apply PCA to data from a dynamic process, since PCA focuses on variance maximization only and pays no attention to whether the components contain dynamics or not. In this paper, a novel dynamic PCA (DiPCA) algorithm is proposed to extract explicitly a set of dynamic latent variables with which to capture the most dynamic variations in the data. After the dynamic variations are extracted, the residuals are essentially uncorrelated in time and static PCA can be applied. The new models generate a subspace of principal time series that are most predictable from their past data. Geometric properties are explored to give insight into the new dynamic model structure. For the purpose of process monitoring, fault detection indices based on DiPCA are developed based on the proposed model. Case studies on simulation data, data from an industrial boiler process, and the Tennessee Eastman process are presented to illustrate the effectiveness of the proposed dynamic models and fault detection methods.
Abstract
The direct detection of a bright, ring-like structure in horizon-resolving images of M87* by the Event Horizon Telescope (EHT) is a striking validation of general relativity. The angular ...size and shape of the ring is a degenerate measure of the location of the emission region, mass, and spin of the black hole. However, we show that the observation of multiple rings, corresponding to the low-order photon rings, can break this degeneracy and produce mass and spin measurements independent of the shape of the rings. We describe two potential experiments that would measure the spin. In the first, observations of the direct emission and
n
= 1 photon ring are made at multiple epochs with different emission locations. This method is conceptually similar to spacetime constraints that arise from variable structures (or hot spots) in that it breaks the near-perfect degeneracy between emission location, mass, and spin for polar observers using temporal variability. In the second, observations of the direct emission and
n
= 1 and
n
= 2 photon rings are made during a single epoch. For both schemes, additional observations comprise a test of general relativity. Thus, comparisons of EHT observations in 2017 and 2018 may be capable of producing the first horizon-scale spin estimates of M87* inferred from strong lensing alone. Additional observation campaigns from future high-frequency, Earth-sized, and space-based radio interferometers can produce high-precision tests of general relativity.
Abstract
In 2017 April, the Event Horizon Telescope (EHT) observed the near-horizon region around the supermassive black hole at the core of the M87 galaxy. These 1.3 mm wavelength observations ...revealed a compact asymmetric ring-like source morphology. This structure originates from synchrotron emission produced by relativistic plasma located in the immediate vicinity of the black hole. Here we present the corresponding linear-polarimetric EHT images of the center of M87. We find that only a part of the ring is significantly polarized. The resolved fractional linear polarization has a maximum located in the southwest part of the ring, where it rises to the level of ∼15%. The polarization position angles are arranged in a nearly azimuthal pattern. We perform quantitative measurements of relevant polarimetric properties of the compact emission and find evidence for the temporal evolution of the polarized source structure over one week of EHT observations. The details of the polarimetric data reduction and calibration methodology are provided. We carry out the data analysis using multiple independent imaging and modeling techniques, each of which is validated against a suite of synthetic data sets. The gross polarimetric structure and its apparent evolution with time are insensitive to the method used to reconstruct the image. These polarimetric images carry information about the structure of the magnetic fields responsible for the synchrotron emission. Their physical interpretation is discussed in an accompanying publication.
Abstract The increasing prevalence of antibiotic-resistant pathogens necessitates the development of novel antimicrobial agents. Herein, PEGylated konjac gum-supported rosin pentaerythritol ...nanocomposites (KG/PEG/RE PNCs) were synthesized using an environmentally friendly sonochemical method, aiming to explore their potential antibacterial and antifungal properties against a range of pathogens, including Candida albicans, Escherichia coli, Pseudomonas aeruginosa, Aspergillus brasiliensis, and Staphylococcus aureus . An elaborate investigation into the rheological properties of these PNCs highlighted the dependence of viscosity on synthesis parameters such as RE concentration, sonication time, and KG/RE blend ratio with the Higiro model validated as a suitable mathematical model for defining the intricate relationship between experimental and resulting viscosity of PNCs. The integration of machine learning (ML), particularly polynomial regression, enabled the modeling of the complex dynamics influencing PNC viscosity, thus advancing comprehension of PNCs behavior in relation to the synthesis parameters. The modeling facilitated precise formulation to predict PNC viscosity with high accuracy, as confirmed by a mean squared error (MSE) of 3.81 and an R 2 of 0.993. Moreover, the PNCs demonstrated broad-spectrum antimicrobial activity, reaching an inhibition plateau during the first week, confirming its efficacy as a versatile antibacterial and antifungal agent. Combining advanced data modeling techniques with biological assessments, this integrated approach represents a step forward in understanding and optimizing polymeric nanostructures.
After a very fast and efficient discriminative broad learning system (BLS) that takes advantage of flatted structure and incremental learning has been developed, here, a mathematical proof of the ...universal approximation property of BLS is provided. In addition, the framework of several BLS variants with their mathematical modeling is given. The variations include cascade, recurrent, and broad-deep combination structures. From the experimental results, the BLS and its variations outperform several exist learning algorithms on regression performance over function approximation, time series prediction, and face recognition databases. In addition, experiments on the extremely challenging data set, such as MS-Celeb-1M, are given. Compared with other convolutional networks, the effectiveness and efficiency of the variants of BLS are demonstrated.
Broad Learning System (BLS) that aims to offer an alternative way of learning in deep structure is proposed in this paper. Deep structure and learning suffer from a time-consuming training process ...because of a large number of connecting parameters in filters and layers. Moreover, it encounters a complete retraining process if the structure is not sufficient to model the system. The BLS is established in the form of a flat network, where the original inputs are transferred and placed as "mapped features" in feature nodes and the structure is expanded in wide sense in the "enhancement nodes." The incremental learning algorithms are developed for fast remodeling in broad expansion without a retraining process if the network deems to be expanded. Two incremental learning algorithms are given for both the increment of the feature nodes (or filters in deep structure) and the increment of the enhancement nodes. The designed model and algorithms are very versatile for selecting a model rapidly. In addition, another incremental learning is developed for a system that has been modeled encounters a new incoming input. Specifically, the system can be remodeled in an incremental way without the entire retraining from the beginning. Satisfactory result for model reduction using singular value decomposition is conducted to simplify the final structure. Compared with existing deep neural networks, experimental results on the Modified National Institute of Standards and Technology database and NYU NORB object recognition dataset benchmark data demonstrate the effectiveness of the proposed BLS.