Accurate extractions of the detected gravitational wave (GW) signal waveforms are essential to validate a detection and to probe the astrophysics behind the sources producing the GWs. This however ...could be difficult in realistic scenarios where the signals detected by existing GW detectors could be contaminated with nonstationary and non-Gaussian noise. While the performance of existing waveform extraction methods are optimal, they are not fast enough for online application, which is important for multimessenger astronomy. In this paper, we demonstrate that a deep learning architecture consisting of convolutional neural network and bidirectional long short-term memory components can be used to extract binary black hole (BBH) GW waveforms from realistic noise in a few milliseconds. We have tested our network systematically on injected GW signals, with component masses uniformly distributed in the range of 10 to 80 M⊙, on Gaussian noise and Laser Interferometer Gravitational Wave Observatory (LIGO) detector noise. We find that our model can extract GW waveforms with overlaps of more than 0.95 with pure numerical relativity templates for signals with signal-to-noise ratio greater than six and is also robust against interfering "glitches". We then apply our model to all ten detected BBH events from LIGO-Virgo's first (O1) and second (O2) observation runs, obtaining ≥ 0.97 overlaps for all ten extracted BBH waveforms with the corresponding pure templates. We discuss the implication of our result and its future applications to GW localization and mass estimation.
Deep learning algorithms, in particular neural networks, have been steadily gaining popularity among the gravitational wave community for the last few years. The reliability and accuracy of deep ...learning approaches in gravitational wave detection, parameter estimation, and glitch classification have already been proved and verified by several groups in recent years. In this paper, we report on the construction of the deep artificial neural network (ANN) to localize simulated gravitational wave signals in the sky with high accuracy. We have modeled the sky as a sphere and have considered cases in which the sphere is divided into 18, 50, 128, 1024, 2048, and 4096 sectors. The sky direction of the gravitational wave source is estimated by classifying the signal into one of these sectors based on its right ascension and declination values for each of these cases. To do this, we have injected simulated binary black hole gravitational wave signals of component masses sampled uniformly between 30 M⊙ and 80 M⊙ into Gaussian noise and used the whitened strain values to obtain the input features for training our ANN. We input features such as the delays in arrival times, phase differences, and amplitude ratios at each of the three detectors Hanford, Livingston, and Virgo, from the raw time-domain strain values as well as from analytical versions of these signals, obtained through Hilbert transformation. We show that our model is able to classify gravitational wave samples, not used in the training process, into their correct sectors with very high accuracy (greater than 90%) for coarse angular resolution using 18, 50, and 128 sectors. We also test our localization on test samples with injection parameters of the published LIGO binary black hole merger events GW150914, GW170818, and GW170823 for 1024, 2048, and 4096 sectors and compare the result with that from BAYESTAR and parameter estimation. In addition, we report that the time taken by our model to localize one gravitational wave signal is around 0.018 s on 14 Intel Xeon CPU cores.
Abstract
We use the energy-balance code magphys to determine stellar and dust masses, and dust corrected star formation rates for over 200 000 GAMA galaxies, 170 000 G10-COSMOS galaxies, and 200 000 ...3D-HST galaxies. Our values agree well with previously reported measurements and constitute a representative and homogeneous data set spanning a broad range in stellar-mass (108–1012 M⊙), dust-mass (106–109 M⊙), and star formation rates (0.01–100 M⊙yr−1), and over a broad redshift range (0.0 < z < 5.0). We combine these data to measure the cosmic star formation history (CSFH), the stellar-mass density (SMD), and the dust-mass density (DMD) over a 12 Gyr timeline. The data mostly agree with previous estimates, where they exist, and provide a quasi-homogeneous data set using consistent mass and star formation estimators with consistent underlying assumptions over the full time range. As a consequence our formal errors are significantly reduced when compared to the historic literature. Integrating our CSFH we precisely reproduce the SMD
with an interstellar medium replenishment factor of 0.50 ± 0.07, consistent with our choice of Chabrier initial mass function plus some modest amount of stripped stellar mass. Exploring the cosmic dust density evolution, we find a gradual increase in dust density with lookback time. We build a simple phenomenological model from the CSFH to account for the dust-mass evolution, and infer two key conclusions: (1) For every unit of stellar mass which is formed 0.0065–0.004 units of dust mass is also formed. (2) Over the history of the Universe approximately 90–95 per cent of all dust formed has been destroyed and/or ejected.
Abstract
The mergers of neutron star–neutron star and neutron star–black hole binaries (NSBHs) are the most promising gravitational wave (GW) events with electromagnetic (EM) counterparts. The rapid ...detection, localization, and simultaneous multimessenger follow-up of these sources are of primary importance in the upcoming science runs of the LIGO-Virgo-KAGRA Collaboration. While prompt EM counterparts during binary mergers can last less than 2 s, the timescales of existing localization methods that use Bayesian techniques, vary from seconds to days. In this paper, we propose the first deep learning–based approach for rapid and accurate sky localization of all types of binary coalescences, including neutron star–neutron star and NSBHs for the first time. Specifically, we train and test a normalizing flow model on matched-filtering output from GW searches to obtain sky direction posteriors in around 1 s using a single P100 GPU, which is several orders of magnitude faster than full Bayesian techniques.
High-Performance Thin-Layer Chromatography (HPTLC) was used in a chemometric investigation of the derived sugar and organic extract profiles of two different honeys (Manuka and Jarrah) with ...adulterants. Each honey was adulterated with one of six different sugar syrups (rice, corn, golden, treacle, glucose and maple syrups) in five different concentrations (10%, 20%, 30%, 40%, and 50% w/w). The chemometric analysis was based on the combined sugar and organic extract profiles’ datasets. To obtain the respective sugar profiles, the amount of fructose, glucose, maltose, and sucrose present in the honey was quantified and for the organic extract profile, the honey’s dichloromethane extract was investigated at 254 and 366 nm, as well as at T (Transmittance) white light and at 366 nm after derivatisation. The presence of sugar syrups, even at a concentration of only 10%, significantly influenced the honeys’ sugar and organic extract profiles and multivariate data analysis of these profiles, in particular cluster analysis (CA), principal component analysis (PCA), principal component regression (PCR), partial least-squares regression (PLSR) and Machine Learning using an artificial neural network (ANN), were able to detect post-harvest syrup adulterations and to discriminate between neat and adulterated honey samples. Cluster analysis and principal component analysis, for instance, could easily differentiate between neat and adulterated honeys through the use of CA or PCA plots. In particular the presence of excess amounts of maltose and sucrose allowed for the detection of sugar adulterants and adulterated honeys by HPTLC-multivariate data analysis. Partial least-squares regression and artificial neural networking were employed, with augmented datasets, to develop optimal calibration for the adulterated honeys and to predict those accurately, which suggests a good predictive capacity of the developed model.
In this work we present our experience from the first year of theSkyNet Pan-STARRS1 Optical Galaxy Survey (POGS) project. This citizen-scientist driven research project uses the Berkeley Open ...Infrastructure for Network Computing (BOINC) middleware and thousands of Internet-connected computers to measure the resolved galactic structural properties of ∼100,000 low redshift galaxies. We are combining the spectral coverage of GALEX, Pan-STARRS1, SDSS, and WISE to generate a value-added, multi-wavelength UV–optical–NIR galaxy atlas for the nearby Universe. Specifically, we are measuring physical parameters (such as local stellar mass, star formation rate, and first-order star formation history) on a resolved pixel-by-pixel basis using spectral energy distribution (SED) fitting techniques in a distributed computing mode.
Accurate wind speed and direction forecasting is paramount across many sectors, spanning agriculture, renewable energy generation, and bushfire management. However, conventional forecasting models ...encounter significant challenges in precisely predicting wind conditions at high spatial resolutions for individual locations or small geographical areas (< 20 km2) and capturing medium to long-range temporal trends and comprehensive spatio-temporal patterns. This study focuses on a spatial temporal approach for high-resolution gridded wind forecasting at the height of 3 and 10 metres across large areas of the Southwest of Western Australia to overcome these challenges. The model utilises the data that covers a broad geographic area and harnesses a diverse array of meteorological factors, including terrain characteristics, air pressure, 10-metre wind forecasts from the European Centre for Medium-Range Weather Forecasts, and limited observation data from sparsely distributed weather stations (such as 3-metre wind profiles, humidity, and temperature), the model demonstrates promising advancements in wind forecasting accuracy and reliability across the entire region of interest. This paper shows the potential of our machine learning model for wind forecasts across various prediction horizons and spatial coverage. It can help facilitate more informed decision-making and enhance resilience across critical sectors.
Inclusive employment opportunities for individuals living with disabilities have been an ongoing issue in society, creating barriers and challenges for this community. Digital assistive technologies ...(DAT) are, and continue to be, helpful tools in aiding in this inclusivity, but they have not always been accessible to all. The SARS-CoV-2 pandemic, where online work became the “new normal”, has bought this into sharp focus, giving individuals access and the ability to utilize different online tools that support individuals living with various disabilities in doing work. To better understand the current context concerning DATs and remote working for individuals living with disabilities, we conducted a scoping review in 2021/2022. We identified relevant papers that aided in identifying validated digital assistive technologies. Our study aims to continue supporting individuals living with disabilities to access the technology needed to join, or remain within, the workforce and work towards dismantling barriers that prevent this.