In his book ‘Is a Good God Logically Possible?’, James Sterba argues that the existence of much of the evil to be found in the world is logically incompatible with the existence of God. I defend the ...Thomistic view that when one properly understands the nature of God and of his relationship to the world, this so-called logical problem of evil does not arise. While Sterba has responded to the version of the Thomistic position presented by Brian Davies, I argue that his response fails.
This review assesses storm studies over the North Atlantic and northwestern Europe regarding the occurrence of potential long‐term trends. Based on a systematic review of available articles, trends ...are classified according to different geographical regions, datasets, and time periods. Articles that used measurement and proxy data, reanalyses, regional and global climate model data on past and future trends are evaluated for changes in storm climate. The most important result is that trends in storm activity depend critically on the time period analysed. An increase in storm numbers is evident for the reanalyses period for the most recent decades, whereas most long‐term studies show merely decadal variability for the last 100–150 years.
Storm trends derived from reanalyses data and climate model data for the past are mostly limited to the last four to six decades. The majority of these studies find increasing storm activity north of about 55–60° N over the North Atlantic with a negative tendency southward. This increase from about the 1970s until the mid‐1990s is also mirrored by long‐term proxies and the North Atlantic Oscillation and constitutes a part of their decadal variability. Studies based on proxy and measurement data or model studies over the North Atlantic for the past which cover more than 100 years show large decadal variations and either no trend or a decrease in storm numbers. Future scenarios until about the year 2100 indicate mostly an increase in winter storm intensity over the North Atlantic and western Europe. However, future trends in total storm numbers are quite heterogeneous and depend on the model generation used.
The aging process is characterized by gradual changes to an organism’s macromolecules, which negatively impacts biological processes. The complex macromolecular structure of chromatin regulates all ...nuclear processes requiring access to the DNA sequence. As such, maintenance of chromatin structure is an integral component to deter premature aging. In this review, we describe current research that links aging to chromatin structure. Histone modifications influence chromatin compaction and gene expression and undergo many changes during aging. Histone protein levels also decline during aging, dramatically affecting chromatin structure. Excitingly, lifespan can be extended by manipulations that reverse the age-dependent changes to chromatin structure, indicating the pivotal role chromatin structure plays during aging.
Aims.
The primary difficulty in understanding the sources and processes that powered cosmic reionization is that it is not possible to directly probe the ionizing Lyman-continuum (LyC) radiation at ...that epoch as those photons have been absorbed by the intervening neutral hydrogen. It is therefore imperative to build a model to accurately predict LyC emission using other properties of galaxies in the reionization era.
Methods.
In recent years, studies have shown that the LyC emission from galaxies may be correlated to their Lyman-alpha (Ly
α
) emission. In this paper we study this correlation by analyzing thousands of simulated galaxies at high redshift in the SPHINX cosmological simulation. We post-process these galaxies with the Ly
α
radiative transfer code RASCAS and analyze the Ly
α
– LyC connection.
Results.
We find that the Ly
α
and LyC luminosities are strongly correlated with each other, although with dispersion. There is a positive correlation between the escape fractions of Ly
α
and LyC radiations in the brightest Lyman-alpha emitters (LAEs; escaping Ly
α
luminosity
L
esc
Ly
α
> 10
41
erg s
−1
), similar to that reported by recent observational studies. However, when we also include fainter LAEs, the correlation disappears, which suggests that the observed relation may be driven by selection effects. We also find that the brighter LAEs are dominant contributors to reionization, with
L
esc
Ly
α
> 10
40
erg s
−1
galaxies accounting for > 90% of the total amount of LyC radiation escaping into the intergalactic medium in the simulation. Finally, we build predictive models using multivariate linear regression, where we use the physical and Ly
α
properties of simulated reionization era galaxies to predict their LyC emission. We build a set of models using different sets of galaxy properties as input parameters and predict their intrinsic and escaping LyC luminosity with a high degree of accuracy (the adjusted
R
2
of these predictions in our fiducial model are 0.89 and 0.85, respectively, where
R
2
is a measure of how much of the response variance is explained by the model). We find that the most important galaxy properties for predicting the escaping LyC luminosity of a galaxy are its
L
esc
Ly
α
, gas mass, gas metallicity, and star formation rate.
Conclusions.
These results and the predictive models can be useful for predicting the LyC emission from galaxies using their physical and Ly
α
properties and can thus help us identify the sources of reionization.
Average (bio)equivalence tests are used to assess if a parameter, like the mean difference in treatment response between two conditions for example, lies within a given equivalence interval, hence ...allowing to conclude that the conditions have “equivalent” means. The two one‐sided tests (TOST) procedure, consisting in testing whether the target parameter is respectively significantly greater and lower than some pre‐defined lower and upper equivalence limits, is typically used in this context, usually by checking whether the confidence interval for the target parameter lies within these limits. This intuitive and visual procedure is however known to be conservative, especially in the case of highly variable drugs, where it shows a rapid power loss, often reaching zero, hence making it impossible to conclude for equivalence when it is actually true. Here, we propose a finite sample correction of the TOST procedure, the α$$ \alpha $$‐TOST, which consists in a correction of the significance level of the TOST allowing to guarantee a test size (or type‐I error rate) of α$$ \alpha $$. This new procedure essentially corresponds to a finite sample and variability correction of the TOST procedure. We show that this procedure is uniformly more powerful than the TOST, easy to compute, and that its operating characteristics outperform the ones of its competitors. A case study about econazole nitrate deposition in porcine skin is used to illustrate the benefits of the proposed method and its advantages compared to other available procedures.
Latent time series models such as (the independent sum of) ARMA(p, q) models with additional stochastic processes are increasingly used for data analysis in biology, ecology, engineering, and ...economics. Inference on and/or prediction from these models can be highly challenging: (i) the data may contain outliers that can adversely affect the estimation procedure; (ii) the computational complexity can become prohibitive when the time series are extremely large; (iii) model selection adds another layer of (computational) complexity; and (iv) solutions that address (i), (ii), and (iii) simultaneously do not exist in practice. This paper aims at jointly addressing these challenges by proposing a general framework for robust two-step estimation based on a bounded influence M-estimator of the wavelet variance. We first develop the conditions for the joint asymptotic normality of the latter estimator thereby providing the necessary tools to perform (direct) inference for scale-based analysis of signals. Taking advantage of the model-independent weights of this first-step estimator, we then develop the asymptotic properties of two-step robust estimators using the framework of the generalized method of wavelet moments (GMWM). Simulation studies illustrate the good finite sample performance of the robust GMWM estimator and applied examples highlight the practical relevance of the proposed approach.