Computing the probability of mission survival for systems using a mission abort policy is not a classical reliability problem, generally defined as the probability of performing its intended mission ...for a specified period of time, because the assessment does not, in general, involve a mission of a specific period of time.
Bed load transport is a highly stochastic, multiscale process, where particle advection and diffusion regimes are governed by the dynamics of each sediment grain during its motion and resting states. ...Having a quantitative understanding of the macroscale behavior emerging from the microscale interactions is important for proper model selection in the absence of individual grain‐scale observations. Here we develop a semimechanistic sediment transport model based on individual particle dynamics, which incorporates the episodic movement (steps separated by rests) of sediment particles and study their macroscale behavior. By incorporating different types of probability distribution functions (PDFs) of particle resting times
Tr, under the assumption of thin‐tailed PDF of particle velocities, we study the emergent behavior of particle advection and diffusion regimes across a wide range of spatial and temporal scales. For exponential PDFs of resting times
Tr, we observe normal advection and diffusion at long time scales. For a power‐law PDF of resting times (i.e.,
f(Tr)∼Tr−ν), the tail thickness parameter
ν is observed to affect the advection regimes (both sub and normal advective), and the diffusion regimes (both subdiffusive and superdiffusive). By comparing our semimechanistic model with two random walk models in the literature, we further suggest that in order to reproduce accurately the emerging diffusive regimes, the resting time model has to be coupled with a particle motion model able to produce finite particle velocities during steps, as the episodic model discussed here.
Key Points:
Macroscale behavior of sediment transport reveals aspects of microscale particle dynamics
Heavy‐tailed particle resting times can result in both sub and superdiffusion
Higher‐order statistical moments are needed to differentiate between various complexity models
Determining reliable probability distributions for ice sheet mass change over the coming century is critical to refining uncertainties in sea-level rise projections. Bayesian calibration, a method ...for constraining projection uncertainty using observations, has been previously applied to ice sheet projections but the impact of the chosen observation type on the calibrated posterior probability distributions has not been quantified. Here, we perform three separate Bayesian calibrations to constrain uncertainty in Greenland Ice Sheet (GrIS) simulations of the committed mass loss in 2100 under the current climate, using observations of velocity change, dynamic ice thickness change, and mass change. Comparing the posterior probability distributions shows that the median ice sheet mass change can differ by 119 % for the particular model ensemble that we used, depending on the observation type used in the calibration. More importantly for risk-averse sea-level planning, posterior probabilities of high-end mass change scenarios are highly sensitive to the observation selected for calibration. Furthermore, we show that using mass change observations alone may result in model simulations that overestimate flow acceleration and underestimate dynamic thinning around the margin of the ice sheet. Finally, we look ahead and present ideas for ways to improve Bayesian calibration of ice sheet projections.
The statistics profession is at a unique point in history. The need for valid statistical tools is greater than ever, data sets are massive, often measuring hundreds of thousands of measurements for ...a single subject. The field is ready to move towards clear objective benchmarks under which tools can be evaluated. Targeted learning allows (1) the full generalization and utilization of cross-validation as an estimator selection tool so that the subjective choices made by humans are now made by the machine, and (2) targeting the fitting of the probability distribution of the data toward the target parameter representing the scientific question of interest. This book is aimed at both statisticians and applied researchers interested in causal inference and general effect estimation for observational and experimental data. Part I is an accessible introduction to super learning and the targeted maximum likelihood estimator, including related concepts necessary to understand and apply these methods. Parts II-IX handle complex data structures and topics applied researchers will immediately recognize from their own research, including time-to-event outcomes, direct and indirect effects, positivity violations, case-control studies, censored data, longitudinal data, and genomic studies.
Traditional structural uncertainty analysis is mainly based on probability models and requires the establishment of accurate parametric probability distribution functions using large numbers of ...experimental samples. In many actual engineering problems, the probability distributions of some parameters can be established due to sufficient samples available, whereas for some parameters, due to the lack or poor quality of samples, only their variation intervals can be obtained, or their probability distribution types can be determined based on the existing data while some of the distribution parameters such as mean and standard deviation can only be given interval estimations. This thus will constitute an important type of probability-interval hybrid uncertain problem, in which the aleatory and epistemic uncertainties both exist. The probability-interval hybrid uncertainty analysis provides an important mean for reliability analysis and design of many complex structures, and has become one of the research focuses in the field of structural uncertainty analysis over the past decades. This paper reviews the four main research directions in this area, i.e., uncertainty modeling, uncertainty propagation analysis, structural reliability analysis, and reliability-based design optimization. It summarizes the main scientific problems, technical difficulties, and current research status of each direction. Based on the review, this paper also provides an outlook for future research in probability-interval hybrid uncertainty analysis.
Two years (2021-2022) of high-frequency-radar (HFR) sea surface current data in the Gulf of Trieste (northern Adriatic Sea) are analysed. Two different timescales are extracted using a ...superstatistical formalism: a relaxation time and a larger timescale over which the system is Gaussian. We propose obtaining an ocean current probability density function (PDF) combining (i) a Gaussian PDF for the fast fluctuations and (ii) a convolution of exponential PDFs for the slowly evolving variance of the Gaussian function rather than for the thermodynamic β=1/Ï2 in a system with a few degrees of freedom, as the latter has divergent moments. The Gaussian PDF reflects the entropy maximization for real-valued variables with a given variance. On the other hand, if a positive variable, as a variance, has a specified mean, the maximum-entropy solution is an exponential PDF. In our case the system has 2 degrees of freedom, and therefore the PDF of the variance is the convolution of two exponentials.
The paper is concerned with the problem of regularization by noise of 3D Navier–Stokes equations. As opposed to several attempts made with additive noise which remained inconclusive, we show here ...that a suitable multiplicative noise of transport type has a regularizing effect. It is proven that stochastic transport noise provides a bound on vorticity which gives well posedness, with high probability. The result holds for sufficiently large noise intensity and sufficiently high spectrum of the noise.
There are many Markov chains on infinite dimensional spaces whose one-step transition kernels are mutually singular when starting from different initial conditions. We give results which prove unique ...ergodicity under minimal assumptions on one hand and the existence of a spectral gap under conditions reminiscent of Harris’ theorem. The first uses the existence of couplings which draw the solutions together as time goes to infinity. Such “asymptotic couplings” were central to (Mattingly and Sinai in Comm Math Phys 219(3):523–565, 2001; Mattingly in Comm Math Phys 230(3):461–462, 2002; Hairer in Prob Theory Relat Field 124:345–380, 2002; Bakhtin and Mattingly in Commun Contemp Math 7:553–582, 2005) on which this work builds. As in Bakhtin and Mattingly (2005) the emphasis here is on stochastic differential delay equations. Harris’ celebrated theorem states that if a Markov chain admits a Lyapunov function whose level sets are “small” (in the sense that transition probabilities are uniformly bounded from below), then it admits a unique invariant measure and transition probabilities converge towards it at exponential speed. This convergence takes place in a total variation norm, weighted by the Lyapunov function. A second aim of this article is to replace the notion of a “small set” by the much weaker notion of a “
d
-small set,” which takes the topology of the underlying space into account via a distance-like function
d
. With this notion at hand, we prove an analogue to Harris’ theorem, where the convergence takes place in a Wasserstein-like distance weighted again by the Lyapunov function. This abstract result is then applied to the framework of stochastic delay equations. In this framework, the usual theory of Harris chains does not apply, since there are natural examples for which there exist
no
small sets (except for sets consisting of only one point). This gives a solution to the long-standing open problem of finding natural conditions under which a stochastic delay equation admits at most one invariant measure and transition probabilities converge to it.
The power spectral density (PSD) of any time-dependent stochastic process Xt is a meaningful feature of its spectral content. In its text-book definition, the PSD is the Fourier transform of the ...covariance function of Xt over an infinitely large observation time T, that is, it is defined as an ensemble-averaged property taken in the limit T → ∞ . A legitimate question is what information on the PSD can be reliably obtained from single-trajectory experiments, if one goes beyond the standard definition and analyzes the PSD of a single trajectory recorded for a finite observation time T. In quest for this answer, for a d-dimensional Brownian motion (BM) we calculate the probability density function of a single-trajectory PSD for arbitrary frequency f, finite observation time T and arbitrary number k of projections of the trajectory on different axes. We show analytically that the scaling exponent for the frequency-dependence of the PSD specific to an ensemble of BM trajectories can be already obtained from a single trajectory, while the numerical amplitude in the relation between the ensemble-averaged and single-trajectory PSDs is a fluctuating property which varies from realization to realization. The distribution of this amplitude is calculated exactly and is discussed in detail. Our results are confirmed by numerical simulations and single-particle tracking experiments, with remarkably good agreement. In addition we consider a truncated Wiener representation of BM, and the case of a discrete-time lattice random walk. We highlight some differences in the behavior of a single-trajectory PSD for BM and for the two latter situations. The framework developed herein will allow for meaningful physical analysis of experimental stochastic trajectories.
This paper presents an extended polynomial chaos formalism for epistemic uncertainties and a new framework for evaluating sensitivities and variations of output probability density functions (PDF) to ...uncertainty in probabilistic models of input variables. An ”extended” polynomial chaos expansion (PCE) approach is developed that accounts for both aleatory and epistemic uncertainties, modeled as random variables, thus allowing a unified treatment of both types of uncertainty. We explore in particular epistemic uncertainty associated with the choice of prior probabilistic models for input parameters. A PCE-based Kernel Density (KDE) construction provides a composite map from the PCE coefficients and germ to the PDF of quantities of interest (QoI). The sensitivities of these PDF with respect to the input parameters are then evaluated. Input parameters of the probabilistic models are considered. By sampling over the epistemic random variable, a family of PDFs is generated and the failure probability is itself estimated as a random variable with its own PCE. Integrating epistemic uncertainties within the PCE framework results in a computationally efficient paradigm for propagation and sensitivity evaluation. Two typical illustrative examples are used to demonstrate the proposed approach.
•The EPCE for quantification of multi-uncertainty is introduced.•EPCE-KDE coupling enables straightforward assessment of stochastic sensitivities.•The total variation in response PDF is estimated by stochastic sensitivities.•Failure probability is characterized as a random variable and represented by EPCE.•The framework enables scientific and efficient prediction from incomplete data.