This volume provides an introduction to stochastic differential equations with jumps, in both theory and application. The book is accessible and contains many new results on numerical methods but ...also innovative methodologies in quantitative finance.
We call peacock an integrable process which is increasing in the convex order; such a notion plays an important role in Mathematical Finance. A deep theorem due to Kellerer states that a process is a ...peacock if and only if it has the same one-dimensional marginals as a martingale. Such a martingale is then said to be associated to this peacock. In this monograph, we exhibit numerous examples of peacocks and associated martingales with the help of different methods: construction of sheets, time reversal, time inversion, self-decomposability, SDE, Skorokhod embeddings. They are developed in eight chapters, with about a hundred of exercises.
Receiver operating characteristic (ROC) analysis is performed by a curve, called ROC curve, plotted based on detection probability, <inline-formula> <tex-math notation="LaTeX">P_{\text {D}} ...</tex-math></inline-formula>, versus false alarm probability, <inline-formula> <tex-math notation="LaTeX">P_{\text {F}} </tex-math></inline-formula>, and has been widely used as an evaluation tool for signal detection. Specifically, the area under an ROC curve (AUC) is calculated and used as a detection measure. Unfortunately, finding distributions of <inline-formula> <tex-math notation="LaTeX">P_{\text {D}} </tex-math></inline-formula> and <inline-formula> <tex-math notation="LaTeX">P_{\text {F}} </tex-math></inline-formula> to generate a continuous ROC curve is practically infeasible. This article investigates approaches to generating a discrete 2D ROC curve of (<inline-formula> <tex-math notation="LaTeX">P_{\text {D}} </tex-math></inline-formula>,<inline-formula> <tex-math notation="LaTeX">P_{\text {F}} </tex-math></inline-formula>) without appealing for probability distributions. Since <inline-formula> <tex-math notation="LaTeX">P_{\text {D}} </tex-math></inline-formula> and <inline-formula> <tex-math notation="LaTeX">P_{\text {F}} </tex-math></inline-formula> are determined by the same threshold <inline-formula> <tex-math notation="LaTeX">\tau </tex-math></inline-formula> to specify a detector, an ROC curve of (<inline-formula> <tex-math notation="LaTeX">P_{\text {D}} </tex-math></inline-formula>,<inline-formula> <tex-math notation="LaTeX">P_{\text {F}} </tex-math></inline-formula>) can only be used to evaluate the effectiveness of a detector but not target detectability (TD) and also not background suppressibility (BS). To address this issue, a 3D ROC curve is generated as a function of (<inline-formula> <tex-math notation="LaTeX">P_{\text {D}} </tex-math></inline-formula>,<inline-formula> <tex-math notation="LaTeX">P_{\text {F}} </tex-math></inline-formula>, <inline-formula> <tex-math notation="LaTeX">\tau </tex-math></inline-formula>) by introducing a specific threshold parameter <inline-formula> <tex-math notation="LaTeX">\tau </tex-math></inline-formula> as a third independent variable. As a result, a 3D ROC curve along with its derived three 2D ROC curves of (<inline-formula> <tex-math notation="LaTeX">P_{\text {D}} </tex-math></inline-formula>,<inline-formula> <tex-math notation="LaTeX">P_{\text {F}} </tex-math></inline-formula>), (<inline-formula> <tex-math notation="LaTeX">P_{\text {D}} </tex-math></inline-formula>,<inline-formula> <tex-math notation="LaTeX">\tau </tex-math></inline-formula>), and (<inline-formula> <tex-math notation="LaTeX">P_{\text {F}} </tex-math></inline-formula>,<inline-formula> <tex-math notation="LaTeX">\tau </tex-math></inline-formula>) can further be used to design new quantitative measures to evaluate the effectiveness of a detector and its TD and BS. To demonstrate the full utility of 3D ROC analysis in target detection, extensive experiments are performed on two types of targets, prior target detection and anomaly detection, to conduct a comprehensive analysis on 3D ROC curves using new designed detection measures to evaluate target/anomaly detection performance.
We are interested in the time discretization of stochastic differential equations with additive d-dimensional Brownian noise and L q − L ρ drift coefficient when the condition d ρ + 2 q < 1, under ...which Krylov and Röckner 26 proved existence of a unique strong solution, is met. We show weak convergence with order 1 2 (1 − (d ρ + 2 q)) which corresponds to half the distance to the threshold for the Euler scheme with randomized time variable and cutoffed drift coefficient so that its contribution on each time-step does not dominate the Brownian contribution. More precisely, we prove that both the diffusion and this Euler scheme admit transition densities and that the difference between these densities is bounded from above by the time-step to this order multiplied by some centered Gaussian density.
We study the convergence rate of the optimal quantization for a probability measure sequence $(\mu_{n})_{n\in\mathbb{N}^{*}}$ on $\mathbb{R}^{d}$ converging in the Wasserstein distance in two ...aspects: the first one is the convergence rate of optimal quantizer $x^{(n)}\in(\mathbb{R}^{d})^{K}$ of $\mu_{n}$ at level $K$; the other one is the convergence rate of the distortion function valued at $x^{(n)}$, called the "performance" of $x^{(n)}$. Moreover, we also study the mean performance of the optimal quantization for the empirical measure of a distribution $\mu$ with finite second moment but possibly unbounded support. As an application, we show that the mean performance for the empirical measure of the multidimensional normal distribution $\mathcal{N}(m, \Sigma)$ and of distributions with hyper-exponential tails behave like $\mathcal{O}(\frac{\log n}{\sqrt{n}})$. This extends the results from BDL08 obtained for compactly supported distribution. We also derive an upper bound which is sharper in the quantization level $K$ but suboptimal in $n$ by applying results in FG15.
Benford's Law Miller, Steven J
2015, 2015., 20150609, 2015-06-09
eBook
Benford's law states that the leading digits of many data sets are not uniformly distributed from one through nine, but rather exhibit a profound bias. This bias is evident in everything from ...electricity bills and street addresses to stock prices, population numbers, mortality rates, and the lengths of rivers. This work demonstrates the many useful techniques that arise from the law, showing how truly multidisciplinary it is, and encouraging collaboration.
ABSTRACT We present a comprehensive analysis of the structural properties and luminosities of the 23 dwarf spheroidal galaxies that fall within the footprint of the Pan-Andromeda Archaeological ...Survey (PAndAS). These dwarf galaxies represent the large majority of Andromeda's known satellite dwarf galaxies and cover a wide range in luminosity ( or ) and surface brightness ( mag arcsec−2). We confirm most previous measurements, but we find And XIX to be significantly larger than before ( , ) and cannot derive parameters for And XXVII as it is likely not a bound stellar system. We also significantly revise downward the luminosities of And XV and And XVI, which are now or . Finally, we provide the first detailed analysis of Cas II/And XXX, a fairly faint system ( ) of typical size ( ), located in close proximity to the two bright elliptical dwarf galaxies NGC 147 and NGC 185. Combined with the set of homogeneous distances published in an earlier contribution, our analysis dutifully tracks all relevant sources of uncertainty in the determination of the properties of the dwarf galaxies from the PAndAS photometric catalog. We further publish the posterior probability distribution functions of all the parameters we fit for in the form of MCMC chains available online; these inputs should be used in any analysis that aims to remain truthful to the data and properly account for covariance between parameters.
Abstract
Learning about hypothesis evaluation using the Bayes factor could enhance psychological research. In contrast to null-hypothesis significance testing it renders the evidence in favor of each ...of the hypotheses under consideration (it can be used to quantify support for the null-hypothesis) instead of a dichotomous reject/do-not-reject decision; it can straightforwardly be used for the evaluation of multiple hypotheses without having to bother about the proper manner to account for multiple testing; and it allows continuous reevaluation of hypotheses after additional data have been collected (Bayesian updating). This tutorial addresses researchers considering to evaluate their hypotheses by means of the Bayes factor. The focus is completely applied and each topic discussed is illustrated using Bayes factors for the evaluation of hypotheses in the context of an ANOVA model, obtained using the R package bain. Readers can execute all the analyses presented while reading this tutorial if they download bain and the R-codes used. It will be elaborated in a completely nontechnical manner: what the Bayes factor is, how it can be obtained, how Bayes factors should be interpreted, and what can be done with Bayes factors. After reading this tutorial and executing the associated code, researchers will be able to use their own data for the evaluation of hypotheses by means of the Bayes factor, not only in the context of ANOVA models, but also in the context of other statistical models.
Translational Abstract
Learning about hypothesis evaluation using the Bayes factor could enhance psychological research. The Bayes factor quantifies the support in the data for two competing hypotheses. These may be the traditional null and alternative hypotheses, but these may also be informative hypotheses like m1 > m2 > m3 and (m1 − m2) > (m2 − m3) where m1, m2, and m3 denote the means in three experimental groups. Bayesian hypotheses evaluation offers options such as quantifying evidence in favor of the null-hypothesis, simultaneous evaluation of multiple hypotheses, and Bayesian updating, that is, recomputation of the Bayes factor after additional data have been collected.
In this tutorial it is elaborated how researchers can use the Bayes factor for the analysis of their own data. The focus is completely applied and each topic discussed is illustrated using Bayes factors for the evaluation of hypotheses in the context of an ANOVA model, obtained using the R package bain. Readers can execute all the analyses presented while reading this tutorial if they download bain and the R-codes used from the bain website. It will be elaborated in a completely nontechnical manner: what the Bayes factor is, how it can be obtained, how Bayes factors should be interpreted, and what can be done with Bayes factors. After reading this tutorial and executing the associated code, researchers will be able to use their own data for the evaluation of hypotheses by means of the Bayes factor, not only in the context of ANOVA models, but also in the context of other statistical models.
This paper investigates the probability that the delay and the peak-age of information exceed a desired threshold in a point-to-point communication system with short information packets. The packets ...are generated according to a stationary memoryless Bernoulli process, placed in a single-server queue and then transmitted over a wireless channel. A variable-length stop-feedback coding scheme-a general strategy that encompasses simple automatic repetition request (ARQ) and more sophisticated hybrid ARQ techniques as special cases-is used by the transmitter to convey the information packets to the receiver. By leveraging finite-blocklength results, the delay violation and the peak-age violation probabilities are characterized without resorting to approximations based on larg-deviation theory as in previous literature. Numerical results illuminate the dependence of delay and peak-age violation probability on system parameters such as the frame size and the undetected error probability, and on the chosen packet-management policy. The guidelines provided by our analysis are particularly useful for the design of low-latency ultra-reliable communication systems.
The concept of Wiener chaos generalizes to an infinite-dimensional setting the
properties of orthogonal polynomials associated with probability distributions
on the real line. It plays a crucial role ...in modern probability theory, with applications
ranging from Malliavin calculus to stochastic differential equations and from
probabilistic approximations to mathematical finance.
This book is concerned with combinatorial structures arising from the study
of chaotic random variables related to infinitely divisible random measures.
The combinatorial structures involved are those of partitions of finite sets,
over which Möbius functions and related inversion formulae are defined.
This combinatorial standpoint (which is originally due to Rota and Wallstrom)
provides an ideal framework for diagrams, which are graphical devices used
to compute moments and cumulants of random variables.
Several applications are described, in particular, recent limit theorems for chaotic random variables.
An Appendix presents a computer implementation in MATHEMATICA for many of the formulae.