In this paper, we study the existence and consistency of the maximum likelihood estimator of the extreme value index based on fc-record values. Following the method used by Drees et al. (2004) and ...Zhou (2009), we prove that the likelihood equations, in terms of fc-record values, eventually admit a strongly consistent solution without any restriction on the extreme value index, which is not the case in the aforementioned studies.
Option prices in the market embed a probability distribution over the option's payoff, which can be fairly easily extracted. But in this risk-neutral distribution, the market's true expectations ...about future returns are distorted by risk preferences, in particular, aversion to volatility risk. To separate the return forecasts from risk aversion, one needs another way to estimate the market's probability estimates, which the author calls "actuarial probabilities." This difficult problem becomes even harder in the swaptions market, because the risk exposure for a swaption depends on two different time horizons. For example, a 2 * 5 swaption has two years ("expiry") of optionality culminating with the option to enter into a swap with a maturity ("tail") of five years. The volatility surface is three-dimensional, with two time variables and the strike interest rate (the "volatility cube"), and it is difficult to estimate statistically using available data on long-lived contracts.
Dietze's conjecture concerns the problem of equipping a tree automaton M with weights to make it probabilistic, in such a way that the resulting automaton N predicts a given corpus C as accurately as ...possible. The conjecture states that the accuracy cannot increase if the states in M are merged with respect to an equivalence relation ∼ so that the result is a smaller automaton M∼. Put differently, merging states can never improve predictions. This is under the assumption that both M and M∼ are bottom-up deterministic and accept every tree in C. We prove that the conjecture holds, using a construction that turns any probabilistic version N∼ of M∼ into a probabilistic version N of M, such that N assigns at least as great a weight to each tree in C as N∼ does.
The modeling of forest growth and production is an essential tool for forestry management because it allows us to perform simulations and project forest biometric variables in the future, thus ...assisting in stock planning and economic analyses. In this work, a growth and production model by diameter distribution was proposed with the application of the Weibull function based on the recovery of parameters through simplified functions between the forest attributes and the parameters of the Weibull function. The algorithm was developed in Excel’s VBA language. Validation was performed with data from the Continuous Forest Inventory (CFI) in a stand of Khaya grandifoliola and in rows of Eucalyptus spp. in the ILPF system, which were ordinarily organized into seven date combinations, from the most distant from to the closest to the projection date. The results were evaluated by the percentage standard error (SE%) applied to the projected and observed volumes and by the Kolmogorov‒Smirnov test applied to the diameter distributions to verify adherence. It was possible to identify an exact relationship for parameter c of the Weibull function as a function of the percentiles and for parameter b, improving the parameter recovery method. Another methodological improvement was the use of maximum diameter and maximum height for age to adjust the hypsometric function. The algorithm presented results for total volume with errors up to 20% in 85% of the tests.
The identification of a person on the basis of scanned images of handwriting is a useful biometric modality with application in forensic and historic document analysis and constitutes an exemplary ...study area within the research field of behavioral biometrics. We developed new and very effective techniques for automatic writer identification and verification that use probability distribution functions (PDFs) extracted from the handwriting images to characterize writer individuality. A defining property of our methods is that they are designed to be independent of the textual content of the handwritten samples. Our methods operate at two levels of analysis: the texture level and the character-shape (allograph) level. At the texture level, we use contour-based joint directional PDFs that encode orientation and curvature information to give an intimate characterization of individual handwriting style. In our analysis at the allograph level, the writer is considered to be characterized by a stochastic pattern generator of ink-trace fragments, or graphemes. The PDF of these simple shapes in a given handwriting sample is characteristic for the writer and is computed using a common shape codebook obtained by grapheme clustering. Combining multiple features (directional, grapheme, and run-length PDFs) yields increased writer identification and verification performance. The proposed methods are applicable to free-style handwriting (both cursive and isolated) and have practical feasibility, under the assumption that a few text lines of handwritten material are available in order to obtain reliable probability estimates
An investigation was carried out into the charging rate dependence of local structural changes in a layered 0.5Li.sub.2MnO.sub.3-0.5LiMn.sub.1/3Ni.sub.1/3Co.sub.1/3O.sub.2 solid solution during the ...initial charging cycle. To clarify the mechanism involved in local atomic rearrangement, a pair distribution function (PDF) analysis was performed using the results of powder neutron diffraction and synchrotron X-ray total scattering measurements. First-principles calculations (VASP code) were used to determine the initial structure when performing the PDF analysis. The bond-length strain (lambda) and the bond-angle strain (sigma.sup.2) for the optimized model were calculated following the PDF analysis in order to clarify the effect of the charging rate on the crystal structure distortion. Before charging, the distortion was small for MnO.sub.6 octahedra compared to that for NiO.sub.6 and CoO.sub.6 octahedra. During charging at a rate of 1C, the MnO.sub.6 octahedra experienced increasing distortion, whereas at 3C the CoO.sub.6 octahedra became more distorted. In addition, when charging at 3C, the values of lambda and sigma.sup.2 increased for NiO.sub.6 octahedra that had entered the Li layer as a result of cation mixing. This appeared to be related to whether the localized atom was Mn or Co within the average structure during charge process. It is thought that distortion occurs in MO.sub.6 octahedra containing whichever element becomes localized, and this depends on the charging rate. This leads to the possibility that decreasing the fractional composition of the element that becomes localized may lead to reduced distortion and improved the cyclability.
The paper investigates a new scheme for generating lifetime probability distributions. The scheme is called Exponential- H family of distribution. The paper presents an application of this family by ...using the Weibull distribution, the new distribution is then called New Flexible Exponential distribution or in short NFE. Various statistical properties are derived, such as quantile function, order statistics, moments, etc. Two real-life data sets and a simulation study have been performed so that to assure the flexibility of the proposed model. It has been declared that the proposed distribution offers nice results than Exponential, Weibull Exponential, and Exponentiated Exponential distribution.
In this paper, we extend the variational method of M. Agueh to a large class of parabolic equations involving q(x)Laplacian parabolic equation (partial derivativerho (t, x)/zt) = div.sub.x (rho(t, x) ...absolute value of nabla.sub.x(G'(rho p) + V)|.sup. q(x)2 nabla.sub.x (G'(rho) + V)). The potential V is not necessarily smooth but belongs to a Sobolev space W.sup.-1, infinity (OMEGA). Given the initial datum rho.sub.0 as a probability density on OMEGA, we use a descent algorithm in the probability space to discretize the q(x)-Laplacian parabolic equation in time. Then, we use compact embedding W.sup.1-q(-) (OMEGA) ?? L.sup.q(.) (OMEGA) established by Fan and Zhao to study the convergence of our algorithm to a weak solution of the q(x)-Laplacian parabolic equation. Finally, we establish the convergence of solutions of the q(x)-Laplacian parabolic equation to equilibrium in the p(.)-variable exponent Wasserstein space.
Cluster analysis is aimed at classifying elements into categories on the basis of their similarity. Its applications range from astronomy to bioinformatics, bibliometrics, and pattern recognition. We ...propose an approach based on the idea that cluster centers are characterized by a higher density than their neighbors and by a relatively large distance from points with higher densities. This idea forms the basis of a clustering procedure in which the number of clusters arises intuitively, outliers are automatically spotted and excluded from the analysis, and clusters are recognized regardless of their shape and of the dimensionality of the space in which they are embedded. We demonstrate the power of the algorithm on several test cases.