We construct a binomial tree model fitting all moments to the approximated geometric Brownian motion. Our construction generalizes the classical Cox–Ross–Rubinstein, the Jarrow–Rudd, and the Tian ...binomial tree models. The new binomial model is used to resolve a discontinuity problem in option pricing.
•We provides a multi-purpose binomial tree model.•Our model generalizes the Cox–Ross–Rubinstein, Jarrow–Rudd, and Tian models.•Our model can fit all moments of the approximate geometric Brownian motion.•Our binomial model is used to resolve the discontinuity problem in option pricing.
Study of gas purifiers for the CMS RPC detector Benussi, L.; Bianco, S.; Colafranceschi, S. ...
Nuclear instruments & methods in physics research. Section A, Accelerators, spectrometers, detectors and associated equipment,
01/2012, Letnik:
661
Journal Article
Recenzirano
Odprti dostop
The CMS RPC muon detector utilizes a gas recirculation system called closed loop (CL) to cope with large gas mixture volumes and costs. A systematic study of CL gas purifiers has been carried out ...over 400 days between July 2008 and August 2009 at CERN in a low-radiation test area, with the use of RPC chambers with currents monitoring, and gas analysis sampling points. The study aimed to fully clarify the presence of pollutants, the chemistry of purifiers used in the CL, and the regeneration procedure. Preliminary results on contaminants release and purifier characterization are reported.
Optimal Financial Portfolios Stoyanov, S. V.; Rachev, S. T.; Fabozzi, F. J.
Applied mathematical finance.,
12/2007, Letnik:
14, Številka:
5
Journal Article
Recenzirano
The classes of reward-risk optimization problems that arise from different choices of reward and risk measures are considered. In certain examples the generic problem reduces to linear or quadratic ...programming problems. An algorithm based on a sequence of convex feasibility problems is given for the general quasi-concave ratio problem. Reward-risk ratios that are appropriate in particular for non-normal assets return distributions and are not quasi-concave are also considered.
The CMS experiment is expected to start data taking during 2008, and large data samples, of the peta-bytes scale, will be produced each year. The CMS Physics Tools package provides the CMS physicist ...with a powerful and flexible software layer for analysis of these huge datasets that is well integrated in the CMS experiment software. C++ generic programming is used to allow simple extensions of analysis tools. A core part of this package is the Candidate Model providing a coherent interface to different types of data. Standard tasks such as combinatorial analyses, generic cuts, MC truth matching and constrained fitting are supported. Advanced template techniques enable the user to add missing features easily. We explain the underlying model, certain details of the implementation and present some use cases showing how the tools are currently used in generator and full simulation studies as preparation for analysis of real data.
The elliptic azimuthal anisotropy coefficient (v2) is measured for charm (D0) and strange (KS0, Λ, Ξ−, and Ω−) hadrons, using a data sample of p+Pb collisions collected by the CMS experiment, at a ...nucleon-nucleon center-of-mass energy of sNN=8.16 TeV. A significant positive v2 signal from long-range azimuthal correlations is observed for all particle species in high-multiplicity p+Pb collisions. The measurement represents the first observation of possible long-range collectivity for open heavy flavor hadrons in small systems. The results suggest that charm quarks have a smaller v2 than the lighter quarks, probably reflecting a weaker collective behavior. This effect is not seen in the larger PbPb collision system at sNN=5.02 TeV, also presented.
In 2012, 14 Italian Institutions participating LHC Experiments (10 in CMS) have won a grant from the Italian Ministry of Research (MIUR), to optimize Analysis activities and in general the ...Tier2/Tier3 infrastructure. A large range of activities is actively carried on: they cover data distribution over WAN, dynamic provisioning for both scheduled and interactive processing, design and development of tools for distributed data analysis, and tests on the porting of CMS software stack to new highly performing / low power architectures.