NUK - logo
E-viri
  • SHARE: Statistical hadroniz...
    Torrieri, G.; Steinke, S.; Broniowski, W.; Florkowski, W.; Letessier, J.; Rafelski, J.

    Computer physics communications, 05/2005, Letnik: 167, Številka: 3
    Journal Article

    SHARE is a collection of programs designed for the statistical analysis of particle production in relativistic heavy-ion collisions. With the physical input of intensive statistical parameters, it generates the ratios of particle abundances. The program includes cascade decays of all confirmed resonances from the Particle Data Tables. The complete treatment of these resonances has been known to be a crucial factor behind the success of the statistical approach. An optional feature implemented is the Breit–Wigner distribution for strong resonances. An interface for fitting the parameters of the model to the experimental data is provided. Title of the program: SHARE, October 2004, version 1.2 Catalogue identifier: ADVD Program summary URL: http://cpc.cs.qub.ac.uk/summaries/ADVD Program obtainable from: CPC Program Library, Queen's University of Belfast, N. Ireland Computer: PC, Pentium III, 512 MB RAM (not hardware dependent) Operating system: Linux: RedHat 6.1, 7.2, FEDORA, etc. (not system dependent) Programming language: FORTRAN77: g77, f77 as well as Mathematica, ver. 4 or 5, for the case of full chemical equilibrium and particle widths set to zero Size of the package: 645 KB directory including example programs (87 KB compressed distribution archive) External routines: KERNLIB, MATHLIB and PACKLIB from the CERN Program Library (see http://cernlib.web.cern.ch for download and installation instructions) Distribution format: tar.gz Number of lines in distributed program, including test data, etc.: 15 277 Number of bytes in distributed program, including test data, etc.: 88 522 Computer: Any computer with an f77 compiler Nature of the physical problem: Statistical analysis of particle production in relativistic heavy-ion collisions involves the formation and the subsequent decays of a large number of resonances. With the physical input of thermal parameters, such as the temperature and fugacities, and considering cascading decays, along with weak interaction feed-down corrections, the observed hadron abundances are obtained. SHARE incorporates diverse physical approaches, with a flexibility of choice of the details of the statistical hadronization model, including the selection of a chemical (non-)equilibrium condition. SHARE also offers evaluation of the extensive properties of the source of particles, such as energy, entropy, baryon number, strangeness, as well as the determination of the best intensive input parameters fitting a set of experimental yields. This allows exploration of a proposed physical hypothesis about hadron production mechanisms and the determination of the properties of their source. Method of solving the problem: Distributions at freeze-out of both the stable particles and the hadronic resonances are set according to a statistical prescription, technically calculated via a series of Bessel functions, using CERN library programs. We also have the option of including finite particle widths of the resonances. While this is computationally expensive, it is necessary to fully implement the essence of the strong interaction dynamics within the statistical hadronization picture. In fact, including finite width has a considerable effect when modeling directly detectable short-lived resonances ( Λ ( 1520 ) , K ∗ , etc.), and is noticeable in fits to experimentally measured yields of stable particles. After production, all hadronic resonances decay. Resonance decays are accomplished by addition of the parent abundances to the daughter, normalized by the branching ratio. Weak interaction decays receive a special treatment, where we introduce daughter particle acceptance factors for both strongly interacting decay products. An interface for fitting to experimental particle ratios of the statistical model parameters with the help of MINUIT 1 is provided. The χ 2 function is defined in the standard way. For an investigated quantity f and experimental error Δ f, (1) χ 2 = ( f experiment − f theory ) 2 ( Δ f statistical + Δ f systematic ) 2 , (2) N DoF = N data points − N free parameters . (note that systematic and statistical errors are independent, since the systematic error is not a random variable). Aside of χ 2 , the program also calculates the statistical significance 2, defined as the probability that, given a “true” theory and a statistical (Gaussian) experimental error, the fitted χ 2 assumes the values at or above the considered value. In the case that the best fit has statistical significance significantly below unity, the model under consideration is very likely inappropriate. In the limit of many degrees of freedom ( N DoF ), the statistical significance function depends only on χ 2 / N DoF , with 90% statistical significance at χ 2 / N DoF ∼ 1 , and falling steeply at χ 2 / N DoF > 1 . However, the degrees of freedom in fits involving ratios are generally not sufficient to reach the asymptotic limit. Hence, statistical significance depends strongly on χ 2 and N DoF separately. In particular, if N DoF < 20 , often for a fit to have an acceptable statistical significance, a χ 2 / N DoF significantly less than 1 is required. The fit routine does not always find the true lowest χ 2 minimum. Specifically, multi-parameter fits with too few degrees of freedom generally exhibit a non-trivial structure in parameter space, with several secondary minima, saddle points, valleys, etc. To help the user perform the minimization effectively, we have added tools to compute the χ 2 contours and profiles. In addition, our program's flexibility allows for many strategies in performing the fit. It is therefore possible, by following the techniques described in Section 3.7, to scan the parameter space and ensure that the minimum found is the true one. Further systematic deviations between the model and experiment can be recognized via the program's output, which includes a particle-by-particle comparison between experiment and theory. Additional comments: In consideration of the wide stream of new data coming out from RHIC, there is an on-going activity, with several groups performing analysis of particle yields. It is our hope that SHARE will allow to create an analysis standard within the community. It can be useful in analyzing the experimental data, verifying simple physical assumptions, evaluating expected yields, as well as allowing to compare various similar models and programs which are currently being used. Typical running time: For the Fortran code, the computation time with the provided default input files is about 10 minutes on 1 GHz processor. The time may rise significantly (by a factor of 300) if the full-fledged optimization and finite widths are included. In Mathematica, the typical running times are of the order of minutes. Accessibility: The program is available from: • The CPC program library, • The following websites: http://www.ifj.edu.pl/Dept4/share.html or http://www.physics.arizona.edu/~torrieri/SHARE/share.html, • From the authors upon request.