The relevance of modeling zones of maximum flooding of rivers (flood zones) during engineering surveys for design and construction of economic activity objects, development of projects for the ...disturbed lands recultivation in the zones of direct impact of water bodies is related to the fact that under current conditions, with ever-increasing requirements for security and business continuity of the projected facilities, the existing methods for assessing the risk of flooding do not always give a satisfactory result in order to fully assess whether there is a risk for the projected facility or not. The main aim of the research is to analyze domestic and foreign experience in modeling flood zones at maximum river water levels for practical solution of problems of engineering design, to assess advantages and disadvantages of the main approaches in the practice of surveys, to demonstrate the possibilities of modeling method for solving a particular engineering problem. Methods. The method of spatial data analysis for creating a digital terrain model via geoinformation systems was used for preparing initial data; the method of numerical simulation of the Inya river flood zone and the abandoned quarry excavation located on the left bank as part of the works on preparation of the project for the disturbed lands recultivation was used to calculate the maximum flood zone. Numerical modeling was carried out using HEC-RAS simulation system, ver. 5.0.5. Results. The author has carried out the review of the existing worldwide and domestic practice of numerical modeling of flood zones for rivers when solving the problems in the field of engineering design. The paper introduces the author's own model of the calculated flood zone on the example of the recultivation site «Mokhovskoye pole». The example of assessment of the impact on the recultivated object when implementing the most negative scenario (probable flooding of the pit in the period of the maximum water flow of 1 % occurrence) was shown, the advantages and disadvantages of applying different approaches in practice were described.
A measurement of the ratio of branching fractions of the decays B^{+}→K^{+}μ^{+}μ^{-} and B^{+}→K^{+}e^{+}e^{-} is presented. The proton-proton collision data used correspond to an integrated ...luminosity of 5.0 fb^{-1} recorded with the LHCb experiment at center-of-mass energies of 7, 8, and 13 TeV. For the dilepton mass-squared range 1.1<q^{2}<6.0 GeV^{2}/c^{4} the ratio of branching fractions is measured to be R_{K}=0.846_{-0.054}^{+0.060}_{-0.014}^{+0.016}, where the first uncertainty is statistical and the second systematic. This is the most precise measurement of R_{K} to date and is compatible with the standard model at the level of 2.5 standard deviations.
Display omitted
First evidence of a structure in the J/ψΛ invariant mass distribution is obtained from an amplitude analysis of Ξb-→J/ψΛK- decays. The observed structure is consistent with being due ...to a charmonium pentaquark with strangeness with a significance of 3.1σ including systematic uncertainties and look-elsewhere effect. Its mass and width are determined to be 4458.8±2.9-1.1+4.7MeV and 17.3±6.5-5.7+8.0MeV, respectively, where the quoted uncertainties are statistical and systematic. The structure is also consistent with being due to two resonances. In addition, the narrow excited Ξ- states, Ξ1690- and Ξ1820-, are seen for the first time in a Ξb- decay, and their masses and widths are measured with improved precision. The analysis is performed using pp collision data corresponding to a total integrated luminosity of 9 fb-1, collected with the LHCb experiment at centre-of-mass energies of 7, 8 and 13 TeV.
The ratio of branching fractions R(D*−)≡B(B0→D*−τ+ντ)/B(B0→D*−μ+νμ) is measured using a data sample of proton-proton collisions collected with the LHCb detector at center-of-mass energies of 7 and 8 ...TeV, corresponding to an integrated luminosity of 3 fb−1. The τ lepton is reconstructed with three charged pions in the final state. A novel method is used that exploits the different vertex topologies of signal and backgrounds to isolate samples of semitauonic decays of b hadrons with high purity. Using the B0→D*−π+π−π+ decay as the normalization channel, the ratio B(B0→D*−τ+ντ)/B(B0→D*−π+π−π+) is measured to be 1.97±0.13±0.18, where the first uncertainty is statistical and the second systematic. An average of branching fraction measurements for the normalization channel is used to derive B(B0→D*−τ+ντ)=(1.42±0.094±0.129±0.054)%, where the third uncertainty is due to the limited knowledge of B(B0→D*−π+π−π+). A test of lepton flavor universality is performed using the well-measured branching fraction B(B0→D*−μ+νμ) to compute R(D*−)=0.291±0.019±0.026±0.013, where the third uncertainty originates from the uncertainties on B(B0→D*−π+π−π+) and B(B0→D*−μ+νμ). This measurement is in agreement with the Standard Model prediction and with previous measurements.
Searches are performed for both promptlike and long-lived dark photons, A^{'}, produced in proton-proton collisions at a center-of-mass energy of 13 TeV, using A^{'}→μ^{+}μ^{-} decays and a data ...sample corresponding to an integrated luminosity of 1.6 fb^{-1} collected with the LHCb detector. The promptlike A^{'} search covers the mass range from near the dimuon threshold up to 70 GeV, while the long-lived A^{'} search is restricted to the low-mass region 214<m(A^{'})<350 MeV. No evidence for a signal is found, and 90% confidence level exclusion limits are placed on the γ-A^{'} kinetic-mixing strength. The constraints placed on promptlike dark photons are the most stringent to date for the mass range 10.6<m(A^{'})<70 GeV, and are comparable to the best existing limits for m(A^{'})<0.5 GeV. The search for long-lived dark photons is the first to achieve sensitivity using a displaced-vertex signature.
LHCbDirac: distributed computing in LHCb Stagni, F; Charpentier, P; Graciani, R ...
Journal of physics. Conference series,
01/2012, Letnik:
396, Številka:
3
Journal Article
Recenzirano
Odprti dostop
We present LHCbDirac, an extension of the DIRAC community Grid solution that handles LHCb specificities. The DIRAC software has been developed for many years within LHCb only. Nowadays it is a ...generic software, used by many scientific communities worldwide. Each community wanting to take advantage of DIRAC has to develop an extension, containing all the necessary code for handling their specific cases. LHCbDirac is an actively developed extension, implementing the LHCb computing model and workflows handling all the distributed computing activities of LHCb. Such activities include real data processing (reconstruction, stripping and streaming), Monte-Carlo simulation and data replication. Other activities are groups and user analysis, data management, resources management and monitoring, data provenance, accounting for user and production jobs. LHCbDirac also provides extensions of the DIRAC interfaces, including a secure web client, python APIs and CLIs. Before putting in production a new release, a number of certification tests are run in a dedicated setup. This contribution highlights the versatility of the system, also presenting the experience with real data processing, data and resources management, monitoring for activities and resources.
A test of lepton universality, performed by measuring the ratio of the branching fractions of the $B^0 → K^{*0}μ^+μ^-$ and $B^0$ → $K^{*0}e^+e^-$ decays, $R_{K^{*0}}$, is presented. The $K^{*0}$ ...meson is reconstructed in the final state $K^+π^-$, which is required to have an invariant mass within 100 MeV/c2 of the known $K^*$ (892)0 mass. The analysis is performed using proton-proton collision data, corresponding to an integrated luminosity of about 3 fb-1, collected by the LHCb experiment at centre-of-mass energies of 7 and 8 TeV. The ratio is measured in two regions of the dilepton invariant mass squared, q2, to be R K ∗ 0 = { 0.66 − + 0.07 0.11 ( s t a t ) ± 0.03 ( s y s t ) f o r 0.045 < q 2 < 1.1 G e V 2 / c 4 , 0.69 − + 0.07 0.11 ( s t a t ) ± 0.05 ( s y s t ) f o r 1.1 < q 2 < 6.0 G e V 2 / c 4 . The corresponding 95.4% confidence level intervals are 0.52, 0.89 and 0.53, 0.94. The results, which represent the most precise measurements of $R_{K^{*0}}$ to date, are compatible with the Standard Model expectations at the level of 2.1–2.3 and 2.4–2.5 standard deviations in the two q2 regions, respectively.
The first simultaneous test of muon-electron universality using B+→K+ℓ+ℓ− and B0→K*0ℓ+ℓ− decays is performed, in two ranges of the dilepton invariant-mass squared, q2. The analysis uses beauty mesons ...produced in proton-proton collisions collected with the LHCb detector between 2011 and 2018, corresponding to an integrated luminosity of 9 fb−1. Each of the four lepton universality measurements reported is either the first in the given q2 interval or supersedes previous LHCb measurements. The results are compatible with the predictions of the Standard Model.
Self managing experiment resources Stagni, F; Ubeda, M; Tsaregorodtsev, A ...
Journal of physics. Conference series,
01/2014, Letnik:
513, Številka:
3
Journal Article
Recenzirano
Odprti dostop
Within this paper we present an autonomic Computing resources management system, used by LHCb for assessing the status of their Grid resources. Virtual Organizations Grids include heterogeneous ...resources. For example, LHC experiments very often use resources not provided by WLCG, and Cloud Computing resources will soon provide a non-negligible fraction of their computing power. The lack of standards and procedures across experiments and sites generated the appearance of multiple information systems, monitoring tools, ticket portals, etc... which nowadays coexist and represent a very precious source of information for running HEP experiments Computing systems as well as sites. These two facts lead to many particular solutions for a general problem: managing the experiment resources. In this paper we present how LHCb, via the DIRAC interware, addressed such issues. With a renewed Central Information Schema hosting all resources metadata and a Status System (Resource Status System) delivering real time information, the system controls the resources topology, independently of the resource types. The Resource Status System applies data mining techniques against all possible information sources available and assesses the status changes, that are then propagated to the topology description. Obviously, giving full control to such an automated system is not risk-free. Therefore, in order to minimise the probability of misbehavior, a battery of tests has been developed in order to certify the correctness of its assessments. We will demonstrate the performance and efficiency of such a system in terms of cost reduction and reliability.
A search for the rare decays $B^0_s$→$μ$+$μ$- and B0→$μ$+$μ$- is performed at the LHCb experiment using data collected in pp collisions corresponding to a total integrated luminosity of 4.4 fb-1. An ...excess of $B^0_s$→$μ$+$μ$- decays is observed with a significance of 7.8 standard deviations, representing the first observation of this decay in a single experiment. The branching fraction is measured to be $B$($μ$+$μ$-)=(3.0 ± $0.6^{+0.3}_{-0.2}$) × 10-9, where the first uncertainty is statistical and the second systematic. The first measurement of the $B^0_s$→$μ$+$μ$- effective lifetime, τ($B^0_s$→($μ$+$μ$-) = 2.04 ± 0.44 ± 0.05 ps, is reported. No significant excess of B0→($μ$+$μ$- decays is found, and a 95% confidence level upper limit, $B$(B0→($μ$+$μ$-) < 3.4 × 10-10, is determined. All results are in agreement with the standard model expectations.