A determination of the composition of primary cosmic rays in the energy range between 1 and 1000 PeV is an important objective in studies of processes involving the formation and propagation of ...cosmic rays. On the basis of experience gained in exploiting the SPHERE-2 balloon-based facility, a SPHERE-3 facility, which possesses a wider aperture and a better optical resolution, is developed. The current status of work on designing this facility is presented.
Both the method of bootstrap analysis and the presentation of data in behavioral tests are suggested to transcribe preference choice and/or multiple choices, with the study of preferences of the ...Egyptian fruit bat pup (
Rousettus aegyptiacus
) serving as an example. The use of the bootstrap method allows for the reliability of a given result from poor data to be evaluated. A new method of visualization proposed allows us to present data clearly with a complex choice.
Here we present the current state of the technical design of the SPHERE project’s new detector. The SPHERE project is aimed at primary cosmic ray studies in the 1–1000 PeV energy range using the ...reflected Cherenkov light method. The concept of a drone mounted detector with a photosensitive camera based on silicon photomultipliers is discussed. The design details of a small scale prototype of this detector is presented.
Paper contains the first results on the development of a SPHERE-3 telescope for the primary cosmic ray studies in 1–1000 PeV energy range using reflected and direct Cherenkov light generated by ...extensive air showers. It also sheds some light on the development of our new approach to the design of the new telescope.
Further development of the way of studying primary cosmic rays by detecting the reflected extensive air shower Cherenkov light is planned, based on the successful implementation of the SPHERE-2 ...aerostat experiment. The possibility of simultaneously detecting direct and reflected Cherenkov light from extensive air showers is demonstrated. Prospects for creating a new SPHERE-3 detector are discussed and the first results from modeling are presented.
Direct Dark Matter searches are nowadays one of the most fervid research topics with many experimental efforts devoted to the search for nuclear recoils induced by the scattering of Weakly ...Interactive Massive Particles (WIMPs). Detectors able to reconstruct the direction of the nucleus recoiling against the scattering WIMP are opening a new frontier to possibly extend Dark Matter searches beyond the neutrino background. Exploiting directionality would also prove the galactic origin of Dark Matter with an unambiguous signal-to-background separation. Indeed, the angular distribution of recoiled nuclei is centered around the direction of the Cygnus constellation, while the background distribution is expected to be isotropic. Current directional experiments are based on gas TPC whose sensitivity is limited by the small achievable detector mass. In this paper we present the discovery potential of a directional experiment based on the use of a solid target made of newly developed nuclear emulsions and of optical read-out systems reaching unprecedented nanometric resolution.
The study of cosmic rays mass composition is an important problem in high-energy physics. The main goal of the SPHERE-2 experiment was to study the energy spectrum of the primary cosmic rays in the ...10–300 PeV energy range. Also the experimental data allow approaching their mass composition. The separation of events into nuclei groups makes it possible to estimate the average masses over the sample. Using machine learning methods, we developed a separation method for the primary nuclei groups that formed extensive air showers based on the simulated events for the SPHERE-2 telescope. Various models of the high energy nucleus-nucleus interaction were used, but their predictions differ significantly. In the SPHERE-2 experiment data analysis, this problem was solved, first, by the use of the data on Cherenkov light, which has weak dependence on the model of hadronic interaction; second, the neural network was trained simultaneously on two interaction models (QGSJET-01 and QGSJETII-04), which differ greatly from each other. Therefore, the independence of experimental data processing from the choice of the nuclear interaction model was ensured. The regression task is solved by machine learning methods. The separation of events into three groups of nuclei—protons (p), nitrogen (N), and iron (Fe)—by using a neural network is more precise than that by using traditional methods.
High-energy cosmic-ray research via the detection of Cherenkov radiation from extensive air showers was begun in the Tunka valley (50 km to the west from the southern extremity of Lake Baikal) in the ...early 1990s. A series of large arrays combined into the TAIGA (Tunka Advanced Instrument for cosmic-ray physics and Gamma Astronomy) astrophysical facility and designed to study gamma rays and charged cosmic rays have been created in the elapsed time. Descriptions of the facility arrays and the main results obtained while investigating high-energy cosmic rays are presented. Plans for a further development of the astrophysical facility are discussed.
This paper presents the results of an analysis of observations of the Crab Nebula gamma-ray source with the first two atmospheric Cherenkov telescopes of the TAIGA (Tunka Advanced Instrument for ...cosmic ray physics and Gamma Astronomy) astrophysical complex in the stereo mode of observations. The article analyzed observational data from 2020 to 2021. Over 36 hours of observations, a signal was obtained at a statistical significance level of 5
and a spectrum of gamma rays was plotted in the energy range from 2 to 70 TeV. The paper describes a technique for gamma–hadron separation and reconstruction of detected gamma-rays energy.
Display omitted
•EuPA YPIC challenge.•Synthetic peptides, sequencing, unusual database.•De novo MS/MS, PTMs, sequencing/synthesis errors.
Here we present the results of our attempt on the EuPA YPIC ...challenge. The task was to sequence the provided synthetic peptides, build the sentence encrypted in them and determine from which book the sentence originated.
The task itself, while holding no direct scientific value, offers an insight in less formal terms (for participants at least) on how the overall process of a scientific study of a “new protein” looks like. Hence, we decided to look at the challenge as if it was a general task of sequencing an unknown protein from an unusual proteome database. To solve the task we used LC–MS/MS, MALDI-MS and de novo sequencing. A combination of two MS instruments and de novo MS/MS data analysis make it possible to sequence new peptides and proteins not yet present in proteomic databases.