We have investigated the behaviour of a core-softened system in a narrow slit pore (the width of the pore is equal to three diameters of a particle). In the previous studies, it was shown that ...strongly confined systems formed crystalline phases which consisted of several triangular or square layers. These phases can also be considered as cuts of FCC or HCP structures. We have shown that the behaviour of a core-softened system is more complex. We have also observed a quasicrystalline phase. Moreover, the phase with two triangular layers appears at lower densities than the phase with two square layers, which is in contrast to the behaviour of the systems studied before. These results demonstrate that the phase behaviour of strongly confined systems can be even more complex than it has been supposed before.
Modern X‐ray free‐electron lasers (XFELs) operating at high repetition rates produce a tremendous amount of data. It is a great challenge to classify this information and reduce the initial data set ...to a manageable size for further analysis. Here an approach for classification of diffraction patterns measured in prototypical diffract‐and‐destroy single‐particle imaging experiments at XFELs is presented. It is proposed that the data are classified on the basis of a set of parameters that take into account the underlying diffraction physics and specific relations between the real‐space structure of a particle and its reciprocal‐space intensity distribution. The approach is demonstrated by applying principal component analysis and support vector machine algorithms to the simulated and measured X‐ray data sets.
Large amount of data being generated at large scale facilities like European X-ray Free- Electron Laser (XFEL) requires new approaches for data processing and analysis. One of the most ...computationally challenging experiments at an XFEL is single-particle structure determination. In this paper we propose a new design for an integrated software platform which combines well-established techniques for XFEL data analysis with High Performance Data Analysis (HPDA) methods. In our software platform we use streaming data analysis algorithms with high performance computing solutions. This approach should allow analysis of the experimental dataflow in quasi-online regime.
Databases of scientific publications contain millions of articles. To access relevant information, effective search methods are needed that use factors related to the key essence of publications. ...This paper presents an approach to extracting the essence of articles from brief descriptions used in references using neural network models. As a result, a prototype of a search service was created that allows you to find scientific publications by short descriptions.
The results of a study of somatic embryogenesis in Norway spruce (
Picea abies
(L.) H. Karst.), growing in the middle taiga subzone in the Republic of Karelia (Russia), are presented. Immature ...zygotic embryos were collected from ten clones of plus trees at the Petrozavodsk Seed Orchard and a tree in Petrozavodsk at a sum of effective temperatures from 728 to 1257 degree days (at a base temperature of 5°C). It has been found that it is necessary to use the LM nutrient medium as a substrate for the induction of somatic embryogenesis and proliferation of cell lines and embryos at the development stages from globular to cotyledonary as explants. After 14 months of cultivation, 12 of 26 (56%) cell lines obtained from explants from the Petrozavodsk Seed Orchard and two cell lines of 23 (9%) from a tree in Petrozavodsk were preserved. As a result of the study, maternal genotypes of clones of plus trees of Norway spruce, capable of forming an embryonic suspensor mass, long-term proliferation, and the formation of regenerated plants were identified.
The LHC experiments are preparing for the precision measurements and further discoveries that will be made possible by higher LHC energies from April 2015 (LHC Run2). The need for simulation, data ...processing and analysis would overwhelm the expected capacity of grid infrastructure computing facilities deployed by the Worldwide LHC Computing Grid (WLCG). To meet this challenge the integration of the opportunistic resources into LHC computing model is highly important. The Tier-1 facility at Kurchatov Institute (NRC-KI) in Moscow is a part of WLCG and it will process, simulate and store up to 10% of total data obtained from ALICE, ATLAS and LHCb experiments. In addition Kurchatov Institute has supercomputers with peak performance 0.12 PFLOPS. The delegation of even a fraction of supercomputing resources to the LHC Computing will notably increase total capacity. In 2014 the development a portal combining a Tier-1 and a supercomputer in Kurchatov Institute was started to provide common interfaces and storage. The portal will be used not only for HENP experiments, but also by other data- and compute-intensive sciences like biology with genome sequencing analysis; astrophysics with cosmic rays analysis, antimatter and dark matter search, etc.
The Large Hadron Collider (LHC), operating at the international CERN Laboratory in Geneva, Switzerland, is leading Big Data driven scientific explorations. Experiments at the LHC explore the ...fundamental nature of matter and the basic forces that shape our universe, and were recently credited for the discovery of a Higgs boson. ATLAS, one of the largest collaborations ever assembled in the sciences, is at the forefront of research at the LHC. To address an unprecedented multi-petabyte data processing challenge, the ATLAS experiment is relying on a heterogeneous distributed computational infrastructure. The ATLAS experiment uses PanDA (Production and Data Analysis) Workload Management System for managing the workflow for all data processing on over 140 data centers. Through PanDA, ATLAS physicists see a single computing facility that enables rapid scientific breakthroughs for the experiment, even though the data centers are physically scattered all over the world. While PanDA currently uses more than 250000 cores with a peak performance of 0.3+ petaFLOPS, next LHC data taking runs will require more resources than Grid computing can possibly provide. To alleviate these challenges, LHC experiments are engaged in an ambitious program to expand the current computing model to include additional resources such as the opportunistic use of supercomputers. We will describe a project aimed at integration of PanDA WMS with supercomputers in United States, Europe and Russia (in particular with Titan supercomputer at Oak Ridge Leadership Computing Facility (OLCF), Supercomputer at the National Research Center “Kurchatov Institute”, IT4 in Ostrava, and others). The current approach utilizes a modified PanDA pilot framework for job submission to the supercomputers batch queues and local data management, with light-weight MPI wrappers to run singlethreaded workloads in parallel on Titan’s multi-core worker nodes. This implementation was tested with a variety of Monte-Carlo workloads on several supercomputing platforms. We will present our current accomplishments in running PanDA WMS at supercomputers and demonstrate our ability to use PanDA as a portal independent of the computing facility’s infrastructure for High Energy and Nuclear Physics, as well as other data-intensive science applications, such as bioinformatics and astro-particle physics.
Ranges of stability and regions of the stoichiometric stability of sulfide phases of nickel(II) and cobalt(II) are determined in water–salt solutions. Regions of the homogeneity of NiS and CoS ...monosulfides are considered. The composition of a microemulsion used as a nanoreactor for synthesizing sulfides is determined, and its phase diagram is plotted. The results from estimating the size of the obtained nanocrystals are presented.
It is known that during spacecraft launches ionospheric plasma properties are modified in the result of impact of shock-acoustic waves generated during carrier rocket supersonic motion. As a rule, ...investigation of ionospheric plasma variations is carried out by the signals of Global Navigation Satellite Systems GPS/GLONASS that implies ground station network. There is no such a system near the “Vostochniy” cosmodrome that makes it necessary to search for an alternative solution. One of them may be the application of ionosphere vertical and oblique sounding stations. Based on the analysis of such station data, the possibility of evaluation of ionosphere modification during “Vostochniy” cosmodrome launches is shown.
The ability to investigate 3D structure of biomolecules, such as proteins and viruses, is essential in biology and medicine. With the invention of super-bright X-ray free electron lasers (e.g. ...European XFEL and Linac Coherent Light Source (LCLS)) the Single Particle Imaging (SPI) approach allows to reconstruct 3D structures from many 2D diffraction images produced in the experiment by X-rays scattered on the biomolecule exposed in different orientations. In the same time, there are still many challenging problems in experiment setup, sample preparation and injection, which limit number and quality of obtained diffraction patterns and, consequently, limit achievable resolution of recovered 3D structure. However even with the current experimental limitations it is possible to reconstruct 3D structures of some large biomolecules. An important question arises: what range of 3D resolution is possible to achieve under experimental conditions available now. We investigated how the number and quality of diffraction images affect the 3D resolution. First the SPI experiment was simulated and reconstructed with the Dragonfly software. Then we analyzed how the number of diffraction images and the beam intensity affect the final 3D resolution. We come to the following conclusions:1) starting from the beam intensity value (fluence) equal to 3×1012photons/μm2 the resolution becomes to be almost constant;2) the resolution strongly depends on the number of diffraction patterns. More than 10 000 diffraction images are required to get 4 nm resolution.