This report summarises some of the activities of the HiggsTools initial training network working group in the period 2015-2017. The main goal of this working group was to produce a document ...discussing various aspects of state-of-the-art Higgs physics at the large hadron collider (LHC) in a pedagogic manner. The first part of the report is devoted to a description of phenomenological searches for new physics (NP) at the LHC. All of the available studies of the couplings of the new resonance discovered in 2012 by the ATLAS and CMS experiments (Aad et al (ATLAS Collaboration) 2012 Phys. Lett. B 716 1-29; Chatrchyan et al (CMS Collaboration) 2012 Phys. Lett. B 716 30-61) conclude that it is compatible with the Higgs boson of the standard model (SM) within present precision. So far the LHC experiments have given no direct evidence for any physical phenomena that cannot be described by the SM. As the experimental measurements become more and more precise, there is a pressing need for a consistent framework in which deviations from the SM predictions can be computed precisely. Such a framework should be applicable to measurements in all sectors of particle physics, not only LHC Higgs measurements but also electroweak precision data, etc. We critically review the use of the κ-framework, fiducial and simplified template cross sections, effective field theories, pseudoobservables and phenomenological Lagrangians. Some of the concepts presented here are well known and were used already at the time of the large electron-positron collider (LEP) experiment. However, after years of theoretical and experimental development, these techniques have been refined, and we describe new tools that have been introduced in order to improve the comparison between theory and experimental data. In the second part of the report, we propose φ * as a new and complementary observable for studying Higgs boson production at large transverse momentum in the case where the Higgs boson decays to two photons. The φ * variable depends on measurements of the angular directions and rapidities of the two Higgs decay products rather than the energies, and exploits the information provided by the calorimeter in the detector. We show that, even without tracking information, the experimental resolution for φ * is better than that of the transverse momentum of the photon pair, particularly at low transverse momentum. We make a detailed study of the phenomenology of the φ * variable, contrasting the behaviour with the Higgs transverse momentum distribution using a variety of theoretical tools including event generators and fixed order perturbative computations. We consider the theoretical uncertainties associated with both p TH and φ * distributions. Unlike the transverse momentum distribution, the φ * distribution is well predicted using the Higgs effective field theory in which the top quark is integrated out-even at large values of φ * -thereby making this a better observable for extracting the parameters of the Higgs interaction. In contrast, the potential of the φ * distribution as a probe of NP is rather limited, since although the overall rate is affected by the presence of additional heavy fields, the shape of the φ * distribution is relatively insensitive to heavy particle thresholds.
A high-granularity timing detector for the ATLAS phase-II upgrade Casado, M.P.; Adam Bourdarios, C.; Belfkir, M. ...
Nuclear instruments & methods in physics research. Section A, Accelerators, spectrometers, detectors and associated equipment,
2022, Letnik:
1032
Journal Article
Recenzirano
Odprti dostop
The large increase of pileup interactions is one of the main experimental challenges for the HL-LHC physics programme. A powerful new way to mitigate the effects of pileup is to use high-precision ...timing information to distinguish between collisions occurring close in space but well-separated in time. A High-Granularity Timing Detector, based on low gain avalanche detector technology, is therefore proposed for the ATLAS Phase-II upgrade. Covering the pseudorapidity region between 2.4 and 4.0, this device will improve the detector physics performance in the forward region. The typical number of hits per track in the detector was optimized so that the target average time resolution per track for a minimum-ionising particle is 30 ps at the start of lifetime, increasing to 50 ps at the end of HL-LHC operation. The high-precision timing information improves the pileup reduction to improve the forward object reconstruction, complementing the capabilities of the upgraded Inner Tracker (ITk) in the forward regions of ATLAS and leading to an improved performance for both jet and lepton reconstruction. These improvements in object reconstruction performance translate into sensitivity gains and enhance the reach of the ATLAS physics programme at the HL-LHC. In addition, the HGTD offers unique capabilities for the online and offline luminosity determination, an important requirement for precision physics measurements.
This Report summarizes some of the activities of the HiggsTools Initial Training Network working group in the period 2015-2017. The main goal of this working group was to produce a document ...discussing various aspects of state-of-the-art Higgs physics at the Large Hadron Collider (LHC) in a pedagogic manner. The first part of the Report is devoted to a description of phenomenological searches for New Physics at the LHC. As the experimental measurements become more and more precise, there is a pressing need for a consistent framework in which deviations from the SM predictions can be computed precisely. We critically review the use of the \k{appa}-framework, fiducial and simplified template cross sections, effective field theories, pseudo-observables and phenomenological Lagrangians. In the second part of the Report, we propose \(\varphi_{\eta}^*\) as a new and complementary observable for studying Higgs boson production at large transverse momentum in the case where the Higgs boson decays to two photons. We make a detailed study of the phenomenology of the \(\varphi_{\eta}^*\) variable, contrasting the behaviour with the Higgs transverse momentum distribution using a variety of theoretical tools including event generators and fixed order perturbative computations.
Power-to-gas (PtG), a technology that converts electricity into hydrogen, is expected to become a core component of future low-carbon energy systems. While its economics and performance as a sector ...coupling technique have been well studied in the context of perfectly competitive energy markets, the distortions caused by the presence of large strategic players with a multi-market presence have received little attention. In this paper, we examine them by specifying a partial equilibrium model that provides a stylized representation of the interactions among the natural gas, electricity, and hydrogen markets. We calibrate our model using the future Dutch energy system as a reference. Using this model, we compare several possible ownership organizations for PtG to investigate how imperfect competition affects its operations. Evidence gained from these market simulations show that the effects of PtG vary with the multi-market profile of its operator. Producers of fossil-based hydrogen tend to make little use of PtG, whereas renewable power producers use it more to increase the electricity prices. Although PtG operations are profitable and can be welfare-enhancing, welfare distribution among agents is highly unequal when PtG is strategically operated in conjunction with variable renewable generation. In that case, PtG also raises environmental concerns as it stimulates the use of polluting thermoelectric generation.
The article traces the fundamental question that has always stood in front of critical theory since Marx ‒ the relationship between theory and practice. The focus will be mainly on the role of the ...theory: what is, what can be and what should be the relationship of critical theory (mainly Theodor Adorno's as negative dialectics) to practice, to context, to the world. This question has faced critical theory since the Eleventh thesis on Feuerbach, in which the distinction between descriptive, purely theoretical philosophy and practical philosophy is reconciled. On the other hand, the main accusations against the philosophy of Adorno and Horkheimer consist of lack of practical commitment, of resignation; here once again that central question will be posed: what is the practical potential and commitment of Adorno's negative dialectic specifically. Adorno calls for a return to theory and moving away from revolutionary practice, as he believes that in that context (1968/9) this practice is impossible. The question will be traced through the debate between Adorno and Marcuse and their disagreement regarding the potential and role of critical theory.
This text aims to reveal and to analyze certain moments of the historical, biographical, cultural, and intellectual context which influenced the critical theory of Max Horkheimer and Theodor W. ...Adorno. The method of contextualization has its theoretical groundings that correspond with the critical approaches of social analysis of the authors in focus: that the theoretical sphere is not separated from the practical and social sphere and to understand fully certain theory one should explore its context. The text is focused on the intellectual influences excluding Marx because that would require a longer and separate investigation.
Celotno besedilo
Dostopno za:
DOBA, IZUM, KILJ, NUK, PILJ, PNG, SAZU, SIK, UILJ, UKNU, UL, UM, UPUK
Gamma tomography technique can provide cross-sectional visualization of an object that needed for investigating pipe scale in geothermal power plants. Parallel beam tomography has advantages such as ...a simple system so that it is easier to apply in the field, but the scanning duration is relatively long. This paper discusses the effect of projections number on the reconstructed images quality and proposes the number of most effective projections for pipe scale investigation. Geothermal pipe sample (OD = 275 mm, t = 10 mm) with scale has been scanned with a parallel beam gamma tomography system. The system consists of a gamma radiation source (137Cs, 80 mCi), scintillation detector, motorized gantry, control module, data acquisition, and computer. The images were reconstructed with six different projections: 128, 64, 32, 16, 8, and 4 projections and the scanning duration: 530.8, 258.9, 127.9, 63.5, 31.7, and 15.8 minutes, respectively. Visually, the 128 projections data produces the smoothest image, whereas 64 and 32 projections images look almost the same. The 16 and 8 projections images are still able to distinguish between the pipe wall, scale, and void even though the 8 projections image looks very blurry. Then, the 4 projections image is not able to visualize the shape of the object. The gaps between the average pipe wall and void gray-scale pixels value of 128, 64, 32, 16, 8 and 8 projections images are 175, 174, 167, 153, 106, and 45, respectively. Based on the scanning duration, visualization, and the gray contrast, then the number of most effective projections is 32.
Polygenic risk scores (PRS) have been widely applied in research studies, showing how population groups can be stratified into risk categories for many common conditions. As healthcare systems ...consider applying PRS to keep their populations healthy, little work has been carried out demonstrating their implementation at an individual level. Our results highlight the need for further standardisation in the way PRS are developed and shared, the importance of individual risk assessment rather than the assumption of inherited averages, and the challenges currently posed when translating PRS into risk metrics.
Expression Atlas (http://www.ebi.ac.uk/gxa) is a value-added database providing information about gene, protein and splice variant expression in different cell types, organism parts, developmental ...stages, diseases and other biological and experimental conditions. The database consists of selected high-quality microarray and RNA-sequencing experiments from ArrayExpress that have been manually curated, annotated with Experimental Factor Ontology terms and processed using standardized microarray and RNA-sequencing analysis methods. The new version of Expression Atlas introduces the concept of 'baseline' expression, i.e. gene and splice variant abundance levels in healthy or untreated conditions, such as tissues or cell types. Differential gene expression data benefit from an in-depth curation of experimental intent, resulting in biologically meaningful 'contrasts', i.e. instances of differential pairwise comparisons between two sets of biological replicates. Other novel aspects of Expression Atlas are its strict quality control of raw experimental data, up-to-date RNA-sequencing analysis methods, expression data at the level of gene sets, as well as genes and a more powerful search interface designed to maximize the biological value provided to the user.