Machine learning is an important applied research area in particle physics, beginning with applications to high-level physics analysis in the 1990s and 2000s, followed by an explosion of applications ...in particle and event identification and reconstruction in the 2010s. In this document we discuss promising future research and development areas in machine learning in particle physics with a roadmap for their implementation, software and hardware resource requirements, collaborative initiatives with the data science community, academia and industry, and training the particle physics community in data science. The main objective of the document is to connect and motivate these areas of research and development with the physics drivers of the High-Luminosity Large Hadron Collider and future neutrino experiments and identify the resource needs for their implementation. Additionally we identify areas where collaboration with external communities will be of great benefit.
A search for the Standard Model Higgs boson (H) produced in association with a weak vector boson (V H) and decaying into a bottom-antibottom quark pair (b ¯b) is reported for the decay channels ...Z(νν¯)H(b ¯b), W(eν)H(b ¯b), W(µν)H(b ¯b), Z(eē)H(b ¯b), and Z(µµ¯)H(b ¯b), with a focus on Z(νν¯)H(b ¯b) in particular. The search is performed with a dataset corresponding to an integrated luminosity of 41.3 fb−1 at a center-of-mass energy of √ s = 13 TeV recorded by the CMS experiment at the LHC during Run 2 in 2017. An excess of events is observed above the expected background with a significance of 3.3 standard deviations, which is compatible with the Standard Model expectation of 3.1 standard deviations for a Higgs boson of mass mH = 125.09 GeV. When combined with the results of previous VH measurements and H → b ¯b searches using other Higgs production modes, the observed (expected) significance is 5.6 (5.5) standard deviations. This represents the first observation of H → b ¯b by the CMS experiment.
Machine learning is an important applied research area in particle physics, beginning with applications to high-level physics analysis in the 1990s and 2000s, followed by an explosion of applications ...in particle and event identification and reconstruction in the 2010s. In this document we discuss promising future research and development areas in machine learning in particle physics with a roadmap for their implementation, software and hardware resource requirements, collaborative initiatives with the data science community, academia and industry, and training the particle physics community in data science. The main objective of the document is to connect and motivate these areas of research and development with the physics drivers of the High-Luminosity Large Hadron Collider and future neutrino experiments and identify the resource needs for their implementation. Additionally we identify areas where collaboration with external communities will be of great benefit.
The rapid evolution of technology and the parallel increasing complexity of algorithmic analysis in HEP requires developers to acquire a much larger portfolio of programming skills. Young researchers ...graduating from universities worldwide currently do not receive adequate preparation in the very diverse fields of modern computing to respond to growing needs of the most advanced experimental challenges. There is a growing consensus in the HEP community on the need for training programmes to bring researchers up to date with new software technologies, in particular in the domains of concurrent programming and artificial intelligence. We review some of the initiatives under way for introducing new training programmes and highlight some of the issues that need to be taken into account for these to be successful.
Machine learning has been applied to several problems in particle physics research, beginning with applications to high-level physics analysis in the 1990s and 2000s, followed by an explosion of ...applications in particle and event identification and reconstruction in the 2010s. In this document we discuss promising future research and development areas for machine learning in particle physics. We detail a roadmap for their implementation, software and hardware resource requirements, collaborative initiatives with the data science community, academia and industry, and training the particle physics community in data science. The main objective of the document is to connect and motivate these areas of research and development with the physics drivers of the High-Luminosity Large Hadron Collider and future neutrino experiments and identify the resource needs for their implementation. Additionally we identify areas where collaboration with external communities will be of great benefit.
Determinations of the proton’s collinear parton distribution functions (PDFs) are emerging with growing precision due to increased experimental activity at facilities like the Large Hadron Collider. ...While this copious information is valuable, the speed at which it is released makes it difficult to quickly assess its impact on the PDFs, short of performing computationally expensive global fits. As an alternative, we explore new methods for quantifying the potential impact of experimental data on the extraction of proton PDFs. Our approach relies crucially on the Hessian correlation between theory-data residuals and the PDFs themselves, as well as on a newly defined quantity-the sensitivity-which represents an extension of the correlation and reflects both PDF-driven and experimental uncertainties. This approach is realized in a new, publicly available analysis package PDFSense, which operates with these statistical measures to identify particularly sensitive experiments, weigh their relative or potential impact on PDFs, and visualize their detailed distributions in a space of the parton momentum fraction x and factorization scale μ. This tool offers a new means of understanding the influence of individual measurements in existing fits as well as a predictive device for directing future fits toward the highest impact data and assumptions. Along the way, many new physics insights can be gained or reinforced. As one of many examples, PDFSense is employed to rank the projected impact of new LHC measurements in jet, vector boson, and tt¯ production and leads us to the conclusion that inclusive jet production at the LHC has a potential for playing an indispensable role in future PDF fits. These conclusions are independently verified by preliminarily fitting this experimental information and investigating the constraints they supply using the Lagrange multiplier technique.
Cullin-RING ubiquitin ligases (CRLs) are critical in ubiquitinating Myc, while COP9 signalosome (CSN) controls neddylation of Cullin in CRL. The mechanistic link between Cullin neddylation and Myc ...ubiquitination/degradation is unclear. Here we show that Myc is a target of the CSN subunit 6 (CSN6)-Cullin signalling axis and that CSN6 is a positive regulator of Myc. CSN6 enhanced neddylation of Cullin-1 and facilitated autoubiquitination/degradation of Fbxw7, a component of CRL involved in Myc ubiquitination, thereby stabilizing Myc. Csn6 haplo-insufficiency decreased Cullin-1 neddylation but increased Fbxw7 stability to compromise Myc stability and activity in an Eμ-Myc mouse model, resulting in decelerated lymphomagenesis. We found that CSN6 overexpression, which leads to aberrant expression of Myc target genes, is frequent in human cancers. Together, these results define a mechanism for the regulation of Myc stability through the CSN-Cullin-Fbxw7 axis and provide insights into the correlation of CSN6 overexpression with Myc stabilization/activation during tumorigenesis.
Determinations of the proton's collinear parton distribution functions (PDFs) are emerging with growing precision due to increased experimental activity at facilities like the Large Hadron Collider. ...While this copious information is valuable, the speed at which it is released makes it difficult to quickly assess its impact on the PDFs, short of performing computationally expensive global fits. As an alternative, we explore new methods for quantifying the potential impact of experimental data on the extraction of proton PDFs. Our approach relies crucially on the Hessian correlation between theory-data residuals and the PDFs themselves, as well as on a newly defined quantity --- the sensitivity --- which represents an extension of the correlation and reflects both PDF-driven and experimental uncertainties. This approach is realized in a new, publicly available analysis package PDFSense, which operates with these statistical measures to identify particularly sensitive experiments, weigh their relative or potential impact on PDFs, and visualize their detailed distributions in a space of the parton momentum fraction \(x\) and factorization scale \(\mu\). This tool offers a new means of understanding the influence of individual measurements in existing fits, as well as a predictive device for directing future fits toward the highest impact data and assumptions. Along the way, many new physics insights can be gained or reinforced. As one of many examples, PDFSense is employed to rank the projected impact of new LHC measurements in jet, vector boson, and \(t\bar{t}\) production and leads us to the conclusion that inclusive jet production at the LHC has a potential for playing an indispensable role in future PDF fits. These conclusions are independently verified by preliminarily fitting this experimental information and investigating the constraints they supply using the Lagrange multiplier technique.
Recent high precision experimental data from a variety of hadronic processes opens new opportunities for determination of the collinear parton distribution functions (PDFs) of the proton. In fact, ...the wealth of information from experiments such as the Large Hadron Collider (LHC) and others, makes it difficult to quickly assess the impact on the PDFs, short of performing computationally expensive global fits. As an alternative, we explore new methods for quantifying the potential impact of experimental data on the extraction of proton PDFs. Our approach relies crucially on the correlation between theory-data residuals and the PDFs themselves, as well as on a newly defined quantity --- the sensitivity --- which represents an extension of the correlation and reflects both PDF-driven and experimental uncertainties. This approach is realized in a new, publicly available analysis package PDFSense, which operates with these statistical measures to identify particularly sensitive experiments, weigh their relative or potential impact on PDFs, and visualize their detailed distributions in a space of the parton momentum fraction x and factorization scale \mu. This tool offers a new means of understanding the influence of individual measurements in existing fits, as well as a predictive device for directing future fits toward the highest impact data and assumptions.