Abstract
We present the development of a single-photon detector and the connected read-out electronics. This “hybrid” detector is based on a vacuum tube, transmission photocathode, microchannel plate ...and a pixelated CMOS read-out anode encapsulating the analog and digital-front end electronics. This assembly will be capable of detecting up to 10
9
photons per second with simultaneous measurement of position and time.
The pixelated read-out anode used is based on the Timepix4 ASIC (65 nm CMOS technology) designed in the framework of the Medipix4 collaboration. This ASIC is an array of 512 × 448 pixels distributed on a 55 μm square pitch, with a sensitive area of ∼7 cm
2
. It features 50–70 e
−
equivalent noise charge, a maximum rate of 2.5 Ghits/s, and allows to time-stamp the leading-edge time and to measure the Time-over-Threshold (ToT) for each pixel. The pixel-cluster position combined with its ToT information will allow to reach 5–10 μm position resolution. This information can also be used to correct for the leading-edge time-walk achieving a timing resolution of the order of 10 ps.
The detector will be highly compact thanks to the encapsulated front-end electronics allowing local data processing and digitization. An FPGA-based data acquisition board, placed far from the detector, will receive the detector hits using 16 electro-optical links operated at 10.24 Gbps. The data acquisition board will decode the information and store the relevant data in a server for offline analysis.
These performance will allow significant advances in particle physics, life sciences, quantum optics or other emerging fields where the detection of single photons with excellent timing and position resolutions are simultaneously required.
Abstract
In this paper, we present Hog (HDL on git), a set of Tcl scripts and a suitable methodology to allow a fruitful use of git as a HDL repository and guarantee synthesis and placing ...reproducibility and binary file traceability. Tcl scripts, able to recreate the HDL projects are committed to the repository. This ensures that all the modifications done to the project are correctly propagated, allowing reproducibility. To make the system more user friendly, all the source files used in each project are listed in dedicated text files that are read out by the project Tcl file and imported into the project. Hog supports Xilinx Vivado, ISE (PlanAhead) and Intel Quartus.
To guarantee binary file traceability, Hog links it permanently to a specific git commit by embedding the git-commit hash (SHA) into the binary file via HDL generics stored into firmware registers. This is done by means of a pre-synthesis script, which interacts with the git repository. The project creation and the pre/post synthesis Tcl scripts make use of the Hog utility library, that includes functions to handle git, parse tags, read list files, etc.
Gitlab Continuous Integration (CI) is automatically configured by Hog to simulate, synthesise, and build the design. Hog-CI generates binary files and checks for timing violations. This permits validating new modifications before accepting them, by exploiting the Gitlab Merge Request (MR) system. This is meant to avoid the pollution of the official branch, undermining the starting point for other developers. Hog-CI runs on shared and private (where the needed IDE must be installed) Gitlab runners. It can parse MR parameters, allowing the specification of directives through special keywords in the MR title/description on Gitlab website.
Hog: handling HDL repository on git Biesuz, N.V.; Cieri, D.; Giangiacomi, N. ...
Journal of instrumentation,
04/2022, Letnik:
17, Številka:
4
Journal Article
Recenzirano
Abstract
Handling HDL project development within large collaborations presents many challenges in terms of maintenance and versioning, due to the lack of standardised procedures. Hog (HDL on git) is ...a tcl-based open-source management tool, created to simplify HDL project development and management by exploiting git and Gitlab Continuous Integration (CI). Hog is compatible with the major HDL IDEs from Xilinx and Intel-FPGA, and guarantees synthesis and placing reproducibility and binary file traceability, by linking each binary file to a specific git commit. Hog-CI validates any changes to the code, handles automatic versioning and can automatically simulate, synthesise and build the design.
The Fast TracKer (FTK) is an ATLAS trigger upgrade built for full-event, low-latency, high-rate tracking. The FTK core, made of 9U VME boards, performs the most demanding computational task. The ...associative memory board (AMB) serial link processor and the auxiliary card (AUX), plugged on the front and back sides of the same VME slot, constitute the processing unit (PU), which finds tracks using hits from eight layers of the inner detector. The PU works in pipeline with the second stage board (SSB), which finds 12-layer tracks by adding extra hits to the identified tracks. In the designed configuration, 16 PUs and four SSBs are installed in a VME crate. The high power consumption of the AMB, AUX, and SSB (respectively, of about 250, 70, and 160 W per board) required the development of a custom cooling system. Even though the expected power consumption for each VME crate of the FTK system is high compared with a common VME setup, the 8 FTK core crates will use ≈60 kW, which is just a fraction of the power and the space needed for a CPU farm performing the same task. We report on the integration of 32 PUs and eight SSBs inside the FTK system, on the infrastructures needed to run and cool them, and on the tests performed to verify the system processing rate and the temperature stability at a safe value.
Hog (HDL on Git): An easy system to handle HDL on a git-based repository Biesuz, N.V.; Cieri, D.; Gonnella, F. ...
Nuclear instruments & methods in physics research. Section A, Accelerators, spectrometers, detectors and associated equipment,
April 2023, 2023-04-00, Letnik:
1049
Journal Article
Recenzirano
Odprti dostop
Coordinating firmware development among many international collaborators is becoming a very widespread problem in high-energy physics. Guaranteeing firmware synthesis reproducibility and assuring ...traceability of binary files is paramount.
We devised Hog - HDL on git (cern.ch/hog), a set of Tcl and Shell scripts that tackles these issues and is deeply integrated with HDL IDEs, such as Xilinx Vivado Design Suite and ISE PlanAhead or Intel Quartus Prime, and all major simulation tools, like Siemens ModelSim or Aldec Riviera Pro.
Git is a very powerful tool and has been chosen as standard by several research institutions, including CERN. Hog perfectly integrates with git to assure an absolute control of HDL source files, constraint files, IDE and simulation settings. It guarantees traceability by automatically embedding the git commit SHA and a numeric version into the binary file, also automatically renamed.
Hog does not rely on any external tool apart from the HDL IDE and git, so it is extremely compatible and does not require any installation. Developers can get quickly up to speed: clone the repository, run the Hog script, work normally with the IDE.
The learning curve to use Hog for the users is minimal. Once the HDL project is created, developers can work on it either using the IDE graphical interface, or with the provided Shell scripts to run the workflow.
Hog works on Windows and Linux, supports IPbus, Sigasi and provides pre-made YAML files to set up a working Continuous Integration on GitLab (Hog-CI) with no additional effort, which runs the HDL implementation for the desired projects. Other features of Hog-CI are the automatic creation of tags and GitLab releases with timing and utilisation reports.
Currently, Hog is successfully used by several firmware projects within the High-Energy Physics community, e.g. in the ATLAS and CMS Phase-II upgrades.
This article documents the muon reconstruction and identification efficiency obtained by the ATLAS experiment for 139
fb
-
1
of
pp
collision data at
s
=
13
TeV collected between 2015 and 2018 during ...Run 2 of the LHC. The increased instantaneous luminosity delivered by the LHC over this period required a reoptimisation of the criteria for the identification of prompt muons. Improved and newly developed algorithms were deployed to preserve high muon identification efficiency with a low misidentification rate and good momentum resolution. The availability of large samples of
Z
→
μ
μ
and
J
/
ψ
→
μ
μ
decays, and the minimisation of systematic uncertainties, allows the efficiencies of criteria for muon identification, primary vertex association, and isolation to be measured with an accuracy at the per-mille level in the bulk of the phase space, and up to the percent level in complex kinematic configurations. Excellent performance is achieved over a range of transverse momenta from 3 GeV to several hundred GeV, and across the full muon detector acceptance of
|
η
|
<
2.7
.
Celotno besedilo
Dostopno za:
DOBA, IZUM, KILJ, NUK, PILJ, PNG, SAZU, SIK, UILJ, UKNU, UL, UM, UPUK
Jet energy scale and resolution measurements with their associated uncertainties are reported for jets using 36–81 fb
-
1
of proton–proton collision data with a centre-of-mass energy of
s
=
13
TeV
...collected by the ATLAS detector at the LHC. Jets are reconstructed using two different input types: topo-clusters formed from energy deposits in calorimeter cells, as well as an algorithmic combination of charged-particle tracks with those topo-clusters, referred to as the ATLAS particle-flow reconstruction method. The anti-
k
t
jet algorithm with radius parameter
R
=
0.4
is the primary jet definition used for both jet types. This result presents new jet energy scale and resolution measurements in the high pile-up conditions of late LHC Run 2 as well as a full calibration of particle-flow jets in ATLAS. Jets are initially calibrated using a sequence of simulation-based corrections. Next, several in situ techniques are employed to correct for differences between data and simulation and to measure the resolution of jets. The systematic uncertainties in the jet energy scale for central jets (
|
η
|
<
1.2
) vary from 1% for a wide range of high-
p
T
jets (
250
<
p
T
<
2000
GeV
), to 5% at very low
p
T
(
20
GeV
) and 3.5% at very high
p
T
(
>
2.5
TeV
). The relative jet energy resolution is measured and ranges from (
24
±
1.5
)% at 20
GeV
to (
6
±
0.5
)% at 300
GeV
.
Celotno besedilo
Dostopno za:
DOBA, IZUM, KILJ, NUK, PILJ, PNG, SAZU, SIK, UILJ, UKNU, UL, UM, UPUK
Abstract The timing performance of the Timepix4 application-specific integrated circuit (ASIC) bump-bonded to a 100 μ m thick n-on-p silicon sensor is presented. A picosecond pulsed infrared laser ...was used to generate electron-hole pairs in the silicon bulk in a repeatable fashion, controlling the amount, position and time of the stimulated charge signal. The timing resolution for a single pixel has been measured to 107 ps r.m.s. for laser-stimulated signals in the silicon sensor bulk. Considering multi-pixel clusters, the measured timing resolution reached 33 ps r.m.s. exploiting oversampling of the timing information over several pixels.
A
bstract
A search for new-physics resonances decaying into a lepton and a jet performed by the ATLAS experiment is presented. Scalar leptoquarks pair-produced in
pp
collisions at
s
= 13 TeV at the ...Large Hadron Collider are considered using an integrated luminosity of 139 fb
−
1
, corresponding to the full Run 2 dataset. They are searched for in events with two electrons or two muons and two or more jets, including jets identified as arising from the fragmentation of
c
- or
b
-quarks. The observed yield in each channel is consistent with the Standard Model background expectation. Leptoquarks with masses below 1.8 TeV and 1.7 TeV are excluded in the electron and muon channels, respectively, assuming a branching ratio into a charged lepton and a quark of 100%, with minimal dependence on the quark flavour. Upper limits on the aforementioned branching ratio are also given as a function of the leptoquark mass.
A
bstract
A search for charged Higgs bosons decaying into a top quark and a bottom quark is presented. The data analysed correspond to 139 fb
−
1
of proton-proton collisions at
s
= 13 TeV, recorded ...with the ATLAS detector at the LHC. The production of a heavy charged Higgs boson in association with a top quark and a bottom quark,
pp
→
tbH
+
→
tbtb
, is explored in the
H
+
mass range from 200 to 2000 GeV using final states with jets and one electron or muon. Events are categorised according to the multiplicity of jets and
b
-tagged jets, and multivariate analysis techniques are used to discriminate between signal and background events. No significant excess above the background-only hypothesis is observed and exclusion limits are derived for the production cross-section times branching ratio of a charged Higgs boson as a function of its mass; they range from 3.6 pb at 200 GeV to 0.036 pb at 2000 GeV at 95% confidence level. The results are interpreted in the hMSSM and
M
h
125
scenarios.