We propose a technique for metal artefact reduction in digital tomosynthesis reconstruction. Although the problem was addressed earlier in the literature, we suggest another approach, which is, in ...our opinion, simpler, and easier to implement. It is a two-stage algorithm. At the first stage, attenuation images are segmented by decomposing their intensity distributions into gaussian-like components. Statistical information contained in each component is used for pixel classification. Components corresponding to metallic objects are identified, and a pixel threshold value separating regions occupied by metal objects from the rest of the image is found. Based on this value, at the second stage, a smooth mapping of image intensity is applied. This makes dense regions transparent, resulting in the artefact reduction in reconstruction. The methodology is demonstrated by several examples.
Monte Carlo (MC) codes serve as the gold standard simulation tool during design and optimisation of x-ray imaging systems. Such simulations often model Rayleigh scattering based on the Independent ...Atom Approximation Model (IAM). This model neglects the low range molecular interference (MI) effects of non-crystalline materials such as human tissues. Previous work has found discrepancies in the simulated images of planar x-ray images between IAM and MI models. However, insignificant differences were found for computed tomography (CT) reconstructions. In this work we present Geant4 MC simulations of a flat panel source digital tomosynthesis (DT) system for human extremities. Results show that with a 1:9 scatter to primary ratio (SPR) in the x-ray projections, the DT reconstructions are insensitive to the differences of the IAM and MI models. Therefore, MC codes that use the IAM model are sufficient for the study of DT systems. That is because DT algorithms have a larger effect on image quality than the few percent change in the noise due to a physical model and noise suppression methods make this change even less important. Dependency of this conclusion on SPR must be considered in other DT modalities where SPR might be larger.
A novel adaptive mesh technique is introduced for problems of image reconstruction in luminescence optical tomography. A dynamical adaptation of the three-dimensional scheme based on the ...finite-volume formulation reduces computational time and balances the ill-posed nature of the inverse problem. The arbitrary shape of the bounding surface is handled by an additional refinement of computational cells on the boundary. Dynamical shrinking of the search volume is introduced to improve computational performance and accuracy while locating the luminescence target. Light propagation in the medium is modeled by the telegraph equation, and the image-reconstruction algorithm is derived from the Fredholm integral equation of the first kind. Stability and computational efficiency of the introduced method are demonstrated for image reconstruction of one and two spherical luminescent objects embedded within a breastlike tissue phantom. Experimental measurements are simulated by the solution of the forward problem on a grid of 5x5 light guides attached to the surface of the phantom.
Three-dimensional bioluminescence imaging is an emerging technique that can be used to monitor molecular events in intact living systems. The inverse problem of 3D bioluminescence imaging does not ...have a unique solution because it requires reconstruction of a 3D source function from a 2D one. A novel approach that addresses this problem with the aid of a simple experimental setup and solves the uniqueness problem of the solution for a monochromatic measurement set is suggested here. The approach is verified numerically by reconstructing bioluminescent objects of various shapes embedded inside highly scattering media, such as biologiçal tissue.
3D imaging modalities such as computed tomography and digital tomosynthesis typically scan the patient from different angles with a lengthy mechanical movement of a single x-ray tube. Therefore, ...millions of 3D scans per year require expensive mechanisms to support a heavy x-ray source and have to compensate for machine vibrations and patient movements. However, recent developments in cold-cathode field emission technology allow the creation of compact, stationary arrays of emitters. Adaptix Ltd has developed a novel, low-cost, square array of such emitters and demonstrated 3D digital tomosynthesis of human extremities and small animals. The use of cold-cathode field emitters also makes the system compact and lightweight. This paper presents Monte Carlo simulations of a concept upgrade of the Adaptix system from the current 60 kVp to 90 kVp and 120 kVp which are better suited for chest imaging. Between 90 kVp and 120 kVp, 3D image quality appears insensitive to voltage and at 90 kVp the photon yield is reduced by 40%-50% while effective dose declines by 14%. A square array of emitters can adequately illuminate a subject for tomosynthesis from a shorter source-to-image distance, thereby reducing the required input power, and offsetting the 28%-50% more input power that is required for operation at 90 kVp. This modelling suggests that lightweight, stationary cold-cathode x-ray source arrays could be used for chest tomosynthesis at a lower voltage, with less dose and without sacrificing image quality. This will reduce weight, size and cost, enabling 3D imaging to be brought to the bedside.
Measurements of beam backgrounds in SuperKEKB Phase 2 Liptak, Z.; Paladino, A.; Santelj, L. ...
Nuclear instruments & methods in physics research. Section A, Accelerators, spectrometers, detectors and associated equipment,
10/2022, Letnik:
1040
Journal Article
Recenzirano
Odprti dostop
The high design luminosity of the SuperKEKB electron–positron collider will result in challenging levels of beam-induced backgrounds in the interaction region. Understanding and mitigating these ...backgrounds is critical to the success of the Belle II experiment. We report on the first background measurements performed after roll-in of the Belle II detector, a period known as SuperKEKB Phase 2, utilizing both the BEAST II system of dedicated background detectors and the Belle II detector itself. We also report on first revisions to the background simulation made in response to our findings. Backgrounds measured include contributions from synchrotron radiation, beam-gas, Touschek, and injection backgrounds. At the end of Phase 2, single-beam backgrounds originating from the 4 GeV positron Low Energy Ring (LER) agree reasonably well with simulation, while backgrounds from the 7 GeV electron High Energy Ring (HER) are approximately one order of magnitude higher than simulation. We extrapolate these backgrounds forward and conclude it is safe to install the Belle II vertex detector.
A novel adaptive mesh technique in the Fourier domain is introduced for problems in fluorescence lifetime imaging. A dynamical adaptation of the three-dimensional scheme based on the finite volume ...formulation reduces computational time and balances the ill-posed nature of the inverse problem. Light propagation in the medium is modeled by the telegraph equation, while the lifetime reconstruction algorithm is derived from the Fredholm integral equation of the first kind. Stability and computational efficiency of the method are demonstrated by image reconstruction of two spherical fluorescent objects embedded in a tissue phantom.
In the original published article, some of the symbols in figure 1A were modified incorrectly during the typesetting and publication process. The correct version of the figure is provided in this ...correction
We consider the problem of fluorescence lifetime optical tomographic imaging in a weakly scattering medium in the presence of highly scattering inclusions. We suggest an approximation to the ...radiative transfer equation, which results from the assumption that the transport coefficient of the scattering media differs by an order of magnitude for weakly and highly scattering regions. The image reconstruction algorithm is based on the variational framework and employs angularly selective intensity measurements. We present numerical simulation of light scattering in a weakly scattering medium that embeds highly scattering objects. Our reconstruction algorithm is verified by recovering optical and fluorescent parameters from numerically simulated datasets.
Charged particle multiplicity distributions in positron-proton deep inelastic scattering at a centre-of-mass energy
s
=
319
GeV are measured. The data are collected with the H1 detector at HERA ...corresponding to an integrated luminosity of 136 pb
-
1
. Charged particle multiplicities are measured as a function of photon virtuality
Q
2
, inelasticity
y
and pseudorapidity
η
in the laboratory and the hadronic centre-of-mass frames. Predictions from different Monte Carlo models are compared to the data. The first and second moments of the multiplicity distributions are determined and the KNO scaling behaviour is investigated. The multiplicity distributions as a function of
Q
2
and the Bjorken variable
x
bj
are converted to the hadron entropy
S
hadron
, and predictions from a quantum entanglement model are tested.
Celotno besedilo
Dostopno za:
DOBA, IZUM, KILJ, NUK, PILJ, PNG, SAZU, SIK, UILJ, UKNU, UL, UM, UPUK