Purpose
: The problem of metal artifact reduction (MAR) is almost as old as the clinical use of computed tomography itself. When metal implants are present in the field of measurement, severe ...artifacts degrade the image quality and the diagnostic value of CT images. Up to now, no generally accepted solution to this issue has been found. In this work, a method based on a new MAR concept is presented: frequency split metal artifact reduction (FSMAR). It ensures efficient reduction of metal artifacts at high image quality with enhanced preservation of details close to metal implants.
Methods
: FSMAR combines a raw data inpainting-based MAR method with an image-based frequency split approach. Many typical methods for metal artifact reduction are inpainting-based MAR methods and simply replace unreliable parts of the projection data, for example, by linear interpolation. Frequency split approaches were used in CT, for example, by combining two reconstruction methods in order to reduce cone-beam artifacts. FSMAR combines the high frequencies of an uncorrected image, where all available data were used for the reconstruction with the more reliable low frequencies of an image which was corrected with an inpainting-based MAR method. The algorithm is tested in combination with normalized metal artifact reduction (NMAR) and with a standard inpainting-based MAR approach. NMAR is a more sophisticated inpainting-based MAR method, which introduces less new artifacts which may result from interpolation errors. A quantitative evaluation was performed using the examples of a simulation of the XCAT phantom and a scan of a spine phantom. Further evaluation includes patients with different types of metal implants: hip prostheses, dental fillings, neurocoil, and spine fixation, which were scanned with a modern clinical dual source CT scanner.
Results
: FSMAR ensures sharp edges and a preservation of anatomical details which is in many cases better than after applying an inpainting-based MAR method only. In contrast to other MAR methods, FSMAR yields images without the usual blurring close to implants.
Conclusions
: FSMAR should be used together with NMAR, a combination which ensures an accurate correction of both high and low frequencies. The algorithm is computationally inexpensive compared to iterative methods and methods with complex inpainting schemes. No parameters were chosen manually; it is ready for an application in clinical routine.
Electroencephalogram (EEG) plays an important role in identifying brain activity and behavior. However, the recorded electrical activity always be contaminated with artifacts and then affect the ...analysis of EEG signal. Hence, it is essential to develop methods to effectively detect and extract the clean EEG data during encephalogram recordings. Several methods have been proposed to remove artifacts, but the research on artifact removal continues to be an open problem. This paper tends to review the current artifact removal of various contaminations. We first discuss the characteristics of EEG data and the types of different artifacts. Then, a general overview of the state-of-the-art methods and their detail analysis are presented. Lastly, a comparative analysis is provided for choosing a suitable methods according to particular application.
The Gravity Recovery and Climate Experiment (GRACE) gravitational models suffer from a dominant systematic error, usually referred to as “longitudinal stripes.” These stripes contaminate useful ...geophysical signals and limit the spectrum of geoscience applications that can be benefited from GRACE and GRACE‐Follow On. Analyses of the spatiotemporal structure of latitudinal stripe profiles show consistent spectral characteristics throughout three years of monthly solutions. Using an elegant combination of GRACE sampling characteristics and advanced moiré theory, we show that the GRACE stripes are sub‐Nyquist (pseudo‐moiré) artifacts arising from the oversampling of the Earth's low‐frequency static disturbing potential (geoid) along the parallels. The low‐frequency geoid modulates the total sampled gravitational signal with a frequency near
mnfs, where fs is the sampling frequency of the GRACE ground track “bundles” along the parallels of latitude, and m and n are mutually prime integers, with 2m ≤ n.
Plain Language Summary
The Gravity Recovery and Climate Experiment (GRACE) mission allowed the monitoring of the Earth's static and dynamic gravity field. Although GRACE has been subject to an extensive number of studies, GRACE‐based gravity field models display strong disturbing systematic noise in the form of “stripes” extending longitudinally and around the globe, contaminating useful geophysical signals. Their origin has not been explained since the launch of GRACE in 2002. Thorough analyses of the spatiotemporal structure of GRACE stripes reveal that their location is nonstationary but with consistent spectral characteristics over monthly and long‐term gravity models. In this contribution, we quantify the GRACE sampling characteristics and we use them to prove that the stripes are sub‐Nyquist (pseudo‐moiré) artifacts. We prove that the stripes are the result of oversampling the low frequency geoid along the east‐west (latitudinal) direction. We generate synthetic stripes using moiré theory, and by means of rigorous spectral analysis, we show that their spectral and spatial characteristics are very similar to the observed ones.
Key Points
The effective latitudinal sampling frequency of GRACE equals Δs=1.14o
Stripes are sub‐Nyquist artifacts, driven by the oversampling of the low‐degree latitudinal geoid
GRACE sampling parameters and spectral moiré theory lead to simulated sub‐Nyquist artifacts (synthetic stripes)
DNA metabarcoding is promising for cost-effective biodiversity monitoring, but reliable diversity estimates are difficult to achieve and validate. Here we present and validate a method, called LULU, ...for removing erroneous molecular operational taxonomic units (OTUs) from community data derived by high-throughput sequencing of amplified marker genes. LULU identifies errors by combining sequence similarity and co-occurrence patterns. To validate the LULU method, we use a unique data set of high quality survey data of vascular plants paired with plant ITS2 metabarcoding data of DNA extracted from soil from 130 sites in Denmark spanning major environmental gradients. OTU tables are produced with several different OTU definition algorithms and subsequently curated with LULU, and validated against field survey data. LULU curation consistently improves α-diversity estimates and other biodiversity metrics, and does not require a sequence reference database; thus, it represents a promising method for reliable biodiversity estimation.
Abstract
Acquisition of a clear idea of the events and artifacts occurring eventually is a difficult task which needs to be solved in digital criminalistics. Recovery of chronology of events and ...artifacts, allows experts to understand chronology of digital crimes and to interpret a conclusion in the form of digital proofs with use of live-response technologies.