Abstract
An object-based forecasting, nowcasting, and alerting system prototype was demonstrated during the summer 2015 Environment Canada Pan Am Science Showcase (ECPASS) in Toronto. Part of this ...demonstration involved the generation of experimental thunderstorm threat areas by both automated NWP postprocessing algorithms and by a pair of human forecasters. This paper first develops a rigorous verification methodology for the intercomparison of continuous as well as categorical probabilistic thunderstorm forecasts. The methodology is then applied to the intercomparison of thunderstorm forecasts made during ECPASS. Statistical postprocessing of forecasts by smoothing with optimal bandwidth followed by recalibration is found to improve the skill scores of all thunderstorm forecasts studied at all lead times between 6 and 48 h. In addition, the calibrated ensemble mean forecasts are found to be better than the calibrated deterministic thunderstorm forecasts for all lead times considered, though postprocessing of the convective rain-rate forecast gives results that are statistically comparable with the ensemble mean forecast. Thunderstorm threat areas that were automatically generated by thresholding the output of NWP-based postprocessed algorithms have better scores than those generated by human forecasters for most lead times beyond 9 h, indicating that they could be integrated as an automated tool for providing high-quality “first-guess” thunderstorm threat areas in an object-based forecasting, nowcasting, and alerting system. A unique contribution of this paper is a novel verification methodology for the fair comparison between continuous and categorical probabilistic forecasts, a methodology that could be used for other experiments involving human- and automatically generated object-based forecasts derived from probabilistic forecasts.
Celotno besedilo
Dostopno za:
DOBA, IZUM, KILJ, NUK, PILJ, PNG, SAZU, UILJ, UKNU, UL, UM, UPUK
Abstract
The Integrated Multi-satellitE Retrievals for Global Precipitation Measurement (GPM; IMERG) is a high-resolution gridded precipitation dataset widely used around the world. This study ...assessed the performance of the half-hourly IMERG v06 Early and Final Runs over a 5-yr period versus 19 high-quality surface stations in the Great Lakes region of North America. This assessment not only looked at precipitation occurrence and amount, but also studied the IMERG Quality Index (QI) and errors related to passive microwave (PMW) sources. Analysis of bias in accumulated precipitation amount and precipitation occurrence statistics suggests that IMERG presents various uncertainties with respect to time scale, meteorological season, PMW source, QI, and land surface type. Results indicate that 1) the cold season’s (November–April) larger relative bias can be mitigated via backward morphing; 2) IMERG 6-h precipitation amount scored best in the warmest season (JJA) with a consistent overestimation of the frequency bias index − 1 (FBI-1); 3) the performance of five PMW sources is affected by the season to different degrees; 4) in terms of some metrics, skills do not always enhance with increasing QI; 5) local lake effects lead to higher correlation and equitable threat score (ETS) for the stations closest to the lakes. Results of this study will be beneficial to both developers and users of IMERG precipitation products.
Significance Statement
The purpose of the study was to assess the performance of the gridded precipitation product from the Integrated Multi-satellitE Retrievals for Global Precipitation Measurement (IMERG) version 6 over the Great Lakes region of North America. The assessment performs a statistical comparison of precipitation amounts from IMERG versus surface stations as a function of time scale, season, precipitation event threshold, and input source among satellites. Interpretation of the results identifies shortcomings in the IMERG algorithms, particularly in extreme precipitation events and over ice-covered surfaces. The results also describe spatial variability in the IMERG data quality due to the complex geography of the study area and offer a clear threshold in the Quality Index (QI) flag for optimal application of the precipitation products.
Celotno besedilo
Dostopno za:
DOBA, IZUM, KILJ, NUK, PILJ, PNG, SAZU, UILJ, UKNU, UL, UM, UPUK
Accurate and cost‐effective dreissenid mussel abundance maps are vital to assess their ecological roles in aquatic systems. A deep neural network (DNN) modeling framework using semantic segmentation ...was developed to automatically assess the abundance distribution of two invasive mussel species: zebra and quagga. DNN models were trained on images captured in Lake Erie and Lake Ontario using an underwater color imaging technique. The accuracy of the method was assessed relative to manual laboratory counts of harvested mussels, their dry biomass, and percentage live coverage estimated from fixed‐size quadrats. Assessments performed on a test set collected from 2016 to 2018 show that DNN‐based mussel coverage predictions explain 79% of the variance in log biomass, and 71% for log abundance (N = 125). For reference, live coverage estimated by scuba divers was transformed and found to be a better predictor of biomass (93%) and abundance (91%) (N = 725), leaving room for improvement of our automated method. When identical images were presented to eight human analysts and the DNN, the agreement in live mussel coverage prediction was 85% (N = 189). Models generalize well to diverse underwater illuminations, camera orientations, and resolutions, but are adversely impacted by occluding vegetation and suspended sediment. DNN models are an efficient and accurate solution for mapping mussel abundances at a scale that was previously impossible. The method may be integrated with other studies to assess the mussels' impacts in a variety of aquatic ecosystems. Source code: https://github.com/AngusG/deep-learning-dreissenid and data https://doi.org/10.5683/SP3/MZEBOJ for reproducing our method are publicly available.
Recently, sparse representation based methods have proven to be successful towards solving image restoration problems. The objective of these methods is to use sparsity prior of the underlying signal ...in terms of some dictionary and achieve optimal performance in terms of mean-squared error, a metric that has been widely criticized in the literature due to its poor performance as a visual quality predictor. In this work, we make one of the first attempts to employ structural similarity (SSIM) index, a more accurate perceptual image measure, by incorporating it into the framework of sparse signal representation and approximation. Specifically, the proposed optimization problem solves for coefficients with minimum
ℒ
0
norm and maximum SSIM index value. Furthermore, a gradient descent algorithm is developed to achieve SSIM-optimal compromise in combining the input and sparse dictionary reconstructed images. We demonstrate the performance of the proposed method by using image denoising and super-resolution methods as examples. Our experimental results show that the proposed SSIM-based sparse representation algorithm achieves better SSIM performance and better visual quality than the corresponding least square-based method.
Probabilistic methods are useful to estimate the
uncertainty in spatial meteorological fields (e.g., the uncertainty in
spatial patterns of precipitation and temperature across large domains). In
...ensemble probabilistic methods, “equally plausible” ensemble members are
used to approximate the probability distribution, hence the uncertainty, of
a spatially distributed meteorological variable conditioned to the available
information. The ensemble members can be used to evaluate the impact of
uncertainties in spatial meteorological fields for a myriad of applications.
This study develops the Ensemble Meteorological Dataset for North America
(EMDNA). EMDNA has 100 ensemble members with daily precipitation amount,
mean daily temperature, and daily temperature range at 0.1∘
spatial resolution (approx. 10 km grids) from 1979 to 2018, derived from a
fusion of station observations and reanalysis model outputs. The station
data used in EMDNA are from a serially complete dataset for North America
(SCDNA) that fills gaps in precipitation and temperature measurements using
multiple strategies. Outputs from three reanalysis products are regridded,
corrected, and merged using Bayesian model averaging. Optimal
interpolation (OI) is used to merge station- and reanalysis-based estimates.
EMDNA estimates are generated using spatiotemporally correlated random
fields to sample from the OI estimates. Evaluation results show that (1) the
merged reanalysis estimates outperform raw reanalysis estimates,
particularly in high latitudes and mountainous regions; (2) the OI estimates
are more accurate than the reanalysis and station-based regression
estimates, with the most notable improvements for precipitation evident in
sparsely gauged regions; and (3) EMDNA estimates exhibit good performance
according to the diagrams and metrics used for probabilistic evaluation. We
discuss the limitations of the current framework and highlight that further
research is needed to improve ensemble meteorological datasets. Overall,
EMDNA is expected to be useful for hydrological and meteorological
applications in North America. The entire dataset and a teaser dataset (a
small subset of EMDNA for easy download and preview) are available at
https://doi.org/10.20383/101.0275 (Tang et al., 2020a).
Identifying women at risk of venous thromboembolism (VTE) is a major public health issue. The objective of this study was to identify environmental and genetic determinants of VTE risk in a large ...sample of women under combined oral contraceptives (COC). A total of 968 women who had had one event of VTE during COC use were compared to 874 women under COC but with no personal history of VTE. Clinical data were collected and a systematic thrombophilia screening was performed together with ABO blood group assessment. After adjusting for age, family history, and type and duration of COC use, main environmental determinants of VTE were smoking (odds ratio OR =1.65, 95% confidence interval 1.30-2.10) and a body mass index higher than 35 kg.m⁻² (OR=3.46 1.81-7.03). In addition, severe inherited thrombophilia (OR=2.13 1.32-3.51) and non-O blood groups (OR=1.98 1.57-2.49) were strong genetic risk factors for VTE. Family history poorly predicted thrombophilia as its prevalence was similar in patients with or without first degree family history of VTE (29.3% vs 23.9%, p=0.09). In conclusion, this study confirms the influence of smoking and obesity and shows for the first time the impact of ABO blood group on the risk of VTE in women under COC. It also confirms the inaccuracy of the family history of VTE to detect inherited thrombophilia.
The clinical presentation of pervasive refusal syndrome is marked by a refusal to eat, walk, talk, firm resistance and an aggressive refusal to accept help and care. The management of patients with ...this syndrome is physically and emotionally draining for caregivers. The quality of the relationship can be quickly affected as it is so unusual and singular. Training, communication and support from the team are essential to be able to continue to provide compassionate care.
Summary
Identifying women at risk of venous thromboembolism (VTE) is a major public health issue. The objective of this study was to identify environmental and genetic determinants of VTE risk in a ...large sample of women under combined oral contraceptives (COC). A total of 968 women who had had one event of VTE during COC use were compared to 874 women under COC but with no personal history of VTE. Clinical data were collected and a systematic thrombophilia screening was performed together with ABO blood group assessment. After adjusting for age, family history, and type and duration of COC use, main environmental determinants of VTE were smoking (odds ratio OR =1.65, 95 % confidence interval 1.30–2.10) and a body mass index higher than 35 kg.m
-2
(OR=3.46 1.81–7.03). In addition, severe inherited thrombophilia (OR=2.13 1.32–3.51) and non-O blood groups (OR=1.98 1.57–2.49) were strong genetic risk factors for VTE. Family history poorly predicted thrombophilia as its prevalence was similar in patients with or without first degree family history of VTE (29.3 % vs 23.9 %, p=0.09). In conclusion, this study confirms the influence of smoking and obesity and shows for the first time the impact of ABO blood group on the risk of VTE in women under COC. It also confirms the inaccuracy of the family history of VTE to detect inherited thrombophilia.
Cardiovascular diseases, including fatal myocardial infarctions from atheromatous plaques, are the primary global mortality cause. Detecting stenotic atheromatous plaques is possible through coronary ...angiography, but vulnerable plaques with eccentric remodeling are undetectable with current diagnostic methods. Addressing this challenge, our group developed a radiopharmaceutical drug targeting vascular cell adhesion molecule 1 (VCAM-1), radiolabeled with technetium-99m. Given the absence of a monograph in the European Pharmacopoeia, and in order to draft the investigational medicinal product documentation, analytical methods had to be validated by high performance liquid chromatography (HPLC) and thin layer chromatography (TLC) to determine the radiochemical purity (RCP) of 99mTc-cAbVCAM1–5. This study therefore presents the results of the validation of analytical methods obtained in this context. The method validation followed the European Association of Nuclear Medicine (EANM) recommendations adapted from ICH Q2(R1), ensuring conformity with specificity, accuracy, repeatability and intermediate precision, linearity, robustness, quantification limit (LoQ), and range criteria. Regarding the results of specificity, both HPLC and TLC methods demonstrated excellent separation of 99mTc-cAbVCAM1–5 from impurities 99mTcO4-. Accuracy results indicated recovery percentages within the range of 99.52–101.40% for the HPLC and 99.51–101.97% for TLC, ensuring reliable measurements for each concentration of 99mTcO4-. Precision of the methods was validated by assessing repeatability and intermediate precision. Linearity was determined over the usual concentrations range and the correlation coefficient was greater than 0.99 for both methods. The limit of quantification was measured by diluting the 99mTcO4- to obtain a signal-to-noise ratio of around 10:1. Under these conditions, we obtained an LOQ of 2.10 MBq/mL for HPLC and 2Mbq/mL for TLC. In conclusion, the analytical methods developed in this study comply with EANM recommendations. This therefore allows us to correctly assess the radiochemical purity of 99mTc-cAbVCAM1–5, a new radiotracer targeting inflammation in vulnerable plaques.
•99mTc-cAbVCAM1–5 is a new radiopharmaceutical targeting a marker of inflammation found in vulnerable atheromatous plaques in cardiac diseases.•A validation of the analytical method to evaluate the radiochemical purity (RCP) by HPLC and TLC was carried out.•Specificity, accuracy, precision, linearity, robustness, quantification limit, and range criteria were assessed.•The results demonstrate that the analytical methods are suitable for assessing the RCP of this new radiopharmaceutical.
Geodesics of the Structural Similarity index Brunet, Dominique; Vass, József; Vrscay, Edward R. ...
Applied mathematics letters,
11/2012, Letnik:
25, Številka:
11
Journal Article
Recenzirano
Odprti dostop
We construct metrics from the geodesics of the Structural Similarity index, an image quality assessment measure. An analytical solution is given for the simple case of zero stability constants, and ...the general solution involving the numerical solution of a nonlinear equation is also found.