The seasonal cycle (SC) of surface air temperature is a substantial element in climatology. Some basic and important topics in climate change studies are based on accurate and reliable estimation of ...SCs, such as percentile‐based indices of extremes and probability density function (PDF) changes of daily temperatures. For both of them, most studies characterized SCs using averages of multi‐days windows, which is not smooth and accurate enough representing the extreme thresholds and climatological normal of SCs. It is necessary to construct smooth and reasonable SCs for more accurate estimation of temperature changes on extreme thresholds and PDFs. In this study, we propose a flexible method based on generalized additive models for location scale and shape and penalized b‐spline smoothing technique with respective distributions to construct smooth SCs and SC for extreme temperatures (SCETs). The accuracy of the constructed smooth SCETs is good with the estimation biases of percentiles tending to be zero. The constructed smooth SCET also exhibits good stability over time, such that the magnitude changes of temperatures on each calendar day are close to climatic changes of mean temperature when the concerned period shifts. Based on the constructed smooth SCs, climatic changes by seasons and PDFs between two periods, 1961–1990 and 1991–2020, are examined over China. The increase of thresholds for hot extremes in spring during the recent period is prominent, while the increase of thresholds for daytime cold extremes in summer over a part of central to southern China is also notable. The smooth SCs and SCETs based on our flexible statistical modelling framework can characterize daily extreme temperatures reasonably and accurately, and should be expected to have more applications for a better understanding of climate changes related to distributions and seasonal cycles.
We propose a flexible method based on GAMLSS framework and penalized b‐spline smoothing technique with respective distributions to construct smooth seasonal cycles of extreme temperatures (SCET). Results show smooth SCETs by statistical modelling are much reasonable and expected to be applied in more studies related to distributions and seasonal cycles.
Individual processes shaping geographical patterns of biodiversity are increasingly understood, but their complex interactions on broad spatial and temporal scales remain beyond the reach of ...analytical models and traditional experiments. To meet this challenge, we built a spatially explicit, mechanistic simulation model implementing adaptation, range shifts, fragmentation, speciation, dispersal, competition, and extinction, driven by modeled climates of the past 800,000 years in South America. Experimental topographic smoothing confirmed the impact of climate heterogeneity on diversification. The simulations identified regions and episodes of speciation (cradles), persistence (museums), and extinction (graves). Although the simulations had no target pattern and were not parameterized with empirical data, emerging richness maps closely resembled contemporary maps for major taxa, confirming powerful roles for evolution and diversification driven by topography and climate.
We study the problem of estimating high-dimensional regression models regularized by a structured sparsity-inducing penalty that encodes prior structural information on either the input or output ...variables. We consider two widely adopted types of penalties of this kind as motivating examples: (1) the general overlapping-group-lasso penalty, generalized from the grouplasso penalty; and (2) the graph-guided-fused-lasso penalty, generalized from the fused-lasso penalty. For both types of penalties, due to their nonseparability and nonsmoothness, developing an efficient optimization method remains a challenging problem. In this paper we propose a general optimization approach, the smoothing proximal gradient (SPG) method, which can solve structured sparse regression problems with any smooth convex loss under a wide spectrum of structured sparsity-inducing penalties. Our approach combines a smoothing technique with an effective proximal gradient method. It achieves a convergence rate significantly faster than the standard first-order methods, subgradient methods, and is much more scalable than the most widely used interior-point methods. The efficiency and scalability of our method are demonstrated on both simulation experiments and real genetic data sets.
In this article, a numerical approximation of modified Kawahara equation is investigated by Kernel smoothing method. The spatial derivatives involved in the modified Kawahara equation are ...approximated by smoothing Kernel method. Whereas, for the time integration, we employ Crank–Nicolson method. The conservative nature of the proposed scheme is demonstrated by the mass conservation constant (I1) and energy conservation constant (I2). To quantify the quality of the proposed scheme, we also have performed numerical testing on a collection of test problems.
Attribution methods can provide powerful insights into the reasons for a classifier's decision. We argue that a key desideratum of an explanation is its robustness to input hyperparameter changes ...that are often randomly set or empirically tuned. High sensitivity to arbitrary hyperparameter choices does not only impede reproducibility but also questions the correctness of an explanation and impairs the trust by end-users. In this paper, we provide a thorough empirical study on the sensitivity of existing attribution methods. We found an alarming trend that many methods are highly sensitive to changes in their common hyperparameters e.g. even changing a random seed can yield a different explanation! In contrast, explanations generated for robust classifiers that are trained to be invariant to pixel-wise perturbations are surprisingly more robust. Interestingly, such sensitivity is not reflected in the average explanation correctness scores over the entire dataset as commonly reported in the literature.
The max-relative entropy together with its smoothed version is a basic tool in quantum information theory. In this paper, we derive the exact exponent for the asymptotic decay of the small ...modification of the quantum state in smoothing the max-relative entropy based on purified distance. We then apply this result to the problem of privacy amplification against quantum side information, and we obtain an upper bound for the exponent of the asymptotic decreasing of the insecurity, measured using either purified distance or relative entropy. Our upper bound complements the earlier lower bound established by Hayashi, and the two bounds match when the rate of randomness extraction is above a critical value. Thus, for the case of high rate, we have determined the exact security exponent. Following this, we give examples and show that in the low-rate case, neither the upper bound nor the lower bound is tight in general. This exhibits a picture similar to that of the error exponent in channel coding. Lastly, we investigate the asymptotics of equivocation and its exponent under the security measure using the sandwiched Rényi divergence of order <inline-formula> <tex-math notation="LaTeX">s\in (1,2 </tex-math></inline-formula>, which has not been addressed previously in the quantum setting.
Purpose
Accurate delineation of the urethra is a prerequisite for urethral dose reduction in prostate radiotherapy. However, even in magnetic resonance‐guided radiation therapy (MRgRT), consistent ...delineation of the urethra is challenging, particularly in online adaptive radiotherapy. This paper presented a fully automatic MRgRT‐based prostatic urethra segmentation framework.
Methods
Twenty‐eight prostate cancer patients were included in this study. In‐house 3D half fourier single‐shot turbo spin‐echo (HASTE) and turbo spin echo (TSE) sequences were used to image the Foley‐free urethra on a 0.35 T MRgRT system. The segmentation pipeline uses 3D nnU‐Net as the base and innovatively combines ground truth and its corresponding radial distance (RD) map during training supervision. Additionally, we evaluate the benefit of incorporating a convolutional long short term memory (LSTM‐Conv) layer and spatial recurrent convolution layer (RCL) into nnU‐Net. A novel slice‐by‐slice simple exponential smoothing (SEPS) method specifically for tubular structures was used to post‐process the segmentation results.
Results
The experimental results show that nnU‐Net trained using a combination of Dice, cross‐entropy and RD achieved a Dice score of 77.1 ± 2.3% in the testing dataset. With SEPS, Hausdorff distance (HD) and 95% HD were reduced to 2.95 ± 0.17 mm and 1.84 ± 0.11 mm, respectively. LSTM‐Conv and RCL layers only minimally improved the segmentation precision.
Conclusion
We present the first Foley‐free MRgRT‐based automated urethra segmentation study. Our method is built on a data‐driven neural network with novel cost functions and a post‐processing step designed for tubular structures. The performance is consistent with the need for online and offline urethra dose reduction in prostate radiotherapy.
The ant colony optimization (ACO) algorithm is a type of classical swarm intelligence algorithm that is especially suitable for combinatorial optimization problems. To further improve the convergence ...speed without affecting the solution quality, in this paper, a novel strengthened pheromone update mechanism is designed that strengthens the pheromone on the edges, which had never been done before, utilizing dynamic information to perform path optimization. In addition, to enhance the global search capability, a novel pheromone-smoothing mechanism is designed to reinitialize the pheromone matrix when the ACO algorithm's search process approaches a defined stagnation state. The improved algorithm is analyzed and tested on a set of benchmark test cases. The experimental results show that the improved ant colony optimization algorithm performs better than compared algorithms in terms of both the diversity of the solutions obtained and convergence speed.
Objective: Electrical impedance myography (EIM) is a quantitative and objective tool to evaluate muscle status. EIM offers the possibility to replace conventional physical functioning scores or ...quality of life measures, which depend on patient cooperation and mood. Methods: Here, we propose a functional mixed-effects model using a state-space approach to describe the response trajectories of EIM data measured on 16 boys with Duchenne muscular dystrophy and 12 healthy controls, both groups measured over a period of two years. The modeling framework presented imposes a smoothing spline structure on EIM data collected at each visit and taking into account of within subject correlations of these curves along the longitudinal measurements. The modeling framework is recast in a state-space approach, thereby allowing for the employment of computationally efficient diffuse Kalman filtering and smoothing algorithms for the model estimation, as well as the estimates of the posterior variance-covariance matrix for the construction of the Bayesian 95% confidence bands. Results: The proposed model allows us to simultaneously adjust for baseline variables, differentiate the longitudinal changes in the smooth functional response and estimate the subject and subject-time specific deviations from the population-averaged response curves. The code is made publicly available in the supplementary material. Significance: The modeling approach presented will potentially enhance EIM capability to serve as a biomarker for testing therapeutic efficacy in DMD and other clinical trials.
This paper presents the winning submission of the M4 forecasting competition. The submission utilizes a dynamic computational graph neural network system that enables a standard exponential smoothing ...model to be mixed with advanced long short term memory networks into a common framework. The result is a hybrid and hierarchical forecasting method.