We present a generalization of Hilfer derivatives in which Riemann–Liouville integrals are replaced by more general Prabhakar integrals. We analyze and discuss its properties. Furthermore, we show ...some applications of these generalized Hilfer–Prabhakar derivatives in classical equations of mathematical physics such as the heat and the free electron laser equations, and in difference–differential equations governing the dynamics of generalized renewal stochastic processes.
We present a model of trade, global innovation, and diffusion, inspired by Eaton and Kortum (1999). The specific structure for innovation and diffusion we propose, which leverages general results ...developed in our previous work (Lind and Ramondo, 2023a), allows us to measure the flow of ideas across countries and over time. By deriving tractable expressions for productivity and expenditure, we can use easily-available international trade data to estimate both innovation and diffusion rates across countries and over time. We find that, although innovation is correlated with economic growth, there are many high income countries that primarily produce using diffused ideas from foreign sources.
Spectrum sharing between wireless networks improves the efficiency of spectrum usage, and thereby alleviates spectrum scarcity due to growing demands for wireless broadband access. To improve the ...usual underutilization of the cellular uplink spectrum, this paper addresses spectrum sharing between a cellular uplink and a mobile ad hoc networks. These networks access either all frequency subchannels or their disjoint subsets, called spectrum underlay and spectrum overlay, respectively. Given these spectrum sharing methods, the capacity trade-off between the coexisting networks is analyzed based on the transmission capacity of a network with Poisson distributed transmitters. This metric is defined as the maximum density of transmitters subject to an outage constraint for a given signal-to-interference ratio (SIR). Using tools from stochastic geometry, the transmission-capacity trade-off between the coexisting networks is analyzed, where both spectrum overlay and underlay as well as successive interference cancellation (SIC) are considered. In particular, for small target outage probability, the transmission capacities of the coexisting networks are proved to satisfy a linear equation, whose coefficients depend on the spectrum sharing method and whether SIC is applied. This linear equation shows that spectrum overlay is more efficient than spectrum underlay. Furthermore, this result also provides insight into the effects of network parameters on transmission capacities, including link diversity gains, transmission distances, and the base station density. In particular, SIC is shown to increase the transmission capacities of both coexisting networks by a linear factor, which depends on the interference-power threshold for qualifying canceled interferers.
Wavelet shrinkage estimation received considerable attentions to estimate stochastic processes such as a non-homogeneous Poisson process in a non-parametric way, and was applied to software ...reliability estimation/prediction. However, it lacks the prediction ability for unknown future patterns in long term and penalizes assessing the software reliability in practice. In this paper, we focus on the long-term prediction of the number of software faults detected in the testing phase and propose many novel long-term prediction methods based on the wavelet shrinkage estimation. The fundamental idea is to adopt both the denoised fault-count data and prediction values, and to minimize several kinds of loss functions to make effective predictions. We also develop an automated wavelet-based software reliability assessment tool, W-SRAT2, which is a drastic extension of the existing tool, W-SRAT, by adding those prediction algorithms. In numerical experiments with 6 actual software development project data, we investigate the predictive performance of our long-term prediction approaches, which consist of 2,640 combinations, and compare them with the common software reliability growth models with the maximum likelihood estimation. It is shown that our wavelet shrinkage estimation/prediction methods outperform the existing software reliability growth models.
•Propose the wavelet-based long-term prediction methods for NHPP-based SRM.•Provide a total of 2,640 prediction methods by combining different wavelet methods.•Our wavelet prediction methods outperform parametric models in experiments.•Develop an automated wavelet-based software reliability assessment tool.
This paper presents a spatial and temporal model of electric vehicle charging demand for a rapid charging station located near a highway exit. Most previous studies have assumed a fixed charging ...location and fixed charging time during the off-peak hours for anticipating electric vehicle charging demand. Some other studies have based on limited charging scenarios at typical locations instead of a mathematical model. Therefore, from a distribution system perspective, electric vehicle charging demand is still unidentified quantity which may vary by space and time. In this context, this study proposes a mathematical model of electric vehicle charging demand for a rapid charging station. The mathematical model is based on the fluid dynamic traffic model and the M/M/s queueing theory. Firstly, the arrival rate of discharged vehicles at a charging station is predicted by the fluid dynamic model. Then, charging demand is forecasted by the M/M/s queueing theory with the arrival rate of discharged vehicles. This mathematical model of charging demand may allow grid's distribution planners to anticipate a charging demand profile at a charging station. A numerical example shows that the proposed model is able to capture the spatial and temporal dynamics of charging demand in a highway charging station.
We consider cluster size data of SARS-CoV-2 transmissions for a number of different settings from recently published data. The statistical characteristics of superspreading events are commonly ...described by fitting a negative binomial distribution to secondary infection and cluster size data as an alternative to the Poisson distribution as it is a longer tailed distribution, with emphasis given to the value of the extra parameter which allows the variance to be greater than the mean. Here we investigate whether other long tailed distributions from more general extended Poisson process modelling can better describe the distribution of cluster sizes for SARS-CoV-2 transmissions.
We use the extended Poisson process modelling (EPPM) approach with nested sets of models that include the Poisson and negative binomial distributions to assess the adequacy of models based on these standard distributions for the data considered.
We confirm the inadequacy of the Poisson distribution in most cases, and demonstrate the inadequacy of the negative binomial distribution in some cases.
The probability of a superspreading event may be underestimated by use of the negative binomial distribution as much larger tail probabilities are indicated by EPPM distributions than negative binomial alternatives. We show that the large shared accommodation, meal and work settings, of the settings considered, have the potential for more severe superspreading events than would be predicted by a negative binomial distribution. Therefore public health efforts to prevent transmission in such settings should be prioritised.
An accurate forecast of a clinical trial enrollment timeline at the planning phase is of great importance to both corporate strategic planning and trial operational excellence. The naive approach ...often calculates an average enrollment rate from historical data and generates an inaccurate prediction based on a linear trend with the average rate. Under the traditional framework of a Poisson–Gamma model, site activation delays are often modeled with either fixed initiation time or a simple random distribution while incorporating the user‐provided site planning information to achieve good forecast accuracy. However, such user‐provided information is not available at the early portfolio planning stage. We present a novel statistical approach based on generalized linear mixed‐effects models and the use of non‐homogeneous Poisson processes through the Bayesian framework to model the country initiation, site activation, and subject enrollment sequentially in a systematic fashion. We validate the performance of our proposed enrollment modeling framework based on a set of 25 preselected studies from four therapeutic areas. Our modeling framework shows a substantial improvement in prediction accuracy in comparison to the traditional statistical approach. Furthermore, we show that our modeling and simulation approach calibrates the data variability appropriately and gives correct coverage rates for prediction intervals of various nominal levels. Finally, we demonstrate the use of our approach to generate the predicted enrollment curves through time with confidence bands overlaid.