We studied the taxonomy of Pluteus podospileus and similar species using morphological and molecular (nrITS, TEF1-α) data, including a detailed study of the type collections of P. inflatus var. ...alneus, Pluteus minutissimus f. major, and P. granulatus var. tenellus. Within the P. podospileus complex, we phylogenetically confirmed six species in Europe, five in Asia, and eight in North America. Based on our results, we recognize P. seticeps as a separate species occurring in North America, while P. podospileus is limited to Eurasia. We describe six new species and a new variety: P. absconditus, P. fuscodiscus, P. gausapatus, P. inexpectatus, P. millsii, and P. notabilis and its variety, P. notabilis var. insignis. We elevate Pluteus seticeps var. cystidiosus to species rank as Pluteus cystidiosus. Based on the holotype of P. inflatus var. alneus, collections of P. inflatus identified by Velenovský, and several modern collections, we resurrect the name P. inflatus. Based on molecular analyses of syntypes of Pluteus minutissimus f. major and a holotype of Pluteus granulatus var. tenellus, we synonymize them under P. inflatus. We also increase our knowledge about the morphology and distribution of P. cutefractus.
The social cost of carbon dioxide (SC-CO
) measures the monetized value of the damages to society caused by an incremental metric tonne of CO
emissions and is a key metric informing climate policy. ...Used by governments and other decision-makers in benefit-cost analysis for over a decade, SC-CO
estimates draw on climate science, economics, demography and other disciplines. However, a 2017 report by the US National Academies of Sciences, Engineering, and Medicine
(NASEM) highlighted that current SC-CO
estimates no longer reflect the latest research. The report provided a series of recommendations for improving the scientific basis, transparency and uncertainty characterization of SC-CO
estimates. Here we show that improved probabilistic socioeconomic projections, climate models, damage functions, and discounting methods that collectively reflect theoretically consistent valuation of risk, substantially increase estimates of the SC-CO
. Our preferred mean SC-CO
estimate is $185 per tonne of CO
($44-$413 per tCO
: 5%-95% range, 2020 US dollars) at a near-term risk-free discount rate of 2%, a value 3.6 times higher than the US government's current value of $51 per tCO
. Our estimates incorporate updated scientific understanding throughout all components of SC-CO
estimation in the new open-source Greenhouse Gas Impact Value Estimator (GIVE) model, in a manner fully responsive to the near-term NASEM recommendations. Our higher SC-CO
values, compared with estimates currently used in policy evaluation, substantially increase the estimated benefits of greenhouse gas mitigation and thereby increase the expected net benefits of more stringent climate policies.
Starving cells of
Dictyostelium discoideum undergo a developmental cycle were cAMP is autocatalytically produced and relayed from cell to cell, resulting in the propagation of excitation waves over a ...spatially extended population. Later on the homogeneous cell layer transforms into a pattern of cell streams directed perpendicular to the cAMP waves. Here we chemically influence aggregation competent cells by isopropylidenadenosin (IPA), an adenosine derivative. It can be assumed, that IPA acts via specific adenosine binding sites localized in the cellular membrane. We find, however, that pattern formation and cellular aggregation under the influence of IPA differ considerably compared to experiments with adenosine. In particular, our observations point towards an inhibitory effect on adenylate cyclase (ACA), the key enzyme in the autocatalytic production process of cAMP inside the cell. Our results suggest the existence of a direct coupling (via intracellular affection) or indirect coupling (via inhibition of cAMP binding) of the specific adenosine receptors to the regulatory circuit that controls cyclic intra- and extracellular cAMP concentration.
Statistical Simulations on Parallel Computers Sevcikova, Hana
Journal of computational and graphical statistics,
12/1/2004, 20041201, 2004-12-00, Letnik:
13, Številka:
4
Journal Article
Recenzirano
Odprti dostop
The potential benefits of parallel computing for time-consuming statistical applications are well known, but have not been widely realized in practice, perhaps in part due to associated technical ...obstacles. This article develops a simple framework for programming statistical simulations using parallel processing, which does not require changing programming language or forgoing the use of standard statistical libraries. The basic idea of using parallel computing for statistical simulation studies is straightforward in principle, and is based on the standard master-slave model. However, there are several technical obstacles that can make it difficult to implement in practice. These include: nonreproducibility of results due to variations in the distribution of random numbers among processes, creation of excessive numbers of slaves, proliferation of slaves with very short lifetimes, and slaves destroyed due to hardware failures. This article proposes solutions for each of these difficulties, and together these solutions constitute an overall parallel computing framework for statistical simulation studies.
In an experiment with 15 processors, the methods detailed here led to increases in speed by factors that can actually exceed the maximum expected factor of 15, due to the efficiencies of the proposed problem decomposition methods. Different gains may be achieved with different strategies, depending on the problem decomposition used and heterogeneity of the processors. Fault tolerance is an important feature of the framework. In an experiment with faults, a non-fault-tolerant version of our method took almost twice as long, and did not produce any results, while the fault-tolerant method dealt efficiently with the faults.
We conclude that parallel computing can greatly improve the efficiency of statistical computationwithout greatly increasing programming complexity, and that it deserves wider investigation for such applications. Software to implement the proposed framework in R is available from
http://www.stat.washington.edu/hana
.
402 subjects with diabetes mellitus have been vaccinated of the total of 34,000 vaccinees immunized during the study period of 9 and half months. Altogether 229 diabetic patients (56.97%) have been ...vaccinated'against tick-borne encephalitis (TBE) and 74 (18.4%) against viral hepatitis (41 types A+B, 30 type A, 3 type B). The average age in four most commonly administered vaccines (FSME IMMUN 0.5 ML, Twinrix Adult, Typhim Vi, and Havrix 1440) was 65, 52, 56, and 54 years, respectively. Live attenuated vaccines have been given to 6 patients with diabetes (1.49%)--- 5 travellers to endemic countries received the yellow fever vaccine Stamaril (1 female, 4 male) and one male patient varicella vaccine Varilrix. Among the least common vaccines in diabetic patients were those against invasive pneumococcal and meningococcal infections. Not a single unexpected side effect has been observed following the vaccination procedure in any diabetic patient. Based on the results of this retrospective study we can conclude that vaccination in diabetic patients is free of any ri-k- provided that there are no other contraindications, e.g. allergy to vaccine components or severe acute febrile illness. In the case of unstable glycaemia and significantly impaired immune system due to diabetes mellitus, vaccination with live attenuated vaccines should be carefully considered and measured against the risks of exposure to each and every specific infectious agent. There is no reason to be afraid of vaccination in diabetic patients provided that general contraindications are respected. On the contrary, this risk group can benefit from vaccination more remarkably since it may have some life-saving potential.
Background: The antiplatelet effect of acetylsalicylic acid (ASA) varies among individual patients. We assessed the short-term reproducibility (STR) and long-term reproducibility (LTR) of light ...transmission aggregometry (LTA). Methods: Residual platelet reactivity was measured twice using LTA in a group of 207 consecutive patients (56 females, mean age 67 ± 9 years) on ASA therapy in 10 ± 6 months interval. The STR was assessed in 15 patients (6 females, mean age 61 ± 7 years) with 10 measurements on 2 consecutive days. Results: There was no correlation between both measurements in the long-term part of the study, and also Bland-Altman plot showed a diverging pattern. However, LTA STR was good with a correlation coefficient of .800 (P < .05) confirmed by Bland-Altman plot. Conclusions: Although short-term intraindividual reproducibility of LTA assessment of platelet reactivity is very good, in the long-term perspective the antiplatelet ASA effectivity may be influenced by additional variables and repeated measurements are warranted.
The link between age and migration propensity is long established, but existing models of country-level net migration ignore the effect of population age distribution on past and projected migration ...rates. We propose a method to estimate and forecast international net migration rates for the 200 most populous countries, taking account of changes in population age structure. We use age-standardized estimates of country-level net migration rates and in-migration rates over quinquennial periods from 1990 through 2020 to decompose past net migration rates into in-migration rates and out-migration rates. We then recalculate historic migration rates on a scale that removes the influence of the population age distribution. This is done by scaling past and projected migration rates in terms of a reference population and period. We show that this can be done very simply, using a quantity we call the migration age structure index (MASI). We use a Bayesian hierarchical model to generate joint probabilistic forecasts of total and age- and sex- specific net migration rates over five-year periods for all countries from 2020 through 2100. We find that accounting for population age structure in historic and forecast net migration rates leads to narrower prediction intervals by the end of the century for most countries. Also, applying a Rogers & Castro-like migration age schedule to migration outflows reduces uncertainty in population pyramid forecasts. Finally, accounting for population age structure leads to less out-migration among countries with rapidly aging populations that are forecast to contract most rapidly by the end of the century. This leads to less drastic population declines than are forecast without accounting for population age structure.
A planar wave propagating in the Belousov–Zhabotinsky reaction medium with pyrogallol as substrate is shown to undergo multiple reversals upon switching the polarity of an imposed dc electric field. ...During the multiple reversals an asymmetry arises in: (i) the propagation velocities of reversed waves, (ii) the ferroin concentrations in front of reversed waves shortly before switching the electric field polarity, and (iii) the location at which a new wave emerges in the wake of the original one. Multiple reversals occur in a limited range of control parameter values and depend on the pyrogallol concentration.
A crucial step in all gradient-based algorithms for calculating the nonparametric maximum likelihood estimator in mixture models is the global maximization of the gradient function. For example, in ...mixtures of exponentials, the methods usually proposed fail. Based on a discretization which is adapted to the data points, a method for maximizing the gradient is suggested. The method is implemented in different gradient-based algorithms; a comparison shows that on mixtures of exponentials, the ISDM algorithm introduced by Lesperance and Kalbfleisch is much faster than its competitors.
In two-component mixtures of exponential distributions, different strategies for starting the likelihood maximization algorithm converge to different types of maxima. The power of an LR test of ...homogeneity against such a mixture strongly depends on the considered strategy, and global maximization need not result in the largest power. An explanation is given on basis of a systematic investigation of the likelihood function in a large number of simulations, using a variety of diagnostic tools. Thereby, we also gain a deeper insight into the properties of the samples that generate particular types of solutions of the likelihood equation. In particular, "spurious solutions" often occur; these are mainly responsible for the fact that global maximization may not result in a statistically meaningful estimator. Removing the smallest elements of a sample may drastically increase the power of previously inferior strategies. PUBLICATION ABSTRACT