The Vuong test for non-nested models is being widely used as a test of zero-inflation. We show that such use is erroneous. We see that this stems from a misunderstanding of what is meant by the term ...“non-nested model” and suggest other approaches for determining zero-inflation
•The “Vuong test For Non-Nested Models” is being misused to test zero-inflation.•This is due to a misunderstanding of the term “non-nested model”.•And also the mathematically advanced nature of its pre-requisites.•The hypotheses of the test are misunderstood.•Allowing negative zero-inflation parameters is a promising alternative.
•Dimension-reduction method proposed for efficiency estimation.•Method substantially reduces estimation error in many cases.•Computational burden is reduced.•Convexity, returns-to-scale are ...preserved.•Useful not only for estimation, but also for hypothesis testing.
It is well-known that the convergence rates of nonparametric efficiency estimators (e.g., free-disposal hull and data envelopment analysis estimators) become slower with increasing numbers of input and output quantities (i.e., dimensionality). Dimension reduction is often utilized in non-parametric density and regression where similar problems occur, but has been used in only a few instances in the context of efficiency estimation. This paper explains why the problem occurs in nonparametric models of production and proposes three diagnostics for when dimension reduction might lead to more accurate estimation of efficiency. Simulation results provide additional insight, and suggest that in many cases dimension reduction is advantageous in terms of reducing estimation error. The simulation results also suggest that when dimensionality is reduced, free-disposal hull estimators become an attractive, viable alternative to the more frequently used (and more restrictive) data envelopment analysis estimators. In the context of efficiency estimation, these results provide the first quantification of the tradeoff between information lost versus improvement in estimation error due to dimension reduction. Results from several papers in the literature are revisited to show what might be gained from reducing dimensionality and how interpretations might differ.
Both renewable and nuclear energy can provide significant contributions to decarbonizing the electric sector. However, a grid employing large amounts of wind and solar energy requires the balance of ...the system to be highly flexible to respond to the increased variability of the net load. This makes deployment of conventional nuclear power challenging both due to the technical challenges of plant cycling and economic limits of reduced capacity factor. In the United States nuclear power plants generally provide constant, base load power and are most economic when operated at constant power levels. Operating nuclear power plants in load-following modes decreases the plants' annual energy output and increases the levelized cost of energy, decreasing economic competitiveness.
One possible solution is to couple thermal energy storage to nuclear power plants. This would enable the reactor to remain at nearly constant output, while cycling the electrical generator in response to the variability of the net load. This paper conceptually explores combinations of wind, solar, and nuclear that can provide a large fraction of a system's electricity, assuming the use of thermal energy storage that would allow nuclear power to provide load following and cycling duty while operating at a constant reactor power output.
► Both renewable and nuclear energy have been proposed to decarbonize the electric sector. ► Deploying large amounts of wind and solar energy requires the balance of the grid to be highly flexible. ► Current reactor designs have technical and economic challenges in providing load-following power. ► Coupling thermal energy storage to nuclear power plants could improve their flexibility. ► Renewables and flexible nuclear power could together significantly decarbonize the electric sector.
Research on the human microbiome has established that commensal and pathogenic bacteria can influence obesity, cancer, and autoimmunity through mechanisms mostly unknown. We found that a component of ...bacterial biofilms, the amyloid protein curli, irreversibly formed fibers with bacterial DNA during biofilm formation. This interaction accelerated amyloid polymerization and created potent immunogenic complexes that activated immune cells, including dendritic cells, to produce cytokines such as type I interferons, which are pathogenic in systemic lupus erythematosus (SLE). When given systemically, curli-DNA composites triggered immune activation and production of autoantibodies in lupus-prone and wild-type mice. We also found that the infection of lupus-prone mice with curli-producing bacteria triggered higher autoantibody titers compared to curli-deficient bacteria. These data provide a mechanism by which the microbiome and biofilm-producing enteric infections may contribute to the progression of SLE and point to a potential molecular target for treatment of autoimmunity.
•Bacterial amyloid curli and DNA composites form within bacterial biofilms•DNA accelerates the polymerization of bacterial amyloid curli•Curli-DNA composites induce autoantibodies and type I interferon•Infections with amyloid-expressing bacteria trigger autoimmunity
Biofilms, multicellular bacterial communities, are associated with numerous infections including UTIs, rhinosinusitis, and periodontal disease. Tükel and colleagues show that bacterial amyloids and eDNA, components of biofilms, form immunogenic complexes that accelerate the progression of an autoimmune disease, SLE, via the generation of autoantibodies and type I interferon response.
ObjectivesSocial prescribing is a way of linking patients in primary care with sources of support within the community to help improve their health and well-being. Social prescribing programmes are ...being widely promoted and adopted in the UK National Health Service and so we conducted a systematic review to assess the evidence for their effectiveness.Setting/data sourcesNine databases were searched from 2000 to January 2016 for studies conducted in the UK. Relevant reports and guidelines, websites and reference lists of retrieved articles were scanned to identify additional studies. All the searches were restricted to English language only.ParticipantsSystematic reviews and any published evaluation of programmes where patient referral was made from a primary care setting to a link worker or facilitator of social prescribing were eligible for inclusion. Risk of bias for included studies was undertaken independently by two reviewers and a narrative synthesis was performed.Primary and secondary outcome measuresPrimary outcomes of interest were any measures of health and well-being and/or usage of health services.ResultsWe included a total of 15 evaluations of social prescribing programmes. Most were small scale and limited by poor design and reporting. All were rated as a having a high risk of bias. Common design issues included a lack of comparative controls, short follow-up durations, a lack of standardised and validated measuring tools, missing data and a failure to consider potential confounding factors. Despite clear methodological shortcomings, most evaluations presented positive conclusions.ConclusionsSocial prescribing is being widely advocated and implemented but current evidence fails to provide sufficient detail to judge either success or value for money. If social prescribing is to realise its potential, future evaluations must be comparative by design and consider when, by whom, for whom, how well and at what cost.Trial registration numberPROSPERO Registration: CRD42015023501.
Theories occupy different positions in the scientific circle of enquiry as they vary in scope, abstraction, and complexity. Mid-range theories play a crucial bridging role between raw empirical ...observations and all-encompassing grand-theoretical schemes. A shift of perspective from 'theories' as products to 'theorising' as a process can enable empirical researchers to capitalise on the two-way relationships between empirical data and different levels of theory and contribute to the advancement of knowledge. This can be facilitated by embracing theoretically informative (in addition to merely theoretically informed) research, developing mechanism-based explanations, and broadening the repertoire of grand-theoretical orientations.
Two-stage DEA: caveat emptor Simar, Léopold; Wilson, Paul W.
Journal of productivity analysis,
10/2011, Letnik:
36, Številka:
2
Journal Article
Recenzirano
This paper examines the wide-spread practice where data envelopment analysis (DEA) efficiency estimates are regressed on some environmental variables in a second-stage analysis. In the literature, ...only two statistical models have been proposed in which second-stage regressions are well-defined and meaningful. In the model considered by Simar and Wilson (J Prod Anal 13:49—78, 2007), truncated regression provides consistent estimation in the second stage, where as in the model proposed by Banker and Natarajan (Oper Res 56: 48-58, 2008a), ordinary least squares (OLS) provides consistent estimation. This paper examines, compares, and contrasts the very different assumptions underlying these two models, and makes clear that second-stage OLS estimation is consistent only under very peculiar and unusual assumptions on the data-generating process that limit its applicability. In addition, we show that in either case, bootstrap methods provide the only feasible means for inference in the second stage. We also comment on ad hoc specifications of second-stage regression equations that ignore the part of the data-generating process that yields data used to obtain the initial DEA estimates.
The response of Earth’s climate system to orbital forcing has been highly state dependent over the past 66 million years.
The states of past climate
Deep-sea benthic foraminifera preserve an ...essential record of Earth's past climate in their oxygen- and carbon-isotope compositions. However, this record lacks sufficient temporal resolution and/or age control in some places to determine which climate forcing and feedback mechanisms were most important. Westerhold
et al.
present a highly resolved and well-dated record of benthic carbon and oxygen isotopes for the past 66 million years. Their reconstruction and analysis show that Earth's climate can be grouped into discrete states separated by transitions related to changing greenhouse gas levels and the growth of polar ice sheets. Each climate state is paced by orbital cycles but responds to variations in radiative forcing in a state-dependent manner.
Science
, this issue p.
1383
Much of our understanding of Earth’s past climate comes from the measurement of oxygen and carbon isotope variations in deep-sea benthic foraminifera. Yet, long intervals in existing records lack the temporal resolution and age control needed to thoroughly categorize climate states of the Cenozoic era and to study their dynamics. Here, we present a new, highly resolved, astronomically dated, continuous composite of benthic foraminifer isotope records developed in our laboratories. Four climate states—Hothouse, Warmhouse, Coolhouse, Icehouse—are identified on the basis of their distinctive response to astronomical forcing depending on greenhouse gas concentrations and polar ice sheet volume. Statistical analysis of the nonlinear behavior encoded in our record reveals the key role that polar ice volume plays in the predictability of Cenozoic climate dynamics.
Many papers have regressed non-parametric estimates of productive efficiency on environmental variables in two-stage procedures to account for exogenous factors that might affect firms’ performance. ...None of these have described a coherent data-generating process (DGP). Moreover, conventional approaches to inference employed in these papers are invalid due to complicated, unknown serial correlation among the estimated efficiencies. We first describe a sensible DGP for such models. We propose single and double bootstrap procedures; both permit valid inference, and the double bootstrap procedure improves statistical efficiency in the second-stage regression. We examine the statistical performance of our estimators using Monte Carlo experiments.
•Establishes existence of limiting distributions for components of Malmquist indices.•Central limit theorems for indices measuring components of productivity change.•Usual central limit theorems are ...invalid for components with 3 or more dimensions.•New results permit inference via subsampling for individual firms.•New results permit inference about geometric means of productivity components.
Malmquist indices are often used to measure productivity changes in dynamic settings and have been widely applied. The indices are typically estimated using data envelopment analysis (DEA) estimators. Malmquist indices are often decomposed into sub-indices that measure the sources of productivity change (e.g., changes in efficiency, technology or other factors). Recently, Kneip et al. (2018) provide new theoretical results enabling inference about productivity change for individual firms as well as average productivity changed measured in terms of geometric means. This paper extends those results to components of productivity change arising from various decompositions of Malmquist indices. New central limit theorems are developed to allow inference about arithmetic means of logarithms of the sub-indices as well as geometric means of (untransformed) sub-indices. The results are quite general and extend to other sub-indices not explicitly considered in this paper.