The preregistration revolution Nosek, Brian A.; Ebersole, Charles R.; DeHaven, Alexander C. ...
Proceedings of the National Academy of Sciences,
03/2018, Volume:
115, Issue:
11
Journal Article
Peer reviewed
Open access
Progress in science relies in part on generating hypotheses with existing observations and testing hypotheses with new observations. This distinction between postdiction and prediction is appreciated ...conceptually but is not respected in practice. Mistaking generation of postdictions with testing of predictions reduces the credibility of research findings. However, ordinary biases in human reasoning, such as hindsight bias, make it hard to avoid this mistake. An effective solution is to define the research questions and analysis plan before observing the research outcomes—a process called preregistration. Preregistration distinguishes analyses and outcomes that result from predictions from those that result from postdictions. A variety of practical strategies are available to make the best possible use of preregistration in circumstances that fall short of the ideal application, such as when the data are preexisting. Services are now available for preregistration across all disciplines, facilitating a rapid increase in the practice. Widespread adoption of preregistration will increase distinctiveness between hypothesis generation and hypothesis testing and will improve the credibility of research findings.
This paper delves into the nuanced dynamics influencing the outcomes of risk assessment (RA) in scientific research projects (SRPs), employing the Naive Bayes algorithm. The methodology involves the ...selection of diverse SRPs cases, gathering data encompassing project scale, budget investment, team experience, and other pertinent factors. The paper advances the application of the Naive Bayes algorithm by introducing enhancements, specifically integrating the Tree-augmented Naive Bayes (TANB) model. This augmentation serves to estimate risk probabilities for different research projects, shedding light on the intricate interplay and contributions of various factors to the RA process. The findings underscore the efficacy of the TANB algorithm, demonstrating commendable accuracy (average accuracy 89.2%) in RA for SRPs. Notably, budget investment (regression coefficient: 0.68, P < 0.05) and team experience (regression coefficient: 0.51, P < 0.05) emerge as significant determinants obviously influencing RA outcomes. Conversely, the impact of project size (regression coefficient: 0.31, P < 0.05) is relatively modest. This paper furnishes a concrete reference framework for project managers, facilitating informed decision-making in SRPs. By comprehensively analyzing the influence of various factors on RA, the paper not only contributes empirical insights to project decision-making but also elucidates the intricate relationships between different factors. The research advocates for heightened attention to budget investment and team experience when formulating risk management strategies. This strategic focus is posited to enhance the precision of RAs and the scientific foundation of decision-making processes.
The Circular Economy (CE) is currently a popular notion within the policy and business advocacy groups. Despite being visionary and provocative in its message, the research on the CE concept is ...emerging. The two intertwined objectives of the paper are; first to identify, discuss and develop the various definitions provided by the emerging literature. Secondly, to suggest an initial research approach with which research on CE can be conducted. Our analysis shows that the existing CE work is mainly done on the practical and technical levels of the actual physical flows of materials and energy in production-consumption systems. The focus of the extant literature is on concrete metrics, tools, instruments and computations. Therefore, the basic assumptions concerning the values, societal structures, cultures, underlying world-views and the paradigmatic potential of CE remain largely unexplored. We argue that CE has already become what Gallie (1955) more than six decades ago termed as an “essentially contested concept” (ECC). The paper further suggests a model for CE research that helps in the categorization, classification and organization of research and investigation on CE. The model can help in limiting the observed unbalance and enhance the contribution of the CE approach to a more sustainable global society.
•Circular Economy is linked to the classical work on Essentially Contested Concepts by Gallie.•New definition of Circular Economy is given.•A model for future research on Circular Economy is proposed.
Machines powered by artificial intelligence increasingly mediate our social, cultural, economic and political interactions. Understanding the behaviour of artificial intelligence systems is essential ...to our ability to control their actions, reap their benefits and minimize their harms. Here we argue that this necessitates a broad scientific research agenda to study machine behaviour that incorporates and expands upon the discipline of computer science and includes insights from across the sciences. We first outline a set of questions that are fundamental to this emerging field and then explore the technical, legal and institutional constraints on the study of machine behaviour.
The decline of science in corporate R&D Arora, Ashish; Belenzon, Sharon; Patacconi, Andrea
Strategic management journal,
01/2018, Volume:
39, Issue:
1
Journal Article
Peer reviewed
Open access
Research summary: In this article, we document a shift away from science by large corporations between 1980 and 2006. We find that publications by company scientists have declined over time in a ...range of industries. We also find that the value attributable to scientific research has dropped, whereas the value attributable to technical knowledge (as measured by patents) has remained stable. These trends are unlikely to be driven principally by changes in publication practices. Furthermore, science continues to be useful as an input into innovation. Our evidence points to a reduction of the private benefits of internal research. Large firms still value the golden eggs of science (as reflected in patents), but seem to be increasingly unwilling to invest in the golden goose itself (the internal scientific capabilities). Managerial summary: There is a widespread belief among commentators that large American corporations are withdrawing from research. Large corporations may still collaborate with universities and acquire promising science-based start-ups, but their labs increasingly focus on developing existing knowledge and commercializing it, rather than creating new knowledge. In this article, we combine firm-level financial information with a large and comprehensive data set on firm publications, patents and acquisitions to quantify the withdrawal from science by large American corporations between 1980 and 2006. This withdrawal is associated with a decline in the private value of research activities, even though scientific knowledge itself remains important for corporate invention. We discuss the managerial and policy implications of our findings.
Abandon Statistical Significance McShane, Blakeley B.; Gal, David; Gelman, Andrew ...
The American statistician,
03/2019, Volume:
73, Issue:
sup1
Journal Article
Peer reviewed
Open access
We discuss problems the null hypothesis significance testing (NHST) paradigm poses for replication and more broadly in the biomedical and social sciences as well as how these problems remain ...unresolved by proposals involving modified p-value thresholds, confidence intervals, and Bayes factors. We then discuss our own proposal, which is to abandon statistical significance. We recommend dropping the NHST paradigm-and the p-value thresholds intrinsic to it-as the default statistical paradigm for research, publication, and discovery in the biomedical and social sciences. Specifically, we propose that the p-value be demoted from its threshold screening role and instead, treated continuously, be considered along with currently subordinate factors (e.g., related prior evidence, plausibility of mechanism, study design and data quality, real world costs and benefits, novelty of finding, and other factors that vary by research domain) as just one among many pieces of evidence. We have no desire to "ban" p-values or other purely statistical measures. Rather, we believe that such measures should not be thresholded and that, thresholded or not, they should not take priority over the currently subordinate factors. We also argue that it seldom makes sense to calibrate evidence as a function of p-values or other purely statistical measures. We offer recommendations for how our proposal can be implemented in the scientific publication process as well as in statistical decision making more broadly.
Global scientific production around the Covid-19 pandemic, in the various disciplines on the various international scientific bibliographic databases, has grown exponentially. The latter builds a ...source of scientific enrichment and an important lever for most researchers around the world, each of its field and its position with an ultimate aim of overcoming this pandemic. In this direction, bibliometric data constitute a fundamental source in the process of evaluation of scientific production in the academic world; bibliometrics provides researchers and institutions with crucial strategic information for the enhancement of their research results with the local and international scientific community, especially in this international pandemic.
Misapplication of statistical data analysis is a common cause of spurious discoveries in scientific research. Existing approaches to ensuring the validity of inferences drawn from data assume a fixed ...procedure to be performed, selected before the data are examined. In common practice, however, data analysis is an intrinsically adaptive process, with new analyses generated on the basis of data exploration, as well as the results of previous analyses on the same data. We demonstrate a new approach for addressing the challenges of adaptivity based on insights from privacy-preserving data analysis. As an application, we show how to safely reuse a holdout data set many times to validate the results of adaptively chosen analyses.
Objective
To verify whether reverse baseplate positioning without the support of intraoperative three-dimensional technology is within the acceptable parameters in the literature and whether glenoid ...bone deformity (GBD) compromises this positioning.
Methods
Sixty-nine reverse shoulder arthroplasties were evaluated with volumetric computed tomography (CT). Two radiologists performed blinded CT scan analysis and evaluated baseplate position within 2mm of the inferior glenoid; the inclination and version of the baseplate in relation to the Friedman line; and upper and lower screw and baseplate metallic peg end point positionings. The patients were divided according to the presence of GBD for statistical analyses.
Results
The two radiologists concurred reasonably in their interpretations of the following analyzed parameters: baseplate position within 2mm of the inferior glenoid rim (97.1% and 95.7%), baseplate inclination (82.6% and 81.2%), baseplate version (69.6% and 56.5%), the upper screw reaching the base of the coracoid process (71% and 79.7%), the inferior screw remaining inside the scapula (88.4% and 84.1%), and the metallic peg of the baseplate considered intraosseous (88.4% and 72.5%).
Conclusion
Reverse baseplate positioning without intraoperative three-dimensional technology is within the acceptable parameters of the literature, except for baseplate version and upper screw position. GBD did not interfere with baseplate positioning in reverse shoulder arthroplasty.