Best practices for analysing microbiomes Knight, Rob; Vrbanac, Alison; Taylor, Bryn C ...
Nature reviews. Microbiology,
07/2018, Letnik:
16, Številka:
7
Journal Article
Recenzirano
Complex microbial communities shape the dynamics of various environments, ranging from the mammalian gastrointestinal tract to the soil. Advances in DNA sequencing technologies and data analysis have ...provided drastic improvements in microbiome analyses, for example, in taxonomic resolution, false discovery rate control and other properties, over earlier methods. In this Review, we discuss the best practices for performing a microbiome study, including experimental design, choice of molecular analysis technology, methods for data analysis and the integration of multiple omics data sets. We focus on recent findings that suggest that operational taxonomic unit-based analyses should be replaced with new methods that are based on exact sequence variants, methods for integrating metagenomic and metabolomic data, and issues surrounding compositional data analysis, where advances have been particularly rapid. We note that although some of these approaches are new, it is important to keep sight of the classic issues that arise during experimental design and relate to research reproducibility. We describe how keeping these issues in mind allows researchers to obtain more insight from their microbiome data sets.
Issues with data and analyses Brown, Andrew W.; Kaiser, Kathryn A.; Allison, David B.
Proceedings of the National Academy of Sciences - PNAS,
03/2018, Letnik:
115, Številka:
11
Journal Article
Recenzirano
Odprti dostop
Some aspects of science, taken at the broadest level, are universal in empirical research. These include collecting, analyzing, and reporting data. In each of these aspects, errors can and do occur. ...In this work, we first discuss the importance of focusing on statistical and data errors to continually improve the practice of science. We then describe underlying themes of the types of errors and postulate contributing factors. To do so, we describe a case series of relatively severe data and statistical errors coupled with surveys of some types of errors to better characterize the magnitude, frequency, and trends. Having examined these errors, we then discuss the consequences of specific errors or classes of errors. Finally, given the extracted themes, we discuss methodological, cultural, and systemlevel approaches to reducing the frequency of commonly observed errors. These approaches will plausibly contribute to the self-critical, self-correcting, ever-evolving practice of science, and ultimately to furthering knowledge.
Constraints on Generality (COG) Simons, Daniel J.; Shoda, Yuichi; Lindsay, D. Stephen
Perspectives on psychological science,
11/2017, Letnik:
12, Številka:
6
Journal Article
Recenzirano
Odprti dostop
Psychological scientists draw inferences about populations based on samples—of people, situations, and stimuli—from those populations. Yet, few papers identify their target populations, and even ...fewer justify how or why the tested samples are representative of broader populations. A cumulative science depends on accurately characterizing the generality of findings, but current publishing standards do not require authors to constrain their inferences, leaving readers to assume the broadest possible generalizations. We propose that the discussion section of all primary research articles specify Constraints on Generality (i.e., a “COG” statement) that identify and justify target populations for the reported findings. Explicitly defining the target populations will help other researchers to sample from the same populations when conducting a direct replication, and it could encourage follow-up studies that test the boundary conditions of the original finding. Universal adoption of COG statements would change publishing incentives to favor a more cumulative science.
Replication-an important, uncommon, and misunderstood practice-is gaining appreciation in psychology. Achieving replicability is important for making research progress. If findings are not ...replicable, then prediction and theory development are stifled. If findings are replicable, then interrogation of their meaning and validity can advance knowledge. Assessing replicability can be productive for generating and testing hypotheses by actively confronting current understandings to identify weaknesses and spur innovation. For psychology, the 2010s might be characterized as a decade of active confrontation. Systematic and multi-site replication projects assessed current understandings and observed surprising failures to replicate many published findings. Replication efforts highlighted sociocultural challenges such as disincentives to conduct replications and a tendency to frame replication as a personal attack rather than a healthy scientific practice, and they raised awareness that replication contributes to self-correction. Nevertheless, innovation in doing and understanding replication and its cousins, reproducibility and robustness, has positioned psychology to improve research practices and accelerate progress.
This thesis comprises a number of studies into controlling the photonic properties of materials by way of controlling their molecular conformation. The absorption and emission behaviours of ...fundamental AIE molecules TPE and HPS were probed in solutions with varying degrees of aggregation. It was found, in contrast to the existing RIR hypothesis, that the monomers were able to emit in good solutions with minimal aggregation, and continued to do so upon low levels of aggregation. The monomer emission of TPE was found to be more prevalent than that of HPS due to the difference in conjugation pathways offered by each molecule. These results suggested that the term AIE is misleading, with the assumption that the molecular emission 'switches on' upon aggregation disregarding the fact that the more efficient fluorescence actually results from a quenching process (excimer formation), with RIR causing the increase in efficiency. A new method for controlling the aggregation in fluorene/fluorenone oligomers was also investigated. Water was found to be highly effective for screening the fluorenone aggregation, with fractions of only 10% sufficient to eliminate the excimer emission. The fluorenone was then successfully reaggregated to varying degrees as the water fraction was increased, due to non-solvent effects. Investigating solutions of thermally oxidised FFF also revealed the method's ability to identify the presence of fluorenone moieties when there was no indication of oxidation in the absorption or PL. Lastly, dip-pen nanolithography was employed to pattern photonic structures in PFO thin films. Beta phase dots with diameter 548nm were achieved, demonstrating the potential of DPN in photonic device fabrication, however there were issues with consistency of feature size and damage to the film. An outline of the development of the microscale PL set-up used to analyse the patterns, and the steps taken to maximise its resolution and spectral contrast is also given. The set-up was found to be well suited to highlight the beta phase of PFO, however there were issues with reproducibility of the PL maps, as well as a trade-off between signal strength and photodegradation of the sample. When compared to patterns imaged on a different set-up, it was clear that the set-up designed here produced too low a signal to accurately compare the new patterns. Further work in sample fabrication, DPN operation and microscale PL optimisation was therefore identified to improve the quality and reproducibility of the patterns.
IntroductionTo evaluate the aortic valve fibrocalcific volume by computed tomography (CT) angiography in patients with aortic stenosis (AS). In particular, to assess its reproducibility, association ...with disease severity, its ability to predict and track AS progression and to perform histological validation.MethodsIn a post-hoc analysis of 136 patients with AS participating in the SALTIRE 2 trial, fibrocalcific volume was calculated using semi-automated software on CT-angiograms at baseline and after one year. The distributions of CT-attenuation were analysed using Gaussian mixture modelling to derive thresholds for aortic valve tissue types enabling the quantification of calcific, non-calcific and fibrocalcific volumes indexed to annulus area. Scan-rescan reproducibility was assessed. Aortic valves from 41 patients undergoing valve replacement were included in the histological validation cohort.ResultsFibrocalcific volume measurements demonstrated excellent scan-rescan reproducibility (mean difference -1%, limits of agreement -4.5% to 2.8%). Baseline fibrocalcific volumes correlated with baseline mean aortic valve gradients on echocardiography in both men and women (rho=0.64 and 0.69 respectively; p<0.001 for both). The relationship was driven principally by calcific volume in men and fibrotic volume in women. After one year, fibrocalcific volume increased by 17% and correlated with an increase in mean gradient (rho=0.32, p=0.003). Baseline fibrocalcific volume was the strongest predictor of disease progression on multivariable analysis, with a particularly strong association in women (rho=0.75, p<0.001). Histologically, there was a good correlation between fibrocalcific volume and valve weight (r=0.51, p<0.001). Furthermore, non-calcific volumes on CT were higher in patients with a higher fibrosis score on histology and similarly calcific volumes on CT were higher in patients with higher Warren-Yong scores for calcification on histology.ConclusionsThe fibrocalcific volume is a highly reproducible, anatomic, CT-derived assessment of AS. It correlates with AS severity and haemodynamic progression and there is good correlation between fibrocalcific volume and histological parameters.