"Conservation genomics" encompasses the idea that genome-scale data will improve the capacity of resource managers to protect species. Although genetic approaches have long been used in conservation ...research, it has only recently become tractable to generate genome-wide data at a scale that is useful for conservation. In this Review, we discuss how genome-scale data can inform species delineation in the face of admixture, facilitate evolution through the identification of adaptive alleles, and enhance evolutionary rescue based on genomic patterns of inbreeding. As genomic approaches become more widely adopted in conservation, we expect that they will have a positive impact on management and policy decisions.
Could extinct species, like mammoths and passenger pigeons, be brought back to life? The science says yes. In How to Clone a Mammoth, Beth Shapiro, evolutionary biologist and pioneer in "ancient DNA" ...research, walks readers through the astonishing and controversial process of de-extinction. From deciding which species should be restored, to sequencing their genomes, to anticipating how revived populations might be overseen in the wild, Shapiro vividly explores the extraordinary cutting-edge science that is being used--today--to resurrect the past. Journeying to far-flung Siberian locales in search of ice age bones and delving into her own research--as well as those of fellow experts such as Svante Paabo, George Church, and Craig Venter--Shapiro considers de-extinction's practical benefits and ethical challenges. Would de-extinction change the way we live? Is this really cloning? What are the costs and risks? And what is the ultimate goal?
Using DNA collected from remains as a genetic blueprint, scientists aim to engineer extinct traits--traits that evolved by natural selection over thousands of years--into living organisms. But rather than viewing de-extinction as a way to restore one particular species, Shapiro argues that the overarching goal should be the revitalization and stabilization of contemporary ecosystems. For example, elephants with genes modified to express mammoth traits could expand into the Arctic, re-establishing lost productivity to the tundra ecosystem.
Looking at the very real and compelling science behind an idea once seen as science fiction, How to Clone a Mammoth demonstrates how de-extinction will redefine conservation's future.
Environmental DNA (eDNA) metabarcoding is an increasingly popular tool for measuring and cataloguing biodiversity. Because the environments and substrates in which DNA is preserved differ ...considerably, eDNA research often requires bespoke approaches to generating eDNA data. Here, we explore how two experimental choices in eDNA study design—the number of PCR replicates and the depth of sequencing of PCR replicates—influence the composition and consistency of taxa recovered from eDNA extracts. We perform 24 PCR replicates from each of six soil samples using two of the most common metabarcodes for Fungi and Viridiplantae (ITS1 and ITS2), and sequence each replicate to an average depth of ~84,000 reads. We find that PCR replicates are broadly consistent in composition and relative abundance of dominant taxa, but that low abundance taxa are often unique to one or a few PCR replicates. Taxa observed in one out of 24 PCR replicates make up 21–29% of the total taxa detected. We also observe that sequencing depth or rarefaction influences alpha diversity and beta diversity estimates. Read sampling depth influences local contribution to beta diversity, placement in ordinations, and beta dispersion in ordinations. Our results suggest that, because common taxa drive some alpha diversity estimates, few PCR replicates and low read sampling depths may be sufficient for many biological applications of eDNA metabarcoding. However, because rare taxa are recovered stochastically, eDNA metabarcoding may never fully recover the true amplifiable alpha diversity in an eDNA extract. Rare taxa drive PCR replicate outliers of alpha and beta diversity and lead to dispersion differences at different read sampling depths. We conclude that researchers should consider the complexity and unevenness of a community when choosing analytical approaches, read sampling depths, and filtering thresholds to arrive at stable estimates.
Previous studies examining the influence of methodological choices in environmental DNA processing on the resulting observed communities are a great resource in designing methods of a study, and accurately interpreting results. Here, we designed a methodological experiment to address the influences of PCR replication, sequencing depth, and a minimum read threshold on measures of biodiversity. We found stability between PCR replicates in high abundance taxa, and high variation in low abundance taxa, which significantly influence alpha and beta diversity.
Minimizing polymerase biases in metabarcoding Nichols, Ruth V.; Vollmers, Christopher; Newsom, Lee A. ...
Molecular ecology resources,
September 2018, Letnik:
18, Številka:
5
Journal Article
Recenzirano
Odprti dostop
DNA metabarcoding is an increasingly popular method to characterize and quantify biodiversity in environmental samples. Metabarcoding approaches simultaneously amplify a short, variable genomic ...region, or “barcode,” from a broad taxonomic group via the polymerase chain reaction (PCR), using universal primers that anneal to flanking conserved regions. Results of these experiments are reported as occurrence data, which provide a list of taxa amplified from the sample, or relative abundance data, which measure the relative contribution of each taxon to the overall composition of amplified product. The accuracy of both occurrence and relative abundance estimates can be affected by a variety of biological and technical biases. For example, taxa with larger biomass may be better represented in environmental samples than those with smaller biomass. Here, we explore how polymerase choice, a potential source of technical bias, might influence results in metabarcoding experiments. We compared potential biases of six commercially available polymerases using a combination of mixtures of amplifiable synthetic sequences and real sedimentary DNA extracts. We find that polymerase choice can affect both occurrence and relative abundance estimates and that the main source of this bias appears to be polymerase preference for sequences with specific GC contents. We further recommend an experimental approach for metabarcoding based on results of our synthetic experiments.
ABSTRACT
Controversy persists about why so many large‐bodied mammal species went extinct around the end of the last ice age. Resolving this is important for understanding extinction processes in ...general, for assessing the ecological roles of humans, and for conserving remaining megafaunal species, many of which are endangered today. Here we explore an integrative hypothesis that asserts that an underlying cause of Late Quaternary megafaunal extinctions was a fundamental shift in the spatio‐temporal fabric of ecosystems worldwide. This shift was triggered by the loss of the millennial‐scale climate fluctuations that were characteristic of the ice age but ceased approximately 11700 years ago on most continents. Under ice‐age conditions, which prevailed for much of the preceding 2.6 Ma, these radical and rapid climate changes prevented many ecosystems from fully equilibrating with their contemporary climates. Instead of today's ‘striped’ world in which species' ranges have equilibrated with gradients of temperature, moisture, and seasonality, the ice‐age world was a disequilibrial ‘plaid’ in which species' ranges shifted rapidly and repeatedly over time and space, rarely catching up with contemporary climate. In the transient ecosystems that resulted, certain physiological, anatomical, and ecological attributes shared by megafaunal species pre‐adapted them for success. These traits included greater metabolic and locomotory efficiency, increased resistance to starvation, longer life spans, greater sensory ranges, and the ability to be nomadic or migratory. When the plaid world of the ice age ended, many of the advantages of being large were either lost or became disadvantages. For instance in a striped world, the low population densities and slow reproductive rates associated with large body size reduced the resiliency of megafaunal species to population bottlenecks. As the ice age ended, the downsides of being large in striped environments lowered the extinction thresholds of megafauna worldwide, which then increased the vulnerability of individual species to a variety of proximate threats they had previously tolerated, such as human predation, competition with other species, and habitat loss. For many megafaunal species, the plaid‐to‐stripes transition may have been near the base of a hierarchy of extinction causes whose relative importances varied geographically, temporally, and taxonomically.
Although phylogenetic inference of protein-coding sequences continues to dominate the literature, few analyses incorporate evolutionary models that consider the genetic code. This problem is ...exacerbated by the exclusion of codon-based models from commonly employed model selection techniques, presumably due to the computational cost associated with codon models. We investigated an efficient alternative to standard nucleotide substitution models, in which codon position (CP) is incorporated into the model. We determined the most appropriate model for alignments of 177 RNA virus genes and 106 yeast genes, using 11 substitution models including one codon model and four CP models. The majority of analyzed gene alignments are best described by CP substitution models, rather than by standard nucleotide models, and without the computational cost of full codon models. These results have significant implications for phylogenetic inference of coding sequences as they make it clear that substitution models incorporating CPs not only are a computationally realistic alternative to standard models but may also frequently be statistically superior.
Estimation of demographic history from nucleotide sequences represents an important component of many studies in molecular ecology. For example, knowledge of a population's history can allow us to ...test hypotheses about the impact of climatic and anthropogenic factors. In the past, demographic analysis was typically limited to relatively simple population models, such as exponential or logistic growth. More flexible approaches are now available, including skyline-plot methods that are able to reconstruct changes in population sizes through time. This technical review focuses on these skyline-plot methods. We describe some general principles relating to sampling design and data collection. We then provide an outline of the methodological framework, which is based on coalescent theory, before tracing the development of the various skyline-plot methods and describing their key features. The performance and properties of the methods are illustrated using two simulated data sets.
The rapid loss of intraspecific variation is a hidden biodiversity crisis. Intraspecific variation, which includes the genomic and phenotypic diversity found within and among populations, is ...threatened by local extinctions, abundance declines, and anthropogenic selection. However, biodiversity assessments often fail to highlight this loss of diversity within species. We review the literature on how intraspecific variation supports critical ecological functions and nature's contributions to people (NCP). Results show that the main categories of NCP (material, non-material, and regulating) are supported by intraspecific variation. We highlight new strategies that are needed to further explore these connections and to make explicit the value of intraspecific variation for NCP. These strategies will require collaboration with local and Indigenous groups who possess critical knowledge on the relationships between intraspecific variation and ecosystem function. New genomic methods provide a promising set of tools to uncover hidden variation. Urgent action is needed to document, conserve, and restore the intraspecific variation that supports nature and people. Thus, we propose that the maintenance and restoration of intraspecific variation should be raised to a major global conservation objective.
Significance The domestication of the horse revolutionized warfare, trade, and the exchange of people and ideas. This at least 5,500-y-long process, which ultimately transformed wild horses into the ...hundreds of breeds living today, is difficult to reconstruct from archeological data and modern genetics alone. We therefore sequenced two complete horse genomes, predating domestication by thousands of years, to characterize the genetic footprint of domestication. These ancient genomes reveal predomestic population structure and a significant fraction of genetic variation shared with the domestic breeds but absent from Przewalski’s horses. We find positive selection on genes involved in various aspects of locomotion, physiology, and cognition. Finally, we show that modern horse genomes contain an excess of deleterious mutations, likely representing the genetic cost of domestication.
The domestication of the horse ∼5.5 kya and the emergence of mounted riding, chariotry, and cavalry dramatically transformed human civilization. However, the genetics underlying horse domestication are difficult to reconstruct, given the near extinction of wild horses. We therefore sequenced two ancient horse genomes from Taymyr, Russia (at 7.4- and 24.3-fold coverage), both predating the earliest archeological evidence of domestication. We compared these genomes with genomes of domesticated horses and the wild Przewalski’s horse and found genetic structure within Eurasia in the Late Pleistocene, with the ancient population contributing significantly to the genetic variation of domesticated breeds. We furthermore identified a conservative set of 125 potential domestication targets using four complementary scans for genes that have undergone positive selection. One group of genes is involved in muscular and limb development, articular junctions, and the cardiac system, and may represent physiological adaptations to human utilization. A second group consists of genes with cognitive functions, including social behavior, learning capabilities, fear response, and agreeableness, which may have been key for taming horses. We also found that domestication is associated with inbreeding and an excess of deleterious mutations. This genetic load is in line with the “cost of domestication” hypothesis also reported for rice, tomatoes, and dogs, and it is generally attributed to the relaxation of purifying selection resulting from the strong demographic bottlenecks accompanying domestication. Our work demonstrates the power of ancient genomes to reconstruct the complex genetic changes that transformed wild animals into their domesticated forms, and the population context in which this process took place.
Cell-free DNA (cfDNA), present in circulating blood plasma, contains information about prenatal health, organ transplant reception, and cancer presence and progression. Originally developed for the ...genomic analysis of highly degraded ancient DNA, single-stranded DNA (ssDNA) library preparation methods are gaining popularity in the field of cfDNA analysis due to their efficiency and ability to convert short, fragmented DNA into sequencing libraries without altering DNA ends. However, current ssDNA methods are costly and time-consuming.
Here we present an efficient ligation-based single-stranded library preparation method that is engineered to produce complex libraries in under 2.5 h from as little as 1 nanogram of input DNA without alteration to the native ends of template molecules. Our method, called Single Reaction Single-stranded LibrarY or SRSLY, ligates uniquely designed Next-Generation Sequencing (NGS) adapters in a one-step combined phosphorylation/ligation reaction that foregoes end-polishing. Using synthetic DNA oligos and cfDNA, we demonstrate the efficiency and utility of this approach and compare with existing double-stranded and single-stranded approaches for library generation. Finally, we demonstrate that cfDNA NGS data generated from SRSLY can be used to analyze DNA fragmentation patterns to deduce nucleosome positioning and transcription factor binding.
SRSLY is a versatile tool for converting short and fragmented DNA molecules, like cfDNA fragments, into sequencing libraries while retaining native lengths and ends.