110 research outputs found

    Eighteenth century Yersinia pestis genomes reveal the long-term persistence of an historical plague focus

    Full text link
    © Bos et al. The 14th-18th century pandemic of Yersinia pestis caused devastating disease outbreaks in Europe for almost 400 years. The reasons for plague's persistence and abrupt disappearance in Europe are poorly understood, but could have been due to either the presence of now-extinct plague foci in Europe itself, or successive disease introductions from other locations. Here we present five Y. pestis genomes from one of the last European outbreaks of plague, from 1722 in Marseille, France. The lineage identified has not been found in any extant Y. pestis foci sampled to date, and has its ancestry in strains obtained from victims of the 14th century Black Death. These data suggest the existence of a previously uncharacterized historical plague focus that persisted for at least three centuries. We propose that this disease source may have been responsible for the many resurgences of plague in Europe following the Black Death

    Quantification of damage in DNA recovered from highly degraded samples – a case study on DNA in faeces

    Get PDF
    BACKGROUND: Poorly preserved biological tissues have become an important source of DNA for a wide range of zoological studies. Measuring the quality of DNA obtained from these samples is often desired; however, there are no widely used techniques available for quantifying damage in highly degraded DNA samples. We present a general method that can be used to determine the frequency of polymerase blocking DNA damage in specific gene-regions in such samples. The approach uses quantitative PCR to measure the amount of DNA present at several fragment sizes within a sample. According to a model of random degradation the amount of available template will decline exponentially with increasing fragment size in damaged samples, and the frequency of DNA damage (λ) can be estimated by determining the rate of decline. RESULTS: The method is illustrated through the analysis of DNA extracted from sea lion faecal samples. Faeces contain a complex mixture of DNA from several sources and different components are expected to be differentially degraded. We estimated the frequency of DNA damage in both predator and prey DNA within individual faecal samples. The distribution of fragment lengths for each target fit well with the assumption of a random degradation process and, in keeping with our expectations, the estimated frequency of damage was always less in predator DNA than in prey DNA within the same sample (mean λ(predator )= 0.0106 per nucleotide; mean λ(prey )= 0.0176 per nucleotide). This study is the first to explicitly define the amount of template damage in any DNA extracted from faeces and the first to quantify the amount of predator and prey DNA present within individual faecal samples. CONCLUSION: We present an approach for characterizing mixed, highly degraded PCR templates such as those often encountered in ecological studies using non-invasive samples as a source of DNA, wildlife forensics investigations and ancient DNA research. This method will allow researchers to measure template quality in order to evaluate alternate sources of DNA, different methods of sample preservation and different DNA extraction protocols. The technique could also be applied to study the process of DNA decay

    Abundant Human DNA Contamination Identified in Non-Primate Genome Databases

    Get PDF
    During routine screens of the NCBI databases using human repetitive elements we discovered an unlikely level of nucleotide identity across a broad range of phyla. To ascertain whether databases containing DNA sequences, genome assemblies and trace archive reads were contaminated with human sequences, we performed an in depth search for sequences of human origin in non-human species. Using a primate specific SINE, AluY, we screened 2,749 non-primate public databases from NCBI, Ensembl, JGI, and UCSC and have found 492 to be contaminated with human sequence. These represent species ranging from bacteria (B. cereus) to plants (Z. mays) to fish (D. rerio) with examples found from most phyla. The identification of such extensive contamination of human sequence across databases and sequence types warrants caution among the sequencing community in future sequencing efforts, such as human re-sequencing. We discuss issues this may raise as well as present data that gives insight as to how this may be occurring

    Amplification of cox2 (∼620 bp) from 2 mg of Up to 129 Years Old Herbarium Specimens, Comparing 19 Extraction Methods and 15 Polymerases

    Get PDF
    During the past years an increasing number of studies have focussed on the use of herbarium specimens for molecular phylogenetic investigations and several comparative studies have been published. However, in the studies reported so far usually rather large amounts of material (typically around 100 mg) were sampled for DNA extraction. This equals an amount roughly equivalent to 8 cm2 of a medium thick leaf. For investigating the phylogeny of plant pathogens, such large amounts of tissue are usually not available or would irretrievably damage the specimens. Through systematic comparison of 19 DNA extraction protocols applied to only 2 mg of infected leaf tissue and testing 15 different DNA polymerases, we could successfully amplify a mitochondrial DNA region (cox2; ∼620 bp) from herbarium specimens well over a hundred years old. We conclude that DNA extraction and the choice of DNA polymerase are crucial factors for successful PCR amplification from small samples of historic herbarium specimens. Through a combination of suitable DNA extraction protocols and DNA polymerases, only a fraction of the preserved plant material commonly used is necessary for successful PCR amplification. This facilitates the potential use of a far larger number of preserved specimens for molecular phylogenetic investigation and provides access to a wealth of genetic information in preserved in specimens deposited in herbaria around the world without reducing their scientific or historical value

    Characterization of Nucleotide Misincorporation Patterns in the Iceman's Mitochondrial DNA

    Get PDF
    BACKGROUND: The degradation of DNA represents one of the main issues in the genetic analysis of archeological specimens. In the recent years, a particular kind of post-mortem DNA modification giving rise to nucleotide misincorporation ("miscoding lesions") has been the object of extensive investigations. METHODOLOGY/PRINCIPAL FINDINGS: To improve our knowledge regarding the nature and incidence of ancient DNA nucleotide misincorporations, we have utilized 6,859 (629,975 bp) mitochondrial (mt) DNA sequences obtained from the 5,350-5,100-years-old, freeze-desiccated human mummy popularly known as the Tyrolean Iceman or Otzi. To generate the sequences, we have applied a mixed PCR/pyrosequencing procedure allowing one to obtain a particularly high sequence coverage. As a control, we have produced further 8,982 (805,155 bp) mtDNA sequences from a contemporary specimen using the same system and starting from the same template copy number of the ancient sample. From the analysis of the nucleotide misincorporation rate in ancient, modern, and putative contaminant sequences, we observed that the rate of misincorporation is significantly lower in modern and putative contaminant sequence datasets than in ancient sequences. In contrast, type 2 transitions represent the vast majority (85%) of the observed nucleotide misincorporations in ancient sequences. CONCLUSIONS/SIGNIFICANCE: This study provides a further contribution to the knowledge of nucleotide misincorporation patterns in DNA sequences obtained from freeze-preserved archeological specimens. In the Iceman system, ancient sequences can be clearly distinguished from contaminants on the basis of nucleotide misincorporation rates. This observation confirms a previous identification of the ancient mummy sequences made on a purely phylogenetical basis. The present investigation provides further indication that the majority of ancient DNA damage is reflected by type 2 (cytosine-->thymine/guanine-->adenine) transitions and that type 1 transitions are essentially PCR artifacts

    To Clone or Not To Clone: Method Analysis for Retrieving Consensus Sequences In Ancient DNA Samples

    Get PDF
    The challenges associated with the retrieval and authentication of ancient DNA (aDNA) evidence are principally due to post-mortem damage which makes ancient samples particularly prone to contamination from “modern” DNA sources. The necessity for authentication of results has led many aDNA researchers to adopt methods considered to be “gold standards” in the field, including cloning aDNA amplicons as opposed to directly sequencing them. However, no standardized protocol has emerged regarding the necessary number of clones to sequence, how a consensus sequence is most appropriately derived, or how results should be reported in the literature. In addition, there has been no systematic demonstration of the degree to which direct sequences are affected by damage or whether direct sequencing would provide disparate results from a consensus of clones

    Methods for comparative metagenomics

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Metagenomics is a rapidly growing field of research that aims at studying uncultured organisms to understand the true diversity of microbes, their functions, cooperation and evolution, in environments such as soil, water, ancient remains of animals, or the digestive system of animals and humans. The recent development of ultra-high throughput sequencing technologies, which do not require cloning or PCR amplification, and can produce huge numbers of DNA reads at an affordable cost, has boosted the number and scope of metagenomic sequencing projects. Increasingly, there is a need for new ways of comparing multiple metagenomics datasets, and for fast and user-friendly implementations of such approaches.</p> <p>Results</p> <p>This paper introduces a number of new methods for interactively exploring, analyzing and comparing multiple metagenomic datasets, which will be made freely available in a new, comparative version 2.0 of the stand-alone metagenome analysis tool MEGAN.</p> <p>Conclusion</p> <p>There is a great need for powerful and user-friendly tools for comparative analysis of metagenomic data and MEGAN 2.0 will help to fill this gap.</p

    Fragmentation of Contaminant and Endogenous DNA in Ancient Samples Determined by Shotgun Sequencing; Prospects for Human Palaeogenomics

    Get PDF
    Despite the successful retrieval of genomes from past remains, the prospects for human palaeogenomics remain unclear because of the difficulty of distinguishing contaminant from endogenous DNA sequences. Previous sequence data generated on high-throughput sequencing platforms indicate that fragmentation of ancient DNA sequences is a characteristic trait primarily arising due to depurination processes that create abasic sites leading to DNA breaks
    corecore