7 research outputs found

    High-Precision, Whole-Genome Sequencing of Laboratory Strains Facilitates Genetic Studies

    Get PDF
    Whole-genome sequencing is a powerful technique for obtaining the reference sequence information of multiple organisms. Its use can be dramatically expanded to rapidly identify genomic variations, which can be linked with phenotypes to obtain biological insights. We explored these potential applications using the emerging next-generation sequencing platform Solexa Genome Analyzer, and the well-characterized model bacterium Bacillus subtilis. Combining sequencing with experimental verification, we first improved the accuracy of the published sequence of the B. subtilis reference strain 168, then obtained sequences of multiple related laboratory strains and different isolates of each strain. This provides a framework for comparing the divergence between different laboratory strains and between their individual isolates. We also demonstrated the power of Solexa sequencing by using its results to predict a defect in the citrate signal transduction pathway of a common laboratory strain, which we verified experimentally. Finally, we examined the molecular nature of spontaneously generated mutations that suppress the growth defect caused by deletion of the stringent response mediator relA. Using whole-genome sequencing, we rapidly mapped these suppressor mutations to two small homologs of relA. Interestingly, stable suppressor strains had mutations in both genes, with each mutation alone partially relieving the relA growth defect. This supports an intriguing three-locus interaction module that is not easily identifiable through traditional suppressor mapping. We conclude that whole-genome sequencing can drastically accelerate the identification of suppressor mutations and complex genetic interactions, and it can be applied as a standard tool to investigate the genetic traits of model organisms

    The Digital Revolution Begets the Global Spatial Data

    No full text
    Regardless of when the digital revolution began, profound changes have occurred during the past 50 years in the way spatial data are collected, stored, manipulated, and used. In years past, maps were made using plane table methods and other manual surveying techniques. But, mapping and automation of spatial data collection has progressed enormously since the 1950's and 1960's when photogrammetric mapping techniques greatly simplified design activities for the U.S. interstate highway system. Similarly, the electronic calculator replaced the slide rule in the 1970's and surveyors have been using electronic total stations since the early 1980's. During the past 20 years, the global positioning system (GPS), geographic information systems (GIS's), and the world-wide-web (WWW) have joined the technological onslaught so that now spatial data can be characterized as digital and three-dimensional (3-D). Regretfully, development and implementation of the conceptual models used to handle spatial data have not kept pace. But a comprehensive 3-D global spatial data model (GSDM) has been defined which accommodates new technology, existing practices, and any location on earth or within the birdcage of orbiting GPS satellites. Spatial data users in many disciplines all over the world stand to benefit from adopting and using a comprehensive standard 3-D model. In 1569 Gerhardus Mercator published his 21-sheet map of the world based upon latitude and longitude spacing that became known as a conformal map projection. The universal transverse Mercator (UTM) projection is still used all over the world and the state plane coordinate system in the United States uses three different conformal projections. But, a map projection is strictly a 2-D model and spatial data are 3-D. Furthermore, the ubiquitous digital computer is now part of everything we do. With advent of the digital revolution, analog maps have given way to electronic digital data bases, GPS provides instantaneous position to novice and expert alike, and remote sensing satellites are capturing images of our planet earth 24 hours a day 7 days a week. In a speech on "The Digital Earth" www.isde5.org/al_gore_speech.htm given by then Vice President Al Gore, at the California Science Center, January 31, 1998, he said, "The hard part of taking advantage of this flood of geo-spatial information will be making sense of it -turning raw data into understandable information." The National Spatial Data Infrastructure (NSDI) and associated geographic information systems (www.fgdc.gov/nsdi/nsdi.html) are promoted as being the best way to handle geo-spatial data. The NSDI offers many advantages, (e.g., specificity and standardization) but the NSDI has two serious flaws -1) without reliable geoid heights, the NSDI is not a true 3-D data base and 2) meta data do an incomplete job of describing spatial data accuracy. The analogy is putting new (digital) wine into old bottles

    Barcroft-Warburg Manometric Apparatus - Usage, Recent Developments, and Applications

    No full text
    corecore