323 research outputs found

    Methods to Isolate Possible Bacteriophage for Micrococcus Luteus and Acinetobacter Baumannii

    Full text link
    The increasing prevalence of antibiotic-resistant strains of bacteria has led to a crisis in treatment options. Acinetobacter baumannii is an example of a bacterium that has developed a dangerous level of multidrug resistance. Not only does it have genes allowing for the resistance to antibiotics, but it also produces a biofilm that protects it. In recent years, A. baumannii has become a major contributor to nosocomial infections making it critical to develop new treatment methods. Micrococcus luteus, while typically not thought of as a pathogen, is also developing a resistance to antibiotics. M. luteus is capable of forming a biofilm on its own making it worrisome as it has increasingly been noted as an opportunistic pathogen. One potential new treatment of antibiotic resistance is the development of bacteriophage therapy, using bacterial viruses to target the infection and treat it. This study examines methods for isolating novel bacteriophage from dairy cattle feces, specifically for the biofilm producers A. baumannii and M. luteus

    Text mixing shapes the anatomy of rank-frequency distributions

    Get PDF
    Natural languages are full of rules and exceptions. One of the most famous quantitative rules is Zipf\u27s law, which states that the frequency of occurrence of a word is approximately inversely proportional to its rank. Though this law of ranks has been found to hold across disparate texts and forms of data, analyses of increasingly large corpora since the late 1990s have revealed the existence of two scaling regimes. These regimes have thus far been explained by a hypothesis suggesting a separability of languages into core and noncore lexica. Here we present and defend an alternative hypothesis that the two scaling regimes result from the act of aggregating texts. We observe that text mixing leads to an effective decay of word introduction, which we show provides accurate predictions of the location and severity of breaks in scaling. Upon examining large corpora from 10 languages in the Project Gutenberg eBooks collection, we find emphatic empirical support for the universality of our claim

    Low temperature reduction of hexavalent chromium by a microbial enrichment consortium and a novel strain of Arthrobacter aurescens

    Get PDF
    BACKGROUND: Chromium is a transition metal most commonly found in the environment in its trivalent [Cr(III)] and hexavalent [Cr(VI)] forms. The EPA maximum total chromium contaminant level for drinking water is 0.1 mg/l (0.1 ppm). Many water sources, especially underground sources, are at low temperatures (less than or equal to 15 Centigrade) year round. It is important to evaluate the possibility of microbial remediation of Cr(VI) contamination using microorganisms adapted to these low temperatures (psychrophiles). RESULTS: Core samples obtained from a Cr(VI) contaminated aquifer at the Hanford facility in Washington were enriched in Vogel Bonner medium at 10 Centigrade with 0, 25, 50, 100, 200, 400 and 1000 mg/l Cr(VI). The extent of Cr(VI) reduction was evaluated using the diphenyl carbazide assay. Resistance to Cr(VI) up to and including 1000 mg/l Cr(VI) was observed in the consortium experiments. Reduction was slow or not observed at and above 100 mg/l Cr(VI) using the enrichment consortium. Average time to complete reduction of Cr(VI) in the 30 and 60 mg/l Cr(VI) cultures of the consortium was 8 and 17 days, respectively at 10 Centigrade. Lyophilized consortium cells did not demonstrate adsorption of Cr(VI) over a 24 hour period. Successful isolation of a Cr(VI) reducing organism (designated P4) from the consortium was confirmed by 16S rDNA amplification and sequencing. Average time to complete reduction of Cr(VI) at 10 Centigrade in the 25 and 50 mg/l Cr(VI) cultures of the isolate P4 was 3 and 5 days, respectively. The 16S rDNA sequence from isolate P4 identified this organism as a strain of Arthrobacter aurescens, a species that has not previously been shown to be capable of low temperature Cr(VI) reduction. CONCLUSION: A. aurescens, indigenous to the subsurface, has the potential to be a predominant metal reducer in enhanced, in situ subsurface bioremediation efforts involving Cr(VI) and possibly other heavy metals and radionuclides

    The sociospatial factors of death: Analyzing effects of geospatially-distributed variables in a Bayesian mortality model for Hong Kong

    Full text link
    Human mortality is in part a function of multiple socioeconomic factors that differ both spatially and temporally. Adjusting for other covariates, the human lifespan is positively associated with household wealth. However, the extent to which mortality in a geographical region is a function of socioeconomic factors in both that region and its neighbors is unclear. There is also little information on the temporal components of this relationship. Using the districts of Hong Kong over multiple census years as a case study, we demonstrate that there are differences in how wealth indicator variables are associated with longevity in (a) areas that are affluent but neighbored by socially deprived districts versus (b) wealthy areas surrounded by similarly wealthy districts. We also show that the inclusion of spatially-distributed variables reduces uncertainty in mortality rate predictions in each census year when compared with a baseline model. Our results suggest that geographic mortality models should incorporate nonlocal information (e.g., spatial neighbors) to lower the variance of their mortality estimates, and point to a more in-depth analysis of sociospatial spillover effects on mortality rates.Comment: 26 pages (15 main, 11 appendix), 22 figures (6 main, 11 appendix), 2 table

    Transitions in climate and energy discourse between Hurricanes Katrina and Sandy

    Get PDF
    Although climate change and energy are intricately linked, their explicit connection is not always prominent in public discourse and the media. Disruptive extreme weather events, including hurricanes, focus public attention in new and different ways offering a unique window of opportunity to analyze how a focusing event influences public discourse. Media coverage of extreme weather events simultaneously shapes and reflects public discourse on climate issues. Here, we analyze climate and energy newspaper coverage of Hurricanes Katrina (2005) and Sandy (2012) using topic models, mathematical techniques used to discover abstract topics within a set of documents. Our results demonstrate that post-Katrina media coverage does not contain a climate change topic, and the energy topic is limited to discussion of energy prices, markets, and the economy with almost no explicit linkages made between energy and climate change. In contrast, post-Sandy media coverage does contain a prominent climate change topic, a distinct energy topic, as well as integrated representation of climate change and energy, indicating a shift in climate and energy reporting between Hurricane Katrina and Hurricane Sandy

    Identifying missing dictionary entries with frequency-conserving context models

    Get PDF
    In an effort to better understand meaning from natural language texts, we explore methods aimed at organizing lexical objects into contexts. A number of these methods for organization fall into a family defined by word ordering. Unlike demographic or spatial partitions of data, these collocation models are of special importance for their universal applicability. While we are interested here in text and have framed our treatment appropriately, our work is potentially applicable to other areas of research (e.g., speech, genomics, and mobility patterns) where one has ordered categorical data (e.g., sounds, genes, and locations). Our approach focuses on the phrase (whether word or larger) as the primary meaning-bearing lexical unit and object of study. To do so, we employ our previously developed framework for generating word-conserving phrase-frequency data. Upon training our model with the Wiktionary, an extensive, online, collaborative, and open-source dictionary that contains over 100000 phrasal definitions, we develop highly effective filters for the identification of meaningful, missing phrase entries. With our predictions we then engage the editorial community of the Wiktionary and propose short lists of potential missing entries for definition, developing a breakthrough, lexical extraction technique and expanding our knowledge of the defined English lexicon of phrases

    Evaluation of a Bayesian inference network for ligand-based virtual screening

    Get PDF
    Background Bayesian inference networks enable the computation of the probability that an event will occur. They have been used previously to rank textual documents in order of decreasing relevance to a user-defined query. Here, we modify the approach to enable a Bayesian inference network to be used for chemical similarity searching, where a database is ranked in order of decreasing probability of bioactivity. Results Bayesian inference networks were implemented using two different types of network and four different types of belief function. Experiments with the MDDR and WOMBAT databases show that a Bayesian inference network can be used to provide effective ligand-based screening, especially when the active molecules being sought have a high degree of structural homogeneity; in such cases, the network substantially out-performs a conventional, Tanimoto-based similarity searching system. However, the effectiveness of the network is much less when structurally heterogeneous sets of actives are being sought. Conclusion A Bayesian inference network provides an interesting alternative to existing tools for ligand-based virtual screening
    • …
    corecore