1,192 research outputs found

    Transfer origins in the conjugative Enterococcus faecalis plasmids pAD1 and pAM373: identification of the pAD1 nic site, a specific relaxase and a possible TraG-like protein

    Full text link
    The Enterococcus faecalis conjugative plasmids pAD1 and pAM373 encode a mating response to the peptide sex pheromones cAD1 and cAM373 respectively. Sequence determination of both plasmids has recently been completed with strong similarity evident over many of the structural genes related to conjugation. pAD1 has two origins of transfer, with oriT1 being located within the repA determinant, whereas the more efficiently utilized oriT2 is located between orf53 and orf57 , two genes found in the present study to be essential for conjugation. We have found a similarly located oriT to be present in pAM373. oriT2 corresponds to about 285 bp based on its ability to facilitate mobilization by pAD1 when ligated to the shuttle vector pAM401; however, it was not mobilized by pAM373. In contrast, a similarly ligated fragment containing the oriT of pAM373 did not facilitate mobilization by pAD1 but was efficiently mobilized by pAM373. The oriT sites of the two plasmids each contained a homologous large inverted repeat (spanning about 140 bp) adjacent to a series of non-homologous short (6 bp) direct repeats. A hybrid construction containing the inverted repeat of pAM373 and direct repeats of pAD1 was mobilized efficiently by pAD1 but not by pAM373, indicating a significantly greater degree of specificity is associated with the direct repeats. Mutational (deletion) analyses of the pAD1 oriT2 inverted repeat structure suggested its importance in facilitating transfer or perhaps ligation of the ends of the newly transferred DNA strand. Analyses showed that Orf57 (to be called TraX) is the relaxase, which was found to induce a specific nick in the large inverted repeat inside oriT ; the protein also facilitated site-specific recombination between two oriT2 sites. Orf53 (to be called TraW) exhibits certain structural similarities to TraG-like proteins, although there is little overall homology.Peer Reviewedhttp://deepblue.lib.umich.edu/bitstream/2027.42/72536/1/j.1365-2958.2002.03007.x.pd

    Update on a Pharmacokinetic-Centric Alternative Tier II Program for MMT—Part II: Physiologically Based Pharmacokinetic Modeling and Manganese Risk Assessment

    Get PDF
    Recently, a variety of physiologically based pharmacokinetic (PBPK) models have been developed for the essential element manganese. This paper reviews the development of PBPK models (e.g., adult, pregnant, lactating, and neonatal rats, nonhuman primates, and adult, pregnant, lactating, and neonatal humans) and relevant risk assessment applications. Each PBPK model incorporates critical features including dose-dependent saturable tissue capacities and asymmetrical diffusional flux of manganese into brain and other tissues. Varied influx and efflux diffusion rate and binding constants for different brain regions account for the differential increases in regional brain manganese concentrations observed experimentally. We also present novel PBPK simulations to predict manganese tissue concentrations in fetal, neonatal, pregnant, or aged individuals, as well as individuals with liver disease or chronic manganese inhalation. The results of these simulations could help guide risk assessors in the application of uncertainty factors as they establish exposure guidelines for the general public or workers

    Dispelling urban myths about default uncertainty factors in chemical risk assessment - Sufficient protection against mixture effects?

    Get PDF
    © 2013 Martin et al.; licensee BioMed Central LtdThis article has been made available through the Brunel Open Access Publishing Fund.Assessing the detrimental health effects of chemicals requires the extrapolation of experimental data in animals to human populations. This is achieved by applying a default uncertainty factor of 100 to doses not found to be associated with observable effects in laboratory animals. It is commonly assumed that the toxicokinetic and toxicodynamic sub-components of this default uncertainty factor represent worst-case scenarios and that the multiplication of those components yields conservative estimates of safe levels for humans. It is sometimes claimed that this conservatism also offers adequate protection from mixture effects. By analysing the evolution of uncertainty factors from a historical perspective, we expose that the default factor and its sub-components are intended to represent adequate rather than worst-case scenarios. The intention of using assessment factors for mixture effects was abandoned thirty years ago. It is also often ignored that the conservatism (or otherwise) of uncertainty factors can only be considered in relation to a defined level of protection. A protection equivalent to an effect magnitude of 0.001-0.0001% over background incidence is generally considered acceptable. However, it is impossible to say whether this level of protection is in fact realised with the tolerable doses that are derived by employing uncertainty factors. Accordingly, it is difficult to assess whether uncertainty factors overestimate or underestimate the sensitivity differences in human populations. It is also often not appreciated that the outcome of probabilistic approaches to the multiplication of sub-factors is dependent on the choice of probability distributions. Therefore, the idea that default uncertainty factors are overly conservative worst-case scenarios which can account both for the lack of statistical power in animal experiments and protect against potential mixture effects is ill-founded. We contend that precautionary regulation should provide an incentive to generate better data and recommend adopting a pragmatic, but scientifically better founded approach to mixture risk assessment. © 2013 Martin et al.; licensee BioMed Central Ltd.Oak Foundatio

    Movable genetic elements and antibiotic resistance in enterococci

    Full text link
    The enterococci possess genetic elements able to move from one strain to another via conjugation. Certain enterococcal plasmids exhibit a broad host range among gram-positive bacteria, but only when matings are performed on solid surfaces. Other plasmids are more specific to enterococci, transfer efficiently in broth, and encode a response to recipient-produced sex phermones. Transmissible non-plasmid elements, the conjugative transposons, are widespread among the enterococci and determine their own fertility properties. Drug resistance, hemolysin, and bacteriocin determinants are commonly found on the various transmissible enterococcal elements. Examples of the different systems are discussed in this review.Peer Reviewedhttp://deepblue.lib.umich.edu/bitstream/2027.42/47900/1/10096_2005_Article_BF01963632.pd

    Soil and water bioengineering: practice and research needs for reconciling natural hazard control and ecological restoration

    Get PDF
    Soil and water bioengineering is a technology that encourages scientists and practitioners to combine their knowledge and skills in the management of ecosystems with a common goal to maximize benefits to both man and the natural environment. It involves techniques that use plants as living building materials, for: (i) natural hazard control (e.g., soil erosion, torrential floods and landslides) and (ii) ecological restoration or nature-based re-introduction of species on degraded lands, river embankments, and disturbed environments. For a bioengineering project to be successful, engineers are required to highlight all the potential benefits and ecosystem services by documenting the technical, ecological, economic and social values. The novel approaches used by bioengineers raise questions for researchers and necessitate innovation from practitioners to design bioengineering concepts and techniques. Our objective in this paper, therefore, is to highlight the practice and research needs in soil and water bioengineering for reconciling natural hazard control and ecological restoration. Firstly, we review the definition and development of bioengineering technology, while stressing issues concerning the design, implementation, and monitoring of bioengineering actions. Secondly, we highlight the need to reconcile natural hazard control and ecological restoration by posing novel practice and research questions

    A mathematical model for breath gas analysis of volatile organic compounds with special emphasis on acetone

    Full text link
    Recommended standardized procedures for determining exhaled lower respiratory nitric oxide and nasal nitric oxide have been developed by task forces of the European Respiratory Society and the American Thoracic Society. These recommendations have paved the way for the measurement of nitric oxide to become a diagnostic tool for specific clinical applications. It would be desirable to develop similar guidelines for the sampling of other trace gases in exhaled breath, especially volatile organic compounds (VOCs) which reflect ongoing metabolism. The concentrations of water-soluble, blood-borne substances in exhaled breath are influenced by: (i) breathing patterns affecting gas exchange in the conducting airways; (ii) the concentrations in the tracheo-bronchial lining fluid; (iii) the alveolar and systemic concentrations of the compound. The classical Farhi equation takes only the alveolar concentrations into account. Real-time measurements of acetone in end-tidal breath under an ergometer challenge show characteristics which cannot be explained within the Farhi setting. Here we develop a compartment model that reliably captures these profiles and is capable of relating breath to the systemic concentrations of acetone. By comparison with experimental data it is inferred that the major part of variability in breath acetone concentrations (e.g., in response to moderate exercise or altered breathing patterns) can be attributed to airway gas exchange, with minimal changes of the underlying blood and tissue concentrations. Moreover, it is deduced that measured end-tidal breath concentrations of acetone determined during resting conditions and free breathing will be rather poor indicators for endogenous levels. Particularly, the current formulation includes the classical Farhi and the Scheid series inhomogeneity model as special limiting cases.Comment: 38 page

    A Qualitative Modeling Approach for Whole Genome Prediction Using High-Throughput Toxicogenomics Data and Pathway-Based Validation

    Get PDF
    Efficient high-throughput transcriptomics (HTT) tools promise inexpensive, rapid assessment of possible biological consequences of human and environmental exposures to tens of thousands of chemicals in commerce. HTT systems have used relatively small sets of gene expression measurements coupled with mathematical prediction methods to estimate genome-wide gene expression and are often trained and validated using pharmaceutical compounds. It is unclear whether these training sets are suitable for general toxicity testing applications and the more diverse chemical space represented by commercial chemicals and environmental contaminants. In this work, we built predictive computational models that inferred whole genome transcriptional profiles from a smaller sample of surrogate genes. The model was trained and validated using a large scale toxicogenomics database with gene expression data from exposure to heterogeneous chemicals from a wide range of classes (the Open TG-GATEs data base). The method of predictor selection was designed to allow high fidelity gene prediction from any pre-existing gene expression data set, regardless of animal species or data measurement platform. Predictive qualitative models were developed with this TG-GATES data that contained gene expression data of human primary hepatocytes with over 941 samples covering 158 compounds. A sequential forward search-based greedy algorithm, combining different fitting approaches and machine learning techniques, was used to find an optimal set of surrogate genes that predicted differential expression changes of the remaining genome. We then used pathway enrichment of up-regulated and down-regulated genes to assess the ability of a limited gene set to determine relevant patterns of tissue response. In addition, we compared prediction performance using the surrogate genes found from our greedy algorithm (referred to as the SV2000) with the landmark genes provided by existing technologies such as L1000 (Genometry) and S1500 (Tox21), finding better predictive performance for the SV2000. The ability of these predictive algorithms to predict pathway level responses is a positive step toward incorporating mode of action (MOA) analysis into the high throughput prioritization and testing of the large number of chemicals in need of safety evaluation

    Incorporating New Technologies Into Toxicity Testing and Risk Assessment: Moving From 21st Century Vision to a Data-Driven Framework

    Get PDF
    Based on existing data and previous work, a series of studies is proposed as a basis toward a pragmatic early step in transforming toxicity testing. These studies were assembled into a data-driven framework that invokes successive tiers of testing with margin of exposure (MOE) as the primary metric. The first tier of the framework integrates data from high-throughput in vitro assays, in vitro-to-in vivo extrapolation (IVIVE) pharmacokinetic modeling, and exposure modeling. The in vitro assays are used to separate chemicals based on their relative selectivity in interacting with biological targets and identify the concentration at which these interactions occur. The IVIVE modeling converts in vitro concentrations into external dose for calculation of the point of departure (POD) and comparisons to human exposure estimates to yield a MOE. The second tier involves short-term in vivo studies, expanded pharmacokinetic evaluations, and refined human exposure estimates. The results from the second tier studies provide more accurate estimates of the POD and the MOE. The third tier contains the traditional animal studies currently used to assess chemical safety. In each tier, the POD for selective chemicals is based primarily on endpoints associated with a proposed mode of action, whereas the POD for nonselective chemicals is based on potential biological perturbation. Based on the MOE, a significant percentage of chemicals evaluated in the first 2 tiers could be eliminated from further testing. The framework provides a risk-based and animal-sparing approach to evaluate chemical safety, drawing broadly from previous experience but incorporating technological advances to increase efficiency
    corecore