292 research outputs found

    Exploiting the pathway structure of metabolism to reveal high-order epistasis

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Biological robustness results from redundant pathways that achieve an essential objective, e.g. the production of biomass. As a consequence, the biological roles of many genes can only be revealed through multiple knockouts that identify a <it>set </it>of genes as essential for a given function. The identification of such "epistatic" essential relationships between network components is critical for the understanding and eventual manipulation of robust systems-level phenotypes.</p> <p>Results</p> <p>We introduce and apply a network-based approach for genome-scale metabolic knockout design. We apply this method to uncover over 11,000 minimal knockouts for biomass production in an <it>in silico </it>genome-scale model of <it>E. coli</it>. A large majority of these "essential sets" contain 5 or more reactions, and thus represent complex epistatic relationships between components of the <it>E. coli </it>metabolic network.</p> <p>Conclusion</p> <p>The complex minimal biomass knockouts discovered with our approach illuminate robust essential systems-level roles for reactions in the <it>E. coli </it>metabolic network. Unlike previous approaches, our method yields results regarding high-order epistatic relationships and is applicable at the genome-scale.</p

    Dynamical Boson Stars

    Full text link
    The idea of stable, localized bundles of energy has strong appeal as a model for particles. In the 1950s John Wheeler envisioned such bundles as smooth configurations of electromagnetic energy that he called {\em geons}, but none were found. Instead, particle-like solutions were found in the late 1960s with the addition of a scalar field, and these were given the name {\em boson stars}. Since then, boson stars find use in a wide variety of models as sources of dark matter, as black hole mimickers, in simple models of binary systems, and as a tool in finding black holes in higher dimensions with only a single killing vector. We discuss important varieties of boson stars, their dynamic properties, and some of their uses, concentrating on recent efforts.Comment: 79 pages, 25 figures, invited review for Living Reviews in Relativity; major revision in 201

    CLINICAL CHARACTERISTICS, OUTCOMES AND RISK FACTORS FOR DEATH AMONG CRITICALLY ILL PATIENTS WITH HIV-RELATED ACUTE KIDNEY INJURY

    Full text link
    SUMMARY Background: The aim of this study is to describe clinical characteristics, outcomes and risk factors for death among patients with HIV-related acute kidney injury (AKI) admitted to an intensive care unit (ICU). Methods: A retrospective study was conducted with HIV-infected AKI patients admitted to the ICU of an infectious diseases hospital in Fortaleza, Brazil. All the patients with confirmed diagnosis of HIV and AKI admitted from January 2004 to December 2011 were included. A comparison between survivors and non-survivors was performed. Risk factors for death were investigated. Results: Among 256 AKI patients admitted to the ICU in the study period, 73 were identified as HIV-infected, with a predominance of male patients (83.6%), and the mean age was 41.2 ± 10.4 years. Non-survivor patients presented higher APACHE II scores (61.4 ± 19 vs. 38.6 ± 18, p = 0.004), used more vasoconstrictors (70.9 vs. 37.5%, p = 0.02) and needed more mechanical ventilation - MV (81.1 vs. 35.3%, p = 0.001). There were 55 deaths (75.3%), most of them (53.4%) due to septic shock. Independent risk factors for mortality were septic shock (OR = 14.2, 95% CI = 2.0-96.9, p = 0.007) and respiratory insufficiency with need of MV (OR = 27.6, 95% CI = 5.0-153.0, p < 0.001). Conclusion: Non-survivor HIV-infected patients with AKI admitted to the ICU presented higher severity APACHE II scores, more respiratory damage and hemodynamic impairment than survivors. Septic shock and respiratory insufficiency were independently associated to death

    Ecosystem development after mangrove wetland creation : plant–soil change across a 20-year chronosequence

    Get PDF
    This paper is not subject to U.S. copyright. The definitive version was published in Ecosystems 15 (2012): 848-866, doi:10.1007/s10021-012-9551-1.Mangrove wetland restoration and creation efforts are increasingly proposed as mechanisms to compensate for mangrove wetland losses. However, ecosystem development and functional equivalence in restored and created mangrove wetlands are poorly understood. We compared a 20-year chronosequence of created tidal wetland sites in Tampa Bay, Florida (USA) to natural reference mangrove wetlands. Across the chronosequence, our sites represent the succession from salt marsh to mangrove forest communities. Our results identify important soil and plant structural differences between the created and natural reference wetland sites; however, they also depict a positive developmental trajectory for the created wetland sites that reflects tightly coupled plant-soil development. Because upland soils and/or dredge spoils were used to create the new mangrove habitats, the soils at younger created sites and at lower depths (10–30 cm) had higher bulk densities, higher sand content, lower soil organic matter (SOM), lower total carbon (TC), and lower total nitrogen (TN) than did natural reference wetland soils. However, in the upper soil layer (0–10 cm), SOM, TC, and TN increased with created wetland site age simultaneously with mangrove forest growth. The rate of created wetland soil C accumulation was comparable to literature values for natural mangrove wetlands. Notably, the time to equivalence for the upper soil layer of created mangrove wetlands appears to be faster than for many other wetland ecosystem types. Collectively, our findings characterize the rate and trajectory of above- and below-ground changes associated with ecosystem development in created mangrove wetlands; this is valuable information for environmental managers planning to sustain existing mangrove wetlands or mitigate for mangrove wetland losses

    Analysis of the reaction of subcutaneous tissues in rats and the antimicrobial activity of calcium hydroxide paste used in association with different substances

    Get PDF
    The aim of this study was to evaluate the subcutaneous tissue response in rats and the antimicrobial activity of intracanal calcium hydroxide dressings mixed with different substances against E. faecalis. Fifty four rats were divided into three experimental groups according to the vehicle in the calcium hydroxide treatment: 0.4% chlorohexidine in propylene glycol (PG),Casearia sylvestris Sw in PG and calcium hydroxide+PG (control group). The pastes were placed into polyethylene tubes and implanted into the subcutaneous tissue. After 7, 14 and 30 days, the samples were processed and histologically evaluated (hematoxylin and eosin). The tissue surface in contact with the material was analyzed, and the quantitative analysis determined the volume density occupied by the inflammatory infiltrate (giant cells, polymorphonuclear cells and mononuclear cells), fibroblasts, collagen fibers and blood vessels. For the antimicrobial analysis, 20 dentin blocks infected with E. faecalis were treated with calcium hydroxide pastes in different vehicles; 0.4% chlorhexidine in PG, PG, extract fromCasearia sylvestris Sw in PG and a positive control (infection and without medication) for 7 days. The efficiency of the pastes was evaluated by the live/dead technique and confocal microscopy. The results showed that 0.4% chlorhexidine induced a higher inflammatory response than the other groups. The Casearia sylvestris Sw extract showed satisfactory results in relation to the intensity of the inflammatory response. In the microbiological test, there were no statistical differences between the evaluated intracanal dressings and the percentage of bacterial viability was between 33 and 42%. The control group showed an 86% viability. Antimicrobial components such as chlorhexidine or Casearia sylvestris Sw did not improve the antimicrobial activity against E. faecalis in comparison to the calcium hydroxide+PG treatment. In addition, the incorporation of chlorhexidine in the calcium hydroxide paste promoted the highest inflammatory response

    Genomics-assisted breeding in four major pulse crops of developing countries: present status and prospects

    Get PDF
    The global population is continuously increasing and is expected to reach nine billion by 2050. This huge population pressure will lead to severe shortage of food, natural resources and arable land. Such an alarming situation is most likely to arise in developing countries due to increase in the proportion of people suffering from protein and micronutrient malnutrition. Pulses being a primary and affordable source of proteins and minerals play a key role in alleviating the protein calorie malnutrition, micronutrient deficiencies and other undernourishment-related issues. Additionally, pulses are a vital source of livelihood generation for millions of resource-poor farmers practising agriculture in the semi-arid and sub-tropical regions. Limited success achieved through conventional breeding so far in most of the pulse crops will not be enough to feed the ever increasing population. In this context, genomics-assisted breeding (GAB) holds promise in enhancing the genetic gains. Though pulses have long been considered as orphan crops, recent advances in the area of pulse genomics are noteworthy, e.g. discovery of genome-wide genetic markers, high-throughput genotyping and sequencing platforms, high-density genetic linkage/QTL maps and, more importantly, the availability of whole-genome sequence. With genome sequence in hand, there is a great scope to apply genome-wide methods for trait mapping using association studies and to choose desirable genotypes via genomic selection. It is anticipated that GAB will speed up the progress of genetic improvement of pulses, leading to the rapid development of cultivars with higher yield, enhanced stress tolerance and wider adaptability

    Children's and adolescents' rising animal-source food intakes in 1990-2018 were impacted by age, region, parental education and urbanicity

    Get PDF
    Animal-source foods (ASF) provide nutrition for children and adolescents physical and cognitive development. Here, we use data from the Global Dietary Database and Bayesian hierarchical models to quantify global, regional and national ASF intakes between 1990 and 2018 by age group across 185 countries, representing 93% of the worlds child population. Mean ASF intake was 1.9 servings per day, representing 16% of children consuming at least three daily servings. Intake was similar between boys and girls, but higher among urban children with educated parents. Consumption varied by age from 0.6 at <1 year to 2.5 servings per day at 1519 years. Between 1990 and 2018, mean ASF intake increased by 0.5 servings per week, with increases in all regions except sub-Saharan Africa. In 2018, total ASF consumption was highest in Russia, Brazil, Mexico and Turkey, and lowest in Uganda, India, Kenya and Bangladesh. These findings can inform policy to address malnutrition through targeted ASF consumption programmes. (c) 2023, The Author(s)

    Incident type 2 diabetes attributable to suboptimal diet in 184 countries

    Get PDF
    The global burden of diet-attributable type 2 diabetes (T2D) is not well established. This risk assessment model estimated T2D incidence among adults attributable to direct and body weight-mediated effects of 11 dietary factors in 184 countries in 1990 and 2018. In 2018, suboptimal intake of these dietary factors was estimated to be attributable to 14.1 million (95% uncertainty interval (UI), 13.814.4 million) incident T2D cases, representing 70.3% (68.871.8%) of new cases globally. Largest T2D burdens were attributable to insufficient whole-grain intake (26.1% (25.027.1%)), excess refined rice and wheat intake (24.6% (22.327.2%)) and excess processed meat intake (20.3% (18.323.5%)). Across regions, highest proportional burdens were in central and eastern Europe and central Asia (85.6% (83.487.7%)) and Latin America and the Caribbean (81.8% (80.183.4%)); and lowest proportional burdens were in South Asia (55.4% (52.160.7%)). Proportions of diet-attributable T2D were generally larger in men than in women and were inversely correlated with age. Diet-attributable T2D was generally larger among urban versus rural residents and higher versus lower educated individuals, except in high-income countries, central and eastern Europe and central Asia, where burdens were larger in rural residents and in lower educated individuals. Compared with 1990, global diet-attributable T2D increased by 2.6 absolute percentage points (8.6 million more cases) in 2018, with variation in these trends by world region and dietary factor. These findings inform nutritional priorities and clinical and public health planning to improve dietary quality and reduce T2D globally. (c) 2023, The Author(s)
    • …
    corecore