65 research outputs found

    Utilization of Renewable Biomass and Waste Materials for Production of Environmentally-Friendly, Bio-based Composites

    Get PDF
    The introduction of renewable biomass into a polymer matrix is an option competing with other possibilities, such as energy recovery and/or re-use in the carbonized state, or production of chemicals, such as, in the case of ligno-cellulosic waste, concentrates on the production of simple sugars, then possibly leading to the development of biopolymers. These competitive applications have also some interest and market, however with a considerable energy, water and materials consumption, due also to the not always high yielding. Other possibilities for renewable biomass are therefore being used as fillers to increase mechanical performance of polymers or to allow e.g., the absorption of toxic chemicals. This review concentrates on the use of biomass as close as possible to the “as received” state, therefore avoiding whenever suitable any thermal treatment. More specifically, it focuses on its introduction into the three categories of oil-based (or bio-based replacement) of engineered polymers, into industrial biopolymers, such as poly(lactic acid) (PLA) and self-developed biopolymers, such as thermoplastic starch (TPS)

    Motor Cortex Representation of the Upper-Limb in Individuals Born without a Hand

    Get PDF
    The body schema is an action-related representation of the body that arises from activity in a network of multiple brain areas. While it was initially thought that the body schema developed with experience, the existence of phantom limbs in individuals born without a limb (amelics) led to the suggestion that it was innate. The problem with this idea, however, is that the vast majority of amelics do not report the presence of a phantom limb. Transcranial magnetic stimulation (TMS) applied over the primary motor cortex (M1) of traumatic amputees can evoke movement sensations in the phantom, suggesting that traumatic amputation does not delete movement representations of the missing hand. Given this, we asked whether the absence of a phantom limb in the majority of amelics means that the motor cortex does not contain a cortical representation of the missing limb, or whether it is present but has been deactivated by the lack of sensorimotor experience. In four upper-limb amelic subjects we directly stimulated the arm/hand region of M1 to see 1) whether we could evoke phantom sensations, and 2) whether muscle representations in the two cortices were organised asymmetrically. TMS applied over the motor cortex contralateral to the missing limb evoked contractions in stump muscles but did not evoke phantom movement sensations. The location and extent of muscle maps varied between hemispheres but did not reveal any systematic asymmetries. In contrast, forearm muscle thresholds were always higher for the missing limb side. We suggest that phantom movement sensations reported by some upper limb amelics are mostly driven by vision and not by the persistence of motor commands to the missing limb within the sensorimotor cortex. We propose that prewired movement representations of a limb need the experience of movement to be expressed within the primary motor cortex

    Genomic Organization of H2Av Containing Nucleosomes in Drosophila Heterochromatin

    Get PDF
    H2Av is a versatile histone variant that plays both positive and negative roles in transcription, DNA repair, and chromatin structure in Drosophila. H2Av, and its broader homolog H2A.Z, tend to be enriched toward 5′ ends of genes, and exist in both euchromatin and heterochromatin. Its organization around euchromatin genes and other features have been described in many eukaryotic model organisms. However, less is known about H2Av nucleosome organization in heterochromatin. Here we report the properties and organization of individual H2Av nucleosomes around genes and transposable elements located in Drosophila heterochromatic regions. We compare the similarity and differences with that found in euchromatic regions. Our analyses suggest that nucleosomes are intrinsically positioned on inverted repeats of DNA transposable elements such as those related to the “1360” element, but are not intrinsically positioned on retrotransposon-related elements

    Mapping geographical inequalities in access to drinking water and sanitation facilities in low-income and middle-income countries, 2000-17

    Get PDF
    Background Universal access to safe drinking water and sanitation facilities is an essential human right, recognised in the Sustainable Development Goals as crucial for preventing disease and improving human wellbeing. Comprehensive, high-resolution estimates are important to inform progress towards achieving this goal. We aimed to produce high-resolution geospatial estimates of access to drinking water and sanitation facilities. Methods We used a Bayesian geostatistical model and data from 600 sources across more than 88 low-income and middle-income countries (LMICs) to estimate access to drinking water and sanitation facilities on continuous continent-wide surfaces from 2000 to 2017, and aggregated results to policy-relevant administrative units. We estimated mutually exclusive and collectively exhaustive subcategories of facilities for drinking water (piped water on or off premises, other improved facilities, unimproved, and surface water) and sanitation facilities (septic or sewer sanitation, other improved, unimproved, and open defecation) with use of ordinal regression. We also estimated the number of diarrhoeal deaths in children younger than 5 years attributed to unsafe facilities and estimated deaths that were averted by increased access to safe facilities in 2017, and analysed geographical inequality in access within LMICs. Findings Across LMICs, access to both piped water and improved water overall increased between 2000 and 2017, with progress varying spatially. For piped water, the safest water facility type, access increased from 40.0% (95% uncertainty interval [UI] 39.4-40.7) to 50.3% (50.0-50.5), but was lowest in sub-Saharan Africa, where access to piped water was mostly concentrated in urban centres. Access to both sewer or septic sanitation and improved sanitation overall also increased across all LMICs during the study period. For sewer or septic sanitation, access was 46.3% (95% UI 46.1-46.5) in 2017, compared with 28.7% (28.5-29.0) in 2000. Although some units improved access to the safest drinking water or sanitation facilities since 2000, a large absolute number of people continued to not have access in several units with high access to such facilities (>80%) in 2017. More than 253 000 people did not have access to sewer or septic sanitation facilities in the city of Harare, Zimbabwe, despite 88.6% (95% UI 87.2-89.7) access overall. Many units were able to transition from the least safe facilities in 2000 to safe facilities by 2017; for units in which populations primarily practised open defecation in 2000, 686 (95% UI 664-711) of the 1830 (1797-1863) units transitioned to the use of improved sanitation. Geographical disparities in access to improved water across units decreased in 76.1% (95% UI 71.6-80.7) of countries from 2000 to 2017, and in 53.9% (50.6-59.6) of countries for access to improved sanitation, but remained evident subnationally in most countries in 2017. Interpretation Our estimates, combined with geospatial trends in diarrhoeal burden, identify where efforts to increase access to safe drinking water and sanitation facilities are most needed. By highlighting areas with successful approaches or in need of targeted interventions, our estimates can enable precision public health to effectively progress towards universal access to safe water and sanitation. Copyright (C) 2020 The Author(s). Published by Elsevier Ltd.Peer reviewe

    Mapping geographical inequalities in oral rehydration therapy coverage in low-income and middle-income countries, 2000-17

    Get PDF
    Background Oral rehydration solution (ORS) is a form of oral rehydration therapy (ORT) for diarrhoea that has the potential to drastically reduce child mortality; yet, according to UNICEF estimates, less than half of children younger than 5 years with diarrhoea in low-income and middle-income countries (LMICs) received ORS in 2016. A variety of recommended home fluids (RHF) exist as alternative forms of ORT; however, it is unclear whether RHF prevent child mortality. Previous studies have shown considerable variation between countries in ORS and RHF use, but subnational variation is unknown. This study aims to produce high-resolution geospatial estimates of relative and absolute coverage of ORS, RHF, and ORT (use of either ORS or RHF) in LMICs. Methods We used a Bayesian geostatistical model including 15 spatial covariates and data from 385 household surveys across 94 LMICs to estimate annual proportions of children younger than 5 years of age with diarrhoea who received ORS or RHF (or both) on continuous continent-wide surfaces in 2000-17, and aggregated results to policy-relevant administrative units. Additionally, we analysed geographical inequality in coverage across administrative units and estimated the number of diarrhoeal deaths averted by increased coverage over the study period. Uncertainty in the mean coverage estimates was calculated by taking 250 draws from the posterior joint distribution of the model and creating uncertainty intervals (UIs) with the 2 center dot 5th and 97 center dot 5th percentiles of those 250 draws. Findings While ORS use among children with diarrhoea increased in some countries from 2000 to 2017, coverage remained below 50% in the majority (62 center dot 6%; 12 417 of 19 823) of second administrative-level units and an estimated 6 519 000 children (95% UI 5 254 000-7 733 000) with diarrhoea were not treated with any form of ORT in 2017. Increases in ORS use corresponded with declines in RHF in many locations, resulting in relatively constant overall ORT coverage from 2000 to 2017. Although ORS was uniformly distributed subnationally in some countries, within-country geographical inequalities persisted in others; 11 countries had at least a 50% difference in one of their units compared with the country mean. Increases in ORS use over time were correlated with declines in RHF use and in diarrhoeal mortality in many locations, and an estimated 52 230 diarrhoeal deaths (36 910-68 860) were averted by scaling up of ORS coverage between 2000 and 2017. Finally, we identified key subnational areas in Colombia, Nigeria, and Sudan as examples of where diarrhoeal mortality remains higher than average, while ORS coverage remains lower than average. Interpretation To our knowledge, this study is the first to produce and map subnational estimates of ORS, RHF, and ORT coverage and attributable child diarrhoeal deaths across LMICs from 2000 to 2017, allowing for tracking progress over time. Our novel results, combined with detailed subnational estimates of diarrhoeal morbidity and mortality, can support subnational needs assessments aimed at furthering policy makers' understanding of within-country disparities. Over 50 years after the discovery that led to this simple, cheap, and life-saving therapy, large gains in reducing mortality could still be made by reducing geographical inequalities in ORS coverage. Copyright (c) 2020 The Author(s). Published by Elsevier Ltd. This is an Open Access article under the CC BY 4.0 license.Peer reviewe

    Pooled analysis of WHO Surgical Safety Checklist use and mortality after emergency laparotomy

    Get PDF
    Background The World Health Organization (WHO) Surgical Safety Checklist has fostered safe practice for 10 years, yet its place in emergency surgery has not been assessed on a global scale. The aim of this study was to evaluate reported checklist use in emergency settings and examine the relationship with perioperative mortality in patients who had emergency laparotomy. Methods In two multinational cohort studies, adults undergoing emergency laparotomy were compared with those having elective gastrointestinal surgery. Relationships between reported checklist use and mortality were determined using multivariable logistic regression and bootstrapped simulation. Results Of 12 296 patients included from 76 countries, 4843 underwent emergency laparotomy. After adjusting for patient and disease factors, checklist use before emergency laparotomy was more common in countries with a high Human Development Index (HDI) (2455 of 2741, 89.6 per cent) compared with that in countries with a middle (753 of 1242, 60.6 per cent; odds ratio (OR) 0.17, 95 per cent c.i. 0.14 to 0.21, P <0001) or low (363 of 860, 422 per cent; OR 008, 007 to 010, P <0.001) HDI. Checklist use was less common in elective surgery than for emergency laparotomy in high-HDI countries (risk difference -94 (95 per cent c.i. -11.9 to -6.9) per cent; P <0001), but the relationship was reversed in low-HDI countries (+121 (+7.0 to +173) per cent; P <0001). In multivariable models, checklist use was associated with a lower 30-day perioperative mortality (OR 0.60, 0.50 to 073; P <0.001). The greatest absolute benefit was seen for emergency surgery in low- and middle-HDI countries. Conclusion Checklist use in emergency laparotomy was associated with a significantly lower perioperative mortality rate. Checklist use in low-HDI countries was half that in high-HDI countries.Peer reviewe

    Global variation in anastomosis and end colostomy formation following left-sided colorectal resection

    Get PDF
    Background End colostomy rates following colorectal resection vary across institutions in high-income settings, being influenced by patient, disease, surgeon and system factors. This study aimed to assess global variation in end colostomy rates after left-sided colorectal resection. Methods This study comprised an analysis of GlobalSurg-1 and -2 international, prospective, observational cohort studies (2014, 2016), including consecutive adult patients undergoing elective or emergency left-sided colorectal resection within discrete 2-week windows. Countries were grouped into high-, middle- and low-income tertiles according to the United Nations Human Development Index (HDI). Factors associated with colostomy formation versus primary anastomosis were explored using a multilevel, multivariable logistic regression model. Results In total, 1635 patients from 242 hospitals in 57 countries undergoing left-sided colorectal resection were included: 113 (6·9 per cent) from low-HDI, 254 (15·5 per cent) from middle-HDI and 1268 (77·6 per cent) from high-HDI countries. There was a higher proportion of patients with perforated disease (57·5, 40·9 and 35·4 per cent; P < 0·001) and subsequent use of end colostomy (52·2, 24·8 and 18·9 per cent; P < 0·001) in low- compared with middle- and high-HDI settings. The association with colostomy use in low-HDI settings persisted (odds ratio (OR) 3·20, 95 per cent c.i. 1·35 to 7·57; P = 0·008) after risk adjustment for malignant disease (OR 2·34, 1·65 to 3·32; P < 0·001), emergency surgery (OR 4·08, 2·73 to 6·10; P < 0·001), time to operation at least 48 h (OR 1·99, 1·28 to 3·09; P = 0·002) and disease perforation (OR 4·00, 2·81 to 5·69; P < 0·001). Conclusion Global differences existed in the proportion of patients receiving end stomas after left-sided colorectal resection based on income, which went beyond case mix alone

    Assessment of Bt trait purity in different generations of transgenic cottons

    No full text
    237-244Adequate expression of Bt (Bacillus thuringiensis) toxins and purity of seeds of Bt-transgenic cottons are important for controlling bollworms, and thereby increasing the cotton productivity. Therefore, we examined the variability in expression of Bt toxin proteins in the seeds and in leaves of different cotton (Gossypium hirsutum (L.) hybrids (JKCH 226, JKCH 1947, JKCH Durga, JKCH Ishwar, JKCH Varun KDCHH 441 and KDCHH 621) expressing Bt toxins in F1 and F2 generations, using bioassays against the cotton bollworm, <i style="mso-bidi-font-style: normal">Helicoverpa armigera (Hübner), and the lateral flow strip (LFS) test. Toxicity of Bt toxin proteins in the seeds of Bt-transgenic cottons to H. armigera correlated with their toxicity in the leaves in one-toxin Bt cotton hybrids. The Bt-F1 and Bt-F2 seeds of JKCH 1947 were more toxic to <i style="mso-bidi-font-style: normal">H. armigera than those of JKCH Varun seeds. The seeds and leaves of F1s showed greater toxicity than the F2 seeds or leaves of one-toxin (cry1Ac) Bt cotton hybrids. However, no significant differences were observed for the two-toxin (cry1Ac and cry2Ab) hybrid, KDCHH 621. Toxicity of leaves to <i style="mso-bidi-font-style: normal">H. armigera increased with crop age, until 112 days after seedling emergence. The Bt trait purity in F1 seeds of four two-toxin Bt cotton hybrids ranged from 86.7 to 100%. The present study emphasizes the necessity of 95% Bt trait purity in seeds of transgenic cotton for sustainable crop production
    corecore