419 research outputs found
Oxidant status of children infected with
Background: Malaria is a global menace caused by the transfer of a plasmodium parasite to a host by an infected anopheles mosquito. Upon infection, the overwhelmed host releases free radicals which have the capacity to induce oxidative damage by lipid peroxidation. This study was undertaken to assess the effect of malaria caused by Plasmodium falciparum on some antioxidant markers and lipid peroxidation levels in children attending hospitals in Katsina State, Nigeria.Materials and Methods: Blood samples were collected from untreated subjects upon confirmation of Plasmodium falciparum parasitaemia using the Giemsa stain technique. One hundred and sixty (160) consenting individuals (80 infected patients and 80 uninfected subjects) comprising of both sexes were randomly selected. The levels of antioxidant markers and malondialdehyde (MDA) - a lipid peroxidation marker were determined. Descriptive analysis was employed using SPSS version 16.0 and significance between groups was ascertained using students' T-test.Results: P. falciparum malarial infection significantly (p <0.05) reduced the antioxidant markers [vitamins A, C, & E; and reduced glutathione (GSH)] by 65.4%, 29.7%, 48.1%, 40.4% respectively in males and by 54.2%, 36.6%, 55.7% , 36.6% in females when compared with values obtained from uninfected, healthy children. Conversely, lipid peroxidation levels were significantly (p <0.05) higher in children with parasitaemia than in nonparasitaemic controls. Males showed greater than 200% increase, while it increased by 138% in females.Conclusion: Our findings indicate a reciprocal relationship, where high levels of lipid peroxidation correspond to low levels of antioxidants, which may be due to over utilization of the antioxidants in order to counteract the effect of free radicals. This may be responsible for oxidative stress and consequently, tissue damage associated with pathology of malaria in Nigerian children.Key words: Antioxidant markers, Plasmodium falciparum, lipid peroxidation and children
Hyperbaric treatment for children with autism: a multicenter, randomized, double-blind, controlled trial
<p>Abstract</p> <p>Background</p> <p>Several uncontrolled studies of hyperbaric treatment in children with autism have reported clinical improvements; however, this treatment has not been evaluated to date with a controlled study. We performed a multicenter, randomized, double-blind, controlled trial to assess the efficacy of hyperbaric treatment in children with autism.</p> <p>Methods</p> <p>62 children with autism recruited from 6 centers, ages 2–7 years (mean 4.92 ± 1.21), were randomly assigned to 40 hourly treatments of either hyperbaric treatment at 1.3 atmosphere (atm) and 24% oxygen ("treatment group", n = 33) or slightly pressurized room air at 1.03 atm and 21% oxygen ("control group", n = 29). Outcome measures included Clinical Global Impression (CGI) scale, Aberrant Behavior Checklist (ABC), and Autism Treatment Evaluation Checklist (ATEC).</p> <p>Results</p> <p>After 40 sessions, mean physician CGI scores significantly improved in the treatment group compared to controls in overall functioning (p = 0.0008), receptive language (p < 0.0001), social interaction (p = 0.0473), and eye contact (p = 0.0102); 9/30 children (30%) in the treatment group were rated as "very much improved" or "much improved" compared to 2/26 (8%) of controls (p = 0.0471); 24/30 (80%) in the treatment group improved compared to 10/26 (38%) of controls (p = 0.0024). Mean parental CGI scores significantly improved in the treatment group compared to controls in overall functioning (p = 0.0336), receptive language (p = 0.0168), and eye contact (p = 0.0322). On the ABC, significant improvements were observed in the treatment group in total score, irritability, stereotypy, hyperactivity, and speech (p < 0.03 for each), but not in the control group. In the treatment group compared to the control group, mean changes on the ABC total score and subscales were similar except a greater number of children improved in irritability (p = 0.0311). On the ATEC, sensory/cognitive awareness significantly improved (p = 0.0367) in the treatment group compared to the control group. Post-hoc analysis indicated that children over age 5 and children with lower initial autism severity had the most robust improvements. Hyperbaric treatment was safe and well-tolerated.</p> <p>Conclusion</p> <p>Children with autism who received hyperbaric treatment at 1.3 atm and 24% oxygen for 40 hourly sessions had significant improvements in overall functioning, receptive language, social interaction, eye contact, and sensory/cognitive awareness compared to children who received slightly pressurized room air.</p> <p>Trial Registration</p> <p>clinicaltrials.gov NCT00335790</p
Fast T2 gradient-spin-echo (T2-GraSE) mapping for myocardial edema quantification: first in vivo validation in a porcine model of ischemia/reperfusion
BACKGROUND: Several T2-mapping sequences have been recently proposed to quantify myocardial edema by providing T2 relaxation time values. However, no T2-mapping sequence has ever been validated against actual myocardial water content for edema detection. In addition, these T2-mapping sequences are either time-consuming or require specialized software for data acquisition and/or post-processing, factors impeding their routine clinical use. Our objective was to obtain in vivo validation of a sequence for fast and accurate myocardial T2-mapping (T2 gradient-spin-echo [GraSE]) that can be easily integrated in routine protocols. METHODS: The study population comprised 25 pigs. Closed-chest 40 min ischemia/reperfusion was performed in 20 pigs. Pigs were sacrificed at 120 min (n = 5), 24 h (n = 5), 4 days (n = 5) and 7 days (n = 5) after reperfusion, and heart tissue extracted for quantification of myocardial water content. For the evaluation of T2 relaxation time, cardiovascular magnetic resonance (CMR) scans, including T2 turbo-spin-echo (T2-TSE, reference standard) mapping and T2-GraSE mapping, were performed at baseline and at every follow-up until sacrifice. Five additional pigs were sacrificed after baseline CMR study and served as controls. RESULTS: Acquisition of T2-GraSE mapping was significantly (3-fold) faster than conventional T2-TSE mapping. Myocardial T2 relaxation measurements performed by T2-TSE and T2-GraSE mapping demonstrated an almost perfect correlation (R(2) = 0.99) and agreement with no systematic error between techniques. The two T2-mapping sequences showed similarly good correlations with myocardial water content: R(2) = 0.75 and R(2) = 0.73 for T2-TSE and T2-GraSE mapping, respectively. CONCLUSIONS: We present the first in vivo validation of T2-mapping to assess myocardial edema. Given its shorter acquisition time and no requirement for specific software for data acquisition or post-processing, fast T2-GraSE mapping of the myocardium offers an attractive alternative to current CMR sequences for T2 quantification
Approaches in biotechnological applications of natural polymers
Natural polymers, such as gums and mucilage, are biocompatible, cheap, easily available and non-toxic materials of native origin. These polymers are increasingly preferred over synthetic materials for industrial applications due to their intrinsic properties, as well as they are considered alternative sources of raw materials since they present characteristics of sustainability, biodegradability and biosafety. As definition, gums and mucilages are polysaccharides or complex carbohydrates consisting of one or more monosaccharides or their derivatives linked in bewildering variety of linkages and structures. Natural gums are considered polysaccharides naturally occurring in varieties of plant seeds and exudates, tree or shrub exudates, seaweed extracts, fungi, bacteria, and animal sources. Water-soluble gums, also known as hydrocolloids, are considered exudates and are pathological products; therefore, they do not form a part of cell wall. On the other hand, mucilages are part of cell and physiological products. It is important to highlight that gums represent the largest amounts of polymer materials derived from plants. Gums have enormously large and broad applications in both food and non-food industries, being commonly used as thickening, binding, emulsifying, suspending, stabilizing agents and matrices for drug release in pharmaceutical and cosmetic industries. In the food industry, their gelling properties and the ability to mold edible films and coatings are extensively studied. The use of gums depends on the intrinsic properties that they provide, often at costs below those of synthetic polymers. For upgrading the value of gums, they are being processed into various forms, including the most recent nanomaterials, for various biotechnological applications. Thus, the main natural polymers including galactomannans, cellulose, chitin, agar, carrageenan, alginate, cashew gum, pectin and starch, in addition to the current researches about them are reviewed in this article.. }To the Conselho Nacional de Desenvolvimento Cientfíico e Tecnológico (CNPq) for fellowships (LCBBC and MGCC) and the Coordenação de Aperfeiçoamento de Pessoal de Nvíel Superior (CAPES) (PBSA). This study was supported by the Portuguese Foundation for Science and Technology (FCT) under the scope of the strategic funding of UID/BIO/04469/2013 unit, the Project RECI/BBB-EBI/0179/2012 (FCOMP-01-0124-FEDER-027462) and COMPETE 2020 (POCI-01-0145-FEDER-006684) (JAT)
Mapping geographical inequalities in access to drinking water and sanitation facilities in low-income and middle-income countries, 2000-17
Background Universal access to safe drinking water and sanitation facilities is an essential human right, recognised in the Sustainable Development Goals as crucial for preventing disease and improving human wellbeing. Comprehensive, high-resolution estimates are important to inform progress towards achieving this goal. We aimed to produce high-resolution geospatial estimates of access to drinking water and sanitation facilities. Methods We used a Bayesian geostatistical model and data from 600 sources across more than 88 low-income and middle-income countries (LMICs) to estimate access to drinking water and sanitation facilities on continuous continent-wide surfaces from 2000 to 2017, and aggregated results to policy-relevant administrative units. We estimated mutually exclusive and collectively exhaustive subcategories of facilities for drinking water (piped water on or off premises, other improved facilities, unimproved, and surface water) and sanitation facilities (septic or sewer sanitation, other improved, unimproved, and open defecation) with use of ordinal regression. We also estimated the number of diarrhoeal deaths in children younger than 5 years attributed to unsafe facilities and estimated deaths that were averted by increased access to safe facilities in 2017, and analysed geographical inequality in access within LMICs. Findings Across LMICs, access to both piped water and improved water overall increased between 2000 and 2017, with progress varying spatially. For piped water, the safest water facility type, access increased from 40.0% (95% uncertainty interval [UI] 39.4-40.7) to 50.3% (50.0-50.5), but was lowest in sub-Saharan Africa, where access to piped water was mostly concentrated in urban centres. Access to both sewer or septic sanitation and improved sanitation overall also increased across all LMICs during the study period. For sewer or septic sanitation, access was 46.3% (95% UI 46.1-46.5) in 2017, compared with 28.7% (28.5-29.0) in 2000. Although some units improved access to the safest drinking water or sanitation facilities since 2000, a large absolute number of people continued to not have access in several units with high access to such facilities (>80%) in 2017. More than 253 000 people did not have access to sewer or septic sanitation facilities in the city of Harare, Zimbabwe, despite 88.6% (95% UI 87.2-89.7) access overall. Many units were able to transition from the least safe facilities in 2000 to safe facilities by 2017; for units in which populations primarily practised open defecation in 2000, 686 (95% UI 664-711) of the 1830 (1797-1863) units transitioned to the use of improved sanitation. Geographical disparities in access to improved water across units decreased in 76.1% (95% UI 71.6-80.7) of countries from 2000 to 2017, and in 53.9% (50.6-59.6) of countries for access to improved sanitation, but remained evident subnationally in most countries in 2017. Interpretation Our estimates, combined with geospatial trends in diarrhoeal burden, identify where efforts to increase access to safe drinking water and sanitation facilities are most needed. By highlighting areas with successful approaches or in need of targeted interventions, our estimates can enable precision public health to effectively progress towards universal access to safe water and sanitation. Copyright (C) 2020 The Author(s). Published by Elsevier Ltd.Peer reviewe
Measurement of the tt¯tt¯ production cross section in pp collisions at √s=13 TeV with the ATLAS detector
A measurement of four-top-quark production using proton-proton collision data at a centre-of-mass energy of 13 TeV collected by the ATLAS detector at the Large Hadron Collider corresponding to an integrated luminosity of 139 fb−1 is presented. Events are selected if they contain a single lepton (electron or muon) or an opposite-sign lepton pair, in association with multiple jets. The events are categorised according to the number of jets and how likely these are to contain b-hadrons. A multivariate technique is then used to discriminate between signal and background events. The measured four-top-quark production cross section is found to be 26+17−15 fb, with a corresponding observed (expected) significance of 1.9 (1.0) standard deviations over the background-only hypothesis. The result is combined with the previous measurement performed by the ATLAS Collaboration in the multilepton final state. The combined four-top-quark production cross section is measured to be 24+7−6 fb, with a corresponding observed (expected) signal significance of 4.7 (2.6) standard deviations over the background-only predictions. It is consistent within 2.0 standard deviations with the Standard Model expectation of 12.0 ± 2.4 fb
Mapping geographical inequalities in oral rehydration therapy coverage in low-income and middle-income countries, 2000-17
Background Oral rehydration solution (ORS) is a form of oral rehydration therapy (ORT) for diarrhoea that has the potential to drastically reduce child mortality; yet, according to UNICEF estimates, less than half of children younger than 5 years with diarrhoea in low-income and middle-income countries (LMICs) received ORS in 2016. A variety of recommended home fluids (RHF) exist as alternative forms of ORT; however, it is unclear whether RHF prevent child mortality. Previous studies have shown considerable variation between countries in ORS and RHF use, but subnational variation is unknown. This study aims to produce high-resolution geospatial estimates of relative and absolute coverage of ORS, RHF, and ORT (use of either ORS or RHF) in LMICs. Methods We used a Bayesian geostatistical model including 15 spatial covariates and data from 385 household surveys across 94 LMICs to estimate annual proportions of children younger than 5 years of age with diarrhoea who received ORS or RHF (or both) on continuous continent-wide surfaces in 2000-17, and aggregated results to policy-relevant administrative units. Additionally, we analysed geographical inequality in coverage across administrative units and estimated the number of diarrhoeal deaths averted by increased coverage over the study period. Uncertainty in the mean coverage estimates was calculated by taking 250 draws from the posterior joint distribution of the model and creating uncertainty intervals (UIs) with the 2 center dot 5th and 97 center dot 5th percentiles of those 250 draws. Findings While ORS use among children with diarrhoea increased in some countries from 2000 to 2017, coverage remained below 50% in the majority (62 center dot 6%; 12 417 of 19 823) of second administrative-level units and an estimated 6 519 000 children (95% UI 5 254 000-7 733 000) with diarrhoea were not treated with any form of ORT in 2017. Increases in ORS use corresponded with declines in RHF in many locations, resulting in relatively constant overall ORT coverage from 2000 to 2017. Although ORS was uniformly distributed subnationally in some countries, within-country geographical inequalities persisted in others; 11 countries had at least a 50% difference in one of their units compared with the country mean. Increases in ORS use over time were correlated with declines in RHF use and in diarrhoeal mortality in many locations, and an estimated 52 230 diarrhoeal deaths (36 910-68 860) were averted by scaling up of ORS coverage between 2000 and 2017. Finally, we identified key subnational areas in Colombia, Nigeria, and Sudan as examples of where diarrhoeal mortality remains higher than average, while ORS coverage remains lower than average. Interpretation To our knowledge, this study is the first to produce and map subnational estimates of ORS, RHF, and ORT coverage and attributable child diarrhoeal deaths across LMICs from 2000 to 2017, allowing for tracking progress over time. Our novel results, combined with detailed subnational estimates of diarrhoeal morbidity and mortality, can support subnational needs assessments aimed at furthering policy makers' understanding of within-country disparities. Over 50 years after the discovery that led to this simple, cheap, and life-saving therapy, large gains in reducing mortality could still be made by reducing geographical inequalities in ORS coverage. Copyright (c) 2020 The Author(s). Published by Elsevier Ltd. This is an Open Access article under the CC BY 4.0 license.Peer reviewe
Multi-messenger observations of a binary neutron star merger
On 2017 August 17 a binary neutron star coalescence candidate (later designated GW170817) with merger time 12:41:04 UTC was observed through gravitational waves by the Advanced LIGO and Advanced Virgo detectors. The Fermi Gamma-ray Burst Monitor independently detected a gamma-ray burst (GRB 170817A) with a time delay of ~1.7 s with respect to the merger time. From the gravitational-wave signal, the source was initially localized to a sky region of 31 deg2 at a luminosity distance of 40+8-8 Mpc and with component masses consistent with neutron stars. The component masses were later measured to be in the range 0.86 to 2.26 Mo. An extensive observing campaign was launched across the electromagnetic spectrum leading to the discovery of a bright optical transient (SSS17a, now with the IAU identification of AT 2017gfo) in NGC 4993 (at ~40 Mpc) less than 11 hours after the merger by the One- Meter, Two Hemisphere (1M2H) team using the 1 m Swope Telescope. The optical transient was independently detected by multiple teams within an hour. Subsequent observations targeted the object and its environment. Early ultraviolet observations revealed a blue transient that faded within 48 hours. Optical and infrared observations showed a redward evolution over ~10 days. Following early non-detections, X-ray and radio emission were discovered at the transient’s position ~9 and ~16 days, respectively, after the merger. Both the X-ray and radio emission likely arise from a physical process that is distinct from the one that generates the UV/optical/near-infrared emission. No ultra-high-energy gamma-rays and no neutrino candidates consistent with the source were found in follow-up searches. These observations support the hypothesis that GW170817 was produced by the merger of two neutron stars in NGC4993 followed by a short gamma-ray burst (GRB 170817A) and a kilonova/macronova powered by the radioactive decay of r-process nuclei synthesized in the ejecta
- …