9 research outputs found

    INFLUENCE OF MINERAL NITROGEN, COMPOST AND NITROGEN FIXING BACTERIA ON TOMATO PLANTS GROWN IN SANDY SOIL

    Get PDF
    Pot trials were conducted under plastic house condition during two successive seasons of 2013/2014 and 2014/2015, at the experimental site of Central Laboratory for Agricultural Climate (CLAC), Agricultural Research Center, Giza, Egypt. The present study aims to determine the partial replacement of mineral nitrogen fertilization of tomato by nitrogen fixing bacteria with or without adding compost in sandy soil. Tomato seedlings (Lora F1Hybrid) were transplanted during the first week of October into plastic pots (30 cm diameter) filled with 10 kg of sandy soil. Three rates 25, 50 and 75% of the recommended mineral nitrogen in the nutrient solution for tomato with adding compostat 2% and nitrogen fixing bacteria (Azotobacter chroococcum and Azospirillium brasilense) at 20 ml/plant either individually or in combinationswere investigated on growth, mineral composition and yield of tomato plants compared to 100% of recommended nitrogen only (control). The plants were irrigated daily by drip irrigation and received 200 ml/plant of nutrient solution twice a weekly. The results showed that using 50 or 75% of N-mineral fertilizer + compost + nitrogen fixing bacteria gave the highest values of growth, mineral composition and yield of tomato. It is recommended that 50% of nitrogen mineral fertilizers for tomato plants could be replaced by nitrogen fixing bacteria in presence of compost, which in earn, reduce environment pollution caused by extensive application of mineral nitrogen fertilizers

    Fundamental Role of Neurochemicals Aberration in the Pathogenesis of Autism Spectrum Disorders

    Get PDF
    AIM: The aim of this research was to establish the perturbation of reliable biomarkers implicated in the pathophysiology of autism to help in the early diagnosis and to be as targets in the treatment of autism spectrum disorders (ASDs) in children and to spotlight into the complex crosstalk between these biomarkers. PATIENS AND METHODS: This study included 90 autistic children aged from 2 to 7 years old, who were classified into two groups, the atypical autism of 30 children and the childhood autism. The childhood autism group was further divided into mild-moderate autism group and severe autism group each of 30 children. The control group included 30 matched healthy children. All the participants were subjected to full psychiatric examinations, psychological investigations, and biochemical measurements, including gamma-aminobutaric acid (GABA), serotonin, dopamine (DA) in plasma, and brain-derived neurotrophic factor (BDNF) in serum. RESULTS: The autistic groups showed a highly significant increase in GABA, serotonin, DA, and BDNF levels compared to the control. Of note, the levels of GABA, DA, and BDNF were significantly increased with the increased disease severity. Furthermore, a significant positive correlation between BDNF levels and both GABA and DA levels in the childhood autism group has been recorded. CONCLUSION: The present clinical setting provides new insight into the fundamental role of BDNF in the brain of autistic children as any alterations of its level due to GABA increment cause change in serotonin and DA levels which have empirical evidence in the pathophysiology of ASD. The results received in this research, create a fertile base for the setup of particular targets in the intervention of this ailment

    Impact of opioid-free analgesia on pain severity and patient satisfaction after discharge from surgery: multispecialty, prospective cohort study in 25 countries

    Get PDF
    Background: Balancing opioid stewardship and the need for adequate analgesia following discharge after surgery is challenging. This study aimed to compare the outcomes for patients discharged with opioid versus opioid-free analgesia after common surgical procedures.Methods: This international, multicentre, prospective cohort study collected data from patients undergoing common acute and elective general surgical, urological, gynaecological, and orthopaedic procedures. The primary outcomes were patient-reported time in severe pain measured on a numerical analogue scale from 0 to 100% and patient-reported satisfaction with pain relief during the first week following discharge. Data were collected by in-hospital chart review and patient telephone interview 1 week after discharge.Results: The study recruited 4273 patients from 144 centres in 25 countries; 1311 patients (30.7%) were prescribed opioid analgesia at discharge. Patients reported being in severe pain for 10 (i.q.r. 1-30)% of the first week after discharge and rated satisfaction with analgesia as 90 (i.q.r. 80-100) of 100. After adjustment for confounders, opioid analgesia on discharge was independently associated with increased pain severity (risk ratio 1.52, 95% c.i. 1.31 to 1.76; P < 0.001) and re-presentation to healthcare providers owing to side-effects of medication (OR 2.38, 95% c.i. 1.36 to 4.17; P = 0.004), but not with satisfaction with analgesia (beta coefficient 0.92, 95% c.i. -1.52 to 3.36; P = 0.468) compared with opioid-free analgesia. Although opioid prescribing varied greatly between high-income and low- and middle-income countries, patient-reported outcomes did not.Conclusion: Opioid analgesia prescription on surgical discharge is associated with a higher risk of re-presentation owing to side-effects of medication and increased patient-reported pain, but not with changes in patient-reported satisfaction. Opioid-free discharge analgesia should be adopted routinely

    The evolving SARS-CoV-2 epidemic in Africa: Insights from rapidly expanding genomic surveillance

    Get PDF
    INTRODUCTION Investment in Africa over the past year with regard to severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) sequencing has led to a massive increase in the number of sequences, which, to date, exceeds 100,000 sequences generated to track the pandemic on the continent. These sequences have profoundly affected how public health officials in Africa have navigated the COVID-19 pandemic. RATIONALE We demonstrate how the first 100,000 SARS-CoV-2 sequences from Africa have helped monitor the epidemic on the continent, how genomic surveillance expanded over the course of the pandemic, and how we adapted our sequencing methods to deal with an evolving virus. Finally, we also examine how viral lineages have spread across the continent in a phylogeographic framework to gain insights into the underlying temporal and spatial transmission dynamics for several variants of concern (VOCs). RESULTS Our results indicate that the number of countries in Africa that can sequence the virus within their own borders is growing and that this is coupled with a shorter turnaround time from the time of sampling to sequence submission. Ongoing evolution necessitated the continual updating of primer sets, and, as a result, eight primer sets were designed in tandem with viral evolution and used to ensure effective sequencing of the virus. The pandemic unfolded through multiple waves of infection that were each driven by distinct genetic lineages, with B.1-like ancestral strains associated with the first pandemic wave of infections in 2020. Successive waves on the continent were fueled by different VOCs, with Alpha and Beta cocirculating in distinct spatial patterns during the second wave and Delta and Omicron affecting the whole continent during the third and fourth waves, respectively. Phylogeographic reconstruction points toward distinct differences in viral importation and exportation patterns associated with the Alpha, Beta, Delta, and Omicron variants and subvariants, when considering both Africa versus the rest of the world and viral dissemination within the continent. Our epidemiological and phylogenetic inferences therefore underscore the heterogeneous nature of the pandemic on the continent and highlight key insights and challenges, for instance, recognizing the limitations of low testing proportions. We also highlight the early warning capacity that genomic surveillance in Africa has had for the rest of the world with the detection of new lineages and variants, the most recent being the characterization of various Omicron subvariants. CONCLUSION Sustained investment for diagnostics and genomic surveillance in Africa is needed as the virus continues to evolve. This is important not only to help combat SARS-CoV-2 on the continent but also because it can be used as a platform to help address the many emerging and reemerging infectious disease threats in Africa. In particular, capacity building for local sequencing within countries or within the continent should be prioritized because this is generally associated with shorter turnaround times, providing the most benefit to local public health authorities tasked with pandemic response and mitigation and allowing for the fastest reaction to localized outbreaks. These investments are crucial for pandemic preparedness and response and will serve the health of the continent well into the 21st century

    Mortality from gastrointestinal congenital anomalies at 264 hospitals in 74 low-income, middle-income, and high-income countries: a multicentre, international, prospective cohort study

    Get PDF
    Summary Background Congenital anomalies are the fifth leading cause of mortality in children younger than 5 years globally. Many gastrointestinal congenital anomalies are fatal without timely access to neonatal surgical care, but few studies have been done on these conditions in low-income and middle-income countries (LMICs). We compared outcomes of the seven most common gastrointestinal congenital anomalies in low-income, middle-income, and high-income countries globally, and identified factors associated with mortality. Methods We did a multicentre, international prospective cohort study of patients younger than 16 years, presenting to hospital for the first time with oesophageal atresia, congenital diaphragmatic hernia, intestinal atresia, gastroschisis, exomphalos, anorectal malformation, and Hirschsprung’s disease. Recruitment was of consecutive patients for a minimum of 1 month between October, 2018, and April, 2019. We collected data on patient demographics, clinical status, interventions, and outcomes using the REDCap platform. Patients were followed up for 30 days after primary intervention, or 30 days after admission if they did not receive an intervention. The primary outcome was all-cause, in-hospital mortality for all conditions combined and each condition individually, stratified by country income status. We did a complete case analysis. Findings We included 3849 patients with 3975 study conditions (560 with oesophageal atresia, 448 with congenital diaphragmatic hernia, 681 with intestinal atresia, 453 with gastroschisis, 325 with exomphalos, 991 with anorectal malformation, and 517 with Hirschsprung’s disease) from 264 hospitals (89 in high-income countries, 166 in middleincome countries, and nine in low-income countries) in 74 countries. Of the 3849 patients, 2231 (58·0%) were male. Median gestational age at birth was 38 weeks (IQR 36–39) and median bodyweight at presentation was 2·8 kg (2·3–3·3). Mortality among all patients was 37 (39·8%) of 93 in low-income countries, 583 (20·4%) of 2860 in middle-income countries, and 50 (5·6%) of 896 in high-income countries (p<0·0001 between all country income groups). Gastroschisis had the greatest difference in mortality between country income strata (nine [90·0%] of ten in lowincome countries, 97 [31·9%] of 304 in middle-income countries, and two [1·4%] of 139 in high-income countries; p≤0·0001 between all country income groups). Factors significantly associated with higher mortality for all patients combined included country income status (low-income vs high-income countries, risk ratio 2·78 [95% CI 1·88–4·11], p<0·0001; middle-income vs high-income countries, 2·11 [1·59–2·79], p<0·0001), sepsis at presentation (1·20 [1·04–1·40], p=0·016), higher American Society of Anesthesiologists (ASA) score at primary intervention (ASA 4–5 vs ASA 1–2, 1·82 [1·40–2·35], p<0·0001; ASA 3 vs ASA 1–2, 1·58, [1·30–1·92], p<0·0001]), surgical safety checklist not used (1·39 [1·02–1·90], p=0·035), and ventilation or parenteral nutrition unavailable when needed (ventilation 1·96, [1·41–2·71], p=0·0001; parenteral nutrition 1·35, [1·05–1·74], p=0·018). Administration of parenteral nutrition (0·61, [0·47–0·79], p=0·0002) and use of a peripherally inserted central catheter (0·65 [0·50–0·86], p=0·0024) or percutaneous central line (0·69 [0·48–1·00], p=0·049) were associated with lower mortality. Interpretation Unacceptable differences in mortality exist for gastrointestinal congenital anomalies between lowincome, middle-income, and high-income countries. Improving access to quality neonatal surgical care in LMICs will be vital to achieve Sustainable Development Goal 3.2 of ending preventable deaths in neonates and children younger than 5 years by 2030

    Forecasting the seepage loss for lined and un-lined canals using artificial neural network and gene expression programming

    No full text
    AbstractCanal lining is customarily used to raise water-use effectiveness and reduce seepage loss. The major water losses in an irrigation channel are due to leakage and evaporation. The Egyptian General Integrated Management for Water Resources and Irrigation introduced a proposal for lining the Al-Hagar canal based on these losses. This study investigates the effect of lining in the Al-Hagar canal on flow characteristics, and compares the canal before and after introducing the lining. Additionally, it discusses the most common type of water loss, namely, losses due to seepage. Fieldwork was conducted on the Al-Hagar canal, Al-Saff Center, South of Helwan city, Egypt. The result revealed that the discharge of the canal after the lining is approximately 1.362–1.573 times greater than that of the un-lined section. Water losses in the Al-Hagar canal were 38.736% when un-lined but decreased to 29.253% when lined. The conveyance effectiveness in the un-lined canal, which is approximately 61.26%, increased to 70.75% when the entire canal is lined, which means a 9.483% improvement of conveyance. New relations were introduced using Artificial Neural Network and Gene Expression Programming to forecast the seepage loss in the lined and the un-lined canal as a function of Manning’s coefficient, Froude number and hydraulic radius. The consequences were better using the GEP program than using ANN for the lined and the un-lined canals. The value of the determination coefficient was 0.98, Correlation factor was 0.99, and the RMSE was 0.0017 for lined canals and the value of determination coefficient was 1, Correlation factor was 1, and the RMSE was 0.0003 for un-lined canals

    Identifying the Basal Ganglia Network Model Markers for Medication-Induced Impulsivity in Parkinson's Disease Patients

    No full text

    International Nosocomial Infection Control Consortium report, data summary of 50 countries for 2010-2015: Device-associated module

    No full text
    •We report INICC device-associated module data of 50 countries from 2010-2015.•We collected prospective data from 861,284 patients in 703 ICUs for 3,506,562 days.•DA-HAI rates and bacterial resistance were higher in the INICC ICUs than in CDC-NHSN's.•Device utilization ratio in the INICC ICUs was similar to CDC-NHSN's. Background: We report the results of International Nosocomial Infection Control Consortium (INICC) surveillance study from January 2010-December 2015 in 703 intensive care units (ICUs) in Latin America, Europe, Eastern Mediterranean, Southeast Asia, and Western Pacific. Methods: During the 6-year study period, using Centers for Disease Control and Prevention National Healthcare Safety Network (CDC-NHSN) definitions for device-associated health care-associated infection (DA-HAI), we collected prospective data from 861,284 patients hospitalized in INICC hospital ICUs for an aggregate of 3,506,562 days. Results: Although device use in INICC ICUs was similar to that reported from CDC-NHSN ICUs, DA-HAI rates were higher in the INICC ICUs: in the INICC medical-surgical ICUs, the pooled rate of central line-associated bloodstream infection, 4.1 per 1,000 central line-days, was nearly 5-fold higher than the 0.8 per 1,000 central line-days reported from comparable US ICUs, the overall rate of ventilator-associated pneumonia was also higher, 13.1 versus 0.9 per 1,000 ventilator-days, as was the rate of catheter-associated urinary tract infection, 5.07 versus 1.7 per 1,000 catheter-days. From blood cultures samples, frequencies of resistance of Pseudomonas isolates to amikacin (29.87% vs 10%) and to imipenem (44.3% vs 26.1%), and of Klebsiella pneumoniae isolates to ceftazidime (73.2% vs 28.8%) and to imipenem (43.27% vs 12.8%) were also higher in the INICC ICUs compared with CDC-NHSN ICUs. Conclusions: Although DA-HAIs in INICC ICU patients continue to be higher than the rates reported in CDC-NSHN ICUs representing the developed world, we have observed a significant trend toward the reduction of DA-HAI rates in INICC ICUs as shown in each international report. It is INICC's main goal to continue facilitating education, training, and basic and cost-effective tools and resources, such as standardized forms and an online platform, to tackle this problem effectively and systematically
    corecore