31 research outputs found

    Impact of irrigation regimes on productivity and profitability of maize + peanut intercropping system in Upper Egypt

    Get PDF
    Good management of soil and water use is one of the most important factors in agricultural sustainability, and intercropping systems are an important component of good agricultural practices. Thus, a field experiment was conducted at the Experimental Farm of Arab Al-Awamer Research Station, Assiut Governorate, Agriculture Research Center, Egypt, during the summer seasons of 2021 and 2022 to investigate the effect of maize (M) + peanut (P) intercropping system on productivity, water use efficiency, and profitability at varying irrigation regimes. The experiment was laid out in a randomized complete block design using a split-plot arrangement with three replicates. Irrigation regimes (120, 100 and 80% ETc) were assigned to the main plots, while the intercropping systems (100% P + 25% M, 100% P + 33% M and 100% P + 50% M) were allocated to the sub-plots. The results showed that most traits of peanut and maize decreased substantially under the 80% ETc irrigation regime. While the largest values of traits were associated with the 120 % ETc. Averaged across the two seasons, the highest values of net return (1,441 US$/ha) were obtained when 100% peanut plants were intercropped with 25% maize at 120% ETc irrigation regime. Therefore, we recommend intercrop maize (25%) with peanut (100%) irrigated with 120% ETc to achieve higher yields and net return. DOI: http://dx.doi.org/10.5281/zenodo.1041338

    Relay intercropping of maize with common dry beans to rationalize nitrogen fertilizer

    Get PDF
    Maize (Zea mays L.) and dry beans (Phaseolus vulgaris L.) are important staple food and cash crops worldwide. Common bean in an intercrop with maize contributes to biological nitrogen fixation, which stabilize productivity of cropping systems and reduce negative environmental impacts and loss of biodiversity for sustainable agriculture. A field experiments was performed during the years of 2020 and 2021 at Sers El-Layian Station, northern Egypt. The current study aiming to study the effect of three sowing dates of maize, represent 3 co-growth duration [T1: at flowering stage (FS) of common beans (60 days co-growth duration), T2: 15 days after FS (45 days co-growth duration), and T3: 30 days after FS (30 days co-growth duration with beans)] and three N fertilizer levels (N1: 190.4, N2: 238.0, and N3: 285.6 kg N/ha of maize) on productivity, profitability and N fertilizer rationalization. The longest co-growth duration of maize intercropping with common beans (T1) significantly (P ≤ 0.05) decreased common beans and maize yields compared with T2 and T3. Performance of common beans did not show (P ≤ 0.05) any variation under different N fertilizer levels of maize. Significant (P ≤ 0.05) increase in maize yield and its components with raising N fertilizer level up to N3. Although there was no significant variation in maize yield when applied N2 and N3, however, nitrogen use efficiency (NUE) was significant (P ≤ 0.05) higher in N2 than N3 by 18.34%. Regardless of planting time and N fertilizer level of maize, combined productivity of common beans and maize increased in the intercropped system as cleared by higher total land equivalent ratios (LER) and area time equivalent ratios (ATER). Highest LER value 1.99 was observed at the shortest co-growth period T3 under N3 followed by 1.97 with N2. Positive values in the actual yield loss index (AYL) indicated intercropping advantage. Different competition indices showed a greater dominance of maize over common beans (aggressivity, Ag; competitive ratio, CR; actual yield losses, AYL). However, the intercropping systems increased the economic advantage (intercropping advantage index, IAI and monetary advantage index MAI) over monoculture. These results imply that shortening the period of co-growth maize with common beans (T3) and applying 238.0 kg N/ha in the relay intercropping system reduced mineral N fertilizer use by 16.67% compared to the advised level 285.6 kg N/ha along with increased productivity per unit area and economic advantages for small-farmer

    Cognitive Functioning in a Pilot Sample of Childhood Cancer Patients in Egypt

    Get PDF
    Abstract A subset of cancer survivors experience cognitive deficits that can last for many years after the completion of chemotherapy. The etiology of this problem is largely unknown, so the present study aimed to assess cognitive functioning in childhood patients with cancer and to investigate the proposed disposing factors including variables related to disease, treatment, and some socio-demographic characteristics. In a case control study parents of 67 cancer patients aged 8-12 years, completed the parent proxy report of PedsQL™ 3.0 Cognitive Functioning Scale (Arabic versions), as well as a separate sheet for socio-demographic data. Control group consisted of 37 healthy subjects from the same age group were subjected to the same methodology for comparison. All patients under the study have successfully accomplished their treatment protocol and were in complete remission during the evaluation. Hematological malignancies represented 70.1% of the patients sample, with the highest proportion for ALL (52.2%). Brain tumors represented 40% of the solid malignancies (29.9% of the study patients). Cognitive functioning score was significantly lower in the solid group (69.6±37.3) compared to the hematologic group (85.1±22.2) (t = 2.1, p =0.038). Cognitive functioning score was also lower in solid group versus control subjects (p =0.047), while it showed no significant difference between hematological malignancies and control group. Older age at diagnosis, urban residence, illiterate mothers, higher duration of treatment as well as long duration of hospital admission were associated with a lower cognitive score in the solid tumors group compared to hematological group

    Adaptive Functioning and Psychosocial Problems in Children with Beta Thalassemia Major

    Get PDF
    BACKGROUND: Beta thalassemia major is considered one of the serious health problems and the commonest hemoglobinopathy in Egypt that creates a burden not only on health system but also on the affected families and children who become vulnerable to emotional, social, psychological and behavioural problems. AIM: This study was designed to assess the psychosocial burden and the adaptive functioning in children with beta-thalassemia major. SUBJECTS AND METHODS: A group of 50 children with thalassemia major and 50 normal children matched for age and sex were included in a case-control study. Vineland Adaptive Functioning Scale was used to assess the adaptive functions; while the Pediatric Symptom Checklist (PSCL) was used to assess psychosocial morbidity. RESULTS: A group of 50 children aged 5-17 years old with thalassemia major, their mean age was 11.05 ± 3.8, showed a statistically significant lower total adaptive behaviour score and communication subscale score. All the mean values of adaptive behaviour for cases and controls were within the average values. Results from the PSCL revealed no significant difference between mean scores of children with thalassemia and controls. A score of attention domain was markedly higher in children with thalassemia. Internalising behaviour was the most dominant as it was detected in 10% of the patient group. CONCLUSION: Thalassemic patients had a relatively mild affection for adaptive and psychosocial functioning that can be explained by social and medical support they receive, which may increase their competence and psychological wellbeing

    Role of Procalcitonin As an Inflammatory Marker in a Sample of Egyptian Children with Simple Obesity

    Get PDF
    BACKGROUND: Obesity is a multifactorial disease, associated with metabolic disorders and chronic low-grade inflammation. Procalcitonin (PCT) is well known as a biomarker of infection, and systemic inflammation. Recently, it has potential as a marker for chronic low-grade inflammation.AIM: This study aims to evaluate the role of serum PCT as an inflammatory biomarker in the diagnosis of obesity-related low-grade inflammation.METHOD: In this case-control study, 50 obese and 35 normal weight children and adolescents aged 5–15 years were enrolled. Anthropometric parameters were measured in all subjects. Blood samples were collected for measurement of lipid profile, blood glucose, insulin, high sensitivity-CRP (Hs-CRP) and serum procalcitonin. Serum (PCT) levels were assessed using enzyme-linked immunosorbent assay.RESULTS: Obese participants had higher concentrations of serum PCT, total cholesterol, triglycerides, LDL-c, glucose and Hs-CRP than control group. On correlation analysis, procalcitonin had significant positive correlation with (BMI) z-score (P = 0.02), insulin (P = 0.00), insulin resistance (HOMA-IR) (P = 0.006), Hs-CRP (P = 0.02), total cholesterol (P = 0.04) and triglycerides (P = 0.00) in obese group.CONCLUSION: The increased serum procalcitonin concentrations were closely related to measures of adiposity, Hs-CRP and insulin resistance, suggesting that PCT may be an excellent biomarker for obesity-related chronic low-grade inflammation in children and adolescents

    Neurocognitive Function and Its Related Potentials in Children with Beta Thalassemia Major: An Egyptian Study

    Get PDF
    BACKGROUND: Repeated blood transfusions and hemolysis in β-Thalassemia major children lead to iron overload in various organs, including the brain which may cause neurodegeneration. AIM: To evaluate intelligence quotient in children with β-thalassemia major and healthy counterparts and to assess risk factors that cause cognitive problems. SUBJECTS AND METHODS: This case-control study was performed on 50 children aged 6-16 years old with β-thalassemia major as patients group and compared with 50 healthy children as a control group of matched age, sex, and social class. Cognitive functions were evaluated by using the Wechsler Intelligence Scale for Children. Serum ferritin and iron were measured by ELISA. RESULTS: There were significantly lower mean performance and full-scale IQ scores of patients group in comparison with controls, whereas no significant differences between both groups as regards to a verbal IQ score. In thalassemic children, block design, comprehension and arithmetic were negatively correlated with age of disease onset, duration of illness and onset of chelation therapy. Serum iron and ferritin were negatively correlated with similarities and digit span. Serum iron levels were negatively correlated with performance IQ score. CONCLUSION: Children with β-thalassemia major need to receive more academic attention and cognitive assessment to improve their IQ

    Impact of opioid-free analgesia on pain severity and patient satisfaction after discharge from surgery: multispecialty, prospective cohort study in 25 countries

    Get PDF
    Background: Balancing opioid stewardship and the need for adequate analgesia following discharge after surgery is challenging. This study aimed to compare the outcomes for patients discharged with opioid versus opioid-free analgesia after common surgical procedures.Methods: This international, multicentre, prospective cohort study collected data from patients undergoing common acute and elective general surgical, urological, gynaecological, and orthopaedic procedures. The primary outcomes were patient-reported time in severe pain measured on a numerical analogue scale from 0 to 100% and patient-reported satisfaction with pain relief during the first week following discharge. Data were collected by in-hospital chart review and patient telephone interview 1 week after discharge.Results: The study recruited 4273 patients from 144 centres in 25 countries; 1311 patients (30.7%) were prescribed opioid analgesia at discharge. Patients reported being in severe pain for 10 (i.q.r. 1-30)% of the first week after discharge and rated satisfaction with analgesia as 90 (i.q.r. 80-100) of 100. After adjustment for confounders, opioid analgesia on discharge was independently associated with increased pain severity (risk ratio 1.52, 95% c.i. 1.31 to 1.76; P < 0.001) and re-presentation to healthcare providers owing to side-effects of medication (OR 2.38, 95% c.i. 1.36 to 4.17; P = 0.004), but not with satisfaction with analgesia (beta coefficient 0.92, 95% c.i. -1.52 to 3.36; P = 0.468) compared with opioid-free analgesia. Although opioid prescribing varied greatly between high-income and low- and middle-income countries, patient-reported outcomes did not.Conclusion: Opioid analgesia prescription on surgical discharge is associated with a higher risk of re-presentation owing to side-effects of medication and increased patient-reported pain, but not with changes in patient-reported satisfaction. Opioid-free discharge analgesia should be adopted routinely

    The evolving SARS-CoV-2 epidemic in Africa: Insights from rapidly expanding genomic surveillance

    Get PDF
    INTRODUCTION Investment in Africa over the past year with regard to severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) sequencing has led to a massive increase in the number of sequences, which, to date, exceeds 100,000 sequences generated to track the pandemic on the continent. These sequences have profoundly affected how public health officials in Africa have navigated the COVID-19 pandemic. RATIONALE We demonstrate how the first 100,000 SARS-CoV-2 sequences from Africa have helped monitor the epidemic on the continent, how genomic surveillance expanded over the course of the pandemic, and how we adapted our sequencing methods to deal with an evolving virus. Finally, we also examine how viral lineages have spread across the continent in a phylogeographic framework to gain insights into the underlying temporal and spatial transmission dynamics for several variants of concern (VOCs). RESULTS Our results indicate that the number of countries in Africa that can sequence the virus within their own borders is growing and that this is coupled with a shorter turnaround time from the time of sampling to sequence submission. Ongoing evolution necessitated the continual updating of primer sets, and, as a result, eight primer sets were designed in tandem with viral evolution and used to ensure effective sequencing of the virus. The pandemic unfolded through multiple waves of infection that were each driven by distinct genetic lineages, with B.1-like ancestral strains associated with the first pandemic wave of infections in 2020. Successive waves on the continent were fueled by different VOCs, with Alpha and Beta cocirculating in distinct spatial patterns during the second wave and Delta and Omicron affecting the whole continent during the third and fourth waves, respectively. Phylogeographic reconstruction points toward distinct differences in viral importation and exportation patterns associated with the Alpha, Beta, Delta, and Omicron variants and subvariants, when considering both Africa versus the rest of the world and viral dissemination within the continent. Our epidemiological and phylogenetic inferences therefore underscore the heterogeneous nature of the pandemic on the continent and highlight key insights and challenges, for instance, recognizing the limitations of low testing proportions. We also highlight the early warning capacity that genomic surveillance in Africa has had for the rest of the world with the detection of new lineages and variants, the most recent being the characterization of various Omicron subvariants. CONCLUSION Sustained investment for diagnostics and genomic surveillance in Africa is needed as the virus continues to evolve. This is important not only to help combat SARS-CoV-2 on the continent but also because it can be used as a platform to help address the many emerging and reemerging infectious disease threats in Africa. In particular, capacity building for local sequencing within countries or within the continent should be prioritized because this is generally associated with shorter turnaround times, providing the most benefit to local public health authorities tasked with pandemic response and mitigation and allowing for the fastest reaction to localized outbreaks. These investments are crucial for pandemic preparedness and response and will serve the health of the continent well into the 21st century

    Vaping, Environmental Toxicants Exposure, and Lung Cancer Risk

    No full text
    Lung cancer (LC) is the second-most prevalent tumor worldwide. According to the most recent GLOBOCAN data, over 2.2 million LC cases were reported in 2020, with an estimated new death incident of 1,796,144 lung cancer cases. Genetic, lifestyle, and environmental exposure play an important role as risk factors for LC. E-cigarette, or vaping, products (EVPs) use has been dramatically increasing world-wide. There is growing concern that EVPs consumption may increase the risk of LC because EVPs contain several proven carcinogenic compounds. However, the relationship between EVPs and LC is not well established. E-cigarette contains nicotine derivatives (e.g., nitrosnornicotine, nitrosamine ketone), heavy metals (including organometal compounds), polycyclic aromatic hydrocarbons, and flavorings (aldehydes and complex organics). Several environmental toxicants have been proven to contribute to LC. Proven and plausible environmental carcinogens could be physical (ionizing and non-ionizing radiation), chemicals (such as asbestos, formaldehyde, and dioxins), and heavy metals (such as cobalt, arsenic, cadmium, chromium, and nickel). Air pollution, especially particulate matter (PM) emitted from vehicles and industrial exhausts, is linked with LC. Although extensive environmental exposure prevention policies and smoking reduction strategies have been adopted globally, the dangers remain. Combined, both EVPs and toxic environmental exposures may demonstrate significant synergistic oncogenicity. This review aims to analyze the current publications on the importance of the relationship between EVPs consumption and environmental toxicants in the pathogenesis of LC
    corecore