48 research outputs found
Soil hydrophysical properties as affected by kind of added polymer.
Laboratory experiments were carried out to clarify the impact of different types of natural and synthetic polymers on some hydrophysical properties (soil hydraulic parameters) of a sandy soil. Adding 0.5% (w/w) of each treatment to soil significantly increased water retention at saturation, field capacity, total available water and readily available water. This treatment decreased the value of inflection point on water retention curve as result of enhancing water behavior in the soil. The obtained results revealed that soil water storage significantly increased from 0.271 in control treatment (without adding polymer) up to 0.414 in treatment [T10] (Acrylic acid + Xanthan) while, field capacity increased significantly from 0.078 in control up to 0.242 of the abovementioned treatment (T10). Regarding the effects of polymer application in total available water and readily available water, data revealed significant increases in the above mentioned parameters. Total available water increased from 0.044 in control treatment up to 0.153 in T10 and readily available water increased from 0.057in control treatment up to 0.185 in T10. Concerning values of inflection point on soil water retention curve, the obtained results revealed that, inflection point of control treatment (1000 mbar) decreased to 590 mbar, as a result of adding a mixture of polymer acrylic acid + xanthan (T10 ).Soil depletion rate decreased as due to polymer application by 25% up to 75% depending on type of polymer and wether it was added individually or in combination with another polymer. This effect led to significant differences among control treatment (control) and the other treatments. Generally, there were significant effects of all polymers on the concerned hydrophyical properties of the studied sandy soil, i.e. storage capacity of soil water, depletion rate of soil water, soil field capacity , soil available water, readily available water and inflection point on the soil water retention curve. Acrylic acid recorded the best results concerning soil water behavior if it was added individually to the sandy soil (Treatment 2) or in combination with Xanthan (Treatment 10) or with Lignosulphonate (Treatment 11)
Screening Spring Wheat Genotypes for TaDreb-B1 and Fehw3 Genes under Severe Drought Stress at the Germination Stage Using KASP Technology
Drought stress is a major yield-limiting factor throughout the world in wheat (Triticum aestivum L.), causing losses of up to 80% of the total yield. The identification of factors affecting drought stress tolerance in the seedling stage is especially important to increase adaptation and accelerate the grain yield potential. In the current study, 41 spring wheat genotypes were tested for their tolerance to drought at the germination stage under two different polyethylene glycol concentrations (PEG) of 25% and 30%. For this purpose, twenty seedlings from each genotype were evaluated in triplicate with a randomized complete block design (RCBD) in a controlled growth chamber. The following nine parameters were recorded: germination pace (GP), germination percentage (G%), number of roots (NR), shoot length (SL), root length (RL), shoot–root length ratio (SRR), fresh biomass weight (FBW), dry biomass weight (DBW), and water content (WC). An analysis of variance (ANOVA) revealed highly significant differences (p \u3c 0.01) among the genotypes, treatments (PEG25%, PEG30%) and genotypes × treatment interaction, for all traits. The broad-sense heritability (H2) estimates were very high in both concentrations. They ranged from 89.4 to 98.9% under PEG25% and from 70.8 to 98.7% under PEG30%. Citr15314 (Afghanistan) was among the best performing genotypes under both concentrations for most of the germination traits. Two KASP markers for TaDreb-B1 and Fehw3 genes were used to screen all genotypes and to study the effect of these on drought tolerance at the germination stage. All genotypes with Fehw3 (only) showed a better performance for most traits under both concentrations compared to other genotypes having TaDreb-B1 or having both genes. To our knowledge, this work is the first report showing the effect of the two genes on germination traits under severe drought stress conditions
Screening Spring Wheat Genotypes for \u3ci\u3eTaDreb-B1\u3c/i\u3e and \u3ci\u3eFehw3\u3c/i\u3e Genes under Severe Drought Stress at the Germination Stage Using KASP Technology
Drought stress is a major yield-limiting factor throughout the world in wheat (Triticum aestivum L.), causing losses of up to 80% of the total yield. The identification of factors affecting drought stress tolerance in the seedling stage is especially important to increase adaptation and accelerate the grain yield potential. In the current study, 41 spring wheat genotypes were tested for their tolerance to drought at the germination stage under two different polyethylene glycol concentrations (PEG) of 25% and 30%. For this purpose, twenty seedlings from each genotype were evaluated in triplicate with a randomized complete block design (RCBD) in a controlled growth chamber. The following nine parameters were recorded: germination pace (GP), germination percentage (G%), number of roots (NR), shoot length (SL), root length (RL), shoot–root length ratio (SRR), fresh biomass weight (FBW), dry biomass weight (DBW), and water content (WC). An analysis of variance (ANOVA) revealed highly significant differences (p \u3c 0.01) among the genotypes, treatments (PEG25%, PEG30%) and genotypes × treatment interaction, for all traits. The broad-sense heritability (H2) estimates were very high in both concentrations. They ranged from 89.4 to 98.9% under PEG25% and from 70.8 to 98.7% under PEG30%. Citr15314 (Afghanistan) was among the best performing genotypes under both concentrations for most of the germination traits. Two KASP markers for TaDreb-B1 and Fehw3 genes were used to screen all genotypes and to study the effect of these on drought tolerance at the germination stage. All genotypes with Fehw3 (only) showed a better performance for most traits under both concentrations compared to other genotypes having TaDreb-B1 or having both genes. To our knowledge, this work is the first report showing the effect of the two genes on germination traits under severe drought stress conditions
IL-4Rα on dendritic cells in neonates and Th2 immunopathology in respiratory syncytial virus infection
© Society for Leukocyte Biology. Respiratory syncytial virus (RSV) is one of the leading causes of bronchiolitis in children, and severe RSV infection early in life has been associated with asthma development. Using a neonatal mouse model, we have shown that down-regulation of IL-4 receptor α (IL-4Rα) with antisense oligonucleotides in the lung during neonatal infection protected from RSV immunopathophysiology. Significant down-regulation of IL-4Rα was observed on pulmonary CD11b+ myeloid dendritic cells (mDCs) suggesting a role for IL-4Rα on mDCs in the immunopathogenesis of neonatal RSV infection. Here, we demonstrated that neonatal CD11b+ mDCs expressed higher levels of IL-4Rα than their adult counterparts. Because CD11b+ mDCs mainly present antigens to CD4+ T cells, we hypothesized that increased expression of IL- 4Rα on neonatal CD11b+ mDCs was responsible for Th2 - biased RSV immunopathophysiology. Indeed, when IL-4Rα was selectively deleted from CD11b+ mDCs, the immunopathophysiology typically observed following RSV reinfection was ablated, including Th2 inflammation, airway-mucus hyperproduction, and pulmonary dysfunction. Further, overexpression of IL-4Rα on adult CD11b+ DCs and their adoptive transfer into adult mice was able to recapitulate the Th2-biased RSV immunopathology typically observed only in neonates infected with RSV. IL-4Rα levels on CD11c+ cells were inversely correlated with maturation status of CD11b+ mDCs upon RSV infection. Our data demonstrate that developmentally regulated IL-4Rα expression is critical for the maturity of pulmonary CD11b+ mDCs and the Th2-biased immunopathogenesis of neonatal RSV infection
Recommended from our members
Improving fodder yields and nutritive value of some forage grasses as animal feeds through intercropping with Egyptian Clover (trifolium alexandrinum l.)
The present study aimed to evaluate the potential of improving the feeding value of Egyptian clover (EC), ryegrass (R), triticale (T), barley (B), and oats (O) monoculture, or Egyptian clover mixed with ryegrass (EC+R), oats (EC+O), barely (EC+B), and triticale (EC+T) at 75:25% seeding rate, respectively, during two successive winter seasons of 2018/19 and 2019/20. Harvesting of plots was carried out at 5 cm stubble height after 60, 100, and 140 days from sowing. The in vitro nutritive value and ruminal fermentation of the monoculture and intercropping containing EC were evaluated. Green forage yield of EC was higher than other plants with about 160% of fresh forage compared with T, O, or EC+T intercropping. The highest crude protein (CP) concentration was noted in EC, while the lowest (p < 0.001) concentration was observed in T, which had the highest fiber fractions content. Ryegrass had the highest net in vitro gas production (GP), while EC+R had the lowest GP (p < 0.05). The EC increased dry matter and organic matter degradability. EC and R reduced protozoal count, while total volatile fatty acids (VFA), acetate, and propionate were increased with B and EC+T intercropping (p < 0.05). Overall, intercropping of EC with grass of triticale or ryegrass at mixing rates of 75:25% resulted in improving fresh and dry forage yields. The legume–grass intercropping improved the protozoa count partitioning factor as an index of microbial protein synthesis and total VFA concentration
Demographics and Epidemiology of Hepatitis B in the State of Qatar: A Five-Year Surveillance-Based Incidence Study
Background: Expatriates represent >80% of Qatar’s population, mostly arriving from countries in Africa and Asia that are endemic with many diseases. This increases the risk for introducing new pathogens into the country and provides a platform for maintenance of endemic pathogen circulation. Here, we report on the incidence and epidemiological characteristics of hepatitis B in Qatar between 2010 and 2014. Methods: We performed a retrospective epidemiological data analysis using the data available at the surveillance system of the Ministry of Public Health (MOPH) in Qatar. Data were collected from distinctive public and private incorporates around the nation. Reported cases of hepatitis B patients represent those who met the stringent case definition as per World Health Organization (WHO) and Centers for Disease Control and Prevention (CDC) guidelines and eventually reported to MOPH. Results: The annual incidence rates of hepatitis B cases were 30.0, 34.2, 30.5, 39.4, and 19.8 per 100,000 population in 2010, 2011, 2012, 2013, and 2014, respectively. There was no specific trend or seasonality for the reported cases. The incidence rates were higher in females compared to males between 2010 and 2012, but similar in 2013 and 2014. The highest incidence rates were reported among individuals between 25 and 34 years of age. No cases were reported in children younger than five years in 2013 and 2014. Rates of hepatitis B cases declined dramatically in 2014, in both Qataris and non-Qataris, as compared to the previous years. Conclusion: Our results indicate a dramatic decline of hepatitis B cases in Qatar but mandate improved surveillance and vaccination efforts in expatriates in the nation. View Full-TextMOP
Antimicrobial resistance among migrants in Europe: a systematic review and meta-analysis
BACKGROUND: Rates of antimicrobial resistance (AMR) are rising globally and there is concern that increased migration is contributing to the burden of antibiotic resistance in Europe. However, the effect of migration on the burden of AMR in Europe has not yet been comprehensively examined. Therefore, we did a systematic review and meta-analysis to identify and synthesise data for AMR carriage or infection in migrants to Europe to examine differences in patterns of AMR across migrant groups and in different settings. METHODS: For this systematic review and meta-analysis, we searched MEDLINE, Embase, PubMed, and Scopus with no language restrictions from Jan 1, 2000, to Jan 18, 2017, for primary data from observational studies reporting antibacterial resistance in common bacterial pathogens among migrants to 21 European Union-15 and European Economic Area countries. To be eligible for inclusion, studies had to report data on carriage or infection with laboratory-confirmed antibiotic-resistant organisms in migrant populations. We extracted data from eligible studies and assessed quality using piloted, standardised forms. We did not examine drug resistance in tuberculosis and excluded articles solely reporting on this parameter. We also excluded articles in which migrant status was determined by ethnicity, country of birth of participants' parents, or was not defined, and articles in which data were not disaggregated by migrant status. Outcomes were carriage of or infection with antibiotic-resistant organisms. We used random-effects models to calculate the pooled prevalence of each outcome. The study protocol is registered with PROSPERO, number CRD42016043681. FINDINGS: We identified 2274 articles, of which 23 observational studies reporting on antibiotic resistance in 2319 migrants were included. The pooled prevalence of any AMR carriage or AMR infection in migrants was 25·4% (95% CI 19·1-31·8; I2 =98%), including meticillin-resistant Staphylococcus aureus (7·8%, 4·8-10·7; I2 =92%) and antibiotic-resistant Gram-negative bacteria (27·2%, 17·6-36·8; I2 =94%). The pooled prevalence of any AMR carriage or infection was higher in refugees and asylum seekers (33·0%, 18·3-47·6; I2 =98%) than in other migrant groups (6·6%, 1·8-11·3; I2 =92%). The pooled prevalence of antibiotic-resistant organisms was slightly higher in high-migrant community settings (33·1%, 11·1-55·1; I2 =96%) than in migrants in hospitals (24·3%, 16·1-32·6; I2 =98%). We did not find evidence of high rates of transmission of AMR from migrant to host populations. INTERPRETATION: Migrants are exposed to conditions favouring the emergence of drug resistance during transit and in host countries in Europe. Increased antibiotic resistance among refugees and asylum seekers and in high-migrant community settings (such as refugee camps and detention facilities) highlights the need for improved living conditions, access to health care, and initiatives to facilitate detection of and appropriate high-quality treatment for antibiotic-resistant infections during transit and in host countries. Protocols for the prevention and control of infection and for antibiotic surveillance need to be integrated in all aspects of health care, which should be accessible for all migrant groups, and should target determinants of AMR before, during, and after migration. FUNDING: UK National Institute for Health Research Imperial Biomedical Research Centre, Imperial College Healthcare Charity, the Wellcome Trust, and UK National Institute for Health Research Health Protection Research Unit in Healthcare-associated Infections and Antimictobial Resistance at Imperial College London
Impact of opioid-free analgesia on pain severity and patient satisfaction after discharge from surgery: multispecialty, prospective cohort study in 25 countries
Background: Balancing opioid stewardship and the need for adequate analgesia following discharge after surgery is challenging. This study aimed to compare the outcomes for patients discharged with opioid versus opioid-free analgesia after common surgical procedures.Methods: This international, multicentre, prospective cohort study collected data from patients undergoing common acute and elective general surgical, urological, gynaecological, and orthopaedic procedures. The primary outcomes were patient-reported time in severe pain measured on a numerical analogue scale from 0 to 100% and patient-reported satisfaction with pain relief during the first week following discharge. Data were collected by in-hospital chart review and patient telephone interview 1 week after discharge.Results: The study recruited 4273 patients from 144 centres in 25 countries; 1311 patients (30.7%) were prescribed opioid analgesia at discharge. Patients reported being in severe pain for 10 (i.q.r. 1-30)% of the first week after discharge and rated satisfaction with analgesia as 90 (i.q.r. 80-100) of 100. After adjustment for confounders, opioid analgesia on discharge was independently associated with increased pain severity (risk ratio 1.52, 95% c.i. 1.31 to 1.76; P < 0.001) and re-presentation to healthcare providers owing to side-effects of medication (OR 2.38, 95% c.i. 1.36 to 4.17; P = 0.004), but not with satisfaction with analgesia (beta coefficient 0.92, 95% c.i. -1.52 to 3.36; P = 0.468) compared with opioid-free analgesia. Although opioid prescribing varied greatly between high-income and low- and middle-income countries, patient-reported outcomes did not.Conclusion: Opioid analgesia prescription on surgical discharge is associated with a higher risk of re-presentation owing to side-effects of medication and increased patient-reported pain, but not with changes in patient-reported satisfaction. Opioid-free discharge analgesia should be adopted routinely