61 research outputs found
Cartesian Parallel Manipulator Modeling, Control and Simulation
Ayssam Elkady, Galal Elkobrosy, Sarwat Hanna, and Tarek Sobh's book chapter on robotic parallel manipulators
A Combined Use of Rhizobacteria and Moringa Leaf Extract Mitigates the Adverse Effects of Drought Stress in Wheat (Triticum aestivum L.)
Less nutrient availability and drought stress are some serious concerns of agriculture. Both biotic and abiotic stress factors have the potential to limit crop productivity. However, several organic extracts obtained from moringa leaves may induce immunity in plants under nutritional and drought stress for increasing their survival. Additionally, some rhizobacterial strains have the ability to enhance root growth for better nutrient and water uptake in stress conditions. To cover the knowledge gap on the interactive effects of beneficial rhizobacteria and moringa leaf extracts (MLEs), this study was conducted. The aim of this experimental study was to investigate the effectiveness of sole and combined use of rhizobacteria and MLEs against nutritional and drought stress in wheat. Nitrogen-fixing bacteria Pseudomonas aeruginosa (Pa) (10(8) CFU ml(-1)) was inoculated to wheat plants with and without foliar-applied MLEs at two different concentrations (MLE 1 = 1:15 v/v and MLE 2 = 1:30 v/v) twice at 25 and 35 days after seed sowing (50 ml per plant) after the establishment of drought stress. Results revealed that Pa + MLE 2 significantly increased fresh weight (FW), dry weight (DW), lengths of roots and shoot and photosynthetic contents of wheat. A significant enhancement in total soluble sugars, total soluble proteins, calcium, potassium, phosphate, and nitrate contents validated the efficacious effect of Pa + MLE 2 over control-treated plants. Significant decrease in sodium, proline, glycine betaine, electrolyte leakage, malondialdehyde, hydrogen peroxide, superoxide dismutase (SOD), and peroxide (POD) concentrations in wheat cultivated under drought stress conditions also represents the imperative role of Pa + MLE 2 over control. In conclusion, Pa + MLE 2 can alleviate nutritional stress and drought effects in wheat. More research in this field is required to proclaim Pa + MLE 2 as the most effective amendment against drought stress in distinct agroecological zones, different soil types, and contrasting wheat cultivars worldwide.Peer reviewe
A Frontier Based Eco-Efficiency Assessment of Electric Vehicles: The Case of European Union Countries Using Mixed and Renewable Sources of Energy
Electric vehicles (EVs) are seen as a promising solution for creating more efficient and sustainable transportation systems. European Union (EU) members show a strong interest in implementing EVs, and the governments support the concept by offering facilities to the buyers. Although electric vehicles can be operated with nonpolluting fuels, such as natural gas, fuel cells are more efficient. Creating electricity can affect the environment and the economy. Three environmental features (consumption of water, GHG emissions, and energy consumption, plus GDP's contribution to EU gross domestic product) were analyzed for 28 EU member states to measure electric vehicle efficiency. In one of the DEA models, an input-oriented method was employed to compute the efficiency scores. The k-means clustering algorithm defined the high, medium, and low-efficiency groups. Even more so, the total efficiency scores in this study show that using solar energy outperforms mixed-source energy sources was found to be more efficient
Associative effects of activated carbon biochar and arbuscular mycorrhizal fungi on wheat for reducing nickel food chain bioavailability
Heavy metal stress and less nutrient availability are some of the major concerns in agriculture. Both abiotic stresses have potential to decrease the crops productivity. On the other hand, organic fertilizers i.e., activated carbon biochar (ACB) and arbuscular mycorrhizal fungi (AMF) increase nutritional and heavy metal like Nickel (Ni) stress tolerance and provide immunity to plants for their survival in unfavorable environments. Previous studies have only looked at single applications of either ACB or AMF thus far. There is limited evidence of their synergistic effects, especially in plants growing in soil contaminated with nickel (Ni). To cover the knowledge gap of combined use of AMF inoculation (Glomus intraradices) and/or wheat straw biochar amendments on wheat growth, antioxidant activities and osmolytes concentration, present study is conducted. The use of either the AMF inoculant or the ACB alone resulted in improved wheat growth and decreased Ni uptake. Furthermore, sole AMF or ACB also reduced Ni stress effectively, allowing wheat to grow faster and reducing soil Ni transfer into plant tissue. In comparison to a control, adding ACB with AMF inoculant considerably increased fungal populations. The most significant increase in wheat growth and decrease in tissue Ni contents came from amending soil with AMF inoculant and biochar. Inducing soil alkalinization and causing Ni immobilization, as well as decreasing Ni phyto-availability, the combination treatment had a synergistic impact. These findings imply that AMF inoculation in ACB treatment could be used not only for wheat production but also for Ni-contaminated soil phyto-stabilization. (C) 2022 The Author(s). Published by Elsevier B.V.Peer reviewe
Effects of hospital facilities on patient outcomes after cancer surgery: an international, prospective, observational study
Background Early death after cancer surgery is higher in low-income and middle-income countries (LMICs) compared with in high-income countries, yet the impact of facility characteristics on early postoperative outcomes is unknown. The aim of this study was to examine the association between hospital infrastructure, resource availability, and processes on early outcomes after cancer surgery worldwide.Methods A multimethods analysis was performed as part of the GlobalSurg 3 study-a multicentre, international, prospective cohort study of patients who had surgery for breast, colorectal, or gastric cancer. The primary outcomes were 30-day mortality and 30-day major complication rates. Potentially beneficial hospital facilities were identified by variable selection to select those associated with 30-day mortality. Adjusted outcomes were determined using generalised estimating equations to account for patient characteristics and country-income group, with population stratification by hospital.Findings Between April 1, 2018, and April 23, 2019, facility-level data were collected for 9685 patients across 238 hospitals in 66 countries (91 hospitals in 20 high-income countries; 57 hospitals in 19 upper-middle-income countries; and 90 hospitals in 27 low-income to lower-middle-income countries). The availability of five hospital facilities was inversely associated with mortality: ultrasound, CT scanner, critical care unit, opioid analgesia, and oncologist. After adjustment for case-mix and country income group, hospitals with three or fewer of these facilities (62 hospitals, 1294 patients) had higher mortality compared with those with four or five (adjusted odds ratio [OR] 3.85 [95% CI 2.58-5.75]; p<0.0001), with excess mortality predominantly explained by a limited capacity to rescue following the development of major complications (63.0% vs 82.7%; OR 0.35 [0.23-0.53]; p<0.0001). Across LMICs, improvements in hospital facilities would prevent one to three deaths for every 100 patients undergoing surgery for cancer.Interpretation Hospitals with higher levels of infrastructure and resources have better outcomes after cancer surgery, independent of country income. Without urgent strengthening of hospital infrastructure and resources, the reductions in cancer-associated mortality associated with improved access will not be realised
Burnout among surgeons before and during the SARS-CoV-2 pandemic: an international survey
Background: SARS-CoV-2 pandemic has had many significant impacts within the surgical realm, and surgeons have been obligated to reconsider almost every aspect of daily clinical practice. Methods: This is a cross-sectional study reported in compliance with the CHERRIES guidelines and conducted through an online platform from June 14th to July 15th, 2020. The primary outcome was the burden of burnout during the pandemic indicated by the validated Shirom-Melamed Burnout Measure. Results: Nine hundred fifty-four surgeons completed the survey. The median length of practice was 10 years; 78.2% included were male with a median age of 37 years old, 39.5% were consultants, 68.9% were general surgeons, and 55.7% were affiliated with an academic institution. Overall, there was a significant increase in the mean burnout score during the pandemic; longer years of practice and older age were significantly associated with less burnout. There were significant reductions in the median number of outpatient visits, operated cases, on-call hours, emergency visits, and research work, so, 48.2% of respondents felt that the training resources were insufficient. The majority (81.3%) of respondents reported that their hospitals were included in the management of COVID-19, 66.5% felt their roles had been minimized; 41% were asked to assist in non-surgical medical practices, and 37.6% of respondents were included in COVID-19 management. Conclusions: There was a significant burnout among trainees. Almost all aspects of clinical and research activities were affected with a significant reduction in the volume of research, outpatient clinic visits, surgical procedures, on-call hours, and emergency cases hindering the training. Trial registration: The study was registered on clicaltrials.gov "NCT04433286" on 16/06/2020
Impact of opioid-free analgesia on pain severity and patient satisfaction after discharge from surgery: multispecialty, prospective cohort study in 25 countries
Background: Balancing opioid stewardship and the need for adequate analgesia following discharge after surgery is challenging. This study aimed to compare the outcomes for patients discharged with opioid versus opioid-free analgesia after common surgical procedures.Methods: This international, multicentre, prospective cohort study collected data from patients undergoing common acute and elective general surgical, urological, gynaecological, and orthopaedic procedures. The primary outcomes were patient-reported time in severe pain measured on a numerical analogue scale from 0 to 100% and patient-reported satisfaction with pain relief during the first week following discharge. Data were collected by in-hospital chart review and patient telephone interview 1 week after discharge.Results: The study recruited 4273 patients from 144 centres in 25 countries; 1311 patients (30.7%) were prescribed opioid analgesia at discharge. Patients reported being in severe pain for 10 (i.q.r. 1-30)% of the first week after discharge and rated satisfaction with analgesia as 90 (i.q.r. 80-100) of 100. After adjustment for confounders, opioid analgesia on discharge was independently associated with increased pain severity (risk ratio 1.52, 95% c.i. 1.31 to 1.76; P < 0.001) and re-presentation to healthcare providers owing to side-effects of medication (OR 2.38, 95% c.i. 1.36 to 4.17; P = 0.004), but not with satisfaction with analgesia (beta coefficient 0.92, 95% c.i. -1.52 to 3.36; P = 0.468) compared with opioid-free analgesia. Although opioid prescribing varied greatly between high-income and low- and middle-income countries, patient-reported outcomes did not.Conclusion: Opioid analgesia prescription on surgical discharge is associated with a higher risk of re-presentation owing to side-effects of medication and increased patient-reported pain, but not with changes in patient-reported satisfaction. Opioid-free discharge analgesia should be adopted routinely
The evolving SARS-CoV-2 epidemic in Africa: Insights from rapidly expanding genomic surveillance
INTRODUCTION
Investment in Africa over the past year with regard to severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) sequencing has led to a massive increase in the number of sequences, which, to date, exceeds 100,000 sequences generated to track the pandemic on the continent. These sequences have profoundly affected how public health officials in Africa have navigated the COVID-19 pandemic.
RATIONALE
We demonstrate how the first 100,000 SARS-CoV-2 sequences from Africa have helped monitor the epidemic on the continent, how genomic surveillance expanded over the course of the pandemic, and how we adapted our sequencing methods to deal with an evolving virus. Finally, we also examine how viral lineages have spread across the continent in a phylogeographic framework to gain insights into the underlying temporal and spatial transmission dynamics for several variants of concern (VOCs).
RESULTS
Our results indicate that the number of countries in Africa that can sequence the virus within their own borders is growing and that this is coupled with a shorter turnaround time from the time of sampling to sequence submission. Ongoing evolution necessitated the continual updating of primer sets, and, as a result, eight primer sets were designed in tandem with viral evolution and used to ensure effective sequencing of the virus. The pandemic unfolded through multiple waves of infection that were each driven by distinct genetic lineages, with B.1-like ancestral strains associated with the first pandemic wave of infections in 2020. Successive waves on the continent were fueled by different VOCs, with Alpha and Beta cocirculating in distinct spatial patterns during the second wave and Delta and Omicron affecting the whole continent during the third and fourth waves, respectively. Phylogeographic reconstruction points toward distinct differences in viral importation and exportation patterns associated with the Alpha, Beta, Delta, and Omicron variants and subvariants, when considering both Africa versus the rest of the world and viral dissemination within the continent. Our epidemiological and phylogenetic inferences therefore underscore the heterogeneous nature of the pandemic on the continent and highlight key insights and challenges, for instance, recognizing the limitations of low testing proportions. We also highlight the early warning capacity that genomic surveillance in Africa has had for the rest of the world with the detection of new lineages and variants, the most recent being the characterization of various Omicron subvariants.
CONCLUSION
Sustained investment for diagnostics and genomic surveillance in Africa is needed as the virus continues to evolve. This is important not only to help combat SARS-CoV-2 on the continent but also because it can be used as a platform to help address the many emerging and reemerging infectious disease threats in Africa. In particular, capacity building for local sequencing within countries or within the continent should be prioritized because this is generally associated with shorter turnaround times, providing the most benefit to local public health authorities tasked with pandemic response and mitigation and allowing for the fastest reaction to localized outbreaks. These investments are crucial for pandemic preparedness and response and will serve the health of the continent well into the 21st century
Early mobilisation in critically ill COVID-19 patients: a subanalysis of the ESICM-initiated UNITE-COVID observational study
Background
Early mobilisation (EM) is an intervention that may improve the outcome of critically ill patients. There is limited data on EM in COVID-19 patients and its use during the first pandemic wave.
Methods
This is a pre-planned subanalysis of the ESICM UNITE-COVID, an international multicenter observational study involving critically ill COVID-19 patients in the ICU between February 15th and May 15th, 2020. We analysed variables associated with the initiation of EM (within 72 h of ICU admission) and explored the impact of EM on mortality, ICU and hospital length of stay, as well as discharge location. Statistical analyses were done using (generalised) linear mixed-effect models and ANOVAs.
Results
Mobilisation data from 4190 patients from 280 ICUs in 45 countries were analysed. 1114 (26.6%) of these patients received mobilisation within 72 h after ICU admission; 3076 (73.4%) did not. In our analysis of factors associated with EM, mechanical ventilation at admission (OR 0.29; 95% CI 0.25, 0.35; pâ=â0.001), higher age (OR 0.99; 95% CI 0.98, 1.00; pââ€â0.001), pre-existing asthma (OR 0.84; 95% CI 0.73, 0.98; pâ=â0.028), and pre-existing kidney disease (OR 0.84; 95% CI 0.71, 0.99; pâ=â0.036) were negatively associated with the initiation of EM. EM was associated with a higher chance of being discharged home (OR 1.31; 95% CI 1.08, 1.58; pâ=â0.007) but was not associated with length of stay in ICU (adj. difference 0.91 days; 95% CI â 0.47, 1.37, pâ=â0.34) and hospital (adj. difference 1.4 days; 95% CI â 0.62, 2.35, pâ=â0.24) or mortality (OR 0.88; 95% CI 0.7, 1.09, pâ=â0.24) when adjusted for covariates.
Conclusions
Our findings demonstrate that a quarter of COVID-19 patients received EM. There was no association found between EM in COVID-19 patients' ICU and hospital length of stay or mortality. However, EM in COVID-19 patients was associated with increased odds of being discharged home rather than to a care facility.
Trial registration ClinicalTrials.gov: NCT04836065 (retrospectively registered April 8th 2021)
Submerged macrophytes as bioindicators for pollution in Lake Mariut along the Mediterranean coast of Egypt
The fact that Lake Mariut is the only coastal lake in Egypt with no natural connection with the Mediterranean Sea has further aggravated the situation through the accumulation of pollutants in the lake. The effect of different pollutants on the composition and distribution of submerged aquatic plant communities in Lake Mariut (Egypt) was studied. Effects of heavy metals, nitrates, nitrites and phosphorus in the water and sediments on the dry weight standing crop (DWSC) of submerged macrophytes and their spatial distributions were investigated. Seven stations (which include 20 sites) were selected to represent the different pollution sources. The species-environmental data were analyzed using Canonical Correspondence Analysis (CCA). Only five submerged macrophytes species were recorded in Lake Mariut. There was an obvious variation in the DWSC of the species according to the pollution source.
Potamogeton pectinatus and Ceratophyllum demersum were the best growing species with maximum DWSC, while C. muricatum, Myriophyllum spicatum and Najas marina subap.
armata were weakly growing in most stations particularly the heavily polluted ones. The CCA ordination showed that dissolved nitrite, iron, phosphate, salinity and sulphate as well as sediment copper and cadmium are the most effective variables on the DWSC of these macrophytes.
C. demersum is more developed in agricultural polluted areas with high levels of water phosphate and sulphate and low levels of water and sediment copper, chromium and lead. On the other hand, P. pectinatus characterizes the industrial polluted areas, with high level of sediment cadmium and low levels of water copper and chromium. These results were discussed and compared with other related studies.Galal Tarek M., Farahat Emad A., Fawzy Manal. Submerged macrophytes as bioindicators for pollution in Lake Mariut along the Mediterranean coast of Egypt. In: Ecologia mediterranea, tome 34,2008. pp. 83-91
- âŠ