19 research outputs found

    Prevalence and Risk Factors of Obesity among Elderly attending Geriatric Outpatient Clinics in Mansoura City

    Get PDF
    Obesity is a major public health problem affecting all ages in both developed and developing countries. It is considered the fifth leading risk factor for deaths all over the world as about 2.8 million people die due to obesity each year directly or indirectly.  Obesity in elderly is considered one of the most serious public health challenges for all over the world. It is a complex; multifactorial disease arises from the interactions between genetic, environmental and behavioral factors together with other factors results in energy imbalance and promotes excessive fat deposition. Aim: to Determine Prevalence and Risk Factors of Obesity among Elderly attending Geriatric Outpatient Clinics in Mansoura City. Method: Descriptive, analytical, cross sectional hospital based research design was used. The study carried out on 126 elderly attending Geriatric outpatient clinics in the specialized medical hospital and general hospital in Mansoura City. Data was collected using 3 tools, socio-demographic and clinical data structured interview sheet, Health promoting Lifestyle profile II (HPLPII), Body Mass Index. Results: The results indicate that the prevalence of obesity among elderly attending Geriatric Outpatient Clinics in Mansoura City was 33.3% and there was significant relation between positive family history of obesity, unhealthy lifestyle as poor nutritional habits, and lack of physical activity, poor stress management and obesity. Conclusion: Increase awareness about obesity and healthy lifestyle is essential for elderly to prevent obesity and its complications. Keywords: Obesity, Elderly, Risk Factors, prevalence, Lifestyl

    Risk factors of falls among elderly living in Urban Suez - Egypt

    Get PDF
    Introduction: Falling is one of the most common geriatric syndromes threatening the independence of older persons. Falls result from a complex and interactive mix of biological or medical, behavioral and environmental factors, many of which are preventable. Studying these diverse risk factors would aid early detection and management of them at the primary care level. Methods: This is a cross sectional study about risk factors of falls was conducted to 340 elders in Urban Suez. Those are all patients over 60 who attended two family practice centers in Urban Suez. Results: When asked about falling during the past 12 months, 205 elders recalled at least one incident of falling. Of them, 36% had their falls outdoors and 24% mentioned that stairs was the most prevalent site for indoor falls. Falls were also reported more among dependant than independent elderly. Using univariate regression analysis, almost all tested risk factors were significantly associated with falls in the studied population. These risk factors include: living alone, having chronic diseases, using medications, having a physical deficit, being in active, and having a high nutritional risk. However, the multivariate regression analysis proved that the strongest risk factors are low level of physical activity with OR 0.6 and P value 0.03, using a cane or walker (OR 1.69 and P value 0.001) and Impairment of daily living activities (OR 1.7 and P value 0.001). Conclusion: Although falls is a serious problem among elderly with many consequences, it has many preventable risk factors. Health care providers should advice people to remain active and more research is needed in such an important area of Family Practice.Pan African Medical Journal 2013; 14:2

    Insecticide resistance in the sand fly, Phlebotomus papatasi from Khartoum State, Sudan

    Get PDF
    <p>Abstract</p> <p>Background</p> <p><it>Phlebotomus papatasi </it>the vector of cutaneous leishmaniasis (CL) is the most widely spread sand fly in Sudan. No data has previously been collected on insecticide susceptibility and/or resistance of this vector, and a first study to establish a baseline data is reported here.</p> <p>Methods</p> <p>Sand flies were collected from Surogia village, (Khartoum State), Rahad Game Reserve (eastern Sudan) and White Nile area (Central Sudan) using light traps. Sand flies were reared in the Tropical Medicine Research Institute laboratory. The insecticide susceptibility status of first progeny (F1) of <it>P. papatasi </it>of each population was tested using WHO insecticide kits. Also, <it>P. papatasi </it>specimens from Surogia village and Rahad Game Reserve were assayed for activities of enzyme systems involved in insecticide resistance (acetylcholinesterase (AChE), non-specific carboxylesterases (EST), glutathione-S-transferases (GSTs) and cytochrome p450 monooxygenases (Cyt p450).</p> <p>Results</p> <p>Populations of <it>P. papatasi </it>from White Nile and Rahad Game Reserve were sensitive to dichlorodiphenyltrichloroethane (DDT), permethrin, malathion, and propoxur. However, the <it>P. papatasi </it>population from Surogia village was sensitive to DDT and permethrin but highly resistant to malathion and propoxur. Furthermore, <it>P. papatasi </it>of Surogia village had significantly higher insecticide detoxification enzyme activity than of those of Rahad Game Reserve. The sand fly population in Surogia displayed high AChE activity and only three specimens had elevated levels for EST and GST.</p> <p>Conclusions</p> <p>The study provided evidence for malathion and propoxur resistance in the sand fly population of Surogia village, which probably resulted from anti-malarial control activities carried out in the area during the past 50 years.</p

    Impact of opioid-free analgesia on pain severity and patient satisfaction after discharge from surgery: multispecialty, prospective cohort study in 25 countries

    Get PDF
    Background: Balancing opioid stewardship and the need for adequate analgesia following discharge after surgery is challenging. This study aimed to compare the outcomes for patients discharged with opioid versus opioid-free analgesia after common surgical procedures.Methods: This international, multicentre, prospective cohort study collected data from patients undergoing common acute and elective general surgical, urological, gynaecological, and orthopaedic procedures. The primary outcomes were patient-reported time in severe pain measured on a numerical analogue scale from 0 to 100% and patient-reported satisfaction with pain relief during the first week following discharge. Data were collected by in-hospital chart review and patient telephone interview 1 week after discharge.Results: The study recruited 4273 patients from 144 centres in 25 countries; 1311 patients (30.7%) were prescribed opioid analgesia at discharge. Patients reported being in severe pain for 10 (i.q.r. 1-30)% of the first week after discharge and rated satisfaction with analgesia as 90 (i.q.r. 80-100) of 100. After adjustment for confounders, opioid analgesia on discharge was independently associated with increased pain severity (risk ratio 1.52, 95% c.i. 1.31 to 1.76; P &lt; 0.001) and re-presentation to healthcare providers owing to side-effects of medication (OR 2.38, 95% c.i. 1.36 to 4.17; P = 0.004), but not with satisfaction with analgesia (beta coefficient 0.92, 95% c.i. -1.52 to 3.36; P = 0.468) compared with opioid-free analgesia. Although opioid prescribing varied greatly between high-income and low- and middle-income countries, patient-reported outcomes did not.Conclusion: Opioid analgesia prescription on surgical discharge is associated with a higher risk of re-presentation owing to side-effects of medication and increased patient-reported pain, but not with changes in patient-reported satisfaction. Opioid-free discharge analgesia should be adopted routinely

    The evolving SARS-CoV-2 epidemic in Africa: Insights from rapidly expanding genomic surveillance

    Get PDF
    INTRODUCTION Investment in Africa over the past year with regard to severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) sequencing has led to a massive increase in the number of sequences, which, to date, exceeds 100,000 sequences generated to track the pandemic on the continent. These sequences have profoundly affected how public health officials in Africa have navigated the COVID-19 pandemic. RATIONALE We demonstrate how the first 100,000 SARS-CoV-2 sequences from Africa have helped monitor the epidemic on the continent, how genomic surveillance expanded over the course of the pandemic, and how we adapted our sequencing methods to deal with an evolving virus. Finally, we also examine how viral lineages have spread across the continent in a phylogeographic framework to gain insights into the underlying temporal and spatial transmission dynamics for several variants of concern (VOCs). RESULTS Our results indicate that the number of countries in Africa that can sequence the virus within their own borders is growing and that this is coupled with a shorter turnaround time from the time of sampling to sequence submission. Ongoing evolution necessitated the continual updating of primer sets, and, as a result, eight primer sets were designed in tandem with viral evolution and used to ensure effective sequencing of the virus. The pandemic unfolded through multiple waves of infection that were each driven by distinct genetic lineages, with B.1-like ancestral strains associated with the first pandemic wave of infections in 2020. Successive waves on the continent were fueled by different VOCs, with Alpha and Beta cocirculating in distinct spatial patterns during the second wave and Delta and Omicron affecting the whole continent during the third and fourth waves, respectively. Phylogeographic reconstruction points toward distinct differences in viral importation and exportation patterns associated with the Alpha, Beta, Delta, and Omicron variants and subvariants, when considering both Africa versus the rest of the world and viral dissemination within the continent. Our epidemiological and phylogenetic inferences therefore underscore the heterogeneous nature of the pandemic on the continent and highlight key insights and challenges, for instance, recognizing the limitations of low testing proportions. We also highlight the early warning capacity that genomic surveillance in Africa has had for the rest of the world with the detection of new lineages and variants, the most recent being the characterization of various Omicron subvariants. CONCLUSION Sustained investment for diagnostics and genomic surveillance in Africa is needed as the virus continues to evolve. This is important not only to help combat SARS-CoV-2 on the continent but also because it can be used as a platform to help address the many emerging and reemerging infectious disease threats in Africa. In particular, capacity building for local sequencing within countries or within the continent should be prioritized because this is generally associated with shorter turnaround times, providing the most benefit to local public health authorities tasked with pandemic response and mitigation and allowing for the fastest reaction to localized outbreaks. These investments are crucial for pandemic preparedness and response and will serve the health of the continent well into the 21st century

    Global, regional, and national comparative risk assessment of 79 behavioural, environmental and occupational, and metabolic risks or clusters of risks, 1990-2015: a systematic analysis for the Global Burden of Disease Study 2015

    Get PDF
    Forouzanfar MH, Afshin A, Alexander LT, et al. Global, regional, and national comparative risk assessment of 79 behavioural, environmental and occupational, and metabolic risks or clusters of risks, 1990-2015: a systematic analysis for the Global Burden of Disease Study 2015. LANCET. 2016;388(10053):1659-1724.Background The Global Burden of Diseases, Injuries, and Risk Factors Study 2015 provides an up-to-date synthesis of the evidence for risk factor exposure and the attributable burden of disease. By providing national and subnational assessments spanning the past 25 years, this study can inform debates on the importance of addressing risks in context. Methods We used the comparative risk assessment framework developed for previous iterations of the Global Burden of Disease Study to estimate attributable deaths, disability-adjusted life-years (DALYs), and trends in exposure by age group, sex, year, and geography for 79 behavioural, environmental and occupational, and metabolic risks or clusters of risks from 1990 to 2015. This study included 388 risk-outcome pairs that met World Cancer Research Fund-defined criteria for convincing or probable evidence. We extracted relative risk and exposure estimates from randomised controlled trials, cohorts, pooled cohorts, household surveys, census data, satellite data, and other sources. We used statistical models to pool data, adjust for bias, and incorporate covariates. We developed a metric that allows comparisons of exposure across risk factors-the summary exposure value. Using the counterfactual scenario of theoretical minimum risk level, we estimated the portion of deaths and DALYs that could be attributed to a given risk. We decomposed trends in attributable burden into contributions from population growth, population age structure, risk exposure, and risk-deleted cause-specific DALY rates. We characterised risk exposure in relation to a Socio-demographic Index (SDI). Findings Between 1990 and 2015, global exposure to unsafe sanitation, household air pollution, childhood underweight, childhood stunting, and smoking each decreased by more than 25%. Global exposure for several occupational risks, high body-mass index (BMI), and drug use increased by more than 25% over the same period. All risks jointly evaluated in 2015 accounted for 57.8% (95% CI 56.6-58.8) of global deaths and 41.2% (39.8-42.8) of DALYs. In 2015, the ten largest contributors to global DALYs among Level 3 risks were high systolic blood pressure (211.8 million [192.7 million to 231.1 million] global DALYs), smoking (148.6 million [134.2 million to 163.1 million]), high fasting plasma glucose (143.1 million [125.1 million to 163.5 million]), high BMI (120.1 million [83.8 million to 158.4 million]), childhood undernutrition (113.3 million [103.9 million to 123.4 million]), ambient particulate matter (103.1 million [90.8 million to 115.1 million]), high total cholesterol (88.7 million [74.6 million to 105.7 million]), household air pollution (85.6 million [66.7 million to 106.1 million]), alcohol use (85.0 million [77.2 million to 93.0 million]), and diets high in sodium (83.0 million [49.3 million to 127.5 million]). From 1990 to 2015, attributable DALYs declined for micronutrient deficiencies, childhood undernutrition, unsafe sanitation and water, and household air pollution; reductions in risk-deleted DALY rates rather than reductions in exposure drove these declines. Rising exposure contributed to notable increases in attributable DALYs from high BMI, high fasting plasma glucose, occupational carcinogens, and drug use. Environmental risks and childhood undernutrition declined steadily with SDI; low physical activity, high BMI, and high fasting plasma glucose increased with SDI. In 119 countries, metabolic risks, such as high BMI and fasting plasma glucose, contributed the most attributable DALYs in 2015. Regionally, smoking still ranked among the leading five risk factors for attributable DALYs in 109 countries; childhood underweight and unsafe sex remained primary drivers of early death and disability in much of sub-Saharan Africa. Interpretation Declines in some key environmental risks have contributed to declines in critical infectious diseases. Some risks appear to be invariant to SDI. Increasing risks, including high BMI, high fasting plasma glucose, drug use, and some occupational exposures, contribute to rising burden from some conditions, but also provide opportunities for intervention. Some highly preventable risks, such as smoking, remain major causes of attributable DALYs, even as exposure is declining. Public policy makers need to pay attention to the risks that are increasingly major contributors to global burden. Copyright (C) The Author(s). Published by Elsevier Ltd

    Application of Kano model for optimizing the training system among nursing internship students: a mixed-method Egyptian study

    No full text
    Abstract Background Clinical experience is an important component of nursing education because it translates students' knowledge into practice, which serves as the cornerstone of nursing practice in health care delivery. Purpose The study aims to explore the quality attributes required for optimizing the training system of nursing internship students using Kano model. Methods A concurrent exploratory sequential triangulation design was used for mixed-methods research. A total of 295 nursing internship students (Target Population) were recruited (whole-population sampling) from the study settings in Egypt. Of them, 280 (97.2%) agreed to participate in the study and completed the interview and the self-administered questionnaire. Data collection was done over 6 months from February to August, 2022. Inferential statistics and thematic data analysis were used to analyze the results. Results Findings revealed that there were 35 fundamental attributes required for high-quality nursing students’ internship training. Kano model was used to categorize and prioritize the 35 quality attributes. Kano analysis revealed that 22 attributes were categorized as "attractive" and 11 attributes were as categorized as "must be" and two were indifferent attributes. Conclusion Incorporating the voice of nurse interns during their training is the key to providing efficient and high-quality internship training experience. It could give realistic impressions about the drawbacks of training and proposed solutions. Implications of the study Nurse managers and educators in clinical settings and educational institutions should put much emphasis on the training attributes and pillars to ensure that nursing internship students are mastering the skills of competent alumni. Provision of conducive training environment that fulfill the basic needs of internship students to maintain passion for learning as well as commitment of internship students to nursing profession will improve the satisfaction level and quality of education, training, and practice. Also, incorporating internship students support system with motivation strategies are helpful tools to maintain exemplary performance of internship students during the training period

    Job embeddedness and missed nursing care at the operating theatres: the mediating role of polychronicity

    No full text
    Abstract Background Perioperative missed nursing care is a serious issue that can compromise patient safety and quality of care. However, little is known about the factors that influence perioperative missed nursing care. Aim This study aimed to examine the effects of job embeddedness and polychronicity on perioperative missed nursing care as well as to test the mediating role of polychronicity on the relationship between job embeddeness and perioperative missed nursing care. Method This was a cross-sectional correlational study that used a convenience sample of 210 operating room nurses from nine hospitals in Egypt. Data were collected using self-administered questionnaires that measured job embeddedness, polychronicity, and perioperative missed nursing care. Structural equation modeling was used to test the hypothesized relationships among the variables. Results The findings demonstrated a significant negative and moderate association between missed perioperative care and both nurses’ job embeddedness and polychronicity. Moreover, there was a moderately positive and significant correlation between polychronicity and job embeddedness. Path analysis revealed a significant positive causal effect between job embeddedness and polychronicity. The results of mediation revealed that the indirect effect of job embeddedness on missed care through polychronicity was statistically significant; suggesting that polychronicity partially mediated this relationship. Conclusion This study sheds light on the intricate relationship between nurses’ job embeddedness, missed care, and polychronicity in the operating theater context. By enhancing job embeddedness and fostering polychronicity among nurses, healthcare organizations can reduce perioperative missed care and ultimately improve patient care outcomes in this critical healthcare setting

    Perceived stress, coping strategies, symptoms severity and function status among carpal tunnel syndrome patients: a nurse-led correlational Study

    No full text
    Abstract Background Carpal tunnel syndrome (CTS) is a prevalent condition characterized by hand pain, tingling, and numbness. The severity of symptoms and functional status in CTS patients may be influenced by perceived stress and how individuals cope with it. However, scarce knowledge exists about the role of coping strategies as moderators in this relationship. Unfolding the role of perceived stress and coping strategies for CTS management will help the nurse to provide comprehensive and tailored nursing care. This will ultimately improve patient comfort, functionality, and quality of life. Purposes This study aimed to examine the role of coping strategies (adaptive and maladaptive) in the relationship between perceived stress and both symptoms severity and function status among those patients. Method We employed a multisite, correlational study design with moderation analysis. The study included 215 patients with CTS from neurosurgery outpatient clinics at three hospitals in Egypt. After obtaining their consent to participate, eligible participants completed anonymous, self-reported measures of perceived stress, the brief COPE inventory, and the Boston Carpal Tunnel Questionnaire. Demographic and biomedical data were also collected. The questionnaire took about 20 min to be completed. The data was collected over six months, starting in February 2023. Results The results showed that perceived stress, adaptive coping, and maladaptive coping were significant predictors of symptoms severity and functional status. Adaptive coping moderated the relationships between perceived stress and both symptoms severity and function status, while maladaptive coping did not. The interaction between perceived stress and adaptive coping explained a moderate effect on symptoms severity and function status after controlling for the main effects and the covariates. Conclusion This study explored the relationship between perceived stress, coping strategies, and outcomes in patients with CTS. The results indicate that nurses play a vital role in assessing and assisting patients to adopt effective coping strategies to manage perceived stress and alleviate symptoms and functional impairment. Moreover, the findings support the need for psychological interventions that address both perceived stress and coping strategies as a way to enhance the functioning status and quality of life of patients with CTS

    Assessment of Hair Aluminum, Lead, and Mercury in a Sample of Autistic Egyptian Children: Environmental Risk Factors of Heavy Metals in Autism

    No full text
    Background and Aims. The etiological factors involved in the etiology of autism remain elusive and controversial, but both genetic and environmental factors have been implicated. The aim of this study was to assess the levels and possible environmental risk factors and sources of exposure to mercury, lead, and aluminum in children with autism spectrum disorder (ASD) as compared to their matched controls. Methods. One hundred ASD children were studied in comparison to 100 controls. All participants were subjected to clinical evaluation and measurement of mercury, lead, and aluminum through hair analysis which reflects past exposure. Results. The mean Levels of mercury, lead, and aluminum in hair of the autistic patients were significantly higher than controls. Mercury, lead, and aluminum levels were positively correlated with maternal fish consumptions, living nearby gasoline stations, and the usage of aluminum pans, respectively. Conclusion. Levels of mercury, lead, and aluminum in the hair of autistic children are higher than controls. Environmental exposure to these toxic heavy metals, at key times in development, may play a causal role in autism
    corecore