1,264 research outputs found

    Energy Technologies for Food Utilization for Displaced People: from identification to evaluation

    Get PDF
    By end-2014, the number of forcibly displaced people in the World was 59.5 million, the highest after the II World War. UNHCR (2015) reports that they are 19.5 million refugees, 38.2 internally displaced persons (IDPs) and 1.8 asylum-seekers, and they have been progressively increased in number for the last 4 years, with an estimation of 13.9 newly displaced in 2014. Such people have several needs, especially in terms of food security. Humanitarian actors usually try to address them focusing on food availability and access, while food utilization is often neglected (Haver K., Harmer A., Taylor G., 2013). The utilization of food, including the access to drinking water, is one of the four pillars of food security, and affects food properties in terms of nutritional intake, especially micronutrients, and healthiness (European Commission, 2009). Appropriate technologies for cooking, food preservation, and water purification are required, but all of them entail the access to fuel or other energy sources. Indeed, access to energy for displaced people is very important from different perspectives, but it is often problematic, and entails five key challenges: “protection, relations between hosts and displaced people, environmental problems, household energy-related natural resource restrictions and livelihood-related challenges” (Lyytinen 2009, pag. 1). The importance of energy for development was pointed out by the Sustainable Energy for All (SE4All) Initiative, while Safe Access to Fuel and Energy (SAFE) focused the attention on crisis-affected populations, in particular refugees and IDPs (SAFE, 2015). Indeed, if people living in camps, and similarly in informal settlements, are provided with energy services, they may access to a wide range of opportunities to change their condition, and conduct a more productive and active life (Bellanca, 2014). Unfortunately, several gaps are still present in humanitarian response for providing displaced people with an adequate access to energy, and studies are few, mainly related to stoves and generally without an independent impact assessment (Gunning, 2014). Very few displaced people have access to modern forms of energy: generally their practices are unsustainable, with average household costs of at least 200 USD per year (family of five) and disproportionate CO2 emission compared to quantity and quality of energy finally utilized (Lahn & Grafham, 2015). Therefore, the gap in giving the right importance to energy access – in particular in linking relief, rehabilitation and development – is clear

    Towards an holistic approach to energy access in humanitarian settings: the SET4food project from technology transfer to knowledge sharing

    Get PDF
    The increasing number of displaced people in the world not only requires rapid humanitarian actions, but also attention to host communities and a holistic and long-term vision. Energy has not been really considered a major topic in people displacement, yet, resulting in negative impacts on several aspects, including food security. New solutions are required, in terms of energy planning, technology development, and adaptation, as well as decision making, sensitization, training, and support to humanitarian actors. The Sustainable Energy Technologies for food security (SET4food) project phase 1 (2014–2015) developed a number of tools to support identification, adaptation, and introduction of appropriate solutions, tested some pilot innovations in critical areas, and promoted the enhancement of humanitarian response capability in the energy sector via an extensive capacity building program. In addition, a second phase of the project (2015–2018) fostered networking and collaboration between the main actors by developing an e-sharing platform, called ENERGYCoP, including a global not-for-profit community of practices for humanitarian professionals working in the energy sector. The platform may enable the shift from traditional “technological transfer” to a more participative approach on co-design and technological cooperation activated by a knowledge sharing mechanism. This paper outlines the main challenges and the achieved results of SET4food, providing recommendations for researchers and practitioners on the way forward

    Laboratory testing of the innovative low-cost Mewar Angithi insert for improving energy efficiency of cooking tasks on three-stone fires in critical contexts

    Get PDF
    Currently, about 2.7 billion people across the world still lack access to clean cooking means. Humanitarian emergencies and post-emergencies are among the most critical situations: the utilization of traditional devices such as three-stone fires have a huge negative impact not only on food security but also on the socio-economic status of people, their health and the surrounding environment. Advanced Cooking Stoves may constitute better systems compared to actual ones, however, financial, logistic and time constraints have strongly limited the interventions in critical contexts until now. The innovative, low-cost Mewar Angithi insert for improving energy efficiency of three-stone fires may play a role in the transition to better cooking systems in such contexts. In this paper, we rely on the Water Boiling Test 4.2.3 to assess the performances of the Mewar Angithi insert respect to a traditional three-stone fire and we analyse the results through a robust statistical procedure. The potentiality and suitability of this novel solution is discussed for its use in critical contexts

    Role of serial ultrasound screening of venous thrombosis in oncologic children with central lines

    Get PDF
    Objective: Pediatric oncology patients are more likely to develop venous thromboembolic events related to central venous catheter (CVC). Study aim was to determine the incidence of catheter related thrombosis (CRT) in a cohort of pediatric oncology patients using vascular ultrasound (US). Methods: Consecutive children of a single cancer referral center, requiring medium to long term CVC implantation, were screened for CRT, using serial ultrasound exams. Measurements and main results: US examinations were taken 15, 30 and 90 days after CVC implantation. A total of 113 catheters were studied in 103 patients (median age 10.5 years old). Ultrasound screening was completed in 80.5% patients. Apart from three subjects, US investigations were well tolerated. Patients were followed for a median of 87 days. No symptomatic CRT was recorded throughout. Three cases of asymptomatic thrombosis were identified with early US screening; incidence of CRT events for 1000 catheter-days was 0.11. The presence of previous catheter-related infection and an history of one or more previous CVC placement were identified as risk factors. Conclusions: In our pediatric patients the incidence of CRT is low. Ultrasound monitoring is well tolerated and allows detecting asymptomatic CRT. Patients with previous CVC infection or insertion seem to have a higher risk of CRT (p =0.003 and p = 0.043 respectively). Keywords: Central venous catheters, Venous thrombosis, Vascular ultrasound, Vascular catheter infections, Childre

    Dissemination of patient blood management practices in Swiss intensive care units: a cross-sectional survey

    Full text link
    BACKGROUND Patient blood management (PBM) promotes the routine detection and treatment of anaemia before surgery, optimising the management of bleeding disorders, thus minimising iatrogenic blood loss and pre-empting allogeneic blood utilisation. PBM programmes have expanded from the elective surgical setting to nonsurgical patients, including those in intensive care units (ICUs), but their dissemination in a whole country is unknown. METHODS We performed a cross-sectional, anonymous survey (10 October 2018 to 13 March 2019) of all ordinary medical members of the Swiss Society of Intensive Care Medicine and the registered ICU nurses from the 77 certified adult Swiss ICUs. We analysed PBM-related interventions adopted in Swiss ICUs and related them to the spread of PBM in Swiss hospitals. We explored blood test ordering policies, blood-sparing strategies and red blood cell-related transfusion practices in ICUs. RESULTS A total of 115 medical doctors and 624 nurses (response rates 27% and 30%, respectively) completed the surveys. Hospitals had implemented a PBM programme according to 42% of physicians, more commonly in Switzerland's German-speaking regions (Odds Ratio [OR] 3.39, 95% confidence interval [CI] 1.23-9.35; p = 0.018) and in hospitals with more than 500 beds (OR 3.91, 95% CI 1.48-10.4; p = 0.006). The PBM programmes targeted the detection and correction of anaemia before surgery (79%), minimising perioperative blood loss (94%) and optimising anaemia tolerance (98%). Laboratory tests were ordered in 70.4% by the intensivist during morning rounds; the nurses performed arterial blood gas analyses autonomously in 48.4%. Blood-sparing techniques were used by only 42.1% of nurses (263 of 624, missing: 6) and 47.0% of physicians (54 of 115). Approximately 60% of respondents used an ICU-specific transfusion guideline. The reported haemoglobin threshold for the nonbleeding ICU population was 70 g/l and, therefore, was at the lower limit of current guidelines. CONCLUSIONS Based on this survey, the estimated proportion of the intensivists working in hospitals with a PBM initiative is 42%, with significant variability between regions and hospitals of various sizes. The risk of iatrogenic anaemia is relevant due to liberal blood sample collection practices and the underuse of blood-sparing techniques. The reported transfusion threshold suggests excellent adherence to current international ICU-specific transfusion guidelines

    The trans-subclavian retrograde approach for transcatheter aortic valve replacement: Single-center experience

    Get PDF
    ObjectiveAortic valve disease is the most common acquired valvular heart disease in adults. With the increasing elderly population, the proportion of patients with symptomatic aortic stenosis who are unsuitable for conventional surgery is increasing. Transcatheter aortic valve implantation has rapidly gained credibility as a valuable alternative to surgery to treat these patients; however, they often have severe iliac-femoral arteriopathy, which renders the transfemoral approach unusable. We report our experience with the trans-subclavian approach for transcatheter aortic valve implantation using the CoreValve (Medtronic CV Luxembourg S.a.r.l.) in 6 patients.MethodsIn May 2008 to September 2009, 6 patients (mean age of 82 ± 5 years), with symptomatic aortic stenosis and no reasonable surgical option because of excessive risk, were excluded from percutaneous femoral CoreValve implantation because of iliac-femoral arteriopathy. These patients underwent transcatheter aortic valve implantation via the axillary artery. Procedures were performed by a combined team of cardiologists, cardiac surgeons, and anesthetists in the catheterization laboratory. The CoreValve 18F delivery system was introduced via the left subclavian artery in 6 patients, 1 with a patent left internal thoracic to left anterior descending artery graft.ResultsProcedural success was obtained in all patients, and the mean aortic gradient decreased 5 mm Hg or less immediately after valve deployment. One patient required implantation of a permanent pacemaker. One patient required a subclavian covered stent implantation to treat a postimplant artery dissection associated with difficult surgical hemostasis. One patient was discharged in good condition but died of pneumonia 40 days after the procedure. All patients were asymptomatic on discharge, with good mid-term prosthesis performance.ConclusionsTranscatheter aortic valve implantation via a surgical subclavian approach seems safe and feasible, offering a new option to treat select, inoperable, and high-risk patients with severe aortic stenosis and peripheral vasculopathy

    Spectral exponent assessment and neurofilament light chain: a comprehensive approach to describe recovery patterns in stroke

    Get PDF
    IntroductionUnderstanding the residual recovery potential in stroke patients is crucial for tailoring effective neurorehabilitation programs. We propose using EEG and plasmatic Neurofilament light chain (NfL) levels as a model to depict longitudinal patterns of stroke recovery.MethodsWe enrolled 13 patients (4 female, mean age 74.7 ± 8.8) who underwent stroke in the previous month and were hospitalized for 2-months rehabilitation. Patients underwent blood withdrawal, clinical evaluation and high-definition EEG at T1 (first week of rehabilitation) and at T2 (53 ± 10 days after). We assessed the levels of NfL and we analyzed the EEG signal extracting Spectral Exponent (SE) values. We compared our variables between the two timepoint and between cortical and non-cortical strokes.ResultsWe found a significant difference in the symmetry of SE values between cortical and non-cortical stroke at both T1 (p = 0.005) and T2 (p = 0.01). SE in the affected hemisphere showed significantly steeper values at T1 when compared with T2 (p = 0.001). EEG measures were consistently related to clinical scores, while NfL at T1 was related to the volume of ischemic lesions (r = 0.75; p = 0.003). Additionally, the combined use of NfL and SE indicated varying trends in longitudinal clinical recovery.ConclusionWe present proof of concept of a promising approach for the characterization of different recovery patterns in stroke patients

    How future surgery will benefit from SARS-COV-2-related measures: a SPIGC survey conveying the perspective of Italian surgeons

    Get PDF
    COVID-19 negatively affected surgical activity, but the potential benefits resulting from adopted measures remain unclear. The aim of this study was to evaluate the change in surgical activity and potential benefit from COVID-19 measures in perspective of Italian surgeons on behalf of SPIGC. A nationwide online survey on surgical practice before, during, and after COVID-19 pandemic was conducted in March-April 2022 (NCT:05323851). Effects of COVID-19 hospital-related measures on surgical patients' management and personal professional development across surgical specialties were explored. Data on demographics, pre-operative/peri-operative/post-operative management, and professional development were collected. Outcomes were matched with the corresponding volume. Four hundred and seventy-three respondents were included in final analysis across 14 surgical specialties. Since SARS-CoV-2 pandemic, application of telematic consultations (4.1% vs. 21.6%; p < 0.0001) and diagnostic evaluations (16.4% vs. 42.2%; p < 0.0001) increased. Elective surgical activities significantly reduced and surgeons opted more frequently for conservative management with a possible indication for elective (26.3% vs. 35.7%; p < 0.0001) or urgent (20.4% vs. 38.5%; p < 0.0001) surgery. All new COVID-related measures are perceived to be maintained in the future. Surgeons' personal education online increased from 12.6% (pre-COVID) to 86.6% (post-COVID; p < 0.0001). Online educational activities are considered a beneficial effect from COVID pandemic (56.4%). COVID-19 had a great impact on surgical specialties, with significant reduction of operation volume. However, some forced changes turned out to be benefits. Isolation measures pushed the use of telemedicine and telemetric devices for outpatient practice and favored communication for educational purposes and surgeon-patient/family communication. From the Italian surgeons' perspective, COVID-related measures will continue to influence future surgical clinical practice

    Prescription appropriateness of anti-diabetes drugs in elderly patients hospitalized in a clinical setting: evidence from the REPOSI Register

    Get PDF
    Diabetes is an increasing global health burden with the highest prevalence (24.0%) observed in elderly people. Older diabetic adults have a greater risk of hospitalization and several geriatric syndromes than older nondiabetic adults. For these conditions, special care is required in prescribing therapies including anti- diabetes drugs. Aim of this study was to evaluate the appropriateness and the adherence to safety recommendations in the prescriptions of glucose-lowering drugs in hospitalized elderly patients with diabetes. Data for this cross-sectional study were obtained from the REgistro POliterapie-Società Italiana Medicina Interna (REPOSI) that collected clinical information on patients aged ≥ 65 years acutely admitted to Italian internal medicine and geriatric non-intensive care units (ICU) from 2010 up to 2019. Prescription appropriateness was assessed according to the 2019 AGS Beers Criteria and anti-diabetes drug data sheets.Among 5349 patients, 1624 (30.3%) had diagnosis of type 2 diabetes. At admission, 37.7% of diabetic patients received treatment with metformin, 37.3% insulin therapy, 16.4% sulfonylureas, and 11.4% glinides. Surprisingly, only 3.1% of diabetic patients were treated with new classes of anti- diabetes drugs. According to prescription criteria, at admission 15.4% of patients treated with metformin and 2.6% with sulfonylureas received inappropriately these treatments. At discharge, the inappropriateness of metformin therapy decreased (10.2%, P < 0.0001). According to Beers criteria, the inappropriate prescriptions of sulfonylureas raised to 29% both at admission and at discharge. This study shows a poor adherence to current guidelines on diabetes management in hospitalized elderly people with a high prevalence of inappropriate use of sulfonylureas according to the Beers criteria

    The “Diabetes Comorbidome”: A Different Way for Health Professionals to Approach the Comorbidity Burden of Diabetes

    Get PDF
    (1) Background: The disease burden related to diabetes is increasing greatly, particularly in older subjects. A more comprehensive approach towards the assessment and management of diabetes’ comorbidities is necessary. The aim of this study was to implement our previous data identifying and representing the prevalence of the comorbidities, their association with mortality, and the strength of their relationship in hospitalized elderly patients with diabetes, developing, at the same time, a new graphic representation model of the comorbidome called “Diabetes Comorbidome”. (2) Methods: Data were collected from the RePoSi register. Comorbidities, socio-demographic data, severity and comorbidity indexes (Cumulative Illness rating Scale CIRS-SI and CIRS-CI), and functional status (Barthel Index), were recorded. Mortality rates were assessed in hospital and 3 and 12 months after discharge. (3) Results: Of the 4714 hospitalized elderly patients, 1378 had diabetes. The comorbidities distribution showed that arterial hypertension (57.1%), ischemic heart disease (31.4%), chronic renal failure (28.8%), atrial fibrillation (25.6%), and COPD (22.7%), were the more frequent in subjects with diabetes. The graphic comorbidome showed that the strongest predictors of death at in hospital and at the 3-month follow-up were dementia and cancer. At the 1-year follow-up, cancer was the first comorbidity independently associated with mortality. (4) Conclusions: The “Diabetes Comorbidome” represents the perfect instrument for determining the prevalence of comorbidities and the strength of their relationship with risk of death, as well as the need for an effective treatment for improving clinical outcomes
    • …
    corecore