55 research outputs found

    Environmental justice in the Republic of Equatorial Guinea and its post-oil reality

    Get PDF
    2013 Summer.Text in Spanish; title page and abstract in Spanish and English.Includes bibliographical references.After gaining its independence from Spain in 1968 and the subsequent discovery of some of the largest offshore oil reserves in Africa in 1995, the socioeconomic reality of Equatorial Guinea has transformed dramatically in the past decades. Once considered an economically stagnant and politically corrupt country, today the Equatoguinean economy is categorized as one of the fastest growing in the world. Yet in spite of all these changes —that initially seem positive— the current political powers have perpetuated a state structure that hinders the great majority of the country’s population, creating a state of environmental injustice in which the Equatoguinean people suffer the consequences of the exploitation of their natural resources without the opportunity to benefit from the positive development that the hydrocarbon industry brings to the nation’s economy. The present investigation focuses on the factors that have contributed to this imbalance between social and economic sectors in Equatorial Guinea, and also how this “negative development” has affected the reality and identity of the nation’s people in modern times. This work will also highlight the evolution of the servile relationship between the Equatoguinean government, other international political entities, and the transnational oil corporations that have established themselves in the region, with special attention to the indifference that they have shown for the overall welfare of the Equatoguinean people. To conclude, I will consider the country’s possible future socioeconomic trajectory in light of all of this information, focusing primarily on its overall relevance in the field of Environmental Justice

    Expenditure, Coping, and Academic Behaviors Among Food-Insecure College Students at 10 Higher Education Institutes in the Appalachian and Southeastern Regions

    Get PDF
    Background A number of studies have measured college student food insecurity prevalence higher than the national average; however, no multicampus regional study among students at 4-y institutions has been undertaken in the Appalachian and Southeast regions of the United States. Objectives The aims of this study were to determine the prevalence of food insecurity among college students in the Appalachian and Southeastern regions of the United States, and to determine the association between food-insecurity status and money expenditures, coping strategies, and academic performance among a regional sample of college students. Methods This regional, cross-sectional, online survey study included 13,642 college students at 10 public universities. Food-insecurity status was measured through the use of the USDA Adult Food Security Survey. The outcomes were associations between food insecurity and behaviors determined with the use of the money expenditure scale (MES), the coping strategy scale (CSS), and the academic progress scale (APS). A forward-selection logistic regression model was used with all variables significant from individual Pearson chi-square and Wilcoxon analyses. The significance criterion α for all tests was 0.05. Results The prevalence of food insecurity at the universities ranged from 22.4% to 51.8% with an average prevalence of 30.5% for the full sample. From the forward-selection logistic regression model, MES (OR: 1.47; 95% CI: 1.40, 1.55), CSS (OR: 1.19; 95% CI: 1.18, 1.21), and APS (OR: 0.95; 95% CI: 0.91, 0.99) scores remained significant predictors of food insecurity. Grade point average, academic year, health, race/ethnicity, financial aid, cooking frequency, and health insurance also remained significant predictors of food security status. Conclusions Food insecurity prevalence was higher than the national average. Food-insecure college students were more likely to display high money expenditures and exhibit coping behaviors, and to have poor academic performance

    Expenditure, Coping, and Academic Behaviors Among Food-Insecure College Students at 10 Higher Education Institutes in the Appalachian and Southeastern Regions

    Get PDF
    Background A number of studies have measured college student food insecurity prevalence higher than the national average; however, no multicampus regional study among students at 4-y institutions has been undertaken in the Appalachian and Southeast regions of the United States. Objectives The aims of this study were to determine the prevalence of food insecurity among college students in the Appalachian and Southeastern regions of the United States, and to determine the association between food-insecurity status and money expenditures, coping strategies, and academic performance among a regional sample of college students. Methods This regional, cross-sectional, online survey study included 13,642 college students at 10 public universities. Food-insecurity status was measured through the use of the USDA Adult Food Security Survey. The outcomes were associations between food insecurity and behaviors determined with the use of the money expenditure scale (MES), the coping strategy scale (CSS), and the academic progress scale (APS). A forward-selection logistic regression model was used with all variables significant from individual Pearson chi-square and Wilcoxon analyses. The significance criterion α for all tests was 0.05. Results The prevalence of food insecurity at the universities ranged from 22.4% to 51.8% with an average prevalence of 30.5% for the full sample. From the forward-selection logistic regression model, MES (OR: 1.47; 95% CI: 1.40, 1.55), CSS (OR: 1.19; 95% CI: 1.18, 1.21), and APS (OR: 0.95; 95% CI: 0.91, 0.99) scores remained significant predictors of food insecurity. Grade point average, academic year, health, race/ethnicity, financial aid, cooking frequency, and health insurance also remained significant predictors of food security status. Conclusions Food insecurity prevalence was higher than the national average. Food-insecure college students were more likely to display high money expenditures and exhibit coping behaviors, and to have poor academic performance

    Antimicrobial resistance among migrants in Europe: a systematic review and meta-analysis

    Get PDF
    BACKGROUND: Rates of antimicrobial resistance (AMR) are rising globally and there is concern that increased migration is contributing to the burden of antibiotic resistance in Europe. However, the effect of migration on the burden of AMR in Europe has not yet been comprehensively examined. Therefore, we did a systematic review and meta-analysis to identify and synthesise data for AMR carriage or infection in migrants to Europe to examine differences in patterns of AMR across migrant groups and in different settings. METHODS: For this systematic review and meta-analysis, we searched MEDLINE, Embase, PubMed, and Scopus with no language restrictions from Jan 1, 2000, to Jan 18, 2017, for primary data from observational studies reporting antibacterial resistance in common bacterial pathogens among migrants to 21 European Union-15 and European Economic Area countries. To be eligible for inclusion, studies had to report data on carriage or infection with laboratory-confirmed antibiotic-resistant organisms in migrant populations. We extracted data from eligible studies and assessed quality using piloted, standardised forms. We did not examine drug resistance in tuberculosis and excluded articles solely reporting on this parameter. We also excluded articles in which migrant status was determined by ethnicity, country of birth of participants' parents, or was not defined, and articles in which data were not disaggregated by migrant status. Outcomes were carriage of or infection with antibiotic-resistant organisms. We used random-effects models to calculate the pooled prevalence of each outcome. The study protocol is registered with PROSPERO, number CRD42016043681. FINDINGS: We identified 2274 articles, of which 23 observational studies reporting on antibiotic resistance in 2319 migrants were included. The pooled prevalence of any AMR carriage or AMR infection in migrants was 25·4% (95% CI 19·1-31·8; I2 =98%), including meticillin-resistant Staphylococcus aureus (7·8%, 4·8-10·7; I2 =92%) and antibiotic-resistant Gram-negative bacteria (27·2%, 17·6-36·8; I2 =94%). The pooled prevalence of any AMR carriage or infection was higher in refugees and asylum seekers (33·0%, 18·3-47·6; I2 =98%) than in other migrant groups (6·6%, 1·8-11·3; I2 =92%). The pooled prevalence of antibiotic-resistant organisms was slightly higher in high-migrant community settings (33·1%, 11·1-55·1; I2 =96%) than in migrants in hospitals (24·3%, 16·1-32·6; I2 =98%). We did not find evidence of high rates of transmission of AMR from migrant to host populations. INTERPRETATION: Migrants are exposed to conditions favouring the emergence of drug resistance during transit and in host countries in Europe. Increased antibiotic resistance among refugees and asylum seekers and in high-migrant community settings (such as refugee camps and detention facilities) highlights the need for improved living conditions, access to health care, and initiatives to facilitate detection of and appropriate high-quality treatment for antibiotic-resistant infections during transit and in host countries. Protocols for the prevention and control of infection and for antibiotic surveillance need to be integrated in all aspects of health care, which should be accessible for all migrant groups, and should target determinants of AMR before, during, and after migration. FUNDING: UK National Institute for Health Research Imperial Biomedical Research Centre, Imperial College Healthcare Charity, the Wellcome Trust, and UK National Institute for Health Research Health Protection Research Unit in Healthcare-associated Infections and Antimictobial Resistance at Imperial College London

    Safety, immunogenicity, and reactogenicity of BNT162b2 and mRNA-1273 COVID-19 vaccines given as fourth-dose boosters following two doses of ChAdOx1 nCoV-19 or BNT162b2 and a third dose of BNT162b2 (COV-BOOST): a multicentre, blinded, phase 2, randomised trial

    Get PDF

    The development and validation of a scoring tool to predict the operative duration of elective laparoscopic cholecystectomy

    Get PDF
    Background: The ability to accurately predict operative duration has the potential to optimise theatre efficiency and utilisation, thus reducing costs and increasing staff and patient satisfaction. With laparoscopic cholecystectomy being one of the most commonly performed procedures worldwide, a tool to predict operative duration could be extremely beneficial to healthcare organisations. Methods: Data collected from the CholeS study on patients undergoing cholecystectomy in UK and Irish hospitals between 04/2014 and 05/2014 were used to study operative duration. A multivariable binary logistic regression model was produced in order to identify significant independent predictors of long (> 90 min) operations. The resulting model was converted to a risk score, which was subsequently validated on second cohort of patients using ROC curves. Results: After exclusions, data were available for 7227 patients in the derivation (CholeS) cohort. The median operative duration was 60 min (interquartile range 45–85), with 17.7% of operations lasting longer than 90 min. Ten factors were found to be significant independent predictors of operative durations > 90 min, including ASA, age, previous surgical admissions, BMI, gallbladder wall thickness and CBD diameter. A risk score was then produced from these factors, and applied to a cohort of 2405 patients from a tertiary centre for external validation. This returned an area under the ROC curve of 0.708 (SE = 0.013, p  90 min increasing more than eightfold from 5.1 to 41.8% in the extremes of the score. Conclusion: The scoring tool produced in this study was found to be significantly predictive of long operative durations on validation in an external cohort. As such, the tool may have the potential to enable organisations to better organise theatre lists and deliver greater efficiencies in care

    Surgical site infection after gastrointestinal surgery in high-income, middle-income, and low-income countries: a prospective, international, multicentre cohort study

    Get PDF
    Background: Surgical site infection (SSI) is one of the most common infections associated with health care, but its importance as a global health priority is not fully understood. We quantified the burden of SSI after gastrointestinal surgery in countries in all parts of the world. Methods: This international, prospective, multicentre cohort study included consecutive patients undergoing elective or emergency gastrointestinal resection within 2-week time periods at any health-care facility in any country. Countries with participating centres were stratified into high-income, middle-income, and low-income groups according to the UN's Human Development Index (HDI). Data variables from the GlobalSurg 1 study and other studies that have been found to affect the likelihood of SSI were entered into risk adjustment models. The primary outcome measure was the 30-day SSI incidence (defined by US Centers for Disease Control and Prevention criteria for superficial and deep incisional SSI). Relationships with explanatory variables were examined using Bayesian multilevel logistic regression models. This trial is registered with ClinicalTrials.gov, number NCT02662231. Findings: Between Jan 4, 2016, and July 31, 2016, 13 265 records were submitted for analysis. 12 539 patients from 343 hospitals in 66 countries were included. 7339 (58·5%) patient were from high-HDI countries (193 hospitals in 30 countries), 3918 (31·2%) patients were from middle-HDI countries (82 hospitals in 18 countries), and 1282 (10·2%) patients were from low-HDI countries (68 hospitals in 18 countries). In total, 1538 (12·3%) patients had SSI within 30 days of surgery. The incidence of SSI varied between countries with high (691 [9·4%] of 7339 patients), middle (549 [14·0%] of 3918 patients), and low (298 [23·2%] of 1282) HDI (p < 0·001). The highest SSI incidence in each HDI group was after dirty surgery (102 [17·8%] of 574 patients in high-HDI countries; 74 [31·4%] of 236 patients in middle-HDI countries; 72 [39·8%] of 181 patients in low-HDI countries). Following risk factor adjustment, patients in low-HDI countries were at greatest risk of SSI (adjusted odds ratio 1·60, 95% credible interval 1·05–2·37; p=0·030). 132 (21·6%) of 610 patients with an SSI and a microbiology culture result had an infection that was resistant to the prophylactic antibiotic used. Resistant infections were detected in 49 (16·6%) of 295 patients in high-HDI countries, in 37 (19·8%) of 187 patients in middle-HDI countries, and in 46 (35·9%) of 128 patients in low-HDI countries (p < 0·001). Interpretation: Countries with a low HDI carry a disproportionately greater burden of SSI than countries with a middle or high HDI and might have higher rates of antibiotic resistance. In view of WHO recommendations on SSI prevention that highlight the absence of high-quality interventional research, urgent, pragmatic, randomised trials based in LMICs are needed to assess measures aiming to reduce this preventable complication

    State of the climate in 2018

    Get PDF
    In 2018, the dominant greenhouse gases released into Earth’s atmosphere—carbon dioxide, methane, and nitrous oxide—continued their increase. The annual global average carbon dioxide concentration at Earth’s surface was 407.4 ± 0.1 ppm, the highest in the modern instrumental record and in ice core records dating back 800 000 years. Combined, greenhouse gases and several halogenated gases contribute just over 3 W m−2 to radiative forcing and represent a nearly 43% increase since 1990. Carbon dioxide is responsible for about 65% of this radiative forcing. With a weak La Niña in early 2018 transitioning to a weak El Niño by the year’s end, the global surface (land and ocean) temperature was the fourth highest on record, with only 2015 through 2017 being warmer. Several European countries reported record high annual temperatures. There were also more high, and fewer low, temperature extremes than in nearly all of the 68-year extremes record. Madagascar recorded a record daily temperature of 40.5°C in Morondava in March, while South Korea set its record high of 41.0°C in August in Hongcheon. Nawabshah, Pakistan, recorded its highest temperature of 50.2°C, which may be a new daily world record for April. Globally, the annual lower troposphere temperature was third to seventh highest, depending on the dataset analyzed. The lower stratospheric temperature was approximately fifth lowest. The 2018 Arctic land surface temperature was 1.2°C above the 1981–2010 average, tying for third highest in the 118-year record, following 2016 and 2017. June’s Arctic snow cover extent was almost half of what it was 35 years ago. Across Greenland, however, regional summer temperatures were generally below or near average. Additionally, a satellite survey of 47 glaciers in Greenland indicated a net increase in area for the first time since records began in 1999. Increasing permafrost temperatures were reported at most observation sites in the Arctic, with the overall increase of 0.1°–0.2°C between 2017 and 2018 being comparable to the highest rate of warming ever observed in the region. On 17 March, Arctic sea ice extent marked the second smallest annual maximum in the 38-year record, larger than only 2017. The minimum extent in 2018 was reached on 19 September and again on 23 September, tying 2008 and 2010 for the sixth lowest extent on record. The 23 September date tied 1997 as the latest sea ice minimum date on record. First-year ice now dominates the ice cover, comprising 77% of the March 2018 ice pack compared to 55% during the 1980s. Because thinner, younger ice is more vulnerable to melting out in summer, this shift in sea ice age has contributed to the decreasing trend in minimum ice extent. Regionally, Bering Sea ice extent was at record lows for almost the entire 2017/18 ice season. For the Antarctic continent as a whole, 2018 was warmer than average. On the highest points of the Antarctic Plateau, the automatic weather station Relay (74°S) broke or tied six monthly temperature records throughout the year, with August breaking its record by nearly 8°C. However, cool conditions in the western Bellingshausen Sea and Amundsen Sea sector contributed to a low melt season overall for 2017/18. High SSTs contributed to low summer sea ice extent in the Ross and Weddell Seas in 2018, underpinning the second lowest Antarctic summer minimum sea ice extent on record. Despite conducive conditions for its formation, the ozone hole at its maximum extent in September was near the 2000–18 mean, likely due to an ongoing slow decline in stratospheric chlorine monoxide concentration. Across the oceans, globally averaged SST decreased slightly since the record El Niño year of 2016 but was still far above the climatological mean. On average, SST is increasing at a rate of 0.10° ± 0.01°C decade−1 since 1950. The warming appeared largest in the tropical Indian Ocean and smallest in the North Pacific. The deeper ocean continues to warm year after year. For the seventh consecutive year, global annual mean sea level became the highest in the 26-year record, rising to 81 mm above the 1993 average. As anticipated in a warming climate, the hydrological cycle over the ocean is accelerating: dry regions are becoming drier and wet regions rainier. Closer to the equator, 95 named tropical storms were observed during 2018, well above the 1981–2010 average of 82. Eleven tropical cyclones reached Saffir–Simpson scale Category 5 intensity. North Atlantic Major Hurricane Michael’s landfall intensity of 140 kt was the fourth strongest for any continental U.S. hurricane landfall in the 168-year record. Michael caused more than 30 fatalities and 25billion(U.S.dollars)indamages.InthewesternNorthPacific,SuperTyphoonMangkhutledto160fatalitiesand25 billion (U.S. dollars) in damages. In the western North Pacific, Super Typhoon Mangkhut led to 160 fatalities and 6 billion (U.S. dollars) in damages across the Philippines, Hong Kong, Macau, mainland China, Guam, and the Northern Mariana Islands. Tropical Storm Son-Tinh was responsible for 170 fatalities in Vietnam and Laos. Nearly all the islands of Micronesia experienced at least moderate impacts from various tropical cyclones. Across land, many areas around the globe received copious precipitation, notable at different time scales. Rodrigues and Réunion Island near southern Africa each reported their third wettest year on record. In Hawaii, 1262 mm precipitation at Waipā Gardens (Kauai) on 14–15 April set a new U.S. record for 24-h precipitation. In Brazil, the city of Belo Horizonte received nearly 75 mm of rain in just 20 minutes, nearly half its monthly average. Globally, fire activity during 2018 was the lowest since the start of the record in 1997, with a combined burned area of about 500 million hectares. This reinforced the long-term downward trend in fire emissions driven by changes in land use in frequently burning savannas. However, wildfires burned 3.5 million hectares across the United States, well above the 2000–10 average of 2.7 million hectares. Combined, U.S. wildfire damages for the 2017 and 2018 wildfire seasons exceeded $40 billion (U.S. dollars)

    QF2011: a protocol to study the effects of the Queensland flood on pregnant women, their pregnancies, and their children's early development

    Get PDF

    Percutaneous revascularization for ischemic left ventricular dysfunction: Cost-effectiveness analysis of the REVIVED-BCIS2 trial

    Get PDF
    BACKGROUND: Percutaneous coronary intervention (PCI) is frequently undertaken in patients with ischemic left ventricular systolic dysfunction. The REVIVED (Revascularization for Ischemic Ventricular Dysfunction)-BCIS2 (British Cardiovascular Society-2) trial concluded that PCI did not reduce the incidence of all-cause death or heart failure hospitalization; however, patients assigned to PCI reported better initial health-related quality of life than those assigned to optimal medical therapy (OMT) alone. The aim of this study was to assess the cost-effectiveness of PCI+OMT compared with OMT alone. METHODS: REVIVED-BCIS2 was a prospective, multicenter UK trial, which randomized patients with severe ischemic left ventricular systolic dysfunction to either PCI+OMT or OMT alone. Health care resource use (including planned and unplanned revascularizations, medication, device implantation, and heart failure hospitalizations) and health outcomes data (EuroQol 5-dimension 5-level questionnaire) on each patient were collected at baseline and up to 8 years post-randomization. Resource use was costed using publicly available national unit costs. Within the trial, mean total costs and quality-adjusted life-years (QALYs) were estimated from the perspective of the UK health system. Cost-effectiveness was evaluated using estimated mean costs and QALYs in both groups. Regression analysis was used to adjust for clinically relevant predictors. RESULTS: Between 2013 and 2020, 700 patients were recruited (mean age: PCI+OMT=70 years, OMT=68 years; male (%): PCI+OMT=87, OMT=88); median follow-up was 3.4 years. Over all follow-ups, patients undergoing PCI yielded similar health benefits at higher costs compared with OMT alone (PCI+OMT: 4.14 QALYs, £22 352; OMT alone: 4.16 QALYs, £15 569; difference: −0.015, £6782). For both groups, most health resource consumption occurred in the first 2 years post-randomization. Probabilistic results showed that the probability of PCI being cost-effective was 0. CONCLUSIONS: A minimal difference in total QALYs was identified between arms, and PCI+OMT was not cost-effective compared with OMT, given its additional cost. A strategy of routine PCI to treat ischemic left ventricular systolic dysfunction does not seem to be a justifiable use of health care resources in the United Kingdom
    corecore