125 research outputs found

    Corporate governance and financial constraints on strategic turnarounds

    Get PDF
    The paper extends the Robbins and Pearce (1992) two-stage turnaround response model to include governance factors. In addition to the retrenchment and recovery, the paper proposes the addition of a realignment stage, referring specifically to the re-alignment of expectations of principal and agent groups. The realignment stage imposes a threshold that must be crossed before the retrenchment and hence recovery stage can be entered. Crossing this threshold is problematic to the extent that the interests of governance-stakeholder groups diverge in a crisis situation. The severity of the crisis impacts on the bases of strategy contingent asset valuation leading to the fragmentation of stakeholder interests. In some cases the consequence may be that management are prevented from carrying out turnarounds by governance constraints. The paper uses a case study to illustrate these dynamics, and like the Robbins and Pearce study, it focuses on the textile industry. A longitudinal approach is used to show the impact of the removal of governance constraints. The empirical evidence suggests that such financial constraints become less serious to the extent that there is a functioning market for corporate control. Building on governance research and turnaround literature, the paper also outlines the general case necessary and sufficient conditions for successful turnarounds

    A SARS-CoV-2 outbreak in a plastics manufacturing plant.

    Get PDF
    BACKGROUND: A SARS-CoV-2 outbreak with an attack rate of 14.3% was reported at a plastics manufacturing plant in England. METHODS: Between 23rd March and 13th May 2021, the COVID-OUT team undertook a comprehensive outbreak investigation, including environmental assessment, surface sampling, molecular and serological testing, and detailed questionnaires, to identify potential SARS-CoV-2 transmission routes, and workplace- and worker-related risk factors. RESULTS: While ventilation, indicated using real-time CO2 proxy measures, was generally adequate on-site, the technical office with the highest localized attack rate (21.4%) frequently reached peaks in CO2 of 2100ppm. SARS-CoV-2 RNA was found in low levels (Ct ≥35) in surface samples collected across the site. High noise levels (79dB) were recorded in the main production area, and study participants reported having close work contacts (73.1%) and sharing tools (75.5%). Only 20.0% of participants reported using a surgical mask and/or FFP2/FFP3 respirator at least half the time and 71.0% expressed concerns regarding potential pay decreases and/or unemployment due to self-isolation or workplace closure. CONCLUSIONS: The findings reinforce the importance of enhanced infection control measures in manufacturing sectors, including improved ventilation with possible consideration of CO2 monitoring, utilising air cleaning interventions in enclosed environments, and provision of good-quality face masks (i.e., surgical masks or FFP2/FFP3 respirators) especially when social distancing cannot be maintained. Further research on the impacts of job security-related concerns is warranted

    Sustainable food security in India—Domestic production and macronutrient availability

    Get PDF
    <div><p>India has been perceived as a development enigma: Recent rates of economic growth have not been matched by similar rates in health and nutritional improvements. To meet the second Sustainable Development Goal (SDG2) of achieving zero hunger by 2030, India faces a substantial challenge in meeting basic nutritional needs in addition to addressing population, environmental and dietary pressures. Here we have mapped—for the first time—the Indian food system from crop production to household-level availability across three key macronutrients categories of ‘calories’, ‘digestible protein’ and ‘fat’. To better understand the potential of reduced food chain losses and improved crop yields to close future food deficits, scenario analysis was conducted to 2030 and 2050. Under India’s current self-sufficiency model, our analysis indicates severe shortfalls in availability of all macronutrients across a large proportion (>60%) of the Indian population. The extent of projected shortfalls continues to grow such that, even in ambitious waste reduction and yield scenarios, enhanced domestic production alone will be inadequate in closing the nutrition supply gap. We suggest that to meet SDG2 India will need to take a combined approach of optimising domestic production and increasing its participation in global trade.</p></div

    Genome-Wide Association Study Identifies Two Novel Regions at 11p15.5-p13 and 1p31 with Major Impact on Acute-Phase Serum Amyloid A

    Get PDF
    Elevated levels of acute-phase serum amyloid A (A-SAA) cause amyloidosis and are a risk factor for atherosclerosis and its clinical complications, type 2 diabetes, as well as various malignancies. To investigate the genetic basis of A-SAA levels, we conducted the first genome-wide association study on baseline A-SAA concentrations in three population-based studies (KORA, TwinsUK, Sorbs) and one prospective case cohort study (LURIC), including a total of 4,212 participants of European descent, and identified two novel genetic susceptibility regions at 11p15.5-p13 and 1p31. The region at 11p15.5-p13 (rs4150642; p = 3.20×10−111) contains serum amyloid A1 (SAA1) and the adjacent general transcription factor 2 H1 (GTF2H1), Hermansky-Pudlak Syndrome 5 (HPS5), lactate dehydrogenase A (LDHA), and lactate dehydrogenase C (LDHC). This region explains 10.84% of the total variation of A-SAA levels in our data, which makes up 18.37% of the total estimated heritability. The second region encloses the leptin receptor (LEPR) gene at 1p31 (rs12753193; p = 1.22×10−11) and has been found to be associated with CRP and fibrinogen in previous studies. Our findings demonstrate a key role of the 11p15.5-p13 region in the regulation of baseline A-SAA levels and provide confirmative evidence of the importance of the 1p31 region for inflammatory processes and the close interplay between A-SAA, leptin, and other acute-phase proteins

    The Effect of Carbon Credits on Savanna Land Management and Priorities for Biodiversity Conservation

    Get PDF
    Carbon finance offers the potential to change land management and conservation planning priorities. We develop a novel approach to planning for improved land management to conserve biodiversity while utilizing potential revenue from carbon biosequestration. We apply our approach in northern Australia's tropical savanna, a region of global significance for biodiversity and carbon storage, both of which are threatened by current fire and grazing regimes. Our approach aims to identify priority locations for protecting species and vegetation communities by retaining existing vegetation and managing fire and grazing regimes at a minimum cost. We explore the impact of accounting for potential carbon revenue (using a carbon price of US14pertonneofcarbondioxideequivalent)onpriorityareasforconservationandtheimpactofexplicitlyprotectingcarbonstocksinadditiontobiodiversity.OurresultsshowthatimprovedmanagementcanpotentiallyraiseapproximatelyUS14 per tonne of carbon dioxide equivalent) on priority areas for conservation and the impact of explicitly protecting carbon stocks in addition to biodiversity. Our results show that improved management can potentially raise approximately US5 per hectare per year in carbon revenue and prevent the release of 1–2 billion tonnes of carbon dioxide equivalent over approximately 90 years. This revenue could be used to reduce the costs of improved land management by three quarters or double the number of biodiversity targets achieved and meet carbon storage targets for the same cost. These results are based on generalised cost and carbon data; more comprehensive applications will rely on fine scale, site-specific data and a supportive policy environment. Our research illustrates that the duel objective of conserving biodiversity and reducing the release of greenhouse gases offers important opportunities for cost-effective land management investments

    Circulating microRNAs in sera correlate with soluble biomarkers of immune activation but do not predict mortality in ART treated individuals with HIV-1 infection: A case control study

    Get PDF
    Introduction: The use of anti-retroviral therapy (ART) has dramatically reduced HIV-1 associated morbidity and mortality. However, HIV-1 infected individuals have increased rates of morbidity and mortality compared to the non-HIV-1 infected population and this appears to be related to end-organ diseases collectively referred to as Serious Non-AIDS Events (SNAEs). Circulating miRNAs are reported as promising biomarkers for a number of human disease conditions including those that constitute SNAEs. Our study sought to investigate the potential of selected miRNAs in predicting mortality in HIV-1 infected ART treated individuals. Materials and Methods: A set of miRNAs was chosen based on published associations with human disease conditions that constitute SNAEs. This case: control study compared 126 cases (individuals who died whilst on therapy), and 247 matched controls (individuals who remained alive). Cases and controls were ART treated participants of two pivotal HIV-1 trials. The relative abundance of each miRNA in serum was measured, by RTqPCR. Associations with mortality (all-cause, cardiovascular and malignancy) were assessed by logistic regression analysis. Correlations between miRNAs and CD4+ T cell count, hs-CRP, IL-6 and D-dimer were also assessed. Results: None of the selected miRNAs was associated with all-cause, cardiovascular or malignancy mortality. The levels of three miRNAs (miRs -21, -122 and -200a) correlated with IL-6 while miR-21 also correlated with D-dimer. Additionally, the abundance of miRs -31, -150 and -223, correlated with baseline CD4+ T cell count while the same three miRNAs plus miR- 145 correlated with nadir CD4+ T cell count. Discussion: No associations with mortality were found with any circulating miRNA studied. These results cast doubt onto the effectiveness of circulating miRNA as early predictors of mortality or the major underlying diseases that contribute to mortality in participants treated for HIV-1 infection

    Risk factors associated with short-term complications in mandibular fractures: the MANTRA study—a Maxillofacial Trainee Research Collaborative (MTReC)

    Get PDF
    Abstract Introduction Complications following mandibular fractures occur in 9–23% of patients. Identifying those at risk is key to prevention. Previous studies highlighted smoking, age and time from injury to presentation as risk factors but rarely recorded other possible confounders. In this paper, we use a collaborative snapshot audit to document novel risk factors and confirm established risks for complications following the treatment of mandibular fractures. Methods The audit was carried out by 122 OMFS trainees across the UK and Ireland (49 centres) over 6 months, coordinated by the Maxillofacial Surgery Trainees Research Collaborative. Variables recorded included basic demography, medical and social history, injury mechanism and type, management and 30-day outcome. Results Nine hundred and forty-seven (947) patients with fractured mandibles were recorded. Surgical management was carried out in 76.3%. Complications at 30 days occurred 65 (9%) of those who were managed surgically. Risk factors for complications included male sex, increasing age, any medical history, increasing number of cigarettes smoked per week, increasing alcohol use per week, worse oral hygiene and increased time from injury to presentation. Discussion We have used a large prospective snapshot audit to confirm established risk factors and identify novel risk factors. We demonstrate that time from injury to presentation is confounded by other indicators of poor health behaviour. These results are important in designing trial protocols for management of mandibular fractures and in targeting health interventions to patients at highest risk of complications. </jats:sec

    Development and Validation of a Risk Score for Chronic Kidney Disease in HIV Infection Using Prospective Cohort Data from the D:A:D Study

    Get PDF
    Ristola M. on työryhmien DAD Study Grp ; Royal Free Hosp Clin Cohort ; INSIGHT Study Grp ; SMART Study Grp ; ESPRIT Study Grp jäsen.Background Chronic kidney disease (CKD) is a major health issue for HIV-positive individuals, associated with increased morbidity and mortality. Development and implementation of a risk score model for CKD would allow comparison of the risks and benefits of adding potentially nephrotoxic antiretrovirals to a treatment regimen and would identify those at greatest risk of CKD. The aims of this study were to develop a simple, externally validated, and widely applicable long-term risk score model for CKD in HIV-positive individuals that can guide decision making in clinical practice. Methods and Findings A total of 17,954 HIV-positive individuals from the Data Collection on Adverse Events of Anti-HIV Drugs (D:A:D) study with >= 3 estimated glomerular filtration rate (eGFR) values after 1 January 2004 were included. Baseline was defined as the first eGFR > 60 ml/min/1.73 m2 after 1 January 2004; individuals with exposure to tenofovir, atazanavir, atazanavir/ritonavir, lopinavir/ritonavir, other boosted protease inhibitors before baseline were excluded. CKD was defined as confirmed (>3 mo apart) eGFR In the D:A:D study, 641 individuals developed CKD during 103,185 person-years of follow-up (PYFU; incidence 6.2/1,000 PYFU, 95% CI 5.7-6.7; median follow-up 6.1 y, range 0.3-9.1 y). Older age, intravenous drug use, hepatitis C coinfection, lower baseline eGFR, female gender, lower CD4 count nadir, hypertension, diabetes, and cardiovascular disease (CVD) predicted CKD. The adjusted incidence rate ratios of these nine categorical variables were scaled and summed to create the risk score. The median risk score at baseline was -2 (interquartile range -4 to 2). There was a 1: 393 chance of developing CKD in the next 5 y in the low risk group (risk score = 5, 505 events), respectively. Number needed to harm (NNTH) at 5 y when starting unboosted atazanavir or lopinavir/ritonavir among those with a low risk score was 1,702 (95% CI 1,166-3,367); NNTH was 202 (95% CI 159-278) and 21 (95% CI 19-23), respectively, for those with a medium and high risk score. NNTH was 739 (95% CI 506-1462), 88 (95% CI 69-121), and 9 (95% CI 8-10) for those with a low, medium, and high risk score, respectively, starting tenofovir, atazanavir/ritonavir, or another boosted protease inhibitor. The Royal Free Hospital Clinic Cohort included 2,548 individuals, of whom 94 individuals developed CKD (3.7%) during 18,376 PYFU (median follow-up 7.4 y, range 0.3-12.7 y). Of 2,013 individuals included from the SMART/ESPRIT control arms, 32 individuals developed CKD (1.6%) during 8,452 PYFU (median follow-up 4.1 y, range 0.6-8.1 y). External validation showed that the risk score predicted well in these cohorts. Limitations of this study included limited data on race and no information on proteinuria. Conclusions Both traditional and HIV-related risk factors were predictive of CKD. These factors were used to develop a risk score for CKD in HIV infection, externally validated, that has direct clinical relevance for patients and clinicians to weigh the benefits of certain antiretrovirals against the risk of CKD and to identify those at greatest risk of CKD.Peer reviewe
    corecore