36 research outputs found

    An analysis of the role of state, economy and civil society in the development, management and reform of the NHS 1948 - 1997

    Get PDF
    The NHS is the centrepiece of the UK welfare state. For fifty years it has provided the majority of health-care in the UK. However the running of the service has not been marked by a smooth operation. Repeated reforms have occurred since 1948 in attempts to increase the efficiency and effectiveness of the service. These reforms have been credited with varying degrees of success. Even the most radical reforms, initiated in 1990, have been marked by some failures - particularly in respect to the provision of services to 'at risk' groups such as the elderly, leading to criticisms of a lack of coherent policy making. The reasons that underlie the success of the NHS in the midst of failure are complicated, but one hypothesis is that the structure of the NHS does not reflect its basic functions. Those functions can be broken down into 2. First is the relief of suffering from illness; second is the support of the industrial and economic base of the UK. The existence of this pluralistic purpose implies that the management of the service requires balancing the forces of economic, state and civil society requirements for the NHS. This management is embodied in a complicated institutionalisation of care, covered in chapter 1. The empirical evidence gathered in the thesis, in chapters 2 and 3 both from literature and case studies, would indicate that at least part of the problems seen in the NHS result from a failure to balance this institutionalisation. However, the situation is made more complex as the result of this imbalance creates further increased demands from some of the elements in the management of the service. Therefore the failure to balance the interactions that surround the NHS increases the pressures on it which in turn increases the imbalance leading to a feedback loop magnifying the problem. The source and problems of this feedback are best exemplified by a case study of the most recent reforms -covered in chapters 4-11 of this thesis. This case study demonstrates that the way in which the 1990 reforms were formulated and implemented took little notice of the impact of the changes on the street level NHS managers - with the results that the reforms did not represent a coherent policy. The result of the lack of coherence is that the changes have not generated efficiency gains, and in some cases have diverted resources away from those most in need. The underlying cause of this is the predominance of non- market forces in the decision making process - i.e. the values of the purchasers and the power of the providers to influence decision making. The linkages between these features of the post reform NHS are described in chapter 12. It is likely that the only way in which the circle of problems in the NHS can be addressed is re-establishing the corporate relationship that surrounds health care. However unlike previous relationships the evidence suggests that the relationship should be established at a policy level, rather than the current trends for a local level relationship. The NHS is not unique in this aspect, as this is the pattern of change seen in many European Countries

    Changes in health in England, with analysis by English regions and areas of deprivation, 1990-2013: a systematic analysis for the Global Burden of Disease Study 2013.

    Get PDF
    BACKGROUND: In the Global Burden of Disease Study 2013 (GBD 2013), knowledge about health and its determinants has been integrated into a comparable framework to inform health policy. Outputs of this analysis are relevant to current policy questions in England and elsewhere, particularly on health inequalities. We use GBD 2013 data on mortality and causes of death, and disease and injury incidence and prevalence to analyse the burden of disease and injury in England as a whole, in English regions, and within each English region by deprivation quintile. We also assess disease and injury burden in England attributable to potentially preventable risk factors. England and the English regions are compared with the remaining constituent countries of the UK and with comparable countries in the European Union (EU) and beyond. METHODS: We extracted data from the GBD 2013 to compare mortality, causes of death, years of life lost (YLLs), years lived with a disability (YLDs), and disability-adjusted life-years (DALYs) in England, the UK, and 18 other countries (the first 15 EU members [apart from the UK] and Australia, Canada, Norway, and the USA [EU15+]). We extended elements of the analysis to English regions, and subregional areas defined by deprivation quintile (deprivation areas). We used data split by the nine English regions (corresponding to the European boundaries of the Nomenclature for Territorial Statistics level 1 [NUTS 1] regions), and by quintile groups within each English region according to deprivation, thereby making 45 regional deprivation areas. Deprivation quintiles were defined by area of residence ranked at national level by Index of Multiple Deprivation score, 2010. Burden due to various risk factors is described for England using new GBD methodology to estimate independent and overlapping attributable risk for five tiers of behavioural, metabolic, and environmental risk factors. We present results for 306 causes and 2337 sequelae, and 79 risks or risk clusters. FINDINGS: Between 1990 and 2013, life expectancy from birth in England increased by 5·4 years (95% uncertainty interval 5·0-5·8) from 75·9 years (75·9-76·0) to 81·3 years (80·9-81·7); gains were greater for men than for women. Rates of age-standardised YLLs reduced by 41·1% (38·3-43·6), whereas DALYs were reduced by 23·8% (20·9-27·1), and YLDs by 1·4% (0·1-2·8). For these measures, England ranked better than the UK and the EU15+ means. Between 1990 and 2013, the range in life expectancy among 45 regional deprivation areas remained 8·2 years for men and decreased from 7·2 years in 1990 to 6·9 years in 2013 for women. In 2013, the leading cause of YLLs was ischaemic heart disease, and the leading cause of DALYs was low back and neck pain. Known risk factors accounted for 39·6% (37·7-41·7) of DALYs; leading behavioural risk factors were suboptimal diet (10·8% [9·1-12·7]) and tobacco (10·7% [9·4-12·0]). INTERPRETATION: Health in England is improving although substantial opportunities exist for further reductions in the burden of preventable disease. The gap in mortality rates between men and women has reduced, but marked health inequalities between the least deprived and most deprived areas remain. Declines in mortality have not been matched by similar declines in morbidity, resulting in people living longer with diseases. Health policies must therefore address the causes of ill health as well as those of premature mortality. Systematic action locally and nationally is needed to reduce risk exposures, support healthy behaviours, alleviate the severity of chronic disabling disorders, and mitigate the effects of socioeconomic deprivation. FUNDING: Bill & Melinda Gates Foundation and Public Health England.Bill & Melinda Gates Foundation; Public Health EnglandThis is the final version of the article. It first appeared from Elsevier via http://dx.doi.org/10.1016/S0140-6736(15)00195-

    Global urban environmental change drives adaptation in white clover

    Get PDF
    Urbanization transforms environments in ways that alter biological evolution. We examined whether urban environmental change drives parallel evolution by sampling 110,019 white clover plants from 6169 populations in 160 cities globally. Plants were assayed for a Mendelian antiherbivore defense that also affects tolerance to abiotic stressors. Urban-rural gradients were associated with the evolution of clines in defense in 47% of cities throughout the world. Variation in the strength of clines was explained by environmental changes in drought stress and vegetation cover that varied among cities. Sequencing 2074 genomes from 26 cities revealed that the evolution of urban-rural clines was best explained by adaptive evolution, but the degree of parallel adaptation varied among cities. Our results demonstrate that urbanization leads to adaptation at a global scale

    Effect of angiotensin-converting enzyme inhibitor and angiotensin receptor blocker initiation on organ support-free days in patients hospitalized with COVID-19

    Get PDF
    IMPORTANCE Overactivation of the renin-angiotensin system (RAS) may contribute to poor clinical outcomes in patients with COVID-19. Objective To determine whether angiotensin-converting enzyme (ACE) inhibitor or angiotensin receptor blocker (ARB) initiation improves outcomes in patients hospitalized for COVID-19. DESIGN, SETTING, AND PARTICIPANTS In an ongoing, adaptive platform randomized clinical trial, 721 critically ill and 58 non–critically ill hospitalized adults were randomized to receive an RAS inhibitor or control between March 16, 2021, and February 25, 2022, at 69 sites in 7 countries (final follow-up on June 1, 2022). INTERVENTIONS Patients were randomized to receive open-label initiation of an ACE inhibitor (n = 257), ARB (n = 248), ARB in combination with DMX-200 (a chemokine receptor-2 inhibitor; n = 10), or no RAS inhibitor (control; n = 264) for up to 10 days. MAIN OUTCOMES AND MEASURES The primary outcome was organ support–free days, a composite of hospital survival and days alive without cardiovascular or respiratory organ support through 21 days. The primary analysis was a bayesian cumulative logistic model. Odds ratios (ORs) greater than 1 represent improved outcomes. RESULTS On February 25, 2022, enrollment was discontinued due to safety concerns. Among 679 critically ill patients with available primary outcome data, the median age was 56 years and 239 participants (35.2%) were women. Median (IQR) organ support–free days among critically ill patients was 10 (–1 to 16) in the ACE inhibitor group (n = 231), 8 (–1 to 17) in the ARB group (n = 217), and 12 (0 to 17) in the control group (n = 231) (median adjusted odds ratios of 0.77 [95% bayesian credible interval, 0.58-1.06] for improvement for ACE inhibitor and 0.76 [95% credible interval, 0.56-1.05] for ARB compared with control). The posterior probabilities that ACE inhibitors and ARBs worsened organ support–free days compared with control were 94.9% and 95.4%, respectively. Hospital survival occurred in 166 of 231 critically ill participants (71.9%) in the ACE inhibitor group, 152 of 217 (70.0%) in the ARB group, and 182 of 231 (78.8%) in the control group (posterior probabilities that ACE inhibitor and ARB worsened hospital survival compared with control were 95.3% and 98.1%, respectively). CONCLUSIONS AND RELEVANCE In this trial, among critically ill adults with COVID-19, initiation of an ACE inhibitor or ARB did not improve, and likely worsened, clinical outcomes. TRIAL REGISTRATION ClinicalTrials.gov Identifier: NCT0273570

    Programme Selection in Research and Development

    No full text
    Mathematical programming as an aid to R & D project portfolio selection has been suggested by many authors, but few practical applications have been reported. This paper discusses some, of the difficulties, associated with the application of the proposed models, and describes procedures for handling these problems. Problem areas considered are: Allowance for future opportunities in multi-period models, Generation of alternative risk-return solutions, Inclusion of projects not completed during the planning period, Comparison of linear and integer programming outputs. In the main, both the problems and methods of solution are clarified by means of numerical examples, and details of a practical case study are given.

    sj-docx-1-rea-10.1177_17470161241235910 – Supplemental material for Reshaping consent so we might improve participant choice (III) – How is the research participant’s understanding currently checked and how might we improve this process?

    No full text
    Supplemental material, sj-docx-1-rea-10.1177_17470161241235910 for Reshaping consent so we might improve participant choice (III) – How is the research participant’s understanding currently checked and how might we improve this process? by Hugh Davies, Simon E Kolstoe and Anthony Lockett in Research Ethics</p

    Urea treatment in fluid restriction-refractory hyponatraemia

    No full text
    Objective: Hyponatraemia in hospitalized patients is common and associated with increased mortality. International guidelines give conflicting advice regarding the role of urea in the treatment of SIADH. We hypothesized that urea is a safe, effective treatment for fluid restriction-refractory hyponatraemia. Design: Review of urea for the treatment of hyponatraemia in patients admitted to a tertiary hospital during 2016-2017. Primary end-point: proportion of patients achieving a serum sodium ≥130\ua0mmol/L at 72\ua0hours. Patients: Urea was used on 78 occasions in 69 patients. The median age was 67 (IQR 52-76), 41% were female. Seventy (89.7%) had hyponatraemia due to SIADH—CNS pathology (64.3%) was the most common cause. The duration was acute in 32 (41%), chronic in 35 (44.9%) and unknown in the rest. Results: The median nadir serum sodium was 122\ua0mmol/L (IQR 118-126). Fluid restriction was first-line treatment in 65.4%. Urea was used first line in 21.8% and second line in 78.2%. Fifty treatment episodes (64.1%) resulted in serum sodium ≥130\ua0mmol/L at 72\ua0hours. In 56 patients who received other prior treatment, the mean sodium change at 72\ua0hours (6.9\ua0±\ua04.8\ua0mmol/L) was greater than with the preceding treatments (−1.0\ua0±\ua04.7\ua0mmol/L; P\ua010\ua0mmol/L in 24\ua0hours or >18\ua0mmol/L in 48\ua0hours), or died. Conclusions: Urea is safe and effective in fluid restriction-refractory hyponatraemia. We recommend urea with a starting dose of ≥30\ua0g/d, in patients with SIADH and moderate to profound hyponatraemia who are unable to undergo, or have failed fluid restriction

    Dietary supplements, guideline alignment and biochemical nutrient status in pregnancy: findings from the Queensland Family Cohort pilot study

    No full text
    In high-income nations, multiple micronutrient (MMN) supplementation during pregnancy is a common practice. We aimed to describe maternal characteristics associated with supplement use and daily dose of supplemental nutrients consumed in pregnancy, and whether guideline alignment and nutrient status are related to supplement use. The Queensland Family Cohort is a prospective, Australian observational longitudinal study. Maternal characteristics, nutrient intake from food and supplements, and biochemical nutrient status were assessed in the second trimester (n = 127). Supplement use was reported by 89% of participants, of whom 91% reported taking an MMN supplement. Participants who received private obstetric care, had private health insurance and had greater alignment to meat/vegetarian alternatives recommendations were more likely to report MMN supplement use. Private obstetric care and general practitioner shared care were associated with higher daily dose of supplemental nutrients consumed compared with midwifery group practice. There was high reliance on supplements to meet nutrient reference values for folate, iodine and iron, but only plasma folate concentrations were higher in MMN supplement versus nonsupplement users. Exceeding the upper level of intake for folic acid and iron was more likely among combined MMN and individual supplement/s users, and associated with higher plasma concentrations of the respective nutrients. Given the low alignment with food group recommendations and potential risks associated with high MMN supplement use, whole food diets should be emphasized. This study confirms the need to define effective strategies for optimizing nutrient intake in pregnancy, especially among those most vulnerable where MMN supplement use may be appropriate.</p
    corecore