527 research outputs found

    Coding accuracy for Parkinson's disease hospital admissions:implications for healthcare planning in the UK

    Get PDF
    ObjectivesHospital Episode Statistics data are used for healthcare planning and hospital reimbursements. Reliability of these data is dependent on the accuracy of individual hospitals reporting Secondary Uses Service (SUS) which includes hospitalisation. The number and coding accuracy for Parkinson's disease hospital admissions at a tertiary centre in Birmingham was assessed.Study designRetrospective, routine-data–based study.MethodsA retrospective electronic database search for all Parkinson's disease patients admitted to the tertiary hospital over a 4-year period (2009–2013) was performed on the SUS database using International Classification of Disease codes, and on the local inpatient electronic prescription database, Prescription and Information Communications System, using medication prescriptions. Capture-recapture methods were used to estimate the number of patients and admissions missed by both databases.ResultsFrom the two databases, between July 2009 and June 2013, 1068 patients with Parkinson's disease accounted for 1999 admissions. During these admissions, the Parkinson's disease was coded as a primary or secondary diagnosis. Ninety-one percent of these admissions were recorded on the SUS database. Capture-recapture methods estimated that the number of Parkinson's disease patients admitted during this period was 1127 patients (95% confidence interval: 1107–1146). A supplementary search of both SUS and Prescription and Information Communications System was undertaken using the hospital numbers of these 1068 patients. This identified another 479 admissions. SUS database under-estimated Parkinson's disease admissions by 27% during the study period.ConclusionThe accuracy of disease coding is critical for healthcare policy planning and must be improved. If the under-reporting of Parkinson's disease admissions on the SUS database is repeated nationally, expenditure on Parkinson's disease admissions in England is under-estimated by approximately £61 million per year

    ‘Sons of athelings given to the earth’: Infant Mortality within Anglo-Saxon Mortuary Geography

    Get PDF
    FOR 20 OR MORE YEARS early Anglo-Saxon archaeologists have believed children are underrepresented in the cemetery evidence. They conclude that excavation misses small bones, that previous attitudes to reporting overlook the very young, or that infants and children were buried elsewhere. This is all well and good, but we must be careful of oversimplifying compound social and cultural responses to childhood and infant mortality. Previous approaches have offered methodological quandaries in the face of this under-representation. However, proportionally more infants were placed in large cemeteries and sometimes in specific zones. This trend is statistically significant and is therefore unlikely to result entirely from preservation or excavation problems. Early medieval cemeteries were part of regional mortuary geographies and provided places to stage events that promoted social cohesion across kinship systems extending over tribal territories. This paper argues that patterns in early Anglo-Saxon infant burial were the result of female mobility. Many women probably travelled locally to marry in a union which reinforced existing social networks. For an expectant mother, however, the safest place to give birth was with experience women in her maternal home. Infant identities were affected by personal and legal association with their mother’s parental kindred, so when an infant died in childbirth or months and years later, it was their mother’s identity which dictated burial location. As a result, cemeteries central to tribal identities became places to bury the sons and daughters of a regional tribal aristocracy

    Successes and Challenges of Implementing Tobacco Dependency Treatment in Health Care Institutions in England

    Get PDF
    There is a significant body of evidence that delivering tobacco dependency treatment within acute care hospitals can deliver high rates of tobacco abstinence and substantial benefits for both patients and the healthcare system. This evidence has driven a renewed investment in the UK healthcare service to ensure all patients admitted to hospital are provided with evidence-based interventions during admission and after discharge. An early-implementer of this new wave of hospital-based tobacco dependency treatment services is “the CURE project” in Greater Manchester, a region in the North West of England. The CURE project strives to change the culture of a hospital system, to medicalise tobacco dependency and empower front-line hospital staff to deliver an admission bundle of care, including identification of patients that smoke, provision of very brief advice (VBA), protocolised prescription of pharmacotherapy, and opt-out referral to the specialist CURE practitioners. This specialist team provides expert treatment and behaviour change support during the hospital admission and can agree a support package after discharge, with either hospital-led or community-led follow-up. The programme has shown exceptional clinical effectiveness, with 22% of all smokers admitted to hospital abstinent from tobacco at 12 weeks, and exceptional cost-effectiveness with a public value return on investment ratio of GBP 30.49 per GBP 1 invested and a cost per QALY of GBP 487. There have been many challenges in implementing this service, underpinned by the system-wide culture change and ensuring the good communication and engagement of all stakeholders across the complex networks of the tobacco control and healthcare system. The delivery of hospital-based tobacco dependency services across all NHS acute care hospitals represents a substantial step forward in the fight against the tobacco epidemic

    Symmetry breaking in mass-recruiting ants: extent of foraging biases depends on resource quality

    Get PDF
    The communication involved in the foraging behaviour of social insects is integral to their success. Many ant species use trail pheromones to make decisions about where to forage. The strong positive feedback caused by the trail pheromone is thought to create a decision between two or more options. When the two options are of identical quality, this is known as symmetry breaking, and is important because it helps colonies to monopolise food sources in a competitive environment. Symmetry breaking is thought to increase with the quantity of pheromone deposited by ants, but empirical studies exploring the factors affecting symmetry breaking are limited. Here, we tested if (i) greater disparity between two food sources increased the degree to which a higher quality food source is favoured and (ii) if the quality of identical food sources would affect the degree of symmetry breaking that occurs. Using the mass-recruiting Pharaoh ant, Monomorium pharaonis, we carried out binary choice tests to investigate how food quality affects the choice and distribution of colony foraging decisions. We found that colonies could coordinate foraging to exploit food sources of greater quality, and a greater contrast in quality between the food sources created a stronger collective decision. Contrary to prediction, we found that symmetry breaking decreased as the quality of two identical food sources increased. We discuss how stochastic effects might lead to relatively strong differences in the amount of pheromone on alternative routes when food source quality is low. Significance statement: Pheromones used by social insects should guide a colony via positive feedback to distribute colony members at resources in the most adaptive way given the current environment. This study shows that when food resources are of equal quality, Pharaoh ant foragers distribute themselves more evenly if the two food sources are both of high quality compared to if both are of low quality. The results highlight the way in which individual ants can modulate their response to pheromone trails which may lead colonies to exploiting resources more evenly when in a resource rich environment

    Predicting the Risk of Disease Recurrence and Death Following Curative-intent Radiotherapy for Non-small Cell Lung Cancer: The Development and Validation of Two Scoring Systems From a Large Multicentre UK Cohort

    Get PDF
    AIMS: There is a paucity of evidence on which to produce recommendations on neither the clinical nor the imaging follow-up of lung cancer patients after curative-intent radiotherapy. In the 2019 National Institute for Health and Care Excellence lung cancer guidelines, further research into risk-stratification models to inform follow-up protocols was recommended. MATERIALS AND METHODS: A retrospective study of consecutive patients undergoing curative-intent radiotherapy for non-small cell lung cancer from 1 October 2014 to 1 October 2016 across nine UK trusts was carried out. Twenty-two demographic, clinical and treatment-related variables were collected and multivariable logistic regression was used to develop and validate two risk-stratification models to determine the risk of disease recurrence and death. RESULTS: In total, 898 patients were included in the study. The mean age was 72 years, 63% (562/898) had a good performance status (0-1) and 43% (388/898), 15% (134/898) and 42% (376/898) were clinical stage I, II and III, respectively. Thirty-six per cent (322/898) suffered disease recurrence and 41% (369/898) died in the first 2 years after radiotherapy. The ASSENT score (age, performance status, smoking status, staging endobronchial ultrasound, N-stage, T-stage) was developed, which stratifies the risk for disease recurrence within 2 years, with an area under the receiver operating characteristic curve (AUROC) for the total score of 0.712 (0.671-0.753) and 0.72 (0.65-0.789) in the derivation and validation sets, respectively. The STEPS score (sex, performance status, staging endobronchial ultrasound, T-stage, N-stage) was developed, which stratifies the risk of death within 2 years, with an AUROC for the total score of 0.625 (0.581-0.669) and 0.607 (0.53-0.684) in the derivation and validation sets, respectively. CONCLUSIONS: These validated risk-stratification models could be used to inform follow-up protocols after curative-intent radiotherapy for lung cancer. The modest performance highlights the need for more advanced risk prediction tools

    Neutrophil-Lymphocyte Ratio and Absolute Lymphocyte Count as Prognostic Markers in Patients Treated with Curative-intent Radiotherapy for Non-small Cell Lung Cancer

    Get PDF
    Aims The neutrophil–lymphocyte ratio (NLR) and the absolute lymphocyte count (ALC) have been proposed as prognostic markers in non-small cell lung cancer (NSCLC). The objective of this study was to examine the association of NLR/ALC before and after curative-intent radiotherapy for NSCLC on disease recurrence and overall survival. Materials and methods A retrospective study of consecutive patients who underwent curative-intent radiotherapy for NSCLC across nine sites in the UK from 1 October 2014 to 1 October 2016. A multivariate analysis was carried out to assess the ability of pre-treatment NLR/ALC, post-treatment NLR/ALC and change in NLR/ALC, adjusted for confounding factors using the Cox proportional hazards model, to predict disease recurrence and overall survival within 2 years of treatment. Results In total, 425 patients were identified with complete blood parameter values. None of the NLR/ALC parameters were independent predictors of disease recurrence. Higher pre-NLR, post-NLR and change in NLR plus lower post-ALC were all independent predictors of worse survival. Receiver operator curve analysis found a pre-NLR > 2.5 (odds ratio 1.71, 95% confidence interval 1.06–2.79, P 5.5 (odds ratio 2.36, 95% confidence interval 1.49–3.76, P 3.6 (odds ratio 2.41, 95% confidence interval 1.5–3.91, P < 0.001) and a post-ALC < 0.8 (odds ratio 2.86, 95% confidence interval 1.76–4.69, P < 0.001) optimally predicted poor overall survival on both univariate and multivariate analysis when adjusted for confounding factors. Median overall survival for the high-versus low-risk groups were: pre-NLR 770 versus 1009 days (P = 0.34), post-NLR 596 versus 1287 days (P ≤ 0.001), change in NLR 553 versus 1214 days (P ≤ 0.001) and post-ALC 594 versus 1287 days (P ≤ 0.001). Conclusion NLR and ALC, surrogate markers for systemic inflammation, have prognostic value in NSCLC patients treated with curative-intent radiotherapy. These simple and readily available parameters may have a future role in risk stratification post-treatment to inform the intensity of surveillance protocols

    Randomised trial of indwelling pleural catheters for refractory transudative pleural effusions

    Get PDF
    Objective: Refractory symptomatic transudative pleural effusions are an indication for pleural drainage. There has been supportive observational evidence for the use of indwelling pleural catheters (IPCs) for transudative effusions, but no randomised trials. We aimed to investigate the effect of IPCs on breathlessness in patients with transudative pleural effusions when compared with standard care. / Methods: A multicentre randomised controlled trial, in which patients with transudative pleural effusions were randomly assigned to either an IPC (intervention) or therapeutic thoracentesis (TT; standard care). The primary outcome was mean daily breathlessness score over 12 weeks from randomisation. / Results: 220 patients were screened from April 2015 to August 2019 across 13 centres, with 33 randomised to intervention (IPC) and 35 to standard care (TT). Underlying aetiology was heart failure in 46 patients, liver failure in 16 and renal failure in six. In primary outcome analysis, the mean±sd breathlessness score over the 12-week study period was 39.7±29.4 mm in the IPC group and 45.0±26.1 mm in the TT group (p=0.67). Secondary outcomes analysis demonstrated that mean±sd drainage was 17 412±17 936 mL and 2901±2416 mL in the IPC and TT groups, respectively. A greater proportion of patients had at least one adverse event in the IPC group (p=0.04). / Conclusion: We found no significant difference in breathlessness over 12 weeks between IPCs or TT. TT is associated with fewer complications and IPCs reduced the number of invasive pleural procedures required. Patient preference and circumstances should be considered in selecting the intervention in this cohort

    How should performance in EBUS mediastinal staging in lung cancer be measured?

    Get PDF
    There has been a paradigm shift in mediastinal staging algorithms in non-small cell lung cancer over the last decade in the United Kingdom (UK). This has seen endoscopic nodal staging (predominantly endobronchial ultrasound, EBUS) almost replace surgical staging (predominantly mediastinoscopy) as the pathological staging procedure of first choic
    corecore