6,480 research outputs found

    Understanding the role of obsolescence in PPP/PFI

    Get PDF
    In 2013 the Guardian newspaper reported that the UK Government had acquired £300 billion worth of capital costs and unitary payments within the formally known Private Finance Initiatives – now Public Private Partnerships. This paper is not about the economics or moral debate upon the success and failures of PPP’s within the UK, but rather the untold story of the impact of obsolescence upon the integral asset systems which support the service delivery. Prisons require supportable and maintainable security systems, the same can be said for government/defence buildings, not to the mention the life critical systems within hospitals and clinics across the country. However, there is an untold story, which is impacting the through life or lifecycle costs to support and maintain key asset systems, driving additional lifecycle expenditures that may be unforeseen. This paper contains evidence of the scale of the financial impact of obsolescence through obsolescence driven investments, not least to mention the potential operational impacts if systems become unsupportable. This paper begins to create a foundation for future research focusing on obsolescence and how best to monitor and mitigate its effects

    Identifying and Managing Asset Obsolescence within the Built Environment

    Get PDF
    Obsolescence in practice commonly occurs in two forms; the asset in question is no longer suitable for current demands, or is no longer available from manufacturers. Most research surrounding obsolescence has targeted short lifecycle components such as electronics or software (2-5 years). There is little consideration of low volume, long-life assets (20+ years) that are commonplace within the built environment (e.g. Uninterruptable Power Supply Systems, Building Management Systems and Fire Alarm Systems). This paper evidences the importance of identifying asset obsolescence within the built environment by observing 'lifecycle mismatches' within a live case study of a ten year old UK Private Finance Initiative (PFI). This paper develops and proposes an original assessment tool, identifying obsolescence within the built environment and empirically tests it within the case study. The methodology and results combine to evidence the importance of obsolescence and the contractual and financial risk it poses. The model is transferrable and scalable thus allowing larger portfolios to be considered. The levels of identifying obsolescence within long-life assets are increasing, whilst the lifecycles of certain component groups are decreasing; posing a growing problem for future Facility Managers

    Cost-effectiveness of bevacizumab for diabetic macular oedema

    Get PDF
    This is the author accepted manuscript. The final version is available from Mark Allen Healthcare via the DOI in this record A Markov model was developed to predict the outcomes and cost-effectiveness of bevacizumab compared to macular laser therapy for diabetes patients with clinically significant macular oedema (CSMO). This study used outcome data from a randomised controlled trial, utility data and health states from a ranibizumab health technology assessment, and costs from the UK national tariff. A total of 37.73% of patients treated with bevacizumab in the model had a visual acuity of at least 76 Early Treatment Diabetic Retinopathy Study Research Group (ETDRS) letters after four years, compared with 4.09% of laser therapy patients. Only 0.11% of bevacizumab patients were blind after four years compared with 6.45% of laser therapy patients. However, with an incremental cost-effectiveness ratio of £51,182, we predict that bevacizumab would not be cost-effective compared to laser therapy because of the influence of the NHS national tariff costs for monitoring patients and administering bevacizumab, and the inability of the EQ-5D measure to capture the impact of sensory deprivation on quality of life sufficiently. This study recommends significant caution when interpreting the results of cost-effectiveness analyses of interventions that involve vision-related interventions.National Institute for Health Research (NIHR

    Evaluating the impact of a simulation study in emergency stroke care

    No full text
    Very few discrete-event simulation studies follow up on recommendations with evaluation of whether modelled benefits have been realised and the extent to which modelling contributed to any change. This paper evaluates changes made to the emergency stroke care pathway at a UK hospital informed by a simulation modelling study. The aims of the study were to increase the proportion of people with strokes that undergo a time-sensitive treatment to breakdown a blood clot within the brain and decrease the time to treatment. Evaluation involved analysis of stroke treatment pre- and post- implementation, as well as a comparison of how the research team believed the intervention would aid implementation compared to what actually happened. Two years after the care pathway was changed, treatment rates had increased in line with expectations and the hospital was treating four times as many patients than before the intervention in half the time. There is evidence that the modelling process aided implementation, but not always in line with expectations of the research team. Despite user involvement throughout the study it proved difficult to involve a representative group of clinical stakeholders in conceptual modelling and this affected model credibility. The research team also found batch experimentation more useful than visual interactive simulation to structure debate and decision making. In particular, simple charts of results focused debates on the clinical effectiveness of drugs - an emergent barrier to change. Visual interactive simulation proved more useful for engaging different hospitals and initiating new projects

    Can the retinal screening interval be safely increased to 2 years for type 2 diabetic patients without retinopathy?

    Get PDF
    This is the final version. Available from American Diabetes Association via the DOI in this recordOBJECTIVE: In the U.K., people with diabetes are typically screened for retinopathy annually. However, diabetic retinopathy sometimes has a slow progression rate. We developed a simulation model to predict the likely impact of screening patients with type 2 diabetes, who have not been diagnosed with diabetic retinopathy, every 2 years rather than annually. We aimed to assess whether or not such a policy would increase the proportion of patients who developed retinopathy-mediated vision loss compared with the current policy, along with the potential cost savings that could be achieved. RESEARCH DESIGN AND METHODS: We developed a model that simulates the progression of retinopathy in type 2 diabetic patients, and the screening of these patients, to predict rates of retinopathy-mediated vision loss. We populated the model with data obtained from a National Health Service Foundation Trust. We generated comparative 15-year forecasts to assess the differences between the current and proposed screening policies. RESULTS The simulation model predicts that implementing a 2-year screening interval for type 2 diabetic patients without evidence of diabetic retinopathy does not increase their risk of vision loss. Furthermore, we predict that this policy could reduce screening costs by ~25%. CONCLUSIONS: Screening people with type 2 diabetes, who have not yet developed retinopathy, every 2 years, rather than annually, is a safe and cost-effective strategy. Our findings support those of other studies, and we therefore recommend a review of the current National Institute for Health and Clinical Excellence (NICE) guidelines for diabetic retinopathy screening implemented in the U.K.National Institute for Health Research (NIHR

    Streamlining pathways for minor injuries in emergency departments through radiographer-led discharge

    Get PDF
    This is the author accepted manuscript. The final version is available from Elsevier via the DOI in this record.Diagnostic imaging services are essential to the diagnosis pathway for many patients arriving at hospital emergency departments with a suspected fracture. Commonly, these patients need to be seen again by a doctor or emergency nurse practitioner after an X-ray image has been taken in order to finalise the diagnosis and determine the next stage in the patients’ pathway. Here, significant waiting times can accrue for these follow-up consultations after radiographic imaging although the vast majority of patients are discharged. Research evidence from pilot studies suggests that patients with minor appendicular injuries could be safely discharged by a suitably qualified radiographer directly after imaging thereby avoiding queues for repeated consultation. In this study, we model patient pathways through an emergency department (ED) at a hospital in the South West of England using process mapping, interviews with ED staff and discrete event simulation (DES). The DES model allowed us to compare the current practice at the hospital with scenarios using radiographer-led discharge of patients directly after imaging and assess the reduction in patients’ length of stay in ED. We also quantified trade-offs between the provision of radiographer-led discharge and its effects, i.e. reduction in waiting times and ED workload. Finally, we discuss how this decision support tool can be used to support understanding for patients and members of staff.Part of this research, i.e. the work of Martin Pitt and Sebastian Rachuba, was supported by the National Institute for Health Research (NIHR) Collaboration for Leadership in Applied Health Research and Care South West Peninsula (NIHR CLAHRC South West Peninsula)

    Evidence for a Single-Spin Azimuthal Asymmetry in Semi-inclusive Pion Electroproduction

    Get PDF
    Single-spin asymmetries for semi-inclusive pion production in deep-inelastic scattering have been measured for the first time. A significant target-spin asymmetry of the distribution in the azimuthal angle φ of the pion relative to the lepton scattering plane was formed for π^+ electroproduction on a longitudinally polarized hydrogen target. The corresponding analyzing power in the sinφ moment of the cross section is 0.022±0.005±0.003. This result can be interpreted as the effect of terms in the cross section involving chiral-odd spin distribution functions in combination with a chiral-odd fragmentation function that is sensitive to the transverse polarization of the fragmenting quark

    Large Sample Asymptotics of the Pseudo-Marginal Method

    Get PDF
    The pseudo-marginal algorithm is a variant of the Metropolis--Hastings algorithm which samples asymptotically from a probability distribution when it is only possible to estimate unbiasedly an unnormalized version of its density. Practically, one has to trade-off the computational resources used to obtain this estimator against the asymptotic variances of the ergodic averages obtained by the pseudo-marginal algorithm. Recent works optimizing this trade-off rely on some strong assumptions which can cast doubts over their practical relevance. In particular, they all assume that the distribution of the difference between the log-density and its estimate is independent of the parameter value at which it is evaluated. Under regularity conditions we show here that, as the number of data points tends to infinity, a space-rescaled version of the pseudo-marginal chain converges weakly towards another pseudo-marginal chain for which this assumption indeed holds. A study of this limiting chain allows us to provide parameter dimension-dependent guidelines on how to optimally scale a normal random walk proposal and the number of Monte Carlo samples for the pseudo-marginal method in the large-sample regime. This complements and validates currently available results.Comment: 76 pages, 3 figure

    Using a probabilistic approach to derive a two-phase model of flow-induced cell migration

    Full text link
    Interstitial fluid flow is a feature of many solid tumours. In vitro experiments have shown that such fluid flow can direct tumour cell movement upstream or downstream depending on the balance between the competing mechanisms of tensotaxis and autologous chemotaxis. In this work we develop a probabilistic-continuum, two-phase model for cell migration in response to interstitial flow. We use a Fokker-Planck type equation for the cell-velocity probability density function, and model the flow-dependent mechanochemical stimulus as a forcing term which biases cell migration upstream and downstream. Using velocity-space averaging, we reformulate the model as a system of continuum equations for the spatio-temporal evolution of the cell volume fraction and flux, in response to forcing terms which depend on the local direction and magnitude of the mechanochemical cues. We specialise our model to describe a one-dimensional cell layer subject to fluid flow. Using a combination of numerical simulations and asymptotic analysis, we delineate the parameter regime where transitions from downstream to upstream cell migration occur. As has been observed experimentally, the model predicts downstream-oriented, chemotactic migration at low cell volume fractions, and upstream-oriented, tensotactic migration at larger volume fractions. We show that the locus of the critical volume fraction, at which the system transitions from downstream to upstream migration, is dominated by the ratio of the rate of chemokine secretion and advection. Our model predicts that, because the tensotactic stimulus depends strongly on the cell volume fraction, upstream migration occurs only transiently when the cells are initially seeded, and transitions to downstream migration occur at later times due to the dispersive effect of cell diffusion.Comment: 20 pages, 6 figures. Submitted to Biophysical Journa
    • …
    corecore