62 research outputs found

    Palliative radiotherapy in addition to self-expanding metal stent for improving dysphagia and survival in advanced oesophageal cancer (ROCS: Radiotherapy after Oesophageal Cancer Stenting):study protocol for a randomized controlled trial

    Get PDF
    Background: The single most distressing symptom for patients with advanced esophageal cancer is dysphagia. Amongst the more effective treatments for relief of dysphagia is insertion of a self-expanding metal stent (SEMS). It is possible that the addition of a palliative dose of external beam radiotherapy may prolong the relief of dysphagia and provide additional survival benefit. The ROCS trial will assess the effect of adding palliative radiotherapy after esophageal stent insertion. Methods/Design: The study is a randomized multicenter phase III trial, with an internal pilot phase, comparing stent alone versus stent plus palliative radiotherapy in patients with incurable esophageal cancer. Eligible participants are those with advanced esophageal cancer who are in need of stent insertion for primary management of dysphagia. Radiotherapy will be administered as 20 Gray (Gy) in five fractions over one week or 30 Gy in 10 fractions over two weeks, within four weeks of stent insertion. The internal pilot will assess rates and methods of recruitment; pre-agreed criteria will determine progression to the main trial. In total, 496 patients will be randomized in a 1:1 ratio with follow up until death. The primary outcome is time to progression of patient-reported dysphagia. Secondary outcomes include survival, toxicity, health resource utilization, and quality of life. An embedded qualitative study will explore the feasibility of patient recruitment by examining patients’ motivations for involvement and their experiences of consent and recruitment, including reasons for not consenting. It will also explore patients’ experiences of each trial arm. Discussion: The ROCS study will be a challenging trial studying palliation in patients with a poor prognosis. The internal pilot design will optimize methods for recruitment and data collection to ensure that the main trial is completed on time. As a pragmatic trial, study strengths include collection of all follow-up data in the usual place of care, and a focus on patient-reported, rather than disease-orientated, outcomes. Exploration of patient experience and health economic analyses will be integral to the assessment of benefit for patients and the NHS

    Development of an enhanced scoring system to predict ICU readmission or in-hospital death within 24 hours using routine patient data from two NHS Foundation Trusts

    Get PDF
    Rationale: Intensive care units (ICUs) admit the most severely ill patients. Once these patients are discharged from the ICU to a step-down ward, they continue to have their vital signs monitored by nursing staff, with Early Warning Score (EWS) systems being used to identify those at risk of deterioration. Objectives: We report the development and validation of an enhanced continuous scoring system for predicting adverse events, which combines vital signs measured routinely on acute care wards (as used by most EWS systems) with a risk score of a future adverse event calculated on discharge from the ICU. Design: A modified Delphi process identified candidate variables commonly available in electronic records as the basis for a ‘static’ score of the patient’s condition immediately after discharge from the ICU. L1-regularised logistic regression was used to estimate the in-hospital risk of future adverse event. We then constructed a model of physiological normality using vital sign data from the day of hospital discharge. This is combined with the static score and used continuously to quantify and update the patient’s risk of deterioration throughout their hospital stay. Setting: Data from two National Health Service Foundation Trusts (UK) were used to develop and (externally) validate the model. Participants: A total of 12 394 vital sign measurements were acquired from 273 patients after ICU discharge for the development set, and 4831 from 136 patients in the validation cohort. Results: Outcome validation of our model yielded an area under the receiver operating characteristic curve of 0.724 for predicting ICU readmission or in-hospital death within 24 hours. It showed an improved performance with respect to other competitive risk scoring systems, including the National EWS (0.653). Conclusions: We showed that a scoring system incorporating data from a patient’s stay in the ICU has better performance than commonly used EWS systems based on vital signs alone. Trial registration number: ISRCTN32008295

    Antibiotics and antibiotic-resistant bacteria in waters associated with a hospital in Ujjain, India

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Concerns have been raised about the public health implications of the presence of antibiotic residues in the aquatic environment and their effect on the development of bacterial resistance. While there is information on antibiotic residue levels in hospital effluent from some other countries, information on antibiotic residue levels in effluent from Indian hospitals is not available. Also, concurrent studies on antibiotic prescription quantity in a hospital and antibiotic residue levels and resistant bacteria in the effluent of the same hospital are few. Therefore, we quantified antibiotic residues in waters associated with a hospital in India and assessed their association, if any, with quantities of antibiotic prescribed in the hospital and the susceptibility of <it>Escherichia coli </it>found in the hospital effluent.</p> <p>Methods</p> <p>This cross-sectional study was conducted in a teaching hospital outside the city of Ujjain in India. Seven antibiotics - amoxicillin, ceftriaxone, amikacin, ofloxacin, ciprofloxacin, norfloxacin and levofloxacin - were selected. Prescribed quantities were obtained from hospital records. The samples of the hospital associated water were analysed for the above mentioned antibiotics using well developed and validated liquid chromatography/tandem mass spectrometry technique after selectively isolating the analytes from the matrix using solid phase extraction. <it>Escherichia coli </it>isolates from these waters were tested for antibiotic susceptibility, by standard Kirby Bauer disc diffusion method using Clinical and Laboratory Standard Institute breakpoints.</p> <p>Results</p> <p>Ciprofloxacin was the highest prescribed antibiotic in the hospital and its residue levels in the hospital wastewater were also the highest. In samples of the municipal water supply and the groundwater, no antibiotics were detected. There was a positive correlation between the quantity of antibiotics prescribed in the hospital and antibiotic residue levels in the hospital wastewater. Wastewater samples collected in the afternoon contained both a higher number and higher levels of antibiotics compared to samples collected in the morning hours. No amikacin was found in the wastewater, but <it>E.coli </it>isolates from all wastewater samples were resistant to amikacin. Although ciprofloxacin was the most prevalent antibiotic detected in the wastewater, <it>E.coli </it>was not resistant to it.</p> <p>Conclusions</p> <p>Antibiotics are entering the aquatic environment of countries like India through hospital effluent. In-depth studies are needed to establish the correlation, if any, between the quantities of antibiotics prescribed in hospitals and the levels of antibiotic residues found in hospital effluent. Further, the effect of this on the development of bacterial resistance in the environment and its subsequent public health impact need thorough assessment.</p

    Changes in grassland management and linear infrastructures associated to the decline of an endangered bird population

    Get PDF
    European grassland birds are experiencing major population declines, mainly due to changes in farmland management. We analyzed the role of habitat availability, grazing management and linear infrastructures (roads and power lines) in explaining spatial and temporal variation in the population density of little bustards (Tetrax tetrax) in Portugal, during a decade in which the species population size halved. We used data from 51 areas (totaling ca. 1,50,000 ha) that were sampled in two different periods (2003–2006 and 2016). In 2003–2006, when the species occurred at high densities, habitat availability was the only factor affecting spatial variation in bustard density. In the 2016 survey, variation in density was explained by habitat availability and livestock management, with reduced bird numbers in areas with higher proportions of cattle. Population declines across the study period were steeper in areas that initially held higher densities of bustards and in areas with a higher proportion of cattle in the total stocking rate. Areas with higher densities of power lines also registered greater density declines, probably due to avoidance behavior and to increased mortality. Overall, our results show little bustards are currently lacking high quality grassland habitat, whose persistence depends on extensive grazing regimes and low linear infrastructure densitiesinfo:eu-repo/semantics/publishedVersio

    Magnetic resonance imaging of anterior cruciate ligament rupture

    Get PDF
    BACKGROUND: Magnetic resonance (MR) imaging is a useful diagnostic tool for the assessment of knee joint injury. Anterior cruciate ligament repair is a commonly performed orthopaedic procedure. This paper examines the concordance between MR imaging and arthroscopic findings. METHODS: Between February, 1996 and February, 1998, 48 patients who underwent magnetic resonance (MR) imaging of the knee were reported to have complete tears of the anterior cruciate ligament (ACL). Of the 48 patients, 36 were male, and 12 female. The average age was 27 years (range: 15 to 45). Operative reconstruction using a patellar bone-tendon-bone autograft was arranged for each patient, and an arthroscopic examination was performed to confirm the diagnosis immediately prior to reconstructive surgery. RESULTS: In 16 of the 48 patients, reconstructive surgery was cancelled when incomplete lesions were noted during arthroscopy, making reconstructive surgery unnecessary. The remaining 32 patients were found to have complete tears of the ACL, and therefore underwent reconstructive surgery. Using arthroscopy as an independent, reliable reference standard for ACL tear diagnosis, the reliability of MR imaging was evaluated. The true positive rate for complete ACL tear diagnosis with MR imaging was 67%, making the possibility of a false-positive report of "complete ACL tear" inevitable with MR imaging. CONCLUSIONS: Since conservative treatment is sufficient for incomplete ACL tears, the decision to undertake ACL reconstruction should not be based on MR findings alone

    Evaluation of the effects of implementing an electronic early warning score system: protocol for a stepped wedge study

    Get PDF
    Background: An Early Warning Score is a clinical risk score based upon vital signs intended to aid recognition of patients in need of urgent medical attention. The use of an escalation of care policy based upon an Early Warning Score is mandated as the standard of practice in British hospitals. Electronic systems for recording vital sign observations and Early Warning Score calculation offer theoretical benefits over paper-based systems. However, the evidence for their clinical benefit is limited. Previous studies have shown inconsistent results. The majority have employed a “before and after” study design, which may be strongly confounded by simultaneously occurring events. This study aims to examine how the implementation of an electronic early warning score system, System for Notification and Documentation (SEND), affects the recognition of clinical deterioration occurring in hospitalised adult patients. Methods: This study is a non-randomised stepped wedge evaluation carried out across the four hospitals of the Oxford University Hospitals NHS Trust, comparing charting on paper and charting using SEND. We assume that more frequent monitoring of acutely ill patients is associated with better recognition of patient deterioration. The primary outcome measure is the time between a patient’s first observations set with an Early Warning Score above the alerting threshold and their subsequent set of observations. Secondary outcome measures are in-hospital mortality, cardiac arrest and Intensive Care admission rates, hospital length of stay and system usability measured using the System Usability Scale. We will also measure Intensive Care length of stay, Intensive Care mortality, Acute Physiology and Chronic Health Evaluation (APACHE) II acute physiology score on admission, to examine whether the introduction of SEND has any effect on Intensive Care-related outcomes. Discussion: The development of this protocol has been informed by guidance from the Agency for Healthcare Research and Quality (AHRQ) Health Information Technology Evaluation Toolkit and Delone and McLeans’s Model of Information System Success. Our chosen trial design, a stepped wedge study, is well suited to the study of a phased roll out. The choice of primary endpoint is challenging. We have selected the time from the first triggering observation set to the subsequent observation set. This has the benefit of being easy to measure on both paper and electronic charting and having a straightforward interpretation. We have collected qualitative measures of system quality via a user questionnaire and organisational descriptors to help readers understand the context in which SEND has been implemented

    Enhancement strategies for transdermal drug delivery systems: current trends and applications

    Get PDF
    Transdermal drug delivery systems have become an intriguing research topic in pharmaceutical technology area and one of the most frequently developed pharmaceutical products in global market. The use of these systems can overcome associated drawbacks of other delivery routes, such as oral and parenteral. The authors will review current trends, and future applications of transdermal technologies, with specific focus on providing a comprehensive understanding of transdermal drug delivery systems and enhancement strategies. This article will initially discuss each transdermal enhancement method used in the development of first-generation transdermal products. These methods include drug/vehicle interactions, vesicles and particles, stratum corneum modification, energy-driven methods and stratum corneum bypassing techniques. Through suitable design and implementation of active stratum corneum bypassing methods, notably microneedle technology, transdermal delivery systems have been shown to deliver both low and high molecular weight drugs. Microneedle technology platforms have proven themselves to be more versatile than other transdermal systems with opportunities for intradermal delivery of drugs/biotherapeutics and therapeutic drug monitoring. These have shown that microneedles have been a prospective strategy for improving transdermal delivery systems. Graphical abstract: [Figure not available: see fulltext.]</p

    Effect of renal denervation on blood pressure in the presence of antihypertensive drugs: 6-month efficacy and safety results from the SPYRAL HTN-ON MED proof-of-concept randomised trial.

    Get PDF
    : Previous catheter-based renal denervation studies have reported variable efficacy results. We aimed to evaluate safety and blood pressure response after renal denervation or sham control in patients with uncontrolled hypertension on antihypertensive medications with drug adherence testing. : In this international, randomised, single-blind, sham-control, proof-of-concept trial, patients with uncontrolled hypertension (aged 20-80 years) were enrolled at 25 centres in the USA, Germany, Japan, UK, Australia, Austria, and Greece. Eligible patients had an office systolic blood pressure of between 150 mm Hg and 180 mm Hg and a diastolic blood pressure of 90 mm Hg or higher; a 24 h ambulatory systolic blood pressure of between 140 mm Hg and 170 mm Hg at second screening; and were on one to three antihypertensive drugs with stable doses for at least 6 weeks. Patients underwent renal angiography and were randomly assigned to undergo renal denervation or sham control. Patients, caregivers, and those assessing blood pressure were masked to randomisation assignments. The primary efficacy endpoint was blood pressure change from baseline (measured at screening visit two), based on ambulatory blood pressure measurements assessed at 6 months, as compared between treatment groups. Drug surveillance was used to assess medication adherence. The primary analysis was done in the intention-to-treat population. Safety events were assessed through 6 months as per major adverse events. This trial is registered with ClinicalTrials.gov, number NCT02439775, and follow-up is ongoing. : Between July 22, 2015, and June 14, 2017, 467 patients were screened and enrolled. This analysis presents results for the first 80 patients randomly assigned to renal denervation (n=38) and sham control (n=42). Office and 24 h ambulatory blood pressure decreased significantly from baseline to 6 months in the renal denervation group (mean baseline-adjusted treatment differences in 24 h systolic blood pressure -7·0 mm Hg, 95% CI -12·0 to -2·1; p=0·0059, 24 h diastolic blood pressure -4·3 mm Hg, -7·8 to -0·8; p=0.0174, office systolic blood pressure -6·6 mm Hg, -12·4 to -0·9; p=0·0250, and office diastolic blood pressure -4·2 mm Hg, -7·7 to -0·7; p=0·0190). The change in blood pressure was significantly greater at 6 months in the renal denervation group than the sham-control group for office systolic blood pressure (difference -6·8 mm Hg, 95% CI -12·5 to -1·1; p=0·0205), 24 h systolic blood pressure (difference -7·4 mm Hg, -12·5 to -2·3; p=0·0051), office diastolic blood pressure (difference -3·5 mm Hg, -7·0 to -0·0; p=0·0478), and 24 h diastolic blood pressure (difference -4·1 mm Hg, -7·8 to -0·4; p=0·0292). Evaluation of hourly changes in 24 h systolic blood pressure and diastolic blood pressure showed blood pressure reduction throughout 24 h for the renal denervation group. 3 month blood pressure reductions were not significantly different between groups. Medication adherence was about 60% and varied for individual patients throughout the study. No major adverse events were recorded in either group. : Renal denervation in the main renal arteries and branches significantly reduced blood pressure compared with sham control with no major safety events. Incomplete medication adherence was common. : Medtronic.<br/

    The mechanisms of action of vaccines containing aluminum adjuvants: an in vitro vs in vivo paradigm

    Get PDF
    corecore