37 research outputs found

    An evaluation of the development and use of a microcomputer assisted system for planning individualised adult literacy programmes in an adult basic education unit

    Get PDF
    The thesis describes the development, implementation and evaluation of a computer assisted system for planning individualised adult literacy programmes in an adult basic education (ABE) unit located in an English College of Further Education. After examining past and current developments of Computer Based Applications in Education, both in general and in Literacy Teaching Applications, conclusions as to the appropriate use of computer-based learning in the proposed context are drawn. Human and hardware resources available in the ABE unit are detailed and appropriate aims for a proposed system based on the earlier conclusions are set out. A possible system instructional model is discussed via details of the current teaching, monitoring and evaluation activities of the unit. An examination of the current theory, practice and literature relating to literacy and adult literacy teaching enables a conclusion that a student-centred approach, in a real world context, using a common core curriculum, is most suitable. A detailed common-core curriculum model for teaching adult literacy is then proposed, following which a Warnier-Orr design exercise of a computer-based system known as MALCM, using the model, is described, from initial considerations through to system testing. The implementation and evaluation of the MALCM system in the setting of the ABE Unit is then described in the form of a case study. The reported and observed experiences of staff involved are analysed and the appropriateness of the case study as a means for evaluation is discussed. The thesis concludes by endorsing the potential for a system such as MALCM but underlines the need for user involvement in any CBL learning management development; It suggests that further development of the MALCM system as currently constituted is non-viable without considerable refinements to take account of developments in the field of hardware and intelligent knowledge-based systems

    The development and validation of a scoring tool to predict the operative duration of elective laparoscopic cholecystectomy

    Get PDF
    Background: The ability to accurately predict operative duration has the potential to optimise theatre efficiency and utilisation, thus reducing costs and increasing staff and patient satisfaction. With laparoscopic cholecystectomy being one of the most commonly performed procedures worldwide, a tool to predict operative duration could be extremely beneficial to healthcare organisations. Methods: Data collected from the CholeS study on patients undergoing cholecystectomy in UK and Irish hospitals between 04/2014 and 05/2014 were used to study operative duration. A multivariable binary logistic regression model was produced in order to identify significant independent predictors of long (> 90 min) operations. The resulting model was converted to a risk score, which was subsequently validated on second cohort of patients using ROC curves. Results: After exclusions, data were available for 7227 patients in the derivation (CholeS) cohort. The median operative duration was 60 min (interquartile range 45–85), with 17.7% of operations lasting longer than 90 min. Ten factors were found to be significant independent predictors of operative durations > 90 min, including ASA, age, previous surgical admissions, BMI, gallbladder wall thickness and CBD diameter. A risk score was then produced from these factors, and applied to a cohort of 2405 patients from a tertiary centre for external validation. This returned an area under the ROC curve of 0.708 (SE = 0.013, p  90 min increasing more than eightfold from 5.1 to 41.8% in the extremes of the score. Conclusion: The scoring tool produced in this study was found to be significantly predictive of long operative durations on validation in an external cohort. As such, the tool may have the potential to enable organisations to better organise theatre lists and deliver greater efficiencies in care

    Global patient outcomes after elective surgery: prospective cohort study in 27 low-, middle- and high-income countries.

    Get PDF
    BACKGROUND: As global initiatives increase patient access to surgical treatments, there remains a need to understand the adverse effects of surgery and define appropriate levels of perioperative care. METHODS: We designed a prospective international 7-day cohort study of outcomes following elective adult inpatient surgery in 27 countries. The primary outcome was in-hospital complications. Secondary outcomes were death following a complication (failure to rescue) and death in hospital. Process measures were admission to critical care immediately after surgery or to treat a complication and duration of hospital stay. A single definition of critical care was used for all countries. RESULTS: A total of 474 hospitals in 19 high-, 7 middle- and 1 low-income country were included in the primary analysis. Data included 44 814 patients with a median hospital stay of 4 (range 2-7) days. A total of 7508 patients (16.8%) developed one or more postoperative complication and 207 died (0.5%). The overall mortality among patients who developed complications was 2.8%. Mortality following complications ranged from 2.4% for pulmonary embolism to 43.9% for cardiac arrest. A total of 4360 (9.7%) patients were admitted to a critical care unit as routine immediately after surgery, of whom 2198 (50.4%) developed a complication, with 105 (2.4%) deaths. A total of 1233 patients (16.4%) were admitted to a critical care unit to treat complications, with 119 (9.7%) deaths. Despite lower baseline risk, outcomes were similar in low- and middle-income compared with high-income countries. CONCLUSIONS: Poor patient outcomes are common after inpatient surgery. Global initiatives to increase access to surgical treatments should also address the need for safe perioperative care. STUDY REGISTRATION: ISRCTN5181700

    Large expert-curated database for benchmarking document similarity detection in biomedical literature search

    Get PDF
    Document recommendation systems for locating relevant literature have mostly relied on methods developed a decade ago. This is largely due to the lack of a large offline gold-standard benchmark of relevant documents that cover a variety of research fields such that newly developed literature search techniques can be compared, improved and translated into practice. To overcome this bottleneck, we have established the RElevant LIterature SearcH consortium consisting of more than 1500 scientists from 84 countries, who have collectively annotated the relevance of over 180 000 PubMed-listed articles with regard to their respective seed (input) article/s. The majority of annotations were contributed by highly experienced, original authors of the seed articles. The collected data cover 76% of all unique PubMed Medical Subject Headings descriptors. No systematic biases were observed across different experience levels, research fields or time spent on annotations. More importantly, annotations of the same document pairs contributed by different scientists were highly concordant. We further show that the three representative baseline methods used to generate recommended articles for evaluation (Okapi Best Matching 25, Term Frequency-Inverse Document Frequency and PubMed Related Articles) had similar overall performances. Additionally, we found that these methods each tend to produce distinct collections of recommended articles, suggesting that a hybrid method may be required to completely capture all relevant articles. The established database server located at https://relishdb.ict.griffith.edu.au is freely available for the downloading of annotation data and the blind testing of new methods. We expect that this benchmark will be useful for stimulating the development of new powerful techniques for title and title/abstract-based search engines for relevant articles in biomedical research.Peer reviewe

    Percutaneous revascularization for ischemic left ventricular dysfunction: Cost-effectiveness analysis of the REVIVED-BCIS2 trial

    Get PDF
    BACKGROUND: Percutaneous coronary intervention (PCI) is frequently undertaken in patients with ischemic left ventricular systolic dysfunction. The REVIVED (Revascularization for Ischemic Ventricular Dysfunction)-BCIS2 (British Cardiovascular Society-2) trial concluded that PCI did not reduce the incidence of all-cause death or heart failure hospitalization; however, patients assigned to PCI reported better initial health-related quality of life than those assigned to optimal medical therapy (OMT) alone. The aim of this study was to assess the cost-effectiveness of PCI+OMT compared with OMT alone. METHODS: REVIVED-BCIS2 was a prospective, multicenter UK trial, which randomized patients with severe ischemic left ventricular systolic dysfunction to either PCI+OMT or OMT alone. Health care resource use (including planned and unplanned revascularizations, medication, device implantation, and heart failure hospitalizations) and health outcomes data (EuroQol 5-dimension 5-level questionnaire) on each patient were collected at baseline and up to 8 years post-randomization. Resource use was costed using publicly available national unit costs. Within the trial, mean total costs and quality-adjusted life-years (QALYs) were estimated from the perspective of the UK health system. Cost-effectiveness was evaluated using estimated mean costs and QALYs in both groups. Regression analysis was used to adjust for clinically relevant predictors. RESULTS: Between 2013 and 2020, 700 patients were recruited (mean age: PCI+OMT=70 years, OMT=68 years; male (%): PCI+OMT=87, OMT=88); median follow-up was 3.4 years. Over all follow-ups, patients undergoing PCI yielded similar health benefits at higher costs compared with OMT alone (PCI+OMT: 4.14 QALYs, £22 352; OMT alone: 4.16 QALYs, £15 569; difference: −0.015, £6782). For both groups, most health resource consumption occurred in the first 2 years post-randomization. Probabilistic results showed that the probability of PCI being cost-effective was 0. CONCLUSIONS: A minimal difference in total QALYs was identified between arms, and PCI+OMT was not cost-effective compared with OMT, given its additional cost. A strategy of routine PCI to treat ischemic left ventricular systolic dysfunction does not seem to be a justifiable use of health care resources in the United Kingdom

    Arrhythmia and death following percutaneous revascularization in ischemic left ventricular dysfunction: Prespecified analyses from the REVIVED-BCIS2 trial

    Get PDF
    BACKGROUND: Ventricular arrhythmia is an important cause of mortality in patients with ischemic left ventricular dysfunction. Revascularization with coronary artery bypass graft or percutaneous coronary intervention is often recommended for these patients before implantation of a cardiac defibrillator because it is assumed that this may reduce the incidence of fatal and potentially fatal ventricular arrhythmias, although this premise has not been evaluated in a randomized trial to date. METHODS: Patients with severe left ventricular dysfunction, extensive coronary disease, and viable myocardium were randomly assigned to receive either percutaneous coronary intervention (PCI) plus optimal medical and device therapy (OMT) or OMT alone. The composite primary outcome was all-cause death or aborted sudden death (defined as an appropriate implantable cardioverter defibrillator therapy or a resuscitated cardiac arrest) at a minimum of 24 months, analyzed as time to first event on an intention-to-treat basis. Secondary outcomes included cardiovascular death or aborted sudden death, appropriate implantable cardioverter defibrillator (ICD) therapy or sustained ventricular arrhythmia, and number of appropriate ICD therapies. RESULTS: Between August 28, 2013, and March 19, 2020, 700 patients were enrolled across 40 centers in the United Kingdom. A total of 347 patients were assigned to the PCI+OMT group and 353 to the OMT alone group. The mean age of participants was 69 years; 88% were male; 56% had hypertension; 41% had diabetes; and 53% had a clinical history of myocardial infarction. The median left ventricular ejection fraction was 28%; 53.1% had an implantable defibrillator inserted before randomization or during follow-up. All-cause death or aborted sudden death occurred in 144 patients (41.6%) in the PCI group and 142 patients (40.2%) in the OMT group (hazard ratio, 1.03 [95% CI, 0.82–1.30]; P =0.80). There was no between-group difference in the occurrence of any of the secondary outcomes. CONCLUSIONS: PCI was not associated with a reduction in all-cause mortality or aborted sudden death. In patients with ischemic cardiomyopathy, PCI is not beneficial solely for the purpose of reducing potentially fatal ventricular arrhythmias. REGISTRATION: URL: https://www.clinicaltrials.gov ; Unique identifier: NCT01920048

    A blood atlas of COVID-19 defines hallmarks of disease severity and specificity.

    Get PDF
    Treatment of severe COVID-19 is currently limited by clinical heterogeneity and incomplete description of specific immune biomarkers. We present here a comprehensive multi-omic blood atlas for patients with varying COVID-19 severity in an integrated comparison with influenza and sepsis patients versus healthy volunteers. We identify immune signatures and correlates of host response. Hallmarks of disease severity involved cells, their inflammatory mediators and networks, including progenitor cells and specific myeloid and lymphocyte subsets, features of the immune repertoire, acute phase response, metabolism, and coagulation. Persisting immune activation involving AP-1/p38MAPK was a specific feature of COVID-19. The plasma proteome enabled sub-phenotyping into patient clusters, predictive of severity and outcome. Systems-based integrative analyses including tensor and matrix decomposition of all modalities revealed feature groupings linked with severity and specificity compared to influenza and sepsis. Our approach and blood atlas will support future drug development, clinical trial design, and personalized medicine approaches for COVID-19

    Effect of angiotensin-converting enzyme inhibitor and angiotensin receptor blocker initiation on organ support-free days in patients hospitalized with COVID-19

    Get PDF
    IMPORTANCE Overactivation of the renin-angiotensin system (RAS) may contribute to poor clinical outcomes in patients with COVID-19. Objective To determine whether angiotensin-converting enzyme (ACE) inhibitor or angiotensin receptor blocker (ARB) initiation improves outcomes in patients hospitalized for COVID-19. DESIGN, SETTING, AND PARTICIPANTS In an ongoing, adaptive platform randomized clinical trial, 721 critically ill and 58 non–critically ill hospitalized adults were randomized to receive an RAS inhibitor or control between March 16, 2021, and February 25, 2022, at 69 sites in 7 countries (final follow-up on June 1, 2022). INTERVENTIONS Patients were randomized to receive open-label initiation of an ACE inhibitor (n = 257), ARB (n = 248), ARB in combination with DMX-200 (a chemokine receptor-2 inhibitor; n = 10), or no RAS inhibitor (control; n = 264) for up to 10 days. MAIN OUTCOMES AND MEASURES The primary outcome was organ support–free days, a composite of hospital survival and days alive without cardiovascular or respiratory organ support through 21 days. The primary analysis was a bayesian cumulative logistic model. Odds ratios (ORs) greater than 1 represent improved outcomes. RESULTS On February 25, 2022, enrollment was discontinued due to safety concerns. Among 679 critically ill patients with available primary outcome data, the median age was 56 years and 239 participants (35.2%) were women. Median (IQR) organ support–free days among critically ill patients was 10 (–1 to 16) in the ACE inhibitor group (n = 231), 8 (–1 to 17) in the ARB group (n = 217), and 12 (0 to 17) in the control group (n = 231) (median adjusted odds ratios of 0.77 [95% bayesian credible interval, 0.58-1.06] for improvement for ACE inhibitor and 0.76 [95% credible interval, 0.56-1.05] for ARB compared with control). The posterior probabilities that ACE inhibitors and ARBs worsened organ support–free days compared with control were 94.9% and 95.4%, respectively. Hospital survival occurred in 166 of 231 critically ill participants (71.9%) in the ACE inhibitor group, 152 of 217 (70.0%) in the ARB group, and 182 of 231 (78.8%) in the control group (posterior probabilities that ACE inhibitor and ARB worsened hospital survival compared with control were 95.3% and 98.1%, respectively). CONCLUSIONS AND RELEVANCE In this trial, among critically ill adults with COVID-19, initiation of an ACE inhibitor or ARB did not improve, and likely worsened, clinical outcomes. TRIAL REGISTRATION ClinicalTrials.gov Identifier: NCT0273570

    31st Annual Meeting and Associated Programs of the Society for Immunotherapy of Cancer (SITC 2016) : part two

    Get PDF
    Background The immunological escape of tumors represents one of the main ob- stacles to the treatment of malignancies. The blockade of PD-1 or CTLA-4 receptors represented a milestone in the history of immunotherapy. However, immune checkpoint inhibitors seem to be effective in specific cohorts of patients. It has been proposed that their efficacy relies on the presence of an immunological response. Thus, we hypothesized that disruption of the PD-L1/PD-1 axis would synergize with our oncolytic vaccine platform PeptiCRAd. Methods We used murine B16OVA in vivo tumor models and flow cytometry analysis to investigate the immunological background. Results First, we found that high-burden B16OVA tumors were refractory to combination immunotherapy. However, with a more aggressive schedule, tumors with a lower burden were more susceptible to the combination of PeptiCRAd and PD-L1 blockade. The therapy signifi- cantly increased the median survival of mice (Fig. 7). Interestingly, the reduced growth of contralaterally injected B16F10 cells sug- gested the presence of a long lasting immunological memory also against non-targeted antigens. Concerning the functional state of tumor infiltrating lymphocytes (TILs), we found that all the immune therapies would enhance the percentage of activated (PD-1pos TIM- 3neg) T lymphocytes and reduce the amount of exhausted (PD-1pos TIM-3pos) cells compared to placebo. As expected, we found that PeptiCRAd monotherapy could increase the number of antigen spe- cific CD8+ T cells compared to other treatments. However, only the combination with PD-L1 blockade could significantly increase the ra- tio between activated and exhausted pentamer positive cells (p= 0.0058), suggesting that by disrupting the PD-1/PD-L1 axis we could decrease the amount of dysfunctional antigen specific T cells. We ob- served that the anatomical location deeply influenced the state of CD4+ and CD8+ T lymphocytes. In fact, TIM-3 expression was in- creased by 2 fold on TILs compared to splenic and lymphoid T cells. In the CD8+ compartment, the expression of PD-1 on the surface seemed to be restricted to the tumor micro-environment, while CD4 + T cells had a high expression of PD-1 also in lymphoid organs. Interestingly, we found that the levels of PD-1 were significantly higher on CD8+ T cells than on CD4+ T cells into the tumor micro- environment (p < 0.0001). Conclusions In conclusion, we demonstrated that the efficacy of immune check- point inhibitors might be strongly enhanced by their combination with cancer vaccines. PeptiCRAd was able to increase the number of antigen-specific T cells and PD-L1 blockade prevented their exhaus- tion, resulting in long-lasting immunological memory and increased median survival
    corecore