45 research outputs found

    Vaccination with Plasmodium vivax Duffy-binding protein inhibits parasite growth during controlled human malaria infection

    Get PDF
    There are no licensed vaccines against Plasmodium vivax. We conducted two phase 1/2a clinical trials to assess two vaccines targeting P. vivax Duffy-binding protein region II (PvDBPII). Recombinant viral vaccines using chimpanzee adenovirus 63 (ChAd63) and modified vaccinia virus Ankara (MVA) vectors as well as a protein and adjuvant formulation (PvDBPII/Matrix-M) were tested in both a standard and a delayed dosing regimen. Volunteers underwent controlled human malaria infection (CHMI) after their last vaccination, alongside unvaccinated controls. Efficacy was assessed by comparisons of parasite multiplication rates in the blood. PvDBPII/Matrix-M, given in a delayed dosing regimen, elicited the highest antibody responses and reduced the mean parasite multiplication rate after CHMI by 51% (n = 6) compared with unvaccinated controls (n = 13), whereas no other vaccine or regimen affected parasite growth. Both viral-vectored and protein vaccines were well tolerated and elicited expected, short-lived adverse events. Together, these results support further clinical evaluation of the PvDBPII/Matrix-M P. vivax vaccine

    Studies of jet mass in dijet and W/Z plus jet events

    Get PDF
    This is the pre-print version of the final published paper that is available from the link below.Invariant mass spectra for jets reconstructed using the anti-kT and Cambridge-Aachen algorithms are studied for different jet “grooming” techniques in data corresponding to an integrated luminosity of 5 fb-1, recorded with the CMS detector in proton-proton collisions at the LHC at a center-of-mass energy of 7TeV. Leading-order QCD predictions for inclusive dijet and W/Z+jet production combined with parton-shower Monte Carlo models are found to agree overall with the data, and the agreement improves with the implementation of jet grooming methods used to distinguish merged jets of large transverse momentum from softer QCD gluon radiation

    The development and validation of a scoring tool to predict the operative duration of elective laparoscopic cholecystectomy

    Get PDF
    Background: The ability to accurately predict operative duration has the potential to optimise theatre efficiency and utilisation, thus reducing costs and increasing staff and patient satisfaction. With laparoscopic cholecystectomy being one of the most commonly performed procedures worldwide, a tool to predict operative duration could be extremely beneficial to healthcare organisations. Methods: Data collected from the CholeS study on patients undergoing cholecystectomy in UK and Irish hospitals between 04/2014 and 05/2014 were used to study operative duration. A multivariable binary logistic regression model was produced in order to identify significant independent predictors of long (> 90 min) operations. The resulting model was converted to a risk score, which was subsequently validated on second cohort of patients using ROC curves. Results: After exclusions, data were available for 7227 patients in the derivation (CholeS) cohort. The median operative duration was 60 min (interquartile range 45–85), with 17.7% of operations lasting longer than 90 min. Ten factors were found to be significant independent predictors of operative durations > 90 min, including ASA, age, previous surgical admissions, BMI, gallbladder wall thickness and CBD diameter. A risk score was then produced from these factors, and applied to a cohort of 2405 patients from a tertiary centre for external validation. This returned an area under the ROC curve of 0.708 (SE = 0.013, p  90 min increasing more than eightfold from 5.1 to 41.8% in the extremes of the score. Conclusion: The scoring tool produced in this study was found to be significantly predictive of long operative durations on validation in an external cohort. As such, the tool may have the potential to enable organisations to better organise theatre lists and deliver greater efficiencies in care

    The impact of immediate breast reconstruction on the time to delivery of adjuvant therapy: the iBRA-2 study

    Get PDF
    Background: Immediate breast reconstruction (IBR) is routinely offered to improve quality-of-life for women requiring mastectomy, but there are concerns that more complex surgery may delay adjuvant oncological treatments and compromise long-term outcomes. High-quality evidence is lacking. The iBRA-2 study aimed to investigate the impact of IBR on time to adjuvant therapy. Methods: Consecutive women undergoing mastectomy ± IBR for breast cancer July–December, 2016 were included. Patient demographics, operative, oncological and complication data were collected. Time from last definitive cancer surgery to first adjuvant treatment for patients undergoing mastectomy ± IBR were compared and risk factors associated with delays explored. Results: A total of 2540 patients were recruited from 76 centres; 1008 (39.7%) underwent IBR (implant-only [n = 675, 26.6%]; pedicled flaps [n = 105,4.1%] and free-flaps [n = 228, 8.9%]). Complications requiring re-admission or re-operation were significantly more common in patients undergoing IBR than those receiving mastectomy. Adjuvant chemotherapy or radiotherapy was required by 1235 (48.6%) patients. No clinically significant differences were seen in time to adjuvant therapy between patient groups but major complications irrespective of surgery received were significantly associated with treatment delays. Conclusions: IBR does not result in clinically significant delays to adjuvant therapy, but post-operative complications are associated with treatment delays. Strategies to minimise complications, including careful patient selection, are required to improve outcomes for patients

    Impact of opioid-free analgesia on pain severity and patient satisfaction after discharge from surgery: multispecialty, prospective cohort study in 25 countries

    Get PDF
    Background: Balancing opioid stewardship and the need for adequate analgesia following discharge after surgery is challenging. This study aimed to compare the outcomes for patients discharged with opioid versus opioid-free analgesia after common surgical procedures.Methods: This international, multicentre, prospective cohort study collected data from patients undergoing common acute and elective general surgical, urological, gynaecological, and orthopaedic procedures. The primary outcomes were patient-reported time in severe pain measured on a numerical analogue scale from 0 to 100% and patient-reported satisfaction with pain relief during the first week following discharge. Data were collected by in-hospital chart review and patient telephone interview 1 week after discharge.Results: The study recruited 4273 patients from 144 centres in 25 countries; 1311 patients (30.7%) were prescribed opioid analgesia at discharge. Patients reported being in severe pain for 10 (i.q.r. 1-30)% of the first week after discharge and rated satisfaction with analgesia as 90 (i.q.r. 80-100) of 100. After adjustment for confounders, opioid analgesia on discharge was independently associated with increased pain severity (risk ratio 1.52, 95% c.i. 1.31 to 1.76; P < 0.001) and re-presentation to healthcare providers owing to side-effects of medication (OR 2.38, 95% c.i. 1.36 to 4.17; P = 0.004), but not with satisfaction with analgesia (beta coefficient 0.92, 95% c.i. -1.52 to 3.36; P = 0.468) compared with opioid-free analgesia. Although opioid prescribing varied greatly between high-income and low- and middle-income countries, patient-reported outcomes did not.Conclusion: Opioid analgesia prescription on surgical discharge is associated with a higher risk of re-presentation owing to side-effects of medication and increased patient-reported pain, but not with changes in patient-reported satisfaction. Opioid-free discharge analgesia should be adopted routinely

    Plasma Neurofilament Light for Prediction of Disease Progression in Familial Frontotemporal Lobar Degeneration

    Get PDF
    Objective: We tested the hypothesis that plasma neurofilament light chain (NfL) identifies asymptomatic carriers of familial frontotemporal lobar degeneration (FTLD)-causing mutations at risk of disease progression. Methods: Baseline plasma NfL concentrations were measured with single-molecule array in original (n = 277) and validation (n = 297) cohorts. C9orf72, GRN, and MAPT mutation carriers and noncarriers from the same families were classified by disease severity (asymptomatic, prodromal, and full phenotype) using the CDR Dementia Staging Instrument plus behavior and language domains from the National Alzheimer's Disease Coordinating Center FTLD module (CDR+NACC-FTLD). Linear mixed-effect models related NfL to clinical variables. Results: In both cohorts, baseline NfL was higher in asymptomatic mutation carriers who showed phenoconversion or disease progression compared to nonprogressors (original: 11.4 ± 7 pg/mL vs 6.7 ± 5 pg/mL, p = 0.002; validation: 14.1 ± 12 pg/mL vs 8.7 ± 6 pg/mL, p = 0.035). Plasma NfL discriminated symptomatic from asymptomatic mutation carriers or those with prodromal disease (original cutoff: 13.6 pg/mL, 87.5% sensitivity, 82.7% specificity; validation cutoff: 19.8 pg/mL, 87.4% sensitivity, 84.3% specificity). Higher baseline NfL correlated with worse longitudinal CDR+NACC-FTLD sum of boxes scores, neuropsychological function, and atrophy, regardless of genotype or disease severity, including asymptomatic mutation carriers. Conclusions: Plasma NfL identifies asymptomatic carriers of FTLD-causing mutations at short-term risk of disease progression and is a potential tool to select participants for prevention clinical trials. Trial registration information: ClinicalTrials.gov Identifier: NCT02372773 and NCT02365922. Classification of evidence: This study provides Class I evidence that in carriers of FTLD-causing mutations, elevation of plasma NfL predicts short-term risk of clinical progression

    Studies of jet mass in dijet and W/Z + jet events

    Full text link

    Breast cancer management pathways during the COVID-19 pandemic: outcomes from the UK ‘Alert Level 4’ phase of the B-MaP-C study

    Get PDF
    Abstract: Background: The B-MaP-C study aimed to determine alterations to breast cancer (BC) management during the peak transmission period of the UK COVID-19 pandemic and the potential impact of these treatment decisions. Methods: This was a national cohort study of patients with early BC undergoing multidisciplinary team (MDT)-guided treatment recommendations during the pandemic, designated ‘standard’ or ‘COVID-altered’, in the preoperative, operative and post-operative setting. Findings: Of 3776 patients (from 64 UK units) in the study, 2246 (59%) had ‘COVID-altered’ management. ‘Bridging’ endocrine therapy was used (n = 951) where theatre capacity was reduced. There was increasing access to COVID-19 low-risk theatres during the study period (59%). In line with national guidance, immediate breast reconstruction was avoided (n = 299). Where adjuvant chemotherapy was omitted (n = 81), the median benefit was only 3% (IQR 2–9%) using ‘NHS Predict’. There was the rapid adoption of new evidence-based hypofractionated radiotherapy (n = 781, from 46 units). Only 14 patients (1%) tested positive for SARS-CoV-2 during their treatment journey. Conclusions: The majority of ‘COVID-altered’ management decisions were largely in line with pre-COVID evidence-based guidelines, implying that breast cancer survival outcomes are unlikely to be negatively impacted by the pandemic. However, in this study, the potential impact of delays to BC presentation or diagnosis remains unknown
    corecore