27 research outputs found

    Human Antigen-Specific Regulatory T Cells Generated by T Cell Receptor Gene Transfer

    Get PDF
    Therapies directed at augmenting regulatory T cell (Treg) activities in vivo as a systemic treatment for autoimmune disorders and transplantation may be associated with significant off-target effects, including a generalized immunosuppression that may compromise beneficial immune responses to infections and cancer cells. Adoptive cellular therapies using purified expanded Tregs represents an attractive alternative to systemic treatments, with results from animal studies noting increased therapeutic potency of antigen-specific Tregs over polyclonal populations. However, current methodologies are limited in terms of the capacity to isolate and expand a sufficient quantity of endogenous antigen-specific Tregs for therapeutic intervention. Moreover, FOXP3+ Tregs fall largely within the CD4+ T cell subset and are thus routinely MHC class II-specific, whereas class I-specific Tregs may function optimally in vivo by facilitating direct tissue recognition.To overcome these limitations, we have developed a novel means for generating large numbers of antigen-specific Tregs involving lentiviral T cell receptor (TCR) gene transfer into in vitro expanded polyclonal natural Treg populations. Tregs redirected with a high-avidity class I-specific TCR were capable of recognizing the melanoma antigen tyrosinase in the context of HLA-A*0201 and could be further enriched during the expansion process by antigen-specific reactivation with peptide loaded artificial antigen presenting cells. These in vitro expanded Tregs continued to express FOXP3 and functional TCRs, and maintained the capacity to suppress conventional T cell responses directed against tyrosinase, as well as bystander T cell responses. Using this methodology in a model tumor system, murine Tregs designed to express the tyrosinase TCR effectively blocked antigen-specific effector T cell (Teff) activity as determined by tumor cell growth and luciferase reporter-based imaging.These results support the feasibility of class I-restricted TCR transfer as a promising strategy to redirect the functional properties of Tregs and provide for a more efficacious adoptive cell therapy

    The impact of surgical delay on resectability of colorectal cancer: An international prospective cohort study

    Get PDF
    AIM: The SARS-CoV-2 pandemic has provided a unique opportunity to explore the impact of surgical delays on cancer resectability. This study aimed to compare resectability for colorectal cancer patients undergoing delayed versus non-delayed surgery. METHODS: This was an international prospective cohort study of consecutive colorectal cancer patients with a decision for curative surgery (January-April 2020). Surgical delay was defined as an operation taking place more than 4 weeks after treatment decision, in a patient who did not receive neoadjuvant therapy. A subgroup analysis explored the effects of delay in elective patients only. The impact of longer delays was explored in a sensitivity analysis. The primary outcome was complete resection, defined as curative resection with an R0 margin. RESULTS: Overall, 5453 patients from 304 hospitals in 47 countries were included, of whom 6.6% (358/5453) did not receive their planned operation. Of the 4304 operated patients without neoadjuvant therapy, 40.5% (1744/4304) were delayed beyond 4 weeks. Delayed patients were more likely to be older, men, more comorbid, have higher body mass index and have rectal cancer and early stage disease. Delayed patients had higher unadjusted rates of complete resection (93.7% vs. 91.9%, P = 0.032) and lower rates of emergency surgery (4.5% vs. 22.5%, P < 0.001). After adjustment, delay was not associated with a lower rate of complete resection (OR 1.18, 95% CI 0.90-1.55, P = 0.224), which was consistent in elective patients only (OR 0.94, 95% CI 0.69-1.27, P = 0.672). Longer delays were not associated with poorer outcomes. CONCLUSION: One in 15 colorectal cancer patients did not receive their planned operation during the first wave of COVID-19. Surgical delay did not appear to compromise resectability, raising the hypothesis that any reduction in long-term survival attributable to delays is likely to be due to micro-metastatic disease

    Pooled analysis of WHO Surgical Safety Checklist use and mortality after emergency laparotomy

    Get PDF
    Background The World Health Organization (WHO) Surgical Safety Checklist has fostered safe practice for 10 years, yet its place in emergency surgery has not been assessed on a global scale. The aim of this study was to evaluate reported checklist use in emergency settings and examine the relationship with perioperative mortality in patients who had emergency laparotomy. Methods In two multinational cohort studies, adults undergoing emergency laparotomy were compared with those having elective gastrointestinal surgery. Relationships between reported checklist use and mortality were determined using multivariable logistic regression and bootstrapped simulation. Results Of 12 296 patients included from 76 countries, 4843 underwent emergency laparotomy. After adjusting for patient and disease factors, checklist use before emergency laparotomy was more common in countries with a high Human Development Index (HDI) (2455 of 2741, 89.6 per cent) compared with that in countries with a middle (753 of 1242, 60.6 per cent; odds ratio (OR) 0.17, 95 per cent c.i. 0.14 to 0.21, P <0001) or low (363 of 860, 422 per cent; OR 008, 007 to 010, P <0.001) HDI. Checklist use was less common in elective surgery than for emergency laparotomy in high-HDI countries (risk difference -94 (95 per cent c.i. -11.9 to -6.9) per cent; P <0001), but the relationship was reversed in low-HDI countries (+121 (+7.0 to +173) per cent; P <0001). In multivariable models, checklist use was associated with a lower 30-day perioperative mortality (OR 0.60, 0.50 to 073; P <0.001). The greatest absolute benefit was seen for emergency surgery in low- and middle-HDI countries. Conclusion Checklist use in emergency laparotomy was associated with a significantly lower perioperative mortality rate. Checklist use in low-HDI countries was half that in high-HDI countries.Peer reviewe

    Global variation in anastomosis and end colostomy formation following left-sided colorectal resection

    Get PDF
    Background End colostomy rates following colorectal resection vary across institutions in high-income settings, being influenced by patient, disease, surgeon and system factors. This study aimed to assess global variation in end colostomy rates after left-sided colorectal resection. Methods This study comprised an analysis of GlobalSurg-1 and -2 international, prospective, observational cohort studies (2014, 2016), including consecutive adult patients undergoing elective or emergency left-sided colorectal resection within discrete 2-week windows. Countries were grouped into high-, middle- and low-income tertiles according to the United Nations Human Development Index (HDI). Factors associated with colostomy formation versus primary anastomosis were explored using a multilevel, multivariable logistic regression model. Results In total, 1635 patients from 242 hospitals in 57 countries undergoing left-sided colorectal resection were included: 113 (6·9 per cent) from low-HDI, 254 (15·5 per cent) from middle-HDI and 1268 (77·6 per cent) from high-HDI countries. There was a higher proportion of patients with perforated disease (57·5, 40·9 and 35·4 per cent; P < 0·001) and subsequent use of end colostomy (52·2, 24·8 and 18·9 per cent; P < 0·001) in low- compared with middle- and high-HDI settings. The association with colostomy use in low-HDI settings persisted (odds ratio (OR) 3·20, 95 per cent c.i. 1·35 to 7·57; P = 0·008) after risk adjustment for malignant disease (OR 2·34, 1·65 to 3·32; P < 0·001), emergency surgery (OR 4·08, 2·73 to 6·10; P < 0·001), time to operation at least 48 h (OR 1·99, 1·28 to 3·09; P = 0·002) and disease perforation (OR 4·00, 2·81 to 5·69; P < 0·001). Conclusion Global differences existed in the proportion of patients receiving end stomas after left-sided colorectal resection based on income, which went beyond case mix alone

    Trends in the development of mammalian pest control technology in New Zealand

    No full text
    Rodenticide and vertebrate pesticide registrations have declined worldwide over the last 30 years. New Zealand has not followed this trend, instead retaining essential toxins and traps, improving their use and exploring new mammal control tools. Looking to the immediate future, as well as continuing to improve the use of existing tools, there are opportunities for further advances in emerging technologies such as wireless technology for species recognition and aiding trapping programmes, self-resetting traps and toxin-delivery systems to be enhanced with advanced lures, and new toxins which increasingly combine ‘low-residue’ characteristics with selectivity and humaneness. More selective baiting and delivery systems will enable more targeted control of possums, mustelids and rodents. The use of new toxins with advantages in specific settings should be complemented by improvements in resetting trap technology, barrier approaches, and novel biocontrol and genetic concepts. Sodium fluoroacetate (1080) and other important tools have been retained; we have the ingredients for transformational change, and new tools are emerging from a research and development pipeline. However, there has been limited practical experience with emerging technologies compared with traditional or 1080 baits. Additional investment and practical experience is imperative, at this stage, to enable the potential of new toxins and other tools to reach their potential. It is also important for the future of New Zealand’s biodiversity that research continues to be focused on emerging technologies as well as on completely novel ideas
    corecore