353 research outputs found

    Editorial comment

    Get PDF

    The epidemiology and biology of pulmonary metastases

    Get PDF
    Our goal in this chapter is to explore the complex processes of metastasis and why there is a predisposition for this to occur in the lung. In addition, we aim to describe the incidence of pulmonary metastases in various contexts and based on the origin of the primary tumor. There are unique characteristics of the pulmonary system that make metastases more likely to occur in the lung than anywhere else in the body. Some of these characteristics include receiving the entire cardiac output every minute, having the densest capillary bed in the body, and being the first reservoir of most lymphatic drainage entering the venous system. There are multiple postulated routes of metastasis to the pulmonary system including hematogenous and lymphatic routes with early or late dissemination. The vascularization of pulmonary metastases is variable and complex, often recruiting supply from bronchial and pulmonary origin. There are also many biochemical factors in the tumor microenvironment that play a key role in the development of lung metastases including vascular endothelial growth factor (VEGF), interleukin-8 (IL-8), very late antigen 4 (VLA-4) and intercellular adhesion molecule 1 (ICAM-1). Studies vary widely in reported rates of pulmonary metastases due to differences in clinical study design, however, it is commonly accepted that up to half of autopsies performed on patients who died of malignancy have pulmonary metastases. In a surgical series describing the incidence of primary cancer types with resected pulmonary metastases the most common sites were thyroid, colon, breast, genitourinary tract, skin, liver, breast, and adrenal glands

    Treatment utilization and outcomes in elderly patients with locally advanced esophageal carcinoma: A review of the National Cancer Database

    Get PDF
    For elderly patients with locally advanced esophageal cancer, therapeutic approaches and outcomes in a modern cohort are not well characterized. Patients ≥70 years old with clinical stage II and III esophageal cancer diagnosed between 1998 and 2012 were identified from the National Cancer Database and stratified based on treatment type. Variables associated with treatment utilization were evaluated using logistic regression and survival evaluated using Cox proportional hazards analysis. Propensity matching (1:1) was performed to help account for selection bias. A total of 21,593 patients were identified. Median and maximum ages were 77 and 90, respectively. Treatment included palliative therapy (24.3%), chemoradiation (37.1%), trimodality therapy (10.0%), esophagectomy alone (5.6%), or no therapy (12.9%). Age ≥80 (OR 0.73), female gender (OR 0.81), Charlson-Deyo comorbidity score ≥2 (OR 0.82), and high-volume centers (OR 0.83) were associated with a decreased likelihood of palliative therapy versus no treatment. Age ≥80 (OR 0.79) and Clinical Stage III (OR 0.33) were associated with a decreased likelihood, while adenocarcinoma histology (OR 1.33) and nonacademic cancer centers (OR 3.9), an increased likelihood of esophagectomy alone compared to definitive chemoradiation. Age ≥80 (OR 0.15), female gender (OR 0.80), and non-Caucasian race (OR 0.63) were associated with a decreased likelihood, while adenocarcinoma histology (OR 2.10) and high-volume centers (OR 2.34), an increased likelihood of trimodality therapy compared to definitive chemoradiation. Each treatment type demonstrated improved survival compared to no therapy: palliative treatment (HR 0.49) to trimodality therapy (HR 0.25) with significance between all groups. Any therapy, including palliative care, was associated with improved survival; however, subsets of elderly patients with locally advanced esophageal cancer are less likely to receive aggressive therapy. Care should be taken to not unnecessarily deprive these individuals of treatment that may improve survival

    Lung Transplantation in the United States, 1999–2008

    Full text link
    This article highlights trends and changes in lung and heart–lung transplantation in the United States from 1999 to 2008. While adult lung transplantation grew significantly over the past decade, rates of heart–lung and pediatric lung transplantation have remained low. Since implementation of the lung allocation score (LAS) donor allocation system in 2005, decreases in the number of active waiting list patients, waiting times for lung transplantation and death rates on the waiting list have occurred. However, characteristics of recipients transplanted in the LAS era differed from those transplanted earlier. The proportion of candidates undergoing lung transplantation for chronic obstructive pulmonary disease decreased, while increasing for those with pulmonary fibrosis. In the LAS era, older, sicker and previously transplanted candidates underwent transplantation more frequently compared with the previous era. Despite these changes, when compared with the pre-LAS era, 1-year survival after lung transplantation did not significantly change after LAS inception. The long-term effects of the change in the characteristics of lung transplant recipients on overall outcomes for lung transplantation remain unknown. Continued surveillance and refinements to the LAS system will affect the distribution and types of candidates transplanted and hopefully lead to improved system efficiency and outcomes.Peer Reviewedhttp://deepblue.lib.umich.edu/bitstream/2027.42/79080/1/j.1600-6143.2010.03055.x.pd

    Gene transfer of tumor necrosis factor inhibitor improves the function of lung allografts

    Get PDF
    AbstractBackgroundTumor necrosis factor is an important mediator of lung transplant acute rejection. Soluble type I tumor necrosis factor receptor binds to tumor necrosis factor-α and -β and inhibits their function. The objectives of this study were to demonstrate efficient in vivo gene transfer of a soluble type I tumor necrosis factor receptor fusion protein (sTNF-RI-Ig) and determine its effects on lung allograft acute rejection.MethodsThree groups of Fischer rats (n = 6 per group) underwent recipient intramuscular transfection 24 hours before transplantation with saline, 1 × 1010 plaque-forming units of control adenovirus encoding β-galactosidase, or 1 × 1010 plaque-forming units of adenovirus encoding human sTNF-RI-Ig (Ad.sTNF-RI-Ig). One group (n = 6) received recipient intramuscular transfection with 1 × 1010 Ad.sTNF-RI-Ig at the time of transplantation. Brown Norway donor lung grafts were stored for 5 hours before orthotopic lung transplantation. Graft function and rejection scores were assessed 5 days after transplantation. Time-dependent transgene expression in muscle, serum, and lung grafts were evaluated by using enzyme-linked immunosorbent assay of human soluble type I tumor necrosis factor receptor.ResultsRecipient intramuscular transfection with 1 × 1010 plaque-forming units of Ad.sTNF-RI-Ig significantly improved arterial oxygenation when delivered 24 hours before transplantation compared with saline, β-galactosidase, and Ad.sTNF-RI-Ig transfection at the time of transplantation (435.8 ± 106.6 mm Hg vs 142.3 ± 146.3 mm Hg, 177.4 ± 153.7 mm Hg, and 237.3 ± 185.2 mm Hg; P = .002, .005, and .046, respectively). Transgene expression was time dependent, and there was a trend toward lower vascular rejection scores (P = .066) in the Ad.sTNF-RI-Ig group transfected 24 hours before transplantation.ConclusionsRecipient intramuscular Ad.sTNF-RI-Ig gene transfer improves allograft function in a well-established model of acute rejection. Maximum benefit was observed when transfection occurred 24 hours before transplantation

    Does reperfusion injury still cause significant mortality after lung transplantation?

    Get PDF
    ObjectivesSevere reperfusion injury after lung transplantation has mortality rates approaching 40%. The purpose of this investigation was to identify whether our improved 1-year survival after lung transplantation is related to a change in reperfusion injury.MethodsWe reported in March 2000 that early institution of extracorporeal membrane oxygenation can improve lung transplantation survival. The records of consecutive lung transplant recipients from 1990 to March 2000 (early era, n = 136) were compared with those of recipients from March 2000 to August 2006 (current era, n = 155). Reperfusion injury was defined by an oxygenation index of greater than 7 (where oxygenation index = [Percentage inspired oxygen] × [Mean airway pressure]/[Partial pressure of oxygen]). Risk factors for reperfusion injury, treatment of reperfusion injury, and 30-day mortality were compared between eras by using χ2, Fisher's, or Student's t tests where appropriate.ResultsAlthough the incidence of reperfusion injury did not change between the eras, 30-day mortality after lung transplantation improved from 11.8% in the early era to 3.9% in the current era (P = .003). In patients without reperfusion injury, mortality was low in both eras. Patients with reperfusion injury had less severe reperfusion injury (P = .01) and less mortality in the current era (11.4% vs 38.2%, P = .01). Primary pulmonary hypertension was more common in the early era (10% [14/136] vs 3.2% [5/155], P = .02). Graft ischemic time increased from 223.3 ± 78.5 to 286.32 ± 88.3 minutes in the current era (P = .0001). The mortality of patients with reperfusion injury requiring extracorporeal membrane oxygenation improved in the current era (80.0% [8/10] vs 25.0% [3/12], P = .01).ConclusionImproved early survival after lung transplantation is due to less severe reperfusion injury, as well as improvements in survival with extracorporeal membrane oxygenation

    Analysis of delayed surgical treatment and oncologic outcomes in clinical stage I non-small cell lung cancer

    Get PDF
    Importance: The association between delayed surgical treatment and oncologic outcomes in patients with non-small cell lung cancer (NSCLC) is poorly understood given that prior studies have used imprecise definitions for the date of cancer diagnosis. Objective: To use a uniform method to quantify surgical treatment delay and to examine its association with several oncologic outcomes. Design, Setting, and Participants: This retrospective cohort study was conducted using a novel data set from the Veterans Health Administration (VHA) system. Included patients had clinical stage I NSCLC and were undergoing resection from 2006 to 2016 within the VHA system. Time to surgical treatment (TTS) was defined as the time between preoperative diagnostic computed tomography imaging and surgical treatment. We evaluated the association between TTS and several delay-associated outcomes using restricted cubic spline functions. Data analyses were performed in November 2021. Exposure: Wait time between cancer diagnosis and surgical treatment (ie, TTS). Main Outcomes and Measures: Several delay-associated oncologic outcomes, including pathologic upstaging, resection with positive margins, and recurrence, were assessed. We also assessed overall survival. Results: Among 9904 patients who underwent surgical treatment for clinical stage I NSCLC, 9539 (96.3%) were men, 4972 individuals (50.5%) were currently smoking, and the mean (SD) age was 67.7 (7.9) years. The mean (SD) TTS was 70.1 (38.6) days. TTS was not associated with increased risk of pathologic upstaging or positive margins. Recurrence was detected in 4158 patients (42.0%) with median (interquartile range) follow-up of 6.15 (2.51-11.51) years. Factors associated with increased risk of recurrence included younger age (hazard ratio [HR] for every 1-year increase in age, 0.992; 95% CI, 0.987-0.997; P = .003), higher Charlson Comorbidity Index score (HR for every 1-unit increase in composite score, 1.055; 95% CI, 1.037-1.073; P \u3c .001), segmentectomy (HR vs lobectomy, 1.352; 95% CI, 1.179-1.551; P \u3c .001) or wedge resection (HR vs lobectomy, 1.282; 95% CI, 1.179-1.394; P \u3c .001), larger tumor size (eg, 31-40 mm vs \u3c10 mm; HR, 1.209; 95% CI, 1.051-1.390; P = .008), higher tumor grade (eg, II vs I; HR, 1.210; 95% CI, 1.085-1.349; P \u3c .001), lower number of lymph nodes examined (eg, ≥10 vs \u3c10; HR, 0.866; 95% CI, 0.803-0.933; P \u3c .001), higher pathologic stage (III vs I; HR, 1.571; 95% CI, 1.351-1.837; P \u3c .001), and longer TTS, with increasing risk after 12 weeks. For each week of surgical delay beyond 12 weeks, the hazard for recurrence increased by 0.4% (HR, 1.004; 95% CI, 1.001-1.006; P = .002). Factors associated with delayed surgical treatment included African American race (odds ratio [OR] vs White race, 1.267; 95% CI, 1.112-1.444; P \u3c .001), higher area deprivation index [ADI] score (OR for every 1 unit increase in ADI score, 1.005; 95% CI, 1.002-1.007; P = .002), lower hospital case load (OR for every 1-unit increase in case load, 0.998; 95% CI, 0.998-0.999; P = .001), and year of diagnosis, with less recent procedures more likely to have delay (OR for each additional year, 0.900; 95% CI, 0.884-0.915; P \u3c .001). Patients with surgical treatment within 12 weeks of diagnosis had significantly better overall survival than those with procedures delayed more than 12 weeks (HR, 1.132; 95% CI, 1.064-1.204; P \u3c .001). Conclusions and Relevance: Using a more precise definition for TTS, this study found that surgical procedures delayed more than 12 weeks were associated with increased risk of recurrence and worse survival. These findings suggest that patients with clinical stage I NSCLC should undergo expeditious treatment within that time frame

    Bleeding and thrombotic complications associated with anticoagulation prior to lung transplantation: A case series

    Get PDF
    Background: Scarce data is available on therapeutic anticoagulation (AC) in patients undergoing pulmonary transplantation. We describe our institutional experience with AC-induced coagulopathy in recipients at the time of transplantation and evaluate its impact on posttransplant outcomes. Methods: Records of adult patients on therapeutic AC at the time of lung transplantation from January 2014 to July 2021 were reviewed. Administration of preoperative pharmacologic reversal was assessed, with adequate reversal defined as international normalized ratio (INR) ≤1.5. We evaluated the incidence of major bleeding complications [delayed sternal closure, reoperation due to bleeding, chest tube output ≥1,500 cc, ≥4 units of packed red blood cells, ≥4 units of platelets, or ≥5 units of fresh frozen plasma (FFP)], major thrombotic complications [venous thromboembolism (VTE) or other major thrombosis on imaging], and inpatient mortality. Results: Of 602 lung transplant recipients, 10 patients taking preoperative warfarin were included in the study. While most patients received pharmacologic reversal preoperatively (n=9, 90%), successful reversal was rarely achieved (n=3, 30%). Inadequate INR reversal was associated with major bleeding events (n=6, 60%). Major thrombotic complications were more frequent (n=7, 70%) than bleeding events. Notably, all fatalities within the cohort (n=2, 20%) were associated with thrombotic, but not bleeding, complications. Conclusions: This is the first known report on the incidence and impact of AC-induced coagulopathy in patients undergoing lung transplantation. Major thrombotic events are frequent and associated with high mortality. Routine surveillance and treatment may be warranted
    • …
    corecore