8 research outputs found

    Supportive Oncology Collaborative: Initial impact of supportive oncology screening and care

    No full text
    180 Background: The Institute of Medicine (IOM) 2013 report recommends supportive oncology care from diagnosis through survivorship, to end of life. The Coleman Supportive Oncology Collaborative (CSOC) developed a city-wide plan to improve supportive oncology. Metrics derived from the Commission on Cancer (CoC), ASCO Quality Oncology Practice Initiative (ASCO-QOPI) and National Quality Forum (NQF) were used to assess the CSOC impact. Methods: Medical records of consecutive cancer patients from 6 practice improvement cancer centers in Chicago (3 academic, 2 safety-net, 1 public) were reviewed for 2 periods: 2014 (n = 843) and Q1 of 2015 (n = 313). Descriptive statistics assessed differences in quality metrics. Results: Significant improvement was achieved in 6 of 8 core supportive oncology metrics (see table). Conclusions: Consolidated metrics are feasible to assess supportive oncology quality. Early data indicate improvement and effectiveness of the collaborative approach. [Table: see text

    A consolidated screening tool for supportive oncology needs and distress

    No full text
    72 Background: The IOM 2013 Report recommends that supportive oncology care start at cancer diagnosis; the Commission on Cancer (CoC) standard 3.2 requires distress screening and indicated action. Screening tools are not standardized across institutions and often address only a portion of patients’ supportive oncology needs. Methods: A collaborative of 100+ clinicians, funded by The Coleman Foundation, developed a patient-centric consolidated screening tool based on validated instruments (NCCN Distress Problem List, PHQ-4, PROMIS) and IOM and CoC. The screening tool was piloted at 6 practice improvement cancer centers in the Chicago area (3 academic, 2 safety-net, 1 public). Patients, providers assessing each patient’s screening results (assessors), and providers receiving referrals (referral providers) were surveyed after each use of the screening tool. Descriptive statistics were used to assess effectiveness of the tool. Results: Responders included 29 patients, 81 assessors and 26 referral providers (SW, chaplain, subspecialist). The majority of patients (22/29, 75%) completed the screening in < 10 minutes without assistance and will complete at every visit. Most assessors (59/77, 76%) spent < 5 minutes reviewing screening results. The majority of patients, assessors, and referral providers reported that the screening tool asked the “right questions”. Assessors reporting partial relevance of some screening questions for 34% (26/77) of patients, uncovered ≥ 1 relevant needs for 96% (25/26) of those patients (p = 0.002). Conclusions: Use of a consolidated supportive oncology screening tool across multiple institutions is feasible, discovered unmet patient needs, and was beneficial for assessors and providers. As the tool is adopted by collaborating institutions, variations in supportive oncology screening may decline, thus improving access to supportive oncology care with implications for national dissemination. [Table: see text

    A consolidated screening tool for supportive oncology needs and distress

    No full text
    47 Background: The IOM 2013 Report recommends that supportive oncology care start at cancer diagnosis; the Commission on Cancer (CoC) Standard 3.2 requires distress screening and indicated action. Screening tools are not standardized across institutions and often address only a portion of patients’ supportive oncology needs. Methods: A collaborative of 100+ clinicians, funded by The Coleman Foundation, developed a patient-centric consolidated screening tool based on validated instruments (NCCN Distress, PHQ-4, PROMIS) and IOM and CoC. The screening tool was piloted at 6 practice-improvement cancer centers in the Chicago area (3 academic, 2 safety-net, 1 public). Patients, providers assessing patients’ screening results (assessors), and providers receiving referrals (providers) were surveyed after use of the screening tool. Descriptive statistics were used to assess effectiveness of the tool. Results: Responders included 175 patients, 81 assessors, and 26 referral providers (social workers, chaplains, subspecialists). The majority of patients (160/175, 91%) completed the screening in <10 minutes, across all patients the screening tool averaged 4 ½ minutes. Most assessors (59/77, 76%) spent <5 minutes reviewing screening results. Most patients, assessors, and providers reported the screening tool asked the “right questions”. Assessors reporting partial relevance of some screening questions for 34% (26/77) of patients, uncovered ≥ 1 relevant needs for 96% (25/26) of those patients (p = 0.002). Conclusions: Use of a consolidated supportive oncology screening tool across multiple institutions is feasible, identified unmet patient needs, and was beneficial for assessors and providers. As the tool is adopted by collaborating institutions, variability in supportive oncology screening practices may decline, thus improving patient care. The tool has implications for quality improvements and national dissemination. [Table: see text

    Selected Bibliography

    No full text
    corecore