11 research outputs found
Recommended from our members
Thoracic Surgeons' Perception of Frail Behavior in Videos of Standardized Patients
Background: Frailty is a predictor of poor outcomes following many types of operations. We measured thoracic surgeons' accuracy in assessing patient frailty using videos of standarized patients demonstrating signs of physical frailty. We compared their performance to that of geriatrics specialists.Methods: We developed an anchored scale for rating degree of frailty. Reference categories were assigned to 31 videos of standarized patients trained to exhibit five levels of activity ranging from “vigorous” to “frail.” Following an explanation of frailty, thoracic surgeons and geriatrics specialists rated the videos. We evaluated inter-rater agreement and tested differences between ratings and reference categories. The influences of clinical specialty, clinical experience, and self-rated expertise were examined.Results: Inter-rater rank correlation among all participants was high (Kendall's W 0.85) whereas exact agreement (Fleiss' kappa) was only moderate (0.47). Better inter-rater agreement was demonstrated for videos exhibiting extremes of behavior. Exact agreement was better for thoracic surgeons (n = 32) than geriatrics specialists (n = 9; p = 0.045), whereas rank correlation was similar for both groups. More clinical years of experience and self-reported expertise were not associated with better inter-rater agreement.Conclusions: Videos of standarized patients exhibiting varying degrees of frailty are rated with internal consistency by thoracic surgeons as accurately as geriatrics specialists when referenced to an anchored scale. Ratings were less consistent for moderate degrees of frailty, suggesting that physicians require training to recognize early frailty. Such videos may be useful in assessing and teaching frailty recognition.</p
PSY497 Spring 2022
This is the space that will house the components for specific projects in PSY497
Thoracic Surgeons' Perception of Frail Behavior in Videos of Standardized Patients
<div><p>Background</p><p>Frailty is a predictor of poor outcomes following many types of operations. We measured thoracic surgeons' accuracy in assessing patient frailty using videos of standarized patients demonstrating signs of physical frailty. We compared their performance to that of geriatrics specialists.</p><p>Methods</p><p>We developed an anchored scale for rating degree of frailty. Reference categories were assigned to 31 videos of standarized patients trained to exhibit five levels of activity ranging from “vigorous” to “frail.” Following an explanation of frailty, thoracic surgeons and geriatrics specialists rated the videos. We evaluated inter-rater agreement and tested differences between ratings and reference categories. The influences of clinical specialty, clinical experience, and self-rated expertise were examined.</p><p>Results</p><p>Inter-rater rank correlation among all participants was high (Kendall's W 0.85) whereas exact agreement (Fleiss' kappa) was only moderate (0.47). Better inter-rater agreement was demonstrated for videos exhibiting extremes of behavior. Exact agreement was better for thoracic surgeons (n = 32) than geriatrics specialists (n = 9; p = 0.045), whereas rank correlation was similar for both groups. More clinical years of experience and self-reported expertise were not associated with better inter-rater agreement.</p><p>Conclusions</p><p>Videos of standarized patients exhibiting varying degrees of frailty are rated with internal consistency by thoracic surgeons as accurately as geriatrics specialists when referenced to an anchored scale. Ratings were less consistent for moderate degrees of frailty, suggesting that physicians require training to recognize early frailty. Such videos may be useful in assessing and teaching frailty recognition.</p></div
Exact agreement and rank correlation according to expertise and years of experience.
<p>Fleiss' kappa: exact inter-rater agreement; Kendall's W: relative agreement among raters; Kendall's tau: relative agreement with a standard value</p
Exact agreement with reference scores by video category according to self-described expertise in recognizing frailty.
<p>Exact agreement with reference scores by video category according to self-described expertise in recognizing frailty.</p
Exact agreement and rank correlation for all raters and for raters according to specialty.
<p>Fleiss' kappa: exact inter-rater agreement; Kendall's W: relative agreement among raters; Kendall's tau: relative agreement with a standard value</p
Exact agreement with references scores by video category according to years of clinical experience.
<p>Exact agreement with references scores by video category according to years of clinical experience.</p
Exact agreement with reference scores by video category for all raters and according to specialty.
<p>Exact agreement with reference scores by video category for all raters and according to specialty.</p