2 research outputs found

    The New Frontier: Can Faculty be Consistent When Rating Clinical Skills Virtually?

    Get PDF
    Purpose/Hypothesis: Accreditation criteria mandate the evaluation of student technical skills. The emerging need for DPT programs to deliver course content remotely and subsequently assess student clinical skills highlights the lack of research surrounding faculty rating consistency when evaluations occur virtually. This study aimed to investigate rating consistency among faculty testers when assessing clinical skills virtually. The primary questions were: (1) is there faculty rating consistency for virtual practical assessments, (2) are there any trends that impact faculty rating of virtual practical performance? Number of Subjects: 623 Materials and Methods: Faculty utilized check list rubrics based on Miller’s Pyramid of Assessment to evaluate students’ virtual practical performances. During the case-based virtual practical performance students were required to simulate a face to face patient encounter, or verbally describe how to perform skills during a patient encounter appropriately. A convenience sample of 623 individual student scores across the DPT curriculum were collected and utilized. Post hoc analysis and One-Way ANOVA was employed to determine differences between faculty raters. Results: There were 4 to 7 faculty raters per course, with faculty testing 7 to 13 students on average. Students were expected to complete the virtual practical performance within; 11-20 minutes (47.5%), 21-30 minutes (25.5%), and 41-60 minutes (15%). Individual course analysis revealed some differences in faculty rating of the students’ virtual practical skills for 6 of the 13 courses. One course in the first year and five courses in the second year of the curriculum had significant differences in faculty rating of student virtual skills performances (p=0.018, p=0.001, p=0.045, p=0.013, p=0.004, p=0.001). Overall, the students’ scores earned from the faculty raters were consistent when compared to traditional face to face practical scores. Conclusions: Faculty rating of students’ virtual skills performance were more consistent in the first year of the DPT curriculum, with more variability in rating for the program’s second year courses. There is the possibility that more faculty rating errors during the second year of the curriculum may have impacted how the students were rated. Even with the differences in faculty rating, virtual skills practicals may be an acceptable option for DPT programs. Clinical Relevance: The recent Coronavirus 2019 (COVID-19) pandemic has increased the need for innovative virtual methods for testing technical skills taught in physical therapy programs. Assessing if consistency between faculty raters can be maintained in the virtual environment is essential in determining the effectiveness of this form of examination. The results of this study indicate that consistency appears to be better maintained earlier in the curriculum, the reason for this trend is unknown. Some difference in how faculty rated students could be attributed to the difference in the courses. This study will be significant in helping to show that effective faculty rating of students’ performance of virtual technical skills is possible

    Creating Bridges of Interprofessional Education: Opportunities for Collaborating Across Multiple Disciplines and Campuses

    Get PDF
    Introduction and Purpose: The purpose of this special interest report is to discuss strategies used by to integrate simulation activities across multiple campuses and programs to foster inter- and intra-professional education. Inter-professional (IP) simulations were done between multiple campuses of a large scale, multi-state, health science university. These simulations utilized Doctor of Occupational Therapy (OTD), Doctor of Physical Therapy (DPT), and Master of Science Speech/Language Pathology (MS-SLP) students and faculty. IP simulations involved DPT students and faculty from the health sciences university in one state and physical therapist assistant (PTA) students and faculty from a different university in that same state. Summary of Use and Results: When IP learning opportunities are designed into health science curricula, it can enhance knowledge, skills, and attitudes to prepare future clinicians to work as part of a collaborative practice-ready workforce. Students participating in IP activities identified themselves having more competence and sense of salience with IP interactions. Valuable student learning occurs when students are armed with attitudes and knowledge of IP collaboration. Moving forward from these experiences, students can progress in IP activities that further translate into enhanced competencies and develop IP technical skills. Importance to Members: Integrating inter- and intra- professional education in clinical and academic settings has the potential to improve behaviors amongst team players promoting improved patient safety and patient outcomes. However, there continues to be a need for research demonstrating the efficacy of IPE and its impact on student and clinician learning and the effects on patient outcomes. IPE Simulation Opportunity 1: IP Collaboration of OT and PT Students: Within the first two weeks of beginning our OT and PT program, students are introduced to an interdisciplinary simulation experience to anchor emerging skills to basic concepts of patient care. The experience is simple and intimate, but organically creates open discussion much deeper than a lab activity involving students partnering with other students to practice learned therapeutic interventions and procedures. The experience is set up with a simulated patient prompted to ask the students questions such as “What is the difference between OT and PT?” Student observers watch as their fellow classmates in the simulation hot seat have to problem solve within a natural hospital environment. The experience brings to light the social components involved in the usually rehearsed informed consent, environmental barriers such as bed rails and tray tables, interprofessional communication with the other discipline with a patient present, and interpretation of vital assessments in patient friendly language. The class debriefs after the simulation to observations, perspectives, and insights. This initial simulation provides an initial experience bridging academia to real-world clinical practice. IPE Simulation Opportunity 2: Intgraprofessional Collaboration of PT and PTA Students Three short simulations were conducted with a group of DPT students and a group of PTA students. The first simulation was of PTA and DPT students reviewing a medical record together and planning for a treatment session on a complex neuro case with a recent history of seizures, traumatic brain injury (TBI), and craniotomy. They had a simulated Clinical Instructor (CI) to guide them. The second simulation involved mobilizing this patient in an EVA walker with multiple lines and tubes with the patient’s mother present with CI guidance as needed. The third simulation involved responding to the patient having a seizure with the mother present and then a debriefing with their CI on the session using the Clinical Performance Indicators (CPI) to guide the discussion. IPE Simulation Opportunity 3: Interprofessional Collaboration of OT, PT, and SLP During SLP program development, IPEC core competencies were mapped to the clinical curriculum and activities developed for OT, PT, and SLP students’ IPE experiences. The activities designed by an IPE faculty team included simulation case scenarios experienced by students each trimester. The case scenarios vary in complexity across IPEC competencies and sub-competencies. For each case scenario simulation, an SLP, PT, and OT student volunteer to act out the improvised scenario, with one or two other students acting as patients, caregivers, or other team members. The rest of the participants observe the scenario and upon completion, all students (actors and observers) participate in a debriefing, which includes reactions, observations, reflections, and discussion. The debrief sessions are structured using the General Interprofessional Debriefing Questions Facilitator Guide to ensure IPEC core competencies were addressed in the simulation. Student feedback is positive with general requests to offer additional experiences. Conclusions: It is important for future research to explore the validity, reliability, and efficacy of IP learning activities to develop best-practices in IPE. This will help clinicians and educators in customizing their needs to meet patient outcomes, accreditation standards, programmatic goals, and institutional goals in their respective programs and settings
    corecore