1 research outputs found
Automated Evaluation Of Psychotherapy Skills Using Speech And Language Technologies
With the growing prevalence of psychological interventions, it is vital to
have measures which rate the effectiveness of psychological care to assist in
training, supervision, and quality assurance of services. Traditionally,
quality assessment is addressed by human raters who evaluate recorded sessions
along specific dimensions, often codified through constructs relevant to the
approach and domain. This is however a cost-prohibitive and time-consuming
method that leads to poor feasibility and limited use in real-world settings.
To facilitate this process, we have developed an automated competency rating
tool able to process the raw recorded audio of a session, analyzing who spoke
when, what they said, and how the health professional used language to provide
therapy. Focusing on a use case of a specific type of psychotherapy called
Motivational Interviewing, our system gives comprehensive feedback to the
therapist, including information about the dynamics of the session (e.g.,
therapist's vs. client's talking time), low-level psychological language
descriptors (e.g., type of questions asked), as well as other high-level
behavioral constructs (e.g., the extent to which the therapist understands the
clients' perspective). We describe our platform and its performance using a
dataset of more than 5,000 recordings drawn from its deployment in a real-world
clinical setting used to assist training of new therapists. Widespread use of
automated psychotherapy rating tools may augment experts' capabilities by
providing an avenue for more effective training and skill improvement,
eventually leading to more positive clinical outcomes.Comment: new version has an updated titl