360 research outputs found

    Opening the Black Box of Family-Based Treatments: an artificial intelligence Framework to Examine therapeutic alliance and therapist Empathy

    Get PDF
    The evidence-based treatment (EBT) movement has primarily focused on core intervention content or treatment fidelity and has largely ignored practitioner skills to manage interpersonal process issues that emerge during treatment, especially with difficult-to-treat adolescents (delinquent, substance-using, medical non-adherence) and those of color. A chief complaint of real world practitioners about manualized treatments is the lack of correspondence between following a manual and managing microsocial interpersonal processes (e.g. negative affect) that arise in treating real world clients. Although family-based EBTs share core similarities (e.g. focus on family interactions, emphasis on practitioner engagement, family involvement), most of these treatments do not have an evidence base regarding common implementation and treatment process problems that practitioners experience in delivering particular models, especially in mid-treatment when demands on families to change their behavior is greatest in treatment - a lack that characterizes the field as a whole. Failure to effectively address common interpersonal processes with difficult-to-treat families likely undermines treatment fidelity and sustained use of EBTs, treatment outcome, and contributes to treatment dropout and treatment nonadherence. Recent advancements in wearables, sensing technologies, multivariate time-series analyses, and machine learning allow scientists to make significant advancements in the study of psychotherapy processes by looking under the skin of the provider-client interpersonal interactions that define therapeutic alliance, empathy, and empathic accuracy, along with the predictive validity of these therapy processes (therapeutic alliance, therapist empathy) to treatment outcome. Moreover, assessment of these processes can be extended to develop procedures for training providers to manage difficult interpersonal processes while maintaining a physiological profile that is consistent with astute skills in psychotherapeutic processes. This paper argues for opening the black box of therapy to advance the science of evidence-based psychotherapy by examining the clinical interior of evidence-based treatments to develop the next generation of audit- and feedback- (i.e., systemic review of professional performance) supervision systems

    A Comprehensive Study on State-Of-Art Learning Algorithms in Emotion Recognition

    Get PDF
    The potential uses of emotion recognition in domains like human-robot interaction, marketing, emotional gaming, and human-computer interface have made it a prominent research subject. Better user experiences can result from the development of technologies that can accurately interpret and respond to human emotions thanks to a better understanding of emotions. The use of several sensors and computational algorithms is the main emphasis of this paper's thorough analysis of the developments in emotion recognition techniques. Our results show that using more than one modality improves the performance of emotion recognition when a variety of metrics and computational techniques are used. This paper adds to the body of knowledge by thoroughly examining and contrasting several state-of-art computational techniques and measurements for emotion recognition. The study emphasizes how crucial it is to use a variety of modalities along with cutting-edge machine learning algorithms in order to attain more precise and trustworthy emotion assessment. Additionally, we pinpoint prospective avenues for additional investigation and advancement, including the incorporation of multimodal data and the investigation of innovative features and fusion methodologies. This study contributes to the development of technology that can better comprehend and react to human emotions by offering practitioners and academics in the field of emotion recognition insightful advice

    Empathy Detection Using Machine Learning on Text, Audiovisual, Audio or Physiological Signals

    Full text link
    Empathy is a social skill that indicates an individual's ability to understand others. Over the past few years, empathy has drawn attention from various disciplines, including but not limited to Affective Computing, Cognitive Science and Psychology. Empathy is a context-dependent term; thus, detecting or recognising empathy has potential applications in society, healthcare and education. Despite being a broad and overlapping topic, the avenue of empathy detection studies leveraging Machine Learning remains underexplored from a holistic literature perspective. To this end, we systematically collect and screen 801 papers from 10 well-known databases and analyse the selected 54 papers. We group the papers based on input modalities of empathy detection systems, i.e., text, audiovisual, audio and physiological signals. We examine modality-specific pre-processing and network architecture design protocols, popular dataset descriptions and availability details, and evaluation protocols. We further discuss the potential applications, deployment challenges and research gaps in the Affective Computing-based empathy domain, which can facilitate new avenues of exploration. We believe that our work is a stepping stone to developing a privacy-preserving and unbiased empathic system inclusive of culture, diversity and multilingualism that can be deployed in practice to enhance the overall well-being of human life

    Having a Bad Day? Detecting the Impact of Atypical Life Events Using Wearable Sensors

    Full text link
    Life events can dramatically affect our psychological state and work performance. Stress, for example, has been linked to professional dissatisfaction, increased anxiety, and workplace burnout. We explore the impact of positive and negative life events on a number of psychological constructs through a multi-month longitudinal study of hospital and aerospace workers. Through causal inference, we demonstrate that positive life events increase positive affect, while negative events increase stress, anxiety and negative affect. While most events have a transient effect on psychological states, major negative events, like illness or attending a funeral, can reduce positive affect for multiple days. Next, we assess whether these events can be detected through wearable sensors, which can cheaply and unobtrusively monitor health-related factors. We show that these sensors paired with embedding-based learning models can be used ``in the wild'' to capture atypical life events in hundreds of workers across both datasets. Overall our results suggest that automated interventions based on physiological sensing may be feasible to help workers regulate the negative effects of life events.Comment: 10 pages, 4 figures, and 3 table

    FusionSense: Emotion Classification using Feature Fusion of Multimodal Data and Deep learning in a Brain-inspired Spiking Neural Network

    Get PDF
    Using multimodal signals to solve the problem of emotion recognition is one of the emerging trends in affective computing. Several studies have utilized state of the art deep learning methods and combined physiological signals, such as the electrocardiogram (EEG), electroencephalogram (ECG), skin temperature, along with facial expressions, voice, posture to name a few, in order to classify emotions. Spiking neural networks (SNNs) represent the third generation of neural networks and employ biologically plausible models of neurons. SNNs have been shown to handle Spatio-temporal data, which is essentially the nature of the data encountered in emotion recognition problem, in an efficient manner. In this work, for the first time, we propose the application of SNNs in order to solve the emotion recognition problem with the multimodal dataset. Specifically, we use the NeuCube framework, which employs an evolving SNN architecture to classify emotional valence and evaluate the performance of our approach on the MAHNOB-HCI dataset. The multimodal data used in our work consists of facial expressions along with physiological signals such as ECG, skin temperature, skin conductance, respiration signal, mouth length, and pupil size. We perform classification under the Leave-One-Subject-Out (LOSO) cross-validation mode. Our results show that the proposed approach achieves an accuracy of 73.15% for classifying binary valence when applying feature-level fusion, which is comparable to other deep learning methods. We achieve this accuracy even without using EEG, which other deep learning methods have relied on to achieve this level of accuracy. In conclusion, we have demonstrated that the SNN can be successfully used for solving the emotion recognition problem with multimodal data and also provide directions for future research utilizing SNN for Affective computing. In addition to the good accuracy, the SNN recognition system is requires incrementally trainable on new data in an adaptive way. It only one pass training, which makes it suitable for practical and on-line applications. These features are not manifested in other methods for this problem.Peer reviewe

    The Possibilities of Classification of Emotional States Based on User Behavioral Characteristics

    Get PDF
    The classification of user's emotions based on their behavioral characteristic, namely their keyboard typing and mouse usage pattern is an effective and non-invasive way of gathering user's data without imposing any limitations on their ability to perform tasks. To gather data for the classifier we used an application, the Emotnizer, which we had developed for this purpose. The output of the classification is categorized into 4 emotional categories from Russel's complex circular model - happiness, anger, sadness and the state of relaxation. The sample of the reference database consisted of 50 students. Multiple regression analyses gave us a model, that allowed us to predict the valence and arousal of the subject based on the input from the keyboard and mouse. Upon re-testing with another test group of 50 students and processing the data we found out our Emotnizer program can classify emotional states with an average success rate of 82.31%

    ECG-based Human Emotion Recognition Using Generative Models

    Get PDF
    Human emotion recognition (HER) is ever-evolving and has become an important research field. In autonomous driving, HER can be vital in developing autonomous vehicles. Introducing au- tonomous vehicles is expected to increase safety, having the potential to prevent accidents. Recognizing the passengers’ emotional reactions while driving can help machine learning al- gorithms learn human behavior in traffic. In this thesis, the focus has been on HER using electrocardiogram (ECG) data. The effect of Autoencoders and Sparse Autoencoders in HER using ECG data has been explored and compared to the state-of-the-art. Additionally, the extent of ECG data as a single modality for HER has been discussed. Three pipelines were con- structed to explore how Autoencoders and Sparse Autoencoders affect HER. All pipelines were denoised and resampled using the Pan-Tompkins algorithm. Additionally, the pipelines were all trained, validated, and tested using the Support Vector Classifier (SVC). The first pipeline uses the Pan-Tompkins processed signals as input to the SVC. In the second pipeline, the input to the SVC is features extracted from the signals using an Autoencoder. The last pipeline uses the latent space of a Sparse Autoencoder as input to the SVC. The target emotions for the classifi- cation task were based on the two-dimensional emotion model of valence and arousal, resulting in four classes. The pipeline including an Autoencoder for feature extraction outperformed the pipeline without feature extraction in addition to reducing the bias the models showed towards one class. Using a Sparse Autoencoder, the overall results were lower, but it was able to reduce the bias toward one class further. These results show that the Autoencoder has potential in ECG-based HER and could contribute to the field
    • …
    corecore