22,947 research outputs found

    Modeling Individual Cyclic Variation in Human Behavior

    Full text link
    Cycles are fundamental to human health and behavior. However, modeling cycles in time series data is challenging because in most cases the cycles are not labeled or directly observed and need to be inferred from multidimensional measurements taken over time. Here, we present CyHMMs, a cyclic hidden Markov model method for detecting and modeling cycles in a collection of multidimensional heterogeneous time series data. In contrast to previous cycle modeling methods, CyHMMs deal with a number of challenges encountered in modeling real-world cycles: they can model multivariate data with discrete and continuous dimensions; they explicitly model and are robust to missing data; and they can share information across individuals to model variation both within and between individual time series. Experiments on synthetic and real-world health-tracking data demonstrate that CyHMMs infer cycle lengths more accurately than existing methods, with 58% lower error on simulated data and 63% lower error on real-world data compared to the best-performing baseline. CyHMMs can also perform functions which baselines cannot: they can model the progression of individual features/symptoms over the course of the cycle, identify the most variable features, and cluster individual time series into groups with distinct characteristics. Applying CyHMMs to two real-world health-tracking datasets -- of menstrual cycle symptoms and physical activity tracking data -- yields important insights including which symptoms to expect at each point during the cycle. We also find that people fall into several groups with distinct cycle patterns, and that these groups differ along dimensions not provided to the model. For example, by modeling missing data in the menstrual cycles dataset, we are able to discover a medically relevant group of birth control users even though information on birth control is not given to the model.Comment: Accepted at WWW 201

    Characterizing physiological and symptomatic variation in menstrual cycles using self-tracked mobile health data

    Full text link
    The menstrual cycle is a key indicator of overall health for women of reproductive age. Previously, menstruation was primarily studied through survey results; however, as menstrual tracking mobile apps become more widely adopted, they provide an increasingly large, content-rich source of menstrual health experiences and behaviors over time. By exploring a database of user-tracked observations from the Clue app by BioWink of over 378,000 users and 4.9 million natural cycles, we show that self-reported menstrual tracker data can reveal statistically significant relationships between per-person cycle length variability and self-reported qualitative symptoms. A concern for self-tracked data is that they reflect not only physiological behaviors, but also the engagement dynamics of app users. To mitigate such potential artifacts, we develop a procedure to exclude cycles lacking user engagement, thereby allowing us to better distinguish true menstrual patterns from tracking anomalies. We uncover that women located at different ends of the menstrual variability spectrum, based on the consistency of their cycle length statistics, exhibit statistically significant differences in their cycle characteristics and symptom tracking patterns. We also find that cycle and period length statistics are stationary over the app usage timeline across the variability spectrum. The symptoms that we identify as showing statistically significant association with timing data can be useful to clinicians and users for predicting cycle variability from symptoms or as potential health indicators for conditions like endometriosis. Our findings showcase the potential of longitudinal, high-resolution self-tracked data to improve understanding of menstruation and women's health as a whole.Comment: The Supplementary Information for this work, as well as the code required for data pre-processing and producing results is available in https://github.com/iurteaga/menstrual_cycle_analysi

    Time-delay neural network for continuous emotional dimension prediction from facial expression sequences

    Get PDF
    "(c) 2015 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other users, including reprinting/ republishing this material for advertising or promotional purposes, creating new collective works for resale or redistribution to servers or lists, or reuse of any copyrighted components of this work in other works."Automatic continuous affective state prediction from naturalistic facial expression is a very challenging research topic but very important in human-computer interaction. One of the main challenges is modeling the dynamics that characterize naturalistic expressions. In this paper, a novel two-stage automatic system is proposed to continuously predict affective dimension values from facial expression videos. In the first stage, traditional regression methods are used to classify each individual video frame, while in the second stage, a Time-Delay Neural Network (TDNN) is proposed to model the temporal relationships between consecutive predictions. The two-stage approach separates the emotional state dynamics modeling from an individual emotional state prediction step based on input features. In doing so, the temporal information used by the TDNN is not biased by the high variability between features of consecutive frames and allows the network to more easily exploit the slow changing dynamics between emotional states. The system was fully tested and evaluated on three different facial expression video datasets. Our experimental results demonstrate that the use of a two-stage approach combined with the TDNN to take into account previously classified frames significantly improves the overall performance of continuous emotional state estimation in naturalistic facial expressions. The proposed approach has won the affect recognition sub-challenge of the third international Audio/Visual Emotion Recognition Challenge (AVEC2013)1

    Ubiquitous emotion-aware computing

    Get PDF
    Emotions are a crucial element for personal and ubiquitous computing. What to sense and how to sense it, however, remain a challenge. This study explores the rare combination of speech, electrocardiogram, and a revised Self-Assessment Mannequin to assess people’s emotions. 40 people watched 30 International Affective Picture System pictures in either an office or a living-room environment. Additionally, their personality traits neuroticism and extroversion and demographic information (i.e., gender, nationality, and level of education) were recorded. The resulting data were analyzed using both basic emotion categories and the valence--arousal model, which enabled a comparison between both representations. The combination of heart rate variability and three speech measures (i.e., variability of the fundamental frequency of pitch (F0), intensity, and energy) explained 90% (p < .001) of the participants’ experienced valence--arousal, with 88% for valence and 99% for arousal (ps < .001). The six basic emotions could also be discriminated (p < .001), although the explained variance was much lower: 18–20%. Environment (or context), the personality trait neuroticism, and gender proved to be useful when a nuanced assessment of people’s emotions was needed. Taken together, this study provides a significant leap toward robust, generic, and ubiquitous emotion-aware computing

    Naturalistic monitoring of the affect-heart rate relationship: A Day Reconstruction Study

    Get PDF
    Objective: Prospective studies have linked both negative affective states and trait neuroticism with hypertension, cardiovascular disease, and mortality. However, identifying how fluctuations in cardiovascular activity in day-to-day settings are related to changes in affect and stable personality characteristics has remained a methodological and logistical challenge. Design - In the present study, we tested the association between affect, affect variability, personality and heart rate (HR) in daily life. Measures: We utilized an online day reconstruction survey to produce a continuous account of affect, interaction, and activity patterns during waking hours. Ambulatory HR was assessed during the same period. Consumption, activity, and baseline physiological characteristics were assessed in order to isolate the relationships between affect, personality and heart rate. Results: Negative affect and variability in positive affect predicted an elevated ambulatory HR and tiredness a lower HR. Emotional stability was inversely related to HR, whereas agreeableness predicted a higher HR. Baseline resting HR was unrelated to either affect or personality. Conclusion: The results suggest that both state and trait factors implicated in negative affectivity may be risk factors for increased cardiovascular reactivity in everyday life. Combining day reconstruction with psychophysiological and environmental monitoring is discussed as a minimally invasive method with promising interdisciplinary relevance.heart rate, negative affect, affect variability, Big Five, Day Reconstruction Method

    Perception of emotion in sounded and imagined music

    Get PDF
    WE STUDIED THE EMOTIONAL RESPONSES BY MUSICIANS to familiar classical music excerpts both when the music was sounded, and when it was imagined.We used continuous response methodology to record response profiles for the dimensions of valence and arousal simultaneously and then on the single dimension of emotionality. The response profiles were compared using cross-correlation analysis, and an analysis of responses to musical feature turning points, which isolate instances of change in musical features thought to influence valence and arousal responses. We found strong similarity between the use of an emotionality arousal scale across the stimuli, regardless of condition (imagined or sounded). A majority of participants were able to create emotional response profiles while imagining the music, which were similar in timing to the response profiles created while listening to the sounded music.We conclude that similar mechanisms may be involved in the processing of emotion in music when the music is sounded and when imagined

    Facial Expression Recognition in the Presence of Head Motion

    Get PDF
    corecore