26 research outputs found

    EVALUATION OF THE EMOTIONAL STATE IN THE OUTDOOR RECREATIONAL ACTIVITIES

    Get PDF
    EVALUATION OF THE EMOTIONAL STATE IN THE OUTDOOR RECREATIONAL ACTIVITIES Gundega Ulme, Behnam Boobani, Daina Arne, Juris Grants Latvian Academy of Sport Education Address: 333 Brivibas Street, Riga, LV-1006, Latvia E-mail: [email protected], [email protected], [email protected], [email protected] The aim of the study is to evaluate the emotional state of volunteer participants in outdoor recreational activities (downhill skiing, cycling) by analysing their facial expressions and self-assessments of emotional state before and after the outdoor recreational activities. Twenty-four volunteers (8 women and 16 men with the average age of 38 years old) participated in the study. The emotional state of the participants was assessed by using Sports Emotion Questionnaire (SEQ) and a software “Face Reader 3.0.” The data obtained in the study were processed using SPSS software.Spearman`s rank correlation coefficient was calculated. The data obtained from the study shows that all the emotions of the participants are more intense before the recreational activities than after. Summarizing the findings obtained from the study, the results indicate the significance of using the Sports Emotion Questionnaire before and after recreational activities. As data received from self-assessments of emotional state often differ from data obtained by observation or by Face Reader, the recreational specialist needs to develop the ability to observe the emotional state of the respondents by using different methods. An interview or conversation is recommended before, during, and after the recreational activity. The information on the emotional state would allow adapting recreational activities to the current needs of the participant, encouraging more efficient involvement in the physical recreational process

    Facial Emotional Classifier For Natural Interaction

    Get PDF
    The recognition of emotional information is a key step toward giving computers the ability to interact more naturally and intelligently with people. We present a simple and computationally feasible method to perform automatic emotional classification of facial expressions. We propose the use of a set of characteristic facial points (that are part of the MPEG4 feature points) to extract relevant emotional information (basically five distances, presence of wrinkles in the eyebrow and mouth shape). The method defines and detects the six basic emotions (plus the neutral one) in terms of this information and has been fine-tuned with a database of more than 1500 images. The system has been integrated in a 3D engine for managing virtual characters, allowing the exploration of new forms of natural interaction

    Measuring user emotionality on online videos: A comparison between self-report and facial expression analysis

    Get PDF
    One common factor that unites the popularity of online video viewers is their virality. Marketers and academics have been involved in the contemporary research not only to understand how online virality occurs but in addition how it can be measured. Thus, the aim of this paper is threefold: a) to advance the understanding of what online video virality is b) to propose a conceptual framework for measuring video virality c) to evaluate two main contrasting methods for measuring video virality. The conceptual framework identifies key elements to video virality as emotions and social groups, and the tools proposed to be used for measuring online video virality is the FaceReader and the online web questionnaire. The findings from the study indicate the existence of discriminant validity between the two methods which inherently adds to the theoretical advancement with the notion that video marketers or researchers cannot use self-report to measure emotions or use it synchronously with facial expression analysis on online videos

    Ambiente Virtual de Aprendizagem: uma abordagem baseada em mediação tecnológica personalizada/ Virtual Learning Environment: an approach based on personalized technological mediation

    Get PDF
    Esta pesquisa refere-se ao desenvolvimento de um Ambiente Virtual de Aprendizagem personalizado capacitado para identificar os estilos de aprendizagem dos usuários, com a atuação do Agente Pedagógico Dóris. Objetiva-se identificar a relação estabelecida entre os Estilos de Aprendizagem, apresentação de conteúdos e a interação com a Dóris. O dispositivo Eye Tracker e o software Face Reader foram adotados como ferramentas metodológicas, a fim de avaliar os resultados da interação dos usuários com o Ambiente Virtual. Essa abordagem permitiu observar que a preferência acerca do modelo da apresentação do conteúdo está estreitamente relacionada à orientação do assunto abordado, podendo variar de acordo com a área do conhecimento trabalhada

    ANS Responses and Facial Expressions Differentiate between the Taste of Commercial Breakfast Drinks

    Get PDF
    The high failure rate of new market introductions, despite initial successful testing with traditional sensory and consumer tests, necessitates the development of other tests. This study explored the ability of selected physiological and behavioral measures of the autonomic nervous system (ANS) to distinguish between repeated exposures to foods from a single category (breakfast drinks) and with similar liking ratings. In this within-subject study 19 healthy young adults sipped from five breakfast drinks, each presented five times, while ANS responses (heart rate, skin conductance response and skin temperature), facial expressions, liking, and intensities were recorded. The results showed that liking was associated with increased heart rate and skin temperature, and more neutral facial expressions. Intensity was associated with reduced heart rate and skin temperature, more neutral expressions and more negative expressions of sadness, anger and surprise. Strongest associations with liking were found after 1 second of tasting, whereas strongest associations with intensity were found after 2 seconds of tasting. Future studies should verify the contribution of the additional information to the prediction of market success

    An Efficient Boosted Classifier Tree-Based Feature Point Tracking System for Facial Expression Analysis

    Get PDF
    The study of facial movement and expression has been a prominent area of research since the early work of Charles Darwin. The Facial Action Coding System (FACS), developed by Paul Ekman, introduced the first universal method of coding and measuring facial movement. Human-Computer Interaction seeks to make human interaction with computer systems more effective, easier, safer, and more seamless. Facial expression recognition can be broken down into three distinctive subsections: Facial Feature Localization, Facial Action Recognition, and Facial Expression Classification. The first and most important stage in any facial expression analysis system is the localization of key facial features. Localization must be accurate and efficient to ensure reliable tracking and leave time for computation and comparisons to learned facial models while maintaining real-time performance. Two possible methods for localizing facial features are discussed in this dissertation. The Active Appearance Model is a statistical model describing an object\u27s parameters through the use of both shape and texture models, resulting in appearance. Statistical model-based training for object recognition takes multiple instances of the object class of interest, or positive samples, and multiple negative samples, i.e., images that do not contain objects of interest. Viola and Jones present a highly robust real-time face detection system, and a statistically boosted attentional detection cascade composed of many weak feature detectors. A basic algorithm for the elimination of unnecessary sub-frames while using Viola-Jones face detection is presented to further reduce image search time. A real-time emotion detection system is presented which is capable of identifying seven affective states (agreeing, concentrating, disagreeing, interested, thinking, unsure, and angry) from a near-infrared video stream. The Active Appearance Model is used to place 23 landmark points around key areas of the eyes, brows, and mouth. A prioritized binary decision tree then detects, based on the actions of these key points, if one of the seven emotional states occurs as frames pass. The completed system runs accurately and achieves a real-time frame rate of approximately 36 frames per second. A novel facial feature localization technique utilizing a nested cascade classifier tree is proposed. A coarse-to-fine search is performed in which the regions of interest are defined by the response of Haar-like features comprising the cascade classifiers. The individual responses of the Haar-like features are also used to activate finer-level searches. A specially cropped training set derived from the Cohn-Kanade AU-Coded database is also developed and tested. Extensions of this research include further testing to verify the novel facial feature localization technique presented for a full 26-point face model, and implementation of a real-time intensity sensitive automated Facial Action Coding System
    corecore