1,563 research outputs found

    FACETEQ interface demo for emotion expression in VR

    Get PDF
    © 2017 IEEE.Faceteq prototype v.05 is a wearable technology for measuring facial expressions and biometric responses for experimental studies in Virtual Reality. Developed by Emteq Ltd laboratory, Faceteq can enable new avenues for virtual reality research through combination of high performance patented dry sensor technologies, proprietary algorithms and real-time data acquisition and streaming. Emteq founded the Faceteq project with the aim to provide a human-centered additional tool for emotion expression, affective human-computer interaction and social virtual environments. The proposed demonstration will exhibit the hardware and its functionality by allowing attendees to experience three of the showcasing applications we developed this year

    FACETEQ; A novel platform for measuring emotion in VR

    Get PDF
    FaceTeq prototype v.05 is a wearable technology for measuring facial expressions and biometric responses for experimental studies in Virtual Reality. Developed by Emteq Ltd laboratory, FaceTeq can enable new avenues for virtual reality research through combination of high performance patented dry sensor technologies, proprietary algorithms and real-time data acquisition and streaming. FaceTeq project was founded with the aim to provide a human-centred additional tool for emotion expression, affective human-computer interaction and social virtual environments. The proposed poster will exhibit the hardware and its functionality

    User emotional interaction processor: a tool to support the development of GUIs through physiological user monitoring

    Get PDF
    Ever since computers have entered humans' daily lives, the activity between the human and the digital ecosystems has increased. This increase encourages the development of smarter and more user-friendly human-computer interfaces. However, to test these interfaces, the means of interaction have been limited, for the most part restricted to the conventional interface, the "manual" interface, where physical input is required, where participants who test these interfaces use a keyboard, mouse, or a touch screen, and where communication between participants and designers is required. There is another method, which will be applied in this dissertation, which does not require physical input from the participants, which is called Affective Computing. This dissertation presents the development of a tool to support the development of graphical interfaces, based on the monitoring of psychological and physiological aspects of the user (emotions and attention), aiming to improve the experience of the end user, with the ultimate goal of improving the interface design. The development of this tool will be described. The results, provided by designers from an IT company, suggest that the tool is useful but that the optimized interface generated by it still has some flaws. These flaws are mainly related to the lack of consideration of a general context in the interface generation process.Desde que os computadores entraram na vida diária dos humanos, a atividade entre o ecossistema humano e o digital tem aumentado. Este aumento estimula o desenvolvimento de interfaces humano-computador mais inteligentes e apelativas ao utilizador. No entanto, para testar estas interfaces, os meios de interação têm sido limitados, em grande parte restritos à interface convencional, a interface "manual", onde é preciso "input" físico, onde os participantes que testam estas interface, usam um teclado, um rato ou um "touch screen", e onde a comunicação dos participantes com os designers é necessária. Existe outro método, que será aplicado nesta dissertação, que não necessita de "input" físico dos participantes, que se denomina de "Affective Computing". Esta dissertação apresenta o desenvolvimento de uma ferramenta de suporte ao desenvolvimento de interfaces gráficas, baseada na monitorização de aspetos psicológicos e fisiológicos do utilizador (emoções e atenção), visando melhorar a experiência do utilizador final, com o objetivo último de melhorar o design da interface. O desenvolvimento desta ferramenta será descrito. Os resultados, dados por designers de uma empresa de IT, sugerem que esta é útil, mas que a interface otimizada gerada pela mesma tem ainda algumas falhas. Estas falhas estão, principalmente, relacionadas com a ausência de consideração de um contexto geral no processo de geração da interface

    Towards emotional interaction: using movies to automatically learn users’ emotional states

    Get PDF
    The HCI community is actively seeking novel methodologies to gain insight into the user's experience during interaction with both the application and the content. We propose an emotional recognition engine capable of automatically recognizing a set of human emotional states using psychophysiological measures of the autonomous nervous system, including galvanic skin response, respiration, and heart rate. A novel pattern recognition system, based on discriminant analysis and support vector machine classifiers is trained using movies' scenes selected to induce emotions ranging from the positive to the negative valence dimension, including happiness, anger, disgust, sadness, and fear. In this paper we introduce an emotion recognition system and evaluate its accuracy by presenting the results of an experiment conducted with three physiologic sensors.info:eu-repo/semantics/publishedVersio

    Feel the Moosic: Emotion-based Music Selection and Recommendation

    Get PDF
    Digital transformation has changed all aspects of life, including the music market and listening habits. The spread of mobile devices and music streaming services has enabled the possibility to access a huge selection of music regardless of time or place. However, this access leads to the customer\u27s problem of choosing the right music for a certain situation or mood. The user is often overwhelmed while choosing music. Context information, especially the emotional state of the user, can help within this process. The possibilities of an emotional music selection are currently limited. The providers rely on predefined playlists for different situations or moods. However, the problem with these lists is, that they do not adapt to new user conditions. A simple, intuitive and automatic emotion-based music selection has so far been poorly investigated in IS practice and research. This paper describes the IS music research project Moosic , which investigates and iteratively implements an intuitive emotion-based music recommendation application. In addition, an initial evaluation of the prototype will be discussed and an outlook on further development will be given

    Emotions, behaviour and belief regulation in an intelligent guide with attitude

    Get PDF
    Abstract unavailable please refer to PD

    Grabbing attention while reading website pages: the influence of verbal emotional cues in advertising

    Get PDF
    The increasing use of the World Wide Web has promised a huge advertising platform for marketers. Investment in online advertising is growing and is expected to overcome traditional media. However, recent studies have reported that users avoid looking at advertising displayed on the World Wide Web. This study aimed at examining the impact of verbal emotional cues (negative/neutral/positive) to capture attention on website’s advertising areas through an eye tracker system. The results revealed significant statistical differences between fixations to negative, positive words and neutral words. Significant differences between the number of fixations and recognition of the target words were found only for the negative valence words. We conclude that negative emotional words could play a major role on user attention to advertising
    corecore