18 research outputs found

    Towards an Effectve Arousal Detecton System for Virtual Reality

    Get PDF
    Immersive technologies offer the potential to drive engagement and create exciting experiences. A better understanding of the emotional state of the user within immersive experiences can assist in healthcare interventions and the evaluation of entertainment technologies. This work describes a feasibility study to explore the effect of affective video content on heart-rate recordings for Virtual Reality applications. A lowcost reflected-mode photoplethysmographic sensor and an electrocardiographic chest-belt sensor were attached on a novel non-invasive wearable interface specially designed for this study. 11 participants responses were analysed, and heart-rate metrics were used for arousal classification. The reported results demonstrate that the fusion of physiological signals yields to significant performance improvement; and hence the feasibility of our new approach

    Wearable artificial intelligence for anxiety and depression: A scoping review

    Get PDF
    Background: Anxiety and depression are the most common mental disorders worldwide. Owing to the lack of psychiatrists around the world, the incorporation of AI and wearable devices (wearable artificial intelligence (AI)) have been exploited to provide mental health services. Objective: The current review aimed to explore the features of wearable AI used for anxiety and depression to identify application areas and open research issues. Methods: We searched 8 electronic databases (MEDLINE, PsycINFO, EMBASE, CINAHL, IEEE Xplore, ACM Digital Library, Scopus, and Google Scholar). Then, we checked studies that cited the included studies, and screened studies that were cited by the included studies. Study selection and data extraction were carried out by two reviewers independently. The extracted data were aggregated and summarized using the narrative synthesis. Results: Of the 1203 citations identified, 69 studies were included in this review. About two thirds of the studies used wearable AI for depression while the remaining studies used it for anxiety. The most frequent application of wearable AI was diagnosing anxiety and depression while no studies used it for treatment purposes. The majority of studies targeted individuals between the ages of 18 and 65. The most common wearable devices used in the studies were Actiwatch AW4. The wrist-worn devices were most common in the studies. The most commonly used data for model development were physical activity data, sleep data, and heart rate data. The most frequently used dataset from open sources was Depresjon. The most commonly used algorithms were Random Forest (RF) and Support Vector Machine (SVM). Conclusions: Wearable AI can offer great promise in providing mental health services related to anxiety and depression. Wearable AI can be used by individuals as a pre-screening assessment of anxiety and depression. Further reviews are needed to statistically synthesize studies’ results related to the performance and effectiveness of wearable AI. Given its potential, tech companies should invest more in wearable AI for treatment purposes for anxiety and depression

    Classifying Smart Personal Assistants: An Empirical Cluster Analysis

    Get PDF
    The digital age has yielded systems that increasingly reduce the complexity of our everyday lives. As such, smart personal assistants such as Amazon’s Alexa or Apple’s Siri combine the comfort of intuitive natural language interaction with the utility of personalized and situation-dependent information and service provision. However, research on SPAs is becoming increasingly complex and opaque. To reduce complexity, this paper introduces a classification system for SPAs. Based on a systematic literature review, a cluster analysis reveals five SPA archetypes: Adaptive Voice (Vision) Assistants, Chatbot Assistants, Embodied Virtual Assistants, Passive Pervasive Assistants, and Natural Conversation Assistants

    Towards Optimized K Means Clustering using Nature-inspired Algorithms for Software Bug Prediction

    Get PDF
    In today s software development environment the necessity for providing quality software products has undoubtedly remained the largest difficulty As a result early software bug prediction in the development phase is critical for lowering maintenance costs and improving overall software performance Clustering is a well-known unsupervised method for data classification and finding related patterns hidden in dataset

    Social Robots in Hospitals: A Systematic Review

    Full text link
    Hospital environments are facing new challenges this century. One of the most important is the quality of services to patients. Social robots are gaining prominence due to the advantages they offer; in particular, several of their main uses have proven beneficial during the pandemic. This study aims to shed light on the current status of the design of social robots and their interaction with patients. To this end, a systematic review was conducted using WoS and MEDLINE, and the results were exhaustive analyzed. The authors found that most of the initiatives and projects serve the el- derly and children, and specifically, that they helped these groups fight diseases such as dementia, autism spectrum disorder (ASD), cancer, and diabetes

    The Compositor Tool: Investigating Consumer Experiences in the Circular Economy

    Get PDF
    Humanity is living through a crisis that sees our way of life exhausting the resources of the earth and ourselves. The fashion sector shows the negative impacts of conspicuous consumption on our socioenvironmental wellbeing. Despite citizens’ growing awareness of their responsibility within consumption cycles, they reveal concerns about their lack of understanding and the support required for them to become agents of responsible consumption. The Circular Economy flourishes as a conceptual approach to help society transition to a more sustainable existence. This paper explores how emerging creative technology and interaction design might support a shift in the role of citizens in the Circular Economy. We performed a design inquiry that investigated the moment of acquisition via configuration of products, storytelling, and multimodal interaction techniques for the creation of experiences that could catalyse citizen-consumers to become custodians of materials. We developed a retail-based concept tool—The Compositor Tool—with which we ran a user study to investigate new experiential ways that consumers can participate in materials’ circularity. The study highlighted how experience design and new interaction techniques can introduce circularity as part of consumer experience by forging deeper connections between people and products/materials and enabling consumers to have more creative and informative material engagement

    Recognition of Human Emotion using Radial Basis Function Neural Networks with Inverse Fisher Transformed Physiological Signals

    Get PDF
    Emotion is a complex state of human mind influenced by body physiological changes and interdependent external events thus making an automatic recognition of emotional state a challenging task. A number of recognition methods have been applied in recent years to recognize human emotion. The motivation for this study is therefore to discover a combination of emotion features and recognition method that will produce the best result in building an efficient emotion recognizer in an affective system. We introduced a shifted tanh normalization scheme to realize the inverse Fisher transformation applied to the DEAP physiological dataset and consequently performed series of experiments using the Radial Basis Function Artificial Neural Networks (RBFANN). In our experiments, we have compared the performances of digital image based feature extraction techniques such as the Histogram of Oriented Gradient (HOG), Local Binary Pattern (LBP) and the Histogram of Images (HIM). These feature extraction techniques were utilized to extract discriminatory features from the multimodal DEAP dataset of physiological signals. Experimental results obtained indicate that the best recognition accuracy was achieved with the EEG modality data using the HIM features extraction technique and classification done along the dominance emotion dimension. The result is very remarkable when compared with existing results in the literature including deep learning studies that have utilized the DEAP corpus and also applicable to diverse fields of engineering studies

    Emotional expressions reconsidered: challenges to inferring emotion from human facial movements

    Get PDF
    It is commonly assumed that a person’s emotional state can be readily inferred from his or her facial movements, typically called emotional expressions or facial expressions. This assumption influences legal judgments, policy decisions, national security protocols, and educational practices; guides the diagnosis and treatment of psychiatric illness, as well as the development of commercial applications; and pervades everyday social interactions as well as research in other scientific fields such as artificial intelligence, neuroscience, and computer vision. In this article, we survey examples of this widespread assumption, which we refer to as the common view, and we then examine the scientific evidence that tests this view, focusing on the six most popular emotion categories used by consumers of emotion research: anger, disgust, fear, happiness, sadness, and surprise. The available scientific evidence suggests that people do sometimes smile when happy, frown when sad, scowl when angry, and so on, as proposed by the common view, more than what would be expected by chance. Yet how people communicate anger, disgust, fear, happiness, sadness, and surprise varies substantially across cultures, situations, and even across people within a single situation. Furthermore, similar configurations of facial movements variably express instances of more than one emotion category. In fact, a given configuration of facial movements, such as a scowl, often communicates something other than an emotional state. Scientists agree that facial movements convey a range of information and are important for social communication, emotional or otherwise. But our review suggests an urgent need for research that examines how people actually move their faces to express emotions and other social information in the variety of contexts that make up everyday life, as well as careful study of the mechanisms by which people perceive instances of emotion in one another. We make specific research recommendations that will yield a more valid picture of how people move their faces to express emotions and how they infer emotional meaning from facial movements in situations of everyday life. This research is crucial to provide consumers of emotion research with the translational information they require
    corecore