7 research outputs found

    Informing Intelligent User Interfaces by Inferring Affective States from Body Postures in Ubiquitous Computing Environments

    No full text
    Intelligent User Interfaces can benefit from having knowledge on the user’s emotion. However, current implementations to detect affective states, are often constraining the user’s freedom of movement by instrumenting her with sensors. This prevents affective computing from being deployed in naturalistic and ubiquitous computing contexts. In this paper, we present a novel system called mASqUE, which uses a set of association rules to infer someone’s affective state from their body postures. This is done without any user instrumentation and using off-the-shelf and non-expensive commodity hardware: a depth camera tracks the body posture of the users and their postures are also used as an indicator of their openness. By combining the posture information with physiological sensors measurements we were able to mine a set of association rules relating postures to affective states. We demonstrate the possibility of inferring affective states from body postures in ubiquitous computing environments and our study also provides insights how this opens up new possibilities for IUI to access the affective states of users from body postures in a nonintrusive way

    Patient Relationship Management (PRM) and AI: The role of Affective Computing

    Get PDF
    Dissertation presented as the partial requirement for obtaining a Master's degree in Information Management, specialization in Knowledge Management and Business IntelligenceArtificial Intelligence (AI) has been praised as the next big thing in the computing revolution, and it's been touted as a game-changer in a variety of fields, including healthcare. The increasing popularity of this technology is driving early adoption and leading to a lack of consideration of the patient perspective in its use, bringing new sources of distrust that come from the absence of human attributes. This study aims to address this problem by presenting a strategy in the area of affective computing that will combat this absence of empathy experienced by the patient during a medical process. To reach this goal, a Design Science Research Methodology will be followed. A preliminary literature study had already been completed, and the research topic and objectives had been established. In addition, to apply the artifact developments, a bot will be built and evaluated by a set of users. The increased awareness of these AI systems will, expectedly, stimulate their use. By adding new research into the affective computing field, it is also expected to contribute to the digital healthcare evolution and to encourage further scientific progress in this area

    Designing an Educational and Intelligent Human-Computer Interface for Older Adults

    Get PDF
    As computing devices continue to become more heavily integrated into our lives, proper design of human-computer interfaces becomes a more important topic of discussion. Efficient and useful human-computer interfaces need to take into account the abilities of the humans who will be using such interfaces, and adapt to difficulties that different users may face – such as the particular difficulties older users must face. However, various issues in the design of human-computer interfaces for older users yet exist: a wide variance of ability is displayed by older adults, which can be difficult to design for. Motions and notions found intuitive by younger users can be anything but for the older user. Properly-designed devices must also assist without injuring the pride and independence of the users – thus, it’s understood that devices designed “for the elderly” may encounter a poor reception when introduced to the ageing community. Affective computing gives current researchers in HCI a useful opportunity to develop applications with interfaces that detect mood and attention via nonverbal cues and take appropriate actions accordingly. Current work in affective computing applications with older adult users points to possibilities reducing feelings of loneliness in the older adult population via these affective applications. However, we believe that everyday applications – such as chat programs or operating systems – can also take advantage of affective computing principles to make themselves more accessible for older adults, via communication enhancement. In this thesis, we document a variety of work in the field of developing human-computer interfaces for the older adult user, and the various requirements each of these studies confirm regarding human-computer interaction design for the elderly. We then explain how integration of affective computing can positively affect these designs, and outline a design approach for proper human-computer interfaces for the elderly which take into account affective computing principles. We then develop a case study around a chat application – ChitChat – which takes these principles and guidelines into account from the beginning, and give several examples of real-world applications also built with these guidelines. Finally, we conclude by summarizing the broader impacts of this work

    Developing a Hand Gesture Recognition System for Mapping Symbolic Hand Gestures to Analogous Emoji in Computer-mediated Communication

    Get PDF
    Recent trends in computer-mediated communications (CMC) have not only led to expanded instant messaging through the use of images and videos, but have also expanded traditional text messaging with richer content, so-called visual communication markers (VCM) such as emoticons, emojis, and stickers. VCMs could prevent a potential loss of subtle emotional conversation in CMC, which is delivered by nonverbal cues that convey affective and emotional information. However, as the number of VCMs grows in the selection set, the problem of VCM entry needs to be addressed. Additionally, conventional ways for accessing VCMs continues to rely on input entry methods that are not directly and intimately tied to expressive nonverbal cues. One such form of expressive nonverbal that does exist and is well-studied comes in the form of hand gestures. In this work, I propose a user-defined hand gesture set that is highly representative to VCMs and a two-stage hand gesture recognition system (trajectory-based, shape-based) that distinguishes the user-defined hand gestures. While the trajectory-based recognizer distinguishes gestures based on the movements of hands, the shape-based recognizer classifies gestures based on the shapes of hands. The goal of this research is to allow users to be more immersed, natural, and quick in generating VCMs through gestures. The idea is for users to maintain the lower-bandwidth online communication of text messaging to largely retain its convenient and discreet properties, while also incorporating the advantages of higher-bandwidth online communication of video messaging by having users naturally gesture their emotions that are then closely mapped to VCMs. Results show that the accuracy of user-dependent is approximately 86% and the accuracy of user-independent is about 82%

    A Model-Based Approach for Gesture Interfaces

    Get PDF
    The description of a gesture requires temporal analysis of values generated by input sensors, and it does not fit well the observer pattern traditionally used by frameworks to handle the user’s input. The current solution is to embed particular gesture-based interactions into frameworks by notifying when a gesture is detected completely. This approach suffers from a lack of flexibility, unless the programmer performs explicit temporal analysis of raw sensors data. This thesis proposes a compositional, declarative meta-model for gestures definition based on Petri Nets. Basic traits are used as building blocks for defining gestures; each one notifies the change of a feature value. A complex gesture is defined by the composition of other sub-gestures using a set of operators. The user interface behaviour can be associated to the recognition of the whole gesture or to any other sub-component, addressing the problem of granularity for the notification of events. The meta-model can be instantiated for different gesture recognition supports and its definition has been validated through a proof of concept library. Sample applications have been developed for supporting multi-touch gestures in iOS and full body gestures with Microsoft Kinect. In addition to the solution for the event granularity problem, this thesis discusses how to separate the definition of the gesture from the user interface behaviour using the proposed compositional approach. The gesture description meta-model has been integrated into MARIA, a model-based user interface description language, extending it with the description of full-body gesture interfaces

    How can people’s spatial behaviour be used to dynamically lay out content on multi-user, interactive screens, and how does this dynamic layout affect people’s spatial behaviour?

    Get PDF
    This thesis aims to explore the influencing factors of layout and presentation changes of large interactive and adaptive displays in multi-user interactions and social organisation. While significant bodies of work have considered the interactivity of digital displays to identify phenomena of use, these have been conducted in localised isolation, and do not address the wider ecological impacts for the influences of emergent organisations of simultaneous use where a system or display may support this. Through considerations of how display presentation and layout can influence the emergence of social organisations, a series of iterative lab-based studies have been carried out to assess and inform a number of interaction modalities. This leads to a series of design recommendations around a system-led approach in presenting a mechanism to support approach behaviours and the maximised utility of a large display, whilst mitigating conflict between social boundaries and impact to user experience. This has identified a range of factors in both the mechanisms of natural social organisation and supporting layout changes and adaptations in maintaining user experience leading towards wider use, scaffolding features of the environment, on-going use, and adaptation within a novel system-led approach. This has presented clear implications to the field, and identified significant areas for further research to refine the subtle factors of interaction which have been identified here

    How can people’s spatial behaviour be used to dynamically lay out content on multi-user, interactive screens, and how does this dynamic layout affect people’s spatial behaviour?

    Get PDF
    This thesis aims to explore the influencing factors of layout and presentation changes of large interactive and adaptive displays in multi-user interactions and social organisation. While significant bodies of work have considered the interactivity of digital displays to identify phenomena of use, these have been conducted in localised isolation, and do not address the wider ecological impacts for the influences of emergent organisations of simultaneous use where a system or display may support this. Through considerations of how display presentation and layout can influence the emergence of social organisations, a series of iterative lab-based studies have been carried out to assess and inform a number of interaction modalities. This leads to a series of design recommendations around a system-led approach in presenting a mechanism to support approach behaviours and the maximised utility of a large display, whilst mitigating conflict between social boundaries and impact to user experience. This has identified a range of factors in both the mechanisms of natural social organisation and supporting layout changes and adaptations in maintaining user experience leading towards wider use, scaffolding features of the environment, on-going use, and adaptation within a novel system-led approach. This has presented clear implications to the field, and identified significant areas for further research to refine the subtle factors of interaction which have been identified here
    corecore