405 research outputs found

    ToFFi – Toolbox for frequency-based fingerprinting of brain signals

    Get PDF
    Spectral fingerprints (SFs) are unique power spectra signatures of human brain regions of interest (ROIs, Keitel & Gross, 2016). SFs allow for accurate ROI identification and can serve as biomarkers of differences exhibited by non-neurotypical groups. At present, there are no open-source, versatile tools to calculate spectral fingerprints. We have filled this gap by creating a modular, highly-configurable MATLAB Toolbox for Frequency-based Fingerprinting (ToFFi). It can transform magnetoencephalographic and electroencephalographic signals into unique spectral representations using ROIs provided by anatomical (AAL, Desikan-Killiany), functional (Schaefer), or other custom volumetric brain parcellations. Toolbox design supports reproducibility and parallel computations

    Affective Stack — A Model for Affective Computing Application Development

    Full text link

    Physiological Sensing for Affective Computing

    Get PDF
    This thesis addresses two aspects related to enabling systems to recognize the affective state of people and respond sensibly to it. First, the issue of representing affective states and unambiguously assigning physiological measurements to those is addressed by suggesting a new approach based on the dimensional emotion model of valence and arousal. Second, the issue of sensing affect-related physiological data is addressed by suggesting a concept for physiological sensor systems that live up to the requirements of adaptive, user-centred systems.In dieser Arbeit wird ein Konzept zur eindeutigen Zuordnung physiologischer Messdaten zu EmotionszustĂ€nden erarbeitet, wobei Probleme klassischer AnsĂ€tze hierzu vermieden werden. Des Weiteren widmet sich die Arbeit der Erfassung emotionsbezogener physiologischer Parameter. Es wird ein Konzept fĂŒr Sensorsysteme vorgestellt, welches die zuverlĂ€ssige Erfassung relevanter physiologischer Parameter erlaubt, ohne jedoch den Nutzer stark zu beeintrĂ€chtigen. Der Schwerpunkt liegt hierbei auf der alltagstauglichen Gestaltung des Systems

    Socially intelligent robots that understand and respond to human touch

    Get PDF
    Touch is an important nonverbal form of interpersonal interaction which is used to communicate emotions and other social messages. As interactions with social robots are likely to become more common in the near future these robots should also be able to engage in tactile interaction with humans. Therefore, the aim of the research presented in this dissertation is to work towards socially intelligent robots that can understand and respond to human touch. To become a socially intelligent actor a robot must be able to sense, classify and interpret human touch and respond to this in an appropriate manner. To this end we present work that addresses different parts of this interaction cycle. The contributions of this dissertation are the following. We have made a touch gesture dataset available to the research community and have presented benchmark results. Furthermore, we have sparked interest into the new field of social touch recognition by organizing a machine learning challenge and have pinpointed directions for further research. Also, we have exposed potential difficulties for the recognition of social touch in more naturalistic settings. Moreover, the findings presented in this dissertation can help to inform the design of a behavioral model for robot pet companions that can understand and respond to human touch. Additionally, we have focused on the requirements for tactile interaction with robot pets for health care applications

    Emerging ExG-based NUI Inputs in Extended Realities : A Bottom-up Survey

    Get PDF
    Incremental and quantitative improvements of two-way interactions with extended realities (XR) are contributing toward a qualitative leap into a state of XR ecosystems being efficient, user-friendly, and widely adopted. However, there are multiple barriers on the way toward the omnipresence of XR; among them are the following: computational and power limitations of portable hardware, social acceptance of novel interaction protocols, and usability and efficiency of interfaces. In this article, we overview and analyse novel natural user interfaces based on sensing electrical bio-signals that can be leveraged to tackle the challenges of XR input interactions. Electroencephalography-based brain-machine interfaces that enable thought-only hands-free interaction, myoelectric input methods that track body gestures employing electromyography, and gaze-tracking electrooculography input interfaces are the examples of electrical bio-signal sensing technologies united under a collective concept of ExG. ExG signal acquisition modalities provide a way to interact with computing systems using natural intuitive actions enriching interactions with XR. This survey will provide a bottom-up overview starting from (i) underlying biological aspects and signal acquisition techniques, (ii) ExG hardware solutions, (iii) ExG-enabled applications, (iv) discussion on social acceptance of such applications and technologies, as well as (v) research challenges, application directions, and open problems; evidencing the benefits that ExG-based Natural User Interfaces inputs can introduceto the areaof XR.Peer reviewe

    Emerging ExG-based NUI Inputs in Extended Realities : A Bottom-up Survey

    Get PDF
    Incremental and quantitative improvements of two-way interactions with extended realities (XR) are contributing toward a qualitative leap into a state of XR ecosystems being efficient, user-friendly, and widely adopted. However, there are multiple barriers on the way toward the omnipresence of XR; among them are the following: computational and power limitations of portable hardware, social acceptance of novel interaction protocols, and usability and efficiency of interfaces. In this article, we overview and analyse novel natural user interfaces based on sensing electrical bio-signals that can be leveraged to tackle the challenges of XR input interactions. Electroencephalography-based brain-machine interfaces that enable thought-only hands-free interaction, myoelectric input methods that track body gestures employing electromyography, and gaze-tracking electrooculography input interfaces are the examples of electrical bio-signal sensing technologies united under a collective concept of ExG. ExG signal acquisition modalities provide a way to interact with computing systems using natural intuitive actions enriching interactions with XR. This survey will provide a bottom-up overview starting from (i) underlying biological aspects and signal acquisition techniques, (ii) ExG hardware solutions, (iii) ExG-enabled applications, (iv) discussion on social acceptance of such applications and technologies, as well as (v) research challenges, application directions, and open problems; evidencing the benefits that ExG-based Natural User Interfaces inputs can introduceto the areaof XR.Peer reviewe

    Affective Computing for Emotion Detection using Vision and Wearable Sensors

    Get PDF
    The research explores the opportunities, challenges, limitations, and presents advancements in computing that relates to, arises from, or deliberately influences emotions (Picard, 1997). The field is referred to as Affective Computing (AC) and is expected to play a major role in the engineering and development of computationally and cognitively intelligent systems, processors and applications in the future. Today the field of AC is bolstered by the emergence of multiple sources of affective data and is fuelled on by developments under various Internet of Things (IoTs) projects and the fusion potential of multiple sensory affective data streams. The core focus of this thesis involves investigation into whether the sensitivity and specificity (predictive performance) of AC, based on the fusion of multi-sensor data streams, is fit for purpose? Can such AC powered technologies and techniques truly deliver increasingly accurate emotion predictions of subjects in the real world? The thesis begins by presenting a number of research justifications and AC research questions that are used to formulate the original thesis hypothesis and thesis objectives. As part of the research conducted, a detailed state of the art investigations explored many aspects of AC from both a scientific and technological perspective. The complexity of AC as a multi-sensor, multi-modality, data fusion problem unfolded during the state of the art research and this ultimately led to novel thinking and origination in the form of the creation of an AC conceptualised architecture that will act as a practical and theoretical foundation for the engineering of future AC platforms and solutions. The AC conceptual architecture developed as a result of this research, was applied to the engineering of a series of software artifacts that were combined to create a prototypical AC multi-sensor platform known as the Emotion Fusion Server (EFS) to be used in the thesis hypothesis AC experimentation phases of the research. The thesis research used the EFS platform to conduct a detailed series of AC experiments to investigate if the fusion of multiple sensory sources of affective data from sensory devices can significantly increase the accuracy of emotion prediction by computationally intelligent means. The research involved conducting numerous controlled experiments along with the statistical analysis of the performance of sensors for the purposes of AC, the findings of which serve to assess the feasibility of AC in various domains and points to future directions for the AC field. The AC experiments data investigations conducted in relation to the thesis hypothesis used applied statistical methods and techniques, and the results, analytics and evaluations are presented throughout the two thesis research volumes. The thesis concludes by providing a detailed set of formal findings, conclusions and decisions in relation to the overarching research hypothesis on the sensitivity and specificity of the fusion of vision and wearables sensor modalities and offers foresights and guidance into the many problems, challenges and projections for the AC field into the future

    Tangible interaction with anthropomorphic smart objects in instrumented environments

    Get PDF
    A major technological trend is to augment everyday objects with sensing, computing and actuation power in order to provide new services beyond the objects' traditional purpose, indicating that such smart objects might become an integral part of our daily lives. To be able to interact with smart object systems, users will obviously need appropriate interfaces that regard their distinctive characteristics. Concepts of tangible and anthropomorphic user interfaces are combined in this dissertation to create a novel paradigm for smart object interaction. This work provides an exploration of the design space, introduces design guidelines, and provides a prototyping framework to support the realisation of the proposed interface paradigm. Furthermore, novel methods for expressing personality and emotion by auditory means are introduced and elaborated, constituting essential building blocks for anthropomorphised smart objects. Two experimental user studies are presented, confirming the endeavours to reflect personality attributes through prosody-modelled synthetic speech and to express emotional states through synthesised affect bursts. The dissertation concludes with three example applications, demonstrating the potentials of the concepts and methodologies elaborated in this thesis.Die Integration von Informationstechnologie in GebrauchsgegenstĂ€nde ist ein gegenwĂ€rtiger technologischer Trend, welcher es AlltagsgegenstĂ€nden ermöglicht, durch den Einsatz von Sensorik, Aktorik und drahtloser Kommunikation neue Dienste anzubieten, die ĂŒber den ursprĂŒnglichen Zweck des Objekts hinausgehen. Die Nutzung dieser sogenannten Smart Objects erfordert neuartige Benutzerschnittstellen, welche die speziellen Eigenschaften und Anwendungsbereiche solcher Systeme berĂŒcksichtigen. Konzepte aus den Bereichen Tangible Interaction und Anthropomorphe Benutzerschnittstellen werden in dieser Dissertation vereint, um ein neues Interaktionsparadigma fĂŒr Smart Objects zu entwickeln. Die vorliegende Arbeit untersucht dafĂŒr die Gestaltungsmöglichkeiten und zeigt relevante Aspekte aus verwandten Disziplinen auf. Darauf aufbauend werden Richtlinien eingefĂŒhrt, welche den Entwurf von Benutzerschnittstellen nach dem hier vorgestellten Ansatz begleiten und unterstĂŒtzen sollen. FĂŒr eine prototypische Implementierung solcher Benutzerschnittstellen wird eine Architektur vorgestellt, welche die Anforderungen von Smart Object Systemen in instrumentierten Umgebungen berĂŒcksichtigt. Ein wichtiger Bestandteil stellt dabei die Sensorverarbeitung dar, welche unter anderem eine Interaktionserkennung am Objekt und damit auch eine physikalische Eingabe ermöglicht. Des Weiteren werden neuartige Methoden fĂŒr den auditiven Ausdruck von Emotion und Persönlichkeit entwickelt, welche essentielle Bausteine fĂŒr anthropomorphisierte Smart Objects darstellen und in Benutzerstudien untersucht wurden. Die Dissertation schliesst mit der Beschreibung von drei Applikationen, welche im Rahmen der Arbeit entwickelt wurden und das Potential der hier erarbeiteten Konzepte und Methoden widerspiegeln
    • 

    corecore