164 research outputs found

    Emotion detection from handwriting and drawing samples using an attention-based transformer model

    Get PDF
    © 2024 The Author(s). This is an open access article distributed under the terms of the Creative Commons Attribution License (CC BY), https://creativecommons.org/licenses/by/4.0/Emotion detection (ED) involves the identification and understanding of an individual’s emotional state through various cues such as facial expressions, voice tones, physiological changes, and behavioral patterns. In this context, behavioral analysis is employed to observe actions and behaviors for emotional interpretation. This work specifically employs behavioral metrics like drawing and handwriting to determine a person’s emotional state, recognizing these actions as physical functions integrating motor and cognitive processes. The study proposes an attention-based transformer model as an innovative approach to identify emotions from handwriting and drawing samples, thereby advancing the capabilities of ED into the domains of fine motor skills and artistic expression. The initial data obtained provides a set of points that correspond to the handwriting or drawing strokes. Each stroke point is subsequently delivered to the attention-based transformer model, which embeds it into a high-dimensional vector space. The model builds a prediction about the emotional state of the person who generated the sample by integrating the most important components and patterns in the input sequence using self-attentional processes. The proposed approach possesses a distinct advantage in its enhanced capacity to capture long-range correlations compared to conventional recurrent neural networks (RNN). This characteristic makes it particularly well-suited for the precise identification of emotions from samples of handwriting and drawings, signifying a notable advancement in the field of emotion detection. The proposed method produced cutting-edge outcomes of 92.64% on the benchmark dataset known as EMOTHAW (Emotion Recognition via Handwriting and Drawing).Peer reviewe

    Haptics: Science, Technology, Applications

    Get PDF
    This open access book constitutes the proceedings of the 12th International Conference on Human Haptic Sensing and Touch Enabled Computer Applications, EuroHaptics 2020, held in Leiden, The Netherlands, in September 2020. The 60 papers presented in this volume were carefully reviewed and selected from 111 submissions. The were organized in topical sections on haptic science, haptic technology, and haptic applications. This year's focus is on accessibility

    Capturing tactile properties of real surfaces for haptic reproduction

    Get PDF
    Tactile feedback of an object’s surface enables us to discern its material properties and affordances. This understanding is used in digital fabrication processes by creating objects with high-resolution surface variations to influence a user’s tactile perception. As the design of such surface haptics commonly relies on knowledge from real-life experiences, it is unclear how to adapt this information for digital design methods. In this work, we investigate replicating the haptics of real materials. Using an existing process for capturing an object’s microgeometry, we digitize and reproduce the stable surface information of a set of 15 fabric samples. In a psychophysical experiment, we evaluate the tactile qualities of our set of original samples and their replicas. From our results, we see that direct reproduction of surface variations is able to influence different psychophysical dimensions of the tactile perception of surface textures. While the fabrication process did not preserve all properties, our approach underlines that replication of surface microgeometries benefits fabrication methods in terms of haptic perception by covering a large range of tactile variations. Moreover, by changing the surface structure of a single fabricated material, its material perception can be influenced. We conclude by proposing strategies for capturing and reproducing digitized textures to better resemble the perceived haptics of the originals

    Haptics: Science, Technology, Applications

    Get PDF
    This open access book constitutes the proceedings of the 12th International Conference on Human Haptic Sensing and Touch Enabled Computer Applications, EuroHaptics 2020, held in Leiden, The Netherlands, in September 2020. The 60 papers presented in this volume were carefully reviewed and selected from 111 submissions. The were organized in topical sections on haptic science, haptic technology, and haptic applications. This year's focus is on accessibility

    Augmented visual, auditory, haptic, and multimodal feedback in motor learning: A review

    Get PDF
    It is generally accepted that augmented feedback, provided by a human expert or a technical display, effectively enhances motor learning. However, discussion of the way to most effectively provide augmented feedback has been controversial. Related studies have focused primarily on simple or artificial tasks enhanced by visual feedback. Recently, technical advances have made it possible also to investigate more complex, realistic motor tasks and to implement not only visual, but also auditory, haptic, or multimodal augmented feedback. The aim of this review is to address the potential of augmented unimodal and multimodal feedback in the framework of motor learning theories. The review addresses the reasons for the different impacts of feedback strategies within or between the visual, auditory, and haptic modalities and the challenges that need to be overcome to provide appropriate feedback in these modalities, either in isolation or in combination. Accordingly, the design criteria for successful visual, auditory, haptic, and multimodal feedback are elaborate

    Robot-Assisted Rehabilitation of Forearm and Hand Function After Stroke

    Get PDF
    Ph.DDOCTOR OF PHILOSOPH

    Desenvolvimento de ferramentas de treino para teleoperação háptica de um robô humanóide

    Get PDF
    Mestrado emEngenharia MecânicaIn robotics, the teleoperation of biped humanoids is one of the most exciting topics. It has the possibility to bypass complex dynamic models with learning demonstration algorithms using human interaction. For this procedure, the Humanoid Project at the University of Aveiro - PHUA, ingrained in the production of a 27 degree-of-freedom full body humanoid platform teleoperated by means of haptic devices. The current project also comprises a robot model that has be imported into the Virtual Robot Experimentation Platform: V-REP. The usage of the simulator allows multiple exercises with greater speed and shorted setup times, when compared to the teleoperation of the real robot, besides providing more safety for the platform and the operator during the tests. By using the simulator, the user can perform tests and make achievements towards the reproduction of human movement with the interaction of two haptic devices providing force feedback to the operator. The performed maneuvers have their kinematic and dynamic data stored for later application in learning by demonstration algorithms. However, the production of more complex and detailed movements requires large amounts of motor skill from the operator. Due to the continuous change of users in the PHUA, an adaptation period is required for the newly arrived operators to develop an a nity with the complex control system. This work is focused on developing methodologies to lower the required time for the training process. Thanks to the versatility of customization provided by V-REP, it was possible to implement interfaces which utilized visual and haptic guidance to enhance the learning capabilities of the operator. A dedicate workstation, new formulations and support tools that control the simulation were developed in order to create a more intuitive control over the humanoid platform. Operators were instructed to reproduce complex 3D movements under several training conditions (with visual and haptic feedback, only haptic feedback, only visual feedback, with guidance tools and without guidance). Performance was measured in terms of speed, drift from intended trajectory, response to the drift and amplitude of the movement. Findings of this study indicate that, with the newly implemented mechanisms, operators are able to gain control over the humanoid platform within a relatively short period of training. Operators subjected to the guidance programs present an even shorter period of training needed, exhibiting high performance in the overall system. These facts support the role of haptic guidance in acquiring kinesthetic memory in high DOFs systems.Em robótica, a teleoperação de robôs bípede humanóides é um dos tópicos mais emocionante. Tem a possibilidade de contornar modelos dinâmicos rígidos, com algoritmos de aprendizagem por demonstração utilizando interação humana. Para este procedimento, o Projeto Humanóide da Universidade de Aveiro - PHUA, empanha-se na produção de uma plataforma humanóide de corpo inteiro teleoperado com dispositivos hapticos. O estado presente do projeto apresenta um robô humanóide com 27 graus de liberdade. O projeto actual apresenta um modelo do robô importado para a Virtual Robot Exper- imentation Platform: V-REP. O uso do simulador permite vários exercícios com maior velocidade e tempos de preparação curtos, quando comparado com a teleoperação do robô real, além de proporcionar mais segurança para a plataforma e do operador durante os ensaios. Ao utilizar o simulador, o utilizador pode realizar testes à reprodução de movimento humano com a interacção de dois dispositivos de meios hápticos que fornecem força de retorno para o operador. As manobras realizadas têm os seus dados cinemáticos e dinâmicos armazenados para posterior aplicação na aprendizagem por algoritmos de demonstração. No entanto, a produção de movimentos mais complexos e detalhados requer grandes quantidades de habilidade motora do operador. Devido à mudança contínua de usuários no PHUA, um período de adaptação é necessário para os operadores recém-chegados a desenvolver uma a nidade com o complexo sistema de controlo. Este trabalho é focado no desenvolvimento de metodologias para diminuir o tempo necessário para o processo de formação dos utilizadores. Graças à versatilidade de personalização fornecidos pela V-REP, foi possível implementar interfaces que utilizaram orientação visual e haptica para melhorar as capacidades de aprendizagem do operador. Uma estação de trabalho, novas formulações e ferramentas de apoio que controlam a simulação foram desenvolvidos a m de criar um controle mais intuitivo sobre a plataforma humanóide. Os operadores foram instruídos a reproduzir movimentos complexos em 3D sob diversas condições de treino (com feedback visual e haptico, apenas feedback haptico, apenas feedback visual, com ferramentas de orientação e sem orientação). O desempenho foi medido em termos de velocidade, a desvio de trajectória pretendida, a resposta à desvio e o tempo gasto para a criação do movimento. Os resultados deste estudo indicam que, com os mecanismos recém-implementadas, os operadores são capazes de ganhar o controlo sobre a plataforma humanóide dentro de um período relativamente curto de treino. Operadores submetidos a programas de orientação apresentam um período ainda mais curto de formação necessária, exibindo alto desempenho no sistema global. Estes fatos justi cam o papel da orientação haptica em adquirir memória cinestésica em sistemas DOFs elevados

    A virtual hand assessment system for efficient outcome measures of hand rehabilitation

    Get PDF
    Previously held under moratorium from 1st December 2016 until 1st December 2021.Hand rehabilitation is an extremely complex and critical process in the medical rehabilitation field. This is mainly due to the high articulation of the hand functionality. Recent research has focused on employing new technologies, such as robotics and system control, in order to improve the precision and efficiency of the standard clinical methods used in hand rehabilitation. However, the designs of these devices were either oriented toward a particular hand injury or heavily dependent on subjective assessment techniques to evaluate the progress. These limitations reduce the efficiency of the hand rehabilitation devices by providing less effective results for restoring the lost functionalities of the dysfunctional hands. In this project, a novel technological solution and efficient hand assessment system is produced that can objectively measure the restoration outcome and, dynamically, evaluate its performance. The proposed system uses a data glove sensorial device to measure the multiple ranges of motion for the hand joints, and a Virtual Reality system to return an illustrative and safe visual assistance environment that can self-adjust with the subject’s performance. The system application implements an original finger performance measurement method for analysing the various hand functionalities. This is achieved by extracting the multiple features of the hand digits’ motions; such as speed, consistency of finger movements and stability during the hold positions. Furthermore, an advanced data glove calibration method was developed and implemented in order to accurately manipulate the virtual hand model and calculate the hand kinematic movements in compliance with the biomechanical structure of the hand. The experimental studies were performed on a controlled group of 10 healthy subjects (25 to 42 years age). The results showed intra-subject reliability between the trials (average of crosscorrelation ρ = 0.7), inter-subject repeatability across the subject’s performance (p > 0.01 for the session with real objects and with few departures in some of the virtual reality sessions). In addition, the finger performance values were found to be very efficient in detecting the multiple elements of the fingers’ performance including the load effect on the forearm. Moreover, the electromyography measurements, in the virtual reality sessions, showed high sensitivity in detecting the tremor effect (the mean power frequency difference on the right Vextensor digitorum muscle is 176 Hz). Also, the finger performance values for the virtual reality sessions have the same average distance as the real life sessions (RSQ =0.07). The system, besides offering an efficient and quantitative evaluation of hand performance, it was proven compatible with different hand rehabilitation techniques where it can outline the primarily affected parts in the hand dysfunction. It also can be easily adjusted to comply with the subject’s specifications and clinical hand assessment procedures to autonomously detect the classification task events and analyse them with high reliability. The developed system is also adaptable with different disciplines’ involvements, other than the hand rehabilitation, such as ergonomic studies, hand robot control, brain-computer interface and various fields involving hand control.Hand rehabilitation is an extremely complex and critical process in the medical rehabilitation field. This is mainly due to the high articulation of the hand functionality. Recent research has focused on employing new technologies, such as robotics and system control, in order to improve the precision and efficiency of the standard clinical methods used in hand rehabilitation. However, the designs of these devices were either oriented toward a particular hand injury or heavily dependent on subjective assessment techniques to evaluate the progress. These limitations reduce the efficiency of the hand rehabilitation devices by providing less effective results for restoring the lost functionalities of the dysfunctional hands. In this project, a novel technological solution and efficient hand assessment system is produced that can objectively measure the restoration outcome and, dynamically, evaluate its performance. The proposed system uses a data glove sensorial device to measure the multiple ranges of motion for the hand joints, and a Virtual Reality system to return an illustrative and safe visual assistance environment that can self-adjust with the subject’s performance. The system application implements an original finger performance measurement method for analysing the various hand functionalities. This is achieved by extracting the multiple features of the hand digits’ motions; such as speed, consistency of finger movements and stability during the hold positions. Furthermore, an advanced data glove calibration method was developed and implemented in order to accurately manipulate the virtual hand model and calculate the hand kinematic movements in compliance with the biomechanical structure of the hand. The experimental studies were performed on a controlled group of 10 healthy subjects (25 to 42 years age). The results showed intra-subject reliability between the trials (average of crosscorrelation ρ = 0.7), inter-subject repeatability across the subject’s performance (p > 0.01 for the session with real objects and with few departures in some of the virtual reality sessions). In addition, the finger performance values were found to be very efficient in detecting the multiple elements of the fingers’ performance including the load effect on the forearm. Moreover, the electromyography measurements, in the virtual reality sessions, showed high sensitivity in detecting the tremor effect (the mean power frequency difference on the right Vextensor digitorum muscle is 176 Hz). Also, the finger performance values for the virtual reality sessions have the same average distance as the real life sessions (RSQ =0.07). The system, besides offering an efficient and quantitative evaluation of hand performance, it was proven compatible with different hand rehabilitation techniques where it can outline the primarily affected parts in the hand dysfunction. It also can be easily adjusted to comply with the subject’s specifications and clinical hand assessment procedures to autonomously detect the classification task events and analyse them with high reliability. The developed system is also adaptable with different disciplines’ involvements, other than the hand rehabilitation, such as ergonomic studies, hand robot control, brain-computer interface and various fields involving hand control
    corecore