8 research outputs found

    Telerobotic 3D Articulated Arm-Assisted Surgery Tools with Augmented Reality for Surgery Training

    Get PDF
    In this research, human body will be marked and tracked using depth camera. The arm motion from the trainer will be sent through network and then mapped into 3D robotic arm in the destination server. The robotic arm will move according to the trainer. In the meantime, trainee will follow the movement and they can learn how to do particular tasks according to the trainer. The telerobotic-assisted surgery tools will give guidance how to slice or do simple surgery in several steps through the 3D medical images which are displayed in the human body. User will do training and selects some of the body parts and then analyzes it. The system provide specific task to be completed during training and measure how many tasks the user can accomplish during the surgical time. The telerobotic-assisted virtual surgery tools using augmented reality (AR) is expected to be used widely in medical education as an alternative system with low-cost solution

    Real Time Interactive Presentation Apparatus based on Depth Image Recognition

    Get PDF
    The research on human computer interaction. Human already thinking to overcome the way of interaction towards natural interaction. Kinect is one of the tools that able to provide user with Natural User Interface (NUI). It has capability to track hand gesture and interpret their action according to the depth data stream. The human hand is tracked in point of cloud form and synchronized simultaneously.The method is started by collecting the depth image to be analyzed by random decision forest algorithm. The algorithm will choose set of thresholds and features split, then provide the information of body skeleton. In this project, hand gesture is divided into several actions such as: waiving to right or left toward head position then it will interpret as next or previous slide. The waiving is measured in degree value towards head as center point. Moreover, pushing action will trigger to open new pop up window of specific slide that contain more detailed information. The result of implementations is quite fascinating, user can control the PowerPoint and event able to design the presentation form in different ways. Furthermore, we also present a new way of presentation by presenting WPF form that connected to database for dynamic presentation tool

    TOU-AR:Touchable Interface for Interactive Interaction in Augmented Reality Environment

    Get PDF
    Touchable interface is one of the future interfaces that can be implemented at any medium such as water, table or even sand. The word multi touch refers to the ability to distinguish between two or more fingers touching a touch-sensing surface, such as a touch screen or a touch pad. This interface is provided tracking the area by using depth camera and projected the interface into the medium. This interface is widely used in augmented reality environment. User will project the particular interface into real world medium and user hand will be tracked simultaneously when touching the area. User can interact in more freely ways and as natural as human did in their daily lif

    Emotional Facial Expression Based On Action Units and Facial Muscle

    Get PDF
    The virtual human play vital roles in virtual reality and game. The process of Enriching the virtual human through their expression is one of the aspect that most researcher studied and improved. This study aims to demonstrate the combination of facial action units (FACS) and facial muscle to produce a realistic facial expression. The result of experiment succeed on producing particular expression such as anger, happy, sad which are able to convey the emotional state of the virtual human. This achievement is believed to bring full mental immersion towards virtual human and audience. The future works will able to generate a complex virtual human expression that combine physical factos such as wrinkle, fluid dynamics for tears or sweating

    Framework of controlling 3d virtual human emotional walking using BCI

    Get PDF
    A Brain-Computer Interface (BCI) is the device that can read and acquire the brain activities. A human body is controlled by Brain-Signals, which considered as a main controller. Furthermore, the human emotions and thoughts will be translated by brain through brain signals and expressed as human mood. This controlling process mainly performed through brain signals, the brain signals is a key component in electroencephalogram (EEG). Based on signal processing the features representing human mood (behavior) could be extracted with emotion as a major feature. This paper proposes a new framework in order to recognize the human inner emotions that have been conducted on the basis of EEG signals using a BCI device controller. This framework go through five steps starting by classifying the brain signal after reading it in order to obtain the emotion, then map the emotion, synchronize the animation of the 3D virtual human, test and evaluate the work. Based on our best knowledge there is no framework for controlling the 3D virtual human. As a result for implementing our framework will enhance the game field of enhancing and controlling the 3D virtual humans’ emotion walking in order to enhance and bring more realistic as well. Commercial games and Augmented Reality systems are possible beneficiaries of this technique. © 2015 Penerbit UTM Press. All rights reserved

    Háptica em jogos: o retorno tátil no estímulo da emoção

    Get PDF
    O toque e a percepção cinestésica[1] compõem uma grande parte das atividades dos jogos no mundo real.  Seja empurrando, chutando, correndo, pulando ou puxando, quase sempre desempenhamos ações que possuem relações motoras e auxiliam na percepção do jogo como também, desencadeiam relações emocionais. Entretanto, em jogos digitais - embora exista o esforço constante de desenvolvedores para prover uma imersão complexa na narrativa - pouco se discutiu tecnicamente quanto ao papel das referências táteis no estímulo da emoção. Este artigo apresenta uma revisão da literatura de estudos que consideram a háptica e a referência emocional do jogador. Também apresenta uma pesquisa aplicada para mapeamento preliminar de respostas emocionais em jogos. Observa que o retorno tátil promovido pela háptica pode oferecer uma relação interativa imersiva aprimorando a experiência emocional do jogador. Por fim conclui que os estudos sobre este tema ainda são incipientes e solicitam desenvolvimentos

    Exploring and designing for multisensory interactions with 3D printed food

    Get PDF
    Experience of food is as varied as it is widespread, part of mundane activities but also embedded in rituals and celebrations. Despite its pervasive richness it has yet to be fully exploited to support embodied and multisensory experiences within Human-Computer Interaction. This thesis addresses this shortcoming, drawing on the unique qualities of food experience in combination with novel technology to design rich, affective, and embodied interactions through food. This work approaches 3D printed food as a material to design emotion- and memory-based experiences with food, and 3D printing of food as a technology for crafting multisensory user experiences in everyday contexts. These perspectives are integrated through the design and evaluation of novel interactions with 3D printed food, following a Research through Design approach combined with material approaches. Through this enquiry, novel research tools for HCI were also created for working with food, flavour, and taste. The thesis comprises seven studies that advance knowledge, based on gaps identified, and novel theoretical framings in a systematic literature review. Through a survey of user perceptions of 3D printed food, opportunities for user experience-based applications were highlighted. An identified opportunity for affective interactions through taste was considered through lab-based studies and interviews with chefs and food designers on using 3D printed food. This was extended through a co-design study with couples in romantic relationships to create flavours of 3D printed food to support emotional expression and coregulation. The use of flavours to cue experience was then explored in relation to self-defining memories with older adults. Through both co-design studies, a multisensory probe kit was built and evaluated to support designing with the senses in HCI and to further explore ideas from the study into food and memory and an app prototype designed for creating personalised flavour-based memory cues. Collectively, these studies support applications of the 3D printing of food for emotional and memory-based applications in HCI, as well as theoretical and methodological contributions to multisensory design and design with food and the body in HCI
    corecore