11,032 research outputs found

    Telerobotic 3D Articulated Arm-Assisted Surgery Tools with Augmented Reality for Surgery Training

    Get PDF
    In this research, human body will be marked and tracked using depth camera. The arm motion from the trainer will be sent through network and then mapped into 3D robotic arm in the destination server. The robotic arm will move according to the trainer. In the meantime, trainee will follow the movement and they can learn how to do particular tasks according to the trainer. The telerobotic-assisted surgery tools will give guidance how to slice or do simple surgery in several steps through the 3D medical images which are displayed in the human body. User will do training and selects some of the body parts and then analyzes it. The system provide specific task to be completed during training and measure how many tasks the user can accomplish during the surgical time. The telerobotic-assisted virtual surgery tools using augmented reality (AR) is expected to be used widely in medical education as an alternative system with low-cost solution

    Machine Understanding of Human Behavior

    Get PDF
    A widely accepted prediction is that computing will move to the background, weaving itself into the fabric of our everyday living spaces and projecting the human user into the foreground. If this prediction is to come true, then next generation computing, which we will call human computing, should be about anticipatory user interfaces that should be human-centered, built for humans based on human models. They should transcend the traditional keyboard and mouse to include natural, human-like interactive functions including understanding and emulating certain human behaviors such as affective and social signaling. This article discusses a number of components of human behavior, how they might be integrated into computers, and how far we are from realizing the front end of human computing, that is, how far are we from enabling computers to understand human behavior
    corecore