33 research outputs found

    Physical rehabilitation based on kinect serious games: ThG therapy game

    Get PDF
    This thesis presents a serious game platform developed using Unity 3D game Engine and Kinect V2 sensor as a natural user interface. The aim of this work was to provide a tool for objective evaluation of patients’ movements during physiotherapy sessions as well as a pleasant way that may increase patient engagement on training motor rehabilitation exercises. The developed platform based on Kinect V2 sensor detects 3D motion of different body joints and provides data storage capability in a remote database. The platform for patient’s data management during physiotherapy process includes biometric data, some data relevant for physiotherapist related to patient’s clinical history, obtained scores during serious game based training and values of metrics such as the distance between feet during a game, left and right feet usage frequency and execution time for imposed movement associated with game mechanics. A description of technologies and techniques used for development of the platform and some results related to usability of the platform are presented in this thesis.Esta tese apresenta uma plataforma de jogo séria desenvolvida usando o motor de jogo Unity 3D juntamente com o sensor Kinect V2 como uma interface natural de utilizador. O objetivo deste trabalho foi fornecer uma ferramenta para avaliação objetiva dos movimentos dos pacientes durante as sessões de fisioterapia, bem como uma maneira agradá- vel que possa aumentar o envolvimento do paciente nos treinos de reabilitação motora. A plataforma desenvolvida baseada no sensor Kinect V2 deteta o movimento 3D de diferentes articulações do corpo e fornece capacidade de armazenamento de dados em uma base de dados remota. A plataforma que gere só dados do paciente durante o processo de fisioterapia inclui dados biométricos, alguns dados relevantes para fisioterapeuta relacionados com o historial clínico do paciente, pontuações durante o treino e valores de métricas, como a distância entre os pés durante o jogo, o uso do pé esquerdo e direito, frequência e tempo de execução do movimento associado à mecânica do jogo. A tese apresenta a descrição das tecnologias e técnicas utilizadas para o desenvolvimento da plataforma, e alguns resultados relacionados com o uso da plataforma

    The 2010 Horizon report

    Get PDF
    Titre de l'écran-titre (visionné le 22 mars 2010

    Designing inclusive & playful technologies for pre-school children

    Get PDF
    This paper reports on an investigation into the potential of everyday technologies to foster playful experiences for young children prior to their formal education. The aim is to consider how best to design age appropriate experiences that are desirable and useful within pre-school settings, and to assist practitioners in experimenting with technologies in the early years school curriculum. This phase of the study focuses on observations of the real-time, non-digital play of young children in a pre-school playgroup and the subsequent introduction of group activities with affordable, non-specialist devices such as ReacTickles, Wii remote and microphone. The study captures the vital inspiration phase of design research. By utilizing observation and interview as an analytical framework to help practitioners to articulate the nuances of playful interaction, the designers have been able to draw early conclusions that provide the guiding principles for future design

    Natural navigation in space and time

    Get PDF
    Faculty at the Department of Computer Science at RIT had developed the Spiegel, a scientific data visualization framework. The system needed a natural interface to control 3D data visualizations in real-time. This thesis describes an extendable system for testing remote control interfaces for 3-dimensional virtual spaces. We had developed and compared 4 remote controls: multi-touch TouchPad, gyroscope-based GyroPad, wearable Data Glove, and Kinect-based Hands controller. Our research study revealed TouchPad as the most natural remote control

    A Software Development Kit for Camera-Based Gesture Interaction

    Get PDF
    Human-Computer Interaction is a rapidly expanding field, in which new implementations of ideas are consistently being released. In recent years, much of the concentration in this field has been on gesture-based control, either touch-based or camera-based. Even though camera-based gesture recognition was previously seen more in science fiction than in reality, this method of interaction is rising in popularity. There are a number of devices readily available to the average consumer that are designed to support this type of input, including the popular Microsoft Kinect and Leap Motion devices. Despite this rise in availability and popularity, development for these devices is currently an arduous task, unless only the most simple of gestures is required. The goal of this thesis is to develop a Software Development Kit (SDK) with which developers can more easily develop interfaces that utilize gesture-based control. If successful, this SDK could significantly reduce the amount of work (both in effort and in lines of code) necessary for a programmer to implement gesture control in an application. This, in turn, could help reduce the intellectual barrier which many face when attempting to implement a new interface. The developed SDK has three main goals. The SDK will place an emphasis on simplicity of code for developers using it; will allow for a variety of gestures, including gestures made by single or multiple trackable objects (e.g., hands and fingers), gestures performed in stages, and continuously-updating gestures; and will be device-agnostic, in that it will not be written exclusively for a single device. The thesis presents the results of a system validation study that suggests all of these goals have been met

    Systematic literature review of hand gestures used in human computer interaction interfaces

    Get PDF
    Gestures, widely accepted as a humans' natural mode of interaction with their surroundings, have been considered for use in human-computer based interfaces since the early 1980s. They have been explored and implemented, with a range of success and maturity levels, in a variety of fields, facilitated by a multitude of technologies. Underpinning gesture theory however focuses on gestures performed simultaneously with speech, and majority of gesture based interfaces are supported by other modes of interaction. This article reports the results of a systematic review undertaken to identify characteristics of touchless/in-air hand gestures used in interaction interfaces. 148 articles were reviewed reporting on gesture-based interaction interfaces, identified through searching engineering and science databases (Engineering Village, Pro Quest, Science Direct, Scopus and Web of Science). The goal of the review was to map the field of gesture-based interfaces, investigate the patterns in gesture use, and identify common combinations of gestures for different combinations of applications and technologies. From the review, the community seems disparate with little evidence of building upon prior work and a fundamental framework of gesture-based interaction is not evident. However, the findings can help inform future developments and provide valuable information about the benefits and drawbacks of different approaches. It was further found that the nature and appropriateness of gestures used was not a primary factor in gesture elicitation when designing gesture based systems, and that ease of technology implementation often took precedence

    Distant pointing in desktop collaborative virtual environments

    Get PDF
    Deictic pointing—pointing at things during conversations—is natural and ubiquitous in human communication. Deictic pointing is important in the real world; it is also important in collaborative virtual environments (CVEs) because CVEs are 3D virtual environments that resemble the real world. CVEs connect people from different locations, allowing them to communicate and collaborate remotely. However, the interaction and communication capabilities of CVEs are not as good as those in the real world. In CVEs, people interact with each other using avatars (the visual representations of users). One problem of avatars is that they are not expressive enough when compare to what we can do in the real world. In particular, deictic pointing has many limitations and is not well supported. This dissertation focuses on improving the expressiveness of distant pointing—where referents are out of reach—in desktop CVEs. This is done by developing a framework that guides the design and development of pointing techniques; by identifying important aspects of distant pointing through observation of how people point at distant referents in the real world; by designing, implementing, and evaluating distant-pointing techniques; and by providing a set of guidelines for the design of distant pointing in desktop CVEs. The evaluations of distant-pointing techniques examine whether pointing without extra visual effects (natural pointing) has sufficient accuracy; whether people can control free arm movement (free pointing) along with other avatar actions; and whether free and natural pointing are useful and valuable in desktop CVEs. Overall, this research provides better support for deictic pointing in CVEs by improving the expressiveness of distant pointing. With better pointing support, gestural communication can be more effective and can ultimately enhance the primary function of CVEs—supporting distributed collaboration

    Arm-Hand-Finger Video Game Interaction

    Get PDF
    Despite the growing popularity and expansion of video game interaction techniques and research in the area of hand gesture recognition, the application of hand gesture video game interaction using arm, hand, and finger motion has not been extensively explored. Most current gesture-based approaches to video game interaction neglect the use of the fingers for interaction, but inclusion of the fingers will allow for more natural and unique interaction and merits further research. To implement arm, hand and finger-based interaction for the video game domain, several problems must be solved including gesture recognition, segmentation, hand visualization, and video game interaction that responds to arm, hand, and finger input. Solutions to each of these problems have been implemented. The potential of this interaction style is illustrated through the introduction of an arm, hand, and finger controlled video game system that responds to players' hand gestures. It includes a finger-gesture recognizer as well as a video game system employing various interaction styles. This consists of a first person shooter game, a driving game, and a menu interaction system. Several users interacted with and played these games, and this form of interaction is especially suitable for real time interaction in first-person games. This is perhaps the first implementation of its kind for video game interaction. Based on test results, arm, hand, and finger interaction a viable form of interaction that deserves further research. This implementation bridges the gap between existing gesture interaction methods and more advanced virtual reality techniques. It successfully combines the solutions to each problem mentioned above into a single, working video game system. This type of interaction has proved to be more intuitive than existing gesture controls in many situations and also less complex to implement than a full virtual reality setup. It allows more control by using the hands' natural motion and allows each hand to interact independently. It can also be reliably implemented using today's technology. This implementation is a base system that can be greatly expanded on. Many possibilities for future work can be applied to this form of interaction
    corecore