825 research outputs found

    Non-Verbal Communication for a Virtual Reality Interface

    Get PDF
    The steady growth of technology has allowed to extend all forms of human-computer communication. Since the emergence of more sophisticated interaction devices, Human Computer Interaction (HCI) science has added the issue of Non-Verbal Communication (NVC). Nowadays, there are a lot of applications such as interactive entertainments and virtual reality requiring more natural and intuitive interfaces. Human gestures constitute a great space of actions expressed by the body, face, and/or hands. Hand Gesture is frequently used in people’s daily life, thus it is an alternative form to communicate with computers in an easy way. This paper introduces a real-time hand gesture recognition and tracking system to identify different and dinamic hand postures. In order to improve the user experience, a set of different system functions into a virtual world had been implemented so interaction can be performed by the user through a data glove device.XIV Workshop Computación Gráfica, Imágenes y Visualización (WCGIV).Red de Universidades con Carreras en Informática (RedUNCI

    Non-Verbal Communication for a Virtual Reality Interface

    Get PDF
    The steady growth of technology has allowed to extend all forms of human-computer communication. Since the emergence of more sophisticated interaction devices, Human Computer Interaction (HCI) science has added the issue of Non-Verbal Communication (NVC). Nowadays, there are a lot of applications such as interactive entertainments and virtual reality requiring more natural and intuitive interfaces. Human gestures constitute a great space of actions expressed by the body, face, and/or hands. Hand Gesture is frequently used in people’s daily life, thus it is an alternative form to communicate with computers in an easy way. This paper introduces a real-time hand gesture recognition and tracking system to identify different and dinamic hand postures. In order to improve the user experience, a set of different system functions into a virtual world had been implemented so interaction can be performed by the user through a data glove device.XIV Workshop Computación Gráfica, Imágenes y Visualización (WCGIV).Red de Universidades con Carreras en Informática (RedUNCI

    Using natural user interfaces to support synchronous distributed collaborative work

    Get PDF
    Synchronous Distributed Collaborative Work (SDCW) occurs when group members work together at the same time from different places together to achieve a common goal. Effective SDCW requires good communication, continuous coordination and shared information among group members. SDCW is possible because of groupware, a class of computer software systems that supports group work. Shared-workspace groupware systems are systems that provide a common workspace that aims to replicate aspects of a physical workspace that is shared among group members in a co-located environment. Shared-workspace groupware systems have failed to provide the same degree of coordination and awareness among distributed group members that exists in co-located groups owing to unintuitive interaction techniques that these systems have incorporated. Natural User Interfaces (NUIs) focus on reusing natural human abilities such as touch, speech, gestures and proximity awareness to allow intuitive human-computer interaction. These interaction techniques could provide solutions to the existing issues of groupware systems by breaking down the barrier between people and technology created by the interaction techniques currently utilised. The aim of this research was to investigate how NUI interaction techniques could be used to effectively support SDCW. An architecture for such a shared-workspace groupware system was proposed and a prototype, called GroupAware, was designed and developed based on this architecture. GroupAware allows multiple users from distributed locations to simultaneously view and annotate text documents, and create graphic designs in a shared workspace. Documents are represented as visual objects that can be manipulated through touch gestures. Group coordination and awareness is maintained through document updates via immediate workspace synchronization, user action tracking via user labels and user availability identification via basic proxemic interaction. Members can effectively communicate via audio and video conferencing. A user study was conducted to evaluate GroupAware and determine whether NUI interaction techniques effectively supported SDCW. Ten groups of three members each participated in the study. High levels of performance, user satisfaction and collaboration demonstrated that GroupAware was an effective groupware system that was easy to learn and use, and effectively supported group work in terms of communication, coordination and information sharing. Participants gave highly positive comments about the system that further supported the results. The successful implementation of GroupAware and the positive results obtained from the user evaluation provides evidence that NUI interaction techniques can effectively support SDCW

    Exploring Natural User Abstractions For Shared Perceptual Manipulator Task Modeling & Recovery

    Get PDF
    State-of-the-art domestic robot assistants are essentially autonomous mobile manipulators capable of exerting human-scale precision grasps. To maximize utility and economy, non-technical end-users would need to be nearly as efficient as trained roboticists in control and collaboration of manipulation task behaviors. However, it remains a significant challenge given that many WIMP-style tools require superficial proficiency in robotics, 3D graphics, and computer science for rapid task modeling and recovery. But research on robot-centric collaboration has garnered momentum in recent years; robots are now planning in partially observable environments that maintain geometries and semantic maps, presenting opportunities for non-experts to cooperatively control task behavior with autonomous-planning agents exploiting the knowledge. However, as autonomous systems are not immune to errors under perceptual difficulty, a human-in-the-loop is needed to bias autonomous-planning towards recovery conditions that resume the task and avoid similar errors. In this work, we explore interactive techniques allowing non-technical users to model task behaviors and perceive cooperatively with a service robot under robot-centric collaboration. We evaluate stylus and touch modalities that users can intuitively and effectively convey natural abstractions of high-level tasks, semantic revisions, and geometries about the world. Experiments are conducted with \u27pick-and-place\u27 tasks in an ideal \u27Blocks World\u27 environment using a Kinova JACO six degree-of-freedom manipulator. Possibilities for the architecture and interface are demonstrated with the following features; (1) Semantic \u27Object\u27 and \u27Location\u27 grounding that describe function and ambiguous geometries (2) Task specification with an unordered list of goal predicates, and (3) Guiding task recovery with implied scene geometries and trajectory via symmetry cues and configuration space abstraction. Empirical results from four user studies show our interface was much preferred than the control condition, demonstrating high learnability and ease-of-use that enable our non-technical participants to model complex tasks, provide effective recovery assistance, and teleoperative control

    A General Framework for Motion Sensor Based Web Services

    Get PDF
    With the development of motion sensing technology, motion sensor based services have been put into a wide range of applications in recent years. Demand of consuming such service on mobile devices has already emerged. However, as most motion sensors are specifically designed for some heavyweight clients such as PCs or game consoles, there are several technical challenges prohibiting motion sensor from being used by lightweight clients such as mobile devices, for example: There is no direct approach to connect the motion sensor with mobile devices. Most mobile devices don't have enough computational power to consume the motion sensor outputs. To address these problems, I have designed and implemented a framework for publishing general motion sensor functionalities as a RESTful web service that is accessible to mobile devices via HTTP connections. In the framework, a pure HTML5 based interface is delivered to the clients to ensure good accessibility, a websocket based data transferring scheme is adopted to guarantee data transferring efficiency, a server side gesture pipeline is proposed to reduce the client side computational burden and a distributed architecture is designed to make the service scalable. Finally, I conducted three experiments to evaluate the framework's compatibility, scalability and data transferring performance

    Interaction Design for Digital Musical Instruments

    Get PDF
    The thesis aims to elucidate the process of designing interactive systems for musical performance that combine software and hardware in an intuitive and elegant fashion. The original contribution to knowledge consists of: (1) a critical assessment of recent trends in digital musical instrument design, (2) a descriptive model of interaction design for the digital musician and (3) a highly customisable multi-touch performance system that was designed in accordance with the model. Digital musical instruments are composed of a separate control interface and a sound generation system that exchange information. When designing the way in which a digital musical instrument responds to the actions of a performer, we are creating a layer of interactive behaviour that is abstracted from the physical controls. Often, the structure of this layer depends heavily upon: 1. The accepted design conventions of the hardware in use 2. Established musical systems, acoustic or digital 3. The physical configuration of the hardware devices and the grouping of controls that such configuration suggests This thesis proposes an alternate way to approach the design of digital musical instrument behaviour – examining the implicit characteristics of its composite devices. When we separate the conversational ability of a particular sensor type from its hardware body, we can look in a new way at the actual communication tools at the heart of the device. We can subsequently combine these separate pieces using a series of generic interaction strategies in order to create rich interactive experiences that are not immediately obvious or directly inspired by the physical properties of the hardware. This research ultimately aims to enhance and clarify the existing toolkit of interaction design for the digital musician

    Augmented interaction for custom-fit products by means of interaction devices at low costs

    Get PDF
    This Ph.D thesis refers to a research project that aims at developing an innovative platform to design lower limb prosthesis (both for below and above knee amputation) centered on the virtual model of the amputee and based on a computer-aided and knowledge-guided approach. The attention has been put on the modeling tool of the socket, which is the most critical component of the whole prosthesis. The main aim has been to redesign and develop a new prosthetic CAD tool, named SMA2 (Socket Modelling Assistant2) exploiting a low-cost IT technologies (e.g. hand/finger tracking devices) and making the user’s interaction as much as possible natural and similar to the hand-made manipulation. The research activities have been carried out in six phases as described in the following. First, limits and criticalities of the already available modeling tool (namely SMA) have been identified. To this end, the first version of SMA has been tested with Ortopedia Panini and the orthopedic research group of Salford University in Manchester with real case studies. Main criticalities were related to: (i) automatic reconstruction of the residuum geometric model starting from medical images, (ii) performance of virtual modeling tools to generate the socket shape, and (iii) interaction mainly based on traditional devices (e.g., mouse and keyboard). The second phase lead to the software reengineering of SMA according to the limits identified in the first phase. The software architecture has been re-designed adopting an object-oriented paradigm and its modularity permits to remove or add new features in a very simple way. The new modeling system, i.e. SMA2, has been totally implemented using open source Software Development Kit-SDK (e.g., Visualization ToolKit VTK, OpenCASCADE and Qt SDK) and based on low cost technology. It includes: • A new module to automatically reconstruct the 3D model of the residual limb from MRI images. In addition, a new procedure based on low-cost technology, such as Microsoft Kinect V2 sensor, has been identified to acquire the 3D external shape of the residuum. • An open source software library, named SimplyNURBS, for NURBS modeling and specifically used for the automatic reconstruction of the residuum 3D model from medical images. Even if, SimplyNURBS has been conceived for the prosthetic domain, it can be used to develop NURBS-based modeling tools for a range of applicative domains from health-care to clothing design. • A module for mesh editing to emulate the hand-made operations carried out by orthopedic technicians during traditional socket manufacturing process. In addition several virtual widgets have been implemented to make available virtual tools similar to the real ones used by the prosthetist, such as tape measure and pencil. • A Natural User Interface (NUI) to allow the interaction with the residuum and socket models using hand-tracking and haptic devices. • A module to generate the geometric models for additive manufacturing of the socket. The third phase concerned the study and design of augmented interaction with particular attention to the Natural User Interface (NUI) for the use of hand-tracking and haptic devices into SMA2. The NUI is based on the use of the Leap Motion device. A set of gestures, mainly iconic and suitable for the considered domain, has been identified taking into account ergonomic issues (e.g., arm posture) and ease of use. The modularity of SMA2 permits us to easily generate the software interface for each device for augmented interaction. To this end, a software module, named Tracking plug-in, has been developed to automatically generate the source code of software interfaces for managing the interaction with low cost hand-tracking devices (e.g., Leap Motion and Intel Gesture Camera) and replicate/emulate manual operations usually performed to design custom-fit products, such medical devices and garments. Regarding haptic rendering, two different devices have been considered, the Falcon Novint, and a haptic mouse developed in-house. In the fourth phase, additive manufacturing technologies have been investigated, in particular FDM one. 3D printing has been exploited in order to permit the creation of trial sockets in laboratory to evaluate the potentiality of SMA2. Furthermore, research activities have been done to study new ways to design the socket. An innovative way to build the socket has been developed based on multi-material 3D printing. Taking advantage of flexible material and multi-material print possibility, new 3D printers permit to create object with soft and hard parts. In this phase, issues about infill, materials and comfort have been faced and solved considering different compositions of materials to re-design the socket shape. In the fifth phase the implemented solution, integrated within the whole prosthesis design platform, has been tested with a transfemoral amputee. Following activities have been performed: • 3D acquisition of the residuum using MRI and commercial 3D scanning systems (low cost and professional). • Creation of the residual limb and socket geometry. • Multi-material 3D printing of the socket using FDM technology. • Gait analysis of the amputee wearing the socket using a markerless motion capture system. • Acquisition of contact pressure between residual limb and a trial socket by means of Teskan’s F-Socket System. Acquired data have been combined inside an ad-hoc developed application, which permits to simultaneously visualize pressure data on the 3D model of the residual lower limb and the animation of gait analysis. Results and feedback have been possible thanks to this application that permits to find correlation between several phases of the gait cycle and the pressure data at the same time. Reached results have been considered very interested and several tests have been planned in order to try the system in orthopedic laboratories in real cases. The reached results have been very useful to evaluate the quality of SMA2 as a future instruments that can be exploited for orthopedic technicians in order to create real socket for patients. The solution has the potentiality to begin a potential commercial product, which will be able to substitute the classic procedure for socket design. The sixth phase concerned the evolution of SMA2 as a Mixed Reality environment, named Virtual Orthopedic LABoratory (VOLAB). The proposed solution is based on low cost devices and open source libraries (e.g., OpenCL and VTK). In particular, the hardware architecture consists of three Microsoft Kinect v2 for human body tracking, the head mounted display Oculus Rift SDK 2 for 3D environment rendering, and the Leap Motion device for hand/fingers tracking. The software development has been based on the modular structure of SMA2 and dedicated modules have been developed to guarantee the communication among the devices. At present, two preliminary tests have been carried out: the first to verify real-time performance of the virtual environment and the second one to verify the augmented interaction with hands using SMA2 modeling tools. Achieved results are very promising but, highlighted some limitations of this first version of VOLAB and improvements are necessary. For example, the quality of the 3D real world reconstruction, especially as far as concern the residual limb, could be improved by using two HD-RGB cameras together the Oculus Rift. To conclude, the obtained results have been evaluated very interested and encouraging from the technical staff of orthopedic laboratory. SMA2 will made possible an important change of the process to design the socket of lower limb prosthesis, from a traditional hand-made manufacturing process to a totally virtual knowledge-guided process. The proposed solutions and results reached so far can be exploited in other industrial sectors where the final product heavily depends on the human body morphology. In fact, preliminary software development has been done to create a virtual environment for clothing design by starting from the basic modules exploited in SMA2

    Investigando Natural User Interfaces (NUIs) : tecnologias e interação em contexto de acessibilidade

    Get PDF
    Orientador: Maria Cecília Calani BaranauskasTese (doutorado) - Universidade Estadual de Campinas, Instituto de ComputaçãoResumo: Natural User Interfaces (NUIs) representam um novo paradigma de interação, com a promessa de ser mais intuitivo e fácil de usar do que seu antecessor, que utiliza mouse e teclado. Em um contexto no qual as tecnologias estão cada vez mais invisíveis e pervasivas, não só a quantidade mas também a diversidade de pessoas que participam deste contexto é crescente. Nesse caso, é preciso estudar como esse novo paradigma de interação de fato consegue ser acessível a todas as pessoas que podem utilizá-lo no dia-a-dia. Ademais, é preciso também caracterizar o paradigma em si, para entender o que o torna, de fato, natural. Portanto, nesta tese apresentamos o caminho que percorremos em busca dessas duas respostas: como caracterizar NUIs, no atual contexto tecnológico, e como tornar NUIs acessíveis para todos. Para tanto, primeiro apresentamos uma revisão sistemática de literatura com o estado da arte. Depois, mostramos um conjunto de heurísticas para o design e a avaliação de NUIs, que foram aplicadas em estudos de caso práticos. Em seguida, estruturamos as ideias desta pesquisa dentro dos artefatos da Semiótica Organizacional, e obtivemos esclarecimentos sobre como fazer o design de NUIs com Acessibilidade, seja por meio de Design Universal, seja para propor Tecnologias Assistivas. Depois, apresentamos três estudos de caso com sistemas NUI cujo design foi feito por nós. A partir desses estudos de caso, expandimos nosso referencial teórico e conseguimos, por fim, encontrar três elementos que resumem a nossa caracterização de NUI: diferenças, affordances e enaçãoAbstract: Natural User Interfaces (NUIs) represent a new interaction paradigm, with the promise of being more intuitive and easy to use than its predecessor, that utilizes mouse and keyboard. In a context where technology is becoming each time more invisible and pervasive, not only the amount but also the diversity of people who participate in this context is increasing. In this case, it must be studied how this new interaction paradigm can, in fact, be accessible to all the people who may use it on their daily routine. Furthermore, it is also necessary to characterize the paradigm itself, to understand what makes it, in fact, natural. Therefore, in this thesis we present the path we took in search of these two answers: how to characterize NUIs in the current technological context, and how to make NUIs accessible to all. To do so, first we present a systematic literature review with the state of the art. Then, we show a set of heuristics for the design and evaluation of NUIs, which were applied in practical study cases. Afterwards, we structure the ideas of this research into the Organizational Semiotics artifacts, and we obtain insights into how to design NUIs with Accessibility, be it through Universal Design, be it to propose Assistive Technologies. Then, we present three case studies with NUI systems which we designed. From these case studies, we expanded our theoretical references were able to, finally, find three elements that sum up our characterization of NUI: differences, affordances and enactionDoutoradoCiência da ComputaçãoDoutora em Ciência da Computação160911/2015-0CAPESCNP
    corecore