2,131 research outputs found

    Body swarm interface (BOSI) : controlling robotic swarms using human bio-signals

    Get PDF
    Traditionally robots are controlled using devices like joysticks, keyboards, mice and other similar human computer interface (HCI) devices. Although this approach is effective and practical for some cases, it is restrictive only to healthy individuals without disabilities, and it also requires the user to master the device before its usage. It becomes complicated and non-intuitive when multiple robots need to be controlled simultaneously with these traditional devices, as in the case of Human Swarm Interfaces (HSI). This work presents a novel concept of using human bio-signals to control swarms of robots. With this concept there are two major advantages: Firstly, it gives amputees and people with certain disabilities the ability to control robotic swarms, which has previously not been possible. Secondly, it also gives the user a more intuitive interface to control swarms of robots by using gestures, thoughts, and eye movement. We measure different bio-signals from the human body including Electroencephalography (EEG), Electromyography (EMG), Electrooculography (EOG), using off the shelf products. After minimal signal processing, we then decode the intended control action using machine learning techniques like Hidden Markov Models (HMM) and K-Nearest Neighbors (K-NN). We employ formation controllers based on distance and displacement to control the shape and motion of the robotic swarm. Comparison for ground truth for thoughts and gesture classifications are done, and the resulting pipelines are evaluated with both simulations and hardware experiments with swarms of ground robots and aerial vehicles

    Comparing Hand Gestures and a Gamepad Interface for Locomotion in Virtual Environments

    Full text link
    Hand gesture is a new and promising interface for locomotion in virtual environments. While several previous studies have proposed different hand gestures for virtual locomotion, little is known about their differences in terms of performance and user preference in virtual locomotion tasks. In the present paper, we presented three different hand gesture interfaces and their algorithms for locomotion, which are called the Finger Distance gesture, the Finger Number gesture and the Finger Tapping gesture. These gestures were inspired by previous studies of gesture-based locomotion interfaces and are typical gestures that people are familiar with in their daily lives. Implementing these hand gesture interfaces in the present study enabled us to systematically compare the differences between these gestures. In addition, to compare the usability of these gestures to locomotion interfaces using gamepads, we also designed and implemented a gamepad interface based on the Xbox One controller. We conducted empirical studies to compare these four interfaces through two virtual locomotion tasks. A desktop setup was used instead of sharing a head-mounted display among participants due to the concern of the Covid-19 situation. Through these tasks, we assessed the performance and user preference of these interfaces on speed control and waypoints navigation. Results showed that user preference and performance of the Finger Distance gesture were close to that of the gamepad interface. The Finger Number gesture also had close performance and user preference to that of the Finger Distance gesture. Our study demonstrates that the Finger Distance gesture and the Finger Number gesture are very promising interfaces for virtual locomotion. We also discuss that the Finger Tapping gesture needs further improvements before it can be used for virtual walking

    Towards Naturalistic Interfaces of Virtual Reality Systems

    Get PDF
    Interaction plays a key role in achieving realistic experience in virtual reality (VR). Its realization depends on interpreting the intents of human motions to give inputs to VR systems. Thus, understanding human motion from the computational perspective is essential to the design of naturalistic interfaces for VR. This dissertation studied three types of human motions, including locomotion (walking), head motion and hand motion in the context of VR. For locomotion, the dissertation presented a machine learning approach for developing a mechanical repositioning technique based on a 1-D treadmill for interacting with a unique new large-scale projective display, called the Wide-Field Immersive Stereoscopic Environment (WISE). The usability of the proposed approach was assessed through a novel user study that asked participants to pursue a rolling ball at variable speed in a virtual scene. In addition, the dissertation studied the role of stereopsis in avoiding virtual obstacles while walking by asking participants to step over obstacles and gaps under both stereoscopic and non-stereoscopic viewing conditions in VR experiments. In terms of head motion, the dissertation presented a head gesture interface for interaction in VR that recognizes real-time head gestures on head-mounted displays (HMDs) using Cascaded Hidden Markov Models. Two experiments were conducted to evaluate the proposed approach. The first assessed its offline classification performance while the second estimated the latency of the algorithm to recognize head gestures. The dissertation also conducted a user study that investigated the effects of visual and control latency on teleoperation of a quadcopter using head motion tracked by a head-mounted display. As part of the study, a method for objectively estimating the end-to-end latency in HMDs was presented. For hand motion, the dissertation presented an approach that recognizes dynamic hand gestures to implement a hand gesture interface for VR based on a static head gesture recognition algorithm. The proposed algorithm was evaluated offline in terms of its classification performance. A user study was conducted to compare the performance and the usability of the head gesture interface, the hand gesture interface and a conventional gamepad interface for answering Yes/No questions in VR. Overall, the dissertation has two main contributions towards the improvement of naturalism of interaction in VR systems. Firstly, the interaction techniques presented in the dissertation can be directly integrated into existing VR systems offering more choices for interaction to end users of VR technology. Secondly, the results of the user studies of the presented VR interfaces in the dissertation also serve as guidelines to VR researchers and engineers for designing future VR systems

    Exploring the movement dynamics of deception

    Get PDF
    Both the science and the everyday practice of detecting a lie rest on the same assumption: hidden cognitive states that the liar would like to remain hidden nevertheless influence observable behavior. This assumption has good evidence. The insights of professional interrogators, anecdotal evidence, and body language textbooks have all built up a sizeable catalog of non-verbal cues that have been claimed to distinguish deceptive and truthful behavior. Typically, these cues are discrete, individual behaviors—a hand touching a mouth, the rise of a brow—that distinguish lies from truths solely in terms of their frequency or duration. Research to date has failed to establish any of these non-verbal cues as a reliable marker of deception. Here we argue that perhaps this is because simple tallies of behavior can miss out on the rich but subtle organization of behavior as it unfolds over time. Research in cognitive science from a dynamical systems perspective has shown that behavior is structured across multiple timescales, with more or less regularity and structure. Using tools that are sensitive to these dynamics, we analyzed body motion data from an experiment that put participants in a realistic situation of choosing, or not, to lie to an experimenter. Our analyses indicate that when being deceptive, continuous fluctuations of movement in the upper face, and somewhat in the arms, are characterized by dynamical properties of less stability, but greater complexity. For the upper face, these distinctions are present despite no apparent differences in the overall amount of movement between deception and truth. We suggest that these unique dynamical signatures of motion are indicative of both the cognitive demands inherent to deception and the need to respond adaptively in a social context

    Addressing the challenges posed by human machine interfaces based on force sensitive resistors for powered prostheses

    Get PDF
    Despite the advancements in the mechatronics aspect of prosthetic devices, prostheses control still lacks an interface that satisfies the needs of the majority of users. The research community has put great effort into the advancements of prostheses control techniques to address users’ needs. However, most of these efforts are focused on the development and assessment of technologies in the controlled environments of laboratories. Such findings do not fully transfer to the daily application of prosthetic systems. The objectives of this thesis focus on factors that affect the use of Force Myography (FMG) controlled prostheses in practical scenarios. The first objective of this thesis assessed the use of FMG as an alternative or synergist Human Machine Interface (HMI) to the more traditional HMI, i.e. surface Electromyography (sEMG). The assessment for this study was conducted in conditions that are relatively close to the real use case of prosthetic applications. The HMI was embedded in the custom prosthetic prototype that was developed for the pilot participant of the study using an off-the-shelf prosthetic end effector. Moreover, prostheses control was assessed as the user moved their limb in a dynamic protocol.The results of the aforementioned study motivated the second objective of this thesis: to investigate the possibility of reducing the complexity of high density FMG systems without sacrificing classification accuracies. This was achieved through a design method that uses a high density FMG apparatus and feature selection to determine the number and location of sensors that can be eliminated without significantly sacrificing the system’s performance. The third objective of this thesis investigated two of the factors that contribute to increased errors in force sensitive resistor (FSR) signals used in FMG controlled prostheses: bending of force sensors and variations in the volume of the residual limb. Two studies were conducted that proposed solutions to mitigate the negative impact of these factors. The incorporation of these solutions into prosthetic devices is discussed in these studies.It was demonstrated that FMG is a promising HMI for prostheses control. The facilitation of pattern recognition with FMG showed potential for intuitive prosthetic control. Moreover, a method for the design of a system that can determine the required number of sensors and their locations on each individual to achieve a simpler system with comparable performance to high density FMG systems was proposed and tested. The effects of the two factors considered in the third objective were determined. It was also demonstrated that the proposed solutions in the studies conducted for this objective can be used to increase the accuracy of signals that are commonly used in FMG controlled prostheses
    • …
    corecore