26,197 research outputs found

    The design and evaluation of an interface and control system for a scariculated rehabilitation robot arm

    Get PDF
    This thesis is concerned with the design and development of a prototype implementation of a Rehabilitation Robotic manipulator based on a novel kinematic configuration. The initial aim of the research was to identify appropriate design criteria for the design of a user interface and control system, and for the subsequent evaluation of the manipulator prototype. This led to a review of the field of rehabilitation robotics, focusing on user evaluations of existing systems. The review showed that the design objectives of individual projects were often contradictory, and that a requirement existed for a more general and complete set of design criteria. These were identified through an analysis of the strengths and weaknesses of existing systems, including an assessment of manipulator performances, commercial success and user feedback. The resulting criteria were used for the design and development of a novel interface and control system for the Middlesex Manipulator - the novel scariculated robotic system. A highly modular architecture was adopted, allowing the manipulator to provide a level of adaptability not approached by existing rehabilitation robotic systems. This allowed the interface to be configured to match the controlling ability and input device selections of individual users. A range of input devices was employed, offering variation in communication mode and bandwidth. These included a commercial voice recognition system, and a novel gesture recognition device. The later was designed using electrolytic tilt sensors, the outputs of which were encoded by artificial neural networks. These allowed for control of the manipulator through head or hand gestures. An individual with spinal-cord injury undertook a single-subject user evaluation of the Middlesex Manipulator over a period of four months. The evaluation provided evidence for the value of adaptability presented by the user interface. It was also shown that the prototype did not currently confonn to all the design criteria, but allowed for the identification of areas for design improvements. This work led to a second research objective, concerned with the problem of configuring an adaptable user interface for a specific individual. A novel form of task analysis is presented within the thesis, that allows the relative usability of interface configurations to be predicted based upon individual user and input device characteristics. An experiment was undertaken with 6 subjects performing 72 tasks runs with 2 interface configurations controlled by user gestures. Task completion times fell within the range predicted, where the range was generated using confidence intervals (α = 0.05) on point estimates of user and device characteristics. This allowed successful prediction over all task runs of the relative task completion times of interface configurations for a given user

    Controlling a remotely located Robot using Hand Gestures in real time: A DSP implementation

    Full text link
    Telepresence is a necessity for present time as we can't reach everywhere and also it is useful in saving human life at dangerous places. A robot, which could be controlled from a distant location, can solve these problems. This could be via communication waves or networking methods. Also controlling should be in real time and smooth so that it can actuate on every minor signal in an effective way. This paper discusses a method to control a robot over the network from a distant location. The robot was controlled by hand gestures which were captured by the live camera. A DSP board TMS320DM642EVM was used to implement image pre-processing and fastening the whole system. PCA was used for gesture classification and robot actuation was done according to predefined procedures. Classification information was sent over the network in the experiment. This method is robust and could be used to control any kind of robot over distance

    GART: The Gesture and Activity Recognition Toolkit

    Get PDF
    Presented at the 12th International Conference on Human-Computer Interaction, Beijing, China, July 2007.The original publication is available at www.springerlink.comThe Gesture and Activity Recognition Toolit (GART) is a user interface toolkit designed to enable the development of gesture-based applications. GART provides an abstraction to machine learning algorithms suitable for modeling and recognizing different types of gestures. The toolkit also provides support for the data collection and the training process. In this paper, we present GART and its machine learning abstractions. Furthermore, we detail the components of the toolkit and present two example gesture recognition applications

    Design and Evaluation of Menu Systems for Immersive Virtual Environments

    Get PDF
    Interfaces for system control tasks in virtual environments (VEs) have not been extensively studied. This paper focuses on various types of menu systems to be used in such environments. We describe the design of the TULIP menu, a menu system using Pinch Glovesâ„¢, and compare it to two common alternatives: floating menus and pen and tablet menus. These three menus were compared in an empirical evaluation. The pen and tablet menu was found to be significantly faster, while users had a preference for TULIP. Subjective discomfort levels were also higher with the floating menus and pen and tablet

    Assessing the effectiveness of direct gesture interaction for a safety critical maritime application

    Get PDF
    Multi-touch interaction, in particular multi-touch gesture interaction, is widely believed to give a more natural interaction style. We investigated the utility of multi-touch interaction in the safety critical domain of maritime dynamic positioning (DP) vessels. We conducted initial paper prototyping with domain experts to gain an insight into natural gestures; we then conducted observational studies aboard a DP vessel during operational duties and two rounds of formal evaluation of prototypes - the second on a motion platform ship simulator. Despite following a careful user-centred design process, the final results show that traditional touch-screen button and menu interaction was quicker and less erroneous than gestures. Furthermore, the moving environment accentuated this difference and we observed initial use problems and handedness asymmetries on some multi-touch gestures. On the positive side, our results showed that users were able to suspend gestural interaction more naturally, thus improving situational awareness

    Interaction With Tilting Gestures In Ubiquitous Environments

    Full text link
    In this paper, we introduce a tilting interface that controls direction based applications in ubiquitous environments. A tilt interface is useful for situations that require remote and quick interactions or that are executed in public spaces. We explored the proposed tilting interface with different application types and classified the tilting interaction techniques. Augmenting objects with sensors can potentially address the problem of the lack of intuitive and natural input devices in ubiquitous environments. We have conducted an experiment to test the usability of the proposed tilting interface to compare it with conventional input devices and hand gestures. The experiment results showed greater improvement of the tilt gestures in comparison with hand gestures in terms of speed, accuracy, and user satisfaction.Comment: 13 pages, 10 figure
    • …
    corecore