1,984 research outputs found

    First validation of the Haptic Sandwich: a shape changing handheld haptic navigation aid

    Get PDF
    This paper presents the Haptic Sandwich, a handheld robotic device that designed to provide pedestrian navigation instructions through a novel shape changing modality. The device resembles a cube with an articulated upper half that is able to rotate and translate (extend) relative to the bottom half, which is grounded in the user’s hand when the device is held. The poses assumed by the device simultaneously correspond to heading and proximity to a navigational target. The Haptic Sandwich provides an alternative to screen and/or audio based pedestrian navigation technologies for both visually impaired and sighted users. Unlike other robotic or haptic navigational solutions, the haptic sandwich is discrete in terms of form and sensory stimulus. Due to the novel and unexplored nature of shape changing interfaces, two user studies were undertaken to validate the concept and device. In the first experiment, stationary participants attempted to identify poses assumed by the device, which was hidden from view. In the second experiment, participants attempted to locate a sequence of invisible navigational targets while walking with the device. Of 1080 pose presentations to 10 individuals in experiment one, 80% were correctly identified and 17.5% had the minimal possible error. Multi-DOF errors accounted for only 1.1% of all answers. The role of simultaneous or independent actuator motion on final shape perception was tested with no significant performance difference. The rotation and extension DOF had significantly different perception accuracy. In the second experiment, participants demonstrated good navigational ability with the device after minimal training and were able to locate all presented targets. Mean motion efficiency of the participants was between 32%-56%. Participants made use of both DOF

    Haptic wearables as sensory replacement, sensory augmentation and trainer - a review

    Get PDF
    Sensory impairments decrease quality of life and can slow or hinder rehabilitation. Small, computationally powerful electronics have enabled the recent development of wearable systems aimed to improve function for individuals with sensory impairments. The purpose of this review is to synthesize current haptic wearable research for clinical applications involving sensory impairments. We define haptic wearables as untethered, ungrounded body worn devices that interact with skin directly or through clothing and can be used in natural environments outside a laboratory. Results of this review are categorized by degree of sensory impairment. Total impairment, such as in an amputee, blind, or deaf individual, involves haptics acting as sensory replacement; partial impairment, as is common in rehabilitation, involves haptics as sensory augmentation; and no impairment involves haptics as trainer. This review found that wearable haptic devices improved function for a variety of clinical applications including: rehabilitation, prosthetics, vestibular loss, osteoarthritis, vision loss and hearing loss. Future haptic wearables development should focus on clinical needs, intuitive and multimodal haptic displays, low energy demands, and biomechanical compliance for long-term usage

    The Analysis of design and manufacturing tasks using haptic and immersive VR - Some case studies

    Get PDF
    The use of virtual reality in interactive design and manufacture has been researched extensively but the practical application of this technology in industry is still very much in its infancy. This is surprising as one would have expected that, after some 30 years of research commercial applications of interactive design or manufacturing planning and analysis would be widespread throughout the product design domain. One of the major but less well known advantages of VR technology is that logging the user gives a great deal of rich data which can be used to automatically generate designs or manufacturing instructions, analyse design and manufacturing tasks, map engineering processes and, tentatively, acquire expert knowledge. The authors feel that the benefits of VR in these areas have not been fully disseminated to the wider industrial community and - with the advent of cheaper PC-based VR solutions - perhaps a wider appreciation of the capabilities of this type of technology may encourage companies to adopt VR solutions for some of their product design processes. With this in mind, this paper will describe in detail applications of haptics in assembly demonstrating how user task logging can lead to the analysis of design and manufacturing tasks at a level of detail not previously possible as well as giving usable engineering outputs. The haptic 3D VR study involves the use of a Phantom and 3D system to analyse and compare this technology against real-world user performance. This work demonstrates that the detailed logging of tasks in a virtual environment gives considerable potential for understanding how virtual tasks can be mapped onto their real world equivalent as well as showing how haptic process plans can be generated in a similar manner to the conduit design and assembly planning HMD VR tool reported in PART A. The paper concludes with a view as to how the authors feel that the use of VR systems in product design and manufacturing should evolve in order to enable the industrial adoption of this technology in the future

    Haptics: Science, Technology, Applications

    Get PDF
    This open access book constitutes the proceedings of the 12th International Conference on Human Haptic Sensing and Touch Enabled Computer Applications, EuroHaptics 2020, held in Leiden, The Netherlands, in September 2020. The 60 papers presented in this volume were carefully reviewed and selected from 111 submissions. The were organized in topical sections on haptic science, haptic technology, and haptic applications. This year's focus is on accessibility

    DAC-h3: A Proactive Robot Cognitive Architecture to Acquire and Express Knowledge About the World and the Self

    Get PDF
    This paper introduces a cognitive architecture for a humanoid robot to engage in a proactive, mixed-initiative exploration and manipulation of its environment, where the initiative can originate from both the human and the robot. The framework, based on a biologically-grounded theory of the brain and mind, integrates a reactive interaction engine, a number of state-of-the art perceptual and motor learning algorithms, as well as planning abilities and an autobiographical memory. The architecture as a whole drives the robot behavior to solve the symbol grounding problem, acquire language capabilities, execute goal-oriented behavior, and express a verbal narrative of its own experience in the world. We validate our approach in human-robot interaction experiments with the iCub humanoid robot, showing that the proposed cognitive architecture can be applied in real time within a realistic scenario and that it can be used with naive users

    Neuromorphic hardware for somatosensory neuroprostheses

    Get PDF
    In individuals with sensory-motor impairments, missing limb functions can be restored using neuroprosthetic devices that directly interface with the nervous system. However, restoring the natural tactile experience through electrical neural stimulation requires complex encoding strategies. Indeed, they are presently limited in effectively conveying or restoring tactile sensations by bandwidth constraints. Neuromorphic technology, which mimics the natural behavior of neurons and synapses, holds promise for replicating the encoding of natural touch, potentially informing neurostimulation design. In this perspective, we propose that incorporating neuromorphic technologies into neuroprostheses could be an effective approach for developing more natural human-machine interfaces, potentially leading to advancements in device performance, acceptability, and embeddability. We also highlight ongoing challenges and the required actions to facilitate the future integration of these advanced technologies

    Expressing Tacit Material Sensations from a Robo-Sculpting Process by Communicating Shared Haptic Experiences

    Get PDF
    A sculptor's sense of touch is paramount because we experience sculpting in the iterative process of making new objects. Making sculpture is a process of expressing the inner 'tacit-self' by way of tangible material interactions that become shared artefacts. The existence of tacit- tactile awareness indicates a natural world of personal haptic experience that this thesis will attempt to unpack. Tele-haptic solutions are presented in the form of two robotic sculptures, Touchbot #1 and Touchbot #2. Touchbots (collectively) are the study objects that this practice- based art-research thesis produced, to ask the question: Is it possible to create a machine that could capture and retransmit tacit-tactile experiences within the artistic act of sculpting, through material engagement, from a sculptor's hand to a non-sculptor's hand? Research, conducted and presented, aims to demonstrate that robotic haptic feedback is a vehicle for communicating 'touch' messages through mechanical transmission during sculptural actions (demonstrated through participant interviews and video observation analysis). Additionally, an epistemological context for exploring 'hands-on' knowledge and practice deficits in machine-assisted object modelling is presented including: Michael Polanyi's Tacit Dimension (Polanyi, 2009), David Gooding's Thing Knowledge (Gooding, 2004, p. 1) and Lambros Malafouris' "Material Agency" and material culture (Malafouris, 2008, pp. 19-36). Intersecting bodies of knowledge weave a common thread to support developing a method of communicating tacit sculptural information using haptic touch experience. Unfortunately, there exists more tele-haptics and telerobotics technology for industrial applications than artworks using the same technology. For instance, 'rapid prototyping' technology—such as 3D printers—is removing human tactile-material interaction from object making altogether. In response to the technological obstacle of expanding contemporary interactive sculpture, haptics is applied to include real-time, iterative, robotically assisted object modelling. A review of contemporary haptic technology demonstrates a gap in our understanding iii of embodied knowledge transference. A shortlist of contemporary artists and their works that address the communication of tacit-haptic experiences is also offered, highlighting the importance of exploring embodied knowledge transfer

    The Use of Tactile Sensors in Oral and Maxillofacial Surgery: An Overview

    Get PDF
    Background: This overview aimed to characterize the type, development, and use of haptic technologies for maxillofacial surgical purposes. The work aim is to summarize and evaluate current advantages, drawbacks, and design choices of presented technologies for each field of application in order to address and promote future research as well as to provide a global view of the issue. Methods: Relevant manuscripts were searched electronically through Scopus, MEDLINE/PubMed, and Cochrane Library databases until 1 November 2022. Results: After analyzing the available literature, 31 articles regarding tactile sensors and interfaces, sensorized tools, haptic technologies, and integrated platforms in oral and maxillofacial surgery have been included. Moreover, a quality rating is provided for each article following appropriate evaluation metrics. Discussion: Many efforts have been made to overcome the technological limits of computed assistant diagnosis, surgery, and teaching. Nonetheless, a research gap is evident between dental/maxillofacial surgery and other specialties such as endovascular, laparoscopic, and microsurgery; especially for what concerns electrical and optical-based sensors for instrumented tools and sensorized tools for contact forces detection. The application of existing technologies is mainly focused on digital simulation purposes, and the integration into Computer Assisted Surgery (CAS) is far from being widely actuated. Virtual reality, increasingly adopted in various fields of surgery (e.g., sino-nasal, traumatology, implantology) showed interesting results and has the potential to revolutionize teaching and learning. A major concern regarding the actual state of the art is the absence of randomized control trials and the prevalence of case reports, retrospective cohorts, and experimental studies. Nonetheless, as the research is fast growing, we can expect to see many developments be incorporated into maxillofacial surgery practice, after adequate evaluation by the scientific community
    corecore