1,167 research outputs found

    Characterization Of Somatosensation In The Brainstem And The Development Of A Sensory Neuroprosthesis

    Get PDF
    Innovations in neuroprosthetics have restored sensorimotor function to paralysis patients and amputees. However, to date there is a lack of solutions available to adequately address the needs of spinal cord injury patients (SCI). In this dissertation we develop a novel sensor-brain interface (SBI) that delivers electric microstimulation to the cuneate nucleus (CN) to restore somatosensory feedback in patients with intact limbs. In Chapter II, we develop a fully passive liquid metal antenna using gallium-indium (GaIn) alloy injected in polydimethylsiloxane (PDM) channels to measure forces within the physiological sensitivity of a human fingertip. In Chapter III, we present the first chronic neural interface with the CN in primates to provide access to long-term unit recordings and stimulation. In Chapter IV, we demonstrate that microstimulation to the CN is detectable in a Three Alternative Force Choice Oddity task in awake behaving primates. In Chapter V, we explore the downstream effects of CN stimulation on primary somatosensory cortex, in the context of spontaneous and evoked spindles under sedation. In summary, these findings constitute a proof-of-concept for the sensory half of a bidirectional sensorimotor prosthesis in the CN

    Distributive tactile sensing using fibre bragg grating sensors

    Get PDF
    Two distributive tactile sensing systems are presented, based on fibre Bragg grating sensors. The first is a one-dimensional metal strip with an array of 4 sensors, which is capable of detecting the magnitude and position of a contacting load. This system is compared experimentally with a similar system using resistive strain gauges. The second is a two-dimensional steel plate with 9 sensors which is able to distinguish the position and shape of a contacting load. This system is compared with a similar system using 16 infrared displacement sensors. Each system uses neural networks to process the sensor data to give information concerning the type of contact

    Robotic hand augmentation drives changes in neural body representation

    Get PDF
    Humans have long been fascinated by the opportunities afforded through augmentation. This vision not only depends on technological innovations but also critically relies on our brain's ability to learn, adapt, and interface with augmentation devices. Here, we investigated whether successful motor augmentation with an extra robotic thumb can be achieved and what its implications are on the neural representation and function of the biological hand. Able-bodied participants were trained to use an extra robotic thumb (called the Third Thumb) over 5 days, including both lab-based and unstructured daily use. We challenged participants to complete normally bimanual tasks using only the augmented hand and examined their ability to develop hand-robot interactions. Participants were tested on a variety of behavioral and brain imaging tests, designed to interrogate the augmented hand's representation before and after the training. Training improved Third Thumb motor control, dexterity, and hand-robot coordination, even when cognitive load was increased or when vision was occluded. It also resulted in increased sense of embodiment over the Third Thumb. Consequently, augmentation influenced key aspects of hand representation and motor control. Third Thumb usage weakened natural kinematic synergies of the biological hand. Furthermore, brain decoding revealed a mild collapse of the augmented hand's motor representation after training, even while the Third Thumb was not worn. Together, our findings demonstrate that motor augmentation can be readily achieved, with potential for flexible use, reduced cognitive reliance, and increased sense of embodiment. Yet, augmentation may incur changes to the biological hand representation. Such neurocognitive consequences are crucial for successful implementation of future augmentation technologies

    The development of assistive technology to reveal knowledge of physical world concepts in young people who have profound motor impairments.

    Get PDF
    Cognitively able children and young people who have profound motor impairments and complex communication needs (the target group or TG) face many barriers to learning, communication, personal development, physical interaction and play experiences, compared to their typically developing peers. Physical interaction (and play) are known to be important components of child development, but this group currently has few suitable ways in which to participate in these activities. Furthermore, the TG may have knowledge about real world physical concepts despite having limited physical interaction experiences but it can be difficult to reveal this knowledge and conventional assessment techniques are not suitable for this group, largely due to accessibility issues. This work presents a pilot study involving a robotics-based system intervention which enabled members of the TG to experience simulated physical interaction and was designed to identify and develop the knowledge and abilities of the TG relating to physical concepts involving temporal, spatial or movement elements. The intervention involved the participants using an eye gaze controlled robotic arm with a custom made haptic feedback device to complete a set of tasks. To address issues with assessing the TG, two new digital Assistive Technology (AT) accessible assessments were created for this research, one using static images, the other video clips. Two participants belonging to the TG took part in the study. The outcomes indicated a high level of capability in performing the tasks, with the participants exhibiting a level of knowledge and ability which was much higher than anticipated. One explanation for this finding could be that they have acquired this knowledge through past experiences and ‘observational learning’. The custom haptic device was found to be useful for assessing the participants’ sense of ‘touch’ in a way which is less invasive than conventional ‘pin-prick’ techniques. The new digital AT accessible assessments seemed especially suitable for one participant, while results were mixed for the other. This suggests that a combination of ‘traditional’ assessment and a ‘practical’ intervention assessment approach may help to provide a clearer, more rounded understanding of individuals within the TG. The work makes contributions to knowledge in the field of disability and Assistive Technology, specifically regarding: AT accessible assessments; haptic device design for the TG; the combination of robotics, haptics and eye gaze for use by the TG to interact with the physical world; a deeper understanding of the TG in general; insights into designing for and working with the TG. The work and information gathered can help therapists and education staff to identify strengths and gaps in knowledge and skills, to focus learning and therapy activities appropriately, and to change the perceptions of those who work with this group, encouraging them to broaden their expectations of the TG

    Biomimetic tactile sensing

    Get PDF

    BRIX - An Easy-to-Use Modular Sensor and Actuator Prototyping Toolkit

    Get PDF
    Zehe S, Großhauser T, Hermann T. BRIX - An Easy-to-Use Modular Sensor and Actuator Prototyping Toolkit. In: Tenth Annual IEEE International Conference on Pervasive Computing and Communications, Workshop Proceedings. Lugano, Swizerland: IEEE; 2012: 817-822.In this paper we present BRIX, a novel modular hardware prototyping platform for applications in mobile, wearable and stationary sensing, data streaming and feedback. The system consists of three different types of compact stack- able modules, which can adapt to various applications and scenarios. The core of BRIX is a base module that contains basic motion sensors, a processor and a wireless interface. A battery module provides power for the system and makes it a mobile device. Different types of extension modules can be stacked onto the base module to extend its scope of functions by sensors, actuators and interactive elements. BRIX allows a very intuitive, inexpensive and expeditious prototyping that does not require knowledge in electronics or hardware design. In an example application, we demonstrate how BRIX can be used to track human body movements

    Smart textiles for improved quality of life and cognitive assessment

    Get PDF
    Smart textiles can be used as innovative solutions to amuse, meaningfully engage, comfort, entertain, stimulate, and to overall improve the quality of life for people living in care homes with dementia or its precursor mild cognitive impairment (MCI). This concept paper presents a smart textile prototype to both entertain and monitor/assess the behavior of the relevant clients. The prototype includes physical computing components for music playing and simple interaction, but additionally games and data logging systems, to determine baselines of activity and interaction. Using microelectronics, light-emitting diodes (LEDs) and capacitive touch sensors woven into a fabric, the study demonstrates the kinds of augmentations possible over the normal manipulation of the traditional non-smart activity apron by incorporating light and sound effects as feedback when patients interact with different regions of the textile. A data logging system will record the patient’s behavioral patterns. This would include the location, frequency, and time of the patient’s activities within the different textile areas. The textile will be placed across the laps of the resident, which they then play with, permitting the development of a behavioral profile through the gamification of cognitive tests. This concept paper outlines the development of a prototype sensor system and highlights the challenges related to its use in a care home setting. The research implements a wide range of functionality through a novel architecture involving loosely coupling and concentrating artifacts on the top layer and technology on the bottom layer. Components in a loosely coupled system can be replaced with alternative implementations that provide the same services, and so this gives the solution the best flexibility. The literature shows that existing architectures that are strongly coupled result in difficulties modeling different individuals without incurring significant costs. © 2021 by the authors. Licensee MDPI, Basel, Switzerland

    Integrating passive ubiquitous surfaces into human-computer interaction

    Get PDF
    Mobile technologies enable people to interact with computers ubiquitously. This dissertation investigates how ordinary, ubiquitous surfaces can be integrated into human-computer interaction to extend the interaction space beyond the edge of the display. It turns out that acoustic and tactile features generated during an interaction can be combined to identify input events, the user, and the surface. In addition, it is shown that a heterogeneous distribution of different surfaces is particularly suitable for realizing versatile interaction modalities. However, privacy concerns must be considered when selecting sensors, and context can be crucial in determining whether and what interaction to perform.Mobile Technologien ermöglichen den Menschen eine allgegenwärtige Interaktion mit Computern. Diese Dissertation untersucht, wie gewöhnliche, allgegenwärtige Oberflächen in die Mensch-Computer-Interaktion integriert werden können, um den Interaktionsraum über den Rand des Displays hinaus zu erweitern. Es stellt sich heraus, dass akustische und taktile Merkmale, die während einer Interaktion erzeugt werden, kombiniert werden können, um Eingabeereignisse, den Benutzer und die Oberfläche zu identifizieren. Darüber hinaus wird gezeigt, dass eine heterogene Verteilung verschiedener Oberflächen besonders geeignet ist, um vielfältige Interaktionsmodalitäten zu realisieren. Bei der Auswahl der Sensoren müssen jedoch Datenschutzaspekte berücksichtigt werden, und der Kontext kann entscheidend dafür sein, ob und welche Interaktion durchgeführt werden soll

    Towards observable haptics: Novel sensors for capturing tactile interaction patterns

    Get PDF
    Kõiva R. Towards observable haptics: Novel sensors for capturing tactile interaction patterns. Bielefeld: Bielefeld University; 2014.Touch is one of the primary senses humans use when performing coordinated interaction, but the lack of a sense of touch in the majority of contemporary interactive technical systems, such as robots, which operate in non-deterministic environments, results in interactions that can at best be described as clumsy. Observing human haptics and extracting the salient information from the gathered data is not only relevant if we are to try to understand the involved underlying cognitive processes, but should also provide us with significant clues to design future intelligent interactive systems. Such systems could one day help to take the burden of tedious tasks off our hands in a similar fashion to how industrial robots revolutionized manufacturing. The aim of the work in this thesis was to provide significant advancements in tactile sensing technology, and thus move us a step closer to realizing this goal. The contributions contained herein can be broken into two major parts. The first part investigates capturing interaction patterns in humans with the goals of better understanding manual intelligence and improving the lives of hand amputees, while the second part is focused on augmenting technical systems with a sense of touch. tacTiles, a wireless tactile sensitive surface element attached to a deformable textile, was developed to capture human full-body interactions with large surfaces we come into contact with in our daily lives, such as floors, chairs, sofas or other furniture. The Tactile Dataglove, iObject and the Tactile Pen were developed especially to observe human manual intelligence. Whereas iObject allows motion sensing and a higher definition tactile signal to be captured than the Tactile Dataglove (220 tactile cells in the first iObject prototype versus 54 cells in the glove), the wearable glove makes haptic interactions with arbitrary objects observable. The Tactile Pen was designed to measure grip force during handwriting in order to better facilitate therapeutic treatment assessments. These sensors have already been extensively used by various research groups, including our own, to gain a better understanding of human manual intelligence. The Finger-Force-Linear-Sensor and the Tactile Bracelet are two novel sensors that were developed to facilitate more natural control of dexterous multi Degree-of-Freedom (DOF) hand prostheses. The Finger-Force-Linear-Sensor is a very accurate bidirectional single finger force ground-truth measurement device that was designed to enable testing and development of single finger forces and muscle activations mapping algorithms. The Tactile Bracelet was designed with the goal to provide a more robust and intuitive means of control for multi-DOF hand prostheses by measuring the muscle bulgings of the remnant muscles of lower arm amputees. It is currently in development and will eventually cover the complete forearm circumference with high spatial resolution tactile sensitive surfaces. An experiment involving a large number of lower arm amputees has already been planned. The Modular flat tactile sensor system, the Fabric-based touch sensitive artificial skin and the 3D shaped tactile sensor were developed to cover and to add touch sensing capabilities to the surfaces of technical systems. The rapid augmentation of systems with a sense of touch was the main goal of the modular flat tactile sensor system. The developed sensor modules can be used alone or in an array to form larger tactile sensitive surfaces such as tactile sensitive tabletops. As many robots have curved surfaces, using flat rigid modules severely limits the areas that can be covered with tactile sensors. The Fabric-based tactile sensor, originally developed to form a tactile dataglove for human hands, can with minor modifications also function as an artificial skin for technical systems. Finally, the 3D shaped tactile sensor based on Laser-Direct-Structuring technology is a novel tactile sensor that has a true 3D shape and provides high sensitivity and a high spatial resolution. These sensors take us further along the path towards creating general purpose technical systems that in time can be of great help to us in our daily lives. The desired tactile sensor characteristics differ significantly according to which haptic interaction patterns we wish to measure. Large tactile sensor arrays that are used to capture full body haptic interactions with floors and upholstered furniture, or that are designed to cover large areas of technical system surfaces, need to be scalable, have low power consumption and should ideally have a low material cost. Two examples of such sensors are tacTiles and the Fabric-based sensor for curved surfaces. At the other end of the tactile sensor development spectrum, if we want to observe manual interactions, high spatial and temporal resolution are crucial to enable the measurement of fine grasping and manipulation actions. Our fingertips contain the highest density area of mechanoreceptors, the organs that sense mechanical pressure and distortions. Thus, to construct biologically inspired anthropomorphic robotic hands, the artificial tactile sensors for the fingertips require similar high-fidelity sensors with surfaces that are curved under small bending radii in 2 dimensions, have high spatial densities, while simultaneously providing high sensitivity. With the fingertip tactile sensor, designed to fit the Shadow Robot Hands' fingers, I show that such sensors can indeed be constructed in the 3D-shaped high spatial resolution tactile sensor section of my thesis. With my work I have made a significant contribution towards making haptics more observable. I achieved this by developing a high number of novel tactile sensors that are usable, give a deeper insight into human haptic interactions, have great potential to help amputees and that make technical systems, such as robots, more capable

    Treadmill User Centering

    Get PDF
    The goal of team CENTREAD was to design a device to allow a person with a visual disability to run efficiently and effectively on a treadmill without fear of falling off or injuring themselves. The customer wished for the device to be small, lightweight, and have an easy, autonomous setup, while providing feedback to the user wirelessly for them to correct their own movement. The ultimate goal of the device is to allow the user to be comfortable, safe, and free while using it in order to ensure they have the best running experience. The device utilizes ultrasonic sensors in housings to detect distances of objects using sound wave pulses. These sensors send signals out and detect the amount of time it takes for the signal to return to the same place, taking that time and converting it into a distance. These distances are sent directly into a microcontroller, where the microcontroller collects and analyzes the data. While analyzing the data, the microcontroller looks for data points that are within the boundaries set as not safe zones. These data points are then assigned a value and are sent over to a wireless transmitter to communicate with its sister receiver. The receiver detects a signal sent from the relative transmitter and sends the signal to another microcontroller to be processed. This process takes the value sent from the transmitter and assigns that value to a pin to activate a voltage to. This pin contains a small eccentric weighted motor that vibrates when a voltage is applied. This vibration is then interpreted by the user to move in the opposite direction of the vibration, correcting their location. This device utilizes two housings, one along the length axis of the treadmill belt and one along the width axis of the treadmill belt. These boxes interpret backwards distance from the front edge of the belt and left and right distance from the inside face of the right treadmill arm, respectively. These housings each contain their own microcontroller and transmitter that communicate with the receiver. The receiver is contained with a belt that the user wears, and collects signals from both housings. The microcontroller interprets these signals and applies a voltage to the respective motor. These motors are located on the left, right, and back of the belt and are there to correct the user to the right, left, and forwards respectively. This feedback system ultimately serves the purpose for solving the users problem and is an effective way of helping them get back to running confidently and safely again
    • …
    corecore