6,445 research outputs found
Machine Learning in Sensors and Imaging
Machine learning is extending its applications in various fields, such as image processing, the Internet of Things, user interface, big data, manufacturing, management, etc. As data are required to build machine learning networks, sensors are one of the most important technologies. In addition, machine learning networks can contribute to the improvement in sensor performance and the creation of new sensor applications. This Special Issue addresses all types of machine learning applications related to sensors and imaging. It covers computer vision-based control, activity recognition, fuzzy label classification, failure classification, motor temperature estimation, the camera calibration of intelligent vehicles, error detection, color prior model, compressive sensing, wildfire risk assessment, shelf auditing, forest-growing stem volume estimation, road management, image denoising, and touchscreens
Interactive robot control system and method of use
A robotic system includes a robot having joints, actuators, and sensors, and a distributed controller. The controller includes command-level controller, embedded joint-level controllers each controlling a respective joint, and a joint coordination-level controller coordinating motion of the joints. A central data library (CDL) centralizes all control and feedback data, and a user interface displays a status of each joint, actuator, and sensor using the CDL. A parameterized action sequence has a hierarchy of linked events, and allows the control data to be modified in real time. A method of controlling the robot includes transmitting control data through the various levels of the controller, routing all control and feedback data to the CDL, and displaying status and operation of the robot using the CDL. The parameterized action sequences are generated for execution by the robot, and a hierarchy of linked events is created within the sequence
A portable robotic rehabilitation system towards improving impaired function of the hand due to stroke
Background: Stroke is the leading cause of adult disability with 70 to 85% of initial strokes resulting in hemiparesis. Physical imparity as a result of stroke tends to be severe and majority of impairments are upper limb-related. Impairment is usually accompanied by long term functional loss which requires dedicated post-stroke rehabilitation to regain motor function. The incidence of stroke is increasing rapidly while there remains a shortage of therapists to provide sufficient rehabilitation. There is therefore a high demand for therapists to attend to the rising number of stroke survivors. Robot-aided therapy has emerged as a beneficial tool for providing continuous rehabilitation of the upper limb and is widely being implemented. With this technology, there is great potential to reduce the ill-effects brought about by the low therapist-patient ratio which has hindered sufficient rehabilitation and consequently the effective recovery of motor function among stroke survivors. Hypothesis: The use of a portable robotic rehabilitation system, as a complementary tool, in hand therapy, would promote continuous rehabilitation by encouraging repetition of task oriented exercises which would enhance motor function of an impaired hand. Task-oriented writing practice would potentially improve hand coordination and result in better accuracy while repetitive training would potentially increase hand motor strength. Objectives: 1.To design and manufacture a portable robotic rehabilitation system. 2. To test the performance and usability of the system. Methods: The system was manufactured and its performance tested in a pilot pre-clinical trial involving three participants. The system's ease of use was assessed using a standardised usability scale. Writing accuracy and hand motor strength were also assessed and the results analysed at the end of the study. Results: The average overall score of usability for the rehabilitation system was a few points higher than the average score. The users of the system also experienced increased motivation whilst performing the repetitive and task oriented exercises. There was an improvement in the completion time of the writing accuracy test and the tasks of the trace sample test. The variation in grip strength of the non-dominant hand during the rehabilitation period was small for each of the participants. Conclusion: The rehabilitation system motivated its users to repetitively perform rehabilitative training which may have improved writing accuracy
Synchronized computational architecture for generalized bilateral control of robot arms
A master six degree of freedom Force Reflecting Hand Controller (FRHC) is available at a master site where a received image displays, in essentially real time, a remote robotic manipulator which is being controlled in the corresponding six degree freedom by command signals which are transmitted to the remote site in accordance with the movement of the FRHC at the master site. Software is user-initiated at the master site in order to establish the basic system conditions, and then a physical movement of the FRHC in Cartesean space is reflected at the master site by six absolute numbers that are sensed, translated and computed as a difference signal relative to the earlier position. The change in position is then transmitted in that differential signal form over a high speed synchronized bilateral communication channel which simultaneously returns robot-sensed response information to the master site as forces applied to the FRHC so that the FRHC reflects the feel of what is taking place at the remote site. A system wide clock rate is selected at a sufficiently high rate that the operator at the master site experiences the Force Reflecting operation in real time
New generation of interactive platforms based on novel printed smart materials
Programa doutoral em Engenharia Eletrónica e de Computadores (área de Instrumentação e Microssistemas Eletrónicos)The last decade was marked by the computer-paradigm changing with other digital devices suddenly becoming available to the general public, such as tablets and smartphones. A shift in perspective from computer to materials as the centerpiece of digital interaction is leading to a diversification of interaction contexts, objects and applications, recurring to intuitive commands and dynamic content that can proportionate more interesting and satisfying experiences.
In parallel, polymer-based sensors and actuators, and their integration in different substrates or devices is an area of increasing scientific and technological interest, which current state of the art starts to permit the use of smart sensors and actuators embodied within the objects seamlessly. Electronics is no longer a rigid board with plenty of chips. New technological advances and perspectives now turned into printed electronics in polymers, textiles or paper. We are assisting to the actual scaling down of computational power into everyday use objects, a fusion of the computer with the material. Interactivity is being transposed to objects erstwhile inanimate.
In this work, strain and deformation sensors and actuators were developed recurring to functional polymer composites with metallic and carbonaceous nanoparticles (NPs) inks, leading to capacitive, piezoresistive and piezoelectric effects, envisioning the creation of tangible user interfaces (TUIs). Based on smart polymer substrates such as polyvinylidene fluoride (PVDF) or polyethylene terephthalate (PET), among others, prototypes were prepared using piezoelectric and dielectric technologies. Piezoresistive prototypes were prepared with resistive inks and restive functional polymers. Materials were printed by screen printing, inkjet printing and doctor blade coating. Finally, a case study of the integration of the different materials and technologies developed is presented in a book-form factor.A última década foi marcada por uma alteração do paradigma de computador pelo súbito aparecimento dos tablets e smartphones para o público geral. A alteração de perspetiva do computador para os materiais como parte central de interação digital levou a uma diversificação dos contextos de interação, objetos e aplicações, recorrendo a comandos intuitivos e conteúdos dinâmicos capazes de tornarem a experiência mais interessante e satisfatória.
Em simultâneo, sensores e atuadores de base polimérica, e a sua integração em diferentes substratos ou dispositivos é uma área de crescente interesse científico e tecnológico, e o atual estado da arte começa a permitir o uso de sensores e atuadores inteligentes perfeitamente integrados nos objetos. Eletrónica já não é sinónimo de placas rígidas cheias de componentes. Novas perspetivas e avanços tecnológicos transformaram-se em eletrónica impressa em polímeros, têxteis ou papel. Neste momento estamos a assistir à redução da computação a objetos do dia a dia, uma fusão do computador com a matéria. A interatividade está a ser transposta para objetos outrora inanimados.
Neste trabalho foram desenvolvidos atuadores e sensores e de pressão e de deformação com recurso a compostos poliméricos funcionais com tintas com nanopartículas (NPs) metálicas ou de base carbónica, recorrendo aos efeitos capacitivo, piezoresistivo e piezoelétrico, com vista à criação de interfaces de usuário tangíveis (TUIs). Usando substratos poliméricos inteligentes tais como fluoreto de polivinilideno (PVDF) ou politereftalato de etileno (PET), entre outos, foi possível a preparação de protótipos de tecnologia piezoelétrica ou dielétrica. Os protótipos de tecnologia piezoresistiva foram feitos com tintas resistivas e polímeros funcionais resistivos. Os materiais foram impressos por serigrafia, jato de tinta, impressão por aerossol e revestimento de lâmina doctor blade. Para terminar, é apresentado um caso de estudo da integração dos diferentes materiais e tecnologias desenvolvidos sob o formato de um livro.This project was supported by FCT – Fundação para a Ciência e a Tecnologia, within the doctorate
grant with reference SFRH/BD/110622/2015, by POCH – Programa Operacional Capital Humano, and
by EU – European Union
Recommended from our members
New Technologies for On-Demand Hand Rehabilitation in the Living Environment after Neurologic Injury
High-dosage rehabilitation therapy enhances neuroplasticity and motor recovery after neurologic injuries such as stroke and spinal cord injury. The optimal exercise dosage necessary to promote upper extremity (UE) recovery is unknown. However, occupational and physical therapy sessions are currently orders of magnitude too low to optimally drive recovery. Taking therapy outside of the clinic and into the living environment using sensing and computer technologies is attractive because it could result in a more cost efficient and effective way to extend therapy dosage. This dissertation developed innovative wearable sensing algorithms and a novel robotic system to enhance hand rehabilitation. We used these technologies to provide on-demand exercise in the living environment in ways not previously achieved, as well as to gain new insights into UE use and recovery after neurologic injuries.Currently, the standard-of-practice for wearable sensing of UE movement after stroke is bimanual wrist accelerometry. While this approach has been validated as a way to monitor amount of UE activity, and has been shown to be correlated with clinical assessments, it is unclear what new information can be obtained with it. We developed two new kinematic metrics of movement quality obtainable from bimanual wrist accelerometry. Using data from stroke survivors, we applied principal component analysis to show that these metrics encode unique information compared to that typically carried by conventional clinical assessments. We presented these results in a new graphical format that facilitates the identification of limb use asymmetries.Wrist accelerometry has the limitation that it cannot isolate functional use of the hand. Previously, we had developed a sensing system, the Manumeter, that quantifies finger movement by sensing magnetic field changes induced by movement of a ring worn on the finger, using a magnetometer array worn at the wrist. We developed, optimized, and validated a calibration-free algorithm, the “HAND” algorithm, for real-time counting of isolated, functional hand movements with the Manumeter. Using data from a robotic wrist simulator, unimpaired volunteers and stroke survivors, we showed that HAND counted movements with ~85% accuracy, missing mainly smaller, slower movements. We also showed that HAND counts correlated strongly with clinical assessments of hand function, indicating validity across a range of hand impairment levels.To date, there have been few attempts to increase hand use and recovery of individuals with a stroke by providing real-time feedback from wearable sensors. We used HAND and the Manumeter to perform a first-of-its-kind randomized controlled trial of the effect of real-time hand movement feedback on hand use and recovery after chronic stroke. We found that real-time feedback on hand movement was ineffective in increasing hand use intensity and improving hand function. We also showed for the first time the non-linear relationship between hand capacity, measured in the laboratory, and actual hand use, measured at-home. Even people with a moderate level of clinical hand function exhibit very low hand use at home. Finally, the challenge of improving hand function for people with moderate to severe injuries highlights the need for novel approaches to rehabilitation. One emerging technique is regenerative rehabilitation, in which regenerative therapies, such as stem cell engraftment, are coupled with intensive rehabilitation. In collaboration with the Department of Veteran Affairs Gordon Mansfield Spinal Cord Injury Translational Collaborative Consortium, we developed a robot for promoting on-demand, hand rehabilitation in a non-human primate model of hemiparetic spinal cord injury that is being used to synergize hand rehabilitation with novel regenerative therapies. Using an innovative bimanual manipulation paradigm, we show that subjects engaged with the device at a similar rate before and after injury across a range of hand impairment severity. We also demonstrate that we could shape relative use of the arm and increase the number of exercise repetitions per reward by changing parameters of the robot. We then evaluated how the peak grip force that the subjects applied to the robot decreased after SCI, demonstrating that it can serve as a potential marker of recovery.These developments provide a foundation for future work in technologies for therapeutic movement rehabilitation in the living environment by establishing: 1) new metrics of upper extremity movement quality; 2) a validated algorithm for achieving a “pedometer for the hand” using wearable magnetometry; 3) a negative clinical trial result on the therapeutic effect of real-time hand feedback after stroke, which begs the question of what can be improved in future trials; 4) the nonlinear relationship between hand movement ability and at-home use, supporting the concept of learned non-use; and 5) the first example of robotic regenerative rehabilitation
Distributed Sensing and Stimulation Systems Towards Sense of Touch Restoration in Prosthetics
Modern prostheses aim at restoring the functional and aesthetic characteristics of the lost limb. To foster prosthesis embodiment and functionality, it is necessary to restitute both volitional control and sensory feedback. Contemporary feedback interfaces presented in research use few sensors and stimulation units to feedback at most two discrete feedback variables (e.g. grasping force and aperture), whereas the human sense of touch relies on a distributed network of mechanoreceptors providing high-fidelity spatial information. To provide this type of feedback in prosthetics, it is necessary to sense tactile information from artificial skin placed on the prosthesis and transmit tactile feedback above the amputation in order to map the interaction between the prosthesis and the environment. This thesis proposes the integration of distributed sensing systems (e-skin) to acquire tactile sensation, and non-invasive multichannel electrotactile feedback and virtual reality to deliver high-bandwidth information to the user. Its core focus addresses the development and testing of close-loop sensory feedback human-machine interface, based on the latest distributed sensing and stimulation techniques for restoring the sense of touch in prosthetics. To this end, the thesis is comprised of two introductory chapters that describe the state of art in the field, the objectives and the used methodology and contributions; as well as three studies distributed over stimulation system level and sensing system level.
The first study presents the development of close-loop compensatory tracking system to evaluate the usability and effectiveness of electrotactile sensory feedback in enabling real-time close-loop control in prosthetics. It examines and compares the subject\u2019s adaptive performance and tolerance to random latencies while performing the dynamic control task (i.e. position control) and simultaneously receiving either visual feedback or electrotactile feedback for communicating the momentary tracking error. Moreover, it reported the minimum time delay needed for an abrupt impairment of users\u2019 performance. The experimental results have shown that electrotactile feedback performance is less prone to changes with longer delays. However, visual feedback drops faster than electrotactile with increased time delays. This is a good indication for the effectiveness of electrotactile feedback in enabling close- loop control in prosthetics, since some delays are inevitable.
The second study describes the development of a novel non-invasive compact multichannel interface for electrotactile feedback, containing 24 pads electrode matrix, with fully programmable stimulation unit, that investigates the ability of able-bodied human subjects to localize the electrotactile stimulus delivered through the electrode matrix. Furthermore, it designed a novel dual parameter -modulation (interleaved frequency and intensity) and compared it to conventional stimulation (same frequency for all pads). In addition and for the first time, it compared the electrotactile stimulation to mechanical stimulation. More, it exposes the integration of virtual prosthesis with the developed system in order to achieve better user experience and object manipulation through mapping the acquired real-time collected tactile data and feedback it simultaneously to the user. The experimental results demonstrated that the proposed interleaved coding substantially improved the spatial localization compared to same-frequency stimulation. Furthermore, it showed that same-frequency stimulation was equivalent to mechanical stimulation, whereas the performance with dual-parameter modulation was significantly better.
The third study presents the realization of a novel, flexible, screen- printed e-skin based on P(VDF-TrFE) piezoelectric polymers, that would cover the fingertips and the palm of the prosthetic hand (particularly the Michelangelo hand by Ottobock) and an assistive sensorized glove for stroke patients. Moreover, it developed a new validation methodology to examine the sensors behavior while being solicited. The characterization results showed compatibility between the expected (modeled) behavior of the electrical response of each sensor to measured mechanical (normal) force at the skin surface, which in turn proved the combination of both fabrication and assembly processes was successful. This paves the way to define a practical, simplified and reproducible characterization protocol for e-skin patches
In conclusion, by adopting innovative methodologies in sensing and stimulation systems, this thesis advances the overall development of close-loop sensory feedback human-machine interface used for restoration of sense of touch in prosthetics. Moreover, this research could lead to high-bandwidth high-fidelity transmission of tactile information for modern dexterous prostheses that could ameliorate the end user experience and facilitate it acceptance in the daily life
TableHop: an actuated fabric display using transparent electrodes
We present TableHop, a tabletop display that provides controlled self-actuated deformation and vibro-tactile feedback to an elastic fabric surface while retaining the ability for high-resolution visual projection. The TableHop surface is made of a highly stretchable pure spandex fabric that is electrostatically actuated using electrodes mounted on its underside. We use transparent indium tin oxide electrodes and high-voltage modulation to create controlled surface deformations. This setup actuates pixels and creates deformations in the fabric up to 5mm. Since the electrodes are transparent, the fabric surface can function as a diffuser for rear-projected visual images, and avoid occlusion by users. Users can touch and interact with the fabric to create expressive interactions as with any fabric based shape-changing interface. By using frequency modulation in the high-voltage circuit, we can also create localised tactile sensations on the user's finger-tip when touching the surface. We provide detailed simulation results of the shape of the surface deformation and the frequency of the haptic vibrations. These results can be used to build prototypes of different sizes and form-factors. We finally create a working prototype of TableHop that has 3040 cm surface area and uses a grid of 33 transparent electrodes. Our prototype uses a maximum of 2.2 mW and can create tactile vibrations of up to 20 . TableHop can be scaled to large interactive surfaces and integrated with other objects and devices. TableHop will improve user interaction experience on 2.5D deformable displays
Exploring human-object interaction through force vector measurement
Thesis: S.M., Massachusetts Institute of Technology, School of Architecture and Planning, Program in Media Arts and Sciences, 2019Cataloged from PDF version of thesis.Includes bibliographical references (pages 101-107).I introduce SCALE, a project aiming to further understand Human-Object Interaction through the real-time analysis of force vector signals, which I have defined as "Force-based Interaction" in this thesis. Force conveys fundamental information in Force-based Interaction, including force intensity, its direction, and object weight - information otherwise difficult to be accessed or inferred from other sensing modalities. To explore the design space of force-based interaction, I have developed the SCALE toolkit, which is composed of modularized 3d-axis force sensors and application APIs. In collaboration with big industry companies, this system has been applied to a variety of application domains and settings, including a retail store, a smart home and a farmers market. In this thesis, I have proposed a base system SCALE, and two additional advanced projects titled KI/OSK and DepthTouch, which build upon the SCALE project.by Takatoshi Yoshida.S.M.S.M. Massachusetts Institute of Technology, School of Architecture and Planning, Program in Media Arts and Science
- …