134 research outputs found

    Multimodal, Embodied and Location-Aware Interaction

    Get PDF
    This work demonstrates the development of mobile, location-aware, eyes-free applications which utilise multiple sensors to provide a continuous, rich and embodied interaction. We bring together ideas from the fields of gesture recognition, continuous multimodal interaction, probability theory and audio interfaces to design and develop location-aware applications and embodied interaction in both a small-scale, egocentric body-based case and a large-scale, exocentric `world-based' case. BodySpace is a gesture-based application, which utilises multiple sensors and pattern recognition enabling the human body to be used as the interface for an application. As an example, we describe the development of a gesture controlled music player, which functions by placing the device at different parts of the body. We describe a new approach to the segmentation and recognition of gestures for this kind of application and show how simulated physical model-based interaction techniques and the use of real world constraints can shape the gestural interaction. GpsTunes is a mobile, multimodal navigation system equipped with inertial control that enables users to actively explore and navigate through an area in an augmented physical space, incorporating and displaying uncertainty resulting from inaccurate sensing and unknown user intention. The system propagates uncertainty appropriately via Monte Carlo sampling and output is displayed both visually and in audio, with audio rendered via granular synthesis. We demonstrate the use of uncertain prediction in the real world and show that appropriate display of the full distribution of potential future user positions with respect to sites-of-interest can improve the quality of interaction over a simplistic interpretation of the sensed data. We show that this system enables eyes-free navigation around set trajectories or paths unfamiliar to the user for varying trajectory width and context. We demon- strate the possibility to create a simulated model of user behaviour, which may be used to gain an insight into the user behaviour observed in our field trials. The extension of this application to provide a general mechanism for highly interactive context aware applications via density exploration is also presented. AirMessages is an example application enabling users to take an embodied approach to scanning a local area to find messages left in their virtual environment

    Multimodal, Embodied and Location-Aware Interaction

    Get PDF
    This work demonstrates the development of mobile, location-aware, eyes-free applications which utilise multiple sensors to provide a continuous, rich and embodied interaction. We bring together ideas from the fields of gesture recognition, continuous multimodal interaction, probability theory and audio interfaces to design and develop location-aware applications and embodied interaction in both a small-scale, egocentric body-based case and a large-scale, exocentric `world-based' case. BodySpace is a gesture-based application, which utilises multiple sensors and pattern recognition enabling the human body to be used as the interface for an application. As an example, we describe the development of a gesture controlled music player, which functions by placing the device at different parts of the body. We describe a new approach to the segmentation and recognition of gestures for this kind of application and show how simulated physical model-based interaction techniques and the use of real world constraints can shape the gestural interaction. GpsTunes is a mobile, multimodal navigation system equipped with inertial control that enables users to actively explore and navigate through an area in an augmented physical space, incorporating and displaying uncertainty resulting from inaccurate sensing and unknown user intention. The system propagates uncertainty appropriately via Monte Carlo sampling and output is displayed both visually and in audio, with audio rendered via granular synthesis. We demonstrate the use of uncertain prediction in the real world and show that appropriate display of the full distribution of potential future user positions with respect to sites-of-interest can improve the quality of interaction over a simplistic interpretation of the sensed data. We show that this system enables eyes-free navigation around set trajectories or paths unfamiliar to the user for varying trajectory width and context. We demon- strate the possibility to create a simulated model of user behaviour, which may be used to gain an insight into the user behaviour observed in our field trials. The extension of this application to provide a general mechanism for highly interactive context aware applications via density exploration is also presented. AirMessages is an example application enabling users to take an embodied approach to scanning a local area to find messages left in their virtual environment

    Robotic manipulators for single access surgery

    Get PDF
    This thesis explores the development of cooperative robotic manipulators for enhancing surgical precision and patient outcomes in single-access surgery and, specifically, Transanal Endoscopic Microsurgery (TEM). During these procedures, surgeons manipulate a heavy set of instruments via a mechanical clamp inserted in the patient’s body through a surgical port, resulting in imprecise movements, increased patient risks, and increased operating time. Therefore, an articulated robotic manipulator with passive joints is initially introduced, featuring built-in position and force sensors in each joint and electronic joint brakes for instant lock/release capability. The articulated manipulator concept is further improved with motorised joints, evolving into an active tool holder. The joints allow the incorporation of advanced robotic capabilities such as ultra-lightweight gravity compensation and hands-on kinematic reconfiguration, which can optimise the placement of the tool holder in the operating theatre. Due to the enhanced sensing capabilities, the application of the active robotic manipulator was further explored in conjunction with advanced image guidance approaches such as endomicroscopy. Recent advances in probe-based optical imaging such as confocal endomicroscopy is making inroads in clinical uses. However, the challenging manipulation of imaging probes hinders their practical adoption. Therefore, a combination of the fully cooperative robotic manipulator with a high-speed scanning endomicroscopy instrument is presented, simplifying the incorporation of optical biopsy techniques in routine surgical workflows. Finally, another embodiment of a cooperative robotic manipulator is presented as an input interface to control a highly-articulated robotic instrument for TEM. This master-slave interface alleviates the drawbacks of traditional master-slave devices, e.g., using clutching mechanics to compensate for the mismatch between slave and master workspaces, and the lack of intuitive manipulation feedback, e.g. joint limits, to the user. To address those drawbacks a joint-space robotic manipulator is proposed emulating the kinematic structure of the flexible robotic instrument under control.Open Acces

    Wearables for Movement Analysis in Healthcare

    Get PDF
    Quantitative movement analysis is widely used in clinical practice and research to investigate movement disorders objectively and in a complete way. Conventionally, body segment kinematic and kinetic parameters are measured in gait laboratories using marker-based optoelectronic systems, force plates, and electromyographic systems. Although movement analyses are considered accurate, the availability of specific laboratories, high costs, and dependency on trained users sometimes limit its use in clinical practice. A variety of compact wearable sensors are available today and have allowed researchers and clinicians to pursue applications in which individuals are monitored in their homes and in community settings within different fields of study, such movement analysis. Wearable sensors may thus contribute to the implementation of quantitative movement analyses even during out-patient use to reduce evaluation times and to provide objective, quantifiable data on the patients’ capabilities, unobtrusively and continuously, for clinical purposes

    Technological advances in deep brain stimulation:Towards an adaptive therapy

    Get PDF
    Parkinson's disease (PD) is neurodegenerative movement disorder and a treatment method called deep brain stimulation (DBS) may considerably reduce the patient’s motor symptoms. The clinical procedure involves the implantation of a DBS lead, consisting of multiple electrode contacts, through which continuous high frequency (around 130 Hz) electric pulses are delivered in the brain. In this thesis, I presented the research which had the goal to improve current DBS technology, focusing on bringing the conventional DBS system a step closer to adaptive DBS, a personalized DBS therapy. The chapters in this thesis can be seen as individual building blocks for such an adaptive DBS system. After the general introduction, the first two chapters, two novel DBS lead designs are studied in a computational model. The model showed that both studied leads were able to exploit the novel distribution of the electrode contacts to shape and steer the stimulation field to activate more neurons in the chosen target compared to the conventional lead, and to counteract lead displacement. In the fourth chapter, an inverse current source density (CSD) method is applied on local field potentials (LFP) measured in a rat model. The pattern of CSD sources can act as a landmark within the STN to locate the potential stimulation target. The fifth and final chapter described the last building block of the DBS system. We introduced an inertial sensors and force sensor based measurement system, which can record hand kinematics and joint stiffness of PD patients. A system which can act as a feedback signal in an adaptive DBS system

    Vitreo-retinal eye surgery robot : sustainable precision

    Get PDF
    Vitreo-retinal eye surgery encompasses the surgical procedures performed on the vitreous humor and the retina. A procedure typically consists of the removal of the vitreous humor, the peeling of a membrane and/or the repair of a retinal detachment. Vitreo-retinal surgery is performed minimal invasively. Small needle shaped instruments are inserted into the eye. Instruments are manipulated by hand in four degrees of freedom about the insertion point. Two rotations move the instrument tip laterally, in addition to a translation in axial instrument direction and a rotation about its longitudinal axis. The manipulation of the instrument tip, e.g. a gripping motion can be considered as a fifth degree of freedom. While performing vitreo-retinal surgery manually, the surgeon faces various challenges. Typically, delicate micrometer range thick tissue is operated, for which steady hand movements and high accuracy instrument manipulation are required. Lateral instrument movements are inverted by the pivoting insertion point and scaled depending on the instrument insertion depth. A maximum of two instruments can be used simultaneously. There is nearly no perception of surgical forces, since most forces are below the human detection limit. Therefore, the surgeon relies only on visual feedback, obtained via a microscope or endoscope. Both vision systems force the surgeon to work in a static and non ergonomic body posture. Although the surgeon’s proficiency improves throughout his career, hand tremor will become a problem at higher age. Robotically assisted surgery with a master-slave system can assist the surgeon in these challenges. The slave system performs the actual surgery, by means of instrument manipulators which handle the instruments. The surgeon remains in control of the instruments by operating haptic interfaces via a master. Using electronic hardware and control software, the master and slave are connected. Amongst others, advantages as tremor filtering, up-scaled force feedback, down-scaled motions and stabilized instrument positioning will enhance dexterity on surgical tasks. Furthermore, providing the surgeon an ergonomic body posture will prolong the surgeon’s career. This thesis focuses on the design and realization of a high precision slave system for eye surgery. The master-slave system uses a table mounted design, where the system is compact, lightweight, easy to setup and equipped to perform a complete intervention. The slave system consists of two main parts: the instrument manipulators and their passive support system. Requirements are derived from manual eye surgery, conversations with medical specialists and analysis of the human anatomy and vitreo-retinal interventions. The passive support system provides a stiff connection between the instrument manipulator, patient and surgical table. Given the human anatomical diversity, presurgical adjustments can be made to allow the instrument manipulators to be positioned over each eye. Most of the support system is integrated within the patient’s headrest. On either the left or right side, two exchangeable manipulator-support arms can be installed onto the support system, depending on the eye being operated upon. The compact, lightweight and easy to install design, allows for a short setup time and quick removal in case of a complication. The slave system’s surgical reach is optimized to emulate manually performed surgery. For bimanual instrument operation, two instrument manipulators are used. Additional instrument manipulators can be used for non-active tools e.g. an illumination probe or an endoscope. An instrument manipulator allows the same degrees of freedom and a similar reach as manually performed surgery. Instrument forces are measured to supply force feedback to the surgeon via haptic interfaces. The instrument manipulator is designed for high stiffness, is play free and has low friction to allow tissue manipulation with high accuracy. Each instrument manipulator is equipped with an on board instrument change system, by which instruments can be changed in a fast and secure way. A compact design near the instrument allows easy access to the surgical area, leaving room for the microscope and peripheral equipment. The acceptance of a surgical robot for eye surgery mostly relies on equipment safety and reliability. The design of the slave system features various safety measures, e.g. a quick release mechanism for the instrument manipulator and additional locks on the pre-surgical adjustment fixation clamp. Additional safety measures are proposed, like a hard cover over the instrument manipulator and redundant control loops in the controlling FPGA. A method to fixate the patient’s head to the headrest by use of a custom shaped polymer mask is proposed. Two instrument manipulators and their passive support system have been realized so far, and the first experimental results confirm the designed low actuation torque and high precision performance

    Rehabilitation Engineering

    Get PDF
    Population ageing has major consequences and implications in all areas of our daily life as well as other important aspects, such as economic growth, savings, investment and consumption, labour markets, pensions, property and care from one generation to another. Additionally, health and related care, family composition and life-style, housing and migration are also affected. Given the rapid increase in the aging of the population and the further increase that is expected in the coming years, an important problem that has to be faced is the corresponding increase in chronic illness, disabilities, and loss of functional independence endemic to the elderly (WHO 2008). For this reason, novel methods of rehabilitation and care management are urgently needed. This book covers many rehabilitation support systems and robots developed for upper limbs, lower limbs as well as visually impaired condition. Other than upper limbs, the lower limb research works are also discussed like motorized foot rest for electric powered wheelchair and standing assistance device

    Pushing the limits of inertial motion sensing

    Get PDF
    • …
    corecore