135 research outputs found

    Imagining & Sensing: Understanding and Extending the Vocalist-Voice Relationship Through Biosignal Feedback

    Get PDF
    The voice is body and instrument. Third-person interpretation of the voice by listeners, vocal teachers, and digital agents is centred largely around audio feedback. For a vocalist, physical feedback from within the body provides an additional interaction. The vocalist’s understanding of their multi-sensory experiences is through tacit knowledge of the body. This knowledge is difficult to articulate, yet awareness and control of the body are innate. In the ever-increasing emergence of technology which quantifies or interprets physiological processes, we must remain conscious also of embodiment and human perception of these processes. Focusing on the vocalist-voice relationship, this thesis expands knowledge of human interaction and how technology influences our perception of our bodies. To unite these different perspectives in the vocal context, I draw on mixed methods from cog- nitive science, psychology, music information retrieval, and interactive system design. Objective methods such as vocal audio analysis provide a third-person observation. Subjective practices such as micro-phenomenology capture the experiential, first-person perspectives of the vocalists them- selves. Quantitative-qualitative blend provides details not only on novel interaction, but also an understanding of how technology influences existing understanding of the body. I worked with vocalists to understand how they use their voice through abstract representations, use mental imagery to adapt to altered auditory feedback, and teach fundamental practice to others. Vocalists use multi-modal imagery, for instance understanding physical sensations through auditory sensations. The understanding of the voice exists in a pre-linguistic representation which draws on embodied knowledge and lived experience from outside contexts. I developed a novel vocal interaction method which uses measurement of laryngeal muscular activations through surface electromyography. Biofeedback was presented to vocalists through soni- fication. Acting as an indicator of vocal activity for both conscious and unconscious gestures, this feedback allowed vocalists to explore their movement through sound. This formed new perceptions but also questioned existing understanding of the body. The thesis also uncovers ways in which vocalists are in control and controlled by, work with and against their bodies, and feel as a single entity at times and totally separate entities at others. I conclude this thesis by demonstrating a nuanced account of human interaction and perception of the body through vocal practice, as an example of how technological intervention enables exploration and influence over embodied understanding. This further highlights the need for understanding of the human experience in embodied interaction, rather than solely on digital interpretation, when introducing technology into these relationships

    Analysis, Design and Fabrication of Micromixers, Volume II

    Get PDF
    Micromixers are an important component in micrototal analysis systems and lab-on-a-chip platforms which are widely used for sample preparation and analysis, drug delivery, and biological and chemical synthesis. The Special Issue "Analysis, Design and Fabrication of Micromixers II" published in Micromachines covers new mechanisms, numerical and/or experimental mixing analysis, design, and fabrication of various micromixers. This reprint includes an editorial, two review papers, and eleven research papers reporting on five active and six passive micromixers. Three of the active micromixers have electrokinetic driving force, but the other two are activated by mechanical mechanism and acoustic streaming. Three studies employs non-Newtonian working fluids, one of which deals with nano-non-Newtonian fluids. Most of the cases investigated micromixer design

    Novel Bidirectional Body - Machine Interface to Control Upper Limb Prosthesis

    Get PDF
    Objective. The journey of a bionic prosthetic user is characterized by the opportunities and limitations involved in adopting a device (the prosthesis) that should enable activities of daily living (ADL). Within this context, experiencing a bionic hand as a functional (and, possibly, embodied) limb constitutes the premise for mitigating the risk of its abandonment through the continuous use of the device. To achieve such a result, different aspects must be considered for making the artificial limb an effective support for carrying out ADLs. Among them, intuitive and robust control is fundamental to improving amputees’ quality of life using upper limb prostheses. Still, as artificial proprioception is essential to perceive the prosthesis movement without constant visual attention, a good control framework may not be enough to restore practical functionality to the limb. To overcome this, bidirectional communication between the user and the prosthesis has been recently introduced and is a requirement of utmost importance in developing prosthetic hands. Indeed, closing the control loop between the user and a prosthesis by providing artificial sensory feedback is a fundamental step towards the complete restoration of the lost sensory-motor functions. Within my PhD work, I proposed the development of a more controllable and sensitive human-like hand prosthesis, i.e., the Hannes prosthetic hand, to improve its usability and effectiveness. Approach. To achieve the objectives of this thesis work, I developed a modular and scalable software and firmware architecture to control the Hannes prosthetic multi-Degree of Freedom (DoF) system and to fit all users’ needs (hand aperture, wrist rotation, and wrist flexion in different combinations). On top of this, I developed several Pattern Recognition (PR) algorithms to translate electromyographic (EMG) activity into complex movements. However, stability and repeatability were still unmet requirements in multi-DoF upper limb systems; hence, I started by investigating different strategies to produce a more robust control. To do this, EMG signals were collected from trans-radial amputees using an array of up to six sensors placed over the skin. Secondly, I developed a vibrotactile system to implement haptic feedback to restore proprioception and create a bidirectional connection between the user and the prosthesis. Similarly, I implemented an object stiffness detection to restore tactile sensation able to connect the user with the external word. This closed-loop control between EMG and vibration feedback is essential to implementing a Bidirectional Body - Machine Interface to impact amputees’ daily life strongly. For each of these three activities: (i) implementation of robust pattern recognition control algorithms, (ii) restoration of proprioception, and (iii) restoration of the feeling of the grasped object's stiffness, I performed a study where data from healthy subjects and amputees was collected, in order to demonstrate the efficacy and usability of my implementations. In each study, I evaluated both the algorithms and the subjects’ ability to use the prosthesis by means of the F1Score parameter (offline) and the Target Achievement Control test-TAC (online). With this test, I analyzed the error rate, path efficiency, and time efficiency in completing different tasks. Main results. Among the several tested methods for Pattern Recognition, the Non-Linear Logistic Regression (NLR) resulted to be the best algorithm in terms of F1Score (99%, robustness), whereas the minimum number of electrodes needed for its functioning was determined to be 4 in the conducted offline analyses. Further, I demonstrated that its low computational burden allowed its implementation and integration on a microcontroller running at a sampling frequency of 300Hz (efficiency). Finally, the online implementation allowed the subject to simultaneously control the Hannes prosthesis DoFs, in a bioinspired and human-like way. In addition, I performed further tests with the same NLR-based control by endowing it with closed-loop proprioceptive feedback. In this scenario, the results achieved during the TAC test obtained an error rate of 15% and a path efficiency of 60% in experiments where no sources of information were available (no visual and no audio feedback). Such results demonstrated an improvement in the controllability of the system with an impact on user experience. Significance. The obtained results confirmed the hypothesis of improving robustness and efficiency of a prosthetic control thanks to of the implemented closed-loop approach. The bidirectional communication between the user and the prosthesis is capable to restore the loss of sensory functionality, with promising implications on direct translation in the clinical practice

    Sensory mechanisms involved in obtaining frictional information for perception and grip force adjustment during object manipulation

    Full text link
    Sensory signals informing about frictional properties of a surface are used both for perception to experience material properties and for motor control to be able to handle objects using adequate manipulative forces. There are fundamental differences between these two purposes and scenarios, how sensory information typically is obtained. This thesis aims to explore the mechanisms involved in the perception of frictional properties of the touched surfaces under conditions relevant for object manipulation. Firstly, I show that in the passive touch condition, when the surface is brought in contact with immobilised finger, humans are unable to use existing friction-related mechanical cues and perceptually associate them with frictional properties. However, a submillimeter range lateral movement significantly improved the subject's ability to evaluate the frictional properties of two otherwise identical surfaces. It is demonstrated that partial slips within the contact area and fingertip tissue deformation create very potent sensory stimuli, enabling tactile afferents to signal friction-dependent mechanical effects translating into slipperiness (friction) perception. Further, I demonstrate that natural movement kinematics facilitate the development of such small skin displacements within the contact area and may play a central role in enabling the perception of surface slipperiness and adjusting grip force to friction when manipulating objects. This demonstrates intimate interdependence between the motor and sensory systems. This work significantly extends our understanding of fundamental tactile sensory processes involved in friction signaling in the context of motor control and dexterous object manipulation tasks. This knowledge and discovered friction sensing principles may assist in designing haptic rendering devices and artificial tactile sensors as well as associated control algorithms to be used in robotic grippers and hand prostheses

    Haptics: Science, Technology, Applications

    Get PDF
    This open access book constitutes the proceedings of the 13th International Conference on Human Haptic Sensing and Touch Enabled Computer Applications, EuroHaptics 2022, held in Hamburg, Germany, in May 2022. The 36 regular papers included in this book were carefully reviewed and selected from 129 submissions. They were organized in topical sections as follows: haptic science; haptic technology; and haptic applications

    Aerial Robotics for Inspection and Maintenance

    Get PDF
    Aerial robots with perception, navigation, and manipulation capabilities are extending the range of applications of drones, allowing the integration of different sensor devices and robotic manipulators to perform inspection and maintenance operations on infrastructures such as power lines, bridges, viaducts, or walls, involving typically physical interactions on flight. New research and technological challenges arise from applications demanding the benefits of aerial robots, particularly in outdoor environments. This book collects eleven papers from different research groups from Spain, Croatia, Italy, Japan, the USA, the Netherlands, and Denmark, focused on the design, development, and experimental validation of methods and technologies for inspection and maintenance using aerial robots

    Planning and control of robotic manipulation actions for extreme environments

    Get PDF
    A large societal and economic need arises for advanced robotic capabilities, where we need to perform complex human-like tasks such as tool-use, in environments that are hazardous for human workers. This thesis addresses a collection of problems, which arise when robotic manipulators must perform complex tasks in cluttered and constrained environments. The work is illustrated by example scenarios of robotic tool use, grasping and manipulating, motivated by the challenges of dismantling operations in the extreme environments of nuclear decommissioning Contrary to popular assumptions, legacy nuclear facilities (which can date back three-quarters of a century in the UK) can be highly unstructured and uncertain environments, with insufficient a-priori information available for e.g. conventional pre-programming of robot tasks. Meanwhile, situational awareness and direct teleoperation can be extremely difficult for human operators working in a safe zone that is physically remote from the robot. This engenders a need for significant autonomous capabilities. Robots must use vision and sensory systems to perceive their environment, plan and execute complex actions on complex objects in cluttered and constrained environments. Significant radiation, of different types and intensities, provides further challenges in terms of sensor noise. Perception uncertainty can also result from e.g. vision systems observing shiny featureless metal structures. Robotic actions therefore need to be: i) planned in ways that are robust to uncertainties; and ii) controlled in ways which enable the robust reaction to disturbances. In particular, we investigate motion planning and control in tasks where the robot must: maintain contact while moving over arbitrarily shaped surfaces with end-effector tools; exert forces and withstand perturbations during forceful contact actions; while also avoiding collisions with obstacles; avoiding singularity configurations; and increasing robustness by maximising manipulability during task execution. Furthermore, we consider the issues of robust planning and control with respect to uncertain information, derived from noisy sensors in challenging environments. We explore the Riemannian geometry and robot's manipulability to yield path planners that produce paths for both fixed-based and floating-based robots, whose tools always stay in contact with the object's surface. Our planners overcome disturbances in the perception and account for robot/environment interactions that may demand unexpected forces. The task execution is entrusted to a hybrid force/motion controller whose motion space behaves with compliance to accommodate unexpected stiffness changes throughout the contact. We examine the problem of grasping a tool for performing a task. Firstly, we introduce a method for selecting the grasp candidate onto an object yielding collision-free motion for the robot in the post-grasp movements. Furthermore, we study the case of a dual-arm robot performing full-force tasks on an object and slippage on the grasping is allowed. We account for the slippage throughout the task execution using a novel controller based on the sliding mode controllers
    • …
    corecore