15 research outputs found

    A.Eye Drive: gaze-based semi-autonomous wheelchair interface

    Get PDF
    Existing wheelchair control interfaces, such as sip & puff or screen based gaze-controlled cursors, are challenging for the severely disabled to navigate safely and independently as users continuously need tointeract with an interface during navigation. This putsa significant cognitive load on users and prevents them from interacting with the environment in other forms during navigation. We have combined eyetracking/gaze-contingent intention decoding with computervision context-awarealgorithms and autonomous navigation drawn fromself-driving vehicles to allow paralysed users to drive by eye, simply by decoding natural gaze about where the user wants to go: A.Eye Drive. Our “Zero UI” driving platform allows users to look and interact visually with at an objector destination of interest in their visual scene, and the wheelchairautonomously takes the user to the intended destination, while continuously updating the computed path for static and dynamic obstacles. This intention decoding technology empowers the end-user by promising more independence through their own agency

    Sparse Eigenmotions derived from daily life kinematics implemented on a dextrous robotic hand

    No full text
    Our hands are considered one of the most complex to control actuated systems, thus, emulating the manipulative skills of real hands is still an open challenge even in anthropomorphic robotic hand. While the action of the 4 long fingers and simple grasp motions through opposable thumbs have been successfully implemented in robotic designs, complex in-hand manipulation of objects was difficult to achieve. We take an approach grounded in data-driven extraction of control primitives from natural human behaviour to develop novel ways to understand the dexterity of hands. We collected hand kinematics datasets from natural, unconstrained human behaviour of daily life in 8 healthy in a studio flat environment. We then applied our Sparse Motion Decomposition approach to extract spatio-temporally localised modes of hand motion that are both time-scale and amplitude-scale invariant. These Sparse EigenMotions (SEMs)[1] form a sparse symbolic code that encodes continuous hand motions. We mechanically implemented the common SEMs on our novel dexterous robotic hand [2] in open-loop control. We report that without processing any feedback during grasp control, several of the SEMs resulted in stable grasps of different daily life objects. The finding that SEMs extracted from daily life implement stable grasps in open-loop control of dexterous hands, lends further support for our hypothesis the brain controls the hand using sparse control strategies

    Neurobehavioural signatures in race car driving

    Get PDF
    ABSTRACT Recent technological developments in mobile brain and body imaging are enabling new frontiers of real-world neuroscience. Simultaneous recordings of body movement and brain activity from highly skillful individuals as they demonstrate their exceptional skills in real-world settings, can shed new light on neurobehavioural structure of human expertise. Driving is a real-world skill which many of us acquire on different levels of expertise. Here we ran a case-study on a subject with the highest level of driving expertise - a Formula E Champion. We studied the expert driver’s neural and motor patterns while he drove a sports car in the “Top Gear” race track under extreme conditions (high speed, low visibility, low temperature, wet track). His brain activity, eye movements and hand/foot movements were recorded. Brain activity in the delta, alpha, and beta frequency bands showed causal relation to hand movements. We demonstrate, here in summary, that even in extreme situations (race track driving) a method for conducting human ethomic (Ethology + Omics) data that encompasses information on the sensory inputs and motor outputs outputs of the brain as well as brain state to characterise complex human skills

    Motor learning in real-world pool billiards

    Get PDF
    The neurobehavioral mechanisms of human motor-control and learning evolved in free behaving, real-life settings, yet this is studied mostly in reductionistic lab-based experiments. Here we take a step towards a more real-world motor neuroscience using wearables for naturalistic full-body motion-tracking and the sports of pool billiards to frame a real-world skill learning experiment. First, we asked if well-known features of motor learning in lab-based experiments generalize to a real-world task. We found similarities in many features such as multiple learning rates, and the relationship between task-related variability and motor learning. Our data-driven approach reveals the structure and complexity of movement, variability, and motor learning, enabling an in-depth understanding of the structure of motor learning in three ways: First, while expecting most of the movement learning is done by the cue-wielding arm, we find that motor learning affects the whole body, changing motor-control from head to toe. Second, during learning, all subjects decreased their movement variability and their variability in the outcome. Subjects who were initially more variable were also more variable after learning. Lastly, when screening the link across subjects between initial variability in individual joints and learning, we found that only the initial variability in the right forearm supination shows a significant correlation to the subjects’ learning rates. This is in-line with the relationship between learning and variability: while learning leads to an overall reduction in movement variability, only initial variability in specific task-relevant dimensions can facilitate faster learning

    The measurement, evolution, and neural representation of action grammars of human behavior

    Get PDF
    Human behaviors from toolmaking to language are thought to rely on a uniquely evolved capacity for hierarchical action sequencing. Testing this idea will require objective, generalizable methods for measuring the structural complexity of real-world behavior. Here we present a data-driven approach for extracting action grammars from basic ethograms, exemplified with respect to the evolutionarily relevant behavior of stone toolmaking. We analyzed sequences from the experimental replication of ~ 2.5 Mya Oldowan vs. ~ 0.5 Mya Acheulean tools, finding that, while using the same “alphabet” of elementary actions, Acheulean sequences are quantifiably more complex and Oldowan grammars are a subset of Acheulean grammars. We illustrate the utility of our complexity measures by re-analyzing data from an fMRI study of stone toolmaking to identify brain responses to structural complexity. Beyond specific implications regarding the co-evolution of language and technology, this exercise illustrates the general applicability of our method to investigate naturalistic human behavior and cognition

    A wearable motion capture suit and machine learning predict disease progression in Friedreich's ataxia.

    Get PDF
    Friedreich's ataxia (FA) is caused by a variant of the Frataxin (FXN) gene, leading to its downregulation and progressively impaired cardiac and neurological function. Current gold-standard clinical scales use simplistic behavioral assessments, which require 18- to 24-month-long trials to determine if therapies are beneficial. Here we captured full-body movement kinematics from patients with wearable sensors, enabling us to define digital behavioral features based on the data from nine FA patients (six females and three males) and nine age- and sex-matched controls, who performed the 8-m walk (8-MW) test and 9-hole peg test (9 HPT). We used machine learning to combine these features to longitudinally predict the clinical scores of the FA patients, and compared these with two standard clinical assessments, Spinocerebellar Ataxia Functional Index (SCAFI) and Scale for the Assessment and Rating of Ataxia (SARA). The digital behavioral features enabled longitudinal predictions of personal SARA and SCAFI scores 9 months into the future and were 1.7 and 4 times more precise than longitudinal predictions using only SARA and SCAFI scores, respectively. Unlike the two clinical scales, the digital behavioral features accurately predicted FXN gene expression levels for each FA patient in a cross-sectional manner. Our work demonstrates how data-derived wearable biomarkers can track personal disease trajectories and indicates the potential of such biomarkers for substantially reducing the duration or size of clinical trials testing disease-modifying therapies and for enabling behavioral transcriptomics

    A wearable motion capture suit and machine learning predict disease progression in Friedreich's ataxia

    Get PDF
    Friedreich's ataxia (FA) is caused by a variant of the Frataxin (FXN) gene, leading to its downregulation and progressively impaired cardiac and neurological function. Current gold-standard clinical scales use simplistic behavioral assessments, which require 18- to 24-month-long trials to determine if therapies are beneficial. Here we captured full-body movement kinematics from patients with wearable sensors, enabling us to define digital behavioral features based on the data from nine FA patients (six females and three males) and nine age- and sex-matched controls, who performed the 8-m walk (8-MW) test and 9-hole peg test (9 HPT). We used machine learning to combine these features to longitudinally predict the clinical scores of the FA patients, and compared these with two standard clinical assessments, Spinocerebellar Ataxia Functional Index (SCAFI) and Scale for the Assessment and Rating of Ataxia (SARA). The digital behavioral features enabled longitudinal predictions of personal SARA and SCAFI scores 9 months into the future and were 1.7 and 4 times more precise than longitudinal predictions using only SARA and SCAFI scores, respectively. Unlike the two clinical scales, the digital behavioral features accurately predicted FXN gene expression levels for each FA patient in a cross-sectional manner. Our work demonstrates how data-derived wearable biomarkers can track personal disease trajectories and indicates the potential of such biomarkers for substantially reducing the duration or size of clinical trials testing disease-modifying therapies and for enabling behavioral transcriptomics

    Decoding of human hand actions to handle missing limbs in neuroprosthetics

    Get PDF
    The only way we can interact with the world is through movements, and our primary interactions are via the hands, thus any loss of hand function has immediate impact on our quality of life. However, to date it has not been systematically assessed how coordination in the hand's joints affects every day actions. This is important for two fundamental reasons. Firstly, to understand the representations and computations underlying motor control “in-the-wild” situations, and secondly to develop smarter controllers for prosthetic hands that have the same functionality as natural limbs. In this work we exploit the correlation structure of our hand and finger movements in daily-life. The novelty of our idea is that instead of averaging variability out, we take the view that the structure of variability may contain valuable information about the task being performed. We asked seven subjects to interact in 17 daily-life situations, and quantified behavior in a principled manner using CyberGlove body sensor networks that, after accurate calibration, track all major joints of the hand. Our key findings are: (1) We confirmed that hand control in daily-life tasks is very low-dimensional, with four to five dimensions being sufficient to explain 80–90% of the variability in the natural movement data. (2) We established a universally applicable measure of manipulative complexity that allowed us to measure and compare limb movements across tasks. We used Bayesian latent variable models to model the low-dimensional structure of finger joint angles in natural actions. (3) This allowed us to build a naïve classifier that within the first 1000 ms of action initiation (from a flat hand start configuration) predicted which of the 17 actions was going to be executed—enabling us to reliably predict the action intention from very short-time-scale initial data, further revealing the foreseeable nature of hand movements for control of neuroprosthetics and tele operation purposes. (4) Using the Expectation-Maximization algorithm on our latent variable model permitted us to reconstruct with high accuracy (&lt;5–6° MAE) the movement trajectory of missing fingers by simply tracking the remaining fingers. Overall, our results suggest the hypothesis that specific hand actions are orchestrated by the brain in such a way that in the natural tasks of daily-life there is sufficient redundancy and predictability to be directly exploitable for neuroprosthetics.QC 20160413</p
    corecore