997 research outputs found

    Arianna: towards a new paradigm for assistive technology at home

    Full text link
    Providing elderly and people with special needs to retain their independence as long as possible is one of the biggest challenges of the society of tomorrow. Teseo, a startup company spinoff from the University of Genoa, aims at accelerating the transition towards a sustainable healthcare system. Teseo's first concept and product, Arianna, allows for the automated recognition of activities of daily living at home and acts as a wellbeing and healthcare personalized assistant. This abstract outlines the main concepts underlying its features and capabilities.Comment: Paper accepted at the Eight Italian Forum on Ambient Assisted Living (ForItAAL 2017

    Detection of bimanual gestures everywhere: why it matters, what we need and what is missing

    Full text link
    Bimanual gestures are of the utmost importance for the study of motor coordination in humans and in everyday activities. A reliable detection of bimanual gestures in unconstrained environments is fundamental for their clinical study and to assess common activities of daily living. This paper investigates techniques for a reliable, unconstrained detection and classification of bimanual gestures. It assumes the availability of inertial data originating from the two hands/arms, builds upon a previously developed technique for gesture modelling based on Gaussian Mixture Modelling (GMM) and Gaussian Mixture Regression (GMR), and compares different modelling and classification techniques, which are based on a number of assumptions inspired by literature about how bimanual gestures are represented and modelled in the brain. Experiments show results related to 5 everyday bimanual activities, which have been selected on the basis of three main parameters: (not) constraining the two hands by a physical tool, (not) requiring a specific sequence of single-hand gestures, being recursive (or not). In the best performing combination of modeling approach and classification technique, five out of five activities are recognized up to an accuracy of 97%, a precision of 82% and a level of recall of 100%.Comment: Submitted to Robotics and Autonomous Systems (Elsevier

    Flexible human-robot cooperation models for assisted shop-floor tasks

    Get PDF
    The Industry 4.0 paradigm emphasizes the crucial benefits that collaborative robots, i.e., robots able to work alongside and together with humans, could bring to the whole production process. In this context, an enabling technology yet unreached is the design of flexible robots able to deal at all levels with humans' intrinsic variability, which is not only a necessary element for a comfortable working experience for the person but also a precious capability for efficiently dealing with unexpected events. In this paper, a sensing, representation, planning and control architecture for flexible human-robot cooperation, referred to as FlexHRC, is proposed. FlexHRC relies on wearable sensors for human action recognition, AND/OR graphs for the representation of and reasoning upon cooperation models, and a Task Priority framework to decouple action planning from robot motion planning and control.Comment: Submitted to Mechatronics (Elsevier

    Play is Child\u27s Work: Play Pods Invade North Philadelphia to Put Children to Work

    Get PDF
    https://digitalcommons.pcom.edu/bridging_gaps2014/1031/thumbnail.jp
    corecore