3 research outputs found

    Ivy : un bus logiciel au service du développement de prototypes de systèmes interactifs

    Get PDF
    Ce document présente l'expérience acquise au cours du développement et de l'utilisation du bus logiciel Ivy, dans un cadre de prototypage de systèmes interactifs pour le contrôle du trafic aérien. Après une description du principe de fonctionnement de ce système, nous verrons comment cet outil a pu influer sur notre approche de problématiques IHM spécifiques comme la multimodalité, l'interaction répartie ou la mobilité. L'accent est porté sur les services rendus par ce bus pour le développement rapide de systèmes interactifs " légers ", facilement intégrables dans un banc de démonstration et basés sur la logique des langages de script. En présentant cet outil que nous utilisons depuis maintenant cinq ans, nous espérons partager ici une expérience utile pour la conception de futures architectures de systèmes interactifs à des fins de recherche prospective

    Combined Surface Electromyography and Motion Capture for Quantitative Analysis of Facial Movements

    No full text
    International audienceResearch ObjectivesThe aim of this work was to test the feasibility of integrating EMG into our motion capture protocol for quantitative analysis of facial movements, in order to measure at the same time the action potentials by sEMG and the markers displacement amplitudes for a determined movement.DesignA prospective study for quantitative analysis of facial movements by the simultaneous use of surface electromyography (sEMG) was conducted.SettingThis study was approved by the local independent ethics committees (CPP Nord Ouest II, Amiens, France; references: ID-RCB 2011-A00532-39; CPP 2011-23), registered at clinicaltrials.gov (NCT02002572).ParticipantsThis feasibility trial was performed on two healthy male volunteers without history of facial pathology.InterventionsWe used a motion capture system consisting of 10 Vantage optoelectronic cameras (Vicon Ltd, Oxford, UK). In addition, we used wireless bipolar PicoEMG sensors (Cometa Systems, Milan, Italy). They were placed on the subject's face, at the motor points of frontalis and zygomatic major muscles.Main Outcome MeasuresKinematic and EMG data was exported as a .csv file that combined EMG signals and markers displacement amplitudes over time. A custom algorithm was developed for processing and analyzing both signals.ResultsWe obtained noiseless, rectified and filtered EMG signals, which allowed a readable visual graphical analysis. EMG signals analysis between sensors of each side of the face showed similar signal patterns between right and left muscles, for each acquisition and subject. Regarding motion capture data, the markers’ displacements amplitudes were compared between each side of the face according to the muscular zone, and we observed symmetry of the results between muscles on each side of the face.ConclusionsHere we showed the feasibility of using motion capture and electromyography for quantitative analysis of facial movements in one single acquisition. We obtained facial expression indicators that can be used for a simultaneous multimodal analysis

    Erratum to: Guidelines for the use and interpretation of assays for monitoring autophagy (3rd edition) (Autophagy, 12, 1, 1-222, 10.1080/15548627.2015.1100356

    No full text
    non present
    corecore