20 research outputs found

    Fixed-gaze head movement detection for triggering commands

    Get PDF
    International audienceIn the field of human-computer interaction, mobile eye-tracking devices (carried on a pair of glasses) can be used to interact with an object remotely, in mobility, while keeping both hands free to perform the main activity. This process allows us to interact with objects beyond our reach and sometimes it can be faster than a traditional interaction. On the other hand, as the main task of the eyes is to observe the environment, it becomes difficult to differentiate a simple observation of an object in the scene from the will to interact with it (Midas touch). To solve this problem, solutions have been proposed in the literature, such as the use of voluntary eye movements, the use of smooth pursuit, or coupling the eye-tracking with a secondary device. In this context, our study focuses on the analysis of voluntary head movement when the user's eyes are fixed on the object of interest to trigger various commands. In order to evaluate the appropriateness of this approach in realistic situation and to evaluate its performance, we have conducted a test of the detection of 6 different fixed-gaze head movements on 40 people: head shaking (right, left), nodding (up, down) and tilting (right, left). During this test, we asked the participants to learn quickly these six movements, then to trigger various commands using these movements. The success rate is 70%, but this rate depends on the individuals and the gestures performed. As these movements are rarely used during the observation of an object, the problem of Midas touch can be avoided, while keeping both hands free

    Robust gaze tracking for advanced mobile interaction

    No full text
    Les dispositifs d'eye-tracking ont un très fort potentiel en tant que modalité d'entrée en IHM (Interaction Homme Machine), en particulier en situation de mobilité. Dans cette thèse, nous nous concentrons sur la mise en œuvre de cette potentialité en mettant en évidence les scénarios dans lesquels l’eye-tracking possède des avantages évidents par rapport à toutes les autres modalités d’interaction. Au cours de nos recherches, nous avons constaté que cette technologie ne dispose pas de méthodes pratiques pour le déclenchement de commandes, ce qui réduit l'usage de tels dispositifs. Dans ce contexte, nous étudions la combinaison d'un eye-tracking et des mouvements volontaires de la tête lorsque le regard est fixe, ce qui permet de déclencher des commandes diverses sans utiliser les mains ni changer la direction du regard. Nous avons ainsi proposé un nouvel algorithme pour la détection des mouvements volontaires de la tête à regard fixe en utilisant uniquement les images capturées par la caméra de scène qui équipe les eye-trackers portés sur la tête, afin de réduire le temps de calcul. Afin de tester la performance de notre algorithme de détection des mouvements de la tête à regard fixe, et l'acceptation par l'utilisateur du déclenchement des commandes par ces mouvements lorsque ses deux mains sont occupées par une autre activité, nous avons effectué des expériences systématiques grâce à l'application EyeMusic que nous avons conçue et développée. Cette application EyeMusic est un système pour l'apprentissage de la musique capable de jouer les notes d’une mesure d'une partition que l’utilisateur ne comprend pas. En effectuant un mouvement volontaire de la tête qui fixe de son regard une mesure particulière d'une partition, l'utilisateur obtient un retour audio. La conception, le développement et les tests d’utilisabilité du premier prototype de cette application sont présentés dans cette thèse. L'utilisabilité de notre application EyeMusic est confirmée par les résultats expérimentaux : 85% des participants ont été en mesure d’utiliser tous les mouvements volontaires de la tête à regard fixe que nous avons implémentés dans le prototype. Le taux de réussite moyen de cette application est de 70%, ce qui est partiellement influencé par la performance intrinsèque de l'eye-tracker que nous utilisons. La performance de notre algorithme de détection des mouvements de la tête à regard fixe est 85%, et il n’y a pas de différence significative entre la performance de chaque mouvement de la tête testé. Également, nous avons exploré deux scénarios d'applications qui reposent sur les mêmes principes de commande, EyeRecipe et EyePay, dont les détails sont également présentés dans cette thèse.Eye-tracking has a very strong potential in human computer interaction (HCI) as an input modality, particularly in mobile situations. In this thesis, we concentrate in demonstrating this potential by highlighting the scenarios in which the eye-tracking possesses obvious advantages comparing with all the other interaction modalities. During our research, we find that this technology lacks convenient action triggering methods, which can scale down the performance of interacting by gaze. In this instance, we investigate the combination of eye-tracking and fixed-gaze head movement, which allows us to trigger various commands without using our hands or changing gaze direction. We have proposed a new algorithm for fixed-gaze head movement detection using only scene images captured by the scene camera equipped in front of the head-mounted eye-tracker, for the purpose of saving computation time. To test the performance of our fixed-gaze head movement detection algorithm and the acceptance of triggering commands by these movements when the user's hands are occupied by another task, we have implemented some tests in the EyeMusic application that we have designed and developed. The EyeMusic system is a music reading system, which can play the notes of a measure in a music score that the user does not understand. By making a voluntary head movement when fixing his/her gaze on the same point of a music score, the user can obtain the desired audio feedback. The design, development and usability testing of the first prototype for this application are presented in this thesis. The usability of our EyeMusic application is confirmed by the experimental results, as 85% of participants were able to use all the head movements we implemented in the prototype. The average success rate of this application is 70%, which is partly influenced by the performance of the eye-tracker we use. The performance of our fixed-gaze head movement detection algorithm is 85%, and there were no significant differences between the performance of each head movement. Apart from the EyeMusic application, we have explored two other scenarios that are based on the same control principles: EyeRecipe and EyePay, the details of these two applications are also presented in this thesis

    Utilisation de l'eye-tracking pour l'interaction mobile dans un environnement réel augmenté

    Get PDF
    Eye-tracking has a very strong potential in human computer interaction (HCI) as an input modality, particularly in mobile situations. In this thesis, we concentrate in demonstrating this potential by highlighting the scenarios in which the eye-tracking possesses obvious advantages comparing with all the other interaction modalities. During our research, we find that this technology lacks convenient action triggering methods, which can scale down the performance of interacting by gaze. In this instance, we investigate the combination of eye-tracking and fixed-gaze head movement, which allows us to trigger various commands without using our hands or changing gaze direction. We have proposed a new algorithm for fixed-gaze head movement detection using only scene images captured by the scene camera equipped in front of the head-mounted eye-tracker, for the purpose of saving computation time. To test the performance of our fixed-gaze head movement detection algorithm and the acceptance of triggering commands by these movements when the user's hands are occupied by another task, we have implemented some tests in the EyeMusic application that we have designed and developed. The EyeMusic system is a music reading system, which can play the notes of a measure in a music score that the user does not understand. By making a voluntary head movement when fixing his/her gaze on the same point of a music score, the user can obtain the desired audio feedback. The design, development and usability testing of the first prototype for this application are presented in this thesis. The usability of our EyeMusic application is confirmed by the experimental results, as 85% of participants were able to use all the head movements we implemented in the prototype. The average success rate of this application is 70%, which is partly influenced by the performance of the eye-tracker we use. The performance of our fixed-gaze head movement detection algorithm is 85%, and there were no significant differences between the performance of each head movement. Apart from the EyeMusic application, we have explored two other scenarios that are based on the same control principles: EyeRecipe and EyePay, the details of these two applications are also presented in this thesis.Les dispositifs d'eye-tracking ont un très fort potentiel en tant que modalité d'entrée en IHM (Interaction Homme Machine), en particulier en situation de mobilité. Dans cette thèse, nous nous concentrons sur la mise en œuvre de cette potentialité en mettant en évidence les scénarios dans lesquels l’eye-tracking possède des avantages évidents par rapport à toutes les autres modalités d’interaction. Au cours de nos recherches, nous avons constaté que cette technologie ne dispose pas de méthodes pratiques pour le déclenchement de commandes, ce qui réduit l'usage de tels dispositifs. Dans ce contexte, nous étudions la combinaison d'un eye-tracking et des mouvements volontaires de la tête lorsque le regard est fixe, ce qui permet de déclencher des commandes diverses sans utiliser les mains ni changer la direction du regard. Nous avons ainsi proposé un nouvel algorithme pour la détection des mouvements volontaires de la tête à regard fixe en utilisant uniquement les images capturées par la caméra de scène qui équipe les eye-trackers portés sur la tête, afin de réduire le temps de calcul. Afin de tester la performance de notre algorithme de détection des mouvements de la tête à regard fixe, et l'acceptation par l'utilisateur du déclenchement des commandes par ces mouvements lorsque ses deux mains sont occupées par une autre activité, nous avons effectué des expériences systématiques grâce à l'application EyeMusic que nous avons conçue et développée. Cette application EyeMusic est un système pour l'apprentissage de la musique capable de jouer les notes d’une mesure d'une partition que l’utilisateur ne comprend pas. En effectuant un mouvement volontaire de la tête qui fixe de son regard une mesure particulière d'une partition, l'utilisateur obtient un retour audio. La conception, le développement et les tests d’utilisabilité du premier prototype de cette application sont présentés dans cette thèse. L'utilisabilité de notre application EyeMusic est confirmée par les résultats expérimentaux : 85% des participants ont été en mesure d’utiliser tous les mouvements volontaires de la tête à regard fixe que nous avons implémentés dans le prototype. Le taux de réussite moyen de cette application est de 70%, ce qui est partiellement influencé par la performance intrinsèque de l'eye-tracker que nous utilisons. La performance de notre algorithme de détection des mouvements de la tête à regard fixe est 85%, et il n’y a pas de différence significative entre la performance de chaque mouvement de la tête testé. Également, nous avons exploré deux scénarios d'applications qui reposent sur les mêmes principes de commande, EyeRecipe et EyePay, dont les détails sont également présentés dans cette thèse

    Assisted Music Score Reading Using Fixed-Gaze Head Movement: Empirical Experiment and Design Implications

    No full text
    International audienceEye-tracking has a very strong potential in human computer interaction (HCI) as an input modality, particularly in mobile situations. However, it lacks convenient action triggering methods. In our research, we investigate the combination of eye-tracking and fixed-gaze head movement, which allows us to trigger various commands without using our hands or changing gaze direction. In this instance, we have proposed a new algorithm for fixed-gaze head movement detection using only scene images captured by the scene camera equipped in front of the head-mounted eye-tracker, for the purpose of saving computation time. To test the performance of our fixedgaze head movement detection algorithm and the acceptance of triggering commands by these movements when the user’s hands are occupied by another task, we have designed and developed an experimental application known as EyeMusic. The EyeMusic system is a music reading system, which can play the notes of a measure in a music score that the user does not understand. By making a voluntary head movement when fixing his/her gaze on the same point of a music score, the user can obtain the desired audio feedback. The design, development and usability testing of the first prototype for this application are presented in this paper.The usability of our application is confirmed by the experimental results, as 85% of participants were able touse all the head movements we implemented in the prototype. The average success rate of this applicationis 70%, which is partly influenced by the performance of the eye-tracker we use. The performance of ourfixed-gaze head movement detection algorithm is 85%, and there were no significant differences between theperformance of each head movement

    Utilisation de l'Eye-tracking pour l'Interaction Mobile dans un Environnement Réel Augmenté

    No full text
    Rencontres doctoralesInternational audienceIn our thesis, we are interested in studying the potentialities of eye-tracking for mobile interaction and for interaction with physical objects in augmented real environments (in the sense of pervasive computing). Eye-tracking has a very strong potential in HCI as an entry modality particularly in mobile situations, and it can facilitate the learning process. In order to test the capabilities of interaction by gaze when the user's hands are occupied by another task, we have designed an interactive music learning application using eye-tracking. The conception of the first prototype for this application is also presented in this article.Dans notre thèse nous cherchons à étudier les potentialités de l’eye-tracking pour l’interaction mobile et pour l’interaction avec des objets physiques dans des environnements réels augmentés (au sens de l’informatique pervasive). L’eye-tracking a un très fort potentiel en tant que modalité d’entrée en IHM (Interaction Homme Machine) en particulier en situation de mobilité, et il est capable de faciliter le processus d’apprentissage. Afin de tester les capacités d’interaction par le regard lorsque les deux mains de l’utilisateur sont occupées par une autre tâche, on a conçu un système d’apprentissage interactif de la musique à l’aide de l’eye-tracking. La conception du 1er prototype de ce système est également présentée dans cet article

    The Circular RNA Profiles of Colorectal Tumor Metastatic Cells

    No full text
    Circular RNAs (circRNAs) have been reported that can be used as biomarkers for colorectal cancers (CRC) and other types of tumors. However, a limited number of studies have been performed investigating the potential role of circRNAs in tumor metastasis. Here, we examined the circRNAs in two CRC cell lines (a primary tumor cell SW480 and its metastasis cell SW620), and found a large set of circRNA (2,919 ncDECs) with significantly differential expression patterns relative to normal cells (NCM460). In addition, we uncovered a set of 623 pmDECs that differ between the primary CRC cells and its metastasis cells. Both differentially expressed circRNA (DEC) sets contain many previously unknown putative CRC-related circRNAs, thereby providing many new circRNAs as candidate biomarkers for CRC development and metastasis. These studies are the first large-scale identification of metastasis-related circRNAs for CRC and provide valuable candidate biomarkers for diagnostic and a starting point for additional investigations of CRC metastasis

    Table2.XLSX

    No full text
    <p>Circular RNAs (circRNAs) have been reported that can be used as biomarkers for colorectal cancers (CRC) and other types of tumors. However, a limited number of studies have been performed investigating the potential role of circRNAs in tumor metastasis. Here, we examined the circRNAs in two CRC cell lines (a primary tumor cell SW480 and its metastasis cell SW620), and found a large set of circRNA (2,919 ncDECs) with significantly differential expression patterns relative to normal cells (NCM460). In addition, we uncovered a set of 623 pmDECs that differ between the primary CRC cells and its metastasis cells. Both differentially expressed circRNA (DEC) sets contain many previously unknown putative CRC-related circRNAs, thereby providing many new circRNAs as candidate biomarkers for CRC development and metastasis. These studies are the first large-scale identification of metastasis-related circRNAs for CRC and provide valuable candidate biomarkers for diagnostic and a starting point for additional investigations of CRC metastasis.</p

    Table4.xlsx

    No full text
    <p>Circular RNAs (circRNAs) have been reported that can be used as biomarkers for colorectal cancers (CRC) and other types of tumors. However, a limited number of studies have been performed investigating the potential role of circRNAs in tumor metastasis. Here, we examined the circRNAs in two CRC cell lines (a primary tumor cell SW480 and its metastasis cell SW620), and found a large set of circRNA (2,919 ncDECs) with significantly differential expression patterns relative to normal cells (NCM460). In addition, we uncovered a set of 623 pmDECs that differ between the primary CRC cells and its metastasis cells. Both differentially expressed circRNA (DEC) sets contain many previously unknown putative CRC-related circRNAs, thereby providing many new circRNAs as candidate biomarkers for CRC development and metastasis. These studies are the first large-scale identification of metastasis-related circRNAs for CRC and provide valuable candidate biomarkers for diagnostic and a starting point for additional investigations of CRC metastasis.</p

    Table7.XLSX

    No full text
    <p>Circular RNAs (circRNAs) have been reported that can be used as biomarkers for colorectal cancers (CRC) and other types of tumors. However, a limited number of studies have been performed investigating the potential role of circRNAs in tumor metastasis. Here, we examined the circRNAs in two CRC cell lines (a primary tumor cell SW480 and its metastasis cell SW620), and found a large set of circRNA (2,919 ncDECs) with significantly differential expression patterns relative to normal cells (NCM460). In addition, we uncovered a set of 623 pmDECs that differ between the primary CRC cells and its metastasis cells. Both differentially expressed circRNA (DEC) sets contain many previously unknown putative CRC-related circRNAs, thereby providing many new circRNAs as candidate biomarkers for CRC development and metastasis. These studies are the first large-scale identification of metastasis-related circRNAs for CRC and provide valuable candidate biomarkers for diagnostic and a starting point for additional investigations of CRC metastasis.</p
    corecore