1,152 research outputs found

    Assessing Visual Attention Using Eye Tracking Sensors in Intelligent Cognitive Therapies Based on Serious Games

    Get PDF
    This study examines the use of eye tracking sensors as a means to identify children's behavior in attention-enhancement therapies. For this purpose, a set of data collected from 32 children with different attention skills is analyzed during their interaction with a set of puzzle games. The authors of this study hypothesize that participants with better performance may have quantifiably different eye-movement patterns from users with poorer results. The use of eye trackers outside the research community may help to extend their potential with available intelligent therapies, bringing state-of-the-art technologies to users. The use of gaze data constitutes a new information source in intelligent therapies that may help to build new approaches that are fully-customized to final users' needs. This may be achieved by implementing machine learning algorithms for classification. The initial study of the dataset has proven a 0.88 (±0.11) classification accuracy with a random forest classifier, using cross-validation and hierarchical tree-based feature selection. Further approaches need to be examined in order to establish more detailed attention behaviors and patterns among children with and without attention problems

    Datadriven Human Intention Analysis : Supported by Virtual Reality and Eye Tracking

    Get PDF
    The ability to determine an upcoming action or what decision a human is about to take, can be useful in multiple areas, for example in manufacturing where humans working with collaborative robots, where knowing the intent of the operator could provide the robot with important information to help it navigate more safely. Another field that could benefit from a system that provides information regarding human intentions is the field of psychological testing where such a system could be used as a platform for new research or be one way to provide information in the diagnostic process. The work presented in this thesis investigates the potential use of virtual reality as a safe, customizable environment to collect gaze and movement data, eye tracking as the non-invasive system input that gives insight into the human mind, and deep machine learning as the tool that analyzes the data. The thesis defines an experimental procedure that can be used to construct a virtual reality based testing system that gathers gaze and movement data, carries out a test study to gather data from human participants, and implements an artificial neural network in order to analyze human behaviour. This is followed by four studies that gives evidence to the decisions that were made in the experimental procedure and shows the potential uses of such a system

    Gaze Based Human Intention Analysis

    Get PDF
    The ability to determine an upcoming action or what decision a human is about to take, can be useful in multiple areas, for example during human-robot collaboration in manufacturing, where knowing the intent of the operator could provide the robot with important information to help it navigate more safely. Another field that could benefit from a system that provides information regarding human intentions is the field of psychological testing where such a system could be used as a platform for new research or be one way to provide information in the diagnostic process. The work presented in this thesis investigates the potential use of virtual reality as a safe, measurable, and customizable environment to collect gaze and movement data, eye tracking as the non-invasive system input that gives insight into the human mind, and deep machine learning as one tool to analyze the data. The thesis defines an experimental procedure that can be used to construct a virtual reality based testing system that gathers gaze and movement data, carry out a test study to gather data from human participants, and implement artificial neural networks in order to analyze human behaviour. This is followed by two studies that gives evidence to the decisions that were made in the experimental procedure and shows the potential uses of such a system

    Biometrics

    Get PDF
    Biometrics uses methods for unique recognition of humans based upon one or more intrinsic physical or behavioral traits. In computer science, particularly, biometrics is used as a form of identity access management and access control. It is also used to identify individuals in groups that are under surveillance. The book consists of 13 chapters, each focusing on a certain aspect of the problem. The book chapters are divided into three sections: physical biometrics, behavioral biometrics and medical biometrics. The key objective of the book is to provide comprehensive reference and text on human authentication and people identity verification from both physiological, behavioural and other points of view. It aims to publish new insights into current innovations in computer systems and technology for biometrics development and its applications. The book was reviewed by the editor Dr. Jucheng Yang, and many of the guest editors, such as Dr. Girija Chetty, Dr. Norman Poh, Dr. Loris Nanni, Dr. Jianjiang Feng, Dr. Dongsun Park, Dr. Sook Yoon and so on, who also made a significant contribution to the book

    Dynamic Contrast-Enhanced MR Microscopy: Functional Imaging in Preclinical Models of Cancer

    Get PDF
    <p>Dynamic contrast-enhanced (DCE) MRI has been widely used as a quantitative imaging method for monitoring tumor response to therapy. The pharmacokinetic parameters derived from this technique have been used in more than 100 phase I trials and investigator led studies. The simultaneous challenges of increasing the temporal and spatial resolution, in a setting where the signal from the much smaller voxel is weaker, have made this MR technique difficult to implement in small-animal imaging. Existing preclinical DCE-MRI protocols acquire a limited number of slices resulting in potentially lost information in the third dimension. Furthermore, drug efficacy studies measuring the effect of an anti-angiogenic treatment, often compare the derived biomarkers on manually selected tumor regions or over the entire volume. These measurements include domains where the interpretation of the biomarkers may be unclear (such as in necrotic areas).</p><p>This dissertation describes and compares a family of four-dimensional (3D spatial + time), projection acquisition, keyhole-sampling strategies that support high spatial and temporal resolution. An interleaved 3D radial trajectory with a quasi-uniform distribution of points in k-space was used for sampling temporally resolved datasets. These volumes were reconstructed with three different k-space filters encompassing a range of possible keyhole strategies. The effect of k-space filtering on spatial and temporal resolution was studied in phantoms and in vivo. The statistical variation of the DCE-MRI measurement is analyzed by considering the fundamental sources of error in the MR signal intensity acquired with the spoiled gradient-echo (SPGR) pulse sequence. Finally, the technique was applied for measuring the extent of the opening of the blood-brain barrier in a mouse model of pediatric glioma and for identifying regions of therapeutic effect in a model of colorectal adenocarcinoma. </p><p>It is shown that 4D radial keyhole imaging does not degrade the system spatial and temporal resolution at a cost of 20-40% decrease in SNR. The time-dependent concentration of the contrast agent measured in vivo is within the theoretically predicted limits. The uncertainty in measuring the pharmacokinetic parameters with the sequences is of the same order, but always higher than, the uncertainty in measuring the pre-injection longitudinal relaxation time. The histogram of the time-to-peak provides useful knowledge about the spatial distribution of K^trans and microvascular density. Two regions with distinct kinetic parameters were identified when the TTP map from DCE-MRM was thresholded at 1000 sec. The effect of bevacizumab, as measured by a decrease in K^trans, was confined to one of these regions. DCE-MRI studies may contribute unique insights into the response of the tumor microenvironment to therapy.</p>Dissertatio

    Toward the real time estimation of the attentional state through ocular activity analysis

    Get PDF
    L'analyse d'incidents aĂ©ronautiques et d'expĂ©riences en laboratoire a montrĂ© que la tunnĂ©lisation attentionnelle amĂšne les pilotes Ă  nĂ©gliger des alarmes critiques. Une piste intĂ©ressante pour rĂ©pondre Ă  ce problĂšme s'appuie sur les systĂšmes adaptatifs qui pourraient assister l'opĂ©rateur en temps rĂ©el (en changeant le comportement du pilote automatique par exemple). Ce type de systĂšmes adaptatifs requiert l'Ă©tat de l'opĂ©rateur en entrĂ©e. Pour cela, des mĂ©thodes d'infĂ©rence de l'Ă©tat de l'opĂ©rateur doublĂ©es de mĂ©triques de la tunnĂ©lisation attentionnelle doivent ĂȘtre proposĂ©es. Le but de cette thĂšse de doctorat est d'apporter la preuve que la dĂ©tection de la tunnĂ©lisation attentionnelle est possible en temps rĂ©el. Pour cela une mĂ©thode adaptative neuro-floue utilisant les mĂ©triques de la tunnĂ©lisation attentionnelle sera proposĂ©e, ainsi que de nouvelles mĂ©triques de la tunnĂ©lisation attentionnelle qui ne dĂ©pendent pas du contexte de l'opĂ©rateur, et qui sont calculables en temps rĂ©el. L'algorithme d'identification des Ă©tats de l'oeil (ESIA) est proposĂ© en ce sens. Les mĂ©triques attentionnelles en sont dĂ©rivĂ©es et testĂ©es dans le contexte d'une expĂ©rience robotique dont le design favorise la tunnĂ©lisation attentionnellle. Nous proposons Ă©galement une nouvelle dĂ©finition du ratio exploitation/exploration d'information dont la pertinence en tant que marqueur de la tunnĂ©lisation attentionnelle est dĂ©montrĂ©e statistiquement. Le travail est ensuite discutĂ© et appliquĂ© sur divers cas d'Ă©tude en aviation et robotique.The analysis of aerospace incidents and laboratory experiments have shown that attentional tunneling leads pilots to neglect critical alarms. One interesting avenue to deal with this issue is to consider adaptive systems that would help the operator in real time (for instance: switching the auto-pilot mode). Such adaptive systems require the operator's state as an input. Therefore, both attentional tunneling metrics and state inference techniques have to be proposed. The goal of the PhD Thesis is to provide attentional tunneling metrics that are real-time and context independent. The Eye State Identification Algorithm (ESIA) that analyses ocular activity is proposed. Metrics are then derived and tested on a robotic experiment meant for favouring attentional tunneling. We also propose a new definition of the explore/exploit ratio that was proven statistically to be a relevant attentional tunneling marker. This work is then discussed and applied to different case studies in aviation and robotics

    Statistical and Computational Methods for Analyzing and Visualizing Large-Scale Genomic Datasets

    Full text link
    Advances in large-scale genomic data production have led to a need for better methods to process, interpret, and organize this data. Starting with raw sequencing data, generating results requires many complex data processing steps, from quality control, alignment, and variant calling to genome wide association studies (GWAS) and characterization of expression quantitative trait loci (eQTL). In this dissertation, I present methods to address issues faced when working with large-scale genomic datasets. In Chapter 2, I present an analysis of 4,787 whole genomes sequenced for the study of age-related macular degeneration (AMD) as a follow-up fine-mapping study to previous work from the International AMD Genomics Consortium (IAMDGC). Through whole genome sequencing, we comprehensively characterized genetic variants associated with AMD in known loci to provide additional insights on the variants potentially responsible for the disease by leveraging 60,706 additional controls. Our study improved the understanding of loci associated with AMD and demonstrated the advantages and disadvantages of different approaches for fine-mapping studies with sequence-based genotypes. In Chapter 3, I describe a novel method and a software tool to perform Hardy-Weinberg equilibrium (HWE) tests for structured populations. In sequence-based genetic studies, HWE test statistics are important quality metrics to distinguish true genetic variants from artifactual ones, but it becomes much less informative when it is applied to a heterogeneous and/or structured population. As next generation sequencing studies contain samples from increasingly diverse ancestries, we developed a new HWE test which addresses both the statistical and computational challenges of modern large-scale sequencing data and implemented the method in a publicly available software tool. Moreover, we extensively evaluated our proposed method with alternative methods to test HWE in both simulated and real datasets. Our method has been successfully applied to the latest variant calling QC pipeline in the TOPMed project. In Chapter 4, I describe PheGET, a web application to interactively visualize Expression Quantitative Trait Loci (eQTLs) across tissues, genes, and regions to aid functional interpretations of regulatory variants. Tissue-specific expression has become increasingly important for understanding the links between genetic variation and disease. To address this need, the Genotype-Tissue Expression (GTEx) project collected and analyzed a treasure trove of expression data. However, effectively navigating this wealth of data to find signals relevant to researchers has become a major challenge. I demonstrate the functionalities of PheGET using the newest GTEx data on our eQTL browser website at https://eqtl.pheweb.org/, allowing the user to 1) view all cis-eQTLs for a single variant; 2) view and compare single-tissue, single-gene associations within any genomic region; 3) find the best eQTL signal in any given genomic region or gene; and 4) customize the plotted data in real time. PheGET is designed to handle and display the kind of complex multidimensional data often seen in our post-GWAS era, such as multi-tissue expression data, in an intuitive and convenient interface, giving researchers an additional tool to better understand the links between genetics and disease.PHDBiostatisticsUniversity of Michigan, Horace H. Rackham School of Graduate Studieshttp://deepblue.lib.umich.edu/bitstream/2027.42/162918/1/amkwong_1.pd

    Client data files and auditor skepticism: How do “dirty” files influence auditors’ skeptical judgments and actions?

    Get PDF
    Auditors receive an abundance of client-prepared data files when performing audit work. With today’s increasingly data-rich environment, these files are likely becoming even more challenging for auditors to cognitively process. Specifically, these data files may have characteristics (e.g., contain errors or irrelevant information; aka “dirty” files) that could challenge their ease of use and interpretation (i.e., processing fluency). Depending on this ease, auditors may view these files as relatively less reliable and trustworthy, resulting in skeptical judgments and actions that are sometimes excessive. This paper reports two experiments examining whether two features of the data files, the presence of minor errors (absent or present) and information load (low or high), influence auditors’ processing fluency, skeptical judgments, and actions. While minor errors should raise auditors’ concerns, greater information load should not. However, we find the lowest processing fluency and highest skeptical judgments and actions when minor errors are present and information load is higher. Our study contributes to the literature by presenting an alternative issue to those raised by regulators (i.e., too much skepticism rather than too little) that can occur when auditors struggle to interpret large amounts of data. From a practical perspective, while access to increased amounts of client data may have benefits, audit firms and clients need to be wary of the potential for wasted time that could create inefficiencies that may affect audit quality
    • 

    corecore