14 research outputs found

    Ontology-Based Meta-Analysis of Global Collections of High-Throughput Public Data

    Get PDF
    The investigation of the interconnections between the molecular and genetic events that govern biological systems is essential if we are to understand the development of disease and design effective novel treatments. Microarray and next-generation sequencing technologies have the potential to provide this information. However, taking full advantage of these approaches requires that biological connections be made across large quantities of highly heterogeneous genomic datasets. Leveraging the increasingly huge quantities of genomic data in the public domain is fast becoming one of the key challenges in the research community today.We have developed a novel data mining framework that enables researchers to use this growing collection of public high-throughput data to investigate any set of genes or proteins. The connectivity between molecular states across thousands of heterogeneous datasets from microarrays and other genomic platforms is determined through a combination of rank-based enrichment statistics, meta-analyses, and biomedical ontologies. We address data quality concerns through dataset replication and meta-analysis and ensure that the majority of the findings are derived using multiple lines of evidence. As an example of our strategy and the utility of this framework, we apply our data mining approach to explore the biology of brown fat within the context of the thousands of publicly available gene expression datasets.Our work presents a practical strategy for organizing, mining, and correlating global collections of large-scale genomic data to explore normal and disease biology. Using a hypothesis-free approach, we demonstrate how a data-driven analysis across very large collections of genomic data can reveal novel discoveries and evidence to support existing hypothesis

    Collective intelligence in action

    No full text

    1 Intelligent Sensor Validation and Sensor Fusion for Reliability and Safety Enhancement in Vehicle Control

    No full text
    In this report we present an evaluation of methods for validation and fusion of sensor readings obtained from multiple sensors, to be used in tracking automated vehicles and avoidance of obstacle in its path. The validation and fusion is performed in two modules which are part of a larger five-module hierarchical supervisory control architecture. This supervisory control architecture operates at two levels of the Automated Vehicle Control Systems (AVCS): the regulation and the coordination level. Supervisory control activities at the regulation layer deal with validation and fusion of the sensor data, as well as fault diagnosis of the actuators, sensors, and the vehicle itself. Supervisory control activities at the coordination layer deal with detecting potential hazards, recommending the feasibility of potential maneuvers and making recommendations to avert accidents in emergency situations. In this grant we formulated the need for an hierarchical approach and then focussed in depth on the two modules sensor validation and sensor fusion. Tracking models were introduced for the various operating states of the automated vehicle, namely vehicle following, maneuvering, i.e. split, merge, lane change, emergencies, and for the lead vehicle in a platoon. The Probabilistic Data Association Filter (based on Kalman filtering) is proposed for the formation of real time validation gates and for fusing the validated readings. A topology for an influenc

    Intelligent Sensor

    No full text
    No sensor will deliver accurate information at all times. Many uncertain influences may add noise to sensor readings, or cause a total malfunction of the sensor. To avert these effects, and to maintain the high level of safety that is an integral part of intelligent transportation sys-tems, we propose a fault tolerant supervisory con-trol architecture. It consists of five main modules: sensor validation, sensor fusion, fault diagnosis, haz-ard analysis, and a safety decision maker. Two methods have been identified and developed for these modules. One is based on probability theory, the other on fuzzy logic. The probability approach models the sensors, along with potential failures, as probabilistic events and adaptively esti-mates the probabilities of these events on-line. For this purpose vector dynamic probabilistic networks have been developed, and rules have been derived for inference in these networks. Such networks pro-vide a theoretically sound method for modeling the uncertainty inherent in a system. The process of sensor validation, data fusion, and fault diagnosis is thus converted to a decision-analytic problem, where consistent decisions can be made to maxi-mize expected utility. The fuzzy logic approach makes use of fuzzy time series for predictions, validation gates for fusion, and abductive inference for diagnosis. The fuzzy validation gates are bound by the physical possible changes from one time step to the next. Curves de-noting confidence values for each sensor reading are fitted into the validation gates, which have their maximum value at the predicted value (calculate
    corecore