206,458 research outputs found

    Efficient basic event orderings for binary decision diagrams

    Get PDF
    Over the last five years significant advances have been made in methodologies to analyse the fault tree diagram. The most successful of these developments has been the Binary Decision Diagram (BDD) approach. The Binary Decision Diagram approach has been shown to improve both the efficiency of determining the minimal cut sets of the fault tree ancl also the accuracy of the calculation procedure used to determine the top event parameters. The BDD technique povides a potential alternative to the traditional approaches based on Kinetic Tree Theory. To utilise the Binary Decision Diagram approach the fault tree structure is first converted to the BDD format. This conversion can be accomplished efficiently but requires the basic events in the fault tree to be placed in an ordering. A poor ordering can result in a Binary Decision Diagram which is not an efficient representation of the fault tree logic structure. The advantages to be gained by utilising the BDD technique rely on the efficiency of the ordering scheme. Alternative ordering schemes have been investigated and no one scheme is appropriate for every tree structure. Research to date has not found any rule based means of determining the best way of ordering basic events for a given fault tree structure. The work presented in this paper takes a machine learning approach based on Genetic Algorithms to select the most appropriate ordering scheme. Features which describe a fault tree structure have been identified and these provide the inputs to the machine learning algorithm. A set of possible ordering schemes has been selected based on previous heuristic work. The objective of the work detailed in the pap:r is to predict the most efficient of the possible ordering alternatives from parameters which describe a fault tree structure

    Learning Bayesian Networks with Thousands of Variables

    Get PDF
    Abstract We present a method for learning Bayesian networks from data sets containing thousands of variables without the need for structure constraints. Our approach is made of two parts. The first is a novel algorithm that effectively explores the space of possible parent sets of a node. It guides the exploration towards the most promising parent sets on the basis of an approximated score function that is computed in constant time. The second part is an improvement of an existing ordering-based algorithm for structure optimization. The new algorithm provably achieves a higher score compared to its original formulation. Our novel approach consistently outperforms the state of the art on very large data sets

    GANN: Genetic algorithm neural networks for the detection of conserved combinations of features in DNA

    Get PDF
    BACKGROUND: The multitude of motif detection algorithms developed to date have largely focused on the detection of patterns in primary sequence. Since sequence-dependent DNA structure and flexibility may also play a role in protein-DNA interactions, the simultaneous exploration of sequence- and structure-based hypotheses about the composition of binding sites and the ordering of features in a regulatory region should be considered as well. The consideration of structural features requires the development of new detection tools that can deal with data types other than primary sequence. RESULTS: GANN (available at ) is a machine learning tool for the detection of conserved features in DNA. The software suite contains programs to extract different regions of genomic DNA from flat files and convert these sequences to indices that reflect sequence and structural composition or the presence of specific protein binding sites. The machine learning component allows the classification of different types of sequences based on subsamples of these indices, and can identify the best combinations of indices and machine learning architecture for sequence discrimination. Another key feature of GANN is the replicated splitting of data into training and test sets, and the implementation of negative controls. In validation experiments, GANN successfully merged important sequence and structural features to yield good predictive models for synthetic and real regulatory regions. CONCLUSION: GANN is a flexible tool that can search through large sets of sequence and structural feature combinations to identify those that best characterize a set of sequences

    Learning Discriminative Bayesian Networks from High-dimensional Continuous Neuroimaging Data

    Get PDF
    Due to its causal semantics, Bayesian networks (BN) have been widely employed to discover the underlying data relationship in exploratory studies, such as brain research. Despite its success in modeling the probability distribution of variables, BN is naturally a generative model, which is not necessarily discriminative. This may cause the ignorance of subtle but critical network changes that are of investigation values across populations. In this paper, we propose to improve the discriminative power of BN models for continuous variables from two different perspectives. This brings two general discriminative learning frameworks for Gaussian Bayesian networks (GBN). In the first framework, we employ Fisher kernel to bridge the generative models of GBN and the discriminative classifiers of SVMs, and convert the GBN parameter learning to Fisher kernel learning via minimizing a generalization error bound of SVMs. In the second framework, we employ the max-margin criterion and build it directly upon GBN models to explicitly optimize the classification performance of the GBNs. The advantages and disadvantages of the two frameworks are discussed and experimentally compared. Both of them demonstrate strong power in learning discriminative parameters of GBNs for neuroimaging based brain network analysis, as well as maintaining reasonable representation capacity. The contributions of this paper also include a new Directed Acyclic Graph (DAG) constraint with theoretical guarantee to ensure the graph validity of GBN.Comment: 16 pages and 5 figures for the article (excluding appendix
    • …
    corecore