189,300 research outputs found

    Efficient fault tree analysis using binary decision diagrams

    Get PDF
    The Binary Decision Diagram (BDD) method has emerged as an alternative to conventional techniques for performing both qualitative and quantitative analysis of fault trees. BDDs are already proving to be of considerable use in reliability analysis, providing a more efficient means of analysing a system, without the need for the approximations previously used in the traditional approach of Kinetic Tree Theory. In order to implement this technique, a BDD must be constructed from the fault tree, according to some ordering of the fault tree variables. The selected variable ordering has a crucial effect on the resulting BDD size and the number of calculations required for its construction; a bad choice of ordering can lead to excessive calculations and a BDD many orders of magnitude larger than one obtained using an ordering more suited to the tree. Within this thesis a comparison is made of the effectiveness of several ordering schemes, some of which have not previously been investigated. Techniques are then developed for the efficient construction of BDDs from fault trees. The method of Faunet reduction is applied to a set of fault trees and is shown to significantly reduce the size of the resulting BDDs. The technique is then extended to incorporate an additional stage that results in further improvements in BDD size. A fault tree analysis strategy is proposed that increases the likelihood of obtaining a BDD for any given fault tree. This method implements simplification techniques, which are applied to the fault tree to obtain a set of concise and independent subtrees, equivalent to the original fault tree structure. BDDs are constructed for each subtree and the quantitative analysis is developed for the set of BDDs to obtain the top event parameters and the event criticality functions

    A generic optimising feature extraction method using multiobjective genetic programming

    Get PDF
    In this paper, we present a generic, optimising feature extraction method using multiobjective genetic programming. We re-examine the feature extraction problem and show that effective feature extraction can significantly enhance the performance of pattern recognition systems with simple classifiers. A framework is presented to evolve optimised feature extractors that transform an input pattern space into a decision space in which maximal class separability is obtained. We have applied this method to real world datasets from the UCI Machine Learning and StatLog databases to verify our approach and compare our proposed method with other reported results. We conclude that our algorithm is able to produce classifiers of superior (or equivalent) performance to the conventional classifiers examined, suggesting removal of the need to exhaustively evaluate a large family of conventional classifiers on any new problem. (C) 2010 Elsevier B.V. All rights reserved

    A Comparative Study of the Application of Different Learning Techniques to Natural Language Interfaces

    Full text link
    In this paper we present first results from a comparative study. Its aim is to test the feasibility of different inductive learning techniques to perform the automatic acquisition of linguistic knowledge within a natural language database interface. In our interface architecture the machine learning module replaces an elaborate semantic analysis component. The learning module learns the correct mapping of a user's input to the corresponding database command based on a collection of past input data. We use an existing interface to a production planning and control system as evaluation and compare the results achieved by different instance-based and model-based learning algorithms.Comment: 10 pages, to appear CoNLL9

    Power System Parameters Forecasting Using Hilbert-Huang Transform and Machine Learning

    Get PDF
    A novel hybrid data-driven approach is developed for forecasting power system parameters with the goal of increasing the efficiency of short-term forecasting studies for non-stationary time-series. The proposed approach is based on mode decomposition and a feature analysis of initial retrospective data using the Hilbert-Huang transform and machine learning algorithms. The random forests and gradient boosting trees learning techniques were examined. The decision tree techniques were used to rank the importance of variables employed in the forecasting models. The Mean Decrease Gini index is employed as an impurity function. The resulting hybrid forecasting models employ the radial basis function neural network and support vector regression. Apart from introduction and references the paper is organized as follows. The section 2 presents the background and the review of several approaches for short-term forecasting of power system parameters. In the third section a hybrid machine learning-based algorithm using Hilbert-Huang transform is developed for short-term forecasting of power system parameters. Fourth section describes the decision tree learning algorithms used for the issue of variables importance. Finally in section six the experimental results in the following electric power problems are presented: active power flow forecasting, electricity price forecasting and for the wind speed and direction forecasting

    Improved Memory-Bounded Dynamic Programming for Decentralized POMDPs

    Full text link
    Memory-Bounded Dynamic Programming (MBDP) has proved extremely effective in solving decentralized POMDPs with large horizons. We generalize the algorithm and improve its scalability by reducing the complexity with respect to the number of observations from exponential to polynomial. We derive error bounds on solution quality with respect to this new approximation and analyze the convergence behavior. To evaluate the effectiveness of the improvements, we introduce a new, larger benchmark problem. Experimental results show that despite the high complexity of decentralized POMDPs, scalable solution techniques such as MBDP perform surprisingly well.Comment: Appears in Proceedings of the Twenty-Third Conference on Uncertainty in Artificial Intelligence (UAI2007
    • …
    corecore