136,086 research outputs found

    Linguistic Decision Tree Induction

    Get PDF

    Verifiable Reinforcement Learning via Policy Extraction

    Full text link
    While deep reinforcement learning has successfully solved many challenging control tasks, its real-world applicability has been limited by the inability to ensure the safety of learned policies. We propose an approach to verifiable reinforcement learning by training decision tree policies, which can represent complex policies (since they are nonparametric), yet can be efficiently verified using existing techniques (since they are highly structured). The challenge is that decision tree policies are difficult to train. We propose VIPER, an algorithm that combines ideas from model compression and imitation learning to learn decision tree policies guided by a DNN policy (called the oracle) and its Q-function, and show that it substantially outperforms two baselines. We use VIPER to (i) learn a provably robust decision tree policy for a variant of Atari Pong with a symbolic state space, (ii) learn a decision tree policy for a toy game based on Pong that provably never loses, and (iii) learn a provably stable decision tree policy for cart-pole. In each case, the decision tree policy achieves performance equal to that of the original DNN policy

    An application of decision trees method for fault diagnosis of induction motors

    Get PDF
    Decision tree is one of the most effective and widely used methods for building classification model. Researchers from various disciplines such as statistics, machine learning, pattern recognition, and data mining have considered the decision tree method as an effective solution to their field problems. In this paper, an application of decision tree method to classify the faults of induction motors is proposed. The original data from experiment is dealt with feature calculation to get the useful information as attributes. These data are then assigned the classes which are based on our experience before becoming data inputs for decision tree. The total 9 classes are defined. An implementation of decision tree written in Matlab is used for these data

    Penerapan Algoritma Decision Tree untuk Penilaian Agunan Pengajuan Kredit

    Full text link
    Masih terdapat kemungkinan kesalahan penilaian agunan sebagai acuan nilai kredit, yang akan membuka peluang terjadinya NPL. Jadi diperlukan suatu cara penilaian (prediksi nilai) yang cukup proporsional, kredibel dan akurat. Prediksi yang tidak akurat menyebabkan perencanaan manajemen kredit yang tidak tepat. Prediksi nilai agunan telah menarik minat banyak peneliti karena nilai pentingnya baik di teoritis dan empiris. Model yang berbeda dapat memberikan keakuratan yang berbeda pula. Karena itu penelitian ini bertujuan menerapkan algoritma decision tree C.45 untuk penilaian agunan pengajuan kredit. Penelitian ini menggunakan data agunan pengajuan kredit di Kota Banjarmasin. Evaluasi kinerja algoritma menggunakan precision and recall dan AUC kemudian dibandingkan dan dianalisa hasilnya antara metode analisis lain (Naive Bayes, K-NN) dengan hasil prediksi dengan metode klasifikasi algoritma C4.5. Hasilnya, Decision Tree C4.5 dapat diterapkan dalam penilaian agunan kredit dengan akurasi 71% dan Nilai AUC di atas 0,6. Decision Tree C4.5 memprediksi lebih akurat dari pada k-NN, Naive Bayes dan Perhitungan bias

    Extension of Decision Tree Algorithm for Stream Data Mining Using Real Data

    Get PDF
    Recently, because of increasing amount of data in the society, data stream mining targeting large scale data has attracted attention. The data mining is a technology of discovery new knowledge and patterns from the massive amounts of data, and what the data correspond to data stream is data stream mining. In this paper, we propose the feature selection with online decision tree. At first, we construct online type decision tree to regard credit card transaction data as data stream on data stream mining. At second, we select attributes thought to be important for detection of illegal use. We apply VFDT (Very Fast Decision Tree learner) algorithm to online type decision tree construction

    Modifiable risk factors predicting major depressive disorder at four year follow-up: a decision tree approach

    Get PDF
    BACKGROUND: Relative to physical health conditions such as cardiovascular disease, little is known about risk factors that predict the prevalence of depression. The present study investigates the expected effects of a reduction of these risks over time, using the decision tree method favoured in assessing cardiovascular disease risk. METHODS: The PATH through Life cohort was used for the study, comprising 2,105 20-24 year olds, 2,323 40-44 year olds and 2,177 60-64 year olds sampled from the community in the Canberra region, Australia. A decision tree methodology was used to predict the presence of major depressive disorder after four years of follow-up. The decision tree was compared with a logistic regression analysis using ROC curves. RESULTS: The decision tree was found to distinguish and delineate a wide range of risk profiles. Previous depressive symptoms were most highly predictive of depression after four years, however, modifiable risk factors such as substance use and employment status played significant roles in assessing the risk of depression. The decision tree was found to have better sensitivity and specificity than a logistic regression using identical predictors. CONCLUSION: The decision tree method was useful in assessing the risk of major depressive disorder over four years. Application of the model to the development of a predictive tool for tailored interventions is discussed

    On the parity complexity measures of Boolean functions

    Get PDF
    The parity decision tree model extends the decision tree model by allowing the computation of a parity function in one step. We prove that the deterministic parity decision tree complexity of any Boolean function is polynomially related to the non-deterministic complexity of the function or its complement. We also show that they are polynomially related to an analogue of the block sensitivity. We further study parity decision trees in their relations with an intermediate variant of the decision trees, as well as with communication complexity.Comment: submitted to TCS on 16-MAR-200
    • …
    corecore