558 research outputs found

    Graphical models for the identification of causal structures in multivariate time series models

    Get PDF
    In this paper we present a semi-automated search pro-cedure to deal with the problem of the identication of the contemporaneous causal structure connected to a large class of multivariate time series models. We refe

    Common Causes and The Direction of Causation

    Get PDF
    Is the common cause principle merely one of a set of useful heuristics for discovering causal relations, or is it rather a piece of heavy duty metaphysics, capable of grounding the direction of causation itself? Since the principle was introduced in Reichenbach’s groundbreaking work The Direction of Time (1956), there have been a series of attempts to pursue the latter program—to take the probabilistic relationships constitutive of the principle of the common cause and use them to ground the direction of causation. These attempts have not all explicitly appealed to the principle as originally formulated; it has also appeared in the guise of independence conditions, counterfactual overdetermination, and, in the causal modelling literature, as the causal markov condition. In this paper, I identify a set of difficulties for grounding the asymmetry of causation on the principle and its descendents. The first difficulty, concerning what I call the vertical placement of causation, consists of a tension between considerations that drive towards the macroscopic scale, and considerations that drive towards the microscopic scale—the worry is that these considerations cannot both be comfortably accommodated. The second difficulty consists of a novel potential counterexample to the principle based on the familiar Einstein Podolsky Rosen (EPR) cases in quantum mechanics

    Finding Exogenous Variables in Data with Many More Variables than Observations

    Full text link
    Many statistical methods have been proposed to estimate causal models in classical situations with fewer variables than observations (p<n, p: the number of variables and n: the number of observations). However, modern datasets including gene expression data need high-dimensional causal modeling in challenging situations with orders of magnitude more variables than observations (p>>n). In this paper, we propose a method to find exogenous variables in a linear non-Gaussian causal model, which requires much smaller sample sizes than conventional methods and works even when p>>n. The key idea is to identify which variables are exogenous based on non-Gaussianity instead of estimating the entire structure of the model. Exogenous variables work as triggers that activate a causal chain in the model, and their identification leads to more efficient experimental designs and better understanding of the causal mechanism. We present experiments with artificial data and real-world gene expression data to evaluate the method.Comment: A revised version of this was published in Proc. ICANN201

    Discovering granger-causal features from deep learning networks

    Full text link
    © Springer Nature Switzerland AG 2018. In this research, we propose deep networks that discover Granger causes from multivariate temporal data generated in financial markets. We introduce a Deep Neural Network (DNN) and a Recurrent Neural Network (RNN) that discover Granger-causal features for bivariate regression on bivariate time series data distributions. These features are subsequently used to discover Granger-causal graphs for multivariate regression on multivariate time series data distributions. Our supervised feature learning process in proposed deep regression networks has favourable F-tests for feature selection and t-tests for model comparisons. The experiments, minimizing root mean squared errors in the regression analysis on real stock market data obtained from Yahoo Finance, demonstrate that our causal features significantly improve the existing deep learning regression models

    Bioinformatics tools in predictive ecology: Applications to fisheries

    Get PDF
    This article is made available throught the Brunel Open Access Publishing Fund - Copygith @ 2012 Tucker et al.There has been a huge effort in the advancement of analytical techniques for molecular biological data over the past decade. This has led to many novel algorithms that are specialized to deal with data associated with biological phenomena, such as gene expression and protein interactions. In contrast, ecological data analysis has remained focused to some degree on off-the-shelf statistical techniques though this is starting to change with the adoption of state-of-the-art methods, where few assumptions can be made about the data and a more explorative approach is required, for example, through the use of Bayesian networks. In this paper, some novel bioinformatics tools for microarray data are discussed along with their ‘crossover potential’ with an application to fisheries data. In particular, a focus is made on the development of models that identify functionally equivalent species in different fish communities with the aim of predicting functional collapse

    A common framework for learning causality

    Full text link
    [EN] Causality is a fundamental part of reasoning to model the physics of an application domain, to understand the behaviour of an agent or to identify the relationship between two entities. Causality occurs when an action is taken and may also occur when two happenings come undeniably together. The study of causal inference aims at uncovering causal dependencies among observed data and to come up with automated methods to find such dependencies. While there exist a broad range of principles and approaches involved in causal inference, in this position paper we argue that it is possible to unify different causality views under a common framework of symbolic learning.This work is supported by the Spanish MINECO project TIN2017-88476-C2-1-R. Diego Aineto is partially supported by the FPU16/03184 and Sergio Jimenez by the RYC15/18009, both programs funded by the Spanish government.Onaindia De La Rivaherrera, E.; Aineto, D.; Jiménez-Celorrio, S. (2018). A common framework for learning causality. Progress in Artificial Intelligence. 7(4):351-357. https://doi.org/10.1007/s13748-018-0151-yS35135774Aineto, D., Jiménez, S., Onaindia, E.: Learning STRIPS action models with classical planning. In: International Conference on Automated Planning and Scheduling, ICAPS-18 (2018)Amir, E., Chang, A.: Learning partially observable deterministic action models. J. Artif. Intell. Res. 33, 349–402 (2008)Asai, M., Fukunaga, A.: Classical planning in deep latent space: bridging the subsymbolic–symbolic boundary. In: National Conference on Artificial Intelligence, AAAI-18 (2018)Cresswell, S.N., McCluskey, T.L., West, M.M.: Acquiring planning domain models using LOCM. Knowl. Eng. Rev. 28(02), 195–213 (2013)Ebert-Uphoff, I.: Two applications of causal discovery in climate science. In: Workshop Case Studies of Causal Discovery with Model Search (2013)Ebert-Uphoff, I., Deng, Y.: Causal discovery from spatio-temporal data with applications to climate science. In: 13th International Conference on Machine Learning and Applications, ICMLA 2014, Detroit, MI, USA, 3–6 December 2014, pp. 606–613 (2014)Giunchiglia, E., Lee, J., Lifschitz, V., McCain, N., Turner, H.: Nonmonotonic causal theories. Artif. Intell. 153(1–2), 49–104 (2004)Halpern, J.Y., Pearl, J.: Causes and explanations: a structural-model approach. Part I: Causes. Br. J. Philos. Sci. 56(4), 843–887 (2005)Heckerman, D., Meek, C., Cooper, G.: A Bayesian approach to causal discovery. In: Jain, L.C., Holmes, D.E. (eds.) Innovations in Machine Learning. Theory and Applications, Studies in Fuzziness and Soft Computing, chapter 1, pp. 1–28. Springer, Berlin (2006)Li, J., Le, T.D., Liu, L., Liu, J., Jin, Z., Sun, B.-Y., Ma, S.: From observational studies to causal rule mining. ACM TIST 7(2), 14:1–14:27 (2016)Malinsky, D., Danks, D.: Causal discovery algorithms: a practical guide. Philos. Compass 13, e12470 (2018)McCain, N., Turner, H.: Causal theories of action and change. In: Proceedings of the Fourteenth National Conference on Artificial Intelligence and Ninth Innovative Applications of Artificial Intelligence Conference, AAAI 97, IAAI 97, 27–31 July 1997, Providence, Rhode Island, pp. 460–465 (1997)McCarthy, J.: Epistemological problems of artificial intelligence. In: Proceedings of the 5th International Joint Conference on Artificial Intelligence, Cambridge, MA, USA, 22–25 August 1977, pp. 1038–1044 (1977)McCarthy, J., Hayes, P.: Some philosophical problems from the standpoint of artificial intelligence. Mach. Intell. 4, 463–502 (1969)Pearl, J.: Reasoning with cause and effect. AI Mag. 23(1), 95–112 (2002)Pearl, J.: Causality: Models, Reasoning and Inference, 2nd edn. Cambridge University Press, Cambridge (2009)Spirtes, C.G.P., Scheines, R.: Causation, Prediction and Search, 2nd edn. The MIT Press, Cambridge (2001)Spirtes, P., Zhang, K.: Causal discovery and inference: concepts and recent methodological advances. Appl. Inform. 3, 3 (2016)Thielscher, M.: Ramification and causality. Artif. Intell. 89(1–2), 317–364 (1997)Triantafillou, S., Tsamardinos, I.: Constraint-based causal discovery from multiple interventions over overlapping variable sets. J. Mach. Learn. Res. 16, 2147–2205 (2015)Yang, Q., Kangheng, W., Jiang, Y.: Learning action models from plan examples using weighted MAX-SAT. Artif. Intell. 171(2–3), 107–143 (2007)Zhuo, H.H., Kambhampati, S: Action-model acquisition from noisy plan traces. In: International Joint Conference on Artificial Intelligence, IJCAI-13, pp. 2444–2450. AAAI Press (2013

    The IBMAP approach for Markov networks structure learning

    Get PDF
    In this work we consider the problem of learning the structure of Markov networks from data. We present an approach for tackling this problem called IBMAP, together with an efficient instantiation of the approach: the IBMAP-HC algorithm, designed for avoiding important limitations of existing independence-based algorithms. These algorithms proceed by performing statistical independence tests on data, trusting completely the outcome of each test. In practice tests may be incorrect, resulting in potential cascading errors and the consequent reduction in the quality of the structures learned. IBMAP contemplates this uncertainty in the outcome of the tests through a probabilistic maximum-a-posteriori approach. The approach is instantiated in the IBMAP-HC algorithm, a structure selection strategy that performs a polynomial heuristic local search in the space of possible structures. We present an extensive empirical evaluation on synthetic and real data, showing that our algorithm outperforms significantly the current independence-based algorithms, in terms of data efficiency and quality of learned structures, with equivalent computational complexities. We also show the performance of IBMAP-HC in a real-world application of knowledge discovery: EDAs, which are evolutionary algorithms that use structure learning on each generation for modeling the distribution of populations. The experiments show that when IBMAP-HC is used to learn the structure, EDAs improve the convergence to the optimum

    Modeling the Cognitive Task Load and Performance of Naval Operators

    Get PDF
    Abstract. Operators on naval ships have to act in dynamic, critical and highdemand task environments. For these environments, a cognitive task load (CTL) model has been proposed as foundation of three operator support functions: adaptive task allocation, cognitive aids and resource feedback. This paper presents the construction of such a model as a Bayesian network with probability relationships between CTL and performance. The network is trained and tested with two datasets: operator performance with an adaptive user interface in a lab-setting and operator performance on a high-tech sailing ship. The “Naïve Bayesian network ” tuned out to be the best choice, providing performance estimations with 86 % and 74 % accuracy for respectively the lab and ship data. Overall, the resulting model nicely generalizes over the two datasets. It will be used to estimate operator performance under momentary CTL-conditions, and to set the thresholds of the load-mitigation strategies for the three support functions

    Extracting causal rules from spatio-temporal data

    Get PDF
    The final publication is available at Springer via http://dx.doi.org/10.1007/978-3-319-23374-1_2This paper is concerned with the problem of detecting causality in spatiotemporal data. In contrast to most previous work on causality, we adopt a logical rather than a probabilistic approach. By defining the logical form of the desired causal rules, the algorithm developed in this paper searches for instances of rules of that form that explain as fully as possible the observations found in a data set. Experiments with synthetic data, where the underlying causal rules are known, show that in many cases the algorithm is able to retrieve close approximations to the rules that generated the data. However, experiments with real data concerning the movement of fish in a large Australian river system reveal significant practical limitations, primarily as a consequence of the coarse granularity of such movement data. In response, instead of focusing on strict causation (where an environmental event initiates a movement event), further experiments focused on perpetuation (where environmental conditions are the drivers of ongoing processes of movement). After retasking to search for a different logical form of rules compatible with perpetuation, our algorithm was able to identify perpetuation rules that explain a significant proportion of the fish movements. For example, approximately one fifth of the detected long-range movements of fish over a period of six years were accounted for by 26 rules taking account of variations in water-level alone.EPSRCAustralian Research Council (ARC) under the Discovery Projects Schem
    corecore