217,488 research outputs found

    Interpretable Self-Attention Temporal Reasoning for Driving Behavior Understanding

    Full text link
    Performing driving behaviors based on causal reasoning is essential to ensure driving safety. In this work, we investigated how state-of-the-art 3D Convolutional Neural Networks (CNNs) perform on classifying driving behaviors based on causal reasoning. We proposed a perturbation-based visual explanation method to inspect the models' performance visually. By examining the video attention saliency, we found that existing models could not precisely capture the causes (e.g., traffic light) of the specific action (e.g., stopping). Therefore, the Temporal Reasoning Block (TRB) was proposed and introduced to the models. With the TRB models, we achieved the accuracy of 86.3%\mathbf{86.3\%}, which outperform the state-of-the-art 3D CNNs from previous works. The attention saliency also demonstrated that TRB helped models focus on the causes more precisely. With both numerical and visual evaluations, we concluded that our proposed TRB models were able to provide accurate driving behavior prediction by learning the causal reasoning of the behaviors.Comment: Submitted to IEEE ICASSP 2020; Pytorch code will be released soo

    Adding Causal Relationships to DL-based Action Formalisms

    Get PDF
    In the reasoning about actions community, causal relationships have been proposed as a possible approach for solving the ramification problem, i. e., the problem of how to deal with indirect effects of actions. In this paper, we show that causal relationships can be added to action formalisms based on Description Logics without destroying the decidability of the consistency and the projection problem

    REBA: A Refinement-Based Architecture for Knowledge Representation and Reasoning in Robotics

    Get PDF
    This paper describes an architecture for robots that combines the complementary strengths of probabilistic graphical models and declarative programming to represent and reason with logic-based and probabilistic descriptions of uncertainty and domain knowledge. An action language is extended to support non-boolean fluents and non-deterministic causal laws. This action language is used to describe tightly-coupled transition diagrams at two levels of granularity, with a fine-resolution transition diagram defined as a refinement of a coarse-resolution transition diagram of the domain. The coarse-resolution system description, and a history that includes (prioritized) defaults, are translated into an Answer Set Prolog (ASP) program. For any given goal, inference in the ASP program provides a plan of abstract actions. To implement each such abstract action, the robot automatically zooms to the part of the fine-resolution transition diagram relevant to this action. A probabilistic representation of the uncertainty in sensing and actuation is then included in this zoomed fine-resolution system description, and used to construct a partially observable Markov decision process (POMDP). The policy obtained by solving the POMDP is invoked repeatedly to implement the abstract action as a sequence of concrete actions, with the corresponding observations being recorded in the coarse-resolution history and used for subsequent reasoning. The architecture is evaluated in simulation and on a mobile robot moving objects in an indoor domain, to show that it supports reasoning with violation of defaults, noisy observations and unreliable actions, in complex domains.Comment: 72 pages, 14 figure

    Bringing action language C+ to normative contexts: preliminary report

    Get PDF
    C+ is an action language for specifying and reasoning about the e ects of actions and the persistence of facts over time. Based on it. we present CN+, an operational enhanced form of C+ designed for representing complex normative systems and integrate them easily into the semantics of the causal theory of actions. The proposed system contains a particular formalization of norms using a life-cycle approach to capture the whole normative meaning of a complex normative framework. We discuss this approach and illustrate it with examples.Peer ReviewedPostprint (author’s final draft

    A common framework for learning causality

    Full text link
    [EN] Causality is a fundamental part of reasoning to model the physics of an application domain, to understand the behaviour of an agent or to identify the relationship between two entities. Causality occurs when an action is taken and may also occur when two happenings come undeniably together. The study of causal inference aims at uncovering causal dependencies among observed data and to come up with automated methods to find such dependencies. While there exist a broad range of principles and approaches involved in causal inference, in this position paper we argue that it is possible to unify different causality views under a common framework of symbolic learning.This work is supported by the Spanish MINECO project TIN2017-88476-C2-1-R. Diego Aineto is partially supported by the FPU16/03184 and Sergio Jimenez by the RYC15/18009, both programs funded by the Spanish government.Onaindia De La Rivaherrera, E.; Aineto, D.; Jiménez-Celorrio, S. (2018). A common framework for learning causality. Progress in Artificial Intelligence. 7(4):351-357. https://doi.org/10.1007/s13748-018-0151-yS35135774Aineto, D., Jiménez, S., Onaindia, E.: Learning STRIPS action models with classical planning. In: International Conference on Automated Planning and Scheduling, ICAPS-18 (2018)Amir, E., Chang, A.: Learning partially observable deterministic action models. J. Artif. Intell. Res. 33, 349–402 (2008)Asai, M., Fukunaga, A.: Classical planning in deep latent space: bridging the subsymbolic–symbolic boundary. In: National Conference on Artificial Intelligence, AAAI-18 (2018)Cresswell, S.N., McCluskey, T.L., West, M.M.: Acquiring planning domain models using LOCM. Knowl. Eng. Rev. 28(02), 195–213 (2013)Ebert-Uphoff, I.: Two applications of causal discovery in climate science. In: Workshop Case Studies of Causal Discovery with Model Search (2013)Ebert-Uphoff, I., Deng, Y.: Causal discovery from spatio-temporal data with applications to climate science. In: 13th International Conference on Machine Learning and Applications, ICMLA 2014, Detroit, MI, USA, 3–6 December 2014, pp. 606–613 (2014)Giunchiglia, E., Lee, J., Lifschitz, V., McCain, N., Turner, H.: Nonmonotonic causal theories. Artif. Intell. 153(1–2), 49–104 (2004)Halpern, J.Y., Pearl, J.: Causes and explanations: a structural-model approach. Part I: Causes. Br. J. Philos. Sci. 56(4), 843–887 (2005)Heckerman, D., Meek, C., Cooper, G.: A Bayesian approach to causal discovery. In: Jain, L.C., Holmes, D.E. (eds.) Innovations in Machine Learning. Theory and Applications, Studies in Fuzziness and Soft Computing, chapter 1, pp. 1–28. Springer, Berlin (2006)Li, J., Le, T.D., Liu, L., Liu, J., Jin, Z., Sun, B.-Y., Ma, S.: From observational studies to causal rule mining. ACM TIST 7(2), 14:1–14:27 (2016)Malinsky, D., Danks, D.: Causal discovery algorithms: a practical guide. Philos. Compass 13, e12470 (2018)McCain, N., Turner, H.: Causal theories of action and change. In: Proceedings of the Fourteenth National Conference on Artificial Intelligence and Ninth Innovative Applications of Artificial Intelligence Conference, AAAI 97, IAAI 97, 27–31 July 1997, Providence, Rhode Island, pp. 460–465 (1997)McCarthy, J.: Epistemological problems of artificial intelligence. In: Proceedings of the 5th International Joint Conference on Artificial Intelligence, Cambridge, MA, USA, 22–25 August 1977, pp. 1038–1044 (1977)McCarthy, J., Hayes, P.: Some philosophical problems from the standpoint of artificial intelligence. Mach. Intell. 4, 463–502 (1969)Pearl, J.: Reasoning with cause and effect. AI Mag. 23(1), 95–112 (2002)Pearl, J.: Causality: Models, Reasoning and Inference, 2nd edn. Cambridge University Press, Cambridge (2009)Spirtes, C.G.P., Scheines, R.: Causation, Prediction and Search, 2nd edn. The MIT Press, Cambridge (2001)Spirtes, P., Zhang, K.: Causal discovery and inference: concepts and recent methodological advances. Appl. Inform. 3, 3 (2016)Thielscher, M.: Ramification and causality. Artif. Intell. 89(1–2), 317–364 (1997)Triantafillou, S., Tsamardinos, I.: Constraint-based causal discovery from multiple interventions over overlapping variable sets. J. Mach. Learn. Res. 16, 2147–2205 (2015)Yang, Q., Kangheng, W., Jiang, Y.: Learning action models from plan examples using weighted MAX-SAT. Artif. Intell. 171(2–3), 107–143 (2007)Zhuo, H.H., Kambhampati, S: Action-model acquisition from noisy plan traces. In: International Joint Conference on Artificial Intelligence, IJCAI-13, pp. 2444–2450. AAAI Press (2013

    A Temporal Ontology for Reasoning about Actions

    Get PDF
    In this paper our work is devoted to systematic study of actions theories by using a logical formalism based on a first order language increased by operators whose main is to facilitate the representation of causal and temporal relationships between actions and their effects as well as causal and temporal relationships between actions and events In Allen and Mc-Dermott formalisms we notice that notions of past present and future do not appear in the predicate Ecause How to affirm that effects don t precede causes To use the concept of temporality without limiting themselves to intervals we enrich our language by an operator defined on time-elements Our formalism avoids an ambiguity like effect precedes cause The originality of this work lies in proposal for a formalism based on equivalence classes We also defined an operator who allows us to represent the evolutions of the universe for various futures and pasts These operators allow to represent the types of reasoning which are prediction explanation and planning we propose a new ontology for causal and temporal representation of actions events The ontology used in our formalism consists of facts events process causality action and plannin
    • …
    corecore