5 research outputs found

    Possible and Necessary Answer Sets of Possibilistic Answer Set Programs

    Get PDF
    Abstract—Answer set programming (ASP) and possibility theory can be combined to form possibilistic answer set programming (PASP), a framework for non-monotonic reasoning under uncertainty. Existing proposals view answer sets of PASP programs as weighted epistemic states, in which the strength by which different literals are believed to hold may vary. In contrast, in this paper we propose an approach in which epistemic states remain Boolean, but some epistemic states may be considered more plausible than others. A PASP program is then a representation of an incomplete description of these epistemic states where certainties are associated with each rule which is interpreted in terms of a necessity measure. The main contribution of this paper is the introduction of a new semantics for PASP as well as a study of the resulting complexity. I

    Possible and necessary answer sets of possibilistic answer set programs

    No full text
    Answer set programming (ASP) and possibility theory can be combined to form possibilistic answer set programming (PASP), a framework for non-monotonic reasoning under uncertainty. Existing proposals view answer sets of PASP programs as weighted epistemic states, in which the strength by which different literals are believed to hold may vary. In contrast, in this paper we propose an approach in which epistemic states remain Boolean, but some epistemic states may be considered more plausible than others. A PASP program is then a representation of an incomplete description of these epistemic states where certainties are associated with each rule which is interpreted in terms of a necessity measure. The main contribution of this paper is the introduction of a new semantics for PASP as well as a study of the resulting complexity

    Using rules of thumb to repair inconsistent knowledge

    Get PDF

    Modélisation et exploitation des connaissances pour les processus d'expertise collaborative

    Get PDF
    Les démarches d’expertise sont aujourd’hui mises en oeuvre dans de nombreux domaines, et plus particulièrement dans le domaine industriel, pour évaluer des situations, comprendre des problèmes ou encore anticiper des risques. Placés en amont des problèmes complexes et mal définis, elles servent à la compréhension de ceux-ci et facilitent ainsi les prises de décisions. Ces démarches sont devenues tellement généralisées qu’elles ont fait l’objet d’une norme (NF X 50-110) et d’un guide de recommandation édité en 2011 (FDX 50-046). Ces démarches reposent principalement sur la formulation d’hypothèses avec un certain doute par un ou plusieurs experts. Par la suite, ces hypothèses vont progressivement être validées ou invalidées au cours des différentes phases de la démarche par rapport aux connaissances disponibles. Ainsi, les certitudes accordées aux hypothèses vont connaître une évolution au cours des dites phases et permettront d’avoir une certitude sur la compréhension d’un problème en fonction des hypothèses valides. Bien que cette approche d’étude de problèmes ait fait l’objet d’une norme, elle manque d’outils automatiques ou semi-automatiques pour assister les experts du domaine lors des différentes phases exploratoires des problèmes. De plus, cette approche quasi manuelle manque des mécanismes appropriés pour gérer les connaissances produites de manière à ce qu’elles soient compréhensibles par les humains et manipulables par les machines. Avant de proposer des solutions à ces limites de l’état actuel des processus d’expertise, une revue des études fondamentales et appliquées en logique, en représentation des connaissances pour l’expertise ou l’expérience, et en intelligence collaborative a été réalisée pour identifier les briques technologiques des solutions proposées. Une analyse de la norme NF X 50-100 a été menée pour comprendre les caractéristiques des Processus d’Expertise et comment ils peuvent être représentés formellement et utilisés comme retour d’expérience. Une étude a été menée sur des rapports d’expertise passés d’accidents d’avion pour trouver comment ils peuvent être représentés dans un format lisible par une machine, général et extensible, indépendant du domaine et partageable entre les systèmes. Cette thèse apporte les contributions suivantes à la démarche d’expertise : Une formalisation des connaissances et une méthodologie de résolution collaborative de problèmes en utilisant des hypothèses. Cette méthode est illustrée par un cas d’étude tiré d’un problème de l’industrie de production, dans lequel un produit fabriqué a été rejeté par des clients. La méthode décrit également des mécanismes d’inférence compatibles avec la représentation formelle proposée. Un raisonnement collaboratif non-monotone basé sur la programmation logique par l’ensemble et la théorie d’incertitude utilisant les fonctions de croyance. Une représentation sémantique des rapports d’expertise basée sur les ontologies. Premièrement, ces contributions ont permis une exécution formelle et systématique des Processus d’Expertise, avec une motivation centrée sur l’humain. Ensuite, elles favorisent leur utilisation pour un traitement approprié selon des propriétés essentielles telles que la traçabilité, la transparence, le raisonnement non-monotone et l’incertitude, en tenant compte du doute humain et de la connaissance limitée des experts. Enfin, ils fournissent une représentation sémantique lisible par l’homme et la machine pour les expertise réalisées

    Epistemic extensions of answer set programming

    Get PDF
    but due to the non-monotonic nature of ASP; the weight can reflect the certainty that the rule itself is correct. ASP programs with incorrect rules may have erroneous conclusions; omitting a correct rule may also lead to errors. To derive the most certain conclusions from an uncertain ASP program; the weight can reflect the certainty with which we can conclude the head of a rule when its body is satisfied. This corresponds with how the weight is understood when defining semantics for PASP in terms of constraints on possibility distributions. On the other hand; we highlight how the weight attached to a rule in PASP can be interpreted in different ways. On the one hand; some decision problems are easier. Thirdly; while the complexity of most reasoning tasks coincides with disjunction in ordinary ASP; called weak disjunction; that has not been previously considered in the ASP literature. When examining the complexity of weak disjunction we unearth that; we obtain a new characterization of ASP in terms of constraints on possibility distributions. This allows us to uncover a new form of disjunction; since ASP is a special case of PASP in which all the rules are entirely certain; we show how semantics for PASP can be defined in terms of constraints on possibility distributions. These new semantics adhere to a different intuition for negation-as-failure than current work on PASP to avoid unintuitive conclusions in specific settings. In addition; where the first leader has the first say and may remove models that he or she finds unsatisfactory. Using this particular communication mechanism allows us to capture the entire polynomial hierarchy. Secondly; where each program in the sequence may successively remove some of the remaining models. This mimics a sequence of leaders; we modify the communication mechanism to also allow us to focus on a sequence of communicating programs; it is shown that the addition of this easy form of communication allows us to move one step up in the polynomial hierarchy. Furthermore; i.e. they can communicate. For the least complex variant of ASP; simple programs; one ASP program can conceptually query another program as to whether it believes some literal to be true or not; which is a framework that allows us to study the formal properties of communication and the complexity of the resulting system in ASP. It is based on an extension of ASP in which we consider a network of ordinary ASP programs. These communicating programs are extended with a new kind of literal based on the notion of asking questions. As such; we introduce Communicating Answer Set Programming (CASP); namely Possibilistic Answer Set Programming (PASP); there are contexts in which the current semantics for PASP lead to unintuitive results. In this thesis we address these issues in the followings ways. Firstly; ASP lacks the means to easily model and reason about uncertain information. While extensions of ASP have been proposed to deal with uncertainty; where each context encodes a different aspect of the real world. Extensions of ASP have been proposed to model such multi-context systems; but the exact effect of communication on the overall expressiveness remains unclear. In addition; it is not an ideal framework to model common-sense reasoning. For example; in ASP we cannot model multi-context systems; while ASP similarly allows us to revise knowledge; we conclude that the bird can fly. When new knowledge becomes available (e.g. the bird is a penguin) we may need to retract conclusions. However; in common-sense reasoning; Answer Set Programming (ASP) is a declarative programming language based on the stable model semantics and geared towards solving complex combinatorial problems. The strength of ASP stems from the use of a non-monotonic operator. This operator allows us to retract previously made conclusions as new information becomes available. Similarly; we may arrive at conclusions based on the absence of information. When an animal is for example a bird; and we do not know that this bird is a penguin; we thus need to consider all situations in which some; none; or all of the least certain rules are omitted. This corresponds to treating some rules as optional and reasoning about which conclusions remain valid regardless of the inclusion of these optional rules. Semantics for PASP are introduced based on this idea and it is shown that some interesting problems in Artificial Intelligence can be expressed in terms of optional rules. For both CASP and the new semantics for PASP we show that most of the concepts that we introduced can be simulated using classical ASP. This provides us with implementations of these concepts and furthermore allows us to benefit from the performance of state-of-the-art ASP solvers
    corecore