834,103 research outputs found

    Multi-attribute decision making with weighted description logics

    Full text link
    We introduce a decision-theoretic framework based on Description Logics (DLs), which can be used to encode and solve single stage multi-attribute decision problems. In particular, we consider the background knowledge as a DL knowledge base where each attribute is represented by a concept, weighted by a utility value which is asserted by the user. This yields a compact representation of preferences over attributes. Moreover, we represent choices as knowledge base individuals, and induce a ranking via the aggregation of attributes that they satisfy. We discuss the benefits of the approach from a decision theory point of view. Furthermore, we introduce an implementation of the framework as a Protégé plugin called uDecide. The plugin takes as input an ontology as background knowledge, and returns the choices consistent with the user’s (the knowledge base) preferences. We describe a use case with data from DBpedia. We also provide empirical results for its performance in the size of the ontology using the reasoner Konclude

    Case-Based Knowledge and Planning

    Get PDF
    "Case-Based Decision Theory" is a theory of decision making under uncertainty, suggesting that people tend to choose acts that performed well in similar cases they recall. The theory has been developed from a decision-/game-/economic-theoretical point of view, as a potential alternative to expected utility theory. In this paper we attempt to re-consider CBDT as a theory of knowledge representation and of planning, to contrast it with the rule-based approach, and to study its implications regarding the process of induction.

    Case-Based Knowledge Representation

    Get PDF
    The representation of knowledge in terms of rules is fraught with theoretical problems, such as the justification of induction, the "right" way to do it, and the revision of knowledg ein face of contradictions. In this paper we argue that these problems, and especiallyt the inconsistency of "knowledge," are partly due to the fact that we pretend to know what in fact cannot be known. Rather than coping with thep roblems that explicit induction raises, we suggest to avoid it. Instead of formulating rules which we supposedly "know," we may make do with the knowledge of actual cases from our experience. Starting from this viewpoint, we continue tooderive Case-Based Decision Theory (CBDT), and propose it as a less ambitious, yet less problematic theory of knowledge representation. CBDT deals with decision making under uncertainty, and can be viewed as performing implicit induction, that is, as using past experience to make decisions, without resorting to the explicit formulation of rules. We discuss two levels on which implicit induction takes place, and the corresponding two roles that "rules" may have in case-based decision making. We also discuss the process of learning and the concept of "expertise" as they are reflected in our model.

    Effective retrieval and new indexing method for case based reasoning: Application in chemical process design

    Get PDF
    In this paper we try to improve the retrieval step for case based reasoning for preliminary design. This improvement deals with three major parts of our CBR system. First, in the preliminary design step, some uncertainties like imprecise or unknown values remain in the description of the problem, because they need a deeper analysis to be withdrawn. To deal with this issue, the faced problem description is soften with the fuzzy sets theory. Features are described with a central value, a percentage of imprecision and a relation with respect to the central value. These additional data allow us to build a domain of possible values for each attributes. With this representation, the calculation of the similarity function is impacted, thus the characteristic function is used to calculate the local similarity between two features. Second, we focus our attention on the main goal of the retrieve step in CBR to find relevant cases for adaptation. In this second part, we discuss the assumption of similarity to find the more appropriated case. We put in highlight that in some situations this classical similarity must be improved with further knowledge to facilitate case adaptation. To avoid failure during the adaptation step, we implement a method that couples similarity measurement with adaptability one, in order to approximate the cases utility more accurately. The latter gives deeper information for the reusing of cases. In a last part, we present a generic indexing technique for the base, and a new algorithm for the research of relevant cases in the memory. The sphere indexing algorithm is a domain independent index that has performances equivalent to the decision tree ones. But its main strength is that it puts the current problem in the center of the research area avoiding boundaries issues. All these points are discussed and exemplified through the preliminary design of a chemical engineering unit operation

    Rule-based Shield Synthesis for Partially Observable Monte Carlo Planning

    Get PDF
    Partially Observable Monte-Carlo Planning (POMCP) is a powerful online algorithm able to generate approximate policies for large Partially Observable Markov Decision Processes. The online nature of this method supports scalability by avoiding complete policy representation. The lack of an explicit representation however hinders policy interpretability and makes policy verification very complex. In this work, we propose two contributions. The first is a method for identifying unexpected actions selected by POMCP with respect to expert prior knowledge of the task. The second is a shielding approach that prevents POMCP from selecting unexpected actions. The first method is based on Maximum Satisfiability Modulo Theory (MAX-SMT). It inspects traces (i.e., sequences of belief-action-observation triplets) generated by POMCP to compute the parameters of logical formulas about policy properties defined by the expert. The second contribution is a module that uses online the logical formulas to identify anomalous actions selected by POMCP and substitutes those actions with actions that satisfy the logical formulas fulfilling expert knowledge. We evaluate our approach in two domains. Results show that the shielded POMCP outperforms the standard POMCP in a case study in which a wrong parameter of POMCP makes it select wrong actions from time to time. © 2021 Copyright for this paper by its authors

    Rule-based shielding for Partially Observable Monte-Carlo Planning

    Get PDF
    Partially Observable Monte-Carlo Planning (POMCP) is a powerful online algorithm able to generate approximate policies for large Partially Observable Markov Decision Processes. The online nature of this method supports scalability by avoiding complete policy representation. The lack of an explicit representation however hinders policy interpretability and makes policy verification very complex. In this work, we propose two contributions. The first is a method for identifying unexpected actions selected by POMCP with respect to expert prior knowledge of the task. The second is a shielding approach that prevents POMCP from selecting unexpected actions. The first method is based on Satisfiability Modulo Theory (SMT). It inspects traces (i.e., sequences of belief-action-observation triplets) generated by POMCP to compute the parameters of logical formulas about policy properties defined by the expert. The second contribution is a module that uses online the logical formulas to identify anomalous actions selected by POMCP and substitutes those actions with actions that satisfy the logical formulas fulfilling expert knowledge. We evaluate our approach on Tiger, a standard benchmark for POMDPs, and a real-world problem related to velocity regulation in mobile robot navigation. Results show that the shielded POMCP outperforms the standard POMCP in a case study in which a wrong parameter of POMCP makes it select wrong actions from time to time. Moreover, we show that the approach keeps good performance also if the parameters of the logical formula are optimized using trajectories containing some wrong actions

    The Solidarity Manifesto: A New Network for Future Change

    Get PDF
    Colonialism is a scheme of standpoint; colonizer versus colonized, West versus East, good versus bad. When put in the foreground, the value of what we see heavily relies on our perspective and knowledge. When learning to dissect, deconstruct, and decolonize spaces, we need to start utilizing decolonial thought as an historical tool rather than a true depiction of reality. Decolonizing spaces and recognizing Western colonization practices means challenging the normative structures in colonial history, thus breaking the cycle of oppression through building community and fostering solidarity. Drawing on theories exploring access to public spheres, representation, protection, permanence, cultural displacement and the creation of crosscultural ecosystems, this study gives special highlight to the (dis)connection between global policy processes and local initiatives through a decolonial feminist lens. Prescribing the need for decolonial discourses in helping bridge the gap between the literary and physical spaces that inform decision-making bodies today, this thesis places emphasis on Françoise Vergès’ A Decolonial Feminism and A Feminist Theory of Violence: A Decolonial Perspective to inform solidarity-centered approaches to future change in policy making. Through a decolonial case study analysis of the Italian occupation of Libya, the exclusive power of language, and observations of NGO work at the United Nations, and by proposing the Solidarity Model based on accountability and representation, the aim of this study is to deconstruct current systems and their discourses to explore future international networks based on human solidarit

    Re-Inventing the Public Sphere: Critical Theory, Social Responsibility, Schools, and the Press.

    Get PDF
    This study examines the contemporary discourses of journalism and pedagogy from the standpoint of critical theory to assess the impact of technocratic rationality and instrumental logic on the practices of communication and education. It is premised on the observation that, spurred by the imperatives of trans-national capital accumulation, privatization inimical to democratic interests has begun to colonize public education. The study represents an effort to reactivate a concept and rhetoric of social responsibility that would animate a project of reclaiming cultural space to be occupied by a public sphere, in a struggle analagous to that waged against feudalism and monarchical Divine Right. . The study argues that communication and education, the essential minima of language, are the basic elements of all cultural development. It makes the case that, by deploying artificial antinomies, education and communication techno-bureaucracy conceals fundamental similarities between the projects of journalism and pedagogy at the levels of both theory and practice--with respect to their complementary roles in enabling citizen participation and appropriating social knowledge in democratic culture--in order to better facilitate reproduction of dominant corporatist ideologies. Taking as the paradigm case the U.S. Supreme Court\u27s 1968 decision in the matter of Hazelwood School District v. Kuhlmeier, the study applies a Foucauldian analytic to evaluate both the Court\u27s decision and responses to it in mainstream press editorials, press industry trade and association periodicals, and journalism reviews. It finds mainstream acceptance on the grounds of its representation of real world conditions, equivocal balance in the trades, and resistance themes in the reviews. The study then thematizes the operation of techno-bureaucratic rationality in the decline of the bourgeois public sphere, and responds to critics who have disparaged social responsibility theory. Finally, it argues for the relevance of such a theory, and explores its implications as a rationale for educational praxis based on the public sphere as counterpoise to the hegemony of state corporatism. Suggestions for further research on the impact potential of desk-top publishing installed in communities, condominium-style, and prepared for by teaching journalistic praxis for a democratic local press, are proffered

    A Graph-Based Modeling Framework for Tracing Hydrological Pollutant Transport in Surface Waters

    Full text link
    Anthropogenic pollution of hydrological systems affects diverse communities and ecosystems around the world. Data analytics and modeling tools play a key role in fighting this challenge, as they can help identify key sources as well as trace transport and quantify impact within complex hydrological systems. Several tools exist for simulating and tracing pollutant transport throughout surface waters using detailed physical models; these tools are powerful, but can be computationally intensive, require significant amounts of data to be developed, and require expert knowledge for their use (ultimately limiting application scope). In this work, we present a graph modeling framework -- which we call HydroGraphs{\tt HydroGraphs} -- for understanding pollutant transport and fate across waterbodies, rivers, and watersheds. This framework uses a simplified representation of hydrological systems that can be constructed based purely on open-source data (National Hydrography Dataset and Watershed Boundary Dataset). The graph representation provides an flexible intuitive approach for capturing connectivity and for identifying upstream pollutant sources and for tracing downstream impacts within small and large hydrological systems. Moreover, the graph representation can facilitate the use of advanced algorithms and tools of graph theory, topology, optimization, and machine learning to aid data analytics and decision-making. We demonstrate the capabilities of our framework by using case studies in the State of Wisconsin; here, we aim to identify upstream nutrient pollutant sources that arise from agricultural practices and trace downstream impacts to waterbodies, rivers, and streams. Our tool ultimately seeks to help stakeholders design effective pollution prevention/mitigation practices and evaluate how surface waters respond to such practices.Comment: 39 pages, 9 figures updated format, updated github link, updated agricultural land in figures/analysis; methods/overall results remain unchange
    corecore