29,469 research outputs found

    A Comparison of Axiomatic Approaches to Qualitative Decision Making Using Possibility Theory

    Get PDF
    A longer version of this paper is available from KU Scholarworks as Giang, P. H. and P. P. Shenoy, "Two Axiomatic Approaches to Decision Making Using Possibility Theory," European Journal of Operational Research, Vol. 162, No. 2, 2005, pp. 450--467.In this paper we analyze two recent axiomatic approaches proposed by Dubois et al., and by Giang and Shenoy, respectively, for qualitative decision making where uncertainty is described by possibility theory. Both axiomtizations are inspired by von Neumann and Morgenstern's system of axioms for the case of probability theory. We show that our approach naturally unifies two axiomatic systems that correspond, respectively, to pessimistic and optimistic decision criteria proposed by Dubois et al. The simplifying unification is achieved by (i) replacing axioms that are supposed to reflect two informational attitudes (uncertainty aversion and uncertainty attraction) by an axiom that imposes order on set of standard lotteries, and (ii) using a binary utility scale in which each utility level is represented by a pair of numbers.The research was supported by a grant from the School of Business PhD Summer Research Fund to both authors

    A Process Modelling Framework Based on Point Interval Temporal Logic with an Application to Modelling Patient Flows

    Get PDF
    This thesis considers an application of a temporal theory to describe and model the patient journey in the hospital accident and emergency (A&E) department. The aim is to introduce a generic but dynamic method applied to any setting, including healthcare. Constructing a consistent process model can be instrumental in streamlining healthcare issues. Current process modelling techniques used in healthcare such as flowcharts, unified modelling language activity diagram (UML AD), and business process modelling notation (BPMN) are intuitive and imprecise. They cannot fully capture the complexities of the types of activities and the full extent of temporal constraints to an extent where one could reason about the flows. Formal approaches such as Petri have also been reviewed to investigate their applicability to the healthcare domain to model processes. Additionally, to schedule patient flows, current modelling standards do not offer any formal mechanism, so healthcare relies on critical path method (CPM) and program evaluation review technique (PERT), that also have limitations, i.e. finish-start barrier. It is imperative to specify the temporal constraints between the start and/or end of a process, e.g., the beginning of a process A precedes the start (or end) of a process B. However, these approaches failed to provide us with a mechanism for handling these temporal situations. If provided, a formal representation can assist in effective knowledge representation and quality enhancement concerning a process. Also, it would help in uncovering complexities of a system and assist in modelling it in a consistent way which is not possible with the existing modelling techniques. The above issues are addressed in this thesis by proposing a framework that would provide a knowledge base to model patient flows for accurate representation based on point interval temporal logic (PITL) that treats point and interval as primitives. These objects would constitute the knowledge base for the formal description of a system. With the aid of the inference mechanism of the temporal theory presented here, exhaustive temporal constraints derived from the proposed axiomatic system’ components serves as a knowledge base. The proposed methodological framework would adopt a model-theoretic approach in which a theory is developed and considered as a model while the corresponding instance is considered as its application. Using this approach would assist in identifying core components of the system and their precise operation representing a real-life domain deemed suitable to the process modelling issues specified in this thesis. Thus, I have evaluated the modelling standards for their most-used terminologies and constructs to identify their key components. It will also assist in the generalisation of the critical terms (of process modelling standards) based on their ontology. A set of generalised terms proposed would serve as an enumeration of the theory and subsume the core modelling elements of the process modelling standards. The catalogue presents a knowledge base for the business and healthcare domains, and its components are formally defined (semantics). Furthermore, a resolution theorem-proof is used to show the structural features of the theory (model) to establish it is sound and complete. After establishing that the theory is sound and complete, the next step is to provide the instantiation of the theory. This is achieved by mapping the core components of the theory to their corresponding instances. Additionally, a formal graphical tool termed as point graph (PG) is used to visualise the cases of the proposed axiomatic system. PG facilitates in modelling, and scheduling patient flows and enables analysing existing models for possible inaccuracies and inconsistencies supported by a reasoning mechanism based on PITL. Following that, a transformation is developed to map the core modelling components of the standards into the extended PG (PG*) based on the semantics presented by the axiomatic system. A real-life case (from the King’s College hospital accident and emergency (A&E) department’s trauma patient pathway) is considered to validate the framework. It is divided into three patient flows to depict the journey of a patient with significant trauma, arriving at A&E, undergoing a procedure and subsequently discharged. Their staff relied upon the UML-AD and BPMN to model the patient flows. An evaluation of their representation is presented to show the shortfalls of the modelling standards to model patient flows. The last step is to model these patient flows using the developed approach, which is supported by enhanced reasoning and scheduling

    Mathematics as the role model for neoclassical economics (Blanqui Lecture)

    Get PDF
    Born out of the conscious effort to imitate mechanical physics, neoclassical economics ended up in the mid 20th century embracing a purely mathematical notion of rigor as embodied by the axiomatic method. This lecture tries to explain how this could happen, or, why and when the economists’ role model became the mathematician rather than the physicist. According to the standard interpretation, the triumph of axiomatics in modern neoclassical economics can be explained in terms of the discipline’s increasing awareness of its lack of good experimental and observational data, and thus of its intrinsic inability to fully abide by the paradigm of mechanics. Yet this story fails to properly account for the transformation that the word “rigor” itself underwent first and foremost in mathematics as well as for the existence of a specific motivation behind the economists’ decision to pursue the axiomatic route. While the full argument is developed in Giocoli 2003, these pages offer a taste of a (partially) alternative story which begins with the so-called formalist revolution in mathematics, then crosses the economists’ almost innate urge to bring their discipline to the highest possible level of generality and conceptual integrity, and ends with the advent and consolidation of that very core set of methods, tools and ideas that constitute the contemporary image of economics.Axiomatic method, formalism, rationality, neoclassical economics

    Compensatory Transfers in Two-Player Decision Problems

    Get PDF
    This paper presents an axiomatic characterization of a family of solutions to two-player quasi-linear social choice problems. In these problems the players select a single action from a set available to them. They may also transfer money between themselves. The solutions form a one-parameter family, where the parameter is a nonnegative number, t. The solutions can be interpreted as follows: Any efficient action can be selected. Based on this action, compute for each player a "best claim for compensation". A claim for compensation is the difference between the value of an alternative action and the selected efficient action, minus a penalty proportional to the extent to which the alternative action is inefficient. The coefficient of proportionality of this penalty is t. The best claim for compensation for a player is the maximum of this computed claim over all possible alternative actions. The solution, at the parameter value t, is to implement the chosen efficient action and make a monetary transfer equal to the average of these two best claims. The characterization relies on three main axioms. The paper presents and justifies these axioms and compares them to related conditions used in other bargaining contexts. In Nash Bargaining Theory, the axioms analagous to these three are in conflict with each other. In contrast, in the quasi-linear social choice setting of this paper, all three conditions can be satisfied simultaneously.

    From Wald to Savage: homo economicus becomes a Bayesian statistician

    Get PDF
    Bayesian rationality is the paradigm of rational behavior in neoclassical economics. A rational agent in an economic model is one who maximizes her subjective expected utility and consistently revises her beliefs according to Bayes’s rule. The paper raises the question of how, when and why this characterization of rationality came to be endorsed by mainstream economists. Though no definitive answer is provided, it is argued that the question is far from trivial and of great historiographic importance. The story begins with Abraham Wald’s behaviorist approach to statistics and culminates with Leonard J. Savage’s elaboration of subjective expected utility theory in his 1954 classic The Foundations of Statistics. It is the latter’s acknowledged fiasco to achieve its planned goal, the reinterpretation of traditional inferential techniques along subjectivist and behaviorist lines, which raises the puzzle of how a failed project in statistics could turn into such a tremendous hit in economics. A couple of tentative answers are also offered, involving the role of the consistency requirement in neoclassical analysis and the impact of the postwar transformation of US business schools.Savage, Wald, rational behavior, Bayesian decision theory, subjective probability, minimax rule, statistical decision functions, neoclassical economics

    (WP 2014-03) Bounded Rationality and Bounded Individuality

    Get PDF
    This paper argues that since the utility function conception of the individual is derived from standard rationality theory, the view that rationality is bounded suggests that individuality should also be seen as bounded. The meaning of this idea is developed in terms of two ways in which individuality can be said to be bounded, with one bound associated with Kahneman and Tversky’s prospect theory and the ‘new’ behavioral economics and a second bound associated with Simon’s evolutionary thinking and the ‘old’ behavioral economics. The paper then shows how different bounded individuality conceptions operate in nudge economics, agent-based modeling, and social identity theory, explaining these conceptions in terms of how they relate to these two behavioral economics views of bounded rationality. How both the ‘new’ and ‘old’ individuality bounds might then be combined in a single account is briefly explored in connection with Kirman’s Marseille fish market analysis
    • …
    corecore