19,991 research outputs found

    Mathematical Foundations of Consciousness

    Get PDF
    We employ the Zermelo-Fraenkel Axioms that characterize sets as mathematical primitives. The Anti-foundation Axiom plays a significant role in our development, since among other of its features, its replacement for the Axiom of Foundation in the Zermelo-Fraenkel Axioms motivates Platonic interpretations. These interpretations also depend on such allied notions for sets as pictures, graphs, decorations, labelings and various mappings that we use. A syntax and semantics of operators acting on sets is developed. Such features enable construction of a theory of non-well-founded sets that we use to frame mathematical foundations of consciousness. To do this we introduce a supplementary axiomatic system that characterizes experience and consciousness as primitives. The new axioms proceed through characterization of so- called consciousness operators. The Russell operator plays a central role and is shown to be one example of a consciousness operator. Neural networks supply striking examples of non-well-founded graphs the decorations of which generate associated sets, each with a Platonic aspect. Employing our foundations, we show how the supervening of consciousness on its neural correlates in the brain enables the framing of a theory of consciousness by applying appropriate consciousness operators to the generated sets in question

    From Bounded Rationality to Behavioral Economics

    Get PDF
    The paper provides an brief overview of the “state of the art” in the theory of rational decision making since the 1950’s, and focuses specially on the evolutionary justification of rationality. It is claimed that this justification, and more generally the economic methodology inherited from the Chicago school, becomes untenable once taking into account Kauffman’s Nk model, showing that if evolution it is based on trial-and-error search process, it leads generally to sub- optimal stable solutions: the ‘as if’ justification of perfect rationality proves therefore to be a fallacious metaphor. The normative interpretation of decision-making theory is therefore questioned, and the two challenging views against this approach , Simon’s bounded rationality and Allais’ criticism to expected utility theory are discussed. On this ground it is shown that the cognitive characteristics of choice processes are becoming more and more important for explanation of economic behavior and of deviations from rationality. In particular, according to Kahneman’s Nobel Lecture, it is suggested that the distinction between two types of cognitive processes – the effortful process of deliberate reasoning on the one hand, and the automatic process of unconscious intuition on the other – can provide a different map with which to explain a broad class of deviations from pure ‘olympian’ rationality. This view requires re-establishing and revising connections between psychology and economics: an on-going challenge against the normative approach to economic methodology.Bounded Rationality, Behavioral Economics, Evolution, As If

    Doing and Showing

    Get PDF
    The persisting gap between the formal and the informal mathematics is due to an inadequate notion of mathematical theory behind the current formalization techniques. I mean the (informal) notion of axiomatic theory according to which a mathematical theory consists of a set of axioms and further theorems deduced from these axioms according to certain rules of logical inference. Thus the usual notion of axiomatic method is inadequate and needs a replacement.Comment: 54 pages, 2 figure

    Should we discount future generations’ welfare? A survey on the “pure” discount rate debate.

    Get PDF
    In A Mathematical Theory of Saving (1928), Frank Ramsey not only laid the foundations of the fruitful optimal growth literature, but also launched a major moral debate: should we discount future generations’ well-being? While Ramsey regarded such “pure” discounting as “ethically indefensible”, several philosophers and economists have developed arguments justifying the “pure” discounting practice since the early 1960s. This essay consists of a survey of those arguments. After a brief examination of the – often implicit – treatment of future generations’ welfare by utilitarian thinkers before Ramsey’s view was expressed, later arguments of various kinds are analysed. It is argued that, under the assumption of perfect certainty regarding future human life, the “pure” discounting practice seems ethically untenable. However, once we account for the uncertainty regarding future generations’ existence, “pure” discounting seems more acceptable, even if strong criticisms still remain, especially regarding the adequateness of the expected utility theory in such a normative context. those limits would be faced by any other consequences-based ethical theory in front of Different Number Choices.

    Invisible Hand in the Process of Making Economics or on the Method and Scope of Economics

    Get PDF
    As a social science, economics cannot be reduced to simply an a priori science or an ideology. In addition economics cannot be solely an empirical or a historical science. Economics is a research field which studies only one dimension of human behavior, with the four fields of mathematics, econometrics, ethics and history intersecting one another. The purpose of this paper is to discuss the two parts of the proposition above, in connection with the controversies surrounding the method and the scope of economics: economics as an applied mathematics and economics as a predictive/empirical science.Invisible hand, Scope and method in economics, Economics as an applied mathematics, Economics as an empirical science, Economics as ideology.

    Cauchy, infinitesimals and ghosts of departed quantifiers

    Get PDF
    Procedures relying on infinitesimals in Leibniz, Euler and Cauchy have been interpreted in both a Weierstrassian and Robinson's frameworks. The latter provides closer proxies for the procedures of the classical masters. Thus, Leibniz's distinction between assignable and inassignable numbers finds a proxy in the distinction between standard and nonstandard numbers in Robinson's framework, while Leibniz's law of homogeneity with the implied notion of equality up to negligible terms finds a mathematical formalisation in terms of standard part. It is hard to provide parallel formalisations in a Weierstrassian framework but scholars since Ishiguro have engaged in a quest for ghosts of departed quantifiers to provide a Weierstrassian account for Leibniz's infinitesimals. Euler similarly had notions of equality up to negligible terms, of which he distinguished two types: geometric and arithmetic. Euler routinely used product decompositions into a specific infinite number of factors, and used the binomial formula with an infinite exponent. Such procedures have immediate hyperfinite analogues in Robinson's framework, while in a Weierstrassian framework they can only be reinterpreted by means of paraphrases departing significantly from Euler's own presentation. Cauchy gives lucid definitions of continuity in terms of infinitesimals that find ready formalisations in Robinson's framework but scholars working in a Weierstrassian framework bend over backwards either to claim that Cauchy was vague or to engage in a quest for ghosts of departed quantifiers in his work. Cauchy's procedures in the context of his 1853 sum theorem (for series of continuous functions) are more readily understood from the viewpoint of Robinson's framework, where one can exploit tools such as the pointwise definition of the concept of uniform convergence. Keywords: historiography; infinitesimal; Latin model; butterfly modelComment: 45 pages, published in Mat. Stu
    • 

    corecore