617 research outputs found
Accounting for Framing-Effects - an informational approach to intensionality in the Bolker-Jeffrey decision model
We suscribe to an account of framing-effects in decision theory in terms of an inference to a background informationa by the hearer when a speaker uses a certain frame while other equivalent frames were also available. This account was sketched by Craig McKenzie. We embed it in Bolker-Jeffrey decision model (or logic of action) - one main reason of this is that this latter model makes preferences bear on propositions. We can deduce a given anomaly or cognitive bias (namely framing-effects) in a formal decision theory. This leads to some philosophical considerations on the relationship between the rationality of preferences and the sensitivity to descriptions or labels of states of affairs (intensionality) in decision-making.information-processing and decision-making, framing-effects, intensionality, Bolker-Jeffrey
Linguistic Criteria of Intentionality
The aim of this paper is to discuss theories that attempt to single out the class of intentional states by appealing to factors that are supposedly criterial for intentional sentences. The papers starts with distinguishing two issues that arise when one thinks about intentional expressions: the Taxonomy Problem and the Fundamental Demarcation Problem. The former concerns the relation between the classes of distinct intentional verbs and distinct intentional states. The latter concerns the question about how to distinguish intentional states and acts from the non-intentional ones. Next, the general desiderata for theories providing criteria for singling out the class of intentional sentences are introduced. Finally, distinct proposals for providing such criteria are analyzed. Author argues that neither is satisfactory
Polynomial Path Orders
This paper is concerned with the complexity analysis of constructor term
rewrite systems and its ramification in implicit computational complexity. We
introduce a path order with multiset status, the polynomial path order POP*,
that is applicable in two related, but distinct contexts. On the one hand POP*
induces polynomial innermost runtime complexity and hence may serve as a
syntactic, and fully automatable, method to analyse the innermost runtime
complexity of term rewrite systems. On the other hand POP* provides an
order-theoretic characterisation of the polytime computable functions: the
polytime computable functions are exactly the functions computable by an
orthogonal constructor TRS compatible with POP*.Comment: LMCS version. This article supersedes arXiv:1209.379
Non‐Classical Knowledge
The Knower paradox purports to place surprising a priori limitations on what we can know. According to orthodoxy, it shows that we need to abandon one of three plausible and widely-held ideas: that knowledge is factive, that we can know that knowledge is factive, and that we can use logical/mathematical reasoning to extend our knowledge via very weak single-premise closure principles. I argue that classical logic, not any of these epistemic principles, is the culprit. I develop a consistent theory validating all these principles by combining Hartry Field's theory of truth with a modal enrichment developed for a different purpose by Michael Caie. The only casualty is classical logic: the theory avoids paradox by using a weaker-than-classical K3 logic.
I then assess the philosophical merits of this approach. I argue that, unlike the traditional semantic paradoxes involving extensional notions like truth, its plausibility depends on the way in which sentences are referred to--whether in natural languages via direct sentential reference, or in mathematical theories via indirect sentential reference by Gödel coding. In particular, I argue that from the perspective of natural language, my non-classical treatment of knowledge as a predicate is plausible, while from the perspective of mathematical theories, its plausibility depends on unresolved questions about the limits of our idealized deductive capacities
Making AI Meaningful Again
Artificial intelligence (AI) research enjoyed an initial period of enthusiasm in the 1970s and 80s. But this enthusiasm was tempered by a long interlude of frustration when genuinely useful AI applications failed to be forthcoming. Today, we are experiencing once again a period of enthusiasm, fired above all by the successes of the technology of deep neural networks or deep machine learning. In this paper we draw attention to what we take to be serious problems underlying current views of artificial intelligence encouraged by these successes, especially in the domain of language processing. We then show an alternative approach to language-centric AI, in which we identify a role for philosophy
Slingshot Arguments and the Intensionality of Identity
It is argued that the slingshot argument does not soundly challenge the truth-maker correspondence theory of truth, by which at least some distinct true propositions are expected to have distinct truth- makers. Objections are presented to possible exact interpretations of the essential slingshot assumption, in which no fully acceptable reconstruction is discovered. A streamlined version of the slingshot is evaluated, in which explicit contradiction results, on the assumption that identity and nonidentity contexts are purely extensional relations, effectively establishing the intensionality of identity
Kolmogorov Complexity in perspective. Part II: Classification, Information Processing and Duality
We survey diverse approaches to the notion of information: from Shannon
entropy to Kolmogorov complexity. Two of the main applications of Kolmogorov
complexity are presented: randomness and classification. The survey is divided
in two parts published in a same volume. Part II is dedicated to the relation
between logic and information system, within the scope of Kolmogorov
algorithmic information theory. We present a recent application of Kolmogorov
complexity: classification using compression, an idea with provocative
implementation by authors such as Bennett, Vitanyi and Cilibrasi. This stresses
how Kolmogorov complexity, besides being a foundation to randomness, is also
related to classification. Another approach to classification is also
considered: the so-called "Google classification". It uses another original and
attractive idea which is connected to the classification using compression and
to Kolmogorov complexity from a conceptual point of view. We present and unify
these different approaches to classification in terms of Bottom-Up versus
Top-Down operational modes, of which we point the fundamental principles and
the underlying duality. We look at the way these two dual modes are used in
different approaches to information system, particularly the relational model
for database introduced by Codd in the 70's. This allows to point out diverse
forms of a fundamental duality. These operational modes are also reinterpreted
in the context of the comprehension schema of axiomatic set theory ZF. This
leads us to develop how Kolmogorov's complexity is linked to intensionality,
abstraction, classification and information system.Comment: 43 page
- …