118,272 research outputs found

    First-order modal logic in the necessary framework of objects

    Get PDF
    I consider the first-order modal logic which counts as valid those sentences which are true on every interpretation of the non-logical constants. Based on the assumptions that it is necessary what individuals there are and that it is necessary which propositions are necessary, Timothy Williamson has tentatively suggested an argument for the claim that this logic is determined by a possible world structure consisting of an infinite set of individuals and an infinite set of worlds. He notes that only the cardinalities of these sets matters, and that not all pairs of infinite sets determine the same logic. I use so-called two-cardinal theorems from model theory to investigate the space of logics and consequence relations determined by pairs of infinite sets, and show how to eliminate the assumption that worlds are individuals from Williamson’s argument

    The Strength of Abstraction with Predicative Comprehension

    Full text link
    Frege's theorem says that second-order Peano arithmetic is interpretable in Hume's Principle and full impredicative comprehension. Hume's Principle is one example of an abstraction principle, while another paradigmatic example is Basic Law V from Frege's Grundgesetze. In this paper we study the strength of abstraction principles in the presence of predicative restrictions on the comprehension schema, and in particular we study a predicative Fregean theory which contains all the abstraction principles whose underlying equivalence relations can be proven to be equivalence relations in a weak background second-order logic. We show that this predicative Fregean theory interprets second-order Peano arithmetic.Comment: Forthcoming in Bulletin of Symbolic Logic. Slight change in title from previous version, at request of referee

    What Statutes Mean: Interpretive Lessons from Positive Theories of Communication and Legislation

    Get PDF
    How should judges interpret statutes? For some scholars and judges, interpreting statutes requires little more than a close examination of statutory language, with perhaps a dictionary and a few interpretive canons nearby. For others, statutory interpretation must be based upon an assessment of a statute\u27s underlying purpose, an evaluation of society\u27s current norms and values, or a normative objective, such as the law\u27s integrity. With such differences squarely framed in the literature, it is reasonable to ask whether anything of value can be added. We contend that there is

    Institutionalization and Structuration: Studying the Links between Action and Institution

    Get PDF
    Institutional theory and structuration theory both contend that institutions and actions are inextricably linked and that institutionalization is best understood as a dynamic, ongoing process. Institutionalists, however, have pursued an empirical agenda that has largely ignored how institutions are created, altered, and reproduced, in part, because their models of institutionalization as a process are underdeveloped. Structuration theory, on the other hand, largely remains a process theory of such abstraction that it has generated few empirical studies. This paper discusses the similarities between the two theories, develops an argument for why a fusion of the two would enable institutional theory to significantly advance, develops a model of institutionalization as a structuration process, and proposes methodological guidelines for investigating the process empirically

    Problem of Time in Quantum Gravity

    Full text link
    The Problem of Time occurs because the `time' of GR and of ordinary Quantum Theory are mutually incompatible notions. This is problematic in trying to replace these two branches of physics with a single framework in situations in which the conditions of both apply, e.g. in black holes or in the very early universe. Emphasis in this Review is on the Problem of Time being multi-faceted and on the nature of each of the eight principal facets. Namely, the Frozen Formalism Problem, Configurational Relationalism Problem (formerly Sandwich Problem), Foliation Dependence Problem, Constraint Closure Problem (formerly Functional Evolution Problem), Multiple Choice Problem, Global Problem of Time, Problem of Beables (alias Problem of Observables) and Spacetime Reconstruction/Replacement Problem. Strategizing in this Review is not just centred about the Frozen Formalism Problem facet, but rather about each of the eight facets. Particular emphasis is placed upon A) relationalism as an underpinning of the facets and as a selector of particular strategies (especially a modification of Barbour relationalism, though also with some consideration of Rovelli relationalism). B) Classifying approaches by the full ordering in which they embrace constrain, quantize, find time/history and find observables, rather than only by partial orderings such as "Dirac-quantize". C) Foliation (in)dependence and Spacetime Reconstruction for a wide range of physical theories, strategizing centred about the Problem of Beables, the Patching Approach to the Global Problem of Time, and the role of the question-types considered in physics. D) The Halliwell- and Gambini-Porto-Pullin-type combined Strategies in the context of semiclassical quantum cosmology.Comment: Invited Review: 26 pages including 2 Figures. This v2 has a number of minor improvements and correction

    Tensions and paradoxes in electronic patient record research: a systematic literature review using the meta-narrative method

    Get PDF
    Background: The extensive and rapidly expanding research literature on electronic patient records (EPRs) presents challenges to systematic reviewers. This literature is heterogeneous and at times conflicting, not least because it covers multiple research traditions with different underlying philosophical assumptions and methodological approaches. Aim: To map, interpret and critique the range of concepts, theories, methods and empirical findings on EPRs, with a particular emphasis on the implementation and use of EPR systems. Method: Using the meta-narrative method of systematic review, and applying search strategies that took us beyond the Medline-indexed literature, we identified over 500 full-text sources. We used ‘conflicting’ findings to address higher-order questions about how the EPR and its implementation were differently conceptualised and studied by different communities of researchers. Main findings: Our final synthesis included 24 previous systematic reviews and 94 additional primary studies, most of the latter from outside the biomedical literature. A number of tensions were evident, particularly in relation to: [1] the EPR (‘container’ or ‘itinerary’); [2] the EPR user (‘information-processer’ or ‘member of socio-technical network’); [3] organizational context (‘the setting within which the EPR is implemented’ or ‘the EPR-in-use’); [4] clinical work (‘decision-making’ or ‘situated practice’); [5] the process of change (‘the logic of determinism’ or ‘the logic of opposition’); [6] implementation success (‘objectively defined’ or ‘socially negotiated’); and [7] complexity and scale (‘the bigger the better’ or ‘small is beautiful’). Findings suggest that integration of EPRs will always require human work to re-contextualize knowledge for different uses; that whilst secondary work (audit, research, billing) may be made more efficient by the EPR, primary clinical work may be made less efficient; that paper, far from being technologically obsolete, currently offers greater ecological flexibility than most forms of electronic record; and that smaller systems may sometimes be more efficient and effective than larger ones. Conclusions: The tensions and paradoxes revealed in this study extend and challenge previous reviews and suggest that the evidence base for some EPR programs is more limited than is often assumed. We offer this paper as a preliminary contribution to a much-needed debate on this evidence and its implications, and suggest avenues for new research

    Formalization of Universal Algebra in Agda

    Get PDF
    In this work we present a novel formalization of universal algebra in Agda. We show that heterogeneous signatures can be elegantly modelled in type-theory using sets indexed by arities to represent operations. We prove elementary results of heterogeneous algebras, including the proof that the term algebra is initial and the proofs of the three isomorphism theorems. We further formalize equational theory and prove soundness and completeness. At the end, we define (derived) signature morphisms, from which we get the contravariant functor between algebras; moreover, we also proved that, under some restrictions, the translation of a theory induces a contra-variant functor between models.Fil: Gunther, Emmanuel. Universidad Nacional de Córdoba. Facultad de Matemática, Astronomía y Física; Argentina. Consejo Nacional de Investigaciones Científicas y Técnicas; ArgentinaFil: Gadea, Alejandro Emilio. Universidad Nacional de Córdoba. Facultad de Matemática, Astronomía y Física; Argentina. Consejo Nacional de Investigaciones Científicas y Técnicas; ArgentinaFil: Pagano, Miguel Maria. Consejo Nacional de Investigaciones Científicas y Técnicas; Argentina. Universidad Nacional de Córdoba. Facultad de Matemática, Astronomía y Física; Argentin

    Cauchy, infinitesimals and ghosts of departed quantifiers

    Get PDF
    Procedures relying on infinitesimals in Leibniz, Euler and Cauchy have been interpreted in both a Weierstrassian and Robinson's frameworks. The latter provides closer proxies for the procedures of the classical masters. Thus, Leibniz's distinction between assignable and inassignable numbers finds a proxy in the distinction between standard and nonstandard numbers in Robinson's framework, while Leibniz's law of homogeneity with the implied notion of equality up to negligible terms finds a mathematical formalisation in terms of standard part. It is hard to provide parallel formalisations in a Weierstrassian framework but scholars since Ishiguro have engaged in a quest for ghosts of departed quantifiers to provide a Weierstrassian account for Leibniz's infinitesimals. Euler similarly had notions of equality up to negligible terms, of which he distinguished two types: geometric and arithmetic. Euler routinely used product decompositions into a specific infinite number of factors, and used the binomial formula with an infinite exponent. Such procedures have immediate hyperfinite analogues in Robinson's framework, while in a Weierstrassian framework they can only be reinterpreted by means of paraphrases departing significantly from Euler's own presentation. Cauchy gives lucid definitions of continuity in terms of infinitesimals that find ready formalisations in Robinson's framework but scholars working in a Weierstrassian framework bend over backwards either to claim that Cauchy was vague or to engage in a quest for ghosts of departed quantifiers in his work. Cauchy's procedures in the context of his 1853 sum theorem (for series of continuous functions) are more readily understood from the viewpoint of Robinson's framework, where one can exploit tools such as the pointwise definition of the concept of uniform convergence. Keywords: historiography; infinitesimal; Latin model; butterfly modelComment: 45 pages, published in Mat. Stu
    corecore