7,788 research outputs found

    "Revolution? What Revolution?" Successes and limits of computing technologies in philosophy and religion

    Get PDF
    Computing technologies like other technological innovations in the modern West are inevitably introduced with the rhetoric of "revolution". Especially during the 1980s (the PC revolution) and 1990s (the Internet and Web revolutions), enthusiasts insistently celebrated radical changes— changes ostensibly inevitable and certainly as radical as those brought about by the invention of the printing press, if not the discovery of fire.\ud These enthusiasms now seem very "1990s�—in part as the revolution stumbled with the dot.com failures and the devastating impacts of 9/11. Moreover, as I will sketch out below, the patterns of diffusion and impact in philosophy and religion show both tremendous success, as certain revolutionary promises are indeed kept—as well as (sometimes spectacular) failures. Perhaps we use revolutionary rhetoric less frequently because the revolution has indeed succeeded: computing technologies, and many of the powers and potentials they bring us as scholars and religionists have become so ubiquitous and normal that they no longer seem "revolutionary at all. At the same time, many of the early hopes and promises instantiated in such specific projects as Artificial Intelligence and anticipations of virtual religious communities only have been dashed against the apparently intractable limits of even these most remarkable technologies. While these failures are usually forgotten they leave in their wake a clearer sense of what these new technologies can, and cannot do

    Parikh and Wittgenstein

    Full text link
    A survey of Parikh’s philosophical appropriations of Wittgensteinian themes, placed into historical context against the backdrop of Turing’s famous paper, “On computable numbers, with an application to the Entscheidungsproblem” (Turing in Proc Lond Math Soc 2(42): 230–265, 1936/1937) and its connections with Wittgenstein and the foundations of mathematics. Characterizing Parikh’s contributions to the interaction between logic and philosophy at its foundations, we argue that his work gives the lie to recent presentations of Wittgenstein’s so-called metaphilosophy (e.g., Horwich in Wittgenstein’s metaphilosophy. Oxford University Press, Oxford, 2012) as a kind of “dead end” quietism. From early work on the idea of a feasibility in arithmetic (Parikh in J Symb Log 36(3):494–508, 1971) and vagueness (Parikh in Logic, language and method. Reidel, Boston, pp 241–261, 1983) to his more recent program in social software (Parikh in Advances in modal logic, vol 2. CSLI Publications, Stanford, pp 381–400, 2001a), Parikh’s work encompasses and touches upon many foundational issues in epistemology, philosophy of logic, philosophy of language, and value theory. But it expresses a unified philosophical point of view. In his most recent work, questions about public and private languages, opportunity spaces, strategic voting, non-monotonic inference and knowledge in literature provide a remarkable series of suggestions about how to present issues of fundamental importance in theoretical computer science as serious philosophical issues

    Reading in the Disciplines: The Challenges of Adolescent Literacy

    Get PDF
    A companion report to Carnegie's Time to Act, focuses on the specific skills and literacy support needed for reading in academic subject areas in higher grades. Outlines strategies for teaching content knowledge and reading strategies together

    A Consensus on the Definition and Knowledge Base for Computer Graphics

    Get PDF
    Despite several decades of historical innovation, measurable impacts, and multiple specializations the existing knowledge base for Computer Graphics (CG) lacks consensus, and numerous definitions for it have been published based on distinct contexts. Disagreement among post-secondary academics has divided CG programs into three contextual areas that emphasize different topics. This division has resulted in the decontextualization of CG education, and CG programs now face several challenges in meeting the needs of industry. Employing the Delphi Method, this investigation explored the perceptions among post-secondary educators and industry professionals about the definition of CG and how it is identified in terms of characteristics and context. The outcomes of this investigation identified CG in the technological paradigm, and provided a road map towards a true definition and distinct knowledge base necessary for establishing CG as a formal computing discipline

    Designing Normative Theories for Ethical and Legal Reasoning: LogiKEy Framework, Methodology, and Tool Support

    Full text link
    A framework and methodology---termed LogiKEy---for the design and engineering of ethical reasoners, normative theories and deontic logics is presented. The overall motivation is the development of suitable means for the control and governance of intelligent autonomous systems. LogiKEy's unifying formal framework is based on semantical embeddings of deontic logics, logic combinations and ethico-legal domain theories in expressive classic higher-order logic (HOL). This meta-logical approach enables the provision of powerful tool support in LogiKEy: off-the-shelf theorem provers and model finders for HOL are assisting the LogiKEy designer of ethical intelligent agents to flexibly experiment with underlying logics and their combinations, with ethico-legal domain theories, and with concrete examples---all at the same time. Continuous improvements of these off-the-shelf provers, without further ado, leverage the reasoning performance in LogiKEy. Case studies, in which the LogiKEy framework and methodology has been applied and tested, give evidence that HOL's undecidability often does not hinder efficient experimentation.Comment: 50 pages; 10 figure

    The Role of the Subjectivist Position in the Probabilization of Forensic Science

    Get PDF
    This paper is concerned with the contribution of forensic science to the legal process by helping reduce uncertainty. Although it is now widely accepted that uncertainty should be handled by probability because it is a safeguard against incoherent proceedings, there remain diverging and conflicting views on how probability ought to be interpreted. This is exemplified by the proposals in scientific literature that call for procedures of probability computation that are referred to as "objective," suggesting that scientists ought to use them in their reporting to recipients of expert information. I find such proposals objectionable. They need to be viewed cautiously, essentially because ensuing probabilistic statements can be perceived as making forensic science prescriptive. A motivating example from the context of forensic DNA analysis will be chosen to illustrate this. As a main point, it shall be argued that such constraining suggestions can be avoided by interpreting probability as a measure of personal belief, that is, subjective probability. Invoking references to foundational literature from mathematical statistics and philosophy of science, the discussion will explore the consequences of this interdisciplinary viewpoint for the practice of forensic expert reporting. It will be emphasized that-as an operational interpretation of probability-the subjectivist perspective enables forensic science to add value to the legal process, in particular by avoiding inferential impasses to which other interpretations of probability may lead. Moreover, understanding probability from a subjective perspective can encourage participants in the legal process to take on more responsibility in matters regarding the coherent handling of uncertainty. This would assure more balanced interactions at the interface between science and the law. This, in turn, provides support for ongoing developments that can be called the "probabilization" of forensic science

    Investigating Cultural Values and Educational Technology Adoption in Central Asia: A Case Study

    Get PDF
    Although the adoption of new tools for communication and learning could reasonably be expected to influence culture, little is known about the relationship between cultural values and the adoption or diffusion of Web 2.0 technologies. This case study examines the way in which the cultural values of 59 teachers in four Central Asian countries influenced and were influenced by Web 2.0 technologies during five to eighteen months of online professional development. Data was collected through self-introductions, Likert-scale and open-ended prompts on initial and final surveys, online forum discussions, and capstone projects. This allows an examination of changes in the participants’ expressed attitudes toward and use of Web 2.0 educational technology as well as the identification of cultural values (Hofstede, 1980b) associated with these patterns of adoption and diffusion. The findings are especially beneficial to decision-makers who care about the way the use of Web 2.0 educational technologies could impact educational systems and cultures
    corecore