91 research outputs found

    Apperceptive patterning: Artefaction, extensional beliefs and cognitive scaffolding

    Get PDF
    In “Psychopower and Ordinary Madness” my ambition, as it relates to Bernard Stiegler’s recent literature, was twofold: 1) critiquing Stiegler’s work on exosomatization and artefactual posthumanism—or, more specifically, nonhumanism—to problematize approaches to media archaeology that rely upon technical exteriorization; 2) challenging how Stiegler engages with Giuseppe Longo and Francis Bailly’s conception of negative entropy. These efforts were directed by a prevalent techno-cultural qualifier: the rise of Synthetic Intelligence (including neural nets, deep learning, predictive processing and Bayesian models of cognition). This paper continues this project but first directs a critical analytic lens at the Derridean practice of the ontologization of grammatization from which Stiegler emerges while also distinguishing how metalanguages operate in relation to object-oriented environmental interaction by way of inferentialism. Stalking continental (Kapp, Simondon, Leroi-Gourhan, etc.) and analytic traditions (e.g., Carnap, Chalmers, Clark, Sutton, Novaes, etc.), we move from artefacts to AI and Predictive Processing so as to link theories related to technicity with philosophy of mind. Simultaneously drawing forth Robert Brandom’s conceptualization of the roles that commitments play in retrospectively reconstructing the social experiences that lead to our endorsement(s) of norms, we compliment this account with Reza Negarestani’s deprivatized account of intelligence while analyzing the equipollent role between language and media (both digital and analog)

    From parts to wholes and back again

    Get PDF
    Causaliteit en subjectiviteit in een taalgebruiksbenadering van de grammatica van het Nederland

    Foucault against Ethics: Subjectivity and Critique after Humanism

    Get PDF
    This dissertation is in the first place an interpretation of the thought of Michel Foucault. Beyond interpretation, it also makes provides a qualified defense of his views on the significance of ethical theory, particularly in its “critical” forms, the shape of the space of reasons, and the role of subjectivity within it. I take as my starting point an orthodox view of Foucault’s work, namely, that it can divided in terms of its content into three distinct periods. First, an “archaeological” phase spanning most of the 1960s. Second, a “genealogical” devoted to unearthing power-relations beneath purportedly progressive institutions. Finally, an “ethical” period, focused on rehabilitating practices of moral self-formation in Antiquity. This so-called “ethical turn” has been a source of persistent criticism of Foucault’s thought for several decades. I claim that this periodization is mistaken. There is no substantively “ethical” period in Foucault’s work that would stand in contrast to his genealogical inquiries. In the first chapter, I present overwhelming textual evidence against this interpretation, and then diagnose the motivation for it: the charge of “ethical nihilism” and the demand for a normative framework from critics of Foucault’s genealogical works. In brief the charge is that in revealing the power-relations that partially constitute Enlightenment institutions and the ideals that sustain them, Foucault deprives himself of the resources required to construct the kind of ethical theory needed to ground his critical project. In the second chapter and third chapters, I bring Foucault into conversation with several figures in analytic philosophy, most prominently Wilfrid Sellars and the “Pittsburgh School,” and P.F. Strawson. I argue that Foucault’s archaeological and genealogical works are best construed as an historical inquiry into the construction of “spaces of reasons,” in which we find ourselves subject to normative evaluation and direction. I then argue that the charge of nihilism against Foucault is the result of a process of neutralizing and depoliticizing the essentially plural, agonistic character of the space of reasons. I conclude by using my interpretation to explain and defend Foucault’s controversial engagement with the Iranian Revolution

    The Blurring of Boundaries in Bioscientific Discourse

    Get PDF
    New technologies have revealed previously unknown and invisible parts of the human body and made it visible at the molecular level, revealing in turn more detailed structures and arrangements than those which were previously available. In doing so, in many ways they refine, expand, and even completely overturn forms of contemporary knowledge. This book maps the shifts and blurring of boundaries in contemporary bioscientific discourse. The authors of its chapters trace the shifts of boundaries in terms of the gradual blurring of the validity of established concepts, interpretive frameworks, and standards of judgment, which are analysed from ontological, gnoseological, ethical, and social perspectives. At the same time, they also map the blurring of boundaries in terms of the interdisciplinary crossing of boundaries between various scientific and artistic disciplines. The shifting of boundaries ultimately forms a part of these boundaries’ definition; upon the basis of a rationally guided discussion, these shifts can be guided and corrected so as to avoid any irreversible damage. Jana Tomašovičová is a philosopher with a special interest in contemporary philosophy and bioethics. She analyses the impact of biotechnology on traditional social, ethical, and anthropological concepts and their relevance in new conditions. She is an associate professor at the Faculty of Arts, University of Ss. Cyril and Methodius in Trnava, Slovakia. During her bioethics research, she conducted short research stays at the universities of Bonn, Heidelberg, Tübingen, and Zürich

    Approaching the screenplay as a complex system: underlying mechanics, interrelating dynamics and the plot-algorithmic process

    Get PDF
    The advancement of theoretical screenwriting has been limited to popularized “how-to” techniques to further investigate the field. These techniques were based on internalised rules-of-thumb drawn from inductive observations of existing screenplays. Such analyses failed to provide answers to two troubling fundamental questions: first, what makes stories emerge in the context of narrative, and second, what are the underlying dynamics that allow a screenplay to function as a unified whole? The contribution of Screenplectics lies in first, by explaining how a screenplay functions synergistically, and appropriating the necessary metaphors, systemically. And second, by explaining the mechanism that is employed between compositional interactions in various structural levels that allows the coherent accumulative derivative we call story to emerge. The transition from an empirical to a theoretical perspective is achieved by examining such dynamics under the prism of holism and through the introduction of characteristics of complex systems: a network of components arranged hierarchically that interact parallel to one another in non-linear ways. This hierarchy shapes the foundation of the different layers of structure in a screenplay: deep, intermediate and surface structure. This research consolidates the notion that for the comprehension of such complex dynamics a more comprehensive theory of narrative is required

    BIOMEDICAL LANGUAGE UNDERSTANDING AND EXTRACTION (BLUE-TEXT): A MINIMAL SYNTACTIC, SEMANTIC METHOD

    Get PDF
    Clinical text understanding (CTU) is of interest to health informatics because critical clinical information frequently represented as unconstrained text in electronic health records are extensively used by human experts to guide clinical practice, decision making, and to document delivery of care, but are largely unusable by information systems for queries and computations. Recent initiatives advocating for translational research call for generation of technologies that can integrate structured clinical data with unstructured data, provide a unified interface to all data, and contextualize clinical information for reuse in multidisciplinary and collaborative environment envisioned by CTSA program. This implies that technologies for the processing and interpretation of clinical text should be evaluated not only in terms of their validity and reliability in their intended environment, but also in light of their interoperability, and ability to support information integration and contextualization in a distributed and dynamic environment. This vision adds a new layer of information representation requirements that needs to be accounted for when conceptualizing implementation or acquisition of clinical text processing tools and technologies for multidisciplinary research. On the other hand, electronic health records frequently contain unconstrained clinical text with high variability in use of terms and documentation practices, and without commitmentto grammatical or syntactic structure of the language (e.g. Triage notes, physician and nurse notes, chief complaints, etc). This hinders performance of natural language processing technologies which typically rely heavily on the syntax of language and grammatical structure of the text. This document introduces our method to transform unconstrained clinical text found in electronic health information systems to a formal (computationally understandable) representation that is suitable for querying, integration, contextualization and reuse, and is resilient to the grammatical and syntactic irregularities of the clinical text. We present our design rationale, method, and results of evaluation in processing chief complaints and triage notes from 8 different emergency departments in Houston Texas. At the end, we will discuss significance of our contribution in enabling use of clinical text in a practical bio-surveillance setting

    Promotion of critical thinking in school physical science.

    Get PDF
    Thesis (Ph.D.)-University of KwaZulu-Natal, Durban, 2008.This dissertation describes an action research study aimed at promoting critical thinking in learners while learning physical science within the South African national curriculum. The data were primarily qualitative in nature, and were collected primarily through participant observation, composed of audio- and video- recorded lessons, interviews, questionnaires, journal entries and written material. Data collection, analysis and interpretation were done in the inductive, cyclic manner of action research. This process was guided by research questions about task characteristics, their position in the teaching sequence, the role of the learning environment, and the need to adjust tasks to fit the needs of different learners, so as to effectively promote critical thinking. A pragmatic approach was used. It was found that it is possible, using particular strategies and tasks, to promote critical thinking while meeting the curriculum outcomes, although the intense syllabus pressure of the curriculum makes this challenging. Task design characteristics and positioning in the teaching sequence, and conditions of the learning environment, were found to affect a task’s effectiveness at promoting critical thinking. Various teaching strategies can improve attainability by a wider range of learners. An instructional model, The Ladder Approach, emerged as being most likely to promote success. This was found to be successful when evaluated against criteria of active engagement and interest by learners, attainability with effort, display of critical thinking traits, and compatibility with the South African curriculum. In this model, an interesting problem is posed at the start of a section, after which direct instruction and learner engagement with the problem run parallel to one another, linked by scaffolding tools which are engaged in individually and collaboratively
    • …
    corecore