73 research outputs found

    A black art: Ontology, data, and the Tower of Babel problem

    Get PDF
    Computational ontologies are a new type of emerging scientific media (Smith, 2016) that process large quantities of heterogeneous data about portions of reality. Applied computational ontologies are used for semantically integrating (Heiler, 1995; Pileggi & Fernandez-Llatas, 2012) divergent data to represent reality and in so doing applied computational ontologies alter conceptions of materiality and produce new realities based on levels of informational granularity and abstraction (Floridi, 2011), resulting in a new type of informational ontology (Iliadis, 2013) the critical analysis of which requires new methods and frameworks. Currently, there is a lack of literature addressing the theoretical, social, and critical dimensions of such informational ontologies, applied computational ontologies, and the interdisciplinary communities of practice (Brown & Duguid, 1991; Wenger, 1998) that produce them. This dissertation fills a lacuna in communicative work in an emerging subfield of Science and Technology Studies (Latour & Woolgar, 1979) known as Critical Data Studies (boyd & Crawford, 2012; Dalton & Thatcher, 2014; Kitchin & Lauriault, 2014) by adopting a critical framework to analyze the systems of thought that inform applied computational ontology while offering insight into its realism-based methods and philosophical frameworks to gauge their ethical import. Since the early 1990s, computational ontologies have been used to organize massive amounts of heterogeneous data by individuating reality into computable parts, attributes, and relations. This dissertation provides a theory of computational ontologies as technologies of individuation (Simondon, 2005) that translate disparate data to produce informational cohesion. By technologies of individuation I mean engineered artifacts whose purpose is to partition portions of reality into computable informational objects. I argue that data are metastable entities and that computational ontologies restrain heterogeneous data via a process of translation to produce semantic interoperability. In this way, I show that computational ontologies effectively re-ontologize (Floridi, 2013) and produce reality and thus that have ethical consequences, specifically in terms of their application to social reality and social ontology (Searle, 2006). I use the Basic Formal Ontology (Arp, Smith, & Spear, 2015)—the world’s most widely used upper-level ontology—as a case study and analyze its methods and ensuing ethical issues concerning its social application in the Military Ontology before recommending an ethical framework. “Ontology” is a term that is used in philosophy and computer science in related but different ways—philosophical ontology typically concerns metaphysics while computational ontology typically concerns databases. This dissertation provides a critical history and theory of ontology and the interdisciplinary teams of researchers that came to adopt methods from philosophical ontology to build, persuade, and reason with applied computational ontology. Following a critical communication approach, I define applied computational ontology construction as a solution to a communication problem among scientists who seek to create semantic interoperability among data and argue that applied ontology is philosophical, informational in nature, and communicatively constituted (McPhee & Zaug, 2000). The primary aim is to explain how philosophy informs applied computational ontology while showing how such ontologies became instantiated in material organizations, how to study them, and describe their ethical implications

    Gilbert Simondon and the Philosophy of Information: An Interview with Jean-Hugues Barthélémy

    Get PDF
    Interview with Jean-Hugues Barthélém

    Learning About Metadata and Machines: Teaching Students Using a Novel Structured Database Activity

    Get PDF
    Machines produce and operate using complex systems of metadata that need to be catalogued, sorted, and processed. Many students lack the experience with metadata and sufficient knowledge about it to understand it as part of their data literacy skills. This paper describes an educational and interactive database activity designed for teaching undergraduate communication students about the creation, value, and logic of structured data. Through a set of virtual instructional videos and interactive visualizations, the paper describes how students can gain experience with structured data and apply that knowledge to successfully find, curate, and classify a digital archive of media artifacts. The pedagogical activity, teaching materials, and archives are facilitated through and housed in an online resource called Fabric of Digital Life (fabricofdigitallife.com). We end by discussing the activity’s relevance for the emerging field of human-machine communication

    CAVRN Syllabus, Vol. 1

    Get PDF
    In this inaugural volume, we introduce CAVRN and set out an agenda for a Critical Augmented and Virtual Reality research Network. Through what we refer to as ‘critical AR and VR studies’, we argue there is urgent need for research that takes stock of rapid developments in the AR and VR space – accounting for the ethical, social, political, and economic implications of these technologies. This volume of CAVRN presents 16 contributions offering critical perspectives on AR and VR, encompassing diverse domains, united in their call for a deeper exploration of the complexities of virtual interaction, advocating for an approach to the critique of VR that accounts for both its material-technical affordances and its socio-cultural dimensions. The contributions in this volume cover four main areas – 1) the policy, regulatory, and legal implications of AR and VR, 2) media theoretical approaches to studying VR, 3) responses to the emerging ‘metaverse’, and 4) VR experiences and storytelling

    Catching Element Formation In The Act

    Full text link
    Gamma-ray astronomy explores the most energetic photons in nature to address some of the most pressing puzzles in contemporary astrophysics. It encompasses a wide range of objects and phenomena: stars, supernovae, novae, neutron stars, stellar-mass black holes, nucleosynthesis, the interstellar medium, cosmic rays and relativistic-particle acceleration, and the evolution of galaxies. MeV gamma-rays provide a unique probe of nuclear processes in astronomy, directly measuring radioactive decay, nuclear de-excitation, and positron annihilation. The substantial information carried by gamma-ray photons allows us to see deeper into these objects, the bulk of the power is often emitted at gamma-ray energies, and radioactivity provides a natural physical clock that adds unique information. New science will be driven by time-domain population studies at gamma-ray energies. This science is enabled by next-generation gamma-ray instruments with one to two orders of magnitude better sensitivity, larger sky coverage, and faster cadence than all previous gamma-ray instruments. This transformative capability permits: (a) the accurate identification of the gamma-ray emitting objects and correlations with observations taken at other wavelengths and with other messengers; (b) construction of new gamma-ray maps of the Milky Way and other nearby galaxies where extended regions are distinguished from point sources; and (c) considerable serendipitous science of scarce events -- nearby neutron star mergers, for example. Advances in technology push the performance of new gamma-ray instruments to address a wide set of astrophysical questions.Comment: 14 pages including 3 figure

    Binary systems and their nuclear explosions

    Get PDF
    Peer ReviewedPreprin

    IMPACT-Global Hip Fracture Audit: Nosocomial infection, risk prediction and prognostication, minimum reporting standards and global collaborative audit. Lessons from an international multicentre study of 7,090 patients conducted in 14 nations during the COVID-19 pandemic

    Get PDF
    corecore