615 research outputs found

    Avoiding deontic explosion by contextually restricting aggregation

    Get PDF
    In this paper, we present an adaptive logic for deontic conflicts, called P2.1(r), that is based on Goble's logic SDLaPe-a bimodal extension of Goble's logic P that invalidates aggregation for all prima facie obligations. The logic P2.1(r) has several advantages with respect to SDLaPe. For consistent sets of obligations it yields the same results as Standard Deontic Logic and for inconsistent sets of obligations, it validates aggregation "as much as possible". It thus leads to a richer consequence set than SDLaPe. The logic P2.1(r) avoids Goble's criticisms against other non-adjunctive systems of deontic logic. Moreover, it can handle all the 'toy examples' from the literature as well as more complex ones

    ASSESSING THE RISKS OF A FUTURE RAPID LARGE SEA LEVEL RISE: A REVIEW

    Get PDF
    Our aim is to make an appropriate characterization and interpretation of the risk problem of rapid large sea level rise that reflects the very large uncertainty in present day knowledge concerning this possibility, and that will be useful in informing discussion about risk management approaches. We consider mainly the potential collapse of the West Antarctic ice sheet as the source of such a sea level rise. Our review, characterization and interpretation of the risk makes us conclude that the risk of a rapid large sea level rise is characterized by potentially catastrophic consequences and high epistemic uncertainty; effective risk management must involve highly adaptive management regimes, vulnerability reduction, and prompt development of capabilities for precautionary reduction of climate change forcings.sea level rise, West Antarctic ice sheet, climate change, adaptive management, epistemic uncertainty, risk management arenas, vulnerability

    Thermal Radiation Analysis System (TRASYS)

    Get PDF
    A user's manual is presented for TRASYS, which is a digital software system with a generalized capability for solving radiation problems. Subroutines, file, and variable definitions are presented along with subroutine and function descriptions for the preprocessor. Definitions and descriptions of components of the processor are also presented

    Lost in translation: data integration tools meet the Semantic Web (experiences from the Ondex project)

    Full text link
    More information is now being published in machine processable form on the web and, as de-facto distributed knowledge bases are materializing, partly encouraged by the vision of the Semantic Web, the focus is shifting from the publication of this information to its consumption. Platforms for data integration, visualization and analysis that are based on a graph representation of information appear first candidates to be consumers of web-based information that is readily expressible as graphs. The question is whether the adoption of these platforms to information available on the Semantic Web requires some adaptation of their data structures and semantics. Ondex is a network-based data integration, analysis and visualization platform which has been developed in a Life Sciences context. A number of features, including semantic annotation via ontologies and an attention to provenance and evidence, make this an ideal candidate to consume Semantic Web information, as well as a prototype for the application of network analysis tools in this context. By analyzing the Ondex data structure and its usage, we have found a set of discrepancies and errors arising from the semantic mismatch between a procedural approach to network analysis and the implications of a web-based representation of information. We report in the paper on the simple methodology that we have adopted to conduct such analysis, and on issues that we have found which may be relevant for a range of similar platformsComment: Presented at DEIT, Data Engineering and Internet Technology, 2011 IEEE: CFP1113L-CD

    Habits of Mind and the Split-Mind Effect: When Computer-Assisted Qualitative Data Analysis Software is Used in Phenomenological Research

    Get PDF
    When Marshall McLUHAN famously stated "the medium is the message," he was echoing Martin HEIDEGGER's assertion that through our use of technology we can become functions of it. Therefore, how does adopting computer-assisted qualitative data analysis software affect our research activities and, more importantly, our conception of research? These questions are explored by examining the influence NVivo had upon an interdisciplinary phenomenological research project in health ethics. We identify the software's effects and situate our decision to use it within the Canadian health sciences research landscape. We also explore the challenges of remaining true to our project's philosophical foundations, as well as how NVivo altered our being-in-the-world as researchers. This case demonstrates McLUHAN's claim that new technologies invariably initiate new practices and modes of being, and urges researchers to attend to how we are both shaping and being shaped by software.URN: http://nbn-resolving.de/urn:nbn:de:0114-fqs120227Cuando Marshall McLUHAN afirmo "El medio es el mensaje" estaba haciendo eco a la afirmación de Martín HEIDEGGER de que a través de nuestro uso de la tecnología podemos convertirnos en funciones de ello. Por consiguiente, ¿Cómo afecta el análisis cualitativo de datos con asistencia de una computadora nuestras actividades de investigación y aún más importante, nuestra concepción de la investigación? Se exploran estas preguntas al examinar la influencia que NVivo tuvo sobre un proyecto de investigación fenomenológica interdisciplinaria en ética de salud. Identificamos los efectos del software y situamos nuestra decisión de usarlo en el horizonte de la investigación de ciencias de la salud canadienses. Exploramos los desafíos de mantenernos fieles a los fundamentos filosóficos del proyecto, así como también la forma en que NVivo alteró nuestro ser-en-el-mundo como investigadores. Este caso demuestra la afirmación de McLUHAN sobre que invariablemente las nuevas tecnologías inician nuevas prácticas y modos de ser, y urge a que los investigadores pongan atención sobre como forman, a la vez que se van formando, por el software.URN: http://nbn-resolving.de/urn:nbn:de:0114-fqs120227Als Marshall McLUHAN seinen berühmten Satz "Das Medium ist die Botschaft" formulierte,  fand sich eine ähnliche Vorstellung in Martin HEIDEGGERs Behauptung, dass wir durch die Nutzung von Technologie zu deren Teil werden.  Unsere Frage ist – von hier ausgehend – in welcher Weise die Nutzung computergestützter qualitativer Analyse-Software unsere Forschungsaktivitäten affiziert und  – noch zentraler – unser Konzept von Forschung? Zu deren Beantwortung haben wir uns mit dem Einfluss von NVivo auf ein interdisziplinäres phänomenologisches Forschungsprojekt befasst, das in der Gesundheitsethik angesiedelt ist. Wir identifizieren die Effekte der Software auf unsere Forschung, indem wir bereits unsere Entscheidung für ihre Nutzung in dem aktuellen Stand der kanadischen Gesundheitswissenschaften situieren. Wir befassen uns auch mit der Herausforderung, den philosophischen Grundlagen unserer Forschung treu zu bleiben und mit der Frage, wie NVivo unser "In-der-Welt-sein" als Forschende verändert hat.  Unser Beispiel zeigt, McLUHAN folgend, dass neue Technologien notwendig neue Praktiken und Seinsweisen hervorbringen, und es verweist Forschende darauf zu reflektieren, wie Software für Forschungsarbeiten angepasst wird und wie sie umgekehrt uns selbst "anpasst".URN: http://nbn-resolving.de/urn:nbn:de:0114-fqs12022
    corecore