2,577 research outputs found

    The Gödel and the Splitting Translations

    Full text link
    When the new research area of logic programming and non-monotonic reasoning emerged at the end of the 1980s, it focused notably on the study of mathematical relations between different non-monotonic formalisms, especially between the semantics of stable models and various non-monotonic modal logics. Given the many and varied embeddings of stable models into systems of modal logic, the modal interpretation of logic programming connectives and rules became the dominant view until well into the new century. Recently, modal interpretations are once again receiving attention in the context of hybrid theories that combine reasoning with non-monotonic rules and ontologies or external knowledge bases. In this talk I explain how familiar embeddings of stable models into modal logics can be seen as special cases of two translations that are very well-known in non-classical logic. They are, first, the translation used by Godel in 1933 to em- ¨ bed Heyting’s intuitionistic logic H into a modal provability logic equivalent to Lewis’s S4; second, the splitting translation, known since the mid-1970s, that allows one to embed extensions of S4 into extensions of the non-reflexive logic, K4. By composing the two translations one can obtain (Goldblatt, 1978) an adequate provability interpretation of H within the Goedel-Loeb logic GL, the system shown by Solovay (1976) to capture precisely the provability predicate of Peano Arithmetic. These two translations and their composition not only apply to monotonic logics extending H and S4, they also apply in several relevant cases to non-monotonic logics built upon such extensions, including equilibrium logic, non-monotonic S4F and autoepistemic logic. The embeddings obtained are not merely faithful and modular, they are based on fully recursive translations applicable to arbitrary logical formulas. Besides providing a uniform picture of some older results in LPNMR, the translations yield a perspective from which some new logics of belief emerge in a natural wa

    The visceral response to underbody blast

    Get PDF
    Blast is the most common cause of injury and death in contemporary warfare. Blast injuries may be categorised based upon their mechanism with underbody blast describing the effect of an explosive device detonating underneath a vehicle. Torso injuries are highly lethal within this environment and yet their mechanism in response to underbody blast is poorly understood. This work seeks to understand the pattern and mechanism of these injuries and to link them to physical underbody blast loading parameters in order to enable mitigation and prevention of serious injury and death. An analysis of the United Kingdom Joint Theatre Trauma Registry for underbody blast events demonstrates that torso injury is a major cause of morbidity and mortality from such incidents. Mediastinal injury, including those trauma to the heart and thoracic great vessels is shown confer the greatest lethality within this complex environment. This work explores the need for a novel in vivo model of underbody loading in order to explore the mechanisms of severe torso injury and to define the relationship between the “dose” of underbody loading and resultant injury. The work includes the development of a new rig which causes underbody blast analogous vertical accelerations upon a seated rat model. Injuries causes by this loading to both the chest and abdomen can be best predicted by the examining the kinematic response of the torso to the loading. Axial compression of the torso, a previously undescribed injury metric is shown to be the best predictor of injury. The ability of these results to translate to a human model is explored in detail, with focus upon the biomechanical rationale; that torso organ injuries occur through both direct compression and shearing of tethering attachments. Survivability of underbody blast could be improved by applying these principles to the design and modification of seats, vehicles and posture.Open Acces

    Orbital Parameter Determination for Wide Stellar Binary Systems in the Age of Gaia

    Full text link
    The orbits of binary stars and planets, particularly eccentricities and inclinations, encode the angular momentum within these systems. Within stellar multiple systems, the magnitude and (mis)alignment of angular momentum vectors among stars, disks, and planets probes the complex dynamical processes guiding their formation and evolution. The accuracy of the \textit{Gaia} catalog can be exploited to enable comparison of binary orbits with known planet or disk inclinations without costly long-term astrometric campaigns. We show that \textit{Gaia} astrometry can place meaningful limits on orbital elements in cases with reliable astrometry, and discuss metrics for assessing the reliability of \textit{Gaia} DR2 solutions for orbit fitting. We demonstrate our method by determining orbital elements for three systems (DS Tuc AB, GK/GI Tau, and Kepler-25/KOI-1803) using \textit{Gaia} astrometry alone. We show that DS Tuc AB's orbit is nearly aligned with the orbit of DS Tuc Ab, GK/GI Tau's orbit might be misaligned with their respective protoplanetary disks, and the Kepler-25/KOI-1803 orbit is not aligned with either component's transiting planetary system. We also demonstrate cases where \textit{Gaia} astrometry alone fails to provide useful constraints on orbital elements. To enable broader application of this technique, we introduce the python tool \texttt{lofti\_gaiaDR2} to allow users to easily determine orbital element posteriors.Comment: 18 pages, 10 figures, accepted for publication in Ap

    Disease activity and cognition in rheumatoid arthritis : an open label pilot study

    Get PDF
    Acknowledgements This work was supported in part by NIHR Newcastle Biomedical Research Centre. Funding for this study was provided by Abbott Laboratories. Abbott Laboratories were not involved in study design; in the collection, analysis and interpretation of data; or in the writing of the report.Peer reviewedPublisher PD

    Improved Parametric Empirical Determination of Module Short Circuit Current for Modelling and Optimization of Solar Photovoltaic Systems

    Get PDF
    Correct modelling of solar photovoltaic (PV) system yields is necessary to optimize system design, improve reliability of projected outputs to ensure favourable project financing and to facilitate proper operations and maintenance. An improved methodology for fine resolution modelling of PV systems is presented using module short-circuit current (Isc) at 5-minute time-scales, and clearly identifies pertinent error mechanisms that arise when working at this high resolution. This work used a modified version of the Sandia array performance model, and introduces new factors to the calculation of Isc to account for identified error mechanisms, including instrumentation alignment, spectral, and module power tolerance errors. A simple methodology was introduced and verified where specific module parameters can be derived solely from properly filtered performance time series data. In particular, this paper focused on methodologies for determining the predicted Isc for a variety of solar PV module types. These methods of regressive analysis significantly reduced the error of the predicted model, and demonstrate the need for this form of modelling when evaluating long term PV array performance. This methodology has applications for current systems operators, which will enable the extraction of useful module parameters from existing data in addition to more precise continuous monitoring of existing systems, and can also be used to more accurately model and optimize new systems

    Dcc --help: Generating Context-Aware Compiler Error Explanations with Large Language Models

    Full text link
    In the challenging field of introductory programming, high enrollments and failure rates drive us to explore tools and systems to enhance student outcomes, especially automated tools that scale to large cohorts. This paper presents and evaluates the dcc --help tool, an integration of a Large Language Model (LLM) into the Debugging C Compiler (DCC) to generate unique, novice-focused explanations tailored to each error. dcc --help prompts an LLM with contextual information of compile- and run-time error occurrences, including the source code, error location and standard compiler error message. The LLM is instructed to generate novice-focused, actionable error explanations and guidance, designed to help students understand and resolve problems without providing solutions. dcc --help was deployed to our CS1 and CS2 courses, with 2,565 students using the tool over 64,000 times in ten weeks. We analysed a subset of these error/explanation pairs to evaluate their properties, including conceptual correctness, relevancy, and overall quality. We found that the LLM-generated explanations were conceptually accurate in 90% of compile-time and 75% of run-time cases, but often disregarded the instruction not to provide solutions in code. Our findings, observations and reflections following deployment indicate that dcc-help provides novel opportunities for scaffolding students' introduction to programming.Comment: 7 pages, 2 figures. Accepted in SIGCSE'2

    Obesity prevention and the role of hospital and community-based health services: A scoping review

    Get PDF
    Background: Control of obesity is an important priority to reduce the burden of chronic disease. Clinical guidelines focus on the role of primary healthcare in obesity prevention. The purpose of this scoping review is to examine what the published literature indicates about the role of hospital and community based health services in adult obesity prevention in order to map the evidence and identify gaps in existing research. Methods: Databases were searched for articles published in English between 2006 and 2016 and screened against inclusion and exclusion criteria. Further papers were highlighted through a manual search of the reference lists. Included papers evaluated interventions aimed at preventing overweight and obesity in adults that were implemented within and/or by hospital and community health services; were an empirical description of obesity prevention within a health setting or reported health staff perceptions of obesity and obesity prevention. Results: The evidence supports screening for obesity of all healthcare patients, combined with referral to appropriate intervention services but indicates that health professionals do not typically adopt this practice. As well as practical issues such as time and resourcing, implementation is impacted by health professionals’ views about the causes of obesity and doubts about the benefits of the health sector intervening once someone is already obese. As well as lacking confidence or knowledge about how to integrate prevention into clinical care, health professional judgements about who might benefit from prevention and negative views about effectiveness of prevention hinder the implementation of practice guidelines. This is compounded by an often prevailing view that preventing obesity is a matter of personal responsibility and choice. Conclusions: This review highlights that whilst a population health approach is important to address the complexity of obesity, it is important that the remit of health services is extended beyond medical treatment to incorporate obesity prevention through screening and referral. Further research into the role of health services in obesity prevention should take a systems approach to examine how health service structures, policy and practice interrelationships, and service delivery boundaries, processes and perspectives impact on changing models of care
    corecore