1,116 research outputs found

    Resilience markers for safer systems and organisations

    Get PDF
    If computer systems are to be designed to foster resilient performance it is important to be able to identify contributors to resilience. The emerging practice of Resilience Engineering has identified that people are still a primary source of resilience, and that the design of distributed systems should provide ways of helping people and organisations to cope with complexity. Although resilience has been identified as a desired property, researchers and practitioners do not have a clear understanding of what manifestations of resilience look like. This paper discusses some examples of strategies that people can adopt that improve the resilience of a system. Critically, analysis reveals that the generation of these strategies is only possible if the system facilitates them. As an example, this paper discusses practices, such as reflection, that are known to encourage resilient behavior in people. Reflection allows systems to better prepare for oncoming demands. We show that contributors to the practice of reflection manifest themselves at different levels of abstraction: from individual strategies to practices in, for example, control room environments. The analysis of interaction at these levels enables resilient properties of a system to be ‘seen’, so that systems can be designed to explicitly support them. We then present an analysis of resilience at an organisational level within the nuclear domain. This highlights some of the challenges facing the Resilience Engineering approach and the need for using a collective language to articulate knowledge of resilient practices across domains

    A Correlation Between Hard Gamma-ray Sources and Cosmic Voids Along the Line of Sight

    Full text link
    We estimate the galaxy density along lines of sight to hard extragalactic gamma-ray sources by correlating source positions on the sky with a void catalog based on the Sloan Digital Sky Survey (SDSS). Extragalactic gamma-ray sources that are detected at very high energy (VHE; E>100 GeV) or have been highlighted as VHE-emitting candidates in the Fermi Large Area Telescope hard source catalog (together referred to as "VHE-like" sources) are distributed along underdense lines of sight at the 2.4 sigma level. There is also a less suggestive correlation for the Fermi hard source population (1.7 sigma). A correlation between 10-500 GeV flux and underdense fraction along the line of sight for VHE-like and Fermi hard sources is found at 2.4 sigma and 2.6 sigma, respectively. The preference for underdense sight lines is not displayed by gamma-ray emitting galaxies within the second Fermi catalog, containing sources detected above 100 MeV, or the SDSS DR7 quasar catalog. We investigate whether this marginal correlation might be a result of lower extragalactic background light (EBL) photon density within the underdense regions and find that, even in the most extreme case of a entirely underdense sight line, the EBL photon density is only 2% less than the nominal EBL density. Translating this into gamma-ray attenuation along the line of sight for a highly attenuated source with opacity tau(E,z) ~5, we estimate that the attentuation of gamma-rays decreases no more than 10%. This decrease, although non-neglible, is unable to account for the apparent hard source correlation with underdense lines of sight.Comment: Accepted by MNRA

    A study of the relative effectiveness and cost of computerized information retrieval in the interactive mode

    Get PDF
    Results of a number of experiments to illuminate the relative effectiveness and costs of computerized information retrieval in the interactive mode are reported. It was found that for equal time spent in preparing the search strategy, the batch and interactive modes gave approximately equal recall and relevance. The interactive mode however encourages the searcher to devote more time to the task and therefore usually yields improved output. Engineering costs as a result are higher in this mode. Estimates of associated hardware costs also indicate that operation in this mode is more expensive. Skilled RECON users like the rapid feedback and additional features offered by this mode if they are not constrained by considerations of cost

    Absolute calibration and beam reconstruction of MITO (a ground-based instrument in the millimetric region)

    Full text link
    An efficient sky data reconstruction derives from a precise characterization of the observing instrument. Here we describe the reconstruction of performances of a single-pixel 4-band photometer installed at MITO (Millimeter and Infrared Testagrigia Observatory) focal plane. The strategy of differential sky observations at millimeter wavelengths, by scanning the field of view at constant elevation wobbling the subreflector, induces a good knowledge of beam profile and beam-throw amplitude, allowing efficient data recovery. The problems that arise estimating the detectors throughput by drift scanning on planets are shown. Atmospheric transmission, monitored by skydip technique, is considered for deriving final responsivities for the 4 channels using planets as primary calibrators.Comment: 14 pages, 6 fiugres, accepted for pubblication by New Astronomy (25 March

    Opportunities and barriers for adoption of a decision-support tool for Alzheimer's Disease

    Get PDF
    Clinical decision-support tools (DSTs) represent a valuable resource in healthcare. However, lack of Human Factors considerations and early design research has often limited their successful adoption. To complement previous technically focused work, we studied adoption opportunities of a future DST built on a predictive model of Alzheimer’s Disease (AD) progression. Our aim is two-fold: exploring adoption opportunities for DSTs in AD clinical care, and testing a novel combination of methods to support this process. We focused on understanding current clinical needs and practices, and the potential for such a tool to be integrated into the setting, prior to its development. Our user-centred approach was based on field observations and semi-structured interviews, analysed through workflow analysis, user profiles, and a design-reality gap model. The first two are common practice, whilst the latter provided added value in highlighting specific adoption needs. We identified the likely early adopters of the tool as being both psychiatrists and neurologists based in research-oriented clinical settings. We defined ten key requirements for the translation and adoption of DSTs for AD around IT, user, and contextual factors. Future works can use and build on these requirements to stand a greater chance to get adopted in the clinical setting

    James Hutton’s geological tours of Scotland : romanticism, literary strategies, and the scientific quest

    Get PDF
    This article explores a somewhat neglected part of the story of the emergence of geology as a science and discourse in the late eighteenth century – James Hutton’s posthumously published accounts of the geological tours of Scotland that he undertook in the years 1785 to 1788 in search of empirical evidence in support of his theory of the Earth and that he intended to include in the projected third volume of his Theory of the Earth of 1795. The article brings some of the assumptions and techniques of literary criticism to bear on Hutton’s scientific travel writing in order to open up new connections between geology, Romantic aesthetics and eighteenth-century travel writing about Scotland. Close analysis of Hutton’s accounts of his field trips to Glen Tilt, Galloway and Arran, supplemented by later accounts of the discoveries at Jedburgh and Siccar Point, reveals the interplay between desire, travel and the scientific quest and foregrounds the textual strategies that Hutton uses to persuade his readers that they share in the experience of geological discovery and interpretation as ‘virtual witnesses’. As well as allowing us to revisit the interrelation between scientific theory and discovery, this article concludes that Hutton was a much better writer than he has been given credit for and suggests that if these geological tours had been published in 1795 they would have made it impossible for critics to dismiss him as an armchair geologist

    Carers’ experiences of home enteral feeding: a survey exploring medicines administration challenges and strategies

    Get PDF

    Genetic screening of 202 individuals with congenital limb malformations and requiring reconstructive surgery

    Get PDF
    BACKGROUND: Congenital limb malformations (CLMs) are common and present to a variety of specialties, notably plastic and orthopaedic surgeons, and clinical geneticists. The authors aimed to characterise causative mutations in an unselected cohort of patients with CLMs requiring reconstructive surgery. METHODS: 202 patients presenting with CLM were recruited. The authors obtained G-banded karyotypes and screened EN1, GLI3, HAND2, HOXD13, ROR2, SALL1, SALL4, ZRS of SHH, SPRY4, TBX5, TWIST1 and WNT7A for point mutations using denaturing high performance liquid chromatography (DHPLC) and direct sequencing. Multiplex ligation dependent probe amplification (MLPA) kits were developed and used to measure copy number in GLI3, HOXD13, ROR2, SALL1, SALL4, TBX5 and the ZRS of SHH. RESULTS: Within the cohort, causative genetic alterations were identified in 23 patients (11%): mutations in GLI3 (n = 5), HOXD13 (n = 5), the ZRS of SHH (n = 4), and chromosome abnormalities (n = 4) were the most common lesions found. Clinical features that predicted the discovery of a genetic cause included a bilateral malformation, positive family history, and having increasing numbers of limbs affected (all p<0.01). Additionally, specific patterns of malformation predicted mutations in specific genes. CONCLUSIONS: Based on higher mutation prevalence the authors propose that GLI3, HOXD13 and the ZRS of SHH should be prioritised for introduction into molecular genetic testing programmes for CLM. The authors have developed simple criteria that can refine the selection of patients by surgeons for referral to clinical geneticists. The cohort also represents an excellent resource to test for mutations in novel candidate genes

    Shrub Communities, Spatial Patterns, and Shrub-Mediated Tree Mortality following Reintroduced Fire in Yosemite National Park, California, USA

    Get PDF
    Shrubs contribute to the forest fuel load; their distribution is important to tree mortality and regeneration, and vertebrate occupancy. We used a method new to fire ecology—extensive continuous mapping of trees and shrub patches within a single large (25.6 ha) study site—to identify changes in shrub area, biomass, and spatial pattern due to fire reintroduction by a backfire following a century of fire exclusion in lower montane forests of the Sierra Nevada, California, USA. We examined whether trees in close proximity to shrubs prior to fire experienced higher mortality rates than trees in areas without shrubs. We calculated shrub biomass using demography subplots and existing allometric equations, and we developed new equations for beaked hazel (Corylus cornuta ssp. californica [A. de Candolle] E. Murray) from full dissection of 50 stems. Fire decreased shrub patch area from 15.1 % to 0.9 %, reduced live shrub biomass from 3.49 Mg ha−1 to 0.27 Mg ha−1, and consumed 4.41 Mg ha−1 of living and dead shrubs. Distinct (non-overlapping) shrub patches decreased from 47 ha−1 to 6 ha−1. The mean distance between shrub patches increased 135 %. Distances between montane chaparral patches increased 285 %, compared to a 54 % increase in distances between riparian shrub patches and an increase of 267 % between generalist shrub patches. Fire-related tree mortality within shrub patches was marginally lower (67.6 % versus 71.8 %), showing a contrasting effect of shrubs on tree mortality between this forest ecosystem and chaparral-dominated ecosystems in which most trees are killed by fire
    • 

    corecore