7,799 research outputs found

    Safer clinical systems : interim report, August 2010

    Get PDF
    Safer Clinical Systems is the Health Foundation’s new five year programme of work to test and demonstrate ways to improve healthcare systems and processes, to develop safer systems that improve patient safety. It builds on learning from the Safer Patients Initiative (SPI) and models of system improvement from both healthcare and other industries. Learning from the SPI highlighted the need to take a clinical systems approach to improving safety. SPI highlighted that many hospitals struggle to implement improvement in clinical areas due to inherent problems with support mechanisms. Clinical processes and systems, rather than individuals, are often the contributors to breakdown in patient safety. The Safer Clinical Systems programme aimed to measure the reliability of clinical processes, identify defects within those processes, and identify the systems that result in those defects. Methods to improve system reliability were then to be tested and re-developed in order to reduce the risk of harm being caused to patients. Such system-level awareness should lead to improvements in other patient care pathways. The relationship between system reliability and actual harm is challenging to identify and measure. Specific, well-defined, small-scale processes have been used in other programmes, and system reliability has been shown to have a direct causal relationship with harm (e.g. care bundle compliance in an intensive care unit can reduce the incidence of ventilator-associated pneumonia). However, it has become evident that harm can be caused by a variety of factors over time; when working in broader, more complex and dynamic systems, change in outcome can be difficult to attribute to specific improvements and difficulties are also associated with relating evidence to resulting harm. The overall aim of Phase 1 of the Safer Clinical Systems programme was to demonstrate proof-of-concept that using a systems-based approach could contribute to improved patient safety. In Phase 1, experienced NHS teams from four locations worked together with expert advisers to co-design the Safer Clinical Systems programme

    Resiliency in numerical algorithm design for extreme scale simulations

    Get PDF
    This work is based on the seminar titled ‘Resiliency in Numerical Algorithm Design for Extreme Scale Simulations’ held March 1–6, 2020, at Schloss Dagstuhl, that was attended by all the authors. Advanced supercomputing is characterized by very high computation speeds at the cost of involving an enormous amount of resources and costs. A typical large-scale computation running for 48 h on a system consuming 20 MW, as predicted for exascale systems, would consume a million kWh, corresponding to about 100k Euro in energy cost for executing 1023 floating-point operations. It is clearly unacceptable to lose the whole computation if any of the several million parallel processes fails during the execution. Moreover, if a single operation suffers from a bit-flip error, should the whole computation be declared invalid? What about the notion of reproducibility itself: should this core paradigm of science be revised and refined for results that are obtained by large-scale simulation? Naive versions of conventional resilience techniques will not scale to the exascale regime: with a main memory footprint of tens of Petabytes, synchronously writing checkpoint data all the way to background storage at frequent intervals will create intolerable overheads in runtime and energy consumption. Forecasts show that the mean time between failures could be lower than the time to recover from such a checkpoint, so that large calculations at scale might not make any progress if robust alternatives are not investigated. More advanced resilience techniques must be devised. The key may lie in exploiting both advanced system features as well as specific application knowledge. Research will face two essential questions: (1) what are the reliability requirements for a particular computation and (2) how do we best design the algorithms and software to meet these requirements? While the analysis of use cases can help understand the particular reliability requirements, the construction of remedies is currently wide open. One avenue would be to refine and improve on system- or application-level checkpointing and rollback strategies in the case an error is detected. Developers might use fault notification interfaces and flexible runtime systems to respond to node failures in an application-dependent fashion. Novel numerical algorithms or more stochastic computational approaches may be required to meet accuracy requirements in the face of undetectable soft errors. These ideas constituted an essential topic of the seminar. The goal of this Dagstuhl Seminar was to bring together a diverse group of scientists with expertise in exascale computing to discuss novel ways to make applications resilient against detected and undetected faults. In particular, participants explored the role that algorithms and applications play in the holistic approach needed to tackle this challenge. This article gathers a broad range of perspectives on the role of algorithms, applications and systems in achieving resilience for extreme scale simulations. The ultimate goal is to spark novel ideas and encourage the development of concrete solutions for achieving such resilience holistically.Peer Reviewed"Article signat per 36 autors/es: Emmanuel Agullo, Mirco Altenbernd, Hartwig Anzt, Leonardo Bautista-Gomez, Tommaso Benacchio, Luca Bonaventura, Hans-Joachim Bungartz, Sanjay Chatterjee, Florina M. Ciorba, Nathan DeBardeleben, Daniel Drzisga, Sebastian Eibl, Christian Engelmann, Wilfried N. Gansterer, Luc Giraud, Dominik G ̈oddeke, Marco Heisig, Fabienne Jezequel, Nils Kohl, Xiaoye Sherry Li, Romain Lion, Miriam Mehl, Paul Mycek, Michael Obersteiner, Enrique S. Quintana-Ortiz, Francesco Rizzi, Ulrich Rude, Martin Schulz, Fred Fung, Robert Speck, Linda Stals, Keita Teranishi, Samuel Thibault, Dominik Thonnes, Andreas Wagner and Barbara Wohlmuth"Postprint (author's final draft

    Flexible Macroblock Ordering for Context-Aware Ultrasound Video Transmission over Mobile WiMAX

    Get PDF
    The most recent network technologies are enabling a variety of new applications, thanks to the provision of increased bandwidth and better management of Quality of Service. Nevertheless, telemedical services involving multimedia data are still lagging behind, due to the concern of the end users, that is, clinicians and also patients, about the low quality provided. Indeed, emerging network technologies should be appropriately exploited by designing the transmission strategy focusing on quality provision for end users. Stemming from this principle, we propose here a context-aware transmission strategy for medical video transmission over WiMAX systems. Context, in terms of regions of interest (ROI) in a specific session, is taken into account for the identification of multiple regions of interest, and compression/transmission strategies are tailored to such context information. We present a methodology based on H.264 medical video compression and Flexible Macroblock Ordering (FMO) for ROI identification. Two different unequal error protection methodologies, providing higher protection to the most diagnostically relevant data, are presented

    Quantitative modelling approaches for lean manufacturing under uncertainty

    Full text link
    [EN] Lean manufacturing (LM) applies different tools that help to eliminate waste as well as the opera-tions that do not add value to the product or processes to increase the value of each performedactivity. Here the main motivation is to study how quantitative modelling approaches can supportLM tools even under system and environment uncertainties. The main contributions of the articleare: (i) providing a systematic literature review of 99 works related to the modelling of uncertaintyin LM environments; (ii) proposing a methodology to classify the reviewed works; (iii) classifyingLM works under uncertainty; and (iv) identify quantitative models and their solution to deal withuncertainty in LM environments by identifying the main variables involved. Hence this article pro-vides a conceptual framework for future LM quantitative modelling under uncertainty as a guide foracademics, researchers and industrial practitioners. The main findings identify that LM under uncer-tainty has been empirically investigated mainly in the US, India and the UK in the automotive andaerospace manufacturing sectors using analytical and simulation models to minimise time and cost.Value stream mapping (VSM) and just in time (JIT) are the most used LM techniques to reduce wastein a context of system uncertainty.The research leading to these results received funding fromthe project 'Industrial Production and Logistics Optimizationin Industry 4.0' (i4OPT) (Ref. PROMETEO/2021/065) granted by the Valencian Regional Government; and grant PDC2022-133957-I00 funded by MCIN/AEI /10.13039/501100011033 and by European Union Next Generation EU/PRTR.Rojas, T.; Mula, J.; Sanchis, R. (2023). Quantitative modelling approaches for lean manufacturing under uncertainty. International Journal of Production Research. 1-27. https://doi.org/10.1080/00207543.2023.229313812

    Developing resilient safety culture for construction projects in Vietnam

    Get PDF
    Although traditional safety culture approach has significantly contributed to accident reduction, it may be inadequate in responding to all of the changing and unforeseen safety risks associated with the complex nature of construction projects. Resilient safety culture has been therefore proposed as a promising concept to address the limitation of traditional safety culture approach in order to achieve a sustained improvement of safety performance in the construction environment. The aim of this study is to investigate the development of resilient safety culture in the construction environment. To fulfil the research aim and objectives, a quantitative approach and a survey research design were adopted. Data were collected using questionnaires targeting the construction project managers involved in the delivery of 78 recently completed building projects in Vietnam. The structural equation modelling (SEM) technique with partial least-squares estimation (PLS) was used to analyse the data. The key findings pertaining to the research objectives are: (1) This study examined the dimensions of resilient safety culture of construction projects. The results confirm 24 measurable scale items comprising three dimensions (i.e. psychological resilience, contextual resilience and behavioural resilience) to define and assess resilient safety culture. (2) This study explored the drivers of resilient safety culture. It was found that hazard prevention practice has a positive impact on contextual and behavioural resilience, error management practice has a positive impact on psychological resilience, and mindful organising practice has a positive impact on contextual resilience. (3) This study examined the interactive effects of resilient safety culture and project complexity on safety performance of construction projects. It was found that resilient safety culture dimensions have positive impacts on safety performance. Psychological resilience has a weaker impact on accident prevention under higher contextual and behavioural resilience levels. Technical and environmental project complexities have negative impacts on safety performance. The negative impact of project complexity on safety performance becomes less significant when there is a higher level of psychological, contextual and behavioural resilience; while this impact might be not significant if psychological, contextual and behavioural resilience were high. The findings of this study contribute to the knowledge of construction safety management by providing the theoretical development and empirical evidence to clarify the concept of resilient safety culture in terms of definition, purpose, value, and assessment and improvement mechanisms in the context of construction projects. Practically, this study (1) provides a frame of safety practices to assess the organisations’ capabilities to manage safety risks and achieve a sustained improvement of safety performance regardless of the changing complexity levels of construction projects, and (2) recommends the appropriate strategies to build up such capabilities

    Benthic mapping of the Bluefields Bay fish sanctuary, Jamaica

    Get PDF
    Small island states, such as those in the Caribbean, are dependent on the nearshore marine ecosystem complex and its resources; the goods and services provided by seagrass and coral reef for example, are particularly indispensable to the tourism and fishing industries. In recognition of their valuable contributions and in an effort to promote sustainable use of marine resources, some nearshore areas have been designated as fish sanctuaries, as well as marine parks and protected areas. In order to effectively manage these coastal zones, a spatial basis is vital to understanding the ecological dynamics and ultimately inform management practices. However, the current extent of habitats within designated sanctuaries across Jamaica are currently unknown and owing to this, the Government of Jamaica is desirous of mapping the benthic features in these areas. Given the several habitat mapping methodologies that exist, it was deemed necessary to test the practicality of applying two remote sensing methods - optical and acoustic - at a pilot site in western Jamaica, the Bluefields Bay fish sanctuary. The optical remote sensing method involved a pixel-based supervised classification of two available multispectral images (WorldView-2 and GeoEye-1), whilst the acoustic method comprised a sonar survey using a BioSonics DT-X Portable Echosounder and subsequent indicator kriging interpolation in order to create continuous benthic surfaces. Image classification resulted in the mapping of three benthic classes, namely submerged vegetation, bare substrate and coral reef, with an overall map accuracy of 89.9% for WorldView-2 and 86.8% for GeoEye-1 imagery. These accuracies surpassed those of the acoustic classification method, which attained 76.6% accuracy for vegetation presence, and 53.5% for bottom substrate (silt, sand and coral reef/ hard bottom). Both approaches confirmed that the Bluefields Bay is dominated by submerged aquatic vegetation, with contrastingly smaller areas of bare sediment and coral reef patches. Additionally, the sonar revealed that silty substrate exists along the shoreline, whilst sand is found further offshore. Ultimately, the methods employed in this study were compared and although it was found that satellite image classification was perhaps the most cost-effective and well-suited for Jamaica given current available equipment and expertise, it is acknowledged that acoustic technology offers greater thematic detail required by a number of stakeholders and is capable of operating in turbid waters and cloud covered environments ill-suited for image classification. On the contrary, a major consideration for the acoustic classification process is the interpolation of processed data; this step gives rise to a number of potential limitations, such as those associated with the choice of interpolation algorithm, available software and expertise. The choice in mapping approach, as well as the survey design and processing steps is not an easy task; however the results of this study highlight the various benefits and shortcomings of implementing optical and acoustic classification approaches in Jamaica.Persons automatically associate tropical waters with spectacular views of coral reefs and colourful fish; however many are perhaps not aware that these coral reefs, as well as other living organisms inhabiting the seabed are in fact extremely valuable to our existence. Healthy coral reefs and seagrass assist in maintaining the sand on our beaches and fish populations and are thereby crucial to the tourism and fishing industries in the Caribbean. For this reason, a number of areas are protected by law and have been designated fish sanctuaries or marine protected areas. In order to understand the functioning of theses areas and effectively inform management strategy, the configuration of what exists on the seafloor is crucial. In the same vein that a motorist needs a road map to navigate unknown areas, coastal stakeholders require maps of the seafloor in order to understand what is happening beneath the water’s surface. The location of seafloor habitats within fish sanctuaries in Jamaica are currently unknown and the Government is interested in mapping them. However a myriad of methods exist that could be employed to achieve this goal. Remote sensing is a broad grouping of methods that involve collecting information about an object without being in direct physical contact with it. Many researchers have successfully mapped marine areas using these techniques and it was believed crucial to test the practicality of two such methods, specifically optical and acoustic remote sensing. The main question to be answered from this study was therefore: Which mapping approach is better for benthic habitat mapping in Jamaica and possibly the wider Caribbean? Optical remote sensing relates to the interaction of energy with the Earth’s surface. A digital photograph is taken from a satellite and subsequently interpreted. Acoustic/ sonar technology involves the recording of waveforms reflected from the seabed. Both methods were employed at a pilot site, the Bluefields Bay fish sanctuary, situated in western Jamaica. The optical remote sensing method involved the classification of two satellite images (named WorldView-2 and GeoEye-1) and this process was informed using known positions of seafloor features, this being known as supervised image classification. With regard to the acoustic method, a field survey utilising sonar equipment (BioSonics DT-X Portable Echosounder) was undertaken in order to collect the necessary sonar data. The processed field data was modelled in order to convert lines of field point data to one continuous map of the sanctuary, a process known as interpolation. The accuracy of each method was then tested using field knowledge of what exists in the sanctuary. The map resulting from the image classification revealed three seafloor types, namely submerged vegetation, coral reef and bare seafloor. The overall map accuracy was 89.9% for the WorldView-2 image and 86.8% for GeoEye-1 imagery. These accuracies surpassed those attained from the acoustic classification method (76.6% for vegetation presence and 53.5% for bottom type - silt, sand and coral reef/ hard bottom). Similar to previous studies undertaken, it was shown that the seabed of Bluefields Bay is primarily inhabited by submerged aquatic vegetation (including seagrass and algae), with contrastingly smaller areas of bare sediment and coral reef. Ultimately, the methods employed in this study were compared and the pros and cons of each were weighed in order to deem one method more suitable in Jamaica. Often, the presence of cloud and suspended matter in the water block the view of the seafloor making image classification difficult. On the contrary, acoustic surveys are capable of operating throughout cloudy conditions and attaining more detailed information of the ocean floor, otherwise not possible with optical remote sensing. A major step in the acoustic classification process however, was the interpolation of processed data, which may introduce additional limitations if careful consideration is not given to the intricacies of the process. Lastly, the acoustic survey certainly required greater financial resources than satellite image classification. In answer to the main question of this study, the most cost effective and feasible mapping method for Jamaica is satellite image classification (based on the results attained). It must be stressed however that the effective implementation of any method will depend on a number of factors, such as available software, equipment, expertise and user needs, that must be weighed in order to select the most feasible mapping method for a particular site

    Light Water Sustainability Program: Optimizing Information Automation Using a New Method Based on System-Theoretic Process Analysis

    Get PDF
    This report describes the interim progress for research supporting the design and optimization of information automation systems for nuclear power plants. Much of the domestic nuclear fleet is currently focused on modernizing technologies and processes, including transitioning toward digitalization in the control room and elsewhere throughout the plant, along with a greater use of automation, artificial intelligence, robotics, and other emerging technologies. While there are significant opportunities to apply these technologies toward greater plant safety, efficiency, and overall cost-effectiveness, optimizing their design and avoiding potential safety and performance risks depends on ensuring that human-performance-related organizational and technical design issues are identified and addressed. This report describes modeling tools and techniques, based on sociotechnical system theory, to support these design goals and their application in the current research effort. The report is intended for senior nuclear energy stakeholders, including regulators, corporate management, and senior plant management. We have developed and employed a method to design an optimized information automation ecosystem (IAE) based on the systems-theoretic constructs underlying sociotechnical systems theory in general and the Systems-Theoretic Accident Modeling and Processes (STAMP) approach in particular. We argue that an IAE can be modeled as an interactive information control system whose behavior can be understood in terms of dynamic control and feedback relationships amongst the system’s technical and organizational components. Up to this point, we have employed a Causal Analysis based on STAMP (CAST) technique to examine a performance- and safety-related incident at an industry partner’s plant that involved the unintentional activation of an emergency diesel generator. This analysis provided insight into the behavior of the plant’s current information control structure within the context of a specific, significant event. Our ongoing analysis is focused on identifying near-term process improvements and longer-term design requirements for an optimized IAE system. The latter analyses will employ a second STAMP-derived technique, System-Theoretic Process Analysis (STPA). STPA is a useful modeling tool for generating and analyzing actual or potential information control structures. Finally, we have begun modeling plantwide organizational relationships and processes. Organizational system modeling will supplement our CAST and STPA findings and provide a basis for mapping out a plantwide information control architecture. CAST analysis findings indicate an important underlying contributor to the incident under investigation, and a significant risk to information automation system performance, was perceived schedule pressure, which exposed weaknesses in interdepartmental coordination between and within responsible plant organizations and challenged the resilience of established plant processes, until a human caused the initiating event. These findings are discussed in terms of their risk to overall system performance and their implications for information automation system resilience and brittleness. We present two preliminary information automation models. The proactive issue resolution model is a test case of an information automation concept with significant near-term potential for application and subsequent reduction in significant plant events. The IAE model is a more general representation of a broader, plantwide information automation system. From our results, we have generated a set of preliminary system-level requirements and safety constraints. These requirements will be further developed over the remainder of our project in collaboration with nuclear industry subject matter experts and specialists in the technical systems under consideration. Additionally, we will continue to pursue the system analyses initiated in the first part of our effort, with a particular emphasis on STPA as the main tool to identify weak or weakening ontrol structures that affect the resilience of organizations and programs. Our intent is to broaden the scope of the analysis from an individual use case to a related set of use cases (e.g., maintenance tasks, compliance tasks) with similar human-system performance challenges. This will enable more generalized findings to refine the Proactive Issue Resolution and IAE models, as well as their system-level requirements and safety constraints. We will use organizational system modeling analyses to supplement STPA findings and model development. We conclude the report with a set of summary recommendations and an initial draft list of system-level requirements and safety constraints for optimized information automation systems
    corecore