635 research outputs found

    Discrete-time dynamic modeling for software and services composition as an extension of the Markov chain approach

    Get PDF
    Discrete Time Markov Chains (DTMCs) and Continuous Time Markov Chains (CTMCs) are often used to model various types of phenomena, such as, for example, the behavior of software products. In that case, Markov chains are widely used to describe possible time-varying behavior of “self-adaptive” software systems, where the transition from one state to another represents alternative choices at the software code level, taken according to a certain probability distribution. From a control-theoretical standpoint, some of these probabilities can be interpreted as control signals and others can just be observed. However, the translation between a DTMC or CTMC model and a corresponding first principle model, that can be used to design a control system is not immediate. This paper investigates a possible solution for translating a CTMC model into a dynamic system, with focus on the control of computing systems components. Notice that DTMC models can be translated as well, providing additional information

    Human performance in manufacturing tasks: Optimization and assessment of required workload and capabilities

    Get PDF
    This paper discusses some examples where human performance and or human error prediction was achieved by using a modified version of the Rasch model(1980), where the probability of a specified outcome is modelled as a logistic function of the difference between the person capacity and item difficulty. The model needs to be modified to take into account an outcome that may not be dichotomous and o take into account the interaction between two macro factors: (a) Task complexity: that summarises all factors contributing to physical and mental workload requirements for execution of a given operative task & (b) Human capability: that considered the skills, training and experience of the people facing the tasks, representing a synthesis of their physical and cognitive abilities to verify whether or not they are matching the task requirements. Task complexity can be evaluated as a mathematical construct considering the compound effects of Mental Workload Demands and Physical Workload Demands associated to an operator task. Similarly, operator capability can be estimated on the basis of the operators' set of cognitive capabilities and physical conditions. The examples chosen for the application of the model were quite different: one is a set of assembly workstation in large computer manufacturing company and the other a set of workstation in the automotive sector. This paper presents and discusses the modelling hypothesis, the interim field data collection, results and possible future direction of the studies.

    Risk based approach for procedures' optimization

    Get PDF
    Despite an increase in the process automation, different activities remain mainly operator driven, as the loading and unloading of tankers, maintenance operations, and so on. In these cases, the activities performed by the operator can be critical, both for the safety and for the product quality. Optimizing the operational procedures is thus a key factor for quality and safety. A risk assessment of the procedure can be adopted as a base for optimisation, highlighting which of the tasks within the procedure mainly contributes to the risk of the working activity. Usually the analysis of the procedures is carried on through a task analysis as in Builes et al. (2014). In this paper the task analysis is used as a starting point for a quantitative risk assessment carried on through an integrated dynamic decision analysis. The logical-probabilistic model of the procedure is elaborated jointly with a consequences analysis, obtaining a risk assessment for all the sequences of tasks of the work procedure under analysis. The risk assessment considered both possible equipment failures and the potential operational errors in executing the tasks. The proposed approach is in this paper demonstrated through the application of the integrated decision analysis for the operation of unloading of ammonia in a plant for the production and storage of fertilizers

    Risk-Informed design process of the IRIS reactor

    Get PDF
    Westinghouse is currently conducting the pre-application licensing of the International Reactor Innovative and Secure (IRIS). The design philosophy of the IRIS has been based on the concept of Safety-by-DesignTM and within this framework the PSA is being used as an integral part of the design process. The basis for the PSA contribution to the design phase of the reactor is the close iteration between the PSA team and the design and safety analysis team. In this process the design team is not only involved in the initial phase of providing system information to the PSA team, allowing in this way the identification of the high risk scenarios, but it is also receiving feedback from the PSA team that suggests design modification aimed at reaching risk-related goals. During the first iteration of this process, the design modifications proposed by the PSA team allowed reducing the initial estimate of Core Damage Frequency (CDF) due to internal events from 2E-6/ry to 2E-8/ry. Since the IRIS design is still in a development phase, a number of assumptions have to be confirmed when the design is finalized. Among key assumptions are the success criteria for both the accident sequences analyzed and the systems involved in the mitigation strategies. The PSA team developed the initial accident sequence event trees according to the information from the preliminary analysis and feasibility studies. A recent coupling between the RELAP and GOTHIC codes made possible the actual simulation of all LOCA sequences identified in the first draft of the Event Trees. Working in close coordination, the PSA and the safety analysis teams developed a matrix case of sequences not only with the purpose of testing the assumed success criteria, but also with the perspective of identifying alternative sequences developed mainly by relaxing the extremely conservative assumptions previously made. The results of these simulations, bounded themselves with conservative assumptions on the Core Damage definition, suggested two new versions of the LOCA Event Tree with two possible configurations of the Automatic Depressurization System. The new CDF has been evaluated for both configurations and the design team has been provided with an additional and risk-related perspective that will help choosing the design alternative to be implemented

    Cost benefit evaluation of maintenance options for aging equipment using monetised risk values: A practical application

    Get PDF
    With constant pressure to reduce maintenance costs as well as short-term budget constraints in a changing market environment, asset managers are compelled to continue operating aging assets while deferring maintenance and investment. The scope of the paper is to get an overview of the methods used to evaluate risks and opportunities for deferred maintenance interventions on aging equipment, and underline the importance to include monetised risk considerations and timeline considerations, to evaluate different scenarios connected with the possible options. Monetised risk values offer the opportunity to support risk-based decision-making using the data collected from the field. The paper presents examples of two different methods and their practical applicability in two case studies in the energy sector for a company managing power stations. The use of the existing and the new proposed solutions are discussed on the basis of their applicability to the concrete examples

    Preparation and characterisation of isotopically enriched Ta2_2O5_5 targets for nuclear astrophysics studies

    Full text link
    The direct measurement of reaction cross sections at astrophysical energies often requires the use of solid targets of known thickness, isotopic composition, and stoichiometry that are able to withstand high beam currents for extended periods of time. Here, we report on the production and characterisation of isotopically enriched Ta2_2O5_5 targets for the study of proton-induced reactions at the Laboratory for Underground Nuclear Astrophysics facility of the Laboratori Nazionali del Gran Sasso. The targets were prepared by anodisation of tantalum backings in enriched water (up to 66% in 17^{17}O and up to 96% in 18^{18}O). Special care was devoted to minimising the presence of any contaminants that could induce unwanted background reactions with the beam in the energy region of astrophysical interest. Results from target characterisation measurements are reported, and the conclusions for proton capture measurements with these targets are drawn.Comment: accepted to EPJ

    Analysis of lower limb internal kinetics and electromyography in elite race walking.

    Get PDF
    The aim of this study was to analyse lower limb joint moments, powers and electromyography patterns in elite race walking. Twenty international male and female race walkers performed at their competitive pace in a laboratory setting. The collection of ground reaction forces (1000 Hz) was synchronised with two-dimensional high-speed videography (100 Hz) and electromyography of seven lower limb muscles (1000 Hz). As well as measuring key performance variables such as speed and stride length, normalised joint moments and powers were calculated. The rule in race walking which requires the knee to be extended from initial contact to midstance effectively made the knee redundant during stance with regard to energy generation. Instead, the leg functioned as a rigid lever which affected the role of the hip and ankle joints. The main contributors to energy generation were the hip extensors during late swing and early stance, and the ankle plantarflexors during late stance. The restricted functioning of the knee during stance meant that the importance of the swing leg in contributing to forward momentum was increased. The knee flexors underwent a phase of great energy absorption during the swing phase and this could increase the risk of injury to the hamstring muscles

    Comparative retrospective study on the modalities of biopsying peripheral neuroblastic tumors: a report from the Italian Pediatric Surgical Oncology Group (GICOP)

    Get PDF
    Background: Peripheral neuroblastic tumors are the most common extracranial solid neoplasms in children. Early and adequate tissue sampling may speed up the diagnostic process and ensure a prompt start of optimal treatment whenever needed. Different biopsy techniques have been described. The purpose of this multi-center study is to evaluate the accuracy and safety of the various examined techniques and to determine whether a preferential procedure exists. Methods: All children who underwent a biopsy, from January 2010 to December 2014, as a result of being diagnosed with a peripheral neuroblastic tumor, were retrospectively reviewed. Data collected included patients’ demographics, clinical presentation, intraoperative technical details, postoperative parameters, complications, and histology reports. The Mann–Whitney U and Fisher's exact tests were used for statistical analysis. Results: The cohort included 100 patients, 32 of whom underwent an incisional biopsy (performed through open or minimally invasive access) (Group A), and the remaining 68 underwent multiple needle-core biopsies (either imaging-guided or laparoscopy/thoracoscopy-assisted) (Group B). Comparing the two groups revealed that Group A patients had a higher rate of complications, a greater need for postoperative analgesia, and required red blood cell transfusion more often. Overall adequacy rate was 94%, without significant differences between the two groups (100% vs. 91.2% for Group A and Group B, respectively, P = 0.0933). Conclusions: Both incision and needle-core biopsying methods provided sub-optimal to optimal sampling adequacy rates in children affected by peripheral neuroblastic tumors. However, the former method was associated with a higher risk of both intraoperative and postoperative complications compared with the latter

    Toward a reassessment of the 19F(p, α0)16O reaction rate at astrophysical temperatures

    Get PDF
    AbstractThe 19F(p, α0)16O reaction at low energies plays an important role in fundamental physics. In particular in nuclear astrophysics it represents, together with the 19F(p, γ)20Ne reaction, the crossing point between the CNO and the NeNa cycles in stars. Further, in hydrogen-rich stellar environments, it is the most important fluorine destruction channel. In this paper we report new measurements on the 19F(p, α0)16O reaction at deeply sub-Coulomb energies (0.2–0.6 MeV), a region where, despite the key role of this reaction, very few and old data are reported. The deduced astrophysical S-factor is ≈1.5–2 times larger than currently adopted extrapolations with possibly important astrophysical consequences
    • 

    corecore