970 research outputs found
Lightweight adaptive filtering for efficient learning and updating of probabilistic models
Adaptive software systems are designed to cope with unpredictable and evolving usage behaviors and environmental conditions. For these systems reasoning mechanisms are needed to drive evolution, which are usually based on models capturing relevant aspects of the running software. The continuous update of these models in evolving environments requires efficient learning procedures, having low overhead and being robust to changes. Most of the available approaches achieve one of these goals at the price of the other. In this paper we propose a lightweight adaptive filter to accurately learn time-varying transition probabilities of discrete time Markov models, which provides robustness to noise and fast adaptation to changes with a very low overhead. A formal stability, unbiasedness and consistency assessment of the learning approach is provided, as well as an experimental comparison with state-of-the-art alternatives
Experimental status of 7Be production and destruction at astrophysical relevant energies
The production and destruction of 7Be plays a significant role in the Big Bang Nucleosynthesis as well as in the framework of the solar neutrino. The 3He(α, γ)7Be reaction cross sections has been measured several times in the last decades, but the precision achieved on reaction rate determinations at the relevant astrophysical energies is not yet satisfactory. The experimental status of this reaction will be critically reviewed, and the theoretical descriptions available will be discussed
Human performance in manufacturing tasks: Optimization and assessment of required workload and capabilities
This paper discusses some examples where human performance and or human error prediction was achieved by using a modified version of the Rasch model(1980), where the probability of a specified outcome is modelled as a logistic function of the difference between the person capacity and item difficulty. The model needs to be modified to take into account an outcome that may not be dichotomous and o take into account the interaction between two macro factors: (a) Task complexity: that summarises all factors contributing to physical and mental workload requirements for execution of a given operative task & (b) Human capability: that considered the skills, training and experience of the people facing the tasks, representing a synthesis of their physical and cognitive abilities to verify whether or not they are matching the task requirements. Task complexity can be evaluated as a mathematical construct considering the compound effects of Mental Workload Demands and Physical Workload Demands associated to an operator task. Similarly, operator capability can be estimated on the basis of the operators' set of cognitive capabilities and physical conditions. The examples chosen for the application of the model were quite different: one is a set of assembly workstation in large computer manufacturing company and the other a set of workstation in the automotive sector. This paper presents and discusses the modelling hypothesis, the interim field data collection, results and possible future direction of the studies.
Risk-Informed design process of the IRIS reactor
Westinghouse is currently conducting the pre-application licensing of the International Reactor Innovative and
Secure (IRIS). The design philosophy of the IRIS has been based on the concept of Safety-by-DesignTM and within this
framework the PSA is being used as an integral part of the design process. The basis for the PSA contribution to the design
phase of the reactor is the close iteration between the PSA team and the design and safety analysis team. In this process the
design team is not only involved in the initial phase of providing system information to the PSA team, allowing in this way the
identification of the high risk scenarios, but it is also receiving feedback from the PSA team that suggests design modification
aimed at reaching risk-related goals.
During the first iteration of this process, the design modifications proposed by the PSA team allowed reducing the initial
estimate of Core Damage Frequency (CDF) due to internal events from 2E-6/ry to 2E-8/ry. Since the IRIS design is still in a
development phase, a number of assumptions have to be confirmed when the design is finalized.
Among key assumptions are the success criteria for both the accident sequences analyzed and the systems involved in the
mitigation strategies. The PSA team developed the initial accident sequence event trees according to the information from
the preliminary analysis and feasibility studies. A recent coupling between the RELAP and GOTHIC codes made possible the
actual simulation of all LOCA sequences identified in the first draft of the Event Trees. Working in close coordination, the
PSA and the safety analysis teams developed a matrix case of sequences not only with the purpose of testing the assumed
success criteria, but also with the perspective of identifying alternative sequences developed mainly by relaxing the extremely
conservative assumptions previously made.
The results of these simulations, bounded themselves with conservative assumptions on the Core Damage definition,
suggested two new versions of the LOCA Event Tree with two possible configurations of the Automatic Depressurization
System. The new CDF has been evaluated for both configurations and the design team has been provided with an additional
and risk-related perspective that will help choosing the design alternative to be implemented
Cost Benefit Evaluation of Maintenance Options for Aging Equipment Using Monetised Risk Values: A practical application
With constant pressure to reduce maintenance costs as well as short-term budget constraints in a changing market environment, asset managers are compelled to continue operating aging assets while deferring maintenance and investment. The scope of the paper is to get an overview of the methods used to evaluate risks and opportunities for deferred maintenance interventions on aging equipment, and underline the importance to include monetised risk considerations and timeline considerations, to evaluate different scenarios connected with the possible options. Monetised risk values offer the opportunity to support risk-based decision-making using the data collected from the field. The paper presents examples of two different methods and their practical applicability in two case studies in the energy sector for a company managing power stations. The use of the existing and the new proposed solutions are discussed on the basis of their applicability to the concrete examples
Linear regression models and k-means clustering for statistical analysis of fNIRS data
We propose a new algorithm, based on a linear regression model, to statistically estimate the hemodynamic activations in fNIRS data sets. The main concern guiding the algorithm development was the minimization of assumptions and approximations made on the data set for the application of statistical tests. Further, we propose a K-means method to cluster fNIRS data (i.e. channels) as activated or not activated. The methods were validated both on simulated and in vivo fNIRS data. A time domain (TD) fNIRS technique was preferred because of its high performances in discriminating cortical activation and superficial physiological changes. However, the proposed method is also applicable to continuous wave or frequency domain fNIRS data sets
Cost benefit evaluation of maintenance options for aging equipment using monetised risk values: A practical application
With constant pressure to reduce maintenance costs as well as short-term budget constraints in a changing market environment, asset managers are compelled to continue operating aging assets while deferring maintenance and investment. The scope of the paper is to get an overview of the methods used to evaluate risks and opportunities for deferred maintenance interventions on aging equipment, and underline the importance to include monetised risk considerations and timeline considerations, to evaluate different scenarios connected with the possible options. Monetised risk values offer the opportunity to support risk-based decision-making using the data collected from the field. The paper presents examples of two different methods and their practical applicability in two case studies in the energy sector for a company managing power stations. The use of the existing and the new proposed solutions are discussed on the basis of their applicability to the concrete examples
Radiochemical separation of 7Be from the cooling water of the neutron spallation source SINQ at PSI
7Be is a key radionuclide for investigation of several astrophysical processes and phenomena. In addition, it is used as a tracer in wear measurements. It is produced in considerable amounts in the cooling water (D2O) of the Spallation Induced Neutron Source (SINQ) facility at PSI by spallation reactions on 16O with the generated fast neutrons. A shielded ion-exchange filter containing 100 mL of the mixed-bed ion exchanger LEWATIT was installed as a bypass for the cooling water into the cooling loop of SINQ for three months. The collected activity of 7Be was in the range of several hundred GBq. Further, the 7Be was separated and purified in a hot-cell remotely-controlled using a separation system installed. With the exception of 10Be, radioactive byproducts can be neglected, so that this cooling water could serve as an ideal source for highly active 7Be-samples. The facility is capable of producing 7Be with activities up to 1 TBq per year. The 7Be sample preparation is described in detail and the possible uses are discussed. In particular some preliminary results of 7Be ion beam production are presente
Measurement of 1323 and 1487 keV resonances in 15N({\alpha}, {\gamma})19F with the recoil separator ERNA
The origin of fluorine is a widely debated issue. Nevertheless, the
^{15}N({\alpha},{\gamma})^{19}F reaction is a common feature among the various
production channels so far proposed. Its reaction rate at relevant temperatures
is determined by a number of narrow resonances together with the DC component
and the tails of the two broad resonances at E_{c.m.} = 1323 and 1487 keV.
Measurement through the direct detection of the 19F recoil ions with the
European Recoil separator for Nuclear Astrophysics (ERNA) were performed. The
reaction was initiated by a 15N beam impinging onto a 4He windowless gas
target. The observed yield of the resonances at Ec.m. = 1323 and 1487 keV is
used to determine their widths in the {\alpha} and {\gamma} channels. We show
that a direct measurement of the cross section of the
^{15}N({\alpha},{\gamma})^{19}F reaction can be successfully obtained with the
Recoil Separator ERNA, and the widths {\Gamma}_{\gamma} and {\Gamma}_{\alpha}
of the two broad resonances have been determined. While a fair agreement is
found with earlier determination of the widths of the 1487 keV resonance, a
significant difference is found for the 1323 keV resonance {\Gamma}_{\alpha} .
The revision of the widths of the two more relevant broad resonances in the
15N({\alpha},{\gamma})19F reaction presented in this work is the first step
toward a more firm determination of the reaction rate. At present, the residual
uncertainty at the temperatures of the ^{19}F stellar nucleosynthesis is
dominated by the uncertainties affecting the Direct Capture component and the
364 keV narrow resonance, both so far investigated only through indirect
experiments.Comment: 8 pages, 11 figures. Accepted for publication in PR
- …