229,777 research outputs found

    Derivation of diagnostic models based on formalized process knowledge

    Get PDF
    © IFAC.Industrial systems are vulnerable to faults. Early and accurate detection and diagnosis in production systems can minimize down-time, increase the safety of the plant operation, and reduce manufacturing costs. Knowledge- and model-based approaches to automated fault detection and diagnosis have been demonstrated to be suitable for fault cause analysis within a broad range of industrial processes and research case studies. However, the implementation of these methods demands a complex and error-prone development phase, especially due to the extensive efforts required during the derivation of models and their respective validation. In an effort to reduce such modeling complexity, this paper presents a structured causal modeling approach to supporting the derivation of diagnostic models based on formalized process knowledge. The method described herein exploits the Formalized Process Description Guideline VDI/VDE 3682 to establish causal relations among key-process variables, develops an extension of the Signed Digraph model combined with the use of fuzzy set theory to allow more accurate causality descriptions, and proposes a representation of the resulting diagnostic model in CAEX/AutomationML targeting dynamic data access, portability, and seamless information exchange

    Carbon capture from natural gas combined cycle power plants: Solvent performance comparison at an industrial scale

    Get PDF
    Natural gas is an important source of energy. This article addresses the problem of integrating an existing natural gas combined cycle (NGCC) power plant with a carbon capture process using various solvents. The power plant and capture process have mutual interactions in terms of the flue gas flow rate and composition vs. the extracted steam required for solvent regeneration. Therefore, evaluating solvent performance at a single (nominal) operating point is not indicative and solvent performance should be considered subject to the overall process operability and over a wide range of operating conditions. In the present research, a novel optimization framework was developed in which design and operation of the capture process are optimized simultaneously and their interactions with the upstream power plant are fully captured. The developed framework was applied for solvent comparison which demonstrated that GCCmax, a newly developed solvent, features superior performances compared to the monoethanolamine baseline solvent

    Guidelines for Weighting Factors Adjustment in Finite State Model Predictive Control of Power Converters and Drives

    Get PDF
    INTERNATIONAL CONFERENCE ON INDUSTRIAL TECHNOLOGY () (.2009.VICTORIA, AUSTRALIA)Model Predictive Control with a finite control set has emerged as a promising control tool for power converters and drives. One of the major advantages is the possibility to control several system variables with a single control law, by including them with appropriate weighting factors. However, at the present state of the art, these coefficients are determined empirically. There is no analytical or numerical method proposed yet to obtain an optimal solution. In addition, the empirical method is not always straightforward, and no procedures have been reported. This paper presents a first approach to a set of guidelines that reduce the uncertainty of this process. First a classification of different types of cost functions and weighting factors is presented. Then the different steps of the empirical process are explained. Finally, results for several power converters and drives applications are analyzed, which show the effectiveness of the proposed guidelines to reach appropriate weighting factors and control performance

    Detection of replay attacks in cyber-physical systems using a frequency-based signature

    Get PDF
    This paper proposes a frequency-based approach for the detection of replay attacks affecting cyber-physical systems (CPS). In particular, the method employs a sinusoidal signal with a time-varying frequency (authentication signal) into the closed-loop system and checks whether the time profile of the frequency components in the output signal are compatible with the authentication signal or not. In order to carry out this target, the couplings between inputs and outputs are eliminated using a dynamic decoupling technique based on vector fitting. In this way, a signature introduced on a specific input channel will affect only the output that is selected to be associated with that input, which is a property that can be exploited to determine which channels are being affected. A bank of band-pass filters is used to generate signals whose energies can be compared to reconstruct an estimation of the time-varying frequency profile. By matching the known frequency profile with its estimation, the detector can provide the information about whether a replay attack is being carried out or not. The design of the signal generator and the detector are thoroughly discussed, and an example based on a quadruple-tank process is used to show the application and effectiveness of the proposed method.Peer ReviewedPostprint (author's final draft

    Analysis of a batch-service queue with variable service capacity, correlated customer types and generally distributed class-dependent service times

    Get PDF
    Queueing models with batch service have been studied frequently, for instance in the domain of telecommunications or manufacturing. Although the batch server's capacity may be variable in practice, only a few authors have included variable capacity in their models. We analyse a batch server with multiple customer classes and a variable service capacity that depends on both the number of waiting customers and their classes. The service times are generally distributed and class-dependent. These features complicate the analysis in a non-trivial way. We tackle it by examining the system state at embedded points, and studying the resulting Markov Chain. We first establish the joint probability generating function (pgf) of the service capacity and the number of customers left behind in the queue immediately after service initiation epochs. From this joint pgf, we extract the pgf for the number of customers in the queue and in the system respectively at service initiation epochs and departure epochs, and the pgf of the actual server capacity. Combined with additional techniques, we also obtain the pgf of the queue and system content at customer arrival epochs and random slot boundaries, and the pgf of the delay of a random customer. In the numerical experiments, we focus on the impact of correlation between the classes of consecutive customers, and on the influence of different service time distributions on the system performance. (C) 2019 Elsevier B.V. All rights reserved

    Is Anonymity the Missing Link Between Commercial and Industrial Revolution?

    Get PDF
    The Industrial Revolution is often characterized as the culmination of a process of commercialisation ; however, the precise nature of such a link remains unclear. This paper models and analyses one such link: the impact of a higher degree of anonymity of market transactions on relative factor prices. Commercialisation raises wages as impersonal labour market transactions replace personalized customary relations. This leads, in equilibrium, to higher real wages to prevent shirking. To the extent that capital and labor are (imperfect) substitutes, the resulting shift in relative factor prices leads to the adoption of a more capital-intensive production technology which, in turn, results in a faster rate of technological progress via enhanced learning by doing. We provide evidence using European historical data that England was among the most urbanized and the highest wage countries at the onset of the industrial revolution.
    • …
    corecore