9,831 research outputs found

    Statistical process monitoring of a multiphase flow facility

    Get PDF
    Industrial needs are evolving fast towards more flexible manufacture schemes. As a consequence, it is often required to adapt the plant production to the demand, which can be volatile depending on the application. This is why it is important to develop tools that can monitor the condition of the process working under varying operational conditions. Canonical Variate Analysis (CVA) is a multivariate data driven methodology which has been demonstrated to be superior to other methods, particularly under dynamically changing operational conditions. These comparative studies normally use computer simulated data in benchmark case studies such as the Tennessee Eastman Process Plant (Ricker, N.L. Tennessee Eastman Challenge Archive, Available at 〈http://depts.washington.edu/control/LARRY/TE/download.html〉 Accessed 21.03.2014). The aim of this work is to provide a benchmark case to demonstrate the ability of different monitoring techniques to detect and diagnose artificially seeded faults in an industrial scale multiphase flow experimental rig. The changing operational conditions, the size and complexity of the test rig make this case study an ideal candidate for a benchmark case that provides a test bed for the evaluation of novel multivariate process monitoring techniques performance using real experimental data. In this paper, the capabilities of CVA to detect and diagnose faults in a real system working under changing operating conditions are assessed and compared with other methodologies. The results obtained demonstrate that CVA can be effectively applied for the detection and diagnosis of faults in real complex systems, and reinforce the idea that the performance of CVA is superior to other algorithms

    Enhancing service requirements of technical product-service systems

    Get PDF
    Due to the integration of product and services as a new business model, product reliability and strategies for cost reduction at the early design stage have become important factors for many manufacturing firms. It is, therefore, critical at this phase to analyse the risk involved with Service Requirements noncompliance in order to help designers make informed decisions; as these decisions have a large impact on the Product Life Cycle (PLC). An investigation has been performed into how Service Requirements are analysed in a service orientated business to achieve reduced Life Cycle Cost (LCC) and improvements of existing Service Requirements. Weibull distribution and Monte Carlo principle have been proposed to do so; as they are considered as the most widely used in product reliability studies in the industry sector. A generic methodology for risk evaluation of failure to deliver a new product against Service Requirements is presented in this paper. This is part of the ongoing research project which aims to, apart from comparing current and targeted Service Requirements, it also facilitates an optimisation of them at the minimum risk of nonconformity

    An integrated aerospace requirement setting and risk analysis tool for life cycle cost reduction and system design improvement

    Get PDF
    In the early conceptual stage of the service orientated model, decisions regarding the design of a new technical product are largely influenced by Service Requirements. Those decisions, therefore, have to merge both technical and business aspects to obtain desired product reliability and reduced Whole Life Cost (WLC). It is, therefore, critical at that phase to define the risk of potential noncompliance of Service Requirements in order to ensure the right design choices; as these decisions have a large impact on the overall product and service development. This paper presents outcome of research project to investigate different approaches used by companies to analyse Service Requirements to achieve reduced Life Cycle Cost (LCC). Analysis using Weibull distribution and Monte Carlo principle have been proposed here; based on the conducted literature review these are considered as the most widely used techniques in product reliability studies. Based on those techniques, a methodology and its software tool for risk evaluation of failure to deliver a new product against Service Requirements are presented in this paper. This is part of the on-going research project which, apart from analysing the gap between the current Service Requirements achievements and the design targets for a new aircraft engine, it also facilitates an optimisation of those requirements at the minimum risk of nonconformity

    Potassium chlorate decomposition under high pressure

    Full text link
    High pressure physics involves placing various substances under high pressure and observing changes in that substance. In this experiment this high amount of pressure is induced using a diamond anvil cell. A diamond anvil cell uses a metal gasket to hold the sample between two diamonds, which will press on the sample to reach high pressures. High pressures are reached with a moderate amount of force by exerting that force over a small area. Diamonds are used for the compression because of their hardness and ability to resist compression. The pressure being exerted on the sample using a diamond anvil cell is often measured using ruby fluorescence. The behavior of ruby under high pressure is well known so the pressure inside the diamond anvil cell can be determined by observing the ruby fluorescence. Ruby is placed inside the gasket along with the sample so that it is always at the same pressure as the sample. Potassium Chlorate is a chemical that is often used as an oxygen producer and as an explosive when mixed with other chemicals. It decomposes under heat to release oxygen gas, which is the reaction we are trying to induce by placing the chemical under pressure. When molecules heat up they begin to vibrate more rapidly and are more likely to collide with each other. When molecules undergo higher pressures they are also more likely to collide as atoms get closer together. The purpose of this experiment is to determine if pressure can induce the same reaction in Potassium Chlorate as heat

    Parallax and Distance Estimates for Twelve Cataclysmic Variable Stars

    Get PDF
    We report parallax and distance estimates for twelve more cataclysmic binaries and related objects observed with the 2.4m Hiltner telescope at MDM Observatory. The final parallax accuracy is typically about 1 mas. For only one of the twelve objects, IR Gem, do we fail to detect a significant parallax. Notable results include distances for V396 Hya (CE 315), a helium double degenerate with a relatively long orbital period, and for MQ Dra (SDSSJ155331+551615), a magnetic system with a very low accretion rate. We find that the Z Cam star KT Persei is physically paired with a K main-sequence star lying 15 arcsec away. Several of the targets have distance estimates in the literature that are based on the white dwarf's effective temperature and flux; our measurements broadly corroborate these estimates, but tend to put the stars a bit closer, indicating that the white dwarfs may have rather larger masses than assumed. As a side note, we briefly describe radial velocity spectroscopy that refines the orbital period of V396 Hya to 65.07 +- 0.08 min.Comment: Accepted for Astronomical Journal. 19 pages, no figure

    A study of subgroup discovery approaches for defect prediction

    Get PDF
    Context: Although many papers have been published on software defect prediction techniques, machine learning approaches have yet to be fully explored. Objective: In this paper we suggest using a descriptive approach for defect prediction rather than the pre-cise classification techniques that are usually adopted. This allows us to characterise defective modules with simple rules that can easily be applied by practitioners and deliver a practical (or engineering) approach rather than a highly accurate result. Method: We describe two well-known subgroup discovery algorithms, the SD algorithm and the CN2-SD algorithm to obtain rules that identify defect prone modules. The empirical work is performed with pub-licly available datasets from the Promise repository and object-oriented metrics from an Eclipse reposi-tory related to defect prediction. Subgroup discovery algorithms mitigate against characteristics of datasets that hinder the applicability of classification algorithms and so remove the need for preprocess-ing techniques. Results: The results show that the generated rules can be used to guide testing effort in order to improve the quality of software development projects. Such rules can indicate metrics, their threshold values and relationships between metrics of defective modules. Conclusions: The induced rules are simple to use and easy to understand as they provide a description rather than a complete classification of the whole dataset. Thus this paper represents an engineering approach to defect prediction, i.e., an approach which is useful in practice, easily understandable and can be applied by practitioners.ICEBERG IAPP-2012-324356MICINN TIN2011-28956-C02-0

    Observing scattered X-ray radiation from gamma-ray bursts: a way to measure their collimation angles

    Get PDF
    There are observational facts and theoretical arguments for an origin of gamma-ray bursts in molecular clouds in distant galaxies. If this is true, one could detect a significant flux of GRB prompt and early afterglow X-ray radiation scattered into our line of sight by the molecular and atomic matter located within tens of parsecs of the GRB site long after the afterglow has faded away. The scattered flux directly measures the typical density of the GRB ambient medium. Furthemore, if the primary emission is beamed, the scattered X-ray flux will be slowly decreasing for several months to years before falling off rapidly. Therefore, it should be possible to estimate the collimation angle of a burst from the light curve of its X-ray echo and a measured value of the line-of-sight absorption column depth. It is shown that detection of such an echo is for the brightest GRBs just within the reach of the Chandra and XMM-Newton observatories.Comment: 6 pages, 2 figures. Accepted for publication in Astronomy and Astrophysic

    On the physical parametrization and magnetic analogs of the Emparan-Teo dihole solution

    Full text link
    The Emparan-Teo non-extremal black dihole solution is reparametrized using Komar quantities and the separation distance as arbitrary parameters. We show how the potential A3A_3 can be calculated for the magnetic analogs of this solution in the Einstein-Maxwell and Einstein-Maxwell-dilaton theories. We also demonstrate that, similar to the extreme case, the external magnetic field can remove the supporting strut in the non-extremal black dihole too.Comment: 9 pages, 1 figur
    corecore