3,474 research outputs found

    Managing Well Integrity using Reliability Based Models

    Get PDF
    Imperial Users onl

    ISBIS 2016: Meeting on Statistics in Business and Industry

    Get PDF
    This Book includes the abstracts of the talks presented at the 2016 International Symposium on Business and Industrial Statistics, held at Barcelona, June 8-10, 2016, hosted at the Universitat PolitÚcnica de Catalunya - Barcelona TECH, by the Department of Statistics and Operations Research. The location of the meeting was at ETSEIB Building (Escola Tecnica Superior d'Enginyeria Industrial) at Avda Diagonal 647. The meeting organizers celebrated the continued success of ISBIS and ENBIS society, and the meeting draw together the international community of statisticians, both academics and industry professionals, who share the goal of making statistics the foundation for decision making in business and related applications. The Scientific Program Committee was constituted by: David Banks, Duke University Amílcar Oliveira, DCeT - Universidade Aberta and CEAUL Teresa A. Oliveira, DCeT - Universidade Aberta and CEAUL Nalini Ravishankar, University of Connecticut Xavier Tort Martorell, Universitat Politécnica de Catalunya, Barcelona TECH Martina Vandebroek, KU Leuven Vincenzo Esposito Vinzi, ESSEC Business Schoo

    Financial constraints and capacity adjustment in the United Kingdom: Evidence from a large panel of survey data

    Get PDF
    The interrelationship between financial constraints and firm activity is a hotly debated issue. The way firms cope with financial constraints is fundamental to the analysis of monetary transmission, of financial stability and of growth and development. The CBI Industrial Trends Survey contains detailed information on the financial constraints faced by a large sample of UK manufacturers. This paper uses the quarterly CBI Industrial Trends Survey firm level data between January 1989 and October 1999. The cleaned sample contains 49,244 quarterly observations on 5,196 firms. As more than 63% of the observations refer to firms with less than 200 employees, the data set is especially well suited for comparing large and small companies. The data set is presented and a new method of checking the informational content of the data is developed. Whereas the relationship between investment activity and financial constraints is theoretically ambivalent due to simultaneity, the link between financial constraints on the one hand and the prevalence and duration of capacity gaps on the other should be unambiguously positive. Looking at the relationship between both types of constraints, two important results emerge. First, there is shown to be informational content in the survey data on financial constraints. Specifically, financially constrained firms take longer to close capacity gaps. This indicates that financial constraints do indeed play a part in the investment process. Second, small firms close their capacity gaps faster than large firms do, but financial constraints seem to be of higher relevance to their adjustment dynamics. --Financial constraints,investment,capacity adjustment,small firm finance,duration analysis

    New approaches to parameter estimation with statistical censoring by means of the CEV algorithm: Characterization of its properties for high-performance normal processes

    Full text link
    [EN] The process of parameter estimation in order to characterize a population using algorithms is in constant development and perfection. Recent years show that data-based decision-making is complex when there is uncertainty generated by statistical censoring. The purpose of this article is to evaluate the effect of statistical censoring on the normal distribution, which is common in many processes. Parameter estimation properties will be characterized with the conditional expected value algorithm, using different censoring percentages and sample sizes. The estimation properties chosen for the study will focus on the monitoring and decision-making related to industrial processes with the presence of censoring.Neira Rueda, J.; CarriĂłn GarcĂ­a, A. (2023). New approaches to parameter estimation with statistical censoring by means of the CEV algorithm: Characterization of its properties for high-performance normal processes. Communication in Statistics- Theory and Methods. 52(10):3557-3573. https://doi.org/10.1080/03610926.2021.197732335573573521

    A data analytics approach to gas turbine prognostics and health management

    Get PDF
    As a consequence of the recent deregulation in the electrical power production industry, there has been a shift in the traditional ownership of power plants and the way they are operated. To hedge their business risks, the many new private entrepreneurs enter into long-term service agreement (LTSA) with third parties for their operation and maintenance activities. As the major LTSA providers, original equipment manufacturers have invested huge amounts of money to develop preventive maintenance strategies to minimize the occurrence of costly unplanned outages resulting from failures of the equipments covered under LTSA contracts. As a matter of fact, a recent study by the Electric Power Research Institute estimates the cost benefit of preventing a failure of a General Electric 7FA or 9FA technology compressor at 10to10 to 20 million. Therefore, in this dissertation, a two-phase data analytics approach is proposed to use the existing monitoring gas path and vibration sensors data to first develop a proactive strategy that systematically detects and validates catastrophic failure precursors so as to avoid the failure; and secondly to estimate the residual time to failure of the unhealthy items. For the first part of this work, the time-frequency technique of the wavelet packet transforms is used to de-noise the noisy sensor data. Next, the time-series signal of each sensor is decomposed to perform a multi-resolution analysis to extract its features. After that, the probabilistic principal component analysis is applied as a data fusion technique to reduce the number of the potentially correlated multi-sensors measurement into a few uncorrelated principal components. The last step of the failure precursor detection methodology, the anomaly detection decision, is in itself a multi-stage process. The obtained principal components from the data fusion step are first combined into a one-dimensional reconstructed signal representing the overall health assessment of the monitored systems. Then, two damage indicators of the reconstructed signal are defined and monitored for defect using a statistical process control approach. Finally, the Bayesian evaluation method for hypothesis testing is applied to a computed threshold to test for deviations from the healthy band. To model the residual time to failure, the anomaly severity index and the anomaly duration index are defined as defects characteristics. Two modeling techniques are investigated for the prognostication of the survival time after an anomaly is detected: the deterministic regression approach, and parametric approximation of the non-parametric Kaplan-Meier plot estimator. It is established that the deterministic regression provides poor prediction estimation. The non parametric survival data analysis technique of the Kaplan-Meier estimator provides the empirical survivor function of the data set comprised of both non-censored and right censored data. Though powerful because no a-priori predefined lifetime distribution is made, the Kaplan-Meier result lacks the flexibility to be transplanted to other units of a given fleet. The parametric analysis of survival data is performed with two popular failure analysis distributions: the exponential distribution and the Weibull distribution. The conclusion from the parametric analysis of the Kaplan-Meier plot is that the larger the data set, the more accurate is the prognostication ability of the residual time to failure model.PhDCommittee Chair: Mavris, Dimitri; Committee Member: Jiang, Xiaomo; Committee Member: Kumar, Virendra; Committee Member: Saleh, Joseph; Committee Member: Vittal, Sameer; Committee Member: Volovoi, Vital

    Merging Data Sources to Predict Remaining Useful Life – An Automated Method to Identify Prognostic Parameters

    Get PDF
    The ultimate goal of most prognostic systems is accurate prediction of the remaining useful life (RUL) of individual systems or components based on their use and performance. This class of prognostic algorithms is termed Degradation-Based, or Type III Prognostics. As equipment degrades, measured parameters of the system tend to change; these sensed measurements, or appropriate transformations thereof, may be used to characterize degradation. Traditionally, individual-based prognostic methods use a measure of degradation to make RUL estimates. Degradation measures may include sensed measurements, such as temperature or vibration level, or inferred measurements, such as model residuals or physics-based model predictions. Often, it is beneficial to combine several measures of degradation into a single parameter. Selection of an appropriate parameter is key for making useful individual-based RUL estimates, but methods to aid in this selection are absent in the literature. This dissertation introduces a set of metrics which characterize the suitability of a prognostic parameter. Parameter features such as trendability, monotonicity, and prognosability can be used to compare candidate prognostic parameters to determine which is most useful for individual-based prognosis. Trendability indicates the degree to which the parameters of a population of systems have the same underlying shape. Monotonicity characterizes the underlying positive or negative trend of the parameter. Finally, prognosability gives a measure of the variance in the critical failure value of a population of systems. By quantifying these features for a given parameter, the metrics can be used with any traditional optimization technique, such as Genetic Algorithms, to identify the optimal parameter for a given system. An appropriate parameter may be used with a General Path Model (GPM) approach to make RUL estimates for specific systems or components. A dynamic Bayesian updating methodology is introduced to incorporate prior information in the GPM methodology. The proposed methods are illustrated with two applications: first, to the simulated turbofan engine data provided in the 2008 Prognostics and Health Management Conference Prognostics Challenge and, second, to data collected in a laboratory milling equipment wear experiment. The automated system was shown to identify appropriate parameters in both situations and facilitate Type III prognostic model development

    Hazard rate models for early warranty issue detection using upstream supply chain information

    Get PDF
    This research presents a statistical methodology to construct an early automotive warranty issue detection model based on upstream supply chain information. This is contrary to extant methods that are mostly reactive and only rely on data available from the OEMs (original equipment manufacturers). For any upstream supply chain information with direct history from warranty claims, the research proposes hazard rate models to link upstream supply chain information as explanatory covariates for early detection of warranty issues. For any upstream supply chain information without direct warranty claims history, we introduce Bayesian hazard rate models to account for uncertainties of the explanatory covariates. In doing so, it improves both the accuracy of warranty issue detection as well as the lead time for detection. The proposed methodology is illustrated and validated using real-world data from a leading global Tier-one automotive supplier

    Methods for dependability analysis of small satellite missions

    Get PDF
    The use of small-satellites as platforms for fast-access to space with relatively low cost has increased in the last years. In particular, many universities in the world have now permanent hands-on education programs based on CubeSats. These small and cheap platforms are becoming more and more attractive also for other-than-educational missions, such as for example technology demonstration, science application, and Earth observation. This new objectives require the development of adequate technology to increase CubeSat performances. Furthermore, it is necessary to improve mission reliability. The research aims at studying methods for dependability analysis conducted by small satellites. The attention is focused on the reliability, as main attribute of the dependability, of CubeSats and CubeSats missions. The work has been structured in three main blocks. The first part of the work has been dedicated to the general study of dependability from the theoretical point of view. It has been studied the dependability attributes, the threads that can affect the dependability of a system, the techniques that are used to mitigate the threads, parameters to measure dependability, and models and techniques for dependability modelling. The second part contains a study of failures occurred during CubeSats missions in the last ten years and their observed reliability evaluation have been conducted. In order to perform this analysis a database has been created. This database contents information of all CubeSats launched until December 2013. The information has been gathered from public sources (i.e. CubeSat projects webs, publications on international journals, etc.) and contains general information (e.g. launch date, objectives) and data regarding possible failures. All this information is then used to conduct a quantitative reliability analysis of these missions by means of non-parametric and parametric methods, demonstrating that these failures follow a Weibull distribution. In the third section different methods, based on the concept of fault prevention, removal and tolerance, have been proposed in order to evaluate and increase dependability, and concretely reliability, of CubeSats and their missions. Concretely, three different methods have been developed: 1) after an analysis of the activities conducted by CubeSat’s developers during whole CubeSat life-cycle, it has been proposed a wide range of activities to be conducted during all phases of satellite’s life-cycle to increase mission rate of success, 2) increase reliability through CubeSats verification, mainly tailoring international ECSS standards to be applied to a CubeSat project, 3) reliability rising at mission level by means of implementing distributed mission architectures instead of classical monolithic architectures. All these methods developed in the present PhD research have been applied to a real space projects under development at Politecnico di Torino within e-st@r program. The e-st@r program is being conducted by the CubeSat Team of the Mechanical and AeroSpace Engineering Department. Concretely, e-st@r-I, e-st@r-II, and 3STAR CubeSats have been used as test cases for the proposed methods. Moreover, part of the present research has been conducted within an internship at the European Space research and Technology Centre (ESTEC) of the European Space Agency (ESA) at Noordwijk (The Netherlands). In particular, the partially realisation of the CubeSats database, the analysis of activities conducted by CubeSat developers and statement of activities for mission rate of success increase have been conducted during the internship
    • 

    corecore