57,986 research outputs found

    On the Statistical Modeling and Analysis of Repairable Systems

    Full text link
    We review basic modeling approaches for failure and maintenance data from repairable systems. In particular we consider imperfect repair models, defined in terms of virtual age processes, and the trend-renewal process which extends the nonhomogeneous Poisson process and the renewal process. In the case where several systems of the same kind are observed, we show how observed covariates and unobserved heterogeneity can be included in the models. We also consider various approaches to trend testing. Modern reliability data bases usually contain information on the type of failure, the type of maintenance and so forth in addition to the failure times themselves. Basing our work on recent literature we present a framework where the observed events are modeled as marked point processes, with marks labeling the types of events. Throughout the paper the emphasis is more on modeling than on statistical inference.Comment: Published at http://dx.doi.org/10.1214/088342306000000448 in the Statistical Science (http://www.imstat.org/sts/) by the Institute of Mathematical Statistics (http://www.imstat.org

    Screening and metamodeling of computer experiments with functional outputs. Application to thermal-hydraulic computations

    Get PDF
    To perform uncertainty, sensitivity or optimization analysis on scalar variables calculated by a cpu time expensive computer code, a widely accepted methodology consists in first identifying the most influential uncertain inputs (by screening techniques), and then in replacing the cpu time expensive model by a cpu inexpensive mathematical function, called a metamodel. This paper extends this methodology to the functional output case, for instance when the model output variables are curves. The screening approach is based on the analysis of variance and principal component analysis of output curves. The functional metamodeling consists in a curve classification step, a dimension reduction step, then a classical metamodeling step. An industrial nuclear reactor application (dealing with uncertainties in the pressurized thermal shock analysis) illustrates all these steps

    Intersession Reliability and Within-Session Stability of a Novel Perception-Action Coupling Task

    Get PDF
    BACKGROUND: The perception-action coupling task (PACT) was designed as a more ecologically valid measure of alertness/reaction times compared to currently used measures by aerospace researchers. The purpose of this study was to assess the reliability, within-subject variability, and systematic bias associated with the PACT. METHODS: There were 16 subjects (men/women = 9/7; age = 27.8 +/- 3.6 yr) who completed 4 identical testing sessions. The PACT requires subjects to make judgements on whether a virtual ball could fit into an aperture. For each session, subjects completed nine cycles of the PACT, with each cycle lasting 5 min. Judgement accuracy and reaction time parameters were calculated for each cycle. Systematic bias was assessed with repeated-measures ANOVA, reliability with intraclass correlation coefficients (ICC), and within-subject variability with coefficients of variation (CVTE). RESULTS: Initiation time (Mean = 0.1065 s) showed the largest systematic bias, requiring the elimination of three cycles to reduce bias, with all other variables requiring, at the most, one. All variables showed acceptable reliability (ICC > 0.70) and within-subject variability (CVTE <20%) with only one cycle after elimination of the first three cycles. CONCLUSIONS: With a three-cycle familiarization period, the PACT was found to be reliable and stable

    Input variable selection in time-critical knowledge integration applications: A review, analysis, and recommendation paper

    Get PDF
    This is the post-print version of the final paper published in Advanced Engineering Informatics. The published article is available from the link below. Changes resulting from the publishing process, such as peer review, editing, corrections, structural formatting, and other quality control mechanisms may not be reflected in this document. Changes may have been made to this work since it was submitted for publication. Copyright @ 2013 Elsevier B.V.The purpose of this research is twofold: first, to undertake a thorough appraisal of existing Input Variable Selection (IVS) methods within the context of time-critical and computation resource-limited dimensionality reduction problems; second, to demonstrate improvements to, and the application of, a recently proposed time-critical sensitivity analysis method called EventTracker to an environment science industrial use-case, i.e., sub-surface drilling. Producing time-critical accurate knowledge about the state of a system (effect) under computational and data acquisition (cause) constraints is a major challenge, especially if the knowledge required is critical to the system operation where the safety of operators or integrity of costly equipment is at stake. Understanding and interpreting, a chain of interrelated events, predicted or unpredicted, that may or may not result in a specific state of the system, is the core challenge of this research. The main objective is then to identify which set of input data signals has a significant impact on the set of system state information (i.e. output). Through a cause-effect analysis technique, the proposed technique supports the filtering of unsolicited data that can otherwise clog up the communication and computational capabilities of a standard supervisory control and data acquisition system. The paper analyzes the performance of input variable selection techniques from a series of perspectives. It then expands the categorization and assessment of sensitivity analysis methods in a structured framework that takes into account the relationship between inputs and outputs, the nature of their time series, and the computational effort required. The outcome of this analysis is that established methods have a limited suitability for use by time-critical variable selection applications. By way of a geological drilling monitoring scenario, the suitability of the proposed EventTracker Sensitivity Analysis method for use in high volume and time critical input variable selection problems is demonstrated.E

    Scor Quality Model Affecting Manufacturing Firm’s Supply Chain Quality Performance And The Moderating Effect Of Qms

    Get PDF
    The main objective of this study is hypothesis testing to explain the nature of the relationship between the independent variables (The SCOR quality model) and the dependent variable (Supply Chain Quality Performance) and moderated by (QMS). Objektif utama kajian ini adalah untuk menerangkan hubungan antara model SCOR kualiti dengan prestasi kualiti rantaian bekalan sesebuah firma dan QMS memoderasikan huungan model SCOR kualiti dengan prestasi kuality rantaian bekalan
    corecore