1,602 research outputs found

    On the constrained economic design of control charts: a literature review

    Get PDF
    The economic design is an appealing approach to settle the design parameters of a control chart. Unfortunately, the economic models to design control charts have been scarcely implemented by quality practitioners due to the simplifying assumptions when representing the multifaceted complexity and constraints present within manufacturing and transactional environments. Although there has been an increasing scepticism about the economic models usefulness in practice, some recent studies proposed in literature face the problem of the control charts economic design from a new point of view: the objective is to achieve a well balanced trade-off between the operational and the statistical aspects. Under this perspective, the economic design problem can be intended in a broader sense as the constrained design of a SPC inspection procedure. This paper presents a discussion of some recent trends in the economic design stream of research and outlines the importance of considering the constraints related to SPC resources availability and modelling the occurrence of random shifts

    A Theoretical Foundation for the Development of Process Capability Indices and Process Parameters Optimization under Truncated and Censoring Schemes

    Get PDF
    Process capability indices (PCIs) provide a measure of the output of an in-control process that conforms to a set of specification limits. These measures, which assume that process output is approximately normally distributed, are intended for measuring process capability for manufacturing systems. After implementing inspections, however, non-conforming products are typically scrapped when units fail to meet the specification limits; hence, after inspections, the actual resulting distribution of shipped products that customers perceive is truncated. In this research, a set of customer-perceived PCIs is developed focused on the truncated normal distribution, as an extension of traditional manufacturer-based indices. Comparative studies and numerical examples reveal considerable differences among the traditional PCIs and the proposed PCIs. The comparison results suggest using the proposed PCIs for capability analyses when non-conforming products are scrapped prior to shipping to customers. The confidence interval approximations for the proposed PCIs are also developed. A simulation technique is implemented to compare the proposed PCIs with its traditional counterparts across multiple performance scenarios. The robust parameter design (RPD), as a systematic method for determining the optimum operating conditions that achieve the quality improvement goals, is also studied within the realm of censored data. Data censoring occurs in time-oriented observations when some data is unmeasurable outside a predetermined study period. The underlying conceptual basis of the current RPD studies is the random sampling from a normal distribution, assuming that all the data points are uncensored. However, censoring schemes are widely implemented in lifetime testing, survival analysis, and reliability studies. As such, this study develops the detailed guidelines for a new RPD method with the consideration of type I-right censoring concepts. The response functions are developed using nonparametric methods, including the Kaplan-Meier estimator, Greenwood\u27s formula, and the Cox proportional hazards regression method. Various response-surface-based robust parameter design optimization models are proposed and are demonstrated through a numerical example. Further, the process capability index for type I-right censored data using the nonparametric methods is also developed for assessing the performance of a product based on its lifetime

    Computational intelligence approaches to robotics, automation, and control [Volume guest editors]

    Get PDF
    No abstract available

    Robust Design and Monitoring Tools for Sustainable and Resilient Structural Design and Infrastructure Management

    Get PDF
    Structural systems are subject to inherent uncertainties due to the variability in many hard-to-control `noise factors\u27 that include but are not limited to external loads, material properties, and construction workmanship. Two design methodologies have been widely accepted in the practicing engineering realm to manage the variability associated with operational structures: Allowable Stress Design (ASD) and Load and Resistance Factor Design (LRFD). These traditional approaches explicitly recognize the presence of uncertainty; however, they do not take robustness against this uncertainty into consideration. Overlooking this robustness against uncertainty in the structural design process has two drawbacks. First, the design may not satisfy the safety requirements if the actual uncertainties in the noise factors are underestimated. Thus, the safety requirements can easily be violated because of the high variation of the system response due to noise factors. Second, to guarantee safety in the presence of this high variability of the system response, the structural designer may be forced to choose an overly conservative, inefficient and thus costly design. When the robustness against uncertainty is not treated as one of the design objectives, this trade-off between the over-design for safety and the under-design for cost-savings is exacerbated. The second chapter of this thesis demonstrates that safe and cost-effective designs can be achieved by implementing Robust Design concepts originally developed in manufacturing engineering to consider the robustness against uncertainty. Robust Design concepts can be used to formulate structural designs, which are insensitive to inherent variability in the design process, thus saving cost, and exceeding the main objectives of safety and serviceability. The second chapter of this thesis presents two methodologies for the application of Robust Design principles to structural design utilizing two optimization schemes: one-at-a-time optimization method and Particle Swarm Optimization (PSO) method. Next, this multi-disciplinary research project introduces a methodology to build a new framework, Structural Life-Cycle Assessment (S-LCA), for quantifying the structural sustainability and resiliency of built systems. This project brings together techniques and concepts from two distinct disciplines: Structural Health Monitoring (SHM) of Civil Engineering and Life Cycle Assessment (LCA) of Environmental Engineering to construct the aforementioned S-LCA charts. The intellectual innovations of this project lie in the advancement in infrastructure management techniques through the development of S-LCA charts, which can be useful as an infrastructure monitoring and decision-making tool, for quantifying the structural sustainability and resiliency of built systems. Such a tool would be of great use in aiding infrastructure managers when prescribing maintenance and repair schemes, and emergency managers and first responders in allocating disaster relief effort resources. Moreover, a quantitative, real-time evaluation of structural damage after a disaster will support emergency managers in resource allocation. The project integrates science based modeling and simulation techniques with advanced monitoring and sensing tools, resulting in scientifically defendable, objective and quantitative metrics of sustainability and resiliency to be used in infrastructure management

    JOINING SEQUENCE ANALYSIS AND OPTIMIZATION FOR IMPROVED GEOMETRICAL QUALITY

    Get PDF
    Disturbances in the manufacturing and assembly processes cause geometrical variation from the ideal geometry. This variation eventually results in functional and aesthetic problems in the final product. Being able to control the disturbances is the desire of the manufacturing industry. \ua0 Joining sequences impact the final geometrical outcome in an assembly considerably. To optimize the sequence for improved geometrical outcome is both experimentally and computationally expensive. In the simulation-based approaches, based on the finite element method, a large number of sequences need to be evaluated.\ua0 In this thesis, the simulation-based joining sequence optimization using non-rigid variation simulation is studied. Initially, the limitation of the applied algorithms in the literature has been addressed. A rule-based optimization approach based on meta-heuristic algorithms and heuristic search methods is introduced to increase the previously applied algorithms\u27 time-efficiency and accuracy. Based on the identified rules and heuristics, a reduced formulation of the sequence optimization is introduced by identifying the critical points for geometrical quality. A subset of the sequence problem is identified and solved in this formulation.\ua0 For real-time optimization of the joining sequence problem, time-efficiency needs to be further enhanced by parallel computations. By identifying the sequence-deformation behavior in the assemblies, black-box surrogate models are introduced, enabling parallel evaluations and accurate approximation of the geometrical quality. Based on this finding, a deterministic stepwise search algorithm for rapid identification of the optimal sequence is introduced.\ua0 Furthermore, a numerical approach to identify the number, location from a set of alternatives, and sequence of the critical joining points for geometrical quality is introduced. Finally, the cause of the various deformations achieved by joining sequences is identified. A time-efficient non-rigid variation simulation approach for evaluating the geometrical quality with respect to the sequences is proposed. \ua0 The results achieved from the studies presented indicate that the simulation-based real-time optimization of the joining sequences is achievable through a parallelized search algorithm and a rapid evaluation of the sequences. The critical joining points for geometrical quality are identified while the sequence is optimized. The results help control the assembly process with respect to the joining operation, improve the geometrical quality, and save significant computational time

    Analysis and Tests for a Hybrid Model created from Classical Taguchi and Goal Post Manufacturing Loss Models

    Get PDF
    We present an analysis of the previously proposed “modified quadratic” loss function. This loss model integrates elements from both Classical Taguchi and Goal Post manufacturing loss models. Specifically, the analyzed hybrid model follows the Taguchi loss quadratic dependence between the upper and lower manufacturing specification limits. On the other hand, outside of these limits, the loss rule is in agreement with the Goal Post Model. Supported by the results of analysis contained herein the Taguchi-Hybrid model does not overestimate the loss as is inherent to the classical Taguchi model. Also, the proposed Taguchi Hybrid model does not ignore deviations from the exact target and hence will not under-estimate the manufacturing loss a symptom characteristic of the Goal Post Model. The analysis for the Taguchi-Hybrid is employed on assuming two different distributions describing the manufacturing parameters namely uniform and Gaussian distributions. The exact analysis is provided for these part distributions including the possibility the mean is both on and off of the ideal target value, i.e. with and without target bias. For the assumption of Gaussian part distribution, the expectation of the Taguchi Hybrid loss function is representable in terms of process capability and normalized target bias. It is observed in the Gaussian PDF case the expectation of the Hybrid Taguchi loss function can be cast into a five-term representation. In this representation two of the terms are the classical Taguchi loss and the Goal Post loss and the remaining three are “negative” corrective losses. These remaining three serve to compensate for overestimation of loss from the first two terms. A wide range of tests was performed with the analytical model for assumptions of parts being distributed both Uniform and Gaussian. Numerical integration was employed to validate the derived dependencies for the associated loss expectations. A hypothetical example for voltage regulator drift demonstrates that in the Uniform distribution case the predicted loss of the Hybrid Taguchi model lies between loss predictions generated from the more conservative Taguchi Loss and least conservative Goal Post Loss models. A second hypothetical example details a procedure to generate salient target bias design limits for the metal oxide semiconductor field effect (MOSFET) transistor channel length. In this procedure considerations were applied to both the expectation for Taguchi-Hybrid Goal Post loss dependence and similarly for Taguchi-Hybrid Quadratic losses. With reasonable loss limits assigned for these expectations it was found that the process design rule for target bias was being controlled by the limit imposed on the Taguchi-Hybrid Quadratic loss which is related to quality of parts passing inspection and not the fraction of parts rejected

    Modeling and Optimization of Stochastic Process Parameters in Complex Engineering Systems

    Get PDF
    For quality engineering researchers and practitioners, a wide number of statistical tools and techniques are available for use in the manufacturing industry. The objective or goal in applying these tools has always been to improve or optimize a product or process in terms of efficiency, production cost, or product quality. While tremendous progress has been made in the design of quality optimization models, there remains a significant gap between existing research and the needs of the industrial community. Contemporary manufacturing processes are inherently more complex - they may involve multiple stages of production or require the assessment of multiple quality characteristics. New and emerging fields, such as nanoelectronics and molecular biometrics, demand increased degrees of precision and estimation, that which is not attainable with current tools and measures. And since most researchers will focus on a specific type of characteristic or a given set of conditions, there are many critical industrial processes for which models are not applicable. Thus, the objective of this research is to improve existing techniques by not only expanding their range of applicability, but also their ability to more realistically model a given process. Several quality models are proposed that seek greater precision in the estimation of the process parameters and the removal of assumptions that limit their breadth and scope. An extension is made to examine the effectiveness of these models in both non-standard conditions and in areas that have not been previously investigated. Upon the completion of an in-depth literature review, various quality models are proposed, and numerical examples are used to validate the use of these methodologies
    • …
    corecore