58,826 research outputs found

    Artificial Neural Network-based error compensation procedure for low-cost encoders

    Full text link
    An Artificial Neural Network-based error compensation method is proposed for improving the accuracy of resolver-based 16-bit encoders by compensating for their respective systematic error profiles. The error compensation procedure, for a particular encoder, involves obtaining its error profile by calibrating it on a precision rotary table, training the neural network by using a part of this data and then determining the corrected encoder angle by subtracting the ANN-predicted error from the measured value of the encoder angle. Since it is not guaranteed that all the resolvers will have exactly similar error profiles because of the inherent differences in their construction on a micro scale, the ANN has been trained on one error profile at a time and the corresponding weight file is then used only for compensating the systematic error of this particular encoder. The systematic nature of the error profile for each of the encoders has also been validated by repeated calibration of the encoders over a period of time and it was found that the error profiles of a particular encoder recorded at different epochs show near reproducible behavior. The ANN-based error compensation procedure has been implemented for 4 encoders by training the ANN with their respective error profiles and the results indicate that the accuracy of encoders can be improved by nearly an order of magnitude from quoted values of ~6 arc-min to ~0.65 arc-min when their corresponding ANN-generated weight files are used for determining the corrected encoder angle.Comment: 16 pages, 4 figures. Accepted for Publication in Measurement Science and Technology (MST

    Investigation of Adjoint Based Shape Optimization Techniques in NASCART-GT using Automatic Reverse Differentiation

    Get PDF
    Automated shape optimization involves making suitable modifications to a geometry that can lead to significant improvements in aerodynamic performance. Currently available mid-fdelity Aerodynamic Optimizers cannot be utilized in the late stages of the design process for performing minor, but consequential, tweaks in geometry. Automated shape optimization involves making suitable modifications to a geometry that can lead to significant improvements in aerodynamic performance. Currently available mid-fidelity Aerodynamic Optimizers cannot be utilized in the late stages of the design process for performing minor, but consequential, tweaks in geometry. High-fidelity shape optimization techniques are explored which, even though computationally demanding, are invaluable since they can account for realistic effects like turbulence and viscocity. The high computational costs associated with the optimization have been avoided by using an indirect optimization approach, which was used to dcouple the effect of the flow field variables on the gradients involved. The main challenge while performing the optimization was to maintain low sensitivity to the number of input design variables. This necessitated the use of Reverse Automatic differentiation tools to generate the gradient. All efforts have been made to keep computational costs to a minimum, thereby enabling hi-fidelity optimization to be used even in the initial design stages. A preliminary roadmap has been laid out for an initial implementation of optimization algorithms using the adjoint approach, into the high fidelity CFD code NASCART-GT.High-fidelity shape optimization techniques are explored which, even though computationally demanding, are invaluable since they can account for realistic effects like turbulence and viscocity. The high computational costs associated with the optimization have been avoided by using an indirect optimization approach, which was used to dcouple the effect of the flow field variables on the gradients involved. The main challenge while performing the optimization was to maintain low sensitivity to the number of input design variables. This necessitated the use of Reverse Automatic differentiation tools to generate the gradient. All efforts have been made to keep computational costs to a minimum, thereby enabling hi-fidelity optimization to be used even in the initial design stages. A preliminary roadmap has been laid out for an initial implementation of optimization algorithms using the adjoint approach, into the high fidelity CFD code NASCART-GT.Ruffin, Stephen - Faculty Mentor ; Feron, Eric - Committee Member/Second Reader ; Sankar, Lakshmi - Committee Member/Second Reade

    Feasibility study of an Integrated Program for Aerospace vehicle Design (IPAD). Volume 2: The design process

    Get PDF
    The extent to which IPAD is to support the design process is identified. Case studies of representative aerospace products were developed as models to characterize the design process and to provide design requirements for the IPAD computing system

    Data Driven Surrogate Based Optimization in the Problem Solving Environment WBCSim

    Get PDF
    Large scale, multidisciplinary, engineering designs are always difficult due to the complexity and dimensionality of these problems. Direct coupling between the analysis codes and the optimization routines can be prohibitively time consuming due to the complexity of the underlying simulation codes. One way of tackling this problem is by constructing computationally cheap(er) approximations of the expensive simulations, that mimic the behavior of the simulation model as closely as possible. This paper presents a data driven, surrogate based optimization algorithm that uses a trust region based sequential approximate optimization (SAO) framework and a statistical sampling approach based on design of experiment (DOE) arrays. The algorithm is implemented using techniques from two packages—SURFPACK and SHEPPACK that provide a collection of approximation algorithms to build the surrogates and three different DOE techniques—full factorial (FF), Latin hypercube sampling (LHS), and central composite design (CCD)—are used to train the surrogates. The results are compared with the optimization results obtained by directly coupling an optimizer with the simulation code. The biggest concern in using the SAO framework based on statistical sampling is the generation of the required database. As the number of design variables grows, the computational cost of generating the required database grows rapidly. A data driven approach is proposed to tackle this situation, where the trick is to run the expensive simulation if and only if a nearby data point does not exist in the cumulatively growing database. Over time the database matures and is enriched as more and more optimizations are performed. Results show that the proposed methodology dramatically reduces the total number of calls to the expensive simulation runs during the optimization process

    Winning customer loyalty in an automotive company through Six Sigma: a case study

    Get PDF
    Six Sigma is a disciplined approach to improving product, process and service quality. Since its inception at Motorola in the mid 1980s Six Sigma has evolved significantly and continues to expand to improve process performance, enhance business profitability and increase customer satisfaction. This paper presents an extensive literature review based on the experiences of both academics and practitioners on Six Sigma, followed by the application of the Define, Measure, Analyse, Improve, Control (DMAIC) problem-solving methodology to identify the parameters causing casting defects and to control these parameters. The results of the study are based on the application of tools and techniques in the DMAIC methodology, i.e. Pareto Analysis, Measurement System Analysis, Regression Analysis and Design of Experiment. The results of the study show that the application of the Six Sigma methodology reduced casting defects and increased the process capability of the process from 0.49 to 1.28. The application of DMAIC has resulted in a significant financial impact (over U.S. $110 000 per annum) on the bottom-line of the company

    Optimal linear combination of poisson variables for multivariate statistical process control

    Full text link
    In this paper we analyze the monitoring of p Poisson quality characteristics simultaneously, developing a new multivariate control chart based on the linear combination of the Poisson variables, the LCP control chart. The optimization of the coefficients of this linear combination (and control limit) for minimizing the out-of-control ARL is constrained by the desired in-control ARL. In order to facilitate the use of this new control chart the optimization is carried out employing user-friendly Windows© software, which also makes a comparison of performance between this chart and other schemes based on monitoring a set of Poisson variables; namely a control chart on the sum of the variables (MP chart), a control chart on their maximum (MX chart) and an optimized set of univariate Poisson charts (Multiple scheme). The LCP control chart shows very good performance. First, the desired in-control ARL (ARL0) is perfectly matched because the linear combination of Poisson variables is not constrained to integer values, which is an advantage over the rest of charts, which cannot in general match the required ARL0 value. Second, in the vast majority of cases this scheme signals process shifts faster than the rest of the charts.This work has been supported by the Ministry of Education and Science of Spain, research project number DPI2009-09925, the CNPq (the Brazilian Council for Scientific and Technological Development), project numbers 302326/2008-1 and 473706/2010-5, and SENESCYT-Ecuador (National Secretary of Higher Education, Science, Technology and Innovation of Equator). The authors are grateful to the referees for their comments, which led to significant improvement of the paper.Kahn Epprecht, E.; Aparisi García, FJ.; García Bustos, SL. (2013). Optimal linear combination of poisson variables for multivariate statistical process control. Computers and Operations Research. 40(12):3021-3032. https://doi.org/10.1016/j.cor.2013.07.007S30213032401

    Development of an experiment-based robust design paradigm for multiple quality characteristics using physical programming

    Get PDF
    The well-known quality improvement methodology, robust design, is a powerful and cost-effective technique for building quality into the design of products and processes. Although several approaches to robust design have been proposed in the literature, little attention has been given to the development of a flexible robust design model. Specifically, flexibility is needed in order to consider multiple quality characteristics simultaneously, just as customers do when judging products, and to capture design preferences with a reasonable degree of accuracy. Physical programming, a relatively new optimization technique, is an effective tool that can be used to transform design preferences into specific weighted objectives. In this paper, we extend the basic concept of physical programming to robust design by establishing the links of experimental design and response surface methodology to address designers’ preferences in a multiresponse robust design paradigm. A numerical example is used to show the proposed procedure and the results obtained are validated through a sensitivity study

    An integrated methodology for the design of Ro-Ro passenger ships

    Get PDF
    The present paper provides a brief introduction to the holistic approach to ship design, defines the generic ship design optimization problem and demonstrates its solution by use of advanced optimization techniques

    Applications of structural optimization methods to fixed-wing aircraft and spacecraft in the 1980s

    Get PDF
    This report is the summary of a technical survey on the applications of structural optimization in the U.S. aerospace industry through the 1980s. Since applications to rotary wing aircraft will be covered by other literature, applications to fixed-wing aircraft and spacecraft were considered. It became clear that very significant progress has been made during this decade, indicating this technology is about to become one of the practical tools in computer aided structural design
    corecore