257,805 research outputs found

    Polynomial Response Surface Approximations for the Multidisciplinary Design Optimization of a High Speed Civil Transport

    Get PDF
    Surrogate functions have become an important tool in multidisciplinary design optimization to deal with noisy functions, high computational cost, and the practical difficulty of integrating legacy disciplinary computer codes. A combination of mathematical, statistical, and engineering techniques, well known in other contexts, have made polynomial surrogate functions viable for MDO. Despite the obvious limitations imposed by sparse high fidelity data in high dimensions and the locality of low order polynomial approximations, the success of the panoply of techniques based on polynomial response surface approximations for MDO shows that the implementation details are more important than the underlying approximation method (polynomial, spline, DACE, kernel regression, etc.). This paper surveys some of the ancillary techniques—statistics, global search, parallel computing, variable complexity modeling—that augment the construction and use of polynomial surrogates

    Data Driven Surrogate Based Optimization in the Problem Solving Environment WBCSim

    Get PDF
    Large scale, multidisciplinary, engineering designs are always difficult due to the complexity and dimensionality of these problems. Direct coupling between the analysis codes and the optimization routines can be prohibitively time consuming due to the complexity of the underlying simulation codes. One way of tackling this problem is by constructing computationally cheap(er) approximations of the expensive simulations, that mimic the behavior of the simulation model as closely as possible. This paper presents a data driven, surrogate based optimization algorithm that uses a trust region based sequential approximate optimization (SAO) framework and a statistical sampling approach based on design of experiment (DOE) arrays. The algorithm is implemented using techniques from two packages—SURFPACK and SHEPPACK that provide a collection of approximation algorithms to build the surrogates and three different DOE techniques—full factorial (FF), Latin hypercube sampling (LHS), and central composite design (CCD)—are used to train the surrogates. The results are compared with the optimization results obtained by directly coupling an optimizer with the simulation code. The biggest concern in using the SAO framework based on statistical sampling is the generation of the required database. As the number of design variables grows, the computational cost of generating the required database grows rapidly. A data driven approach is proposed to tackle this situation, where the trick is to run the expensive simulation if and only if a nearby data point does not exist in the cumulatively growing database. Over time the database matures and is enriched as more and more optimizations are performed. Results show that the proposed methodology dramatically reduces the total number of calls to the expensive simulation runs during the optimization process

    Exploiting correlation in the construction of D-optimal response surface designs.

    Get PDF
    Cost considerations and difficulties in performing completely randomized experiments often dictate the necessity to run response surface experiments in a bi-randomization format. The resulting compound symmetric error structure not only affects estimation and inference procedures but it also has severe consequences for the optimality of the designs used. Fir this reason, it should be taken into account explicitly when constructing the design. In this paper, an exchange algorithm for constructing D-optimal bi-randomization designs is developed and the resulting designs are analyzed. Finally, the concept of bi-randomization experiments is refined, yielding very efficient designs, which, in many cases, outperform D-optimal completely randomized experiments.Structure;

    Outperforming completely randomized designs.

    Get PDF
    Bi-randomization designs have become increasingly popular in industry because some of the factors under investigation are often hard-to-change. It is well-known that the resulting compound symmetric error structure not only affects estimation and inference procedures but also the efficiency of the experimental designs used. In this paper, the use of bi-randomization designs is shown to outperform completely randomized designs in terms of D-efficiency. This result suggests that bi-randomization designs should be considered as an alternative to completely randomized designs even if all experimental factors are easy-to-change.Optimal;

    Design Issues for Generalized Linear Models: A Review

    Full text link
    Generalized linear models (GLMs) have been used quite effectively in the modeling of a mean response under nonstandard conditions, where discrete as well as continuous data distributions can be accommodated. The choice of design for a GLM is a very important task in the development and building of an adequate model. However, one major problem that handicaps the construction of a GLM design is its dependence on the unknown parameters of the fitted model. Several approaches have been proposed in the past 25 years to solve this problem. These approaches, however, have provided only partial solutions that apply in only some special cases, and the problem, in general, remains largely unresolved. The purpose of this article is to focus attention on the aforementioned dependence problem. We provide a survey of various existing techniques dealing with the dependence problem. This survey includes discussions concerning locally optimal designs, sequential designs, Bayesian designs and the quantile dispersion graph approach for comparing designs for GLMs.Comment: Published at http://dx.doi.org/10.1214/088342306000000105 in the Statistical Science (http://www.imstat.org/sts/) by the Institute of Mathematical Statistics (http://www.imstat.org

    Highly Abrasion-resistant and Long-lasting Concrete

    Get PDF
    Studded tire usage in Alaska contributes to rutting damage on pavements resulting in high maintenance costs and safety issues. In this study binary, ternary, and quaternary highly-abrasion resistant concrete mix designs, using supplementary cementitious materials (SCMs), were developed. The fresh, mechanical and durability properties of these mix designs were then tested to determine an optimum highly-abrasion resistant concrete mix that could be placed in cold climates to reduce rutting damage. SCMs used included silica fume, ground granulated blast furnace slag, and type F fly ash. Tests conducted measured workability, air content, drying shrinkage, compressive strength, flexural strength, and chloride ion permeability. Resistance to freeze-thaw cycles, scaling due to deicers, and abrasion resistance were also measured. A survey and literature review on concrete pavement practices in Alaska and other cold climates was also conducted. A preliminary construction cost analysis comparing the concrete mix designs developed was also completed

    Deep Underground Science and Engineering Laboratory - Preliminary Design Report

    Full text link
    The DUSEL Project has produced the Preliminary Design of the Deep Underground Science and Engineering Laboratory (DUSEL) at the rehabilitated former Homestake mine in South Dakota. The Facility design calls for, on the surface, two new buildings - one a visitor and education center, the other an experiment assembly hall - and multiple repurposed existing buildings. To support underground research activities, the design includes two laboratory modules and additional spaces at a level 4,850 feet underground for physics, biology, engineering, and Earth science experiments. On the same level, the design includes a Department of Energy-shepherded Large Cavity supporting the Long Baseline Neutrino Experiment. At the 7,400-feet level, the design incorporates one laboratory module and additional spaces for physics and Earth science efforts. With input from some 25 science and engineering collaborations, the Project has designed critical experimental space and infrastructure needs, including space for a suite of multidisciplinary experiments in a laboratory whose projected life span is at least 30 years. From these experiments, a critical suite of experiments is outlined, whose construction will be funded along with the facility. The Facility design permits expansion and evolution, as may be driven by future science requirements, and enables participation by other agencies. The design leverages South Dakota's substantial investment in facility infrastructure, risk retirement, and operation of its Sanford Laboratory at Homestake. The Project is planning education and outreach programs, and has initiated efforts to establish regional partnerships with underserved populations - regional American Indian and rural populations

    Design of experiments for non-manufacturing processes : benefits, challenges and some examples

    Get PDF
    Design of Experiments (DoE) is a powerful technique for process optimization that has been widely deployed in almost all types of manufacturing processes and is used extensively in product and process design and development. There have not been as many efforts to apply powerful quality improvement techniques such as DoE to improve non-manufacturing processes. Factor levels often involve changing the way people work and so have to be handled carefully. It is even more important to get everyone working as a team. This paper explores the benefits and challenges in the application of DoE in non-manufacturing contexts. The viewpoints regarding the benefits and challenges of DoE in the non-manufacturing arena are gathered from a number of leading academics and practitioners in the field. The paper also makes an attempt to demystify the fact that DoE is not just applicable to manufacturing industries; rather it is equally applicable to non-manufacturing processes within manufacturing companies. The last part of the paper illustrates some case examples showing the power of the technique in non-manufacturing environments

    Optimisation of the shear forming process by means of multivariate statistical methods

    Get PDF
    Shear forming is a versatile process for manufacturing complex lightweight components which are required in increasing numbers by many different industries. Inherent advantages of the process are simple tooling, low tool costs, good external and internal surface quality, close dimensional accuracy, and good mechanical properties of the components. In times of free market economy, it is necessary to on the one hand fulfill the increasing demands toward the quality characteristics and on the other hand to reduce the development time needed to manufacture such a high quality component. Since shear forming is a complex and sensitive process in terms of deformation characteristics this is not an easy task. To assess the overall quality of a component several, mutually contradictory, quality characteristics have to be considered simultaneously. While conventionally each characteristic is considered separately, in this paper, a statistical approach is presented which copes with the above mentioned demands and provides the opportunity for an efficient, multivariate optimisation of the process. With a minimum of statistically planned experiments, mathematical models are derived which describe the influence of the machine parameters and their interactions on quantitative as well as qualitative component characteristics. A multivariate optimisation procedure based on the concept of desirabilities is used to find the best compromise between the mutually contradictory quality characteristics. With this statistical approach a workpiece for electrical industry is manufactured which requires a very good surface quality and close geometrical tolerances. --Shear forming,experimental design,multivariate optimisation,high voltage divider
    corecore