95,817 research outputs found

    A Combined Method in Parameters Optimization of Hydrocyclone

    Get PDF
    To achieve efficient separation of calcium hydroxide and impurities in carbide slag by using hydrocyclone, the physical granularity property of carbide slag, hydrocyclone operation parameters for slurry concentration, and the slurry velocity inlet are designed to be optimized. The optimization methods are combined with the Design of Experiment (DOE) method and the Computational Fluid Dynamics (CFD) method. Based on Design Expert software, the central composite design (CCD) with three factors and five levels amounting to five groups of 20 test responses was constructed, and the experiments were performed by numerical simulation software FLUENT. Through the analysis of variance deduced from numerical simulation experiment results, the regression equations of pressure drop, overflow concentration, purity, and separation efficiencies of two solid phases were, respectively, obtained. The influences of factors were analyzed by the responses, respectively. Finally, optimized results were obtained by the multiobjective optimization method through the Design Expert software. Based on the optimized conditions, the validation test by numerical simulation and separation experiment were separately proceeded. The results proved that the combined method could be efficiently used in studying the hydrocyclone and it has a good performance in application engineering

    Data Driven Surrogate Based Optimization in the Problem Solving Environment WBCSim

    Get PDF
    Large scale, multidisciplinary, engineering designs are always difficult due to the complexity and dimensionality of these problems. Direct coupling between the analysis codes and the optimization routines can be prohibitively time consuming due to the complexity of the underlying simulation codes. One way of tackling this problem is by constructing computationally cheap(er) approximations of the expensive simulations, that mimic the behavior of the simulation model as closely as possible. This paper presents a data driven, surrogate based optimization algorithm that uses a trust region based sequential approximate optimization (SAO) framework and a statistical sampling approach based on design of experiment (DOE) arrays. The algorithm is implemented using techniques from two packages—SURFPACK and SHEPPACK that provide a collection of approximation algorithms to build the surrogates and three different DOE techniques—full factorial (FF), Latin hypercube sampling (LHS), and central composite design (CCD)—are used to train the surrogates. The results are compared with the optimization results obtained by directly coupling an optimizer with the simulation code. The biggest concern in using the SAO framework based on statistical sampling is the generation of the required database. As the number of design variables grows, the computational cost of generating the required database grows rapidly. A data driven approach is proposed to tackle this situation, where the trick is to run the expensive simulation if and only if a nearby data point does not exist in the cumulatively growing database. Over time the database matures and is enriched as more and more optimizations are performed. Results show that the proposed methodology dramatically reduces the total number of calls to the expensive simulation runs during the optimization process

    Development of a Degradation Model for the Collapse Analysis of Composite Aerospace Structures

    Get PDF
    For stiffened structures in compression the most critical damage mechanism leading to structural collapse is delamination or adhesive disbonding between the skin and stiffener. This paper presents the development of a numerical approach capable of simulating interlaminar crack growth in composite structures as a representation of this damage mecha-nism. A degradation methodology was proposed using shell layers connected at the nodes by user-defined multiple point constraints (MPCs), and then controlling the properties of these MPCs to simulate the initiation and propagation of delamination and disbonding. A fracture mechanics approach based on the Virtual Crack Closure Technique (VCCT) is used to detect growth at the delamination front. Numerical predictions using the degradation methodology were compared to experimental results for double cantilever beam (DCB) specimens to dem-onstrate the effectiveness of the current approach. Future development will focus on address-ing the apparent conservatism of the VCCT approach, and extending the application of the method to other specimen types and stiffened structures representative of composite fuselage designs. This work is part of the European Commission Project COCOMAT (Improved MA-Terial Exploitation at Safe Design of COmposite Airframe Structures by Accurate Simulation of COllapse), an ongoing four-year project that aims to exploit the large strength reserves of composite aerospace structures through more accurate prediction of collapse

    The application of multi-objective robust design methods in ship design

    Get PDF
    When designing large complex vessels, the evaluation of a particular design can be both complicated and time consuming. Designers often resort to the use of concept design models enabling both a reduction in complexity and time for evaluation. Various optimisation methods are then typically used to explore the design space facilitating the selection of optimum or near optimum designs. It is now possible to incorporate considerations of seakeeping, stability and costs at the earliest stage in the ship design process. However, to ensure that reliable results are obtained, the models used are generally complex and computationally expensive. Methods have been developed which avoid the necessity to carry out an exhaustive search of the complete design space. One such method is described which is concerned with the application of the theory of Design Of Experiments (DOE) enabling the design space to be efficiently explored. The objective of the DOE stage is to produce response surfaces which can then be used by an optimisation module to search the design space. It is assumed that the concept exploration tool whilst being a simplification of the design problem, is still sufficiently complicated to enable reliable evaluations of a particular design concept. The response surface is used as a representation of the concept exploration tool, and by it's nature can be used to rapidly evaluate a design concept hence reducing concept exploration time. While the methodology has a wide applicability in ship design and production, it is illustrated by its application to the design of a catamaran with respect to seakeeping. The paper presents results exploring the design space for the catamaran. A concept is selected which is robust with respect to the Relative Bow Motion (RBM), the heave, pitch and roll at any particular waveheading. The design space is defined by six controllable design parameters; hull length, breadth to draught ratio, distance between demihull centres, coefficient of waterplane, longitudinal centre of floatation, longitudinal centre of buoyancy, and by one noise parameter, the waveheading. A Pareto-optimal set of solutions is obtained using RBM, heave, pitch and roll as criteria. The designer can then select from this set the design which most closely satisfies their requirements. Typical solutions are shown to yield average reductions of over 25% in the objective functions when compared to earlier results obtained using conventional optimisation methods

    Validation of Simulation, With and Without Real Data

    Get PDF
    This paper gives a survey on how to validate simulation models through the application of mathematical statistics. The type of statistical test actually applied, depends on the availability of data on the real system: (i) no data, (ii) only output data, and (iii) both input and output data. In case (i), the system analysts can still experiment with the simulation model to obtain simulated data. Those experiments should be guided by the statistical theory on design of experiments (DOE); an inferior - but popular - approach is to change only one factor at a time. In case (ii), real and simulated output data may be compared through the well-known Student t statistic. In case (iii), trace-driven simulation becomes possible. Then validation, however, should not proceed as follows: make a scatter plot with real and simulated outputs, fit a line, and test whether that line has unit slope and passes through the origin. Instead, better tests are presented. Several case studies are summarized, to illustrate the three types of situations.verification;credibility;assessment;sensitivity;robustness;regression

    Quasi-Static Folding and Deployment of Ultrathin Composite Tape-Spring Hinges

    Get PDF
    Deployable structures made from ultrathin composite materials can be folded elastically and are able to selfdeploy by releasing the stored strain energy. This paper presents a detailed study of the folding and deployment of a tape-spring hinge made from a two-ply plain-weave laminate of carbon-fiber reinforced plastic. Aparticular version of this hinge was constructed, and its moment-rotation profile during quasi-static deployment was measured. The present study is the first to incorporate in the simulation an experimentally validated elastic micromechanical model and to provide quantitative comparisons between the simulations and the measured behavior of an actual hinge. Folding and deployment simulations of the tape-spring hinge were carried out with the commercial finite element package Abaqus/Explicit, starting from the as-built unstrained structure. The folding simulation includes the effects of pinching the hinge in the middle to reduce the peak moment required to fold it. The deployment simulation fully captures both the steady-state moment part of the deployment and the final snap back to the deployed configuration. An alternative simulation without pinching the hinge provides an estimate of the maximum moment that could be carried by the hinge during operation. This is about double the snapback moment

    Experimental Design for Sensitivity Analysis, Optimization and Validation of Simulation Models

    Get PDF
    This chapter gives a survey on the use of statistical designs for what-if analysis in simula- tion, including sensitivity analysis, optimization, and validation/verification. Sensitivity analysis is divided into two phases. The first phase is a pilot stage, which consists of screening or searching for the important factors among (say) hundreds of potentially important factors. A novel screening technique is presented, namely sequential bifurcation. The second phase uses regression analysis to approximate the input/output transformation that is implied by the simulation model; the resulting regression model is also known as a metamodel or a response surface. Regression analysis gives better results when the simu- lation experiment is well designed, using either classical statistical designs (such as frac- tional factorials) or optimal designs (such as pioneered by Fedorov, Kiefer, and Wolfo- witz). To optimize the simulated system, the analysts may apply Response Surface Metho- dology (RSM); RSM combines regression analysis, statistical designs, and steepest-ascent hill-climbing. To validate a simulation model, again regression analysis and statistical designs may be applied. Several numerical examples and case-studies illustrate how statisti- cal techniques can reduce the ad hoc character of simulation; that is, these statistical techniques can make simulation studies give more general results, in less time. Appendix 1 summarizes confidence intervals for expected values, proportions, and quantiles, in termi- nating and steady-state simulations. Appendix 2 gives details on four variance reduction techniques, namely common pseudorandom numbers, antithetic numbers, control variates or regression sampling, and importance sampling. Appendix 3 describes jackknifing, which may give robust confidence intervals.least squares;distribution-free;non-parametric;stopping rule;run-length;Von Neumann;median;seed;likelihood ratio

    Failure analysis of CFRP laminates subjected to Compression After Impact: FE simulation using discrete interface elements

    Get PDF
    This paper presents a model for the numerical simulation of impact damage, permanent indentation and compression after impact (CAI) in CFRP laminates. The same model is used for the formation of damage developing during both low-velocity / low-energy impact tests and CAI tests. The different impact and CAI elementary damage types are taken into account, i.e. matrix cracking, fiber failure and interface delamination. Experimental tests and model results are compared, and this comparison is used to highlight the laminate failure scenario during residual compression tests. Finally, the impact energy effect on the residual strength is evaluated and compared to experimental results

    Modelling multi-tier enterprise applications behaviour with design of experiments technique

    Get PDF
    Queueing network models are commonly used for performance modelling. However, through application development stage analytical models might not be able to continuously reflect performance, for example due to performance bugs or minor changes in the application code that cannot be readily reflected in the queueing model. To cope with this problem, a measurement-based approach adopting Design of Experiments (DoE) technique is proposed. The applicability of the proposed method is demonstrated on a complex 3-tier e-commerce application that is difficult to model with queueing networks

    Impact of Embedded Carbon Fiber Heating Panel on the Structural/Mechanical Performance of Roadway Pavement

    Get PDF
    INE/AUTC 12.3
    corecore