196,585 research outputs found

    Simulation of non-linear bearing forces for post-stability investigation

    Get PDF
    Different types of bearing designs were developed to improve dynamic properties of rotor-bearing systems. Elliptical bearings, multisleeve bearings, tilting pad and other designs such as herringbone groove were utilized to increase resistance to the onset of self excited vibrations. Experimental trials are costly, two alternative methods are used to gain a qualitative insight. The first one creates mathematical model and applies both a digital or an analog computer simulation. The second one investigates phenomena occurring on the laboratory rig with the bearing replaced by an electronic simulating device, working in a feedback loop, which produces forces,which are functions of journal displacement and velocity. The simulated hydrodynamic forces are produced according to assumed characteristics matched to the bearing type. The principal benefit of the analog simulation is that nonlinear characteristics of a subsystem are precisely identified and mathematical methods applied for a wide class of problems are checked on the experimental installation

    Experimental designs for multiple-level responses, with application to a large-scale educational intervention

    Full text link
    Educational research often studies subjects that are in naturally clustered groups of classrooms or schools. When designing a randomized experiment to evaluate an intervention directed at teachers, but with effects on teachers and their students, the power or anticipated variance for the treatment effect needs to be examined at both levels. If the treatment is applied to clusters, power is usually reduced. At the same time, a cluster design decreases the probability of contamination, and contamination can also reduce power to detect a treatment effect. Designs that are optimal at one level may be inefficient for estimating the treatment effect at another level. In this paper we study the efficiency of three designs and their ability to detect a treatment effect: randomize schools to treatment, randomize teachers within schools to treatment, and completely randomize teachers to treatment. The three designs are compared for both the teacher and student level within the mixed model framework, and a simulation study is conducted to compare expected treatment variances for the three designs with various levels of correlation within and between clusters. We present a computer program that study designers can use to explore the anticipated variances of treatment effects under proposed experimental designs and settings.Comment: Published in at http://dx.doi.org/10.1214/08-AOAS216 the Annals of Applied Statistics (http://www.imstat.org/aoas/) by the Institute of Mathematical Statistics (http://www.imstat.org

    Kriging metamodeling for simulation

    Get PDF
    Many scientific disciplines use mathematical models to describe complicated real systems. Often, analytical methods are inadequate, so simulation is applied. This thesis focuses on computer intensive simulation experiments in Operations Research/Management Science. For such experiments it is necessary to apply interpolation. In this thesis, Kriging interpolation for random simulation is proposed and a novel type of Kriging - called Detrended Kriging - is developed. Kriging turns out to give better predictions in random simulation than classic low-order polynomial regression. Kriging is not sensitive to variance heterogeneity: i.e. Kriging is a robust method. Moreover, the thesis develops a novel method to select experimental designs for expensive simulation. This method is sequential, and accounts for the specific input/output function implied by the underlying simulation model. For deterministic simulation the designs are constructed through cross-validation and jackknifing, whereas for random simulation the customization is achieved through bootstrapping. The novel method simulates relatively more input combinations in the interesting parts of the input/output function, and gives better predictions than traditional Latin Hypercube Sample designs with prefixed sample sizes.

    Human behaviour simulation using gaming software based on video observation analysis

    Get PDF
    Computer simulation is one of the techniques applied by engineers and architects to evaluate building designs before real construction is undertaken. Computer simulation is also applied in crowd research to evaluate the safety of building designs for human evacuation during emergency situations. By using computer simulation, the best and worst-case scenarios during emergency evacuation can be predicted without using real humans by carrying out the simulation many times (Gwynne, et al., 1999). This provides many advantages compared to experimental methods (e.g. fire drills) when dealing with the ethical issues and rarely occurring events. Besides that, the simulations can be applied to investigate the outcome of different evacuation strategies (Hsiung, et al., 2009) and to investigate emergent behaviour based on new theories or hypotheses (Pan, et al., 2006). In this research, computer simulation is applied to develop a prototype of a ‘toolkit’ or computer program that is able to model and simulate human movement and behaviour in crowded spaces. The research has its origin in the AUNT-SUE (Accessibility and User Needs in Transport - Sustainable Urban Environment) project that emphasized the need to accommodate the largest possible range of humans with different abilities and aspirations (Marshall, et al., 2008) based on the philosophy of ‘inclusive design’, ‘design for all’ or ‘universal design’. A video observation method was used in this research to record ‘real’ human movement and behaviour in crowded spaces. Once analysed, the recorded video is considered as input data for the human behaviour simulation. The simulation focuses on the microscopic scale where each individual character within the crowd is considered. Additionally, heterogeneous characters or different types of humans such as older people, disabled people, young and able- bodied are also considered. However, the simulation only focuses on the normal situation where there is no panic condition. The gaming software DarkBASIC Professional was applied after the video observation analysis

    Comparison of data-acquisition designs for single-shot edge-illumination X-ray phase-contrast tomography

    Get PDF
    Edge-illumination X-ray phase-contrast tomography (EIXPCT) is an emerging technique that enables practical phase-contrast imaging with laboratory-based X-ray sources. A joint reconstruction method was proposed for reconstructing EIXPCT images, enabling novel flexible data-acquisition designs. However, only limited efforts have been devoted to optimizing data-acquisition designs for use with the joint reconstruction method. In this study, several promising designs are introduced, such as the constant aperture position (CAP) strategy and the alternating aperture position (AAP) strategy covering different angular ranges. In computer-simulation studies, these designs are analyzed and compared. Experimental data are employed to test the designs in real-world applications. All candidate designs are also compared for their implementation complexity. The tradeoff between data-acquisition time and image quality is discussed

    Robust data analysis for factorial experimental designs: Improved methods and software

    Get PDF
    Factorial experimental designs are a large family of experimental designs. Robust statistics has been a subject of considerable research in recent decades. Therefore, robust analysis of factorial designs is applicable to many real problems. Seheult and Tukey (2001) suggested a method of robust analysis of variance for a full factorial design without replication. Their method is generalised for many other factorial designs without the restriction of one observation in each cell. Furthermore, a new algorithm to decompose data from a factorial design is introduced and programmed in the statistical computer package R. The whole procedure of robust data analysis is also programmed in R and it is intended to submit the library to the repository of R software, CRAN. In the procedure of robust data analysis, a cut-off value is needed to detect possible outliers. A set of optimum cut-off values for univariate data and some dimensions of two-way designs (complete and incomplete) has also been provided using an improved design of simulation study

    Steady-State Co-Kriging Models

    Get PDF
    In deterministic computer experiments, a computer code can often be run at different levels of complexity/fidelity and a hierarchy of levels of code can be obtained. The higher the fidelity and hence the computational cost, the more accurate output data can be obtained. Methods based on the co-kriging methodology Cressie (2015) for predicting the output of a high-fidelity computer code by combining data generated to varying levels of fidelity have become popular over the last two decades. For instance, Kennedy and O\u27Hagan (2000) first propose to build a metamodel for multi-level computer codes by using an auto-regressive model structure. Forrester et al. (2007) provide details on estimation of the model parameters and further investigate the use of co-kriging for multi-fidelity optimization based on the efficient global optimization algorithm Jones et al. (1998). Qian and Wu (2008) propose a Bayesian hierarchical modeling approach for combining low-accuracy and high-accuracy experiments. More recently, Gratiet and Cannamela (2015) propose sequential design strategies using fast cross-validation techniques for multi-fidelity computer codes.;This research intends to extend the co-kriging metamodeling methodology to study steady-state simulation experiments. First, the mathematical structure of co-kriging is extended to take into account heterogeneous simulation output variances. Next, efficient steady-state simulation experimental designs are investigated for co-kriging to achieve a high prediction accuracy for estimation of steady-state parameters. Specifically, designs consisting of replicated longer simulation runs at a few design points and replicated shorter simulation runs at a larger set of design points will be considered. Also, design with no replicated simulation runs at long simulation is studied, along with different methods for calculating the output variance in absence of replicated outputs.;Stochastic co-kriging (SCK) method is applied to an M/M/1, as well as an M/M/5 queueing system. In both examples, the prediction performance of the SCK model is promising. It is also shown that the SCK method provides better response surfaces compared to the SK method

    Analysis of opposed jet hydrogen-air counter flow diffusion flame

    Get PDF
    A computational simulation of the opposed-jet diffusion flame is performed to study its structure and extinction limits. The present analysis concentrates on the nitrogen-diluted hydrogen-air diffusion flame, which provides the basic information for many vehicle designs such as the aerospace plane for which hydrogen is a candidate as the fuel. The computer program uses the time-marching technique to solve the energy and species equations coupled with the momentum equation solved by the collocation method. The procedure is implemented in two stages. In the first stage, a one-step forward overal chemical reaction is chosen with the gas phase chemical reaction rate determined by comparison with experimental data. In the second stage, a complete chemical reaction mechanism is introduced with detailed thermodynamic and transport property calculations. Comparison between experimental extinction data and theoretical predictions is discussed. The effects of thermal diffusion as well as Lewis number and Prandtl number variations on the diffusion flame are also presented

    Systematic comparison of designs and emulators for computer experiments using a library of test functions

    Get PDF
    As computational resources have become faster and more economical, scientific research has transitioned from using only physical experiments to using simulationbased exploration. A body of literature has since grown aimed at the design and analysis of so-called computer experiments. While this literature is large and active, little work has been focused on comparing methods. This project presents ways of comparing and evaluating both design and emulation methods for computer experiments. Using a suite of test functions — in this work we introduce the Virtual Library of Computer Experiments a procedure is established which can provide guidance as to how to proceed in simulation problems. An illustrative comparison is performed for each context; putting three emulators, then four experimental designs up against each other; while also highlighting possible considerations for test function choice

    Optimal design for correlated processes with input-dependent noise

    Get PDF
    Optimal design for parameter estimation in Gaussian process regression models with input-dependent noise is examined. The motivation stems from the area of computer experiments, where computationally demanding simulators are approximated using Gaussian process emulators to act as statistical surrogates. In the case of stochastic simulators, which produce a random output for a given set of model inputs, repeated evaluations are useful, supporting the use of replicate observations in the experimental design. The findings are also applicable to the wider context of experimental design for Gaussian process regression and kriging. Designs are proposed with the aim of minimising the variance of the Gaussian process parameter estimates. A heteroscedastic Gaussian process model is presented which allows for an experimental design technique based on an extension of Fisher information to heteroscedastic models. It is empirically shown that the error of the approximation of the parameter variance by the inverse of the Fisher information is reduced as the number of replicated points is increased. Through a series of simulation experiments on both synthetic data and a systems biology stochastic simulator, optimal designs with replicate observations are shown to outperform space-filling designs both with and without replicate observations. Guidance is provided on best practice for optimal experimental design for stochastic response models
    • …
    corecore