104 research outputs found

    Concepts for a theoretical and experimental study of lifting rotor random loads and vibrations, Phase 2

    Get PDF
    A comparison with NASA conducted simulator studies has shown that the approximate digital method for computing rotor blade flapping responses to random inputs, tentatively suggested in Phase I Report, gives with increasing rotor advance ratio the wrong trend. Consequently, three alternative methods of solution have been considered and are described: (1) an approximate method based on the functional relation between input and output double frequency spectra, (2) a numerical method based on the system responses to deterministic inputs and (3) a perturbation approach. Among these the perturbation method requires the least amount of computation and has been developed in two forms - the first form to obtain the response correlation function and the second for the time averaged spectra of flapping oscillations

    Probabilistic constraint reasoning with Monte Carlo integration to robot localization

    Get PDF
    This work studies the combination of safe and probabilistic reasoning through the hybridization of Monte Carlo integration techniques with continuous constraint programming. In continuous constraint programming there are variables ranging over continuous domains (represented as intervals) together with constraints over them (relations between variables) and the goal is to find values for those variables that satisfy all the constraints (consistent scenarios). Constraint programming “branch-and-prune” algorithms produce safe enclosures of all consistent scenarios. Special proposed algorithms for probabilistic constraint reasoning compute the probability of sets of consistent scenarios which imply the calculation of an integral over these sets (quadrature). In this work we propose to extend the “branch-and-prune” algorithms with Monte Carlo integration techniques to compute such probabilities. This approach can be useful in robotics for localization problems. Traditional approaches are based on probabilistic techniques that search the most likely scenario, which may not satisfy the model constraints. We show how to apply our approach in order to cope with this problem and provide functionality in real time

    Numerical solution of linear integral equations with random forcing terms, 1987

    Get PDF
    In Chapter one of this report we define Fredholm integral equations of the second kind, Volterra integral equations of the second kind and differentiate between the two of them and explain why integral equations are important. In Chapter two we discuss numerical procedures to integral equations. The equations we used in this report are of two types: (1) Fredholm equations and (2) Volterra equations. The methods we used for Fredholm equations are: (i) Simpson's rule, (ii) Trapezoidal rule, (iii) Weddle's rule, (iv) the Collocation method, and (iv) the Galerkin method. For Volterra equations we used the successive approximation method with (i) Simpson's rule, (ii) Trapezoidal rule and (iii) Weddle's rule to evaluate the integrals. In both Fredholm and Volterra integral equations we have the forcing term to be random. Our simulation results are presented in tables and graphs

    Monte Carlo integration.

    Get PDF
    by Sze Tsz-leung.Thesis (M.Phil.)--Chinese University of Hong Kong, 1993.Includes bibliographical references (leaves 91).Chapter Chapter 1 --- IntroductionChapter 1.1 --- Basic concepts of Monte Carlo integration --- p.1Chapter 1.1.1 --- Importance sampling --- p.4Chapter 1.1.2 --- Control variate --- p.5Chapter 1.1.3 --- Antithetic variate --- p.6Chapter 1.1.4 --- Stratified sampling --- p.7Chapter 1.1.5 --- Biased Estimator --- p.10Chapter 1.2 --- Some special methods in Monte Carlo integration --- p.11Chapter 1.2.1 --- Haber´ةs modified Monte Carlo quadrature I --- p.11Chapter 1.2.2 --- Haber's modified Monte Carlo quadrature II --- p.11Chapter 1.2.3 --- Weighted Monte Carlo integration --- p.12Chapter 1.2.4 --- Adaptive importance sampling --- p.13Chapter Chapter 2 --- New methodsChapter 2.1 --- The use of Newton Cotes quadrature formulae in stage one --- p.17Chapter 2.1.1 --- Using one-dimensional trapezoidal rule --- p.17Chapter 2.1.2 --- Using two-dimensional or higher dimensional product trapezoidal rule --- p.21Chapter 2.1.3 --- Extension to higher order one-dimensional Newton Cotes formulae --- p.32Chapter 2.2 --- The use of Guass quadrature rule in stage one --- p.45Chapter 2.3 --- Some variations of the new methods --- p.56Chapter 2.3.1 --- Using probability points in both stages --- p.56Chapter 2.3.2 --- Importance sampling --- p.59Chapter 2.3.2.1 --- Triangular distribution --- p.60Chapter 2.3.2.2 --- Beta distribution --- p.64Chapter Chapter 3 --- ExamplesChapter 3.1 --- Example one: using trapezoidal rule as basic rule --- p.73Chapter 3.1.1 --- One-dimensional case --- p.73Chapter 3.1.2 --- Two-dimensional case --- p.80Chapter 3.2 --- Example two: Using Simpson's 3/8 rule as basic rule --- p.85Chapter 3.3 --- Example three: Using Guass rule as basic rule --- p.86Chapter Chapter 4 --- Conclusion and discussions --- p.88Reference --- p.9

    Nonlinear probabilistic finite element models of laminated composite shells

    Get PDF
    A probabilistic finite element analysis procedure for laminated composite shells has been developed. A total Lagrangian finite element formulation, employing a degenerated 3-D laminated composite shell with the full Green-Lagrange strains and first-order shear deformable kinematics, forms the modeling foundation. The first-order second-moment technique for probabilistic finite element analysis of random fields is employed and results are presented in the form of mean and variance of the structural response. The effects of material nonlinearity are included through the use of a rate-independent anisotropic plasticity formulation with the macroscopic point of view. Both ply-level and micromechanics-level random variables can be selected, the latter by means of the Aboudi micromechanics model. A number of sample problems are solved to verify the accuracy of the procedures developed and to quantify the variability of certain material type/structure combinations. Experimental data is compared in many cases, and the Monte Carlo simulation method is used to check the probabilistic results. In general, the procedure is quite effective in modeling the mean and variance response of the linear and nonlinear behavior of laminated composite shells

    The development and application of a normative framework for considering uncertainty and variability in economic evaluation

    Get PDF
    The focus of this thesis is in the development and application of a normative framework for handling both variability and uncertainty in making decisions using economic evaluation. The framework builds on the recent work which takes an intuitive Bayesian approach to handling uncertainty as well as adding a similar approach for the handling of variability. The technique of stratified cost effectiveness analysis is introduced as an innovative, intuitive and theoretically sound basis for consideration of variability with respect to cost effectiveness. The technique requires the identification of patient strata where there are differences between strata but individual strata are relatively homogenous. For handling uncertainty, the normative framework requires a twofold approach. First, the cost effectiveness of therapies within each patient stratum must be assessed using probabilistic analysis. Secondly, techniques for estimation of the expected value of perfect information should be applied to determine an efficient research plan for the disease of interest. For the latter, a new technique for estimating EVPI based on quadrature is described which is both accurate and allows simpler calculation of the expected value of sample information. In addition the unit normal loss integral method previously ignored as a method of estimating EVPPI is shown to be appropriate in specific circumstances. The normative framework is applied to decisions relating to the public funding of the treatment of osteoporosis in the province of Ontario. The optimal limited use criteria would be to fund treatment with alendronate for women aged 75 years and over with previous fracture and 77 years and over with no previous fracture. An efficient research plan would fund a randomised controlled trial comparing etidronate to no therapy with a sample size of 640. Certain other research studies are of lesser value. Subsequent to the analysis contained in this thesis, the province of Ontario revised there limited use criteria to be broadly in line with the conclusions of this analysis. Thus, the application of the framework to this area demonstrates both its feasibility and acceptability. The normative framework developed in this thesis provides an optimal solution for decision makers in terms of handling uncertainty and variability in economic evaluation. Further research refining methods for estimating information value and considering other forms of uncertainty within models will enhance the framework.EThOS - Electronic Theses Online ServiceUniversity of Ottawa : Ottawa Health Research InstituteGBUnited Kingdo

    Probabilistic fracture mechanics by boundary element method

    No full text
    In this work, a new boundary element method is presented for the Probabilistic Fracture Mechanics analysis. The method developed allows the probabilistic analysis of cracked structure accomplished by the dual boundary element method (DBEM), in which the traction integral equation is used on one of the crack faces as opposed to the usual displacement integral equation. The stress intensity factors and their first order derivatives are evaluated for mode-I and mixed-mode fracture problems. A new boundary element formulation is derived and implemented to evaluate the design variables sensitivities. This method involves the solution of matrix systems formed by the direct differentiation of the discretised dual boundary element equations with respect to the each random parameter. The derivatives of fracture parameters with respect to design variables are calculated using implicit differentiation method (IDM) in DBEM for mode-I and mixed-mode fracture problems. The gradient of performance function is determined analytically and the total derivative method (TDM) is used in probabilistic fatigue crack growth problems. The randomness in the geometry, material property and the applied stress are considered in 2-D fracture problems; while initial crack size, final crack size, material property and applied stress are considered in fatigue crack growth. Uncertainties in other aspects of the problem can be included. First-Order Reliability Method (FORM) is used for predicting the reliability of cracked structures. The Hasofer Lind Rackwitz Fiessler algorithm is used to find the most probable point, referred as reliability index. Finally, the validation and applications of the stochastic boundary element coupled with FORM are presented. Numerical calculations are shown to be in good agreement either with the analytical solution or Monte Carlo Simulation

    Techniques to accelerate boundary element contributions in elasticity

    Get PDF
    The problem of rapid re-analysis of small problems in elasticity is investigated. The aim is to enable updated stress contours to be displayed in real-time as a design geometry is dynamically modified. The focus of this work is small to medium sized problems; as a result it cannot be assumed that the solution phase dominates, and so the evaluation of boundary integrals is considered as well as the equation solution. Two strategies are employed for acceleration of boundary element integrals: the use of Look-Up Tables (LUTs) containing precomputed integrals and the use of approximate analytical expressions derived from surface fits. These may be used in the matrix assembly and internal point calculations. LUTs are derived for both flat and circular arc elements for both the displacement and stress boundary integral equation. Details are provided on suitable LUT refinements and the approach is benchmarked against conventional Gauss-Legendre quadrature. The surface fit approach is presented as an alternative to LUTs that does not incur the considerable memory cost associated with LUTs. This approach has been limited to flat elements. The equation solution is cast in a re-solution framework, in which we use a GM-RES iterative solver. Convergence is greatly accelerated by using an approximate but complete LU preconditioner updated periodically using multi-threading. Consideration of the period of update is investigated with reference to the spread of eigenvalues in the preconditioned system. The resulting system achieves the aim of providing real time update of contours for small to medium size problems on a PC. This development is expected to allow a qualitative change in the way engineers might use computer aided engineering tools, in which design ideas may rapidly be assessed immediately as a change is made

    Colloquium numerical treatment of integral equations

    Get PDF
    corecore