20 research outputs found

    Multiscale numerical methods for the simulation of diffusion processes in random heterogeneous media with guaranteed accuracy

    Get PDF
    The possibility of combining several constituents to obtain properties that cannot be obtained with any of them alone, explains the growing proliferation of composites in mechanical structures. However, the modelling of such heterogeneous systems poses extreme challenges to computational mechanics. The direct simulation of the aforementioned gives rise to computational models that are extremely expensive if not impossible to solve. Through homogenisation, the excessive computational burden is eliminated by separating the two scales (the scale of the constituents and the scale of the structure). Nonetheless, the hypotheses under which homogenisation applies are usually violated. Traditional homogenisation schemes provide no means to quantify this error. The �rst contribution of this thesis is the development of a method to quantify the homogenisation error. In this method, the heterogeneous medium is represented by a stochastic partial di�erential equation where each realisation corresponds to a particle layout. This representation allows us to derive guaranteed error estimates with a low computational cost. The e�ectivity (ratio between true error and estimate) is characterised and a relation is established between the error estimates and classical results in micromechanics. Moreover, a strategy to reduce the homogenisation error is presented. The second contribution of this thesis is made by developing a numerical method with guaranteed error bounds that directly approximates the solution of heterogeneous models by using shape functions that incorporate information of the microscale. The construction of those shape functions resembles the methods of computational homogenisation where microscale boundary value problems are solved to obtain homogenised properties

    Guaranteed error bounds in homogenisation: an optimum stochastic approach to preserve the numerical separation of scales

    Get PDF
    This paper proposes a new methodology to guarantee the accuracy of the homogenisation schemes that are traditionally employed to approximate the solution of PDEs with random, fast evolving diffusion coefficients. More precisely, in the context of linear elliptic diffusion problems in randomly packed particulate composites, we develop an approach to strictly bound the error in the expectation and second moment of quantities of interest, without ever solving the fine-scale, intractable stochastic problem. The most attractive feature of our approach is that the error bounds are computed without any integration of the fine-scale features. Our computations are purely macroscopic, deterministic and remain tractable even for small scale ratios. The second contribution of the paper is an alternative derivation of modelling error bounds through the Prager–Synge hypercircle theorem. We show that this approach allows us to fully characterise and optimally tighten the interval in which predicted quantities of interest are guaranteed to lie. We interpret our optimum result as an extension of Reuss–Voigt approaches, which are classically used to estimate the homogenised diffusion coefficients of composites, to the estimation of macroscopic engineering quantities of interest. Finally, we make use of these derivations to obtain an efficient procedure for multiscale model verification and adaptation

    Efficient modeling of random heterogeneous materials with an uniform probability density function (slides)

    Get PDF
    Homogenised constitutive laws are largely used to predict the behaviour of composite structures. Assessing the validity of such homogenised models can be done by making use of the concept of “modelling error”. First, a microscopic “faithful” -and potentially intractable- model of the structure is defined. Then, one tries to quantify the effect of the homogenisation procedure on a result that would be obtained by directly using the “faithful” model. Such an approach requires (a) the “faithful” model to be more representative of the physical phenomena of interest than the homogenised model and (b) a reliable approximation of the result obtained using the ”faithful” and intractable model to be available at cheap costs. We focus here on point (b), and more precisely on the extension of the techniques devel- oped in [3] [2] to estimate the error due to the homogenisation of linear, spatially random composite materials. Particularly, we will approximate the unknown probability density function by bounding its first moment. In this paper, we will present this idea in more detail, displaying the numerical efficiencies and computational costs related to the error estimation. The fact that the probability density function is uniform is exploited to greatly reduce the computational cost. We will also show some first attempts to correct the homogenised model using non-conforming, weakly intrusive microscopic patches

    Multi-scale methods for fracture: model learning across scales, digital twinning and factors of safety

    Get PDF
    Authors: S. P. A. Bordas, L. A. A. Beex, P. Kerfriden, D. A. Paladim, O. Goury, A. Akbari, H. Rappel  Multi-scale methods for fracture: model learning across scales, digital twinning and factors of safety Fracture and material instabilities originate at spatial scales much smaller than that of the structure of interest: delamination, debonding, fibre breakage, cell-wall buckling, are examples of nano/micro or meso-scale mechanisms which can lead to global failure of the material and structure. Such mechanisms cannot, for computational and practical reasons, be accounted at structural scale, so that acceleration methods are necessary.  We review in this presentation recently proposed approaches to reduce the computational expense associated with multi-scale modelling of fracture. In light of two particular examples, we show connections between algebraic reduction (model order reduction and quasi-continuum methods) and homogenisation-based reduction. We open the discussion towards suitable approaches for machine-learning and Bayesian statistical based multi-scale model selection. Such approaches could fuel a digital-twin concept enabling models to learn from real-time data acquired during the life of the structure, accounting for “real” environmental conditions during predictions, and, eventually, moving beyond the “factors of safety” era

    Implementation of extended finite element method in the commercial library diffpack

    Full text link
    Proyecto ConfidencialAlves Paladim, D. (2012). Implementation of extended finite element method in the commercial library diffpack. http://hdl.handle.net/10251/28930.Archivo delegad

    Etnogeografia: reflections on school education, the spatial and territorial of Xakriabá people in the north of Minas Gerais

    No full text
    Esta tese relata a pesquisa em Geografia Humana pela Universidade de São Paulo, realizada a partir do envolvimento com educação escolar do Povo Xakriabá no norte de Minas Gerais. Esse povo vive em duas Terras Indígenas localizadas entre os municípios de Itacarambi e São João das Missões. O tema a ser pesquisado consistiu na importância da educação escolar indígena para manutenção, transformação e conquista do território. Para tanto busquei compreender as relações entre as unidades escolares e as transformações do território de convivência. De este modo entender as ações de espacialização e territorialização dessa etnia corroborou para compreender como resistem a globalização a partir da força do lugar em que vivem. Estudei através de observação participante, entrevistas e oficinas de audiovisuais e diálogos. As perguntas geradoras dessa reflexão foram: Quanto a Questão Indígena indicamos conceitos relacionados ao ensino de Geografia Agrária e controvérsias teóricas - metodológicas referentes ao tema. Um panorama e os limites no entendimento das ações do movimento indígena, entendido enquanto movimento socioterritorial.This thesis presents the results of a research developed in the Geographs Department of the University of São Paulo related to the Post Graduation Program of Human Geograph. It was origined by the involvement with the schools education of Xakriabá People of the North of Minas Gerais. These people live in two Indigenous Lands situated between Itacarambi and São João das Missões towns. The researched theme was the importance of Indigenous schools education to the maintenance, transformation and conquest of the territory. With this intention, I wanted to understand the relations between schools unities and the territorys transformations where the indigenous cohabit. To this purpose, understand the acts of spatialization and territorialization of this ethnical group corroborated to understand how it resists to the globalization with the power of the place where these people live. I studied, using the active observation, interviews and workshops of videos and dialogues. Then refletion was grounded in the quest of the comprehension of the Indigenous Question related to the conceptions of the teaching of Agrarian Geograph and the theoretic and methodologic controversies linked to the theme. I indicated the possibilities and the limits of conceptions and actions of the indigenous schools movement in the social and territorial contest

    Untitled in english

    No full text
    Estudamos as forças e momentos hidráulicos que atuam sobre o rotor de uma válvula esférica de grande porte utilizada para proteção da unidade geradora em usinas hidrelétricas. Analisando o fenômeno físico, utilizamos conceitos de Mecânica dos Fluidos Computacional e aplicamos o Método dos Volumes Finitos a um domínio de simulação tridimensional representando o equipamento em estudo. Simulações em Regime Permanente foram feitas com o rotor em posições discretas simulando o fechamento da válvula esférica. O fluido, água, foi considerado incompressível e o escoamento completamente desenvolvido em regime turbulento. O modelo de turbulência K-Epsilon RNG com uma função de parede padrão e o esquema SIMPLEC de acoplamento entre os campos de Pressão-Velocidade foram adotados para resolver o problema tridimensional de forma segregada. As Condições de Contorno aplicadas foram velocidade prescrita na Entrada do domínio computacional e condição localmente parabólica na Saída. Apenas como referência um valor para a pressão foi aplicada na Entrada do problema. Investigamos a magnitude e comportamento de forças e momentos atuantes no rotor bem como o padrão do fluxo no interior da válvula durante a operação de fechamento. É mostrada a variação de coeficientes como arrasto, sustentação, fluxo e cavitação. Também verificamos as demais propriedades decorrentes do fluxo turbulento obtidas a partir da simulação.The hydraulic forces and Moments acting on a spherical valve rotor used in hydropower plants to protect the generator unit were studied using Computational Fluid Dynamics and the Finite Volume Method through three dimensional simulations of the flow domain. Steady-state simulations with the rotor in different positions were carried out in order to analyze the spherical valve closing operation. The fluid, water, was considered incompressible and turbulence was simulated through the k-e RNG model with standard wall functions. The SIMPLEC scheme for the pressure-velocity coupling was adopted to solve the problem in a segregated way. The boundary conditions applied were specified velocity on the flow inlet boundary and parabolic conditions on the outflow boundary. Forces and moments on the rotor, flow pathlines and cavitation coefficients were computed

    An adaptive scheme for homogenised domains

    Get PDF
    In this paper, we extend the concept of modelling error estimation for the homogenisation of ellipticPDEs. In order to do so, we fully acknowledge that the rapid spatial variation of microscopicdiffusion constants cannot be known exactly. Therefore, we represent the microscopic diffusioncoefficients as a random field. In this context, the accuracy of surrogate models, such ashomogenisation schemes, can be quantified by estimating the error in the first moments of theprobability density function of a quantity of interest.We propose a way to bound the error in the two first moments, following and extending the seminalwork of [1]. Our derivations rely on the Constitutive Relation Error[2] (CRE), which states that acertain distance between the solutions delivered by the primal and a dual surrogates of the originalstochastic problem is equal to some measure of the exact and unaffordable errors. We further assumethat these surrogates are deterministic, consistently with the theory of homogenisation. Minimisingthe CRE in this subset of homogenisation schemes leads us to an optimal surrogate that is closelyrelated to the classical Voigt and Reuss models. This result is used in a goal-oriented setting toestablish upper and lower bounds for the first two moments of the quantity of interest. We show that the method respect the numerical separation of scales, and is therefore affordable andeasy to implement, and that it produces useful results as long as the mismatch between the diffusioncoefficients of the microstructure remains small. We will propose extensions for the case of highmismatch, by allowing the surrogate solutions to fluctuate in the stochastic domain

    Efficient modeling of random heterogeneous materials with an uniform probability density function

    Get PDF
    Homogenised constitutive laws are largely used to predict the behaviour of composite structures. Assessing the validity of such homogenised models can be done by making use of the concept of ``modelling error''. First, a microscopic ``faithful'' -and potentially intractable- model of the structure is defined. Then, one tries to quantify the effect of the homogenisation procedure on a result that would be obtained by directly using the ``faithful'' model. Such an approach requires (a) the ``faithful'' model to be more representative of the physical phenomena of interest than the homogenised model and (b) a reliable approximation of the result obtained using the "faithful" and intractable model to be available at cheap costs. We focus here on point (b), and more precisely on the extension of the techniques developed in [3][2] to estimate the error due to the homogenisation of linear, spatially random composite materials. Particularly, we will approximate the unknown probability density function by bounding its first moment. In this paper, we will present this idea in more detail, displaying the numerical efficiencies and computational costs related to the error estimation. The fact that the probability density function is uniform is exploited to greatly reduce the computational cost. We will also show some first attempts to correct the homogenised model using non-conforming, weakly intrusive microscopic patches

    Implementation of a XFEM toolbox in Diffpack

    Get PDF
    The Diffpack Development Framework is an object-oriented software environment for the numerical solution of partial differential equations (PDEs). By its design, Diffpack intends to close the gap between black-box simulation packages and technical computing environments using interpreted computer languages. The framework provides a high degree of modeling flexibility, while still offering the computational efficiency needed for most demanding simulation problems in science and engineering. Technically speaking, Diffpack is a collection of C++ libraries with classes, functions and utility programs. The numerical functionality is embedded in an environment of software engineering tools supporting the management of Diffpack development projects. Diffpack supports a variety of numerical methods with distinct focus on the finite element method (FEM) but has no inherent restrictions on the types of PDEs and therefore applications to be solved. The key point of partition of unity enriched methods such as XFEM and GFEM is to help capture discontinuities and singularities or large gradients in solutions, which are not well resolved by h or prefinement [1]. The general idea is that the mesh need not conform to the moving boundaries so that minimal or no remeshing is required during the analysis. Our main motivation is to provide a generic implementation of enrichment within a flexible C++ environment, namely the Diffpack platform. The work was inspired by some of our earlier work [6,9] and that of other colleagues [5,7,8]. We demonstrate how object-oriented programming is particularly useful for the treatment of data structures and operations associated with XFEM : mesh-geometry interaction, non-standard integration rules, application of boundary conditions, treatment of level set data [2,6]. We detail the implementation of such features and verify and validate their implementation based on [5]. We show results based on unshifted, shifted [1] and study the behaviour of the stable generalized finite element method (SGFEM) to avoid blending effects and help control the conditioning of the system matrix [4]. For integration of elements cut by interface we use an in-house Delaunay Triangulation algorithm proposed by [3,5] and presented in detail in a companion paper
    corecore