887 research outputs found

    Buckling Sensitivity of Tow-Steered Plates Subjected to Multiscale Defects by High-Order Finite Elements and Polynomial Chaos Expansion

    Get PDF
    It is well known that fabrication processes inevitably lead to defects in the manufactured components. However, thanks to the new capabilities of the manufacturing procedures that have emerged during the last decades, the number of imperfections has diminished while numerical models can describe the ground truth designs. Even so, a variety of defects has not been studied yet, let alone the coupling among them. This paper aims to characterise the buckling response of Variable Stiffness Composite (VSC) plates subjected to spatially varying fibre volume content as well as fibre misalignments, yielding a multiscale sensitivity analysis. On the one hand, VSCs have been modelled by means of the Carrera Unified Formulation (CUF) and a layer-wise (LW) approach, with which independent stochastic fields can be assigned to each composite layer. On the other hand, microscale analysis has been performed by employing CUF-based Mechanics of Structure Genome (MSG), which was used to build surrogate models that relate the fibre volume fraction and the material elastic properties. Then, stochastic buckling analyses were carried out following a multiscale Monte Carlo analysis to characterise the buckling load distributions statistically. Eventually, it was demonstrated that this multiscale sensitivity approach can be accelerated by an adequate usage of sampling techniques and surrogate models such as Polynomial Chaos Expansion (PCE). Finally, it has been shown that sensitivity is greatly affected by nominal fibre orientation and the multiscale uncertainty features

    A modular nonlinear stochastic finite element formulation for uncertainty estimation

    Get PDF
    The Monte Carlo method is widely used for the estimation of uncertainties in mechanical engineering design. However, while flexible, this method remains impractical in terms of computational time and scalability. In 1990’s Ghanem and Spanos introduced a more efficient approach. Albeit code intrusive, this framework has the advantage of being sampling independent, provides accurate output statistics and is modular in terms of operation. However, this method has traditionally been limited to linear elasticity. This thesis aims to extend the Galerkin Stochastic Finite Element Method (GSFEM) to a wider range of industry-relevant applications. First, Wavelet expansions are incorporated in the GSFEM. This new feature enables capturing discontinuous surface responses. The flexibility of this approach is illustrated with a 3D hyperelastic example with nonlinear behaviour arising from buckling uncertainty. Then, the GSFEM is specialised in the A Posteriori Finite Element Method (APFEM) where uniform distributions are taken by default to allow for parametric studies of the inputs of interest as a postprocessing step after the simulation. Doing so, APFEM only requires the knowledge of the vertices of the parameter space. In particular, one key advantage of APFEM is its use in the context of Bayesian inferences, where the random evaluations required by the Bayesian setting (usually done through Monte Carlo) can be done exactly without the need for further simulations. Finally, the APFEM is modified to obtain output’s response under random geometric perturbations. This framework allows reliable mechanical outcome prediction for a set of geometries comprised between two given configurations. The GSFEM has also been extended to a wide panel of constitutive models and has been modified to incorporate simple contact features leveraging the maturity of our stochastic algebra. Using the Monte Carlo (MC) method as a baseline, results show the excellent accuracy of the aforementioned stochastic method. The developed framework could potentially allow further enhancement of the design step in the engineering process as one simulation is enough to obtain all possible mechanical outcomes

    Metamodel-based uncertainty quantification for the mechanical behavior of braided composites

    Get PDF
    The main design requirement for any high-performance structure is minimal dead weight. Producing lighter structures for aerospace and automotive industry directly leads to fuel efficiency and, hence, cost reduction. For wind energy, lighter wings allow larger rotor blades and, consequently, better performance. Prosthetic implants for missing body parts and athletic equipment such as rackets and sticks should also be lightweight for augmented functionality. Additional demands depending on the application, can very often be improved fatigue strength and damage tolerance, crashworthiness, temperature and corrosion resistance etc. Fiber-reinforced composite materials lie within the intersection of all the above requirements since they offer competing stiffness and ultimate strength levels at much lower weight than metals, and also high optimization and design potential due to their versatility. Braided composites are a special category with continuous fiber bundles interlaced around a preform. The automated braiding manufacturing process allows simultaneous material-structure assembly, and therefore, high-rate production with minimal material waste. The multi-step material processes and the intrinsic heterogeneity are the basic origins of the observed variability during mechanical characterization and operation of composite end-products. Conservative safety factors are applied during the design process accounting for uncertainties, even though stochastic modeling approaches lead to more rational estimations of structural safety and reliability. Such approaches require statistical modeling of the uncertain parameters which is quite expensive to be performed experimentally. A robust virtual uncertainty quantification framework is presented, able to integrate material and geometric uncertainties of different nature and statistically assess the response variability of braided composites in terms of effective properties. Information-passing multiscale algorithms are employed for high-fidelity predictions of stiffness and strength. In order to bypass the numerical cost of the repeated multiscale model evaluations required for the probabilistic approach, smart and efficient solutions should be applied. Surrogate models are, thus, trained to map manifolds at different scales and eventually substitute the finite element models. The use of machine learning is viable for uncertainty quantification, optimization and reliability applications of textile materials, but not straightforward for failure responses with complex response surfaces. Novel techniques based on variable-fidelity data and hybrid surrogate models are also integrated. Uncertain parameters are classified according to their significance to the corresponding response via variance-based global sensitivity analysis procedures. Quantification of the random properties in terms of mean and variance can be achieved by inverse approaches based on Bayesian inference. All stochastic and machine learning methods included in the framework are non-intrusive and data-driven, to ensure direct extensions towards more load cases and different materials. Moreover, experimental validation of the adopted multiscale models is presented and an application of stochastic recreation of random textile yarn distortions based on computed tomography data is demonstrated

    Multiscale Methods for Random Composite Materials

    Get PDF
    Simulation of material behaviour is not only a vital tool in accelerating product development and increasing design efficiency but also in advancing our fundamental understanding of materials. While homogeneous, isotropic materials are often simple to simulate, advanced, anisotropic materials pose a more sizeable challenge. In simulating entire composite components such as a 25m aircraft wing made by stacking several 0.25mm thick plies, finite element models typically exceed millions or even a billion unknowns. This problem is exacerbated by the inclusion of sub-millimeter manufacturing defects for two reasons. Firstly, a finer resolution is required which makes the problem larger. Secondly, defects introduce randomness. Traditionally, this randomness or uncertainty has been quantified heuristically since commercial codes are largely unsuccessful in solving problems of this size. This thesis develops a rigorous uncertainty quantification (UQ) framework permitted by a state of the art finite element package \texttt{dune-composites}, also developed here, designed for but not limited to composite applications. A key feature of this open-source package is a robust, parallel and scalable preconditioner \texttt{GenEO}, that guarantees constant iteration counts independent of problem size. It boasts near perfect scaling properties in both, a strong and a weak sense on over 15,00015,000 cores. It is numerically verified by solving industrially motivated problems containing upwards of 200 million unknowns. Equipped with the capability of solving expensive models, a novel stochastic framework is developed to quantify variability in part performance arising from localized out-of-plane defects. Theoretical part strength is determined for independent samples drawn from a distribution inferred from B-scans of wrinkles. Supported by literature, the results indicate a strong dependence between maximum misalignment angle and strength knockdown based on which an engineering model is presented to allow rapid estimation of residual strength bypassing expensive simulations. The engineering model itself is built from a large set of simulations of residual strength, each of which is computed using the following two step approach. First, a novel parametric representation of wrinkles is developed where the spread of parameters defines the wrinkle distribution. Second, expensive forward models are only solved for independent wrinkles using \texttt{dune-composites}. Besides scalability the other key feature of \texttt{dune-composites}, the \texttt{GenEO} coarse space, doubles as an excellent multiscale basis which is exploited to build high quality reduced order models that are orders of magnitude smaller. This is important because it enables multiple coarse solves for the cost of one fine solve. In an MCMC framework, where many solves are wasted in arriving at the next independent sample, this is a sought after quality because it greatly increases effective sample size for a fixed computational budget thus providing a route to high-fidelity UQ. This thesis exploits both, new solvers and multiscale methods developed here to design an efficient Bayesian framework to carry out previously intractable (large scale) simulations calibrated by experimental data. These new capabilities provide the basis for future work on modelling random heterogeneous materials while also offering the scope for building virtual test programs including nonlinear analyses, all of which can be implemented within a probabilistic setting

    A probabilistic peridynamic framework with an application to the study of the statistical size effect

    Get PDF
    This is the final version. Available from Elsevier via the DOI in this record. Data availability: No data was used for the research described in the article.The high computational expense of peridynamic models remains a major limitation, hindering ‘outer-loop’ applications that require a large number of simulations, for example, uncertainty quantification. This contribution presents a framework that makes such computations feasible. By employing a Multilevel Monte Carlo framework, where the majority of simulations are performed using a coarse mesh, and performing relatively few simulations using a fine mesh, a significant reduction in computational cost can be realised, and statistics of structural failure can be estimated. The maximum observed speed-up factor is 16 when compared to a standard Monte Carlo estimator, thus enabling the efficient forward propagation of uncertain parameters in a computationally expensive peridynamic model. Furthermore, the multilevel method provides an estimate of both the discretisation error and sampling error, thereby improving confidence in numerical predictions. The performance of the approach is demonstrated through an examination of the statistical size effect in quasi-brittle materials.Turing Fellowshi

    Function-oriented in-line quality assurance of hybrid sheet molding compound

    Get PDF
    Die Verwendung von faserverstĂ€rkten Kunststoffen (FVK) nimmt weltweit stetig zu. Die Kombination von diskontinuierlichem Sheet Molding Compound (DiCo-SMC) und kontinuierlichem SMC (Co-SMC) in einer neuen, hybriden Materialklasse (CoDiCo-SMC) verspricht gĂŒnstige Herstellungskosten bei gleichzeitig hoher lokaler Steifigkeit und Festigkeit zu erreichen. Allerdings gefĂ€hrden auftretende Fertigungsabweichungen die FunktionserfĂŒllung der gefertigten Bauteile. Die resultierende Forderung nach fehlerfreien FVK-Bauteilen gilt neben den hohen Preisen fĂŒr Rohmaterialien als ein weiterer Kostentreiber. Mithilfe des Ansatzes einer bauteilindividuellen, funktionsorientierten In-line-QualitĂ€tssicherung soll im Rahmen dieser Arbeit Abhilfe geschaffen werden. FĂŒr diese Art der QualitĂ€tssicherung werden In-line-Messergebnisse in Funktionsmodelle integriert. Metamodelle der Funktionsmodelle beschleunigen die Funktionsbewertung und ermöglichen eine Funktionsaussage innerhalb der Zykluszeit in der Produktion. In der vorliegenden Arbeit wurde die bauteilindividuelle, funktionsorientierte In-line-QualitĂ€ts-sicherung exemplarisch fĂŒr die neue Werkstoffklasse CoDiCo-SMC umgesetzt. Zur Quantifizierung von drei relevanten Fertigungsabweichungen (lokale Glasfaseranteile, Pose des Co-SMC Patches, Delamination) wurden drei verschiedene Messtechniken eingesetzt. Die Terahertz-Spektroskopie wurde zum ersten Mal zur In-line-Messung lokaler Glasfaseranteile in DiCo-SMC eingesetzt. Die Puls-Phasen-Thermografie wurde zur Quantifizierung der Delamination und eine Industriekamera zur Messung der Pose des Co-SMC Patches genutzt. FĂŒr jede Messtechnik wurde die Messunsicherheit gemĂ€ĂŸ des „Guide to the expression of uncertainty in measurement“ (GUM) quantifiziert. Die Messergebnisse wurden in einem parametrierten Finite-Elemente-Modell (FE) weiterverarbeitet und zu einer FunktionsprĂ€diktion aggregiert. Mit Hilfe der Messergebnisse und der modellierten Funktion konnten ĂŒber diese Input-Output-Beziehungen Metamodelle trainiert werden. In dieser Arbeit wird die prĂ€dizierte Bauteilfunktion ebenfalls als Messergebnis verstanden. Daher wurden die Mess-unsicherheiten sowohl der FE-Modelle als auch der Metamodelle bestimmt. Der vorgeschlagene Ansatz wurde anhand von zwei exemplarischen PrĂŒfkörpern validiert. Die Ergebnisse zeigen, dass insbesondere die Messung der lokalen Glasfaseranteile und der Pose des Co-SMC Patches RĂŒckschlĂŒsse auf die bauteilspezifische Steifigkeit zulassen. Allerdings muss aufgrund der ermittelten Messunsicherheiten derzeit noch auf eine industrielle Anwendung verzichtet werden. Die Nutzung bauteilspezifischer Funktionsinformationen nach der Fertigung ermöglicht es, gĂ€ngige Sicherheitsfaktoren in der Dimensionierung von FVK-Bauteilen zu reduzieren

    UQ and AI: data fusion, inverse identification, and multiscale uncertainty propagation in aerospace components

    Get PDF
    A key requirement for engineering designs is that they offer good performance across a range of uncertain conditions while exhibiting an admissibly low probability of failure. In order to design components that offer good performance across a range of uncertain conditions, it is necessary to take account of the effect of the uncertainties associated with a candidate design. Uncertainty Quantification (UQ) methods are statistical methods that may be used to quantify the effect of the uncertainties inherent in a system on its performance. This thesis expands the envelope of UQ methods for the design of aerospace components, supporting the integration of UQ methods in product development by addressing four industrial challenges. Firstly, a method for propagating uncertainty through computational models in a hierachy of scales is described that is based on probabilistic equivalence and Non-Intrusive Polynomial Chaos (NIPC). This problem is relevant to the design of aerospace components as the computational models used to evaluate candidate designs are typically multiscale. This method was then extended to develop a formulation for inverse identification, where the probability distributions for the material properties of a coupon are deduced from measurements of its response. We demonstrate how probabilistic equivalence and the Maximum Entropy Principle (MEP) may be used to leverage data from simulations with scarce experimental data- with the intention of making this stage of product design less expensive and time consuming. The third contribution of this thesis is to develop two novel meta-modelling strategies to promote the wider exploration of the design space during the conceptual design phase. Design Space Exploration (DSE) in this phase is crucial as decisions made at the early, conceptual stages of an aircraft design can restrict the range of alternative designs available at later stages in the design process, despite limited quantitative knowledge of the interaction between requirements being available at this stage. A histogram interpolation algorithm is presented that allows the designer to interactively explore the design space with a model-free formulation, while a meta-model based on Knowledge Based Neural Networks (KBaNNs) is proposed in which the outputs of a high-level, inexpensive computer code are informed by the outputs of a neural network, in this way addressing the criticism of neural networks that they are purely data-driven and operate as black boxes. The final challenge addressed by this thesis is how to iteratively improve a meta-model by expanding the dataset used to train it. Given the reliance of UQ methods on meta-models this is an important challenge. This thesis proposes an adaptive learning algorithm for Support Vector Machine (SVM) metamodels, which are used to approximate an unknown function. In particular, we apply the adaptive learning algorithm to test cases in reliability analysis.Open Acces

    Robust improvement of the asymmetric post-buckling behavior of a composite panel by perturbing ïŹber paths

    Get PDF
    The buckling behavior of structures is highly sensitive to imperfections, i.e., deviations from the geometry and material properties of the ideal structure. In this paper, an approach is presented in which the effects of spatially varying ïŹber misalignments in composite structures are assessed through random ïŹeld analysis and are subsequently used to improve the structure while simultaneously making it more robust to ïŹber misalignments. Effects of misalignments are quantiïŹed by applying random ïŹelds on the structure, which represent ïŹber misalignments. Using analyses of the effect of the random local stiffness changes due to ïŹber misalign- ments, a pattern of the relative inïŹ‚uence these local changes have on the buckling load is created. By applying a small change to local ïŹber orientation corresponding to this pattern to the original structure, the performance of the design is improved. Additional stochastic analyses are performed using the improved design, reanalyzing the effects local ïŹber misalignments have on the structural performance and the subsequent changes in robustness. Stochastic results show an overall increase in the mean buckling load and a reduction in the coefïŹcient of variation in the analysis of the perturbed structure. The approach is applied to a composite panel exhibiting asymmetric post‐buckling behavior, i.e., having an unstable post‐buckling branch and an (initially) stable branch. Results show that perturbations in the ïŹber path can nudge a structure into a more stable post‐ buckling path by promoting a post‐buckling path using local changes in structural stiffness. The robustness of improved designs can also increase, making structures less susceptible to local ïŹber misalignments
    • 

    corecore