14 research outputs found

    A stochastic continuum damage model for dynamic fracture analysis of quasi-brittle materials using asynchronous Spacetime Discontinuous Galerkin (aSDG) method

    Get PDF
    The microstructural design has an essential effect on the fracture response of brittle materials. We present a stochastic bulk damage formulation to model dynamic brittle fracture. This model is compared with a similar interfacial model for homogeneous and heterogeneous materials. The damage models are rate-dependent, and the corresponding damage evolution includes delay effects. The evolution equation specifies the rate at which damage tends to its quasi-static limit. The relaxation time of the model introduces an intrinsic length scale for dynamic fracture and addresses the mesh sensitivity problem of earlier damage models with much less computational efforts. The ordinary differential form of the damage equation makes this remedy quite simple and enables capturing the loading rate sensitivity of strain-stress response. A stochastic field is defined for material cohesion and fracture strength to involve microstructure effects in the proposed formulations. The statistical fields are constructed through the Karhunen-Loeve (KL) method.An advanced asynchronous Spacetime Discontinuous Galerkin (aSDG) method is used to discretize the final system of coupled equations. Local and asynchronous solution process, linear complexity of the solution versus the number of elements, local recovery of balance properties, and high spatial and temporal orders of accuracy are some of the main advantages of the aSDG method.Several numerical examples are presented to demonstrate mesh insensitivity of the method and the effect of boundary conditions on dynamic fracture patterns. It is shown that inhomogeneity greatly differentiates fracture patterns from those of a homogeneous rock, including the location of zones with maximum damage. Moreover, as the correlation length of the random field decreases, fracture patterns resemble angled-cracks observed in compressive rock fracture. The final results show that a stochastic bulk damage model produces more realistic results in comparison with a homogenizes model

    Tailoring structures using stochastic variations of structural parameters.

    Get PDF
    Imperfections, meaning deviations from an idealized structure, can manifest through unintended variations in a structure’s geometry or material properties. Such imperfections affect the stiffness properties and can change the way structures behave under load. The magnitude of these effects determines how reliable and robust a structure is under loading. Minor changes in geometry and material properties can also be added intentionally, creating a more beneficial load response or making a more robust structure. Examples of this are variable stiffness composites, which have varying fiber paths, or structures with thickened patches. The work presented in this thesis aims to introduce a general approach to creating geodesic random fields in finite elements and exploiting these to improve designs. Random fields can be assigned to a material or geometric parameter. Stochastic analysis can then quantify the effects of variations on a structure for a given type of imperfection. Information extracted from the effects of imperfections can also identify areas critical to a structure’s performance. Post-processing stochastic results by computing the correlation between local changes and the structural performance result in a pattern, describing the effects of local changes. Perturbing the ideal deterministic geometry or material distribution of a structure using the pattern of local influences can increase performance. Examples demonstrate the approach by increasing the deterministic (without imperfections applied) linear buckling load, fatigue life, and post-buckling path of structures. Deterministic improvements can have a detrimental effect on the robustness of a structure. Increasing the amplitude of perturbation applied to the original design can improve the robustness of a structure’s response. Robustness analyses on a curved composite panel show that increasing the amplitude of design changes makes a structure less sensitive to variations. The example studied shows that an increase in robustness comes with a relatively small decrease in the deterministic improvement.Imperfektionen, d. h. die Abweichungen von einer idealisierten Struktur, können sich durch unbeabsichtigte Variationen in der Geometrie oder den Materialeigenschaften einer Struktur ergeben. Solche Imperfektionen wirken sich auf die Steifigkeitseigenschaften aus und können das Verhalten von Strukturen unter Last verändern. Das Ausmaß dieser Auswirkungen bestimmt, wie zuverlässig und robust eine Struktur unter Belastung ist. Kleine Änderungen der Geometrie und der Materialeigenschaften können auch absichtlich eingebaut werden, um ein verbessertes Lastverhalten zu erreichen oder eine stabilere Struktur zu schaffen. Beispiele hierfür sind Verbundwerkstoffe mit variabler Steifigkeit, die unterschiedliche Faserverläufe aufweisen, oder Strukturen mit lokalen Verstärkungen. Die in dieser Dissertation vorgestellte Arbeit zielt darauf ab, einen allgemeinen Ansatz zur Erstellung geodätischer Zufallsfelder in Finiten Elementen zu entwickeln und diese zur Verbesserung von Konstruktionen zu nutzen. Zufallsfelder können Material- oder Geometrieparametern zugeordnet werden. Die stochastische Analyse kann dann die Auswirkungen von Variationen auf eine Struktur für eine bestimmte Art von Imperfektion quantifizieren. Die aus den Auswirkungen von Imperfektionen gewonnenen Informationen können auch Bereiche identifizieren, die für das Tragvermögen einer Struktur kritisch sind. Die Auswertung der stochastischen Ergebnisse durch Berechnung der Korrelation zwischen lokalen Veränderungen und Strukturtragvermögen ergibt ein Muster, das die Auswirkungen lokaler Veränderungen beschreibt. Die Perturbation der idealen deterministischen Geometrie oder der Materialverteilung einer Struktur unter Verwendung des Musters der lokalen Einflüsse kann das Tragvermögen erhöhen. Anhand von Beispielen wird der Ansatz durch die Erhöhung der deterministischen (ohne Imperfektionen) linearen Knicklast, der Lebensdauer und des Nachknickverhaltens von Strukturen aufgezeigt. Deterministische Verbesserungen können sich zum Nachteil der Robustheit einer Struktur auswirken. Eine Vergrößerung der Amplitude der auf den ursprünglichen Designentwurf angewendeten Perturbation kann die Robustheit der Reaktion einer Struktur verbessern. Robustheitsanalysen an einer gekrümmten Verbundplatte zeigen, dass eine Struktur durch eine Vergrößerung der Amplitude der Entwurfsänderungen weniger empfindlich gegenüber Abweichungen wird. Das untersuchte Beispiel zeigt, dass eine Erhöhung der Robustheit mit einem relativ geringen Verlust der deterministischen Verbesserung eingeht

    UQ and AI: data fusion, inverse identification, and multiscale uncertainty propagation in aerospace components

    Get PDF
    A key requirement for engineering designs is that they offer good performance across a range of uncertain conditions while exhibiting an admissibly low probability of failure. In order to design components that offer good performance across a range of uncertain conditions, it is necessary to take account of the effect of the uncertainties associated with a candidate design. Uncertainty Quantification (UQ) methods are statistical methods that may be used to quantify the effect of the uncertainties inherent in a system on its performance. This thesis expands the envelope of UQ methods for the design of aerospace components, supporting the integration of UQ methods in product development by addressing four industrial challenges. Firstly, a method for propagating uncertainty through computational models in a hierachy of scales is described that is based on probabilistic equivalence and Non-Intrusive Polynomial Chaos (NIPC). This problem is relevant to the design of aerospace components as the computational models used to evaluate candidate designs are typically multiscale. This method was then extended to develop a formulation for inverse identification, where the probability distributions for the material properties of a coupon are deduced from measurements of its response. We demonstrate how probabilistic equivalence and the Maximum Entropy Principle (MEP) may be used to leverage data from simulations with scarce experimental data- with the intention of making this stage of product design less expensive and time consuming. The third contribution of this thesis is to develop two novel meta-modelling strategies to promote the wider exploration of the design space during the conceptual design phase. Design Space Exploration (DSE) in this phase is crucial as decisions made at the early, conceptual stages of an aircraft design can restrict the range of alternative designs available at later stages in the design process, despite limited quantitative knowledge of the interaction between requirements being available at this stage. A histogram interpolation algorithm is presented that allows the designer to interactively explore the design space with a model-free formulation, while a meta-model based on Knowledge Based Neural Networks (KBaNNs) is proposed in which the outputs of a high-level, inexpensive computer code are informed by the outputs of a neural network, in this way addressing the criticism of neural networks that they are purely data-driven and operate as black boxes. The final challenge addressed by this thesis is how to iteratively improve a meta-model by expanding the dataset used to train it. Given the reliance of UQ methods on meta-models this is an important challenge. This thesis proposes an adaptive learning algorithm for Support Vector Machine (SVM) metamodels, which are used to approximate an unknown function. In particular, we apply the adaptive learning algorithm to test cases in reliability analysis.Open Acces

    Snapshot-Based Methods and Algorithms

    Get PDF
    An increasing complexity of models used to predict real-world systems leads to the need for algorithms to replace complex models with far simpler ones, while preserving the accuracy of the predictions. This two-volume handbook covers methods as well as applications. This second volume focuses on applications in engineering, biomedical engineering, computational physics and computer science

    Model Order Reduction

    Get PDF
    An increasing complexity of models used to predict real-world systems leads to the need for algorithms to replace complex models with far simpler ones, while preserving the accuracy of the predictions. This two-volume handbook covers methods as well as applications. This second volume focuses on applications in engineering, biomedical engineering, computational physics and computer science

    Applications

    Get PDF

    Model Order Reduction

    Get PDF
    An increasing complexity of models used to predict real-world systems leads to the need for algorithms to replace complex models with far simpler ones, while preserving the accuracy of the predictions. This three-volume handbook covers methods as well as applications. This third volume focuses on applications in engineering, biomedical engineering, computational physics and computer science
    corecore