64 research outputs found

    Suitably graded THB-spline refinement and coarsening: Towards an adaptive isogeometric analysis of additive manufacturing processes

    Get PDF
    In the present work we introduce a complete set of algorithms to efficiently perform adaptive refinement and coarsening by exploiting truncated hierarchical B-splines (THB-splines) defined on suitably graded isogeometric meshes, that are called admissible mesh configurations. We apply the proposed algorithms to two-dimensional linear heat transfer problems with localized moving heat source, as simplified models for additive manufacturing applications. We first verify the accuracy of the admissible adaptive scheme with respect to an overkilled solution, for then comparing our results with similar schemes which consider different refinement and coarsening algorithms, with or without taking into account grading parameters. This study shows that the THB-spline admissible solution delivers an optimal discretization for what concerns not only the accuracy of the approximation, but also the (reduced) number of degrees of freedom per time step. In the last example we investigate the capability of the algorithms to approximate the thermal history of the problem for a more complicated source path. The comparison with uniform and non-admissible hierarchical meshes demonstrates that also in this case our adaptive scheme returns the desired accuracy while strongly improving the computational efficiency.Comment: 20 pages, 12 figure

    B-Spline based uncertainty quantification for stochastic analysis

    Get PDF
    The consideration of uncertainties has become inevitable in state-of-the-art science and technology. Research in the field of uncertainty quantification has gained much importance in the last decades. The main focus of scientists is the identification of uncertain sources, the determination and hierarchization of uncertainties, and the investigation of their influences on system responses. Polynomial chaos expansion, among others, is suitable for this purpose, and has asserted itself as a versatile and powerful tool in various applications. In the last years, its combination with any kind of dimension reduction methods has been intensively pursued, providing support for the processing of high-dimensional input variables up to now. Indeed, this is also referred to as the curse of dimensionality and its abolishment would be considered as a milestone in uncertainty quantification. At this point, the present thesis starts and investigates spline spaces, as a natural extension of polynomials, in the field of uncertainty quantification. The newly developed method 'spline chaos', aims to employ the more complex, but thereby more flexible, structure of splines to counter harder real-world applications where polynomial chaos fails. Ordinarily, the bases of polynomial chaos expansions are orthogonal polynomials, which are replaced by B-spline basis functions in this work. Convergence of the new method is proved and emphasized by numerical examples, which are extended to an accuracy analysis with multi-dimensional input. Moreover, by solving several stochastic differential equations, it is shown that the spline chaos is a generalization of multi-element Legendre chaos and superior to it. Finally, the spline chaos accounts for solving partial differential equations and results in a stochastic Galerkin isogeometric analysis that contributes to the efficient uncertainty quantification of elliptic partial differential equations. A general framework in combination with an a priori error estimation of the expected solution is provided

    Probabilistic design optimization of horizontal axis wind turbine rotors

    Get PDF
    Considerable interest in renewable energy has increased in recent years due to the concerns raised over the environmental impact of conventional energy sources and their price volatility. In particular, wind power has enjoyed a dramatic global growth in installed capacity over the past few decades. Nowadays, the advancement of wind turbine industry represents a challenge for several engineering areas, including materials science, computer science, aerodynamics, analytical design and analysis methods, testing and monitoring, and power electronics. In particular, the technological improvement of wind turbines is currently tied to the use of advanced design methodologies, allowing the designers to develop new and more efficient design concepts. Integrating mathematical optimization techniques into the multidisciplinary design of wind turbines constitutes a promising way to enhance the profitability of these devices. In the literature, wind turbine design optimization is typically performed deterministically. Deterministic optimizations do not consider any degree of randomness affecting the inputs of the system under consideration, and result, therefore, in an unique set of outputs. However, given the stochastic nature of the wind and the uncertainties associated, for instance, with wind turbine operating conditions or geometric tolerances, deterministically optimized designs may be inefficient. Therefore, one of the ways to further improve the design of modern wind turbines is to take into account the aforementioned sources of uncertainty in the optimization process, achieving robust configurations with minimal performance sensitivity to factors causing variability. The research work presented in this thesis deals with the development of a novel integrated multidisciplinary design framework for the robust aeroservoelastic design optimization of multi-megawatt horizontal axis wind turbine (HAWT) rotors, accounting for the stochastic variability related to the input variables. The design system is based on a multidisciplinary analysis module integrating several simulations tools needed to characterize the aeroservoelastic behavior of wind turbines, and determine their economical performance by means of the levelized cost of energy (LCOE). The reported design framework is portable and modular in that any of its analysis modules can be replaced with counterparts of user-selected fidelity. The presented technology is applied to the design of a 5-MW HAWT rotor to be used at sites of wind power density class from 3 to 7, where the mean wind speed at 50 m above the ground ranges from 6.4 to 11.9 m/s. Assuming the mean wind speed to vary stochastically in such range, the rotor design is optimized by minimizing the mean and standard deviation of the LCOE. Airfoil shapes, spanwise distributions of blade chord and twist, internal structural layup and rotor speed are optimized concurrently, subject to an extensive set of structural and aeroelastic constraints. The effectiveness of the multidisciplinary and robust design framework is demonstrated by showing that the probabilistically designed turbine achieves more favorable probabilistic performance than those of the initial baseline turbine and a turbine designed deterministically

    Stratégies efficaces en caractérisation des matériaux et calibration de modèles mécaniques pour la conception virtuelle des tôles métalliques

    Get PDF
    The mechanical design of sheet metal forming parts tends to be more virtual, reducing delays and manufacturing costs. Reliable numerical simulations can also lead to optimized metallic parts using accurately calibrated advanced constitutive models. Thus, the aim of this thesis is to improve the representation of the mechanical behavior of the material in the numerical model, by developing efficient and accurate methodologies to calibrate advanced constitutive models. A recent trend in material characterization is the use of a limited number of heterogeneous mechanical tests, which provide more valuable data than classical quasi-homogeneous tests. Yet, the design of the most suitable tests is still an open question. To that extent, an overview of heterogeneous mechanical tests for metallic sheets is provided. However, no standards exist for such tests, so specific metrics to analyze the achieved mechanical states are suggested and applied to four tests. Results show that the use of various metrics provides a good basis to qualitatively and quantitatively evaluate heterogeneous mechanical tests. Due to the development of full-field measurement techniques, it is possible to use heterogeneous mechanical tests to characterize the behavior of materials. However, no analytical solution exists between the measured fields and the material parameters. Inverse methodologies are required to calibrate constitutive models using an optimization algorithm to find the best material parameters. Most applications tend to use a gradient-based algorithm without exploring other possibilities. The performance of gradient-based and -free algorithms in the calibration of a thermoelastoviscoplastic model is discussed in terms of efficiency and robustness of the optimization process. Often, plane stress conditions are assumed in the calibration of constitutive models. Nevertheless, it is still unclear whether these are acceptable when dealing with large deformations. To further understand these limitations, the calibration of constitutive models is compared using the virtual fields method implemented in 2D and 3D frameworks. However, the 3D framework requires volumetric information of the kinematic fields, which is experimentally difficult to obtain. To address this constraint, an already existing volume reconstruction method, named internal mesh generation, is further improved to take into account strain gradients in the thickness. The uncertainty of the method is quantified through virtual experiments and synthetic images. Overall, the impact of this thesis is related to (i) the importance of establishing standard metrics in the selection and design of heterogeneous mechanical tests, and (ii) enhancing the calibration of advanced constitutive models from a 2D to a 3D framework.O projeto mecânico de peças por conformação de chapas metálicas tende a ser mais virtual, reduzindo atrasos e custos de produção. Simulações numéricas confiáveis também podem levar a peças optimizadas usando modelos constitutivos avançados calibrados com precisão. Assim, o objetivo desta tese é melhorar a representação do comportamento mecânico do material no modelo numérico, através do desenvolvimento de metodologias eficientes e precisas para a calibração de modelos constitutivos avançados. Uma tendência recente na caracterização de materiais é o uso de um número limitado de ensaios mecânicos heterogéneos, que fornecem dados mais valiosos do que os ensaios clássicos quase-homogéneos. No entanto, a concepção de ensaios mais adequados ainda é uma questão em aberto. Este trabalho detalha os ensaios mecânicos heterogêneos para chapas metálicas. No entanto, não existem ainda normas para estes ensaios, pelo que métricas específicas para analisar os estados mecânicos são sugeridas e aplicadas a quatro ensaios. Os resultados mostram que o uso de várias métricas disponibiliza uma boa base para avaliar ensaios mecânicos heterogéneos. Devido ao desenvolvimento de técnicas de medição de campo total, é possível utilizar ensaios mecânicos heterogéneos para caracterizar o comportamento dos materiais. No entanto, não existe uma solução analítica entre os campos medidos e os parâmetros do material. Metodologias inversas são necessárias para calibrar os modelos constitutivos usando um algoritmo de otimização para encontrar os melhores parâmetros do material. A maioria das aplicações tende a usar um algoritmo baseado em gradiente sem explorar outras possibilidades. O desempenho de vários algoritmos na calibração de um modelo termoelastoviscoplástico é discutido em termos de eficiência e robustez do processo de otimização. Frequentemente, são utilizadas condições de estado plano de tensão na calibração de modelos constitutivos, hipótese que é questionada quando se trata de grandes deformações. A calibração de modelos constitutivos é comparada usando o método de campos virtuais implementado em 2D e 3D. No entanto, a implementação 3D requer informações volumétricas dos campos cinemáticos, o que é experimentalmente difícil de obter. Um método de reconstrução volúmica já existente é melhorado para considerar os gradientes de deformação ao longo da espessura. A incerteza do método é quantificada através de ensaios virtuais e imagens sintéticas. No geral, o impacto desta tese está relacionado com (i) a importância de estabelecer métricas na seleção e concepção de ensaios mecânicos heterogéneos, e (ii) promover desenvolvimentos na calibração de modelos constitutivos avançados de 2D para 3D.La conception mécanique des pièces métalliques tend à être plus virtuelle, réduisant les délais et les coûts de fabrication. Des simulations numériques fiables peuvent conduire à des pièces optimisées en utilisant des modèles mécaniques avancés calibrés avec précision. Ainsi, l’objectif de cette thèse est d’améliorer la représentation du comportement mécanique du matériau dans le modèle numérique, en développant des méthodologies efficaces et précises pour calibrer des modèles de comportement avancés. Une tendance récente dans la caractérisation des matériaux est l’utilisation d’un nombre limité d’essais mécaniques hétérogènes, qui fournissent des données plus riches que les essais classiques quasi-homogènes. Pourtant, la conception des tests les plus adaptés reste une question ouverte. Ce tra- vail détaille les essais mécaniques hétérogènes pour les tôles métalliques. Cependant, aucune norme n’existe pour de tels tests, ainsi des métriques spécifiques pour analyser les états mécaniques sont suggérées et appliquées à quatre tests. Les résultats montrent que l’utilisation de diverses métriques fournit une bonne base pour évaluer des essais mécaniques hétérogènes. L’utilisation des essais mécaniques hétérogènes pour caractériser le com- portement des matériaux est rendue possible par des mesures de champ. Cependant, aucune solution analytique n’existe entre les champs mesurés et les paramètres du matériau. Des méthodologies inverses sont nécessaires pour calibrer les modèles de comportement à l’aide d’un algorithme d’optimi- sation afin de déterminer les meilleurs paramètres de matériau. Un algorithme basé sur le gradient est très fréquemment utilisé, sans explorer d’autres pos- sibilités. La performance de plusieurs algorithmes dans la calibration d’un modèle thermoélastoviscoplastique est discutée en termes d’efficacité et de robustesse du processus d’optimisation. Souvent, des conditions de contraintes planes sont supposées dans la cal- ibration des modèles, hypothèse qui est remise en cause dans le cas de forte localisation des déformations. La calibration de modèles de comporte- ment est comparée à l’aide de la méthode des champs virtuels développée dans les cadres 2D et 3D. Cependant, le cadre 3D nécessite des informations volumétriques des champs cinématiques, ce qui est expérimentalement dif- ficile à obtenir. Une méthode de reconstruction volumique déjà existante est encore améliorée pour prendre en compte les gradients de déformation dans l’épaisseur. L’incertitude de la méthode est quantifiée par des expériences virtuelles, à l’aide d’images de synthèse. Dans l’ensemble, l’impact de cette thèse est lié à (i) l’importance d’établir des métriques dans la sélection et la conception d’essais mécaniques hétérogènes, et (ii) à faire progresser la calibration de modèles de comportement avancés d’un cadre 2D à un cadre 3D.Programa Doutoral em Engenharia Mecânic

    Annales Mathematicae et Informaticae (56.)

    Get PDF

    Procedurally generated models for Isogeometric Analysis

    Get PDF
    Increasingly powerful hard- and software allows for the numerical simulation of complex physical phenomena with high levels of detail. In light of this development the definition of numerical models for the Finite Element Method (FEM) has become the bottleneck in the simulation process. Characteristic features of the model generation are large manual efforts and a de-coupling of geometric and numerical model. In the highly probable case of design revisions all steps of model preprocessing and mesh generation have to be repeated. This includes the idealization and approximation of a geometric model as well as the definition of boundary conditions and model parameters. Design variants leading to more resource-efficient structures might hence be disregarded due to limited budgets and constrained time frames. A potential solution to above problem is given with the concept of Isogeometric Analysis (IGA). Core idea of this method is to directly employ a geometric model for numerical simulations, which allows to circumvent model transformations and the accompanying data losses. Basis for this method are geometric models described in terms of Non-uniform rational B-Splines (NURBS). This class of piecewise continuous rational polynomial functions is ubiquitous in computer graphics and Computer-Aided Design (CAD). It allows the description of a wide range of geometries using a compact mathematical representation. The shape of an object thereby results from the interpolation of a set of control points by means of the NURBS functions, allowing efficient representations for curves, surfaces and solid bodies alike. Existing software applications, however, only support the modeling and manipulation of the former two. The description of three-dimensional solid bodies consequently requires significant manual effort, thus essentially forbidding the setup of complex models. This thesis proposes a procedural approach for the generation of volumetric NURBS models. That is, a model is not described in terms of its data structures but as a sequence of modeling operations applied to a simple initial shape. In a sense this describes the "evolution" of the geometric model under the sequence of operations. In order to adapt this concept to NURBS geometries, only a compact set of commands is necessary which, in turn, can be adapted from existing algorithms. A model then can be treated in terms of interpretable model parameters. This leads to an abstraction from its data structures and model variants can be set up by variation of the governing parameters. The proposed concept complements existing template modeling approaches: templates can not only be defined in terms of modeling commands but can also serve as input geometry for said operations. Such templates, arranged in a nested hierarchy, provide an elegant model representation. They offer adaptivity on each tier of the model hierarchy and allow to create complex models from only few model parameters. This is demonstrated for volumetric fluid domains used in the simulation of vertical-axis wind turbines. Starting from a template representation of airfoil cross-sections, the complete "negative space" around the rotor blades can be described by a small set of model parameters, and model variants can be set up in a fraction of a second. NURBS models offer a high geometric flexibility, allowing to represent a given shape in different ways. Different model instances can exhibit varying suitability for numerical analyses. For their assessment, Finite Element mesh quality metrics are regarded. The considered metrics are based on purely geometric criteria and allow to identify model degenerations commonly used to achieve certain geometric features. They can be used to decide upon model adaptions and provide a measure for their efficacy. Unfortunately, they do not reveal a relation between mesh distortion and ill-conditioning of the equation systems resulting from the numerical model

    Simulation of Abstract Scenarios: Towards Automated Tooling in Criticality Analysis

    Get PDF
    While the introduction of automated vehicles to public roads promises various ecological, economical and societal benefits, reliable verification & validation processes that guarantee safe operation of automated vehicles are subject to ongoing research. As automated vehicles are safety-critical complex systems, operating in an open context, the uncountable infinite set of potentially critical situations renders traditional, distance-based approaches to verification & validation infeasible. Leveraging the power of abstraction, current scenario-based approaches aim at reducing this complexity by elic-itation of representative scenario classes while simultaneously shifting significant analysis and testing efforts to virtual environments. In this work we bridge the gap between high-level, abstract scenario specifications and state-of-the-art detailed vehicle and traffic simulators. While the first allow for coverage argumentation with the definition of finite and well manageable sets of scenario classes the latter is necessary for a in-depth assessment of the vehicle implementation and its interaction with the physical environment. We present a method and prototypical implementation based on constraint solving techniques to generate (sets of) concrete simulation tasks defined in the well established OpenDRIVE/OpenSCENARIO formats from abstract scenarios specified as Traffic Sequence Charts. The feasibility is demonstrated using a highway parallel overtaking scenario as a running example

    AutoGraff: towards a computational understanding of graffiti writing and related art forms.

    Get PDF
    The aim of this thesis is to develop a system that generates letters and pictures with a style that is immediately recognizable as graffiti art or calligraphy. The proposed system can be used similarly to, and in tight integration with, conventional computer-aided geometric design tools and can be used to generate synthetic graffiti content for urban environments in games and in movies, and to guide robotic or fabrication systems that can materialise the output of the system with physical drawing media. The thesis is divided into two main parts. The first part describes a set of stroke primitives, building blocks that can be combined to generate different designs that resemble graffiti or calligraphy. These primitives mimic the process typically used to design graffiti letters and exploit well known principles of motor control to model the way in which an artist moves when incrementally tracing stylised letter forms. The second part demonstrates how these stroke primitives can be automatically recovered from input geometry defined in vector form, such as the digitised traces of writing made by a user, or the glyph outlines in a font. This procedure converts the input geometry into a seed that can be transformed into a variety of calligraphic and graffiti stylisations, which depend on parametric variations of the strokes

    Automatic parametric digital design of custom-fit bicycle helmets based on 3D anthropometry and novel clustering algorithm

    Get PDF
    Bicycle helmets can provide valuable protective effects to the wearer’s head in the event of a crash. However, the level of protection that helmets offer varies greatly between the users for similar impacts. Although these discrepancies can be due to many causes, several researchers highlighted the poor fit of helmets experienced by some users as a possible explanation. Poor helmet fit may be attributed to two main causes. First, the helmet could be worn incorrectly, with the helmet either worn back to front, or tilted forward or backward. The chin strap could also be unfastened. Second, helmet sizes and shapes available to the public might not be suitable for the full range of head morphologies observed in the population. Indeed, for some users, there could either be a large gap and/or pressure points between the inner surfaces of the helmet and the head, or a low coverage of the skull area with significant unprotected regions of the head. While the poorly informed usage of bicycle helmets is partly rectifiable through education programs, the mismatch between the head and the helmet’s inside surfaces primarily relates to the conventional design method and manufacturing techniques used in the industry today. In addition to the safety concerns described above, poorly fitted helmets can cause significant discomfort and may lead people to cycle infrequently or even not cycle altogether. Such a reaction could be somewhat detrimental to the user since the health benefits of regular cycling are significant. Some organisations and institutions even believe that the risks involved in cycling without a helmet (in not-extreme practices such as mountain biking) might be outweighed by the health benefits of consistent physical workout that the activity procures. However, this is impractical in countries such as Australia where mandatory helmet laws (MHL) are in place. Improper helmet fit coupled with MHL might be the reason why Australians cycle less than formerly, despite many initiatives undertaken by the government to grow the activity. In summary, current commercially available bicycle helmets suffer from the lack of fit accuracy, are uncomfortable, and consequently can discourage riding activities in the community, especially in populations like Australia where MHL exist. Therefore, the main purpose of this research has been to develop an innovative method to produce bicycle helmet models that provide a highly accurate fit to the wearer’s head. To achieve this goal, a mass customisation (MC) framework was initiated. MC systems enable the association of the small unit costs of mass production with the compliance of individual customisation. Although MC is defined as the use of both computer-aided design and manufacturing systems to produce custom output, it was decided to focus exclusively, in this study, on the design part of the MC framework of bicycle helmets. More specifically, I tried to answer the following central research question: How can one automatically create commercially ready, custom-fit digital 3D models of bicycle helmets based on 3D anthropometric data? One objective was to create certified design models, since helmets must comply with relevant safety regulations to be sold in a country. Safety standards generally determine the amount of energy a helmet must absorb during a crash, which mostly affects the thickness of its foam liner. Since customisation plays a major role in the helmet liner’s thickness, special considerations on how the automatic process should affect the helmet’s shape were provided. Contrary to conventional helmet production techniques, this method was based on state of the art technologies and techniques, such as three-dimensional (3D) anthropometry, supervised and unsupervised machine-learning methods, and fully parametric design models. Indeed, until today, traditional 1D anthropometric data (e.g., head circumference, head length, and head breath) have been the primary sources of information used by ergonomists for the design of user-centred products such as helmets. Although these data are simple to use and understand, they only provide univariate measures of key dimensions, and these tend to only partially represent the actual shape characteristics of the head. However, 3D anthropometric data can capture the full shape of a scanned surface, thereby providing meaningful information for the design of properly fitted headgear. However, the interpretation of these data can be complicated due to the abundance of information they contain (i.e., a 3D head scan can contain up to several million data points). In recent years, the use of 3D measurements for product design has become more appealing thanks to the advances in mesh parameterization, multivariate analyses, and clustering algorithms. Such analyses and algorithms have been adopted in this project. To the author’s knowledge, this is the first time that these methods have been applied to the design of helmets within a mass customisation framework. As a result, a novel method has been developed to automatically create a complete, certified custom-fit 3D model of a bicycle helmet based on the 3D head scan of a specific individual. Even though the manufacturing of the generated customised helmets is not discussed in detail in this research, it is envisaged that the models could be fabricated using either advanced subtractive and additive manufacturing technologies (e.g., numerical control machining and 3D printing.), standard moulding techniques, or a combination of both. The proposed design framework was demonstrated using a case study where customised helmet models were created for Australian cyclists. The computed models were evaluated and validated using objective (digital models) fit assessments. Thus, a significant improvement in terms of fit accuracy was observed compared to commercially available helmet models. More specifically, a set of new techniques and algorithms were developed, which successively: (i) clean, repair, and transform a digitized head scan to a registered state; (ii) compare it to the population of interest and categorize it into a predefined group; and (iii) modify the group’s generic helmet 3D model to precisely follow the head shape considered. To successfully implement the described steps, a 3D anthropometric database comprising 222 Australian cyclists was first established using a cutting edge handheld white light 3D scanner. Subsequently, a clustering algorithm, called 3D-HEAD-CLUSTERING, was introduced to categorize individuals with similar head shapes into groups. The algorithm successfully classified 95% of the sample into four groups. A new supervised learning method was then developed to classify new customers into one of the four computed groups. It was named the 3D-HEAD-CLASSIFIER. Generic 3D helmet models were then generated for each of the computed groups using the minimum, maximum, and mean shapes of all the participants classified inside a group. The generic models were designed specifically to comply with the relevant safety standard when accounting for all the possible head shape variations within a group. Furthermore, a novel quantitative method that investigates the fit accuracy of helmets was presented. The creation of the new method was deemed necessary, since the scarce computational methods available in the literature for fit assessment of user-centred products were inadequate for the complex shapes of today’s modern bicycle helmets. The HELMET-FIT-INDEX (HFI) was thus introduced, providing a fit score ranging on a scale from 0 (excessively poor fit) to 100 (perfect fit) for a specific helmet and a specific individual. In-depth analysis of three commercially available helmets and 125 participants demonstrated a consistent correlation between subjective assessment of helmet fit and the index. The HFI provided a detailed understanding of helmet efficiency regarding fit. For example, it was shown that females and Asians experience lower helmet fit accuracy than males and Caucasians, respectively. The index was used during the MC design process to validate the high fit accuracy of the generated customised helmet models. As far as the author is aware, HFI is the first method to successfully demonstrate an ability to evaluate users’ feelings regarding fit using computational analysis. The user-centred framework presented in this work for the customisation of bicycle helmet models is proved to be a valuable alternative to the current standard design processes. With the new approach presented in this research study, the fit accuracy of bicycle helmets is optimised, improving both the comfort and the safety characteristics of the headgear. Notwithstanding the fact that the method is easily adjustable to other helmet types (e.g., motorcycle, rock climbing, football, military, and construction), the author believes that the development of similar MC frameworks for user-centred products such as shoes, glasses and gloves could be adapted effortlessly. Future work should first emphasise the fabrication side of the proposed MC system and describe how customised helmet models can be accommodated in a global supply chain model. Other research projects could focus on adjusting the proposed customisation framework to other user-centred products

    Extreme Rainfall Events: Incorporating Temporal and Spatial Dependence to Improve Statistical Models

    Get PDF
    The proper design of protective measurements against floods related to heavy precipitation has long been a question of interest in many fields of study. A crucial component for such design is the analysis of extreme historical rainfall using Extreme Value Theory (EVT) methods, which provide information about the frequency and magnitude of possible future events. Characterizing an entire basin or geographical catchment requires the extension of univariate EVT methods to capture the spatial variability of the data. This extension requires that the similarity of the data for nearby stations be included in the model, resulting in more efficient use of the data. This dissertation focuses on using statistical models incorporating spatial dependence for modeling annual rainfall maxima. Additionally, we present ways of adapting the models to capture the dependence between rainfall of different time scales. These models are used in order to pursue two aims. The first aim is to improve our understanding of the mechanisms that lead to dependence on extreme rainfall. The second aim is to improve the resulting estimates when incorporating the dependence into the models. Two published studies make up the main findings of this dissertation. The models used in both studies involve the use of Brown-Resnick max-stable processes, allowing the models to explicitly account for the dependence on either the temporal or the spatial domain. These conditional models are compared for both cases to a model that ignores the dependence, allowing us to determine the impact of the dependence in both situations. Contributions to three other studies using the concept of dependence are also summarized. In the first study, we assess the impact of including the dependence between rainfall series of different aggregation durations when estimating Intensity-Duration-Frequency curves. This assessment was done in a case study for the Wupper catchment in Germany. This study found that including the dependence in the model had a positive effect on the prediction accuracy when focusing on rainfall with short durations (d <= 10h) and large probabilities of non-exceedance. Therefore, we recommend using max-stable processes when a study focuses on short-duration rainfall. In the second study, we investigate how the spatial dependence of extreme rainfall in Berlin-Brandenburg changes seasonally and how this change could impact the estimates from a max-stable model that includes this dependence. The seasonality was determined by estimating the parameters of a summer and winter semi-annual block maxima model. The results from this study showed that, for the summer maxima, the dependence structure was adequately captured by an isotropic Brown-Resnick model. On the contrary, the same model performed poorly for the winter maxima, suggesting that a change in the assumptions is needed when dealing with typical winter events, typically frontal or stratiform for this region. These results show that accounting for the meteorological properties of the rainfall-generating processes can provide useful information for the design of the models. Overall, our findings show that including meteorological knowledge in statistical models can improve their resulting estimations. In particular, we find that, under certain conditions, using statistical dependence to incorporate knowledge about the differences in temporal and spatial scales of rainfall-generating mechanisms can lead to a positive impact in the models.Die richtige Auslegung von Schutzmaßnahmen gegen Überschwemmungen im Zusammenhang mit Starkniederschlägen ist seit langem eine Frage, die in vielen Studienbereichen von Interesse ist. Eine entscheidende Komponente für eine solche Planung ist die Analyse extremer historischer Niederschläge mit Methoden der Extremwertstatistik, die Informationen über die Häufigkeit und das Ausmaß möglicher künftiger Ereignisse liefern. Die Charakterisierung eines ganzen Einzugsgebiets oder einer geografischen Einheit erfordert die Erweiterung der univariaten Extremwerstatistik-Methoden, um die räumliche Variabilität der Daten zu erfassen. Diese Erweiterung erfordert, dass die Ähnlichkeit der Daten für nahe gelegene Stationen in das Modell einbezogen wird, was zu einer effizienteren Nutzung der Daten führt. Diese Dissertation konzentriert sich auf die Verwendung statistischer Modelle, die die räumliche Abhängigkeit bei der Modellierung von jährlichen Niederschlagsmaxima berücksichtigen. Darüber hinaus werden Möglichkeiten zur Anpassung der Modelle vorgestellt, um die Abhängigkeit zwischen Niederschlägen auf verschiedenen Zeitskalen zu erfassen. Diese Modelle werden zur Verfolgung zweier Ziele eingesetzt. Das erste Ziel besteht darin, unser Verständnis der Mechanismen zu verbessern, die zur Abhängigkeit von extremen Niederschlägen führen. Das zweite Ziel besteht darin, die resultierenden Schätzungen zu verbessern, wenn die Abhängigkeit in die Modelle einbezogen wird. Zwei veröffentlichte Studien bilden die wichtigsten Ergebnisse dieser Dissertation. Die in beiden Studien verwendeten Modelle basieren auf max-stabilen Brown-Resnick-Prozessen, die es den Modellen ermöglichen, die Abhängigkeit entweder auf der zeitlichen oder auf der räumlichen Ebene ausdrücklich zu berücksichtigen. Diese bedingten Modelle werden für beide Fälle mit einem Modell verglichen, das die Abhängigkeit ignoriert, so dass wir die Auswirkungen der Abhängigkeit in beiden Situationen bestimmen können. Es werden auch Beiträge zu drei anderen Studien zusammengefasst, die das Konzept der Abhängigkeit verwenden. In der ersten Studie bewerten wir die Auswirkungen der Einbeziehung der Abhängigkeit zwischen Niederschlagsreihen unterschiedlicher Aggregationsdauern bei der Schätzung von Intensitäts-Dauer-Frequenz-Kurven. Diese Bewertung wurde in einer Fallstudie für das Einzugsgebiet der Wupper in Deutschland durchgeführt. Diese Studie ergab, dass sich die Einbeziehung der Abhängigkeit in das Modell positiv auf die Vorhersagegenauigkeit auswirkt, wenn man sich auf Niederschläge mit kurzen Dauern (d <= 10 h) und großer Nichtüberschreitungwahrscheinlichkeit konzentriert. Daher empfehlen wir die Verwendung von max-stabilen Prozessen, wenn sich eine Studie auf Regenfälle von kurzer Dauer konzentriert. In der zweiten Studie untersuchen wir, wie sich die räumliche Abhängigkeit von Extremniederschlägen in Berlin-Brandenburg saisonal verändert und wie sich diese Veränderung auf die Schätzungen eines max-stabilen Modells auswirken könnte, das diese Abhängigkeit berücksichtigt. Die Saisonalität wurde durch die Schätzung der Parameter eines halbjährlichen Sommer- und Winter-Blockmaxima-Modells bestimmt. Die Ergebnisse dieser Studie zeigten, dass die Abhängigkeitsstruktur für die Sommermaxima durch ein isotropes Brown-Resnick-Modell angemessen erfasst wurde. Im Gegensatz dazu zeigte dasselbe Modell eine schlechte Leistung für die Wintermaxima, was darauf hindeutet, dass eine Änderung der Annahmen erforderlich ist, wenn es um typische Winterereignisse geht, die in dieser Region typischerweise frontal oder stratiförmig sind. Diese Ergebnisse zeigen, dass die Berücksichtigung der meteorologischen Eigenschaften der Niederschlagsprozesse nützliche Informationen für die Gestaltung der Modelle liefern kann. Insgesamt zeigen unsere Ergebnisse, dass die Einbeziehung von meteorologischem Wissen in statistische Modelle die daraus resultierenden Schätzungen verbessern kann. Insbesondere stellen wir fest, dass unter bestimmten Bedingungen die Nutzung der statistischen Abhängigkeit zur Einbeziehung von Wissen über die Unterschiede in den zeitlichen und räumlichen Skalen der regenerzeugenden Mechanismen zu einer positiven Wirkung in den Modellen führen kann
    corecore