1,087 research outputs found
Time-varying model identification for time-frequency feature extraction from EEG data
A novel modelling scheme that can be used to estimate and track time-varying properties of nonstationary signals is investigated. This scheme is based on a class of time-varying AutoRegressive with an eXogenous input (ARX) models where the associated time-varying parameters are represented by multi-wavelet basis functions. The orthogonal least square (OLS) algorithm is then applied to refine the model parameter estimates of the time-varying ARX model. The main features of the multi-wavelet approach is that it enables smooth trends to be tracked but also to capture sharp changes in the time-varying process parameters. Simulation studies and applications to real EEG data show that the proposed algorithm can provide important transient information on the inherent dynamics of nonstationary processes
UQ and AI: data fusion, inverse identification, and multiscale uncertainty propagation in aerospace components
A key requirement for engineering designs is that they offer good performance across a range of uncertain conditions while exhibiting an admissibly low probability of failure. In order to design components that offer good performance across a range of uncertain conditions, it is necessary to take account of the effect of the uncertainties associated with a candidate design. Uncertainty Quantification (UQ) methods are statistical methods that may be used to quantify the effect of the uncertainties inherent in a system on its performance. This thesis expands the envelope of UQ methods for the design of aerospace components, supporting the integration of UQ methods in product development by addressing four industrial challenges.
Firstly, a method for propagating uncertainty through computational models in a hierachy of scales is described that is based on probabilistic equivalence and Non-Intrusive Polynomial Chaos (NIPC). This problem is relevant to the design of aerospace components as the computational models used to evaluate candidate designs are typically multiscale. This method was then extended to develop a formulation for inverse identification, where the probability distributions for the material properties of a coupon are deduced from measurements of its response. We demonstrate how probabilistic equivalence and the Maximum Entropy Principle (MEP) may be used to leverage data from simulations with scarce experimental data- with the intention of making this stage of product design less expensive and time consuming.
The third contribution of this thesis is to develop two novel meta-modelling strategies to promote the wider exploration of the design space during the conceptual design phase. Design Space Exploration (DSE) in this phase is crucial as decisions made at the early, conceptual stages of an aircraft design can restrict the range of alternative designs available at later stages in the design process, despite limited quantitative knowledge of the interaction between requirements being available at this stage. A histogram interpolation algorithm is presented that allows the designer to interactively explore the design space with a model-free formulation, while a meta-model based on Knowledge Based Neural Networks (KBaNNs) is proposed in which the outputs of a high-level, inexpensive computer code are informed by the outputs of a neural network, in this way addressing the criticism of neural networks that they are purely data-driven and operate as black boxes.
The final challenge addressed by this thesis is how to iteratively improve a meta-model by expanding the dataset used to train it. Given the reliance of UQ methods on meta-models this is an important challenge. This thesis proposes an adaptive learning algorithm for Support Vector Machine (SVM) metamodels, which are used to approximate an unknown function. In particular, we apply the adaptive learning algorithm to test cases in reliability analysis.Open Acces
Water waves over a rough bottom in the shallow water regime
This is a study of the Euler equations for free surface water waves in the
case of varying bathymetry, considering the problem in the shallow water
scaling regime. In the case of rapidly varying periodic bottom boundaries this
is a problem of homogenization theory. In this setting we derive a new model
system of equations, consisting of the classical shallow water equations
coupled with nonlocal evolution equations for a periodic corrector term. We
also exhibit a new resonance phenomenon between surface waves and a periodic
bottom. This resonance, which gives rise to secular growth of surface wave
patterns, can be viewed as a nonlinear generalization of the classical Bragg
resonance. We justify the derivation of our model with a rigorous mathematical
analysis of the scaling limit and the resulting error terms. The principal
issue is that the shallow water limit and the homogenization process must be
performed simultaneously. Our model equations and the error analysis are valid
for both the two- and the three-dimensional physical problems.Comment: Revised version, to appear in Annales de l'Institut Henri Poincar\'
Sparse Regression for the Breathing Mode Instability: Extracting Governing Equations from Hall Effect Thrusters Simulation Data
The development of efficient and reliable electric space propulsion systems relies on accurate modeling and identification of their underlying dynamics. Traditional approaches to model identification often involve intricate physical analysis or making extensive assumptions, limiting their applicability and scalability. In this thesis an algorithm for accurate modeling and identification of electric space propulsion systems is presented. The algorithm, based on sparse regression and model parsimony, allows automatic data-driven identification of models for space plasma thrusters. It incorporates statistical techniques, physical constraints, and trajectory-based information for robust system identification. The algorithm is demonstrated using PIC/fluid simulation data from a Hall Effect Thruster for several operating points. Models of varying complexity are obtained, focusing on physical explainability and coefficient variation with operating point. The resulting equations for average ion and neutral densities align well with existing models. Point-wise density models exhibit location dependency in the discharge chamber. The algorithm showcases its general applicability to other electric propulsion systems.This work has been carried out as part of the ZARATHUSTRA project, which has received funding from the European Research Council (ERC) under the European Union’s Horizon 2020 research and innovation programme (Grant Agreement
No. 950466).Máster en IngenierĂa Espacia
Metamodel-based uncertainty quantification for the mechanical behavior of braided composites
The main design requirement for any high-performance structure is minimal dead weight. Producing lighter structures for aerospace and automotive industry directly leads to fuel efficiency and, hence, cost reduction. For wind energy, lighter wings allow larger rotor blades and, consequently, better performance. Prosthetic implants for missing body parts and athletic equipment such as rackets and sticks should also be lightweight for augmented functionality. Additional demands depending on the application, can very often be improved fatigue strength and damage tolerance, crashworthiness, temperature and corrosion resistance etc. Fiber-reinforced composite materials lie within the intersection of all the above requirements since they offer competing stiffness and ultimate strength levels at much lower weight than metals, and also high optimization and design potential due to their versatility. Braided composites are a special category with continuous fiber bundles interlaced around a preform. The automated braiding manufacturing process allows simultaneous material-structure assembly, and therefore, high-rate production with minimal material waste. The multi-step material processes and the intrinsic heterogeneity are the basic origins of the observed variability during mechanical characterization and operation of composite end-products. Conservative safety factors are applied during the design process accounting for uncertainties, even though stochastic modeling approaches lead to more rational estimations of structural safety and reliability. Such approaches require statistical modeling of the uncertain parameters which is quite expensive to be performed experimentally. A robust virtual uncertainty quantification framework is presented, able to integrate material and geometric uncertainties of different nature and statistically assess the response variability of braided composites in terms of effective properties. Information-passing multiscale algorithms are employed for high-fidelity predictions of stiffness and strength. In order to bypass the numerical cost of the repeated multiscale model evaluations required for the probabilistic approach, smart and efficient solutions should be applied. Surrogate models are, thus, trained to map manifolds at different scales and eventually substitute the finite element models. The use of machine learning is viable for uncertainty quantification, optimization and reliability applications of textile materials, but not straightforward for failure responses with complex response surfaces. Novel techniques based on variable-fidelity data and hybrid surrogate models are also integrated. Uncertain parameters are classified according to their significance to the corresponding response via variance-based global sensitivity analysis procedures. Quantification of the random properties in terms of mean and variance can be achieved by inverse approaches based on Bayesian inference. All stochastic and machine learning methods included in the framework are non-intrusive and data-driven, to ensure direct extensions towards more load cases and different materials. Moreover, experimental validation of the adopted multiscale models is presented and an application of stochastic recreation of random textile yarn distortions based on computed tomography data is demonstrated
Hybrid Approach in Microscale Transport Phenomena: Application to Biodiesel Synthesis in Micro-reactors
A hybrid engineering approach to the study of transport phenomena, based on the
synergy among computational, analytical, and experimental methodologies is
reviewed. The focus of the chapter is on fundamental analysis and proof of concept
developments in the use of nano- and micro-technologies for energy efficiency and
heat and mass transfer enhancement applications. The hybrid approach described
herein combines improved lumped-differential modeling, hybrid numericalanalytical solution methods, mixed symbolic-numerical computations, and
advanced experimental techniques for micro-scale transport phenomena. An
application dealing with micro-reactors for continuous synthesis of biodiesel is
selected to demonstrate the instrumental role of the hybrid approach in achieving
improved design and enhanced performance
Self-Evaluation Applied Mathematics 2003-2008 University of Twente
This report contains the self-study for the research assessment of the Department of Applied Mathematics (AM) of the Faculty of Electrical Engineering, Mathematics and Computer Science (EEMCS) at the University of Twente (UT). The report provides the information for the Research Assessment Committee for Applied Mathematics, dealing with mathematical sciences at the three universities of technology in the Netherlands. It describes the state of affairs pertaining to the period 1 January 2003 to 31 December 2008
Statistical State Dynamics: a new perspective on turbulence in shear flow
Traditionally, single realizations of the turbulent state have been the
object of study in shear flow turbulence. When a statistical quantity was
needed it was obtained from a spatial, temporal or ensemble average of sample
realizations of the turbulence. However, there are important advantages to
studying the dynamics of the statistical state (the SSD) directly. In highly
chaotic systems statistical quantities are often the most useful and the
advantage of obtaining these statistics directly from a state variable is
obvious. Moreover, quantities such as the probability density function (pdf)
are often difficult to obtain accurately by sampling state trajectories even if
the pdf is stationary. In the event that the pdf is time dependent, solving
directly for the pdf as a state variable is the only alternative. However,
perhaps the greatest advantage of the SSD approach is conceptual: adopting this
perspective reveals directly the essential cooperative mechanisms among the
disparate spatial and temporal scales that underly the turbulent state. While
these cooperative mechanisms have distinct manifestation in the dynamics of
realizations of turbulence both these cooperative mechanisms and the phenomena
associated with them are not amenable to analysis directly through study of
realizations as they are through the study of the associated SSD. In this
review a selection of example problems in the turbulence of planetary and
laboratory flows is examined using recently developed SSD analysis methods in
order to illustrate the utility of this approach to the study of turbulence in
shear flow.Comment: 27 pages, 18 figures. To appear in the book "Zonal jets:
Phenomenology, genesis, physics", Cambridge University Press, edited by B.
Galperin and P. L. Rea
Nervous–system–wise Functional Estimation of Directed Brain–Heart Interplay through Microstate Occurrences
Background: The quantification of functional brain–heart interplay (BHI) through analysis of the dynamics of the central and autonomic nervous systems provides effective biomarkers for cognitive, emotional, and autonomic state changes. Several computational models have been proposed to estimate BHI, focusing on a single sensor, brain region, or frequency activity. However, no models currently provide a directional estimation of such interplay at the organ level. Objective: This study proposes an analysis framework to estimate BHI that quantifies the directional information flow between whole–brain and heartbeat dynamics. Methods: System–wise directed functional estimation is performed through an ad-hoc symbolic transfer entropy implementation, which leverages on EEG-derived microstate series and on partition of heart rate variability series. The proposed framework is validated on two different experimental datasets: the first investigates the cognitive workload performed through mental arithmetic and the second focuses on an autonomic maneuver using a cold pressor test (CPT). Results: The experimental results highlight a significant bidirectional increase in BHI during cognitive workload with respect to the preceding resting phase and a higher descending interplay during a CPT compared to the preceding rest and following recovery phases. These changes are not detected by the intrinsic self entropy of isolated cortical and heartbeat dynamics. Conclusion: This study corroborates the literature on the BHI phenomenon under these experimental conditions and the new perspective provides novel insights from an organ–level viewpoint. Significance: A system–wise perspective of the BHI phenomenon may provide new insights into physiological and pathological processes that may not be completely understood at a lower level/scale of analysis
- …