549,797 research outputs found

    On the development and analysis of coupled surface-subsurface models of catchments. Part 3. Analytical solutions and scaling laws

    Full text link
    The objective of this three-part work is the formulation and rigorous analysis of a number of reduced mathematical models that are nevertheless capable of describing the hydrology at the scale of a river basin (i.e. catchment). Coupled effects of surface and subsurface flows are considered. In this third part, we focus on the development of analytical solutions and scaling laws for a benchmark catchment model that models the river flow (runoff) generated during a single rainfall. We demonstrate that for catchments characterised by a shallow impenetrable bedrock, the shallow-water approximation allows a reduction of the governing formulation to a coupled system of one-dimensional time-dependent equations for the surface and subsurface flows. Asymptotic analysis is used to derive semi-analytical solutions of the model. We provide simple asymptotic scaling laws describing the peak flow formation. These scaling laws can be used as an analytical benchmark for assessing the validity of other physical, conceptual, or statistical models of catchments

    Rectification properties of conically shaped nanopores: consequences of miniaturization

    Full text link
    Nanopores attracted a great deal of scientific interest as templates for biological sensors as well as model systems to understand transport phenomena at the nanoscale. The experimental and theoretical analysis of nanopores has been so far focused on understanding the effect of the pore opening diameter on ionic transport. In this article we present systematic studies on the dependence of ion transport properties on the pore length. Particular attention was given to the effect of ion current rectification exhibited for conically shaped nanopores with homogeneous surface charges. We found that reducing the length of conically shaped nanopores significantly lowered their ability to rectify ion current. However, rectification properties of short pores can be enhanced by tailoring the surface charge and the shape of the narrow opening. Furthermore we analyze the relationship of the rectification behavior and ion selectivity for different pore lengths. All simulations were performed using MsSimPore, a software package for solving the Poisson-Nernst-Planck (PNP) equations. It is based on a novel finite element solver and allows for simulations up to surface charge densities of -2 e/nm^2. MsSimPore is based on 1D reduction of the PNP model, but allows for a direct treatment of the pore with bulk electrolyte reservoirs, a feature which was previously used in higher dimensional models only. MsSimPore includes these reservoirs in the calculations; a property especially important for short pores, where the ionic concentrations and the electric potential vary strongly inside the pore as well as in the regions next to pore entrance

    ENSO dynamics in current climate models: an investigation using nonlinear dimensionality reduction

    Get PDF
    International audienceLinear dimensionality reduction techniques, notably principal component analysis, are widely used in climate data analysis as a means to aid in the interpretation of datasets of high dimensionality. These linear methods may not be appropriate for the analysis of data arising from nonlinear processes occurring in the climate system. Numerous techniques for nonlinear dimensionality reduction have been developed recently that may provide a potentially useful tool for the identification of low-dimensional manifolds in climate data sets arising from nonlinear dynamics. Here, we apply Isomap, one such technique, to the study of El Niño/Southern Oscillation variability in tropical Pacific sea surface temperatures, comparing observational data with simulations from a number of current coupled atmosphere-ocean general circulation models. We use Isomap to examine El Niño variability in the different datasets and assess the suitability of the Isomap approach for climate data analysis. We conclude that, for the application presented here, analysis using Isomap does not provide additional information beyond that already provided by principal component analysis

    Adaptive surrogates of crashworthiness models for multi-purpose engineering analyses accounting for uncertainty

    Get PDF
    © 2022 Elsevier. This manuscript version is made available under the CC-BY-NC-ND 4.0 license http://creativecommons.org/licenses/by-nc-nd/4.0/Uncertainty Quantification (UQ) is a booming discipline for complex computational models based on the analysis of robustness, reliability and credibility. UQ analysis for nonlinear crash models with high dimensional outputs presents important challenges. In crashworthiness, nonlinear structural behaviours with multiple hidden modes require expensive models (18 h for a single run). Surrogate models (metamodels) allow substituting the full order model, introducing a response surface for a reduced training set of numerical experiments. Moreover, uncertain input and large number of degrees of freedom result in high dimensional problems, which derives to a bottle neck that blocks the computational efficiency of the metamodels. Kernel Principal Component Analysis (kPCA) is a multidimensionality reduction technique for non-linear problems, with the advantage of capturing the most relevant information from the response and improving the efficiency of the metamodel. Aiming to compute the minimum number of samples with the full order model. The proposed methodology is tested with a practical industrial problem that arises from the automotive industry.This work is partially funded by Generalitat de Catalunya (Grant Number 1278 SGR 2017-2019 and Pla de Doctorats Industrials 2017 DI 058) and Ministerio de Economía y Empresa and Ministerio de Ciencia, Innovación y Universidades (Grant Number DPI2017-85139-C2-2-R).Peer ReviewedPostprint (author's final draft

    Nonlinear Dimensionality Reduction Methods in Climate Data Analysis

    Full text link
    Linear dimensionality reduction techniques, notably principal component analysis, are widely used in climate data analysis as a means to aid in the interpretation of datasets of high dimensionality. These linear methods may not be appropriate for the analysis of data arising from nonlinear processes occurring in the climate system. Numerous techniques for nonlinear dimensionality reduction have been developed recently that may provide a potentially useful tool for the identification of low-dimensional manifolds in climate data sets arising from nonlinear dynamics. In this thesis I apply three such techniques to the study of El Nino/Southern Oscillation variability in tropical Pacific sea surface temperatures and thermocline depth, comparing observational data with simulations from coupled atmosphere-ocean general circulation models from the CMIP3 multi-model ensemble. The three methods used here are a nonlinear principal component analysis (NLPCA) approach based on neural networks, the Isomap isometric mapping algorithm, and Hessian locally linear embedding. I use these three methods to examine El Nino variability in the different data sets and assess the suitability of these nonlinear dimensionality reduction approaches for climate data analysis. I conclude that although, for the application presented here, analysis using NLPCA, Isomap and Hessian locally linear embedding does not provide additional information beyond that already provided by principal component analysis, these methods are effective tools for exploratory data analysis.Comment: 273 pages, 76 figures; University of Bristol Ph.D. thesis; version with high-resolution figures available from http://www.skybluetrades.net/thesis/ian-ross-thesis.pdf (52Mb download

    Generalization of particle impact behavior in gas turbine via non-dimensional grouping

    Get PDF
    Fouling in gas turbines is caused by airborne contaminants which, under certain conditions, adhere to aerodynamic surfaces upon impact. The growth of solid deposits causes geometric modifications of the blades in terms of both mean shape and roughness level. The consequences of particle deposition range from performance deterioration to life reduction to complete loss of power. Due to the importance of the phenomenon, several methods to model particle sticking have been proposed in literature. Most models are based on the idea of a sticking probability, defined as the likelihood a particle has to stick to a surface upon impact. Other models investigate the phenomenon from a deterministic point of view by calculating the energy available before and after the impact. The nature of the materials encountered within this environment does not lend itself to a very precise characterization, consequently, it is difficult to establish the limits of validity of sticking models based on field data or even laboratory scale experiments. As a result, predicting the growth of solid deposits in gas turbines is still a task fraught with difficulty. In this work, two nondimensional parameters are defined to describe the interaction between incident particles and a substrate, with particular reference to sticking behavior in a gas turbine. In the first part of the work, historical experimental data on particle adhesion under gas turbine-like conditions are analyzed by means of relevant dimensional quantities (e.g. particle viscosity, surface tension, and kinetic energy). After a dimensional analysis, the data then are classified using non-dimensional groups and a universal threshold for the transition from erosion to deposition and from fragmentation to splashing based on particle properties and impact conditions is identified. The relation between particle kinetic energy/surface energy and the particle temperature normalized by the softening temperature represents the original non-dimensional groups able to represent a basis of a promising adhesion criterion

    CWITools: A Python3 Data Analysis Pipeline for the Cosmic Web Imager Instruments

    Get PDF
    The Palomar Cosmic Web Imager (PCWI) and Keck Cosmic Web Imager (KCWI) are integral-field spectrographs on the Hale 5m telescope at Palomar Observatory and the Keck-2 10m telescope at W. M. Keck Observatory, respectively. In recent years, these instruments have been increasingly used to conduct survey work; in particular focused on the circumgalactic and intergalactic media at high redshift. Extracting faint signals from three-dimensional IFU data is a complex task which can become prohibitively difficult for large samples without the proper tools. We present CWITools, a package written in Python3 for the analysis of PCWI and KCWI data. CWITools is designed to provide a pipeline between the output of the standard instrument data reduction pipelines and scientific products such as surface brightness maps, spectra, velocity maps, as well as a wide array of associated models and measurements. While the package is designed specifically for PCWI and KCWI data, the package is open source and can be adapted to accommodate any three-dimensional integral field spectroscopy data. Here, we describe this pipeline, the methodology behind individual steps and provide example applications

    Statistical characterization of technical surface microstructure

    Get PDF
    In the development and production of industrial parts, both the macroscopic shape and the microstructure of the parts surface on a µm-scale strongly influence the parts properties. For instance, a surface in frictional contact should be structured in a way to reduce the expected wear by optimizing its lubrication properties. A gasket surface must not be too rough to prevent leakage, etc. The measurement of surface roughness started a few decades ago with the advent of tactile profilometers. These drag a stylus along a line over the surface and record the vertical deflection of the stylus as it moves over the surface, thus recording the height of the surface at the sampling points. Modern measurement techniques make it possible to acquire a complete three-dimensional height map of the surfaces. Obviously, the techniques for analysing two-dimensional profiles are not adequate for the analysis of three-dimensional height maps. Although many propositions for 3D-analysis have been made, these often lack a sound theoretical background. Hence, their understanding is limited and only a few are used regularly, resulting in an inadequate surface descrip- tion. A simple but powerful approach is to use the Minkowski functionals of the excursion sets of the data to charactarize the surface structure. These func- tionals can be interpreted in different ways depending on the model for the surface. Two models seem especially suited for technical surfaces: Random fields for surfaces with no obvious structure, e.g. shot-blasted surfaces and Boolean grain models for surfaces consisting of smaller structuring elements, e.g. sintered materials. In this thesis, a complete framework for the analysis of three-dimensional surface data using the Minkowski functionals is developed. This novel ap- proach allows for a stepwise data reduction: A complex data set is first reduced to three characterizing functions, from which further parameters can be derived. Due to a novel fast and accurate estimator for the characterizing functions, this technique is also suitable for time-critical tasks like the application in production automation
    • …
    corecore