279 research outputs found

    Explicit parametric solutions of lattice structures with proper generalized decomposition (PGD): applications to the design of 3D-printed architectured materials

    Get PDF
    The final publication is available at Springer via http://dx.doi.org/10.1007/s00466-017-1534-9Architectured materials (or metamaterials) are constituted by a unit-cell with a complex structural design repeated periodically forming a bulk material with emergent mechanical properties. One may obtain specific macro-scale (or bulk) properties in the resulting architectured material by properly designing the unit-cell. Typically, this is stated as an optimal design problem in which the parameters describing the shape and mechanical properties of the unit-cell are selected in order to produce the desired bulk characteristics. This is especially pertinent due to the ease manufacturing of these complex structures with 3D printers. The proper generalized decomposition provides explicit parametic solutions of parametric PDEs. Here, the same ideas are used to obtain parametric solutions of the algebraic equations arising from lattice structural models. Once the explicit parametric solution is available, the optimal design problem is a simple post-process. The same strategy is applied in the numerical illustrations, first to a unit-cell (and then homogenized with periodicity conditions), and in a second phase to the complete structure of a lattice material specimen.Peer ReviewedPostprint (author's final draft

    Nonlinear dimensionality reduction for parametric problems: a kernel proper orthogonal decomposition

    Get PDF
    This is the peer reviewed version of the following article: Diez, P. [et al.]. Nonlinear dimensionality reduction for parametric problems: a kernel proper orthogonal decomposition. "International journal for numerical methods in engineering", 30 Desembre 2021, vol. 122, núm. 24, p. 7306-7327, which has been published in final form at DOI: 10.1002/nme.6831. This article may be used for non-commercial purposes in accordance with Wiley Terms and Conditions for Self-Archiving.Reduced-order models are essential tools to deal with parametric problems in the context of optimization, uncertainty quantification, or control and inverse problems. The set of parametric solutions lies in a low-dimensional manifold (with dimension equal to the number of independent parameters) embedded in a large-dimensional space (dimension equal to the number of degrees of freedom of the full-order discrete model). A posteriori model reduction is based on constructing a basis from a family of snapshots (solutions of the full-order model computed offline), and then use this new basis to solve the subsequent instances online. Proper orthogonal decomposition (POD) reduces the problem into a linear subspace of lower dimension, eliminating redundancies in the family of snapshots. The strategy proposed here is to use a nonlinear dimensionality reduction technique, namely, the kernel principal component analysis (kPCA), in order to find a nonlinear manifold, with an expected much lower dimension, and to solve the problem in this low-dimensional manifold. Guided by this paradigm, the methodology devised here introduces different novel ideas, namely, 1) characterizing the nonlinear manifold using local tangent spaces, where the reduced-order problem is linear and based on the neighboring snapshots, 2) the approximation space is enriched with the cross-products of the snapshots, introducing a quadratic description, 3) the kernel for kPCA is defined ad hoc, based on physical considerations, and 4) the iterations in the reduced-dimensional space are performed using an algorithm based on a Delaunay tessellation of the cloud of snapshots in the reduced space. The resulting computational strategy is performing outstandingly in the numerical tests, alleviating many of the problems associated with POD and improving the numerical accuracy.Generalitat de Catalunya, 2017-SGR-1278; Ministerio de Ciencia e Innovación, CEX2018-000797-S; PID2020-113463RB-C32; PID2020-113463RB-C33Peer ReviewedPostprint (published version

    Automatic plankton quantification using deep features

    Get PDF
    The study of marine plankton data is vital to monitor the health of the world’s oceans. In recent decades, automatic plankton recognition systems have proved useful to address the vast amount of data collected by specially engineered in situ digital imaging systems. At the beginning, these systems were developed and put into operation using traditional automatic classification techniques, which were fed with handdesigned local image descriptors (such as Fourier features), obtaining quite successful results. In the past few years, there have been many advances in the computer vision community with the rebirth of neural networks. In this paper, we leverage how descriptors computed using Convolutional Neural Networks (CNNs) trained with out-of-domain data are useful to replace hand-designed descriptors in the task of estimating the prevalence of each plankton class in a water sample. To achieve this goal, we have designed a broad set of experiments that show how effective these deep features are when working in combination with state-of-the-art quantification algorithms

    Physics-based manifold learning in scaffolds for tissue engineering: application to inverse problems

    Get PDF
    In the field of bone regeneration, insertion of scaffolds favours bone formation by triggering the differentiation of mesenchymal cells into osteoblasts. The presence of Calcium ions (Ca2+) in the interstitial fluid across scaffolds is thought to play a relevant role in the process. In particular, the Ca2+ patterns can be used as an indicator of where to expect bone formation. In this work, we analyse the inverse problem for these distribution patterns, using an advection-diffusion nonlinear model for the concentration of Ca2+. That is, given a set of observables which are related to the amount of expected bone formation, we aim at determining the values of the parameters that best fit the data. The problem is solved in a realistic 3D-printed structured scaffold for two uncertain parameters: the amplitude of the velocity of the interstitial fluid and the ionic release rate from the scaffold. The minimization in the inverse problem requires multiple evaluations of the nonlinear model. The computational cost is alleviated by the combination of standard Proper Orthogonal Decomposition (POD), to reduce the number of degrees of freedom, with an adhoc hyper-reduction strategy, which avoids the assembly of a full-order system at every iteration of the Newton’s method. The proposed hyper-reduction method is formulated using the Principal Component Analysis (PCA) decomposition of suitable training sets, devised from the weak form of the problem. In the numerical tests, the hyper-reduced formulation leads to accurate results with a significant reduction of the computational demands with respect to standard POD.Peer ReviewedPostprint (author's final draft

    Optimal COVID-19 Adapted Table Disposition in Hostelry for Guaranteeing the Social Distance through Memetic Algorithms

    Get PDF
    [EN] The COVID-19 pandemic has challenged all physical interactions. Social distancing, face masks and other rules have reshaped our way of living during the last year. The impact of these measures for indoor establishments, such as education or hostelry businesses, resulted in a considerable organisation problem. Achieving a table distribution inside these indoor spaces that fulfilled the distancing requirements while trying to allocate the maximum number of tables for enduring the pandemic has proved to be a considerable task for multiple establishments. This problem, defined as the Table Location Problem (TLP), is categorised as NP-Hard, thus a metaheuristic resolution is recommended. In our previous works, a Genetic Algorithm (GA) optimisation was proposed for optimising the table distribution in real classrooms. However, the proposed algorithm performed poorly for high obstacle density scenarios, especially when allocating a considerable number of tables due to the existing dependency between adjacent tables in the distance distribution. Therefore, in this paper, we introduce for the first time, to the authors’ best knowledge, a Memetic Algorithm (MA) optimisation that improves the previously designed GA through the introduction of a Gradient Based Local Search. Multiple configurations have been analysed for a real hostelryrelated scenario and a comparison between methodologies has been performed. Results show that the proposed MA optimisation obtained adequate solutions that the GA was unable to reach, demonstrating a superior convergence performance and an overall greater flexibility. The MA performance denoted its value not only from a COVID-19 distancing perspective but also as a flexible managing algorithm for daily table arrangement, thus fulfilling the main objectives of this paper.SIMinisterio de Ciencia, Innovación y Universidade

    Towards industry 5.0 through metaverse

    Get PDF
    [EN] The digital transformation of the industry allows the optimization of resources through enabling technology that virtualizes the behavior of Cyber-Physical Systems (CPS) along the entire value chain. However, these virtual environments characterized by machine-to-machine interactions lacked the presence of humans who are at the center of the next defined industrial revolution, Industry 5.0. The goal is for humans to be actively integrated into these virtual environments called metaverses where interactions with environmental digital assets are possible. To achieve this human-centered industrial metaverse perspective, it is necessary to provide humans with technologies that allow them to reach a more immersive and realistic conception of the production processes. For this purpose, we present in this paper, a framework based on hyperconnectivity where several enabling technologies (e.g., Digital Twins, Virtual Reality, Industrial Internet of Things (IIoT)) are integrated in order to converge towards the industrial human-centered metaverse. To validate our framework, a demonstrator has been developed enabling the evaluation of the behavior of humans in virtual environments when facing collaborative tasks that require human-to-human interaction. Within the evaluation of this demonstrator, an experiment based on an assembly that requires interaction with an autonomous vehicle has been carried out both in reality and in the virtual world. The results obtained indicate that the avatars’ metaverse performance is closer to reality when individuals have previous experience with VR goggles, even proving, in this case, the effectiveness of metaverse for industrial operators’ training. In addition, the performance of the application has been evaluated with technical parameters and the perception of the users has been analyzed by conducting a survey receiving very positive feedback and results. Therefore, the industrial metaverse, blending cutting-edge tech with a human-centric approach for Industry 5.0, is now a reality.SIPublicación en abierto financiada por el Consorcio de Bibliotecas Universitarias de Castilla y León (BUCLE), con cargo al Programa Operativo 2014ES16RFOP009 FEDER 2014-2020 DE CASTILLA Y LEÓN, Actuación:20007-CL - Apoyo Consorcio BUCL

    Adaptive surrogates of crashworthiness models for multi-purpose engineering analyses accounting for uncertainty

    Get PDF
    © 2022 Elsevier. This manuscript version is made available under the CC-BY-NC-ND 4.0 license http://creativecommons.org/licenses/by-nc-nd/4.0/Uncertainty Quantification (UQ) is a booming discipline for complex computational models based on the analysis of robustness, reliability and credibility. UQ analysis for nonlinear crash models with high dimensional outputs presents important challenges. In crashworthiness, nonlinear structural behaviours with multiple hidden modes require expensive models (18 h for a single run). Surrogate models (metamodels) allow substituting the full order model, introducing a response surface for a reduced training set of numerical experiments. Moreover, uncertain input and large number of degrees of freedom result in high dimensional problems, which derives to a bottle neck that blocks the computational efficiency of the metamodels. Kernel Principal Component Analysis (kPCA) is a multidimensionality reduction technique for non-linear problems, with the advantage of capturing the most relevant information from the response and improving the efficiency of the metamodel. Aiming to compute the minimum number of samples with the full order model. The proposed methodology is tested with a practical industrial problem that arises from the automotive industry.This work is partially funded by Generalitat de Catalunya (Grant Number 1278 SGR 2017-2019 and Pla de Doctorats Industrials 2017 DI 058) and Ministerio de Economía y Empresa and Ministerio de Ciencia, Innovación y Universidades (Grant Number DPI2017-85139-C2-2-R).Peer ReviewedPostprint (author's final draft

    Nonintrusive Uncertainty Quantification for automotive crash problems with VPS/Pamcrash

    Full text link
    Uncertainty Quantification (UQ) is a key discipline for computational modeling of complex systems, enhancing reliability of engineering simulations. In crashworthiness, having an accurate assessment of the behavior of the model uncertainty allows reducing the number of prototypes and associated costs. Carrying out UQ in this framework is especially challenging because it requires highly expensive simulations. In this context, surrogate models (metamodels) allow drastically reducing the computational cost of Monte Carlo process. Different techniques to describe the metamodel are considered, Ordinary Kriging, Polynomial Response Surfaces and a novel strategy (based on Proper Generalized Decomposition) denoted by Separated Response Surface (SRS). A large number of uncertain input parameters may jeopardize the efficiency of the metamodels. Thus, previous to define a metamodel, kernel Principal Component Analysis (kPCA) is found to be effective to simplify the model outcome description. A benchmark crash test is used to show the efficiency of combining metamodels with kPCA

    Evaluation Strategies of Nanomaterials Toxicity

    Get PDF
    The revolutionary development of nanoscience during the last years has increased the number of studies in the field to evaluate the toxicity and risk levels. The design of different nanomaterials together with biological components has implemented the advances in biomedicine. Specifically, nanoparticles seem to be a promising platform due to their features, including nanoscale dimensions and physical and chemical characteristics than can be modified in function of the final application. Herein, we review the main studies realized with nanoparticles in order to understand and characterize the cellular uptake mechanisms involved in biocompatibility, toxicity, and how they alter the biological processes to avoid disease progression
    corecore