HAL-Polytechnique
Not a member yet
51270 research outputs found
Sort by
Deciphering the impact of future individual Antarctic freshwater sources on the Southern Ocean properties and ice shelf basal melting
International audienceThe Antarctic ice sheet is losing mass. This mass loss is primarily due to ice shelf basal melting and the subsequent acceleration of glaciers. The substantial freshwater fluxes resulting from ice shelf and iceberg melting affect the Southern Ocean and beyond. As emphasized by some studies, they slow down the decline of Antarctic sea ice and hinder mixing between surface water and Circumpolar Deep Waters, further intensifying ice shelf basal melting. In this context, most studies so far have neglected the impact of surface meltwater runoff , but recent CMIP6 projections using the SSP5-8.5 scenario challenge this view, suggesting runoff values in 2100 similar to current basal melt rates. This prompts a reassessment of surface meltwater future impact on the ocean. We use the ocean and sea-ice model NEMO-SI3 resolving the sub-shelf cavities of Antarctica and including an interactive iceberg module. We perform thorough sensitivity experiments to disentangle the effect of changes in the atmospheric forcing, increased ice shelf basal melting, surface freshwater runoff and iceberg calving flux by 2100 in a high-end scenario. Contrary to expectations, the atmosphere alone does not substantially warm ice shelf cavities compared to present temperatures. However, the introduction of additional freshwater sources amplifies warming, leading to escalated melt rates and establishing a positive feedback. The magnitude of this effect correlates with the quantity of released freshwater, with the most substantial impact originating from ice shelf basal melting. Moreover, larger surface freshwater runoff and iceberg calving flux contribute to further cavity warming, resulting in a noteworthy 10% increase in ice shelf basal melt rates. We also describe a potential tipping point for cold ice shelves, such as Filchner-Ronne, before the year 2100
Global ocean ventilation: a comparison between a general circulation model and data-constrained inverse models
International audienceOcean ventilation, or the transfer of tracers from the surface boundary layer into the ocean interior, is a critical process in the climate system. Here, we assess steady-state ventilation patterns and rates in three models of ocean transports: a 1° global configuration of the Nucleus for European Modelling of the Ocean (NEMO), version 2 of the Ocean Circulation Inverse Model (OCIM), and the Total Matrix Intercomparison (TMI). We release artificial dyes in six surface regions of each model and compare equilibrium dye distributions as well as ideal age distributions. We find good qualitative agreement in large-scale dye distributions across the three models. However, the distributions indicate that TMI is more diffusive than OCIM, itself more diffusive than NEMO. NEMO simulates a sharp separation between bottom and intermediate water ventilation zones in the Southern Ocean, leading to a weaker influence of the latter zone on the abyssal ocean. A shallow bias of North Atlantic ventilation in NEMO contributes to a stronger presence of the North Atlantic dye in the mid-depth Southern Ocean and Pacific. This isopycnal communication between the North Atlantic surface and the mid-depth Pacific is very slow, however, and NEMO simulates a maximum age in the North Pacific about 900 years higher than the data-constrained models. Possible causes of this age bias are interrogated with NEMO sensitivity experiments. Implementation of an observation-based 3D map of isopycnal diffusivity augments the maximum age, due to weaker isopycnal diffusion at depths. We suggest that tracer upwelling in the subarctic Pacific is underestimated in NEMO and a key missing piece in the representation of global ocean ventilation in general circulation models
Modélisation interactive de formes 3D bio-inspirées évolutives et émergentes.
Due to manufacturing constraints, Computer-Aided Design has primarily focused on combinations of mathematical functions and simple parametric forms. However, the landscape changed with the advent of 3D printing, which allows for high shape complexity. The cost of additive manufacturing is now dominated by part size and material used rather than complexity, paving the way for a reevaluation of 3D modeling practices, including interactive conception and increased complexity.Inspired by the self-organizing principles observed in living organisms, the field of morphogenesis presents an intriguing alternative for 3D modeling. Unlike traditional CAD systems relying on explicit user-defined parameters, morphogenetic models leverage dynamic processes that exhibit emergence, evolution, adaptation to the environment, or self-healing.The general purpose of this Ph.D. is to explore and develop new approaches to 3D modeling based on highly detailed evolutionary shapes inspired by morphogenesis. The thesis commences with an in-depth exploration of bio-inspired 3D modeling, encompassing various methodologies, challenges, and options for incorporating bio-inspired concepts into 3D modeling practices.Subsequent chapters delve into specific morphogenesis models.In the first part, the focus extends to adapting a biologically inspired model, specifically Physarum polycephalum, into computer graphics for designing organic-like microstructures. This section offers a comprehensive methodological development, analyzes model parameters, and discusses potential applications in diverse fields such as additive manufacturing, design, and biology.In the second part, a novel approach is investigated, utilizing Reaction/Diffusion models to grow lattice-like and membrane-like structures within arbitrary shapes. The methodology is based on anisotropic Reaction-Diffusion systems and diffusion tensor fields, demonstrating applications in mechanical properties, validation through nonlinear analysis, user interaction, and scalability.Finally, the third part explores the application of deep learning techniques to learn the rules of morphogenesis processes, specifically Reaction/Diffusion. It begins by illustrating the richness offered by Reaction/Diffusion systems before delving into the training of Cellular Automata and Reaction/Diffusion rules to learn system parameters, resulting in robust and "life-like" behaviors.En raison de contraintes de fabrication, la Conception Assistée par Ordinateur (CAO) s'est principalement concentrée sur des combinaisons de fonctions mathématiques et de formes paramétriques simples. Cependant, ceci a changé avec l'avènement de l'impression 3D, qui permet désormais de manufacturer simplement des pièces topologiquemnt complexes. Le coût de la fabrication additive est désormais déterminé par la taille de la pièce et le matériau utilisé plutôt que par sa complexité, ouvrant la voie à une réévaluation des pratiques de modélisation 3D, incluant la conception interactive et complexe.Inspiré par les principes d'auto-organisation observés dans les organismes vivants, le domaine de la morphogenèse présente une alternative intéressante pour la modélisation 3D. Contrairement aux systèmes CAO traditionnels reposant sur des paramètres explicites définis par l'utilisateur, les modèles morphogénétiques exploitent des processus dynamiques tels que l'émergence, l'évolution, l'adaptation à l'environnement ou l'auto-guérison.Le but général de cette thèse est d'explorer et de développer de nouvelles approches de modélisation 3D basées sur des formes évolutives hautement détaillées inspirées par la morphogenèse. La thèse commence par une exploration approfondie de la modélisation 3D bio-inspirée, englobant diverses méthodologies, défis et possibilités pour incorporer des concepts bio-inspirés dans les pratiques de modélisation 3D.Les chapitres suivants se penchent sur des modèles spécifiques de morphogenèse.La première partie étudie comment adapter un modèle biologiquement inspiré, extit{Physarum polycephalum}, au domaine d'informatique graphique, dans le but de concevoir des microstructures organiques. Ce chapitre propose une étude méthodologique complète, analyse les paramètres du modèle et discute des applications potentielles dans divers domaines tels que la fabrication additive, le design et la biologie.Dans la deuxième partie, une nouvelle approche est étudiée, utilisant un modèle de réaction/diffusion pour faire croître des structures lattice et membranes à l'intérieur de formes arbitraires. La méthode se base sur des systèmes de réaction-diffusion anisotropes et des champs de tenseurs de diffusion, et démontre de remarquables propriétés mécaniques pour les structures générées, validées par analyse non linéaire. Cette approche est scalable au grand volume et permet une interactivité utilisateur en temps réel.Enfin, la troisième partie explore l'application de techniques d'apprentissage profond pour apprendre les règles des processus de morphogenèse, en particulier ceux de réaction/diffusion. Elle commence par illustrer la richesse offerte par les systèmes de réaction/diffusion avant de se plonger dans l'entraînement d'automates cellulaires et de règles de réaction/diffusion pour apprendre les paramètres de ces systèmes. Ces derniers se révèlent être robustes et montrent un comportement très semblable au vivant
Bayesian Calibration in a multi-output transposition context
Bayesian calibration is an effective approach for ensuring that numerical simulations accurately reflect the behavior of physical systems. However, because numerical models are never perfect, a discrepancy known as model error exists between the model outputs and the observed data, and must be quantified. Conventional methods can not be implemented in transposition situations, such as when a model has multiple outputs but only one is experimentally observed. To account for the model error in this context, we propose augmenting the calibration process by introducing additional input numerical parameters through a hierarchical Bayesian model, which includes hyperparameters for the prior distribution of the calibration variables. Importance sampling estimators are used to avoid increasing computational costs. Performance metrics are introduced to assess the proposed probabilistic model and the accuracy of its predictions. The method is applied on a computer code with three outputs that models the Taylor cylinder impact test. The outputs are considered as the observed variables one at a time, to work with three different transposition situations. The proposed method is compared with other approaches that embed model errors to demonstrate the significance of the hierarchical formulation
Fabricar os dados: o trabalho por trás da Inteligência Artificial
International audienceAI generates both enthusiasm and disillusionment, with promises that often go unfulfilled. It is therefore not surprising that human labor, which is its fundamental component, is also subject to these same deceptions. The development of “smart technologies” depends, at different stages, on a multitude of precarious, underpaid and invisible workers, who, dispersed globally, carry out repetitive, fragmented activities, paid per task and completed in a few seconds. These are workers who label data to train algorithms, through tasks that require the intuitive, creative and cognitive abilities of human beings, such as categorizing images, classifying advertisements, transcribing audio and video, evaluating advertisements, moderating content on social media, labeling human anatomical points of interest, digitizing documents, etc. This form of work is often referred to as “microwork”. Our contribution, which documents the conditions of microwork in Brazil and offers portraits of the workers, is a step in the wider effort to overcome the current state of invisibilization. It opens up avenues for future research, with the aim of better characterizing this new form of work, tracing its changes over time in relation to the dynamics of globalization and, ideally, identifying levers for action and transitions.A IA gera tanto entusiasmo quanto desilusão, com promessas que muitas vezes não são cumpridas. Por isso, não é de surpreender que o trabalho humano, que é seu componente fundamental, também esteja sujeito a essas mesmas decepções. O desenvolvimento de “tecnologias inteligentes” depende, em diferentes etapas, de uma multidão de trabalhadores precarizados, sub-remunerados e invisibilizados, os quais dispersos globalmente realizam atividades repetitivas, fragmentadas, pagas por tarefa e feitas em poucos segundos. Tratam-se de trabalhadores que rotulam dados para treinarem algorítmos, mediante tarefas que necessitam das capacidades intuitivas, criativas e cognitivas dos seres humanos, tais como categorização de imagens, classificação de publicidades, transcrição de áudios e vídeos, avaliação de anúncios, moderação de conteúdos em mídias sociais, rotulagem de pontos de interesse anatômicos humanos, digitalização de documentos etc. Esta forma de trabalho é frequentemente designada por “microtrabalho”. Nossa contribuição, que documenta as condições do microtrabalho no Brasil e oferece retratos dos trabalhadores, é um passo no esforço mais amplo para superar o atual estado de invisibilização. Ele abre caminhos para pesquisas futuras, com o objetivo de caracterizar melhor essa nova forma de trabalho, mesmo que seja apenas em termos de terminologia, rastreando suas mudanças ao longo do tempo em relação à dinâmica da globalização e, idealmente, identificando alavancas para ação e transformação
Approximation de mod`eles stochastiques d’ ´epid´emies sur grands graphes multi-niveaux
We study an SIR model with two levels of mixing, namely a uniformly mixing global level, and a local level with two layers of household and workplace contacts, respectively. More precisely, we aim at proposing reduced models which approximate well the epidemic dynamics at hand, while being more prone to mathematical analysis and/or numerical exploration.We investigate the epidemic impact of the workplace size distribution. Our simulation study shows that if the average workplace size is kept fixed, the variance of the workplace size distribution is a good indicator of its influence on key epidemic outcomes. In addition, this allows to design an efficient teleworking strategy. Next, we demonstrate that a deterministic, uniformly mixing SIR model calibrated using the epidemic growth rate yields a parsimonious approximation of the household-workplace model.However, the accuracy of this reduced model deteriorates over time and lacks theoretical guarantees. Hence, we study the large population limit of the stochastic household-workplace model, which we formalize as a measure-valued process with continuous state space. In a general setting, we establish convergence to the unique deterministic solution of a measure-valued equation. In the case of exponentially distributed infectious periods, a stronger reduction to a finite dimensional dynamical system is obtained.Further, in order to gain a finer insight on the impact of the model parameters on the performance of both reduced models, we perform a sensitivity study. We show that the large population limit of the household-workplace model can approximate well the epidemic even if some assumptions on the contact network are relaxed. Similarly, we quantify the impact of epidemic parameters on the capacity of the uniformly mixing reduced model to predict key epidemic outcomes.Finally, we consider density-dependent population processes in general. We establish a many-to-one formula which reduces the typical lineage of a sampled individual to a time-inhomogeneous spinal process. In addition, we use a coupling argument to quantify the large population convergence of a spinal process.Nous étudions un modèle SIR à deux niveaux de mélange, à savoir un niveau global uniformément mélangeant, et un niveau local divisé en deux couches de contacts au sein des foyers et lieux de travail, respectivement. Nous cherchons à développer des modèles réduits qui approchent bien cette dynamique épidémique, tout en étant plus maniables pour l’analyse numérique et/ou théorique.D'abord, nous analysons l’impact épidémique de la distribution des tailles des lieux de travail. Notre étude par simulations montre que, si la moyenne de la distribution des tailles de lieux de travail est fixée, sa variance est un bon indicateur de son influence sur des caractéristiques clés de l’épidémie. Cela nous permet de proposer des stratégies de télétravail efficaces. Ensuite, nous montrons qu’un modèle SIR déterministe, uniformément mélangeant, calibré sur le taux de croissance épidémique fournit une approximation parcimonieuse de l'épidémie.Néanmoins, la précision de ce modèle réduit décroît au cours du temps et n'a pas de garanties théoriques. Nous étudions donc la limite grande population du modèle stochastique à foyers et lieux de travail, que nous formalisons comme un processus à valeur mesure dont l’espace de types est continu. Nous établissons sa convergence vers l’unique solution déterministe d’une équation à valeur mesure. Dans le cas où les périodes infectieuses sont exponentiellement distribuées, une réduction plus forte vers un système dynamique fini-dimensionnel est obtenue.De plus, une étude de sensibilité nous permet de comprendre l’impact des paramètres du modèle sur la performance de ces deux modèles réduits. Nous montrons que la limite grande population du modèle foyer-travail permet de bien approcher l’épidémie, même si certaines hypothèses sur le réseau de contact sont relâchées. De même, nous quantifions l’impact des paramètres épidémiques sur la capacité du modèle réduit uniformément mélangeant à prédire des caractéristiques clés de l’épidémie.Enfin, nous considérons plus généralement des processus de population densité-dépendants. Nous établissons une formule tous-pour-un qui réduit la lignée typique d’un individu échantillonné à un processus spinal inhomogène en temps. Par ailleurs, nous quantifions par couplage la convergence en grande population d'une construction spinale
Property-Based Testing by Elaborating Proof Outlines
International audienceProperty-based testing (PBT) is a technique for validating code against an executable specification by automatically generating test-data. We present a proof-theoretical reconstruction of this style of testing for relational specifications and employ the Foundational Proof Certificate framework to describe test generators. We do this by encoding certain kinds of ``proof outlines'' as proof certificates that can describe various common generation strategies in the PBT literature, ranging from random to exhaustive, including their combination. We also address the shrinking of counterexamples as a first step toward their explanation. Once generation is accomplished, the testing phase is a standard logic programming search. After illustrating our techniques on simple, first-order (algebraic) data structures, we lift it to data structures containing bindings by using the -tree syntax approach to encode bindings. The Prolog programming language can perform both generating and checking of tests using this approach to syntax. We then further extend PBT to specifications in a fragment of linear logic. Under consideration in Theory and Practice of Logic Programming (TPLP)
Stochastic Computation of Barycentric Coordinates
International audienceThis paper presents a practical and general approach for computing barycentric coordinates through stochastic sampling. Our key insight is a reformulation of the kernel integral defining barycentric coordinates into a weighted least-squares minimization that enables Monte Carlo integration without sacrificing linear precision. Our method can thus compute barycentric coordinates directly at the points of interest, both inside and outside the cage, using just proximity queries to the cage such as closest points and ray intersections. As a result, we can evaluate barycentric coordinates for a large variety of cage representations (from quadrangulated surface meshes to parametric curves) seamlessly, bypassing any volumetric discretization or custom solves. To address the archetypal noise induced by sample-based estimates, we also introduce a denoising scheme tailored to barycentric coordinates. We demonstrate the efficiency and flexibility of our formulation by implementing a stochastic generation of harmonic coordinates, mean-value coordinates, and positive mean-value coordinates
Mitigating Farmland Biodiversity Loss A Bio-Economic Model of Land Consolidation and Pesticide Use
Biodiversity loss driven by agricultural intensification is a pressing global issue, with significant implications for ecosystem stability and human well-being. We design an integrated bio-economic agent-based model, informed by historical data from the French agricultural sector, to project future biodiversity trends and evaluate policy interventions. Our model predicts further biodiversity decline under a business-as-usual scenario, primarily due to intensified land consolidation. We evaluate two policy options: reducing pesticide use and subsidizing small farmers. While pesticide reduction rapidly benefits biodiversity in the beginning, it eventually leads to increased land consolidation and further biodiversity loss. In contrast, subsidizing small farmers by reallocating a small fraction of existing subsidies, stabilizes farm sizes and enhances biodiversity in the long run. The most effective strategy results from combining both policies, leveraging pesticide reduction alongside targeted subsidies to balance economic pressures and consistently improve biodiversity
Impact of the tilted cloud vertical structure on a northward-progress episode of the East Asian summer monsoonal precipitation belt
International audienceImpact of cloud vertical structure (CVS) on a northward-progressing rainfall episode of the East Asian summer monsoon (EASM) is explored using the Weather Research and Forecasting model, in which CloudSat observation-based vertical structure of cloud liquid water content (LWC) can be imposed. Composite LWC anomaly from CloudSat data shows a northward tilted structure from the upper to the lower troposphere. Compared to the control simulation (without modification of LWC), the one with LWC imposed, but without tilted structure, doesn't show significant changes. When LWC is introduced and northward tilted, the geopotential height (HGT) decreases in the north of the convective center, which increases the meridional wind and provides favorable conditions for the northward shift of the precipitation belt. When LWC is southward tilted, HGT decreases in the middle and lower troposphere in the south of the convective center and increases in the north, which slows down the northward shift of the precipitation belt. Adding cloud water leads to increase in humidity and decrease in temperature, causing significant increase in stratiform clouds and related precipitation. In the configuration of northward tilted LWC, low-temperature and high-humidity area is located on the north side of the convective center, favorable for the occurrence and northward shift of the precipitation belt. Deep convection is weakened with convective precipitation reduced, while shallow convection enhances the latent heat release in the lower troposphere. Therefore, more water vapor and energy are transported from boundary layer to free atmosphere, promoting the northward shift of the precipitation belt