149 research outputs found

    Comprendre l’impact de la culture et des valeurs sur les comportements économiques

    Get PDF
    Yann Algan mène des recherches en économie expérimentale et comportementale à Sciences Po où il enseigne principalement la macroéconomie et la politique économique. Il répond ici aux questions de nonfiction.fr dans le cadre d’un dossier consacré aux nouveaux économistes français

    Proposal to recover an extensive ground state degeneracy in a two-dimensional square array of nanomagnets

    Full text link
    We investigate numerically the micromagnetic properties and the low-energy physics of an artificial square spin system in which the nanomagnets are physically connected at the lattice vertices. Micromagnetic simulations reveal that the energy stored at the vertex sites strongly depends on the type of magnetic domain wall formed by the four connected nanomagnets. As a consequence, the energy gap between the vertex types can be partially modified by varying the geometrical parameters of the nanomagnets, such as their width and thickness. Based on the energy levels given by the micromagnetic simulations, we compute the thermodynamic properties of the corresponding spin models using Monte Carlo simulations. We found two regimes, both being characterized by an extensive ground state manifold, in sharp contrast with similar lattices with disconnected nanomagnets. For narrow and thin nanomagnets, low-energy spin configurations consist of independent ferromagnetic straight lines crossing the whole lattice. The ground state manifold is thus highly degenerate, although this degeneracy is subdominant. In the limit of thick and wide nanomagnets, our findings suggest that the celebrated square ice model may be fabricated experimentally from a simple square lattice of connected elements. These results show that the micromagnetic nature of artificial spin systems involves another degree of freedom that can be finely tuned to explore strongly correlated disordered magnetic states of matter.Comment: 6 pages, 5 figure

    Detection of emerging neurodegeneration using Bayesian linear mixed-effect modeling

    Get PDF
    Early detection of neurodegeneration, and prediction of when neurodegenerative diseases will lead to symptoms, are critical for developing and initiating disease modifying treatments for these disorders. While each neurodegenerative disease has a typical pattern of early changes in the brain, these disorders are heterogeneous, and early manifestations can vary greatly across people. Methods for detecting emerging neurodegeneration in any part of the brain are therefore needed. Prior publications have described the use of Bayesian linear mixed-effects (BLME) modeling for characterizing the trajectory of change across the brain in healthy controls and patients with neurodegenerative disease. Here, we use an extension of such a model to detect emerging neurodegeneration in cognitively healthy individuals at risk for dementia. We use BLME to quantify individualized rates of volume loss across the cerebral cortex from the first two MRIs in each person and then extend the BLME model to predict future values for each voxel. We then compare observed values at subsequent time points with the values that were expected from the initial rates of change and identify voxels that are lower than the expected values, indicating accelerated volume loss and neurodegeneration. We apply the model to longitudinal imaging data from cognitively normal participants in the Alzheimer\u27s Disease Neuroimaging Initiative (ADNI), some of whom subsequently developed dementia, and two cognitively normal cases who developed pathology-proven frontotemporal lobar degeneration (FTLD). These analyses identified regions of accelerated volume loss prior to or accompanying the earliest symptoms, and expanding across the brain over time, in all cases. The changes were detected in regions that are typical for the likely diseases affecting each patient, including medial temporal regions in patients at risk for Alzheimer\u27s disease, and insular, frontal, and/or anterior/inferior temporal regions in patients with likely or proven FTLD. In the cases where detailed histories were available, the first regions identified were consistent with early symptoms. Furthermore, survival analysis in the ADNI cases demonstrated that the rate of spread of accelerated volume loss across the brain was a statistically significant predictor of time to conversion to dementia. This method for detection of neurodegeneration is a potentially promising approach for identifying early changes due to a variety of diseases, without prior assumptions about what regions are most likely to be affected first in an individual

    Probabilistic simulation for the certification of railway vehicles

    Get PDF
    The present dynamic certification process that is based on experiments has been essentially built on the basis of experience. The introduction of simulation techniques into this process would be of great interest. However, an accurate simulation of complex, nonlinear systems is a difficult task, in particular when rare events (for example, unstable behaviour) are considered. After analysing the system and the currently utilized procedure, this paper proposes a method to achieve, in some particular cases, a simulation-based certification. It focuses on the need for precise and representative excitations (running conditions) and on their variable nature. A probabilistic approach is therefore proposed and illustrated using an example. First, this paper presents a short description of the vehicle / track system and of the experimental procedure. The proposed simulation process is then described. The requirement to analyse a set of running conditions that is at least as large as the one tested experimentally is explained. In the third section, a sensitivity analysis to determine the most influential parameters of the system is reported. Finally, the proposed method is summarized and an application is presented

    Simulation numérique par éléments finis d'agrégats polycristallins soumis à des chargements thermomécaniques issus du soudage

    Get PDF
    Le soudage multi-passes de tubes en acier inoxydable austénitique implique des chargements thermomécaniques cycliques complexes, susceptibles d’affecter le matériau au-delà de la zone affectée thermiquement, et dont les effets sur les hétérogénéités de contraintes et de déformations intra et intergranulaires sont encore mal compris. Afin d’analyser numériquement l’influence des textures cristallographique (orientations et désorientations cristallines) et morphologique (taille et forme des grains) sur ces hétérogénéités, des simulations par éléments finis sur des structures idéalisées ont été effectuées à l’aide du code de calcul Abaqus©. Des agrégats polycristallins 3D ont été générés par extrusion de géométries 2D, issues de pavages réguliers ou d’images de microstructures réelles déterminées par MEB-EBSD. Le comportement du monocristal a été supposé thermoélastoviscoplatique anisotrope. L’orientation de chaque grain a été définie soit à partir des mesures EBSD, soit générée aléatoirement. Les conditions de chargement ont été appliquées à partir d’un calcul de soudage réalisé par Areva avec le logiciel Sysweld. La prise en compte dans Abaqus de l’ensemble de ces hypothèses a nécessité le développement de procédures utilisateurs permettant aux deux logiciels de communiquer et de traiter ainsi l'aspect multi-échelle. Les paramètres de la loi de comportement ont été identifiés par méthode inverse à partir de données de la littérature sur le comportement du matériau en traction à différentes températures, de 300 à 1200 K. Des procédures spécifiques de post-traitement en python ont également été développées, notamment pour calculer les contraintes normales et tangentielles aux joints de grains. Les résultats montrent les évolutions de distributions de champs mécaniques pour différentes configurations, caractérisées par des désorientations aléatoires ou exacerbées, et pour différentes orientations des joints de grains par rapport aux axes du chargement imposé. Les résultats sont discutés par comparaison avec les simulations numériques réalisées pour un comportement mécanique isotrope sur un matériau homogène

    Triathlon of Lightweight Block Ciphers for the Internet of Things

    Get PDF
    In this paper we introduce a framework for the benchmarking of lightweight block ciphers on a multitude of embedded platforms. Our framework is able to evaluate the execution time, RAM footprint, as well as binary code size, and allows one to define a custom figure of merit according to which all evaluated candidates can be ranked. We used the framework to benchmark implementations of 19 lightweight ciphers, namely AES, Chaskey, Fantomas, HIGHT, LBlock, LEA, LED, Piccolo, PRESENT, PRIDE, PRINCE, RC5, RECTANGLE, RoadRunneR, Robin, Simon, SPARX, Speck, and TWINE, on three microcontroller platforms: 8-bit AVR, 16-bit MSP430, and 32-bit ARM. Our results bring some new insights to the question of how well these lightweight ciphers are suited to secure the Internet of Things (IoT). The benchmarking framework provides cipher designers with an easy-to-use tool to compare new algorithms with the state-of-the-art and allows standardization organizations to conduct a fair and consistent evaluation of a large number of candidates

    Connectivity-Based Parcellation of the Cortical Mantle Using q-Ball Diffusion Imaging

    Get PDF
    This paper exploits the idea that each individual brain region has a specific connection profile to create parcellations of the cortical mantle using MR diffusion imaging. The parcellation is performed in two steps. First, the cortical mantle is split at a macroscopic level into 36 large gyri using a sulcus recognition system. Then, for each voxel of the cortex, a connection profile is computed using a probabilistic tractography framework. The tractography is performed from q fields using regularized particle trajectories. Fiber ODF are inferred from the q-balls using a sharpening process focusing the weight around the q-ball local maxima. A sophisticated mask of propagation computed from a T1-weighted image perfectly aligned with the diffusion data prevents the particles from crossing the cortical folds. During propagation, the particles father child particles in order to improve the sampling of the long fascicles. For each voxel, intersection of the particle trajectories with the gyri lead to a connectivity profile made up of only 36 connection strengths. These profiles are clustered on a gyrus by gyrus basis using a K-means approach including spatial regularization. The reproducibility of the results is studied for three subjects using spatial normalization

    Triathlon of Lightweight Block Ciphers for the Internet of Things

    Get PDF
    In this paper, we introduce a framework for the benchmarking of lightweight block ciphers on a multitude of embedded platforms. Our framework is able to evaluate the execution time, RAM footprint, as well as binary code size, and allows one to define a custom "figure of merit" according to which all evaluated candidates can be ranked. We used the framework to benchmark implementations of 19 lightweight ciphers, namely AES, Chaskey, Fantomas, HIGHT, LBlock, LEA, LED, Piccolo, PRESENT, PRIDE, PRINCE, RC5, RECTANGLE, RoadRunneR, Robin, Simon, SPARX, Speck, and TWINE, on three microcontroller platforms: 8-bit AVR, 16-bit MSP430, and 32-bit ARM. Our results bring some new insights into the question of how well these lightweight ciphers are suited to secure the Internet of things. The benchmarking framework provides cipher designers with an easy-to-use tool to compare new algorithms with the state of the art and allows standardization organizations to conduct a fair and consistent evaluation of a large number of candidates

    Extensive degeneracy, Coulomb phase and magnetic monopoles in an artificial realization of the square ice model

    Full text link
    Artificial spin ice systems have been introduced as a possible mean to investigate frustration effects in a well-controlled manner by fabricating lithographically-patterned two-dimensional arrangements of interacting magnetic nanostructures. This approach offers the opportunity to visualize unconventional states of matter, directly in real space, and triggered a wealth of studies at the frontier between nanomagnetism, statistical thermodynamics and condensed matter physics. Despite the strong efforts made these last ten years to provide an artificial realization of the celebrated square ice model, no simple geometry based on arrays of nanomagnets succeeded to capture the macroscopically degenerate ground state manifold of the corresponding model. Instead, in all works reported so far, square lattices of nanomagnets are characterized by a magnetically ordered ground state consisting of local flux-closure configurations with alternating chirality. Here, we show experimentally and theoretically, that all the characteristics of the square ice model can be observed if the artificial square lattice is properly designed. The spin configurations we image after demagnetizing our arrays reveal unambiguous signatures of an algebraic spin liquid state characterized by the presence of pinch points in the associated magnetic structure factor. Local excitations, i.e. classical analogues of magnetic monopoles, are found to be free to evolve in a massively degenerated, divergence-free vacuum. We thus provide the first lab-on-chip platform allowing the investigation of collective phenomena, including Coulomb phases and ice-like physics.Comment: 26 pages, 10 figure

    The GRAVITY Coud\'e Infrared Adaptive Optics (CIAO) system for the VLT Interferometer

    Full text link
    GRAVITY is a second generation instrument for the VLT Interferometer, designed to enhance the near-infrared astrometric and spectro-imaging capabilities of VLTI. Combining beams from four telescopes, GRAVITY will provide an astrometric precision of order 10 micro-arcseconds, imaging resolution of 4 milli-arcseconds, and low and medium resolution spectro-interferometry, pushing its performance far beyond current infrared interfero- metric capabilities. To maximise the performance of GRAVITY, adaptive optics correction will be implemented at each of the VLT Unit Telescopes to correct for the effects of atmospheric turbulence. To achieve this, the GRAVITY project includes a development programme for four new wavefront sensors (WFS) and NIR-optimized real time control system. These devices will enable closed-loop adaptive correction at the four Unit Telescopes in the range 1.4-2.4 {\mu}m. This is crucially important for an efficient adaptive optics implementation in regions where optically bright references sources are scarce, such as the Galactic Centre. We present here the design of the GRAVITY wavefront sensors and give an overview of the expected adaptive optics performance under typical observing conditions. Benefiting from newly developed SELEX/ESO SAPHIRA electron avalanche photodiode (eAPD) detectors providing fast readout with low noise in the near-infrared, the AO systems are expected to achieve residual wavefront errors of \leq400 nm at an operating frequency of 500 Hz.Comment: to be published in Proc. SPIE vol. 8446 (2012
    corecore