1,850 research outputs found

    Self-consistent simulations of a von K\'arm\'an type dynamo in a spherical domain with metallic walls

    Get PDF
    We have performed numerical simulations of boundary-driven dynamos using a three-dimensional non-linear magnetohydrodynamical model in a spherical shell geometry. A conducting fluid of magnetic Prandtl number Pm=0.01 is driven into motion by the counter-rotation of the two hemispheric walls. The resulting flow is of von K\'arm\'an type, consisting of a layer of zonal velocity close to the outer wall and a secondary meridional circulation. Above a certain forcing threshold, the mean flow is unstable to non-axisymmetric motions within an equatorial belt. For fixed forcing above this threshold, we have studied the dynamo properties of this flow. The presence of a conducting outer wall is essential to the existence of a dynamo at these parameters. We have therefore studied the effect of changing the material parameters of the wall (magnetic permeability, electrical conductivity, and thickness) on the dynamo. In common with previous studies, we find that dynamos are obtained only when either the conductivity or the permeability is sufficiently large. However, we find that the effect of these two parameters on the dynamo process are different and can even compete to the detriment of the dynamo. Our self-consistent approach allow us to analyze in detail the dynamo feedback loop. The dynamos we obtain are typically dominated by an axisymmetric toroidal magnetic field and an axial dipole component. We show that the ability of the outer shear layer to produce a strong toroidal field depends critically on the presence of a conducting outer wall, which shields the fluid from the vacuum outside. The generation of the axisymmetric poloidal field, on the other hand, occurs in the equatorial belt and does not depend on the wall properties.Comment: accepted for publication in Physical Review

    The XENON100 exclusion limit without considering Leff as a nuisance parameter

    Full text link
    In 2011, the XENON100 experiment has set unprecedented constraints on dark matter-nucleon interactions, excluding dark matter candidates with masses down to 6 GeV if the corresponding cross section is larger than 10^{-39} cm^2. The dependence of the exclusion limit in terms of the scintillation efficiency (Leff) has been debated at length. To overcome possible criticisms XENON100 performed an analysis in which Leff was considered as a nuisance parameter and its uncertainties were profiled out by using a Gaussian likelihood in which the mean value corresponds to the best fit Leff value smoothly extrapolated to zero below 3 keVnr. Although such a method seems fairly robust, it does not account for more extreme types of extrapolation nor does it enable to anticipate on how much the exclusion limit would vary if new data were to support a flat behaviour for Leff below 3 keVnr, for example. Yet, such a question is crucial for light dark matter models which are close to the published XENON100 limit. To answer this issue, we use a maximum Likelihood ratio analysis, as done by the XENON100 collaboration, but do not consider Leff as a nuisance parameter. Instead, Leff is obtained directly from the fits to the data. This enables us to define frequentist confidence intervals by marginalising over Leff.Comment: 10 pages;, 9 figures; references adde

    The transportation sector and low-carbon growth pathways: modeling urban, infrastructure and spatial determinants of mobility

    Get PDF
    International audienceThere is still a controversy as to the effect of spatial organization on CO2 emissions. This paper contributes to this debate by investigating the potentials offered by infrastructure measures favoring lower mobility in the transition to a low-carbon economy. This is done by embarking a detailed description of passenger and freight transportation in an energy-economy-environment (E3) model. In addition to the standard representation of transport technologies, this framework considers explicitly the "behavioural" determinants of mobility that drive the demand for transport but are often disregarded in mitigation assessments: constrained mobility needs (essentially commuting) imposed by the spatial organization of residence and production, modal choices triggered by installed infrastructure and the freight transport intensity of production processes. This study demonstrates that the implementation of measures fostering a modal shift towards low-carbon modes and a decoupling of mobility needs from economic activity significantly modifies the sectoral distribution of mitigation efforts and reduces the carbon tax levels necessary to reach a given climate target relatively to a "carbon price only" policy. This result is robust to a wide range of assumptions about exogenous parameters

    Existence of Weak Solutions for the Unsteady Interaction of a Viscous Fluid with an Elastic Plate

    Get PDF
    International audienceWe consider a three--dimensional viscous incompressible fluid governed by the Navier--Stokes equations, interacting with an elastic plate located on one part of the fluid boundary. We do not neglect the deformation of the fluid domain which consequently depends on the displacement of the structure. The purpose of this work is to study the solutions of this unsteady fluid--structure interaction problem, as the coefficient modeling the viscoelasticity (resp. the rotatory inertia) of the plate tends to zero. As a consequence, we obtain the existence of at least one weak solution for the limit problem (Navier--Stokes equation coupled with a plate in flexion) as long as the structure does not touch the bottom of the fluid cavity

    Nonparametric Bayesian extraction of object configurations in massive data

    No full text
    International audienceThis study presents an unsupervised method for detection of configurations of objects based on a point process in a nonparametric Bayesian framework. This is of interest as the model presented here has a number of parameters that increases with the number of objects detected. The marked point process yields a natural sparse representation of the object configuration, even in massive data fields. However, Bayesian methods can lead to the evaluation of some densities that raise computational issues, due to the huge number of detected objects. We have developed an iterative update of these densities when changes in the object configurations are made, which allows the computational cost to be reduced. The performance of the proposed algorithm is illustrated on synthetic data and very challenging quasi-real hyperspectral data for young galaxy detection

    Contrôle des erreurs pour la détection d'événements rares et faibles dans des champs de données massifs

    No full text
    National audienceIn this paper, we address the general issue of detecting rare and weak signatures in very noisy data. Multiple hypotheses testing approaches can be used to extract a list of components of the data that are likely to be contaminated by a source while controlling a global error criterion. However most of efficients methods available in the literature stand for independent tests, or require specific dependency hypotheses. Based on the work of Benjamini and Yekutieli [1], we show that under some classical positivity assumptions, the Benjamini-Hochberg procedure for False Discovery Rate (FDR) [2] control can be directly applied to the statistics produced by a very common tool in signal and image processing that introduces dependency: the matched filter.Nous nous intéressons à la détection d'événements rares et de faible intensité dans des données massives bruitées. Les approches par tests multiples d'hypothèses peuvent être utilisées pour extraire une liste d'échantillons susceptibles de contenir de l'information tout en contrôlant un critère d'erreurs de détection global. Dans la littérature, la plupart de ces approches ne sont valides que pour des tests indépendants, ou sous des hypothèses particulières de dépendance. Nous nous proposons de montrer, en étendant les travaux de Benjamini et Yekutieli [1], que sous certaines hypothèses, il est cependant possible d'appliquer la procédure de contrôle du taux de fausses découverte (FDR) de Benjamini-Hochberg [2] sur une statistique de filtrage adapté très utilisée en traitement du signal et des images

    Error control for the detection of rare and weak signatures in massive data

    No full text
    International audienceIn this paper, we address the general issue of detecting rare and weak signatures in very noisy data. Multiple hypotheses testing approaches can be used to extract a list of components of the data that are likely to be contaminated by a source while controlling a global error criterion. However most of efficients methods available in the literature are derived for independent tests. Based on the work of Benjamini and Yekutieli [1], we show that under some classical positiv-ity assumptions, the Benjamini-Hochberg procedure for False Discovery Rate (FDR) control can be directly applied to the result produced by a very common tool in signal and image processing: the matched filter. This shows that despite the dependency structure between the components of the matched filter output, the Benjamini-Hochberg procedure still guarantee the FDR control. This is illustrated on both synthetic and real data

    Qudits of composite dimension, mutually unbiased bases and projective ring geometry

    Full text link
    The d2d^2 Pauli operators attached to a composite qudit in dimension dd may be mapped to the vectors of the symplectic module Zd2\mathcal{Z}_d^{2} (Zd\mathcal{Z}_d the modular ring). As a result, perpendicular vectors correspond to commuting operators, a free cyclic submodule to a maximal commuting set, and disjoint such sets to mutually unbiased bases. For dimensions d=6, 10, 15, 12d=6,~10,~15,~12, and 18, the fine structure and the incidence between maximal commuting sets is found to reproduce the projective line over the rings Z6\mathcal{Z}_{6}, Z10\mathcal{Z}_{10}, Z15\mathcal{Z}_{15}, Z6×F4\mathcal{Z}_6 \times \mathbf{F}_4 and Z6×Z3\mathcal{Z}_6 \times \mathcal{Z}_3, respectively.Comment: 10 pages (Fast Track communication). Journal of Physics A Mathematical and Theoretical (2008) accepte

    Worldwide spreading of economic crisis

    Full text link
    We model the spreading of a crisis by constructing a global economic network and applying the Susceptible-Infected-Recovered (SIR) epidemic model with a variable probability of infection. The probability of infection depends on the strength of economic relations between the pair of countries, and the strength of the target country. It is expected that a crisis which originates in a large country, such as the USA, has the potential to spread globally, like the recent crisis. Surprisingly we show that also countries with much lower GDP, such as Belgium, are able to initiate a global crisis. Using the {\it k}-shell decomposition method to quantify the spreading power (of a node), we obtain a measure of ``centrality'' as a spreader of each country in the economic network. We thus rank the different countries according to the shell they belong to, and find the 12 most central countries. These countries are the most likely to spread a crisis globally. Of these 12 only six are large economies, while the other six are medium/small ones, a result that could not have been otherwise anticipated. Furthermore, we use our model to predict the crisis spreading potential of countries belonging to different shells according to the crisis magnitude.Comment: 13 pages, 4 figures and Supplementary Materia
    corecore