5,993 research outputs found
Ash plume properties retrieved from infrared images: a forward and inverse modeling approach
We present a coupled fluid-dynamic and electromagnetic model for volcanic ash
plumes. In a forward approach, the model is able to simulate the plume dynamics
from prescribed input flow conditions and generate the corresponding synthetic
thermal infrared (TIR) image, allowing a comparison with field-based
observations. An inversion procedure is then developed to retrieve ash plume
properties from TIR images.
The adopted fluid-dynamic model is based on a one-dimensional, stationary
description of a self-similar (top-hat) turbulent plume, for which an
asymptotic analytical solution is obtained. The electromagnetic
emission/absorption model is based on the Schwarzschild's equation and on Mie's
theory for disperse particles, assuming that particles are coarser than the
radiation wavelength and neglecting scattering. [...]
Application of the inversion procedure to an ash plume at Santiaguito volcano
(Guatemala) has allowed us to retrieve the main plume input parameters, namely
the initial radius , velocity , temperature , gas mass ratio
, entrainment coefficient and their related uncertainty. Moreover,
coupling with the electromagnetic model, we have been able to obtain a reliable
estimate of the equivalent Sauter diameter of the total particle size
distribution.
The presented method is general and, in principle, can be applied to the
spatial distribution of particle concentration and temperature obtained by any
fluid-dynamic model, either integral or multidimensional, stationary or
time-dependent, single or multiphase. The method discussed here is fast and
robust, thus indicating potential for applications to real-time estimation of
ash mass flux and particle size distribution, which is crucial for model-based
forecasts of the volcanic ash dispersal process.Comment: 41 pages, 13 figures, submitted pape
Emerging role of angiogenesis in adaptive and maladaptive right ventricular remodeling in pulmonary hypertension
Right ventricular (RV) function is the primary prognostic factor for both morbidity and mortality in pulmonary hypertension (PH). RV hypertrophy is initially an adaptive physiological response to increased overload; however, with persistent and/or progressive afterload increase, this response frequently transitions to more pathological maladaptive remodeling. The mechanisms and disease processes underlying this transition are mostly unknown. Angiogenesis has recently emerged as a major modifier of RV adaptation in the setting of pressure overload. A novel paradigm has emerged that suggests that angiogenesis and angiogenic signaling are required for RV adaptation to afterload increases and that impaired and/or insufficient angiogenesis is a major driver of RV decompensation. Here, we summarize our current understanding of the concepts of maladaptive and adaptive RV remodeling, discuss the current literature on angiogenesis in the adapted and failing RV, and identify potential therapeutic approaches targeting angiogenesis in RV failure
The maison europeenne des procedes innovants (MEPI),an example of piloting and industrial demonstration facility for the green process engineering
Abstract. Economical, energy savings and environmental challenges require an actual technological breakthrough in process engineering, aiming with productivity, product quality, safety and reliability objectives. This explains the present growth of interest in innovative technologies (intensified devices for reaction, mixing and separation) and methods (multifunctionality, hybrid separation, batch to continuous methodology, new media …), the whole being recognized as Process Intensification. Up to now, a few of innovations has been successfully industrialized, probably due to the lack of experience and retrofitting in front of a breakthrough that always represents a technical and financial risk. There is now clearly a need for industrial demonstrations of successful PI experiments avoiding the questions of confidentiality. Consequently, a piloting and demonstration facility has been created in Toulouse in order to accelerate the implementation of PI technology in industry and the development of the Green Process Engineering. The idea is to build a data bank of success stories. The principle of this industrial technical platform lies on the association of 3 types of partners: university, equipment providers and industrial end-users
Baryon Asymmetry of the Universe without Boltzmann or Kadanoff-Baym
We present a formalism that allows the computation of the baryon asymmetry of
the universe from first principles of statistical physics and quantum field
theory that is applicable to certain types of beyond the Standard Model physics
(such as the neutrino Minimal Standard Model -- MSM) and does not require
the solution of Boltzmann or Kadanoff-Baym equations. The formalism works if a
thermal bath of Standard Model particles is very weakly coupled to a new sector
(sterile neutrinos in the MSM case) that is out-of-equilibrium. The key
point that allows a computation without kinetic equations is that the number of
sterile neutrinos produced during the relevant cosmological period remains
small. In such a case, it is possible to expand the formal solution of the von
Neumann equation perturbatively and obtain a master formula for the lepton
asymmetry expressed in terms of non-equilibrium Wightman functions. The master
formula neatly separates CP-violating contributions from finite temperature
correlation functions and satisfies all three Sakharov conditions. These
correlation functions can then be evaluated perturbatively; the validity of the
perturbative expansion depends on the parameters of the model considered. Here
we choose a toy model (containing only two active and two sterile neutrinos) to
illustrate the use of the formalism, but it could be applied to other models.Comment: 26 pages, 10 figure
Eigenvalues and simplicity of interval exchange transformations
International audienceFor a class of d-interval exchange transformations, whichwe call the symmetric class,we deÞne a new self-dual induction process in which the system is successively induced on a union ofsub-intervals. This algorithm gives rise to an underlying graph structure which reßects the dynamicalbehavior of the system, through theRokhlin towers of the induced maps. We apply it to build a wide as-sortment of explicit examples on four intervals having different dynamical properties: these include theÞrst nontrivial examples with eigenvalues (rational or irrational), the Þrst ever example of an exchangeon more than three intervals satisfying VeechÕs simplicity (though this weakening of the notion of min-imal self-joinings was designed in 1982 to be satisÞed by interval exchange transformations), and anunexpected example which is non uniquely ergodic, weakly mixing for one invariant ergodic measurebut has rational eigenvalues for the other invariant ergodic measure
Correlation functions of the one-dimensional attractive Bose gas
The zero-temperature correlation functions of the one-dimensional attractive
Bose gas with delta-function interaction are calculated analytically for any
value of the interaction parameter and number of particles, directly from the
integrability of the model. We point out a number of interesting features,
including zero recoil energy for large number of particles, analogous to a
M\"ossbauer effect.Comment: 4 pages, 2 figure
Revisiting Shared Data Protection Against Key Exposure
This paper puts a new light on secure data storage inside distributed
systems. Specifically, it revisits computational secret sharing in a situation
where the encryption key is exposed to an attacker. It comes with several
contributions: First, it defines a security model for encryption schemes, where
we ask for additional resilience against exposure of the encryption key.
Precisely we ask for (1) indistinguishability of plaintexts under full
ciphertext knowledge, (2) indistinguishability for an adversary who learns: the
encryption key, plus all but one share of the ciphertext. (2) relaxes the
"all-or-nothing" property to a more realistic setting, where the ciphertext is
transformed into a number of shares, such that the adversary can't access one
of them. (1) asks that, unless the user's key is disclosed, noone else than the
user can retrieve information about the plaintext. Second, it introduces a new
computationally secure encryption-then-sharing scheme, that protects the data
in the previously defined attacker model. It consists in data encryption
followed by a linear transformation of the ciphertext, then its fragmentation
into shares, along with secret sharing of the randomness used for encryption.
The computational overhead in addition to data encryption is reduced by half
with respect to state of the art. Third, it provides for the first time
cryptographic proofs in this context of key exposure. It emphasizes that the
security of our scheme relies only on a simple cryptanalysis resilience
assumption for blockciphers in public key mode: indistinguishability from
random, of the sequence of diferentials of a random value. Fourth, it provides
an alternative scheme relying on the more theoretical random permutation model.
It consists in encrypting with sponge functions in duplex mode then, as before,
secret-sharing the randomness
Mandated data archiving greatly improves access to research data
The data underlying scientific papers should be accessible to researchers
both now and in the future, but how best can we ensure that these data are
available? Here we examine the effectiveness of four approaches to data
archiving: no stated archiving policy, recommending (but not requiring)
archiving, and two versions of mandating data deposition at acceptance. We
control for differences between data types by trying to obtain data from papers
that use a single, widespread population genetic analysis, STRUCTURE. At one
extreme, we found that mandated data archiving policies that require the
inclusion of a data availability statement in the manuscript improve the odds
of finding the data online almost a thousand-fold compared to having no policy.
However, archiving rates at journals with less stringent policies were only
very slightly higher than those with no policy at all. At one extreme, we found
that mandated data archiving policies that require the inclusion of a data
availability statement in the manuscript improve the odds of finding the data
online almost a thousand fold compared to having no policy. However, archiving
rates at journals with less stringent policies were only very slightly higher
than those with no policy at all. We also assessed the effectiveness of asking
for data directly from authors and obtained over half of the requested
datasets, albeit with about 8 days delay and some disagreement with authors.
Given the long term benefits of data accessibility to the academic community,
we believe that journal based mandatory data archiving policies and mandatory
data availability statements should be more widely adopted
Refurbishing Voyager 1 & 2 Planetary Radio Astronomy (PRA) Data
Voyager/PRA (Planetary Radio Astronomy) data from digitized tapes archived at
CNES have been reprocessed and recalibrated. The data cover the Jupiter and
Saturn flybys of both Voyager probes. We have also reconstructed
goniopolarimetric datasets (flux and polarization) at full resolution. These
datasets are currently not available to the scientific community, but they are
of primary interest for the analysis of the Cassini data at Saturn, and the
Juno data at Jupiter, as well as for the preparation of the JUICE mission. We
present the first results derived from the re-analysis of this dataset.Comment: Accepted manuscript for PRE8 (Planetary Radio Emission VIII
conference) proceeding
Robust Constraint on a Drifting Proton-to-Electron Mass Ratio at z=0.89 from Methanol Observation at Three Radio Telescopes
A limit on a possible cosmological variation of the proton-to-electron mass
ratio is derived from methanol (CHOH) absorption lines in the
benchmark PKS1830211 lensing galaxy at redshift observed with the
Effelsberg 100-m radio telescope, the Institute de Radio Astronomie
Millim\'{e}trique 30-m telescope, and the Atacama Large
Millimeter/submillimeter Array. Ten different absorption lines of CHOH
covering a wide range of sensitivity coefficients are used to derive
a purely statistical 1- constraint of for a lookback time of 7.5 billion years. Systematic effects of
chemical segregation, excitation temperature, frequency dependence and time
variability of the background source are quantified. A multi-dimensional linear
regression analysis leads to a robust constraint of .Comment: 5 pages, 3 figures. Published in PR
- …