1,866 research outputs found
The Deflationary Bias of the ZLB and the FED’s Strategic Response
The paper shows, in a simple analytical framework, the existence of a deflationary bias in an
economy with a low natural rate of interest, a Zero Lower Bound (ZLB) constraint on
nominal interest rates and a discretionary Central Bank with an inflation mandate. The
presence of the ZLB prevents the central bank from offsetting negative shocks to inflation
whereas it can offset positive shocks. This asymmetry pushes average inflation below the
target which in turn drags down inflation expectations and reinforces the likelihood of hitting
the ZLB. We show that this deflationary bias is particularly relevant for a Central Bank with
a symmetric dual mandate (i.e. minimizing deviations from inflation and employment),
especially when facing demand shocks. But a strict inflation targeter cannot escape the suboptimal deflationary equilibrium either. The deflationary bias can be mitigated by targeting
“shortfalls” instead of “deviations” from maximum employment and/or using flexible
average inflation targeting. However, changing monetary policy strategy risks inflation
expectations becoming entrenched above the target if the natural interest rate increases
Space-weighted seismic attenuation mapping of the aseismic source of Campi Flegrei 1983-84 unrest
Peer reviewedPublisher PD
Quasideterministic generation of maximally entangled states of two mesoscopic atomic ensembles by adiabatic quantum feedback
We introduce an efficient, quasideterministic scheme to generate maximally
entangled states of two atomic ensembles. The scheme is based on quantum
nondemolition measurements of total atomic populations and on adiabatic quantum
feedback conditioned by the measurements outputs. The high efficiency of the
scheme is tested and confirmed numerically for ideal photodetection as well as
in the presence of losses.Comment: 7 pages, 6 figures, title changed, revised version published on Phys.
Rev
Decoherence of number states in phase-sensitive reservoirs
The non-unitary evolution of initial number states in general Gaussian
environments is solved analytically. Decoherence in the channels is quantified
by determining explicitly the purity of the state at any time. The influence of
the squeezing of the bath on decoherence is discussed. The behavior of coherent
superpositions of number states is addressed as well.Comment: 5 pages, 2 figures, minor changes, references adde
Tunable non-Gaussian resources for continuous-variable quantum technologies
We introduce and discuss a set of tunable two-mode states of
continuous-variable systems, as well as an efficient scheme for their
experimental generation. This novel class of tunable entangled resources is
defined by a general ansatz depending on two experimentally adjustable
parameters. It is very ample and flexible as it encompasses Gaussian as well as
non-Gaussian states. The latter include, among others, known states such as
squeezed number states and de-Gaussified photon-added and photon-subtracted
squeezed states, the latter being the most efficient non-Gaussian resources
currently available in the laboratory. Moreover, it contains the classes of
squeezed Bell states and even more general non-Gaussian resources that can be
optimized according to the specific quantum technological task that needs to be
realized. The proposed experimental scheme exploits linear optical operations
and photon detections performed on a pair of uncorrelated two--mode Gaussian
squeezed states. The desired non-Gaussian state is then realized via ancillary
squeezing and conditioning. Two independent, freely tunable experimental
parameters can be exploited to generate different states and to optimize the
performance in implementing a given quantum protocol. As a concrete instance,
we analyze in detail the performance of different states considered as
resources for the realization of quantum teleportation in realistic conditions.
For the fidelity of teleportation of an unknown coherent state, we show that
the resources associated to the optimized parameters outperform, in a
significant range of experimental values, both Gaussian twin beams and
photon-subtracted squeezed states.Comment: 13 pages, 7 figure
Optimal estimation of losses at the ultimate quantum limit with non-Gaussian states
We address the estimation of the loss parameter of a bosonic channel probed
by arbitrary signals. Unlike the optimal Gaussian probes, which can attain the
ultimate bound on precision asymptotically either for very small or very large
losses, we prove that Fock states at any fixed photon number saturate the bound
unconditionally for any value of the loss. In the relevant regime of low-energy
probes, we demonstrate that superpositions of the first low-lying Fock states
yield an absolute improvement over any Gaussian probe. Such few-photon states
can be recast quite generally as truncations of de-Gaussified photon-subtracted
states.Comment: 4 pages, 3 figure
Global gyrokinetic simulations of ITG turbulence in the configuration space of the Wendelstein 7-X stellarator
We study the effect of turbulent transport in different magnetic
configurations of the Weldenstein 7-X stellarator. In particular, we performed
direct numerical simulations with the global gyrokinetic code GENE-3D, modeling
the behavior of Ion Temperature Gradient turbulence in the Standard,
High-Mirror, and Low-Mirror configurations of W7-X. We found that the
Low-Mirror configuration produces more transport than both the High-Mirror and
the Standard configurations. By comparison with radially local simulations, we
have demonstrated the importance of performing global nonlinear simulations to
predict the turbulent fluxes quantitatively
A Gaussian-Mixture based stochastic framework for the interpretation of spatial heterogeneity in multimodal fields
We provide theoretical formulations enabling characterization of spatial distributions of variables (such as, e.g., conductivity/permeability, porosity, vadose zone hydraulic parameters, and reaction rates) that are typical of hydrogeological and/or geochemical scenarios associated with randomly heterogeneous geomaterials and are organized on various scales of heterogeneity. Our approach and ensuing formulations embed the joint assessment of the probability distribution of a target variable and its associated spatial increments, DY, taken between locations separated by any given distance (or lag). The spatial distribution of Y is interpreted through a bimodal Gaussian mixture model. The modes of the latter correspond to an indicator random field which is in turn related to the occurrence of different processes and/or geomaterials within the domain of observation. The distribution of each component of the mixture is governed by a given length scale driving the strength of its spatial correlation. Our model embeds within a unique theoretical framework the main traits arising in a stochastic analysis of these systems. These include (i) a slight to moderate asymmetry in the distribution of Y and (ii) the occurrence of a dominant peak and secondary peaks in the distribution of DY whose importance changes with lag together with the moments of the distribution. This causes the probability distribution of increments to scale with lag in way that is consistent with observed experimental patterns. We analyze the main features of the modeling and parameter estimation framework through a set of synthetic scenarios. We then consider two experimental datasets associated with different processes and observation scales. We start with an original dataset comprising microscale reaction rate maps taken at various observation times. These are evaluated from AFM imaging of the surface of a calcite crystal in contact with a fluid and subject to dissolution. Such recent high resolution imaging techniques are key to enhance our knowledge of the processes driving the reaction. The second dataset is a well established collection of Darcy-scale air-permeability data acquired by Tidwell and Wilson (1999) [Water Resour Res, 35, 3375-3387] on a block of volcanic tuff through minipermeameters associated with various measurement scales
Exploring the links between cancer and placenta development
The development of metastatic cancer is a multistage process, which often requires decades to complete. Impairments in DNA damage control and DNA repair in cancer cell precursors generate genetically heterogeneous cell populations. However, despite heterogeneity most solid cancers have stereotypical behaviours, including invasiveness and suppression of immune responses that can be unleashed with immunotherapy targeting lymphocyte checkpoints. The mechanisms leading to the acquisition of stereotypical properties remain poorly understood. Reactivation of embryonic development processes in cells with unstable genomes might contribute to tumour expansion and metastasis formation. However, it is unclear whether these events are linked to immune response modulation. Tumours and embryos have non-self-components and need to avoid immune responses in their microenvironment. In mammalian embryos, neo-antigens are of paternal origin, while in tumour cells DNA mismatch repair and replication defects generate them. Inactivation of the maternal immune response towards the embryo, which occurs at the placental-maternal interface, is key to ensuring embryonic development. This regulation is accomplished by the trophoblast, which mimics several malignant cell features, including the ability to invade normal tissues and to avoid host immune responses, often adopting the same cancer immunoediting strategies. A better understanding as to whether and how genotoxic stress promotes cancer development through reactivation of programmes occurring during early stages of mammalian placentation could help to clarify resistance to drugs targeting immune checkpoint and DNA damage responses and to develop new therapeutic strategies to eradicate cancer
- …