4,437 research outputs found
Supply-Side Policies and the Zero Lower Bound
This paper examines how supply-side policies may play a role in fighting a low aggregate demand that traps an economy at the zero lower bound (ZLB) of nominal interest rates. Future increases in productivity or reductions in mark-ups triggered by supply-side policies generate a wealth effect that pulls current consumption and output up. Since the economy is at the ZLB, increases in the interest rates do not undo this wealth effect, as we will have in the case outside the ZLB. We illustrate this mechanism with a simple two-period New Keynesian model. We discuss possible objections to this set of policies and the relation of supply-side policies with more conventional monetary and fiscal policies.
Processes with Long Memory: Regenerative Construction and Perfect Simulation
We present a perfect simulation algorithm for stationary processes indexed by
Z, with summable memory decay. Depending on the decay, we construct the process
on finite or semi-infinite intervals, explicitly from an i.i.d. uniform
sequence. Even though the process has infinite memory, its value at time 0
depends only on a finite, but random, number of these uniform variables. The
algorithm is based on a recent regenerative construction of these measures by
Ferrari, Maass, Mart{\'\i}nez and Ney. As applications, we discuss the perfect
simulation of binary autoregressions and Markov chains on the unit interval.Comment: 27 pages, one figure. Version accepted by Annals of Applied
Probability. Small changes with respect to version
An Empirical Study of Real-World SPARQL Queries
Understanding how users tailor their SPARQL queries is crucial when designing
query evaluation engines or fine-tuning RDF stores with performance in mind. In
this paper we analyze 3 million real-world SPARQL queries extracted from logs
of the DBPedia and SWDF public endpoints. We aim at finding which are the most
used language elements both from syntactical and structural perspectives,
paying special attention to triple patterns and joins, since they are indeed
some of the most expensive SPARQL operations at evaluation phase. We have
determined that most of the queries are simple and include few triple patterns
and joins, being Subject-Subject, Subject-Object and Object-Object the most
common join types. The graph patterns are usually star-shaped and despite
triple pattern chains exist, they are generally short.Comment: 1st International Workshop on Usage Analysis and the Web of Data
(USEWOD2011) in the 20th International World Wide Web Conference (WWW2011),
Hyderabad, India, March 28th, 201
Homomorphic Encryption of the k=2 Bernstein-Vazirani Algorithm
The nonrecursive Bernstein-Vazirani algorithm was the first quantum algorithm
to show a superpolynomial improvement over the corresponding best classical
algorithm. Here we define a class of circuits that solve a particular case of
this problem for second-level recursion. This class of circuits simplifies the
number of gates required to construct the oracle by making it grow linearly
with the number of qubits in the problem. We find an application of this scheme
to quantum homomorphic encryption (QHE) which is an important cryptographic
technology useful for delegated quantum computation. It allows a remote server
to perform quantum computations on encrypted quantum data, so that the server
cannot know anything about the client's data. Liang developed QHE schemes with
perfect security, -homomorphism, no interaction between server and
client, and quasi-compactness bounded by where M is the number of gates
in the circuit. Precisely these schemes are suitable for circuits with a
polynomial number of gates . Following these schemes, the
simplified circuits we have constructed can be evaluated homomorphically in an
efficient way.Comment: Revtex file, color figure
Adaptive multiscapes: an up-to-date metaphor to visualize molecular adaptation
Wright's metaphor of the fitness landscape has shaped and conditioned our view of the adaptation of populations for almost a century. Since its inception, and including criticism raised by Wright himself, the concept has been surrounded by controversy. Among others, the debate stems from the intrinsic difficulty to capture important features of the space of genotypes, such as its high dimensionality or the existence of abundant ridges, in a visually appealing two-dimensional picture. Two additional currently widespread observations come to further constrain the applicability of the original metaphor: the very skewed distribution of phenotype sizes (which may actively prevent, due to entropic effects, the achievement of fitness maxima), and functional promiscuity (i.e. the existence of secondary functions which entail partial adaptation to environments never encountered before by the population). Results: Here we revise some of the shortcomings of the fitness landscape metaphor and propose a new "scape" formed by interconnected layers, each layer containing the phenotypes viable in a given environment. Different phenotypes within a layer are accessible through mutations with selective value, while neutral mutations cause displacements of populations within a phenotype. A different environment is represented as a separated layer, where phenotypes may have new fitness values, other phenotypes may be viable, and the same genotype may yield a different phenotype, representing genotypic promiscuity. This scenario explicitly includes the many-to-many structure of the genotype-to-phenotype map. A number of empirical observations regarding the adaptation of populations in the light of adaptive multiscapes are reviewed. Conclusions: Several shortcomings of Wright's visualization of fitness landscapes can be overcome through adaptive multiscapes.This work was supported by the Spanish projects ViralESS (FIS2014-57686-P, MINECO) and FIS2015-64349-P (MINECO/FEDER, UE). The funding body did not have any role in the design of the study and collection, analysis, and interpretation of data, and did not contribute to writing the manuscript
Reaction Dynamics for the Systems 7Be,8B + 208Pb at Coulomb Barrier Energies
In this contribution we describe the first results obtained for the investigation of
the elastic scattering process in the reactions induced by the Radioactive Ion Beams 7Be and
8B on a 208Pb target at Coulomb barrier energies. The experimental data were analyzed within
the framework of the optical model in order to extract the total reaction cross section. The
comparison with data available in literature for reactions induced on 208Pb by light ions in the
mass range A = 6-8 shows that the loosely-bound 8B has the largest reactivity
A Global Review of PWR Nuclear Power Plants
[Abstract] Nuclear energy is presented as a real option in the face of the current problem of climate change and the need to reduce CO2 emissions. The nuclear reactor design with the greatest global impact throughout history and which has the most ambitious development plans is the Pressurized Water Reactor (PWR). Thus, a global review of such a reactor design is presented in this paper, utilizing the analysis of (i) technical aspects of the different variants of the PWR design implemented over the past eight years, (ii) the level of implementation of PWR nuclear power plants in the world, and (iii) a life extension scenario and future trends in PWR design based on current research and development (R&D) activity. To develop the second analysis, a statistical study of the implementation of the different PWR variants has been carried out. Such a statistical analysis is based on the operating factor, which represents the relative frequency of reactors operating around the world. The results reflect the hegemony of the western variants in the 300 reactors currently operating, highlighting the North American and French versions. Furthermore, a simulation of a possible scenario of increasing the useful life of operational PWRs up to 60 years has been proposed,
seeing that in 2050 the generation capacity of nuclear PWRs power plants will decrease by 50%, and the number of operating reactors by 70%
ToyLIFE: a computational framework to study the multi-level organisation of the genotype-phenotype map
The genotype-phenotype map is an essential object to understand organismal complexity and adaptability. However, its experimental characterisation is a daunting task. Thus, simple models have been proposed and investigated. They have revealed that genotypes differ in their robustness to mutations; phenotypes are represented by a broadly varying number of genotypes, and simple point mutations suffice to navigate the space of genotypes while maintaining a phenotype. Nonetheless, most current models focus only on one level of the map (folded molecules, gene regulatory networks, or networks of metabolic reactions), so that many relevant questions cannot be addressed. Here we introduce toyLIFE, a multi-level model for the genotype-phenotype map based on simple genomes and interaction rules from which a complex behaviour at upper levels emerges - remarkably plastic gene regulatory networks and metabolism. toyLIFE is a tool that permits the investigation of how different levels are coupled, in particular how and where mutations affect phenotype or how the presence of certain metabolites determines the dynamics of toyLIFE gene regulatory networks. The model can easily incorporate evolution through more complex mutations, recombination, or gene duplication and deletion, thus opening an avenue to explore extended genotype-phenotype maps.This work was supported through projects FIS2011-22449 (CFA, PC and JAC) and FIS2011{27569 (SM) of the Spanish MINECO.Publicad
Method for real-time measurement of the nonlinear refractive index
In this work, we propose a novel method for continuous real-time measurement
of the dynamics of the nonlinear refractive index n2. This is particularly
important for characterizing phenomena or materials (such as biological
tissues, gases and other compounds) whose nonlinear behavior or structure
varies rapidly with time. The proposed method ingeniously employs two powerful
tools: the spectral broadening induced by self-phase modulation and the
real-time spectral analysis using the dispersive Fourier transformation. The
feasibility of the technique is experimentally demonstrated, achieving
high-speed measurements at rates of several MHz
Response Time Distributions in Rapid Chess: A Large-Scale Decision Making Experiment
Rapid chess provides an unparalleled laboratory to understand decision making in a natural environment. In a chess game, players choose consecutively around 40 moves in a finite time budget. The goodness of each choice can be determined quantitatively since current chess algorithms estimate precisely the value of a position. Web-based chess produces vast amounts of data, millions of decisions per day, incommensurable with traditional psychological experiments. We generated a database of response times (RTs) and position value in rapid chess games. We measured robust emergent statistical observables: (1) RT distributions are long-tailed and show qualitatively distinct forms at different stages of the game, (2) RT of successive moves are highly correlated both for intra- and inter-player moves. These findings have theoretical implications since they deny two basic assumptions of sequential decision making algorithms: RTs are not stationary and can not be generated by a state-function. Our results also have practical implications. First, we characterized the capacity of blunders and score fluctuations to predict a player strength, which is yet an open problem in chess softwares. Second, we show that the winning likelihood can be reliably estimated from a weighted combination of remaining times and position evaluation
- …