9,693 research outputs found
On what I do not understand (and have something to say): Part I
This is a non-standard paper, containing some problems in set theory I have
in various degrees been interested in. Sometimes with a discussion on what I
have to say; sometimes, of what makes them interesting to me, sometimes the
problems are presented with a discussion of how I have tried to solve them, and
sometimes with failed tries, anecdote and opinion. So the discussion is quite
personal, in other words, egocentric and somewhat accidental. As we discuss
many problems, history and side references are erratic, usually kept at a
minimum (``see ... '' means: see the references there and possibly the paper
itself).
The base were lectures in Rutgers Fall'97 and reflect my knowledge then. The
other half, concentrating on model theory, will subsequently appear
Entanglement tongue and quantum synchronization of disordered oscillators
We study the synchronization of dissipatively-coupled van der Pol oscillators
in the quantum limit, when each oscillator is near its quantum ground state.
Two quantum oscillators with different frequencies exhibit an entanglement
tongue, which is the quantum analogue of an Arnold tongue. It means that the
oscillators are entangled in steady state when the coupling strength is greater
than a critical value, and the critical coupling increases with detuning. An
ensemble of many oscillators with random frequencies still exhibits a
synchronization phase transition in the quantum limit, and we analytically
calculate how the critical coupling depends on the frequency disorder. Our
results can be experimentally observed with trapped ions or neutral atoms.Comment: 11 pages, 5 figure
Quantum Simulation Logic, Oracles, and the Quantum Advantage
Query complexity is a common tool for comparing quantum and classical
computation, and it has produced many examples of how quantum algorithms differ
from classical ones. Here we investigate in detail the role that oracles play
for the advantage of quantum algorithms. We do so by using a simulation
framework, Quantum Simulation Logic (QSL), to construct oracles and algorithms
that solve some problems with the same success probability and number of
queries as the quantum algorithms. The framework can be simulated using only
classical resources at a constant overhead as compared to the quantum resources
used in quantum computation. Our results clarify the assumptions made and the
conditions needed when using quantum oracles. Using the same assumptions on
oracles within the simulation framework we show that for some specific
algorithms, like the Deutsch-Jozsa and Simon's algorithms, there simply is no
advantage in terms of query complexity. This does not detract from the fact
that quantum query complexity provides examples of how a quantum computer can
be expected to behave, which in turn has proved useful for finding new quantum
algorithms outside of the oracle paradigm, where the most prominent example is
Shor's algorithm for integer factorization.Comment: 48 pages, 46 figure
Bell's Theorem and Locally-Mediated Reformulations of Quantum Mechanics
Bell's Theorem rules out many potential reformulations of quantum mechanics,
but within a generalized framework, it does not exclude all "locally-mediated"
models. Such models describe the correlations between entangled particles as
mediated by intermediate parameters which track the particle world-lines and
respect Lorentz covariance. These locally-mediated models require the
relaxation of an arrow-of-time assumption which is typically taken for granted.
Specifically, some of the mediating parameters in these models must
functionally depend on measurement settings in their future, i.e., on input
parameters associated with later times. This option (often called
"retrocausal") has been repeatedly pointed out in the literature, but the
exploration of explicit locally-mediated toy-models capable of describing
specific entanglement phenomena has begun only in the past decade. A brief
survey of such models is included here. These models provide a continuous and
consistent description of events associated with spacetime locations, with
aspects that are solved "all-at-once" rather than unfolding from the past to
the future. The tension between quantum mechanics and relativity which is
usually associated with Bell's Theorem does not occur here. Unlike conventional
quantum models, the number of parameters needed to specify the state of a
system does not grow exponentially with the number of entangled particles. The
promise of generalizing such models to account for all quantum phenomena is
identified as a grand challenge.Comment: 61 pages, 2 figures; accepted for publication by Rev. Mod. Phy
Environmental Efficiency, Emission Trends and Labour Productivity: Trade-Off or Joint Dynamics? Empirical Evidence Using NAMEA Panel Data
The paper provides new empirical evidence on the relationship between environmental efficiency and labour productivity using industry level data. We first provide a critical and extensive discussion around the interconnected issues of environmental efficiency and performance, firm performances and labour productivity, and environmental and non-environmental innovation dynamics. The most recent literature dealing with environmental innovation, environmental regulations and economic performances is taken as reference. We then test a newly adapted EKC hypothesis, by verifying the correlation between the two trends of environmental efficiency (productivity, namely sector emission on added value) and labour productivity (added value on employees) over a dynamic path. We exploit official NAMEA data sources for Italy over 1990-2002 for 29 sectoral branches. The period is crucial since environmental issues and then environmental policies came into the arena, and a restructuring of the economy occurred. It is thus interesting to assess the extent to which capital investments for the economy as a whole are associated with a positive or negative correlation between environmental efficiency of productive branches and labour productivity, often claimed by mainstream theory dealing with innovation in environmental economics. We believe that on the basis of the theoretical and empirical analyses focusing on innovation paths, firm performances and environmental externalities, there are good reasons to expect a positive correlation between environmental and labour productivities, or in alternative terms a negative correlation between mission intensity of production and labour productivity. The tested hypothesis is crucial within the long standing discussion over the potential trade-off or complementarity between environmental and labour productivity, strictly associated with sectoral and national technological innovation paths. The main added value of the paper is the analysis of the aforementioned hypothesis by exploiting a panel data set based on official NAMEA sectoral disaggregated accounting data, providing both cross section heterogeneity and a sufficient time span. We find that for most emissions, if not all, a negative correlation emerges between labour productivity and environmental productivity. Though this trend appears driven by the macro sectors services, manufacturing and industry, this evidence is not homogenous across emissions. In some cases U-shapes arise, mainly for services, and the assessment of Turning Points is crucial. Manufacturing and industry, all in all, seem to have a stronger weight. Overall, then, labour productivity dynamics seem to be complementary to a decreasing emission intensity of productive processes. The extent to which this evidence derives from endogenous market forces, industrial restructuring and/or from policy effects is scope for further research. The relative role of manufacturing and services in explaining this pattern is also to be analysed in future empirical analyses. In addition, the role of capital stocks and trade openness are extensions which may add value to future analyses carried out on the same NAMEA dataset.Decoupling, NAMEA Emissions, Labour Productivity, Sectoral Added Value, Kuznets Curves, Environmental Efficiency
On Random Unitary Channels
In this article we provide necessary and sufficient conditions for a
completely positive trace-preserving (CPT) map to be decomposable into a convex
combination of unitary maps. Additionally, we set out to define a proper
distance measure between a given CPT map and the set of random unitary maps,
and methods for calculating it. In this way one could determine whether
non-classical error mechanisms such as spontaneous decay or photon loss
dominate over classical uncertainties, for example in a phase parameter. The
present paper is a step towards achieving this goal.Comment: 11 pages, typeset using RevTeX
- …