27,550 research outputs found
Quantum stochastic integrals as operators
We construct quantum stochastic integrals for the integrator being a
martingale in a von Neumann algebra, and the integrand -- a suitable process
with values in the same algebra, as densely defined operators affiliated with
the algebra. In the case of a finite algebra we allow the integrator to be an
--martingale in which case the integrals are --martingales too
Servitization 2.0: The significance of product and service dominant logics for public service organisations
This conceptual paper explores servitization as significant to public service organisations (PSOs) within which there is a requirement to administer lean and sustainable provision. It specifically appreciates that the digital transformation of services has embraced customer processing machine technologies that facilitate volume growth alongside information sharing; thus, fostering co-operation within collaborative network systems whilst pro-actively operating as elements of the product-service system (PSS). It demonstrates the significance of good-dominant logic (GDL) and service-dominant logic (SDL) perspectives when considering servitization within specific PSOs, and therefore seeks to better understand the strategic and operational realities for the era of Servitization 2.0
A Probabilistic Defense of Proper De Jure Objections to Theism
A common view among nontheists combines the de jure objection that theism is epistemically unacceptable with agnosticism about the de facto objection that theism is false. Following Plantinga, we can call this a “proper” de jure objection—a de jure objection that does not depend on any de facto objection. In his Warranted Christian Belief, Plantinga has produced a general argument against all proper de jure objections. Here I first show that this argument is logically fallacious (it makes subtle probabilistic fallacies disguised by scope ambiguities), and proceed to lay the groundwork for the construction of actual proper de jure objections
COORDINATING MACROECONOMIC POLICY IN A SIMPLE AK GROWTH MODEL
Modern theories of government finance stress the importance of an economy’s fiscal deficits in determining the course of monetary policy. Modern growth theory stresses the role of monetary factors in economic growth. This paper explores how these two are interrelated, using a simple AK growth model, one with money, reserve requirements, and government debt. We provide a comprehensive look at the coordination of macroeconomic policy and its effects on long-run growth under three alternative coordinating arrangements. We uncover some unconventional results regarding the relationship between growth and a number of policy variables; these rest squarely on the constraint of the coordination processMonetary and Fiscal Policy; AK growth model; inflation targeting; open market operations; reserve requirements
Mediapolis: an introduction
The organisation of this workshop has been prompted by concerns with the way media so
often seem to get left out of writing on cities and urban politics (rather than vice-versa).
We agree with Iveson’s (2007) argument that urban and media studies have much more
in the way of shared concerns when it comes to politics than is conventionally thought to
be the case. As a result, we are hoping this workshop will create an occasion for urban
scholars to meet those studying media, to explore what difference it makes to explicitly
consider the place of media practices in making a politics of cities, and conversely, to
consider what is left out when such practices are relegated to the background. In certain
ways, we are suggesting a contemporary return to something like Robert Park’s
inclination in relation to cities and media. In his seminal essay on the natural history of
the newspaper, for example (Park, 1925), Park exhibits a style which does not generally
seem to distinguish between or oppose the urban and the media when studying politics
and democracy. This surely has something to do with Park’s own intellectual period, and
the absence of established disciplines in media or urban studies. Yet this is also precisely
the point of the workshop: an opportunity for engagement and discussion through a
similar sort of pre-disciplinary spirit
Retrodiction with two-level atoms: atomic previvals
In the Jaynes-Cummings model a two-level atom interacts with a single-mode
electromagnetic field. Quantum mechanics predicts collapses and revivals in the
probability that a measurement will show the atom to be excited at various
times after the initial preparation of the atom and field. In retrodictive
quantum mechanics we seek the probability that the atom was prepared in a
particular state given the initial state of the field and the outcome of a
later measurement on the atom. Although this is not simply the time reverse of
the usual predictive problem, we demonstrate in this paper that retrodictive
collapses and revivals also exist. We highlight the differences between
predictive and retrodictive evolutions and describe an interesting situation
where the prepared state is essentially unretrodictable.Comment: 15 pages, 3 (5) figure
Difficulty of distinguishing product states locally
Non-locality without entanglement is a rather counter-intuitive phenomenon in
which information may be encoded entirely in product (unentangled) states of
composite quantum systems in such a way that local measurement of the
subsystems is not enough for optimal decoding. For simple examples of pure
product states, the gap in performance is known to be rather small when
arbitrary local strategies are allowed. Here we restrict to local strategies
readily achievable with current technology; those requiring neither a quantum
memory nor joint operations. We show that, even for measurements on pure
product states there can be a large gap between such strategies and
theoretically optimal performance. Thus even in the absence of entanglement
physically realizable local strategies can be far from optimal for extracting
quantum information.Comment: 5 pages, 1 figur
From retrodiction to Bayesian quantum imaging
We employ quantum retrodiction to develop a robust Bayesian algorithm for reconstructing the intensity values of an image from sparse photocount data, while also accounting for detector noise in the form of dark counts. This method yields not only a reconstructed image but also provides the full probability distribution function for the intensity at each pixel. We use simulated as well as real data to illustrate both the applications of the algorithm and the analysis options that are only available when the full probability distribution functions are known. These include calculating Bayesian credible regions for each pixel intensity, allowing an objective assessment of the reliability of the reconstructed image intensity values
High performance structures
Materials selection, structural geometry, proof testing and statistical screening, prestressing, and system energy as tools for designing optimum trusses and other high performance structure
- …