1,476 research outputs found
A provenance task abstraction framework
Visual analytics tools integrate provenance recording to externalize analytic processes or user insights. Provenance can be captured on varying levels of detail, and in turn activities can be characterized from different granularities. However, current approaches do not support inferring activities that can only be characterized across multiple levels of provenance. We propose a task abstraction framework that consists of a three stage approach, composed of (1) initializing a provenance task hierarchy, (2) parsing the provenance hierarchy by using an abstraction mapping mechanism, and (3) leveraging the task hierarchy in an analytical tool. Furthermore, we identify implications to accommodate iterative refinement, context, variability, and uncertainty during all stages of the framework. A use case describes exemplifies our abstraction framework, demonstrating how context can influence the provenance hierarchy to support analysis. The paper concludes with an agenda, raising and discussing challenges that need to be considered for successfully implementing such a framework
A Quantum Rosetta Stone for Interferometry
Heisenberg-limited measurement protocols can be used to gain an increase in
measurement precision over classical protocols. Such measurements can be
implemented using, e.g., optical Mach-Zehnder interferometers and Ramsey
spectroscopes. We address the formal equivalence between the Mach-Zehnder
interferometer, the Ramsey spectroscope, and the discrete Fourier transform.
Based on this equivalence we introduce the ``quantum Rosetta stone'', and we
describe a projective-measurement scheme for generating the desired
correlations between the interferometric input states in order to achieve
Heisenberg-limited sensitivity. The Rosetta stone then tells us the same method
should work in atom spectroscopy.Comment: 8 pages, 4 figure
A novel approach to task abstraction to make better sense of provenance data
Working Group Report in 'Provenance and Logging for Sense Making' report from Dagstuhl Seminar 18462: Provenance and Logging for Sense Making, Dagstuhl Reports, Volume 8, Issue 1
From Linear Optical Quantum Computing to Heisenberg-Limited Interferometry
The working principles of linear optical quantum computing are based on
photodetection, namely, projective measurements. The use of photodetection can
provide efficient nonlinear interactions between photons at the single-photon
level, which is technically problematic otherwise. We report an application of
such a technique to prepare quantum correlations as an important resource for
Heisenberg-limited optical interferometry, where the sensitivity of phase
measurements can be improved beyond the usual shot-noise limit. Furthermore,
using such nonlinearities, optical quantum nondemolition measurements can now
be carried out at the single-photon level.Comment: 10 pages, 5 figures; Submitted to a Special Issue of J. Opt. B on
"Fluctuations and Noise in Photonics and Quantum Optics" (Herman Haus
Memorial Issue); v2: minor change
Music in advertising and consumer identity: The search for Heideggerian authenticity
This study discusses netnographic findings involving 472 YouTube postings categorized to identify themes regarding consumersâ experience of music in advertisements. Key themes relate to musical taste, musical indexicality, musical repetition and musical authenticity. Postings reveal how music conveys individual taste and is linked to personal memories and Heideggerâs coincidental time where moments of authenticity may be triggered in a melee of emotions, memories and projections. Identity protection is enabled as consumers frequently resist advertisersâ attempts to use musical repetition to impose normative identity. Critiques of repetition in the music produce Heideggerian anxiety leading to critically reflective resistance. Similarly, where advertising devalues the authenticity of iconic pieces of music, consumers often resist such authenticity transgressions as a threat to their own identity. The Heideggerian search for meaning in life emphasizes the significance of philosophically driven ideological authenticity in consumersâ responses to music in advertisements
BCG as a case study for precision vaccine development: lessons from vaccine heterogeneity, trained immunity, and immune ontogeny
Vaccines have been traditionally developed with the presumption that they exert identical immunogenicity regardless of target population and that they provide protection solely against their target pathogen. However, it is increasingly appreciated that vaccines can have off-target effects and that vaccine immunogenicity can vary substantially with demographic factors such as age and sex. Bacille Calmette-GuĂ©rin (BCG), the live attenuated Mycobacterium bovis vaccine against tuberculosis (TB), represents a key example of these concepts. BCG vaccines are manufactured under different conditions across the globe generating divergent formulations. Epidemiologic studies have linked early life immunization with certain BCG formulations to an unanticipated reduction (âŒ50%) in all-cause mortality, especially in low birthweight males, greatly exceeding that attributable to TB prevention. This mortality benefit has been related to prevention of sepsis and respiratory infections suggesting that BCG induces âheterologousâ protection against unrelated pathogens. Proposed mechanisms for heterologous protection include vaccine-induced immunometabolic shifts, epigenetic reprogramming of innate cell populations, and modulation of hematopoietic stem cell progenitors resulting in altered responses to subsequent stimuli, a phenomenon termed âtrained immunity.â In addition to genetic differences, licensed BCG formulations differ markedly in content of viable mycobacteria key for innate immune activation, potentially contributing to differences in the ability of these diverse formulations to induce TB-specific and heterologous protection. BCG immunomodulatory properties have also sparked interest in its potential use to prevent or alleviate autoimmune and inflammatory diseases, including type 1 diabetes mellitus and multiple sclerosis. BCG can also serve as a model: nanoparticle vaccine formulations incorporating Toll-like receptor 8 agonists can mimic some of BCGâs innate immune activation, suggesting that aspects of BCGâs effects can be induced with non-replicating stimuli. Overall, BCG represents a paradigm for precision vaccinology, lessons from which will help inform next generation vaccines
Approaching the Heisenberg limit with two mode squeezed states
Two mode squeezed states can be used to achieve Heisenberg limit scaling in
interferometry: a phase shift of can be
resolved. The proposed scheme relies on balanced homodyne detection and can be
implemented with current technology. The most important experimental
imperfections are studied and their impact quantified.Comment: 4 pages, 7 figure
Anisotropic Vacuum Induced Interference in Decay Channels
We demonstrate how the anisotropy of the vacuum of the electromagnetic field
can lead to quantum interferences among the decay channels of close lying
states. Our key result is that interferences are given by the {\em scalar}
formed from the antinormally ordered electric field correlation tensor for the
anisotropic vacuum and the dipole matirx elements for the two transitions. We
present results for emission between two conducting plates as well as for a two
photon process involving fluorescence produced under coherent cw excitationComment: 6 pages with 2 figures, to appear in Phys. Rev. Lett. (tentative june
2000
Quantum computation with linear optics
We present a constructive method to translate small quantum circuits into
their optical analogues, using linear components of present-day quantum optics
technology only. These optical circuits perform precisely the computation that
the quantum circuits are designed for, and can thus be used to test the
performance of quantum algorithms. The method relies on the representation of
several quantum bits by a single photon, and on the implementation of universal
quantum gates using simple optical components (beam splitters, phase shifters,
etc.). The optical implementation of Brassard et al.'s teleportation circuit, a
non-trivial 3-bit quantum computation, is presented as an illustration.Comment: LaTeX with llncs.cls, 11 pages with 5 postscript figures, Proc. of
1st NASA Workshop on Quantum Computation and Quantum Communication (QCQC 98
On Tackling the Limits of Resolution in SAT Solving
The practical success of Boolean Satisfiability (SAT) solvers stems from the
CDCL (Conflict-Driven Clause Learning) approach to SAT solving. However, from a
propositional proof complexity perspective, CDCL is no more powerful than the
resolution proof system, for which many hard examples exist. This paper
proposes a new problem transformation, which enables reducing the decision
problem for formulas in conjunctive normal form (CNF) to the problem of solving
maximum satisfiability over Horn formulas. Given the new transformation, the
paper proves a polynomial bound on the number of MaxSAT resolution steps for
pigeonhole formulas. This result is in clear contrast with earlier results on
the length of proofs of MaxSAT resolution for pigeonhole formulas. The paper
also establishes the same polynomial bound in the case of modern core-guided
MaxSAT solvers. Experimental results, obtained on CNF formulas known to be hard
for CDCL SAT solvers, show that these can be efficiently solved with modern
MaxSAT solvers
- âŠ