173 research outputs found
Two-photon quantum walks in an elliptical direct-write waveguide array
Integrated optics provides an ideal test bed for the emulation of quantum
systems via continuous-time quantum walks. Here we study the evolution of
two-photon states in an elliptic array of waveguides. We characterise the
photonic chip via coherent-light tomography and use the results to predict
distinct differences between temporally indistinguishable and distinguishable
two-photon inputs which we then compare with experimental observations. Our
work highlights the feasibility for emulation of coherent quantum phenomena in
three-dimensional waveguide structures.Comment: 8 pages, 7 figure
Convective transport of formaldehyde to the upper troposphere and lower stratosphere and associated scavenging in thunderstorms over the central United States during the 2012DC3 study
Quantum-inspired interferometry with chirped laser pulses
We introduce and implement an interferometric technique based on chirped
femtosecond laser pulses and nonlinear optics. The interference manifests as a
high-visibility (> 85%) phase-insensitive dip in the intensity of an optical
beam when the two interferometer arms are equal to within the coherence length
of the light. This signature is unique in classical interferometry, but is a
direct analogue to Hong-Ou-Mandel quantum interference. Our technique exhibits
all the metrological advantages of the quantum interferometer, but with signals
at least 10^7 times greater. In particular we demonstrate enhanced resolution,
robustness against loss, and automatic dispersion cancellation. Our
interferometer offers significant advantages over previous technologies, both
quantum and classical, in precision time delay measurements and biomedical
imaging.Comment: 6 pages, 4 figure
Quantum computing with mixed states
We discuss a model for quantum computing with initially mixed states.
Although such a computer is known to be less powerful than a quantum computer
operating with pure (entangled) states, it may efficiently solve some problems
for which no efficient classical algorithms are known. We suggest a new
implementation of quantum computation with initially mixed states in which an
algorithm realization is achieved by means of optimal basis independent
transformations of qubits.Comment: 2 figures, 52 reference
Silencing Early Viral Replication in Macrophages and Dendritic Cells Effectively Suppresses Flavivirus Encephalitis
West Nile (WN) and St. Louis encephalitis (SLE) viruses can cause fatal
neurological infection and currently there is neither a specific treatment nor
an approved vaccine for these infections. In our earlier studies, we have
reported that siRNAs can be developed as broad-spectrum antivirals for the
treatment of infection caused by related viruses and that a small peptide called
RVG-9R can deliver siRNA to neuronal cells as well as macrophages. To increase
the repertoire of broad-spectrum antiflaviviral siRNAs, we screened 25 siRNAs
targeting conserved regions in the viral genome. Five siRNAs were found to
inhibit both WNV and SLE replication in vitro reflecting broad-spectrum
antiviral activity and one of these was also validated in vivo. In addition, we
also show that RVG-9R delivers siRNA to macrophages and dendritic cells,
resulting in effective suppression of virus replication. Mice were challenged
intraperitoneally (i.p.) with West Nile virus (WNV) and treated i.v. with
siRNA/peptide complex. The peritoneal macrophages isolated on day 3 post
infection were isolated and transferred to new hosts. Mice receiving macrophages
from the anti-viral siRNA treated mice failed to develop any disease while the
control mice transferred with irrelevant siRNA treated mice all died of
encephalitis. These studies suggest that early suppression of viral replication
in macrophages and dendritic cells by RVG-9R-mediated siRNA delivery is key to
preventing the development of a fatal neurological disease
On opportunistic software reuse
The availability of open source assets for almost all imaginable domains has led the software industry toopportunistic design-an approach in which people develop new software systems in an ad hoc fashion by reusing and combining components that were not designed to be used together. In this paper we investigate this emerging approach. We demonstrate the approach with an industrial example in whichNode.jsmodules and various subsystems are used in an opportunistic way. Furthermore, to study opportunistic reuse as a phenomenon, we present the results of three contextual interviews and a survey with reuse practitioners to understand to what extent opportunistic reuse offers improvements over traditional systematic reuse approaches.Peer reviewe
Development of an algorithm for phenotypic screening of carbapenemase-producing Enterobacteriaceae in the routine laboratory
‘It's the relationship you develop with them’: emotional intelligence in nurse leadership. A qualitative study
Interpreting Meta-Analyses of Genome-Wide Association Studies
Meta-analysis is an increasingly popular tool for combining multiple genome-wide association studies in a single analysis to identify associations with small effect sizes. The effect sizes between studies in a meta-analysis may differ and these differences, or heterogeneity, can be caused by many factors. If heterogeneity is observed in the results of a meta-analysis, interpreting the cause of heterogeneity is important because the correct interpretation can lead to a better understanding of the disease and a more effective design of a replication study. However, interpreting heterogeneous results is difficult. The standard approach of examining the association p-values of the studies does not effectively predict if the effect exists in each study. In this paper, we propose a framework facilitating the interpretation of the results of a meta-analysis. Our framework is based on a new statistic representing the posterior probability that the effect exists in each study, which is estimated utilizing cross-study information. Simulations and application to the real data show that our framework can effectively segregate the studies predicted to have an effect, the studies predicted to not have an effect, and the ambiguous studies that are underpowered. In addition to helping interpretation, the new framework also allows us to develop a new association testing procedure taking into account the existence of effect
- …
