1,094 research outputs found
An All Optical Fibre Quantum Controlled-NOT Gate
We report the first experimental demonstration of an optical controlled-NOT
gate constructed entirely in fibre. We operate the gate using two heralded
optical fibre single photon sources and find an average logical fidelity of 90%
and an average process fidelity of 0.83<F<0.91. On the basis of a simple model
we are able to conclude that imperfections are primarily due to the photon
sources, meaning that the gate itself works with very high fidelity.Comment: 4 pages, 4 figures, comments welcom
Transgenic amplification of glucocorticoid action in adipose tissue causes high blood pressure in mice
Obesity is closely associated with the metabolic syndrome, a combination of disorders including insulin resistance, diabetes, dyslipidemia, and hypertension. A role for local glucocorticoid reamplification in obesity and the metabolic syndrome has been suggested. The enzyme 11ÎČ-hydroxysteroid dehydrogenase type 1 (11ÎČ-HSD1) regenerates active cortisol from inactive 11-keto forms, and aP2-HSD1 mice with relative transgenic overexpression of this enzyme in fat cells develop visceral obesity with insulin resistance and dyslipidemia. Here we report that aP2-HSD1 mice also have high arterial blood pressure (BP). The mice have increased sensitivity to dietary salt and increased plasma levels of angiotensinogen, angiotensin II, and aldosterone. This hypertension is abolished by selective angiotensin II receptor AT-1 antagonist at a low dose that does not affect BP in non-Tg littermates. These findings suggest that activation of the circulating renin-angiotensin system (RAS) develops in aP2-HSD1 mice. The long-term hypertension is further reflected by an appreciable hypertrophy and hyperplasia of the distal tubule epithelium of the nephron, resembling salt-sensitive or angiotensin IIâmediated hypertension. Taken together, our findings suggest that overexpression of 11ÎČ-HSD1 in fat is sufficient to cause salt-sensitive hypertension mediated by an activated RAS. The potential role of adipose 11ÎČ-HSD1 in mediating critical features of the metabolic syndrome extends beyond obesity and metabolic complications to include the most central cardiovascular feature of this disorder
Swift Observations of GRB 050603: An afterglow with a steep late time decay slope
We report the results of Swift observations of the Gamma Ray Burst GRB
050603. With a V magnitude V=18.2 about 10 hours after the burst the optical
afterglow was the brightest so far detected by Swift and one of the brightest
optical afterglows ever seen. The Burst Alert Telescope (BAT) light curves show
three fast-rise-exponential-decay spikes with =12s and a fluence of
7.6 ergs cm in the 15-150 keV band. With an ergs it was also one of the most energetic
bursts of all times. The Swift spacecraft began observing of the afterglow with
the narrow-field instruments about 10 hours after the detection of the burst.
The burst was bright enough to be detected by the Swift UV/Optical telescope
(UVOT) for almost 3 days and by the X-ray Telescope (XRT) for a week after the
burst. The X-ray light curve shows a rapidly fading afterglow with a decay
index =1.76. The X-ray energy spectral index was
=0.71\plm0.10 with the column density in agreement with the
Galactic value. The spectral analysis does not show an obvious change in the
X-ray spectral slope over time. The optical UVOT light curve decays with a
slope of =1.8\plm0.2.
The steepness and the similarity of the optical and X-ray decay rates suggest
that the afterglow was observed after the jet break. We estimate a jet opening
angle of about 1-2Comment: 14 pages, accepted for publication in Ap
Influence of Nanoparticle Size and Shape on Oligomer Formation of an Amyloidogenic Peptide
Understanding the influence of macromolecular crowding and nanoparticles on
the formation of in-register -sheets, the primary structural component
of amyloid fibrils, is a first step towards describing \emph{in vivo} protein
aggregation and interactions between synthetic materials and proteins. Using
all atom molecular simulations in implicit solvent we illustrate the effects of
nanoparticle size, shape, and volume fraction on oligomer formation of an
amyloidogenic peptide from the transthyretin protein. Surprisingly, we find
that inert spherical crowding particles destabilize in-register -sheets
formed by dimers while stabilizing -sheets comprised of trimers and
tetramers. As the radius of the nanoparticle increases crowding effects
decrease, implying smaller crowding particles have the largest influence on the
earliest amyloid species. We explain these results using a theory based on the
depletion effect. Finally, we show that spherocylindrical crowders destabilize
the ordered -sheet dimer to a greater extent than spherical crowders,
which underscores the influence of nanoparticle shape on protein aggregation
On the fate of the secondary white dwarf in double-degenerate double-detonation Type Ia supernovae
The progenitor systems and explosion mechanism of Type Ia supernovae are
still unknown. Currently favoured progenitors include double-degenerate systems
consisting of two carbon-oxygen white dwarfs with thin helium shells. In the
double-detonation scenario, violent accretion leads to a helium detonation on
the more massive primary white dwarf that turns into a carbon detonation in its
core and explodes it. We investigate the fate of the secondary white dwarf,
focusing on changes of the ejecta and observables of the explosion if the
secondary explodes as well rather than survives. We simulate a binary system of
a and a carbon-oxygen white dwarf with
helium shells each. We follow the system self-consistently from
inspiral to ignition, through the explosion, to synthetic observables. We
confirm that the primary white dwarf explodes self-consistently. The helium
detonation around the secondary white dwarf, however, fails to ignite a carbon
detonation. We restart the simulation igniting the carbon detonation in the
secondary white dwarf by hand and compare the ejecta and observables of both
explosions. We find that the outer ejecta at are
indistinguishable. Light curves and spectra are very similar until d
after explosion and the ejecta are much more spherical than for violent merger
models. The inner ejecta differ significantly which slows down the decline rate
of the bolometric light curve after maximum of the model with a secondary
explosion by about 20 per cent. We expect future synthetic 3D nebular spectra
to confirm or rule out either model.Comment: 12 pages, 7 figures, submitted to MNRAS, comments welcom
Massless D-strings and moduli stabilization in type I cosmology
We consider the cosmological evolution induced by the free energy F of a gas
of maximally supersymmetric heterotic strings at finite temperature and weak
coupling in dimension D>=4. We show that F, which plays the role of an
effective potential, has minima associated to enhanced gauge symmetries, where
all internal moduli can be attracted and dynamically stabilized. Using the fact
that the heterotic/type I S-duality remains valid at finite temperature and can
be applied at each instant of a quasi-static evolution, we find in the dual
type I cosmology that all internal NS-NS and RR moduli in the closed string
sector and the Wilson lines in the open string sector can be stabilized. For
the special case of D=6, the internal volume modulus remains a flat direction,
while the dilaton is stabilized. An essential role is played by light D-string
modes wrapping the internal manifold and whose contribution to the free energy
cannot be omitted, even when the type I string is at weak coupling. As a
result, the order of magnitude of the internal radii expectation values on the
type I side is (lambda_I alpha')^{1/2}, where lambda_I is the ten-dimensional
string coupling. The non-perturbative corrections to the type I free energy can
alternatively be described as effects of "thermal E1-instantons", whose
worldsheets wrap the compact Euclidean time cycle.Comment: 39 pages, 1 figur
Experimental Quantum Hamiltonian Learning
Efficiently characterising quantum systems, verifying operations of quantum
devices and validating underpinning physical models, are central challenges for
the development of quantum technologies and for our continued understanding of
foundational physics. Machine-learning enhanced by quantum simulators has been
proposed as a route to improve the computational cost of performing these
studies. Here we interface two different quantum systems through a classical
channel - a silicon-photonics quantum simulator and an electron spin in a
diamond nitrogen-vacancy centre - and use the former to learn the latter's
Hamiltonian via Bayesian inference. We learn the salient Hamiltonian parameter
with an uncertainty of approximately . Furthermore, an observed
saturation in the learning algorithm suggests deficiencies in the underlying
Hamiltonian model, which we exploit to further improve the model itself. We go
on to implement an interactive version of the protocol and experimentally show
its ability to characterise the operation of the quantum photonic device. This
work demonstrates powerful new quantum-enhanced techniques for investigating
foundational physical models and characterising quantum technologies
Why do authoritarian regimes provide public goods? Policy communities, external shocks and ideas in Chinaâs rural social policy making
Recent research on authoritarian regimes argues that they provide public goods in order to prevent rebellion. This essay shows that the âthreat of rebellionâ alone cannot explain Chinese party-state policies to extend public goods to rural residents in the first decade of the twenty-first century. Drawing on theories of policy making, it argues that Chinaâs one-party regime extended public goods to the rural population under the influence of ideas and policy options generated by policy communities of officials, researchers, international organisations and other actors. The party-state centre adopted and implemented these ideas and policy options when they provided solutions to external shocks and supported economic development goals. Explanations of policies and their outcomes in authoritarian political systems need to include not only âdictatorsâ but also other actors, and the ideas they generate
An Integrated-Photonics Optical-Frequency Synthesizer
Integrated-photonics microchips now enable a range of advanced
functionalities for high-coherence applications such as data transmission,
highly optimized physical sensors, and harnessing quantum states, but with
cost, efficiency, and portability much beyond tabletop experiments. Through
high-volume semiconductor processing built around advanced materials there
exists an opportunity for integrated devices to impact applications cutting
across disciplines of basic science and technology. Here we show how to
synthesize the absolute frequency of a lightwave signal, using integrated
photonics to implement lasers, system interconnects, and nonlinear frequency
comb generation. The laser frequency output of our synthesizer is programmed by
a microwave clock across 4 THz near 1550 nm with 1 Hz resolution and
traceability to the SI second. This is accomplished with a heterogeneously
integrated III/V-Si tunable laser, which is guided by dual
dissipative-Kerr-soliton frequency combs fabricated on silicon chips. Through
out-of-loop measurements of the phase-coherent, microwave-to-optical link, we
verify that the fractional-frequency instability of the integrated photonics
synthesizer matches the reference-clock instability for a 1
second acquisition, and constrain any synthesis error to while
stepping the synthesizer across the telecommunication C band. Any application
of an optical frequency source would be enabled by the precision optical
synthesis presented here. Building on the ubiquitous capability in the
microwave domain, our results demonstrate a first path to synthesis with
integrated photonics, leveraging low-cost, low-power, and compact features that
will be critical for its widespread use.Comment: 10 pages, 6 figure
A knowledge-based framework for service management
peer-reviewedThe purpose of this paper is to investigate how information and communication technologies are used for service standardisation, customisation, and modularisation by knowledge-intensive service firms through the development and empirical validation of a knowledge-based framework. This paper uses 59 in-depth interviews, observational data, and document analysis from case studies of three service-related departments in high-technology, multinational knowledge-intensive business services (KIBSs). Prior research does not conceptualise the relationships between service customisation, standardisation and modularisation. This paper seeks to overcome this gap by integrating insights from research on the role played by both knowledge and information and communication technologies (ICTs) to construct and validate a framework to deal with this gap. It outlines the implications for service firms' use of ICT to deal with increasing knowledge intensity as well as indicating the circumstances under which service knowledge is best customised, standardised and modularised. Further testing in other industries would prove useful in extending the usefulness and applicability of the findings. The originality of the paper lies in developing and validating the first framework to outline the relationship between how service knowledge is customised, standardised or modularised and indicating the associated issues and challenges. It emphasises the role of knowledge and technology. The value of this framework increases as more firms deal with increasing knowledge intensity in the services they provide and in their use of ICTs to reap the benefits of appropriate knowledge reuse.ACCEPTEDpeer-reviewe
- âŠ