251 research outputs found
(Re)presenting heritage: laser scanning and 3D visualisations for cultural resilience and community engagement.
Cultural heritage is increasingly being viewed as an economic asset for geographic areas who aim to capitalise in the surge in interest in local history and heritage tourism from members of the public. Digital technologies have developed that facilitate new forms of engagement with heritage and allow local areas to showcase their history, potentially broadening interest to a wider audience, thus acting as a driver for cultural and economic resilience. The research presented in this paper explores this through interdisciplinary research utilising laser scanning and visualisation in combination with social research in Elgin. 3D data capture technologies were used to develop and test 3D data visualisations and protocols through which the urban built heritage can be digitally recorded. The main focus of this paper surrounds the application and perceptions of these technologies. Findings suggest that the primary driver for cultural heritage developments was economic (with an emphasis on tourism) but further benefits and key factors of community engagement, social learning and cultural resilience were also reported. Stakeholder engagement and partnership working, in particular, were identified as critical factors of success. The findings from the community engagement events demonstrate that laser scanning and visualisation provide a novel and engaging mechanism for co-producing heritage assets. There is a high level of public interest in such technologies and users who engaged with these models reported that they gained new perspectives (including spatial and temporal perspectives) on the built heritage of the area
The inevitable youthfulness of known high-redshift radio galaxies
Radio galaxies can be seen out to very high redshifts, where in principle
they can serve as probes of the early evolution of the Universe. Here we show
that for any model of radio-galaxy evolution in which the luminosity decreases
with time after an initial rapid increase (that is, essentially all reasonable
models), all observable high-redshift radio-galaxies must be seen when the
lobes are less than 10^7 years old. This means that high-redshift radio
galaxies can be used as a high-time-resolution probe of evolution in the early
Universe. Moreover, this result helps to explain many observed trends of
radio-galaxy properties with redshift [(i) the `alignment effect' of optical
emission along radio-jet axes, (ii) the increased distortion in radio
structure, (iii) the decrease in physical sizes, (iv) the increase in radio
depolarisation, and (v) the increase in dust emission] without needing to
invoke explanations based on cosmology or strong evolution of the surrounding
intergalactic medium with cosmic time, thereby avoiding conflict with current
theories of structure formation.Comment: To appear in Nature. 4 pages, 2 colour figures available on request.
Also available at http://www-astro.physics.ox.ac.uk/~km
Complete experimental toolbox for alignment-free quantum communication
Quantum communication employs the counter-intuitive features of quantum
physics to perform tasks that are im- possible in the classical world. It is
crucial for testing the foundations of quantum theory and promises to rev-
olutionize our information and communication technolo- gies. However, for two
or more parties to execute even the simplest quantum transmission, they must
establish, and maintain, a shared reference frame. This introduces a
considerable overhead in communication resources, par- ticularly if the parties
are in motion or rotating relative to each other. We experimentally demonstrate
how to circumvent this problem with the efficient transmission of quantum
information encoded in rotationally invariant states of single photons. By
developing a complete toolbox for the efficient encoding and decoding of
quantum infor- mation in such photonic qubits, we demonstrate the fea- sibility
of alignment-free quantum key-distribution, and perform a proof-of-principle
alignment-free entanglement distribution and violation of a Bell inequality.
Our scheme should find applications in fundamental tests of quantum mechanics
and satellite-based quantum communication.Comment: Main manuscript: 7 pages, 3 figures; Supplementary Information: 7
pages, 3 figure
Testing foundations of quantum mechanics with photons
The foundational ideas of quantum mechanics continue to give rise to
counterintuitive theories and physical effects that are in conflict with a
classical description of Nature. Experiments with light at the single photon
level have historically been at the forefront of tests of fundamental quantum
theory and new developments in photonics engineering continue to enable new
experiments. Here we review recent photonic experiments to test two
foundational themes in quantum mechanics: wave-particle duality, central to
recent complementarity and delayed-choice experiments; and Bell nonlocality
where recent theoretical and technological advances have allowed all
controversial loopholes to be separately addressed in different photonics
experiments.Comment: 10 pages, 5 figures, published as a Nature Physics Insight review
articl
Recommended from our members
Demonstration of the event identification capabilities of the NEXT-White detector
In experiments searching for neutrinoless double-beta decay, the possibility of identifying the two emitted electrons is a powerful tool in rejecting background events and therefore improving the overall sensitivity of the experiment. In this paper we present the first measurement of the efficiency of a cut based on the different event signatures of double and single electron tracks, using the data of the NEXT-White detector, the first detector of the NEXT experiment operating underground. Using a 228Th calibration source to produce signal-like and background-like events with energies near 1.6 MeV, a signal efficiency of 71.6 ± 1.5 stat± 0.3 sys% for a background acceptance of 20.6 ± 0.4 stat± 0.3 sys% is found, in good agreement with Monte Carlo simulations. An extrapolation to the energy region of the neutrinoless double beta decay by means of Monte Carlo simulations is also carried out, and the results obtained show an improvement in background rejection over those obtained at lower energies. [Figure not available: see fulltext.
Recommended from our members
Radiogenic backgrounds in the NEXT double beta decay experiment
Natural radioactivity represents one of the main backgrounds in the search for neutrinoless double beta decay. Within the NEXT physics program, the radioactivity- induced backgrounds are measured with the NEXT-White detector. Data from 37.9 days of low-background operations at the Laboratorio Subterráneo de Canfranc with xenon depleted in 136Xe are analyzed to derive a total background rate of (0.84±0.02) mHz above 1000 keV. The comparison of data samples with and without the use of the radon abatement system demonstrates that the contribution of airborne-Rn is negligible. A radiogenic background model is built upon the extensive radiopurity screening campaign conducted by the NEXT collaboration. A spectral fit to this model yields the specific contributions of 60Co, 40K, 214Bi and 208Tl to the total background rate, as well as their location in the detector volumes. The results are used to evaluate the impact of the radiogenic backgrounds in the double beta decay analyses, after the application of topological cuts that reduce the total rate to (0.25±0.01) mHz. Based on the best-fit background model, the NEXT-White median sensitivity to the two-neutrino double beta decay is found to be 3.5σ after 1 year of data taking. The background measurement in a Qββ±100 keV energy window validates the best-fit background model also for the neutrinoless double beta decay search with NEXT-100. Only one event is found, while the model expectation is (0.75±0.12) events. [Figure not available: see fulltext.]
Guaranteed violation of a Bell inequality without aligned reference frames or calibrated devices
Bell tests---the experimental demonstration of a Bell inequality
violation---are central to understanding the foundations of quantum mechanics,
underpin quantum technologies, and are a powerful diagnostic tool for
technological developments in these areas. To date, Bell tests have relied on
careful calibration of the measurement devices and alignment of a shared
reference frame between the two parties---both technically demanding tasks in
general. Surprisingly, we show that neither of these operations are necessary,
violating Bell inequalities with near certainty with (i) unaligned, but
calibrated, measurement devices, and (ii) uncalibrated and unaligned devices.
We demonstrate generic quantum nonlocality with randomly chosen local
measurements on a singlet state of two photons implemented with reconfigurable
integrated optical waveguide circuits based on voltage-controlled phase
shifters. The observed results demonstrate the robustness of our schemes to
imperfections and statistical noise. This new approach is likely to have
important applications in both fundamental science and in quantum technologies,
including device independent quantum key distribution.Comment: 7 pages, 7 figure
Recommended from our members
Energy calibration of the NEXT-White detector with 1% resolution near Q ββ of 136Xe
Excellent energy resolution is one of the primary advantages of electroluminescent high-pressure xenon TPCs. These detectors are promising tools in searching for rare physics events, such as neutrinoless double-beta decay (ββ0ν), which require precise energy measurements. Using the NEXT-White detector, developed by the NEXT (Neutrino Experiment with a Xenon TPC) collaboration, we show for the first time that an energy resolution of 1% FWHM can be achieved at 2.6 MeV, establishing the present technology as the one with the best energy resolution of all xenon detectors for ββ0ν searches. [Figure not available: see fulltext.
Generating, manipulating and measuring entanglement and mixture with a reconfigurable photonic circuit
Entanglement is the quintessential quantum mechanical phenomenon understood
to lie at the heart of future quantum technologies and the subject of
fundamental scientific investigations. Mixture, resulting from noise, is often
an unwanted result of interaction with an environment, but is also of
fundamental interest, and is proposed to play a role in some biological
processes. Here we report an integrated waveguide device that can generate and
completely characterize pure two-photon states with any amount of entanglement
and arbitrary single-photon states with any amount of mixture. The device
consists of a reconfigurable integrated quantum photonic circuit with eight
voltage controlled phase shifters. We demonstrate that for thousands of
randomly chosen configurations the device performs with high fidelity. We
generate maximally and non-maximally entangled states, violate a Bell-type
inequality with a continuum of partially entangled states, and demonstrate
generation of arbitrary one-qubit mixed states.Comment: 6 pages, 6 figure
- …