72 research outputs found
Blockchain in Libraries
This issue of Library Technology Reports (vol. 55, no. 8), “Blockchain in Libraries,” examines the application of blockchain in libraries. Blockchain technology has the ability to transform how libraries provide services and organize information. To date, most of these applications are still in the conceptual stage. However, sooner or later, development and implementation will follow. This report is intended to provide a primer on the technology and some thought starters. In chapter 2, the concept of blockchain is explained. Chapter 3 provides eight thought and conversation starters that look at how blockchain could be applied in libraries. Chapter 4 looks at the barriers and challenges of implementing blockchain in libraries. Chapter 5 raises some questions around ethical issues that librarians should consider with respect to blockchain implementation
Variational quantum simulation of U(1) lattice gauge theories with qudit systems
Lattice gauge theories are fundamental to various fields, including particle
physics, condensed matter, and quantum information theory. Recent progress in
the control of quantum systems allows for studying Abelian lattice gauge
theories in table-top experiments. However, several challenges remain, such as
implementing dynamical fermions in higher spatial dimensions and magnetic field
terms. Here, we map D-dimensional U(1) Abelian lattice gauge theories onto
qudit systems with local interactions for arbitrary D. We propose a variational
quantum simulation scheme for the qudit system with a local Hamiltonian, that
can be implemented on a universal qudit quantum device as the one developed in
[Nat. Phys. 18, 1053-1057 (2022)]. We describe how to implement the variational
imaginary-time evolution protocol for ground state preparation as well as the
variational real-time evolution protocol to simulate non-equilibrium physics on
universal qudit quantum computers, supplemented with numerical simulations. Our
proposal can serve as a way of simulating lattice gauge theories, particularly
in higher spatial dimensions, with minimal resources, regarding both system
sizes and gate count
Independent Promotion of Young Talents in Satellite Development on the Full-Scale Satellite Mission SOURCE
The SOURCE mission is the first student satellite developed at the University of Stuttgart. This unique opportunity for undergraduate and graduate students is made possible by the cooperation between the Institute of Space Systems (IRS) and the Small Satellite Student Society (KSat e.V.
Exploring larval phenology as predictor for range expansion in an invasive species
Predicting range expansion of invasive species is one of the key challenges in ecology. We modelled the phenological window for successful larval release and development (WLR) in order to predict poleward expansion of the invasive crab Hemigrapsus sanguineus along the Atlantic coast of North America and north Europe. WLR quantifies the number of opportunities (in days) when larval release leads to a successful completion of the larval phase; WLR depends on the effects of temperature on the duration of larval development and survival. Successful larval development is a necessary requirement for the establishment of self‐persistent local populations. WLR was computed from a mechanistic model, based on in situ temperature time series and a laboratory–calibrated curve predicting duration of larval development from temperature. As a validation step, we checked that model predictions of the time of larval settlement matched observations from the field for our local population (Helgoland, North Sea). We then applied our model to the North American shores because larvae from our European population showed, in the laboratory, similar responses to temperature to those of a North American population. WLR correctly predicted the northern distribution limit in North American shores, where the poleward expansion of H. sanguineus appear to have stalled (as of 2015). For north Europe, where H. sanguineus is a recent invader, WLR predicted ample room for poleward expansion towards NE England and S Norway. We also explored the importance of year‐to‐year variation in temperature for WLR and potential expansion: variations in WLR highlighted the role of heat waves as likely promoters of recruitment subsidising sink populations located at the distribution limits. Overall, phenological windows may be used as a part of a warning system enabling more targeted programs for monitoring
Characterizing quantum instruments: from non-demolition measurements to quantum error correction
In quantum information processing quantum operations are often processed
alongside measurements which result in classical data. Due to the information
gain of classical measurement outputs non-unitary dynamical processes can take
place on the system, for which common quantum channel descriptions fail to
describe the time evolution. Quantum measurements are correctly treated by
means of so-called quantum instruments capturing both classical outputs and
post-measurement quantum states. Here we present a general recipe to
characterize quantum instruments alongside its experimental implementation and
analysis. Thereby, the full dynamics of a quantum instrument can be captured,
exhibiting details of the quantum dynamics that would be overlooked with common
tomography techniques. For illustration, we apply our characterization
technique to a quantum instrument used for the detection of qubit loss and
leakage, which was recently implemented as a building block in a quantum error
correction (QEC) experiment (Nature 585, 207-210 (2020)). Our analysis reveals
unexpected and in-depth information about the failure modes of the
implementation of the quantum instrument. We then numerically study the
implications of these experimental failure modes on QEC performance, when the
instrument is employed as a building block in QEC protocols on a logical qubit.
Our results highlight the importance of careful characterization and modelling
of failure modes in quantum instruments, as compared to simplistic
hardware-agnostic phenomenological noise models, which fail to predict the
undesired behavior of faulty quantum instruments. The presented methods and
results are directly applicable to generic quantum instruments.Comment: 28 pages, 21 figure
Experimental single-setting quantum state tomography
Quantum computers solve ever more complex tasks using steadily growing system
sizes. Characterizing these quantum systems is vital, yet becoming increasingly
challenging. The gold-standard is quantum state tomography (QST), capable of
fully reconstructing a quantum state without prior knowledge. Measurement and
classical computing costs, however, increase exponentially in the system size -
a bottleneck given the scale of existing and near-term quantum devices. Here,
we demonstrate a scalable and practical QST approach that uses a single
measurement setting, namely symmetric informationally complete (SIC) positive
operator-valued measures (POVM). We implement these nonorthogonal measurements
on an ion trap device by utilizing more energy levels in each ion - without
ancilla qubits. More precisely, we locally map the SIC POVM to orthogonal
states embedded in a higher-dimensional system, which we read out using
repeated in-sequence detections, providing full tomographic information in
every shot. Combining this SIC tomography with the recently developed
randomized measurement toolbox ("classical shadows") proves to be a powerful
combination. SIC tomography alleviates the need for choosing measurement
settings at random ("derandomization"), while classical shadows enable the
estimation of arbitrary polynomial functions of the density matrix orders of
magnitudes faster than standard methods. The latter enables in-depth
entanglement studies, which we experimentally showcase on a 5-qubit absolutely
maximally entangled (AME) state. Moreover, the fact that the full tomography
information is available in every shot enables online QST in real time. We
demonstrate this on an 8-qubit entangled state, as well as for fast state
identification. All in all, these features single out SIC-based classical
shadow estimation as a highly scalable and convenient tool for quantum state
characterization.Comment: 34 pages, 15 figure
Characterizing large-scale quantum computers via cycle benchmarking
Quantum computers promise to solve certain problems more efficiently than
their digital counterparts. A major challenge towards practically useful
quantum computing is characterizing and reducing the various errors that
accumulate during an algorithm running on large-scale processors. Current
characterization techniques are unable to adequately account for the
exponentially large set of potential errors, including cross-talk and other
correlated noise sources. Here we develop cycle benchmarking, a rigorous and
practically scalable protocol for characterizing local and global errors across
multi-qubit quantum processors. We experimentally demonstrate its practicality
by quantifying such errors in non-entangling and entangling operations on an
ion-trap quantum computer with up to 10 qubits, with total process fidelities
for multi-qubit entangling gates ranging from 99.6(1)% for 2 qubits to 86(2)%
for 10 qubits. Furthermore, cycle benchmarking data validates that the error
rate per single-qubit gate and per two-qubit coupling does not increase with
increasing system size.Comment: The main text consists of 6 pages, 3 figures and 1 table. The
supplementary information consists of 6 pages, 2 figures and 3 table
Simulating 2D lattice gauge theories on a qudit quantum computer
Particle physics underpins our understanding of the world at a fundamental
level by describing the interplay of matter and forces through gauge theories.
Yet, despite their unmatched success, the intrinsic quantum mechanical nature
of gauge theories makes important problem classes notoriously difficult to
address with classical computational techniques. A promising way to overcome
these roadblocks is offered by quantum computers, which are based on the same
laws that make the classical computations so difficult. Here, we present a
quantum computation of the properties of the basic building block of
two-dimensional lattice quantum electrodynamics, involving both gauge fields
and matter. This computation is made possible by the use of a trapped-ion qudit
quantum processor, where quantum information is encoded in different states
per ion, rather than in two states as in qubits. Qudits are ideally suited for
describing gauge fields, which are naturally high-dimensional, leading to a
dramatic reduction in the quantum register size and circuit complexity. Using a
variational quantum eigensolver, we find the ground state of the model and
observe the interplay between virtual pair creation and quantized magnetic
field effects. The qudit approach further allows us to seamlessly observe the
effect of different gauge field truncations by controlling the qudit dimension.
Our results open the door for hardware-efficient quantum simulations with
qudits in near-term quantum devices
A compact ion-trap quantum computing demonstrator
Quantum information processing is steadily progressing from a purely academic
discipline towards applications throughout science and industry. Transitioning
from lab-based, proof-of-concept experiments to robust, integrated realizations
of quantum information processing hardware is an important step in this
process. However, the nature of traditional laboratory setups does not offer
itself readily to scaling up system sizes or allow for applications outside of
laboratory-grade environments. This transition requires overcoming challenges
in engineering and integration without sacrificing the state-of-the-art
performance of laboratory implementations. Here, we present a 19-inch rack
quantum computing demonstrator based on optical qubits in
a linear Paul trap to address many of these challenges. We outline the
mechanical, optical, and electrical subsystems. Further, we describe the
automation and remote access components of the quantum computing stack. We
conclude by describing characterization measurements relevant to digital
quantum computing including entangling operations mediated by the
Molmer-Sorenson interaction. Using this setup we produce maximally-entangled
Greenberger-Horne-Zeilinger states with up to 24 ions without the use of
post-selection or error mitigation techniques; on par with well-established
conventional laboratory setups
- …