62,674 research outputs found
4. generációs mobil rendszerek kutatása = Research on 4-th Generation Mobile Systems
A 3G mobil rendszerek szabványosítása a végéhez közeledik, legalábbis a meghatározó képességek tekintetében. Ezért létfontosságú azon technikák, eljárások vizsgálata, melyek a következő, 4G rendszerekben meghatározó szerepet töltenek majd be. Több ilyen kutatási irányvonal is létezik, ezek közül projektünkben a fontosabbakra koncentráltunk. A következőben felsoroljuk a kutatott területeket, és röviden összegezzük az elért eredményeket. Szórt spektrumú rendszerek Kifejlesztettünk egy új, rádiós interfészen alkalmazható hívásengedélyezési eljárást. Szimulációs vizsgálatokkal támasztottuk alá a megoldás hatékonyságát. A projektben kutatóként résztvevő Jeney Gábor sikeresen megvédte Ph.D. disszertációját neurális hálózatokra épülő többfelhasználós detekciós technikák témában. Az elért eredmények Imre Sándor MTA doktori disszertációjába is beépültek. IP alkalmazása mobil rendszerekben Továbbfejlesztettük, teszteltük és általánosítottuk a projekt keretében megalkotott új, gyűrű alapú topológiára épülő, a jelenleginél nagyobb megbízhatóságú IP alapú hozzáférési koncepciót. A témakörben Szalay Máté Ph.D. disszertációja már a nyilvános védésig jutott. Kvantum-informatikai módszerek alkalmazása 3G/4G detekcióra Új, kvantum-informatikai elvekre épülő többfelhasználós detekciós eljárást dolgoztunk ki. Ehhez új kvantum alapú algoritmusokat is kifejlesztettünk. Az eredményeket nemzetközi folyóiratok mellett egy saját könyvben is publikáltuk. | The project consists of three main research directions. Spread spectrum systems: we developed a new call admission control method for 3G air interfaces. Project member Gabor Jeney obtained the Ph.D. degree and project leader Sandor Imre submitted his DSc theses from this area. Application of IP in mobile systems: A ring-based reliable IP mobility mobile access concept and corresponding protocols have been developed. Project member Máté Szalay submitted his Ph.D. theses from this field. Quantum computing based solutions in 3G/4G detection: Quantum computing based multiuser detection algorithm was developed. Based on the results on this field a book was published at Wiley entitled: 'Quantum Computing and Communications - an engineering approach'
Quantum state preparation and macroscopic entanglement in gravitational-wave detectors
Long-baseline laser-interferometer gravitational-wave detectors are operating
at a factor of 10 (in amplitude) above the standard quantum limit (SQL) within
a broad frequency band. Such a low classical noise budget has already allowed
the creation of a controlled 2.7 kg macroscopic oscillator with an effective
eigenfrequency of 150 Hz and an occupation number of 200. This result, along
with the prospect for further improvements, heralds the new possibility of
experimentally probing macroscopic quantum mechanics (MQM) - quantum mechanical
behavior of objects in the realm of everyday experience - using
gravitational-wave detectors. In this paper, we provide the mathematical
foundation for the first step of a MQM experiment: the preparation of a
macroscopic test mass into a nearly minimum-Heisenberg-limited Gaussian quantum
state, which is possible if the interferometer's classical noise beats the SQL
in a broad frequency band. Our formalism, based on Wiener filtering, allows a
straightforward conversion from the classical noise budget of a laser
interferometer, in terms of noise spectra, into the strategy for quantum state
preparation, and the quality of the prepared state. Using this formalism, we
consider how Gaussian entanglement can be built among two macroscopic test
masses, and the performance of the planned Advanced LIGO interferometers in
quantum-state preparation
The UTMOST: A hybrid digital signal processor transforms the MOST
The Molonglo Observatory Synthesis Telescope (MOST) is an 18,000 square meter
radio telescope situated some 40 km from the city of Canberra, Australia. Its
operating band (820-850 MHz) is now partly allocated to mobile phone
communications, making radio astronomy challenging. We describe how the
deployment of new digital receivers (RX boxes), Field Programmable Gate Array
(FPGA) based filterbanks and server-class computers equipped with 43 GPUs
(Graphics Processing Units) has transformed MOST into a versatile new
instrument (the UTMOST) for studying the dynamic radio sky on millisecond
timescales, ideal for work on pulsars and Fast Radio Bursts (FRBs). The
filterbanks, servers and their high-speed, low-latency network form part of a
hybrid solution to the observatory's signal processing requirements. The
emphasis on software and commodity off-the-shelf hardware has enabled rapid
deployment through the re-use of proven 'software backends' for its signal
processing. The new receivers have ten times the bandwidth of the original MOST
and double the sampling of the line feed, which doubles the field of view. The
UTMOST can simultaneously excise interference, make maps, coherently dedisperse
pulsars, and perform real-time searches of coherent fan beams for dispersed
single pulses. Although system performance is still sub-optimal, a pulsar
timing and FRB search programme has commenced and the first UTMOST maps have
been made. The telescope operates as a robotic facility, deciding how to
efficiently target pulsars and how long to stay on source, via feedback from
real-time pulsar folding. The regular timing of over 300 pulsars has resulted
in the discovery of 7 pulsar glitches and 3 FRBs. The UTMOST demonstrates that
if sufficient signal processing can be applied to the voltage streams it is
possible to perform innovative radio science in hostile radio frequency
environments.Comment: 12 pages, 6 figure
A non-adiabatic approach to entanglement distribution over long distances
Entanglement distribution between trapped-atom quantum memories, viz. single
atoms in optical cavities, is addressed. In most scenarios, the rate of
entanglement distribution depends on the efficiency with which the state of
traveling single photons can be transferred to trapped atoms. This loading
efficiency is analytically studied for two-level, -level, -level,
and double--level atomic configurations by means of a system-reservoir
approach. An off-resonant non-adiabatic approach to loading -level
trapped-atom memories is proposed, and the ensuing trade-offs between the
atom-light coupling rate and input photon bandwidth for achieving a high
loading probability are identified. The non-adiabatic approach allows a broad
class of optical sources to be used, and in some cases it provides a higher
system throughput than what can be achieved by adiabatic loading mechanisms.
The analysis is extended to the case of two double- trapped-atom
memories illuminated by a polarization-entangled biphoton.Comment: 15 pages, 15 figure
Phenomenological Study of Decoherence in Solid-State Spin Qubits due to Nuclear Spin Diffusion
We present a study of the prospects for coherence preservation in solid-state
spin qubits using dynamical decoupling protocols. Recent experiments have
provided the first demonstrations of multipulse dynamical decoupling sequences
in this qubit system, but quantitative analyses of potential coherence
improvements have been hampered by a lack of concrete knowledge of the relevant
noise processes. We present simulations of qubit coherence under the
application of arbitrary dynamical decoupling pulse sequences based on an
experimentally validated semiclassical model. This phenomenological approach
bundles the details of underlying noise processes into a single experimentally
relevant noise power spectral density. Our results show that the dominant
features of experimental measurements in a two-electron singlet-triplet spin
qubit can be replicated using a noise power spectrum associated
with nuclear-spin-flips in the host material. Beginning with this validation we
address the effects of nuclear programming, high-frequency nuclear-spin
dynamics, and other high-frequency classical noise sources, with conjectures
supported by physical arguments and microscopic calculations where relevant.
Our results provide expected performance bounds and identify diagnostic metrics
that can be measured experimentally in order to better elucidate the underlying
nuclear spin dynamics.Comment: Updated References. Related articles at:
http://www.physics.usyd.edu.au/~mbiercuk/Publications.htm
Quantum interface between frequency-uncorrelated down-converted entanglement and atomic-ensemble quantum memory
Photonic entanglement source and quantum memory are two basic building blocks
of linear-optical quantum computation and long-distance quantum communication.
In the past decades, intensive researches have been carried out, and remarkable
progress, particularly based on the spontaneous parametric down-converted
(SPDC) entanglement source and atomic ensembles, has been achieved. Currently,
an important task towards scalable quantum information processing (QIP) is to
efficiently write and read entanglement generated from a SPDC source into and
out of an atomic quantum memory. Here we report the first experimental
realization of a quantum interface by building a 5 MHz frequency-uncorrelated
SPDC source and reversibly mapping the generated entangled photons into and out
of a remote optically thick cold atomic memory using electromagnetically
induced transparency. The frequency correlation between the entangled photons
is almost fully eliminated with a suitable pump pulse. The storage of a
triggered single photon with arbitrary polarization is shown to reach an
average fidelity of 92% for 200 ns storage time. Moreover,
polarization-entangled photon pairs are prepared, and one of photons is stored
in the atomic memory while the other keeps flying. The CHSH Bell's inequality
is measured and violation is clearly observed for storage time up to 1
microsecond. This demonstrates the entanglement is stored and survives during
the storage. Our work establishes a crucial element to implement scalable
all-optical QIP, and thus presents a substantial progress in quantum
information science.Comment: 28 pages, 4 figures, 1 tabl
Review of high-contrast imaging systems for current and future ground- and space-based telescopes I. Coronagraph design methods and optical performance metrics
The Optimal Optical Coronagraph (OOC) Workshop at the Lorentz Center in
September 2017 in Leiden, the Netherlands gathered a diverse group of 25
researchers working on exoplanet instrumentation to stimulate the emergence and
sharing of new ideas. In this first installment of a series of three papers
summarizing the outcomes of the OOC workshop, we present an overview of design
methods and optical performance metrics developed for coronagraph instruments.
The design and optimization of coronagraphs for future telescopes has
progressed rapidly over the past several years in the context of space mission
studies for Exo-C, WFIRST, HabEx, and LUVOIR as well as ground-based
telescopes. Design tools have been developed at several institutions to
optimize a variety of coronagraph mask types. We aim to give a broad overview
of the approaches used, examples of their utility, and provide the optimization
tools to the community. Though it is clear that the basic function of
coronagraphs is to suppress starlight while maintaining light from off-axis
sources, our community lacks a general set of standard performance metrics that
apply to both detecting and characterizing exoplanets. The attendees of the OOC
workshop agreed that it would benefit our community to clearly define
quantities for comparing the performance of coronagraph designs and systems.
Therefore, we also present a set of metrics that may be applied to theoretical
designs, testbeds, and deployed instruments. We show how these quantities may
be used to easily relate the basic properties of the optical instrument to the
detection significance of the given point source in the presence of realistic
noise.Comment: To appear in Proceedings of the SPIE, vol. 1069
- …