23,381 research outputs found
Global crop production forecasting data system analysis
The author has identified the following significant results. Findings led to the development of a theory of radiometric discrimination employing the mathematical framework of the theory of discrimination between scintillating radar targets. The theory indicated that the functions which drive accuracy of discrimination are the contrast ratio between targets, and the number of samples, or pixels, observed. Theoretical results led to three primary consequences, as regards the data system: (1) agricultural targets must be imaged at correctly chosen times, when the relative evolution of the crop's development is such as to maximize their contrast; (2) under these favorable conditions, the number of observed pixels can be significantly reduced with respect to wall-to-wall measurements; and (3) remotely sensed radiometric data must be suitably mixed with other auxiliary data, derived from external sources
Gravity gradient preliminary investigations on exhibit ''A'' Final report
Quartz microbalance gravity gradiometer performance test
Sensor-assisted Video Mapping of the Seafloor
In recent years video surveys have become an increasingly important ground-truthing of acousticseafloor characterization and benthic habitat mapping studies. However, the ground-truthing and detailed characterization provided by video are still typically done using sparse sample imagery supplemented by physical samples. Combining single video frames in a seamless mosaic can provide a tool by which imagery has significant areal coverage, while at the same time showing small fauna and biological features at mm resolution. The generation of such a mosaic is a challenging task due to height variations of the imaged terrain and decimeter scale knowledge of camera position. This paper discusses the current role of underwater video survey, and the potential for generating consistent, quantitative image maps using video data, accompanied by data that can be measured by auxiliary sensors with sufficient accuracy, such as camera tilt and heading, and their use in automated mosaicking techniques. The camera attitude data also provide the necessary information to support the development of a video collage. The collage provides a quick look at the large spatial scale features in a scene and can be used to pinpoint regions that are likely to yield useful information when rendered into high-resolution mosaics. It is proposed that high quality mosaics can be produced using consumer-grade cameras and low-cost sensors, thereby allowing for the economical scientific video surveys. A case study is presented with the results from benthic habitat mapping and the ground-truthing ofseafloor acoustic data using both real underwater imagery and simulations. A computer modeling of the process of video data acquisition (in particular on a non-flat terrain) allows for a better understanding of the main sources of error in mosaic generation and for the choice of near-optimal processing strategies. Various spatial patterns of video survey coverage are compared and it is shown that some patterns have certain advantages in the sense of accumulated error and overall mosaic accuracy
Implementation of quantum maps by programmable quantum processors
A quantum processor is a device with a data register and a program register.
The input to the program register determines the operation, which is a
completely positive linear map, that will be performed on the state in the data
register. We develop a mathematical description for these devices, and apply it
to several different examples of processors. The problem of finding a processor
that will be able to implement a given set of mappings is also examined, and it
is shown that while it is possible to design a finite processor to realize the
phase-damping channel, it is not possible to do so for the amplitude-damping
channel.Comment: 10 revtex pages, no figure
Twenty-one centimeter tomography with foregrounds
Twenty-one centimeter tomography is emerging as a powerful tool to explore
the end of the cosmic dark ages and the reionization epoch, but it will only be
as good as our ability to accurately model and remove astrophysical foreground
contamination. Previous treatments of this problem have focused on the angular
structure of the signal and foregrounds and what can be achieved with limited
spectral resolution (bandwidths in the 1 MHz range). In this paper we introduce
and evaluate a ``blind'' method to extract the multifrequency 21cm signal by
taking advantage of the smooth frequency structure of the Galactic and
extragalactic foregrounds. We find that 21 cm tomography is typically limited
by foregrounds on scales Mpc and limited by noise on scales Mpc, provided that the experimental bandwidth can be made substantially
smaller than 0.1 MHz. Our results show that this approach is quite promising
even for scenarios with rather extreme contamination from point sources and
diffuse Galactic emission, which bodes well for upcoming experiments such as
LOFAR, MWA, PAST, and SKA.Comment: 10 pages, 6 figures. Revised version including various cases with
high noise level. Major conclusions unchanged. Accepted for publication in
Ap
Quantum Error Correction on Linear Nearest Neighbor Qubit Arrays
A minimal depth quantum circuit implementing 5-qubit quantum error correction
in a manner optimized for a linear nearest neighbor architecture is described.
The canonical decomposition is used to construct fast and simple gates that
incorporate the necessary swap operations. Simulations of the circuit's
performance when subjected to discrete and continuous errors are presented. The
relationship between the error rate of a physical qubit and that of a logical
qubit is investigated with emphasis on determining the concatenated error
correction threshold.Comment: 4 pages, 5 figure
Implementation of the Quantum Fourier Transform
The quantum Fourier transform (QFT) has been implemented on a three bit
nuclear magnetic resonance (NMR) quantum computer, providing a first step
towards the realization of Shor's factoring and other quantum algorithms.
Implementation of the QFT is presented with fidelity measures, and state
tomography. Experimentally realizing the QFT is a clear demonstration of NMR's
ability to control quantum systems.Comment: 6 pages, 2 figure
Quantum computing with nearest neighbor interactions and error rates over 1%
Large-scale quantum computation will only be achieved if experimentally
implementable quantum error correction procedures are devised that can tolerate
experimentally achievable error rates. We describe a quantum error correction
procedure that requires only a 2-D square lattice of qubits that can interact
with their nearest neighbors, yet can tolerate quantum gate error rates over
1%. The precise maximum tolerable error rate depends on the error model, and we
calculate values in the range 1.1--1.4% for various physically reasonable
models. Even the lowest value represents the highest threshold error rate
calculated to date in a geometrically constrained setting, and a 50%
improvement over the previous record.Comment: 4 pages, 8 figure
- …