918 research outputs found

    Efficient high-dimensional entanglement imaging with a compressive sensing, double-pixel camera

    Get PDF
    We implement a double-pixel, compressive sensing camera to efficiently characterize, at high resolution, the spatially entangled fields produced by spontaneous parametric downconversion. This technique leverages sparsity in spatial correlations between entangled photons to improve acquisition times over raster-scanning by a scaling factor up to n^2/log(n) for n-dimensional images. We image at resolutions up to 1024 dimensions per detector and demonstrate a channel capacity of 8.4 bits per photon. By comparing the classical mutual information in conjugate bases, we violate an entropic Einstein-Podolsky-Rosen separability criterion for all measured resolutions. More broadly, our result indicates compressive sensing can be especially effective for higher-order measurements on correlated systems.Comment: 10 pages, 7 figure

    Compressively characterizing high-dimensional entangled states with complementary, random filtering

    Get PDF
    The resources needed to conventionally characterize a quantum system are overwhelmingly large for high- dimensional systems. This obstacle may be overcome by abandoning traditional cornerstones of quantum measurement, such as general quantum states, strong projective measurement, and assumption-free characterization. Following this reasoning, we demonstrate an efficient technique for characterizing high-dimensional, spatial entanglement with one set of measurements. We recover sharp distributions with local, random filtering of the same ensemble in momentum followed by position---something the uncertainty principle forbids for projective measurements. Exploiting the expectation that entangled signals are highly correlated, we use fewer than 5,000 measurements to characterize a 65, 536-dimensional state. Finally, we use entropic inequalities to witness entanglement without a density matrix. Our method represents the sea change unfolding in quantum measurement where methods influenced by the information theory and signal-processing communities replace unscalable, brute-force techniques---a progression previously followed by classical sensing.Comment: 13 pages, 7 figure

    Locally Adaptive Block Thresholding Method with Continuity Constraint

    Full text link
    We present an algorithm that enables one to perform locally adaptive block thresholding, while maintaining image continuity. Images are divided into sub-images based some standard image attributes and thresholding technique is employed over the sub-images. The present algorithm makes use of the thresholds of neighboring sub-images to calculate a range of values. The image continuity is taken care by choosing the threshold of the sub-image under consideration to lie within the above range. After examining the average range values for various sub-image sizes of a variety of images, it was found that the range of acceptable threshold values is substantially high, justifying our assumption of exploiting the freedom of range for bringing out local details.Comment: 12 Pages, 4 figures, 1 Tabl

    The influence of quantization process on the performance of global entropic thresholding algorithms using electrical capacitance tomography data

    Get PDF
    In measuring component fraction in multiphase flows using tomographic techniques, it is desirable to use a high speed tomography system capable of generating 100 tomograms per second. The electrical capacitance tomography system in this regard is considered to be the best among the available tomographic techniques. However, due to its inherent limitations the system generates distorted reconstructed tomograms necessitating the use of extra signal processing techniques such as thresholding to minimize these distortions. Whilst thresholding technique has been effective in minimizing distortions, the additional computation associated with the process limits the speed of tomogram generation desired from the system. Further, the accuracy of the techniques is limited to higher ranges of the full component fraction range. However, since its performance can be influenced by the nature of the quantization process required a priori, optimal quantization parameters can be found and used to improve performance. In this article the influence of quantization resolution and its rate on the performance of global entropic thresholding algorithms have been investigated. Measurement of gas volume component fraction in a multiphase flow of gas/liquid mixture using electrical capacitance tomography system has been used for evaluation using simulated and online capacitance measurement data. Results show that an optimal quantizer resolution is flow regime dependent. Higher resolutions are optimal for annular flow and vice versa for stratified flow regimes. Also, higher resolution significantly minimizes the dependency of the thresholding algorithm on the object to be searched, thereby reducing complexity of designing a thresholder. Overall, the optimal quantization resolution is 256. Tanzania Journal of Science Vol. 31 (2) 2005: pp. 63-7

    Edges Detection Based On Renyi Entropy with Split/Merge

    Get PDF
    Most of the classical methods for edge detection are based on the first and second order derivatives of gray levels of the pixels of the original image. These processes give rise to the exponential increment of computational time, especially with large size of images, and therefore requires more time for processing. This paper shows the new algorithm based on both the Rényi entropy and the Shannon entropy together for edge detection using split and merge technique. The objective is to find the best edge representation and decrease the computation time. A set of experiments in the domain of edge detection are presented. The system yields edge detection performance comparable to the classic methods, such as Canny, LOG, and Sobel.  The experimental results show that the effect of this method is better to LOG, and Sobel methods. In addition, it is better to other three methods in CPU time. Another benefit comes from easy implementation of this method. Keywords: Rényi Entropy, Information content, Edge detection, Thresholdin

    Histogram analysis of the human brain MR images based on the S-function membership and Shannon's entropy function

    Get PDF
    The analysis of medical images for the purpose of computer-aided diagnosis and therapy planning includes segmentation as a preliminary stage for the visualization or quantification. In this paper, we present the first step in our fuzzy segmentation system that is capable of segmenting magnetic resonance (MR) images of a human brain. The histogram analysis based on the S-function membership and the Shannon's entropy function provides finding exact segmentation points. In the final stage, pixel classification is performed using the rule-based fuzzy logic inference. When the segmentation is complete, attributes of these classes may be determined (e.g., volumes), or the classes may be visualized as spatial objects. In contrast to other segmentation methods, like thresholding and region-based algorithms, our methods proceeds automatically and allow more exact delineation of the anatomical structures
    • …
    corecore