825 research outputs found

    Entanglement quantification from incomplete measurements: Applications using photon-number-resolving weak homodyne detectors

    Full text link
    The certificate of success for a number of important quantum information processing protocols, such as entanglement distillation, is based on the difference in the entanglement content of the quantum states before and after the protocol. In such cases, effective bounds need to be placed on the entanglement of non-local states consistent with statistics obtained from local measurements. In this work, we study numerically the ability of a novel type of homodyne detector which combines phase sensitivity and photon-number resolution to set accurate bounds on the entanglement content of two-mode quadrature squeezed states without the need for full state tomography. We show that it is possible to set tight lower bounds on the entanglement of a family of two-mode degaussified states using only a few measurements. This presents a significant improvement over the resource requirements for the experimental demonstration of continuous-variable entanglement distillation, which traditionally relies on full quantum state tomography.Comment: 18 pages, 6 figure

    Optical implementations of radial basis classifiers

    Get PDF
    We describe two optical systems based on the radial basis function approach to pattern classification. An optical-disk-based system for handwritten character recognition is demonstrated. The optical system computes the Euclidean distance between an unknown input and 650 stored patterns at a demonstrated rate of 26,000 pattern comparisons/s. The ultimate performance of this system is limited by optical-disk resolution to 10^11 binary operations/s. An adaptive system is also presented that facilitates on-line learning and provides additional robustness

    MIMO-aided near-capacity turbo transceivers: taxonomy and performance versus complexity

    No full text
    In this treatise, we firstly review the associated Multiple-Input Multiple-Output (MIMO) system theory and review the family of hard-decision and soft-decision based detection algorithms in the context of Spatial Division Multiplexing (SDM) systems. Our discussions culminate in the introduction of a range of powerful novel MIMO detectors, such as for example Markov Chain assisted Minimum Bit-Error Rate (MC-MBER) detectors, which are capable of reliably operating in the challenging high-importance rank-deficient scenarios, where there are more transmitters than receivers and hence the resultant channel-matrix becomes non-invertible. As a result, conventional detectors would exhibit a high residual error floor. We then invoke the Soft-Input Soft-Output (SISO) MIMO detectors for creating turbo-detected two- or three-stage concatenated SDM schemes and investigate their attainable performance in the light of their computational complexity. Finally, we introduce the powerful design tools of EXtrinsic Information Transfer (EXIT)-charts and characterize the achievable performance of the diverse near- capacity SISO detectors with the aid of EXIT charts

    A Thesis on a 3D Input Device for Sketching Characters

    Get PDF
    The goal of this project is to develop a 3D input device using a stiff piece of paper and a camera. The camera tracks the piece of paper in 3D space. The user orients the paper in 3D space and then draws on the paper using a pen-like device. The camera tracks the movement of the pen on the piece of paper. The location of the pen in 3D space can then be calculated from the orientation of the paper. A drawing application that uses this 3D input device was also developed. The application allows a user to make characters by sketching ellipses. The drawing application creates a virtual rendering of the paper and displays this to the user. As the user positions the real paper, the virtual one mirrors its movements. The user can draw shapes on the paper. These shapes then get rendered in the virtual scene

    A perspective on physical reservoir computing with nanomagnetic devices

    Full text link
    Neural networks have revolutionized the area of artificial intelligence and introduced transformative applications to almost every scientific field and industry. However, this success comes at a great price; the energy requirements for training advanced models are unsustainable. One promising way to address this pressing issue is by developing low-energy neuromorphic hardware that directly supports the algorithm's requirements. The intrinsic non-volatility, non-linearity, and memory of spintronic devices make them appealing candidates for neuromorphic devices. Here we focus on the reservoir computing paradigm, a recurrent network with a simple training algorithm suitable for computation with spintronic devices since they can provide the properties of non-linearity and memory. We review technologies and methods for developing neuromorphic spintronic devices and conclude with critical open issues to address before such devices become widely used

    Proceedings of the second "international Traveling Workshop on Interactions between Sparse models and Technology" (iTWIST'14)

    Get PDF
    The implicit objective of the biennial "international - Traveling Workshop on Interactions between Sparse models and Technology" (iTWIST) is to foster collaboration between international scientific teams by disseminating ideas through both specific oral/poster presentations and free discussions. For its second edition, the iTWIST workshop took place in the medieval and picturesque town of Namur in Belgium, from Wednesday August 27th till Friday August 29th, 2014. The workshop was conveniently located in "The Arsenal" building within walking distance of both hotels and town center. iTWIST'14 has gathered about 70 international participants and has featured 9 invited talks, 10 oral presentations, and 14 posters on the following themes, all related to the theory, application and generalization of the "sparsity paradigm": Sparsity-driven data sensing and processing; Union of low dimensional subspaces; Beyond linear and convex inverse problem; Matrix/manifold/graph sensing/processing; Blind inverse problems and dictionary learning; Sparsity and computational neuroscience; Information theory, geometry and randomness; Complexity/accuracy tradeoffs in numerical methods; Sparsity? What's next?; Sparse machine learning and inference.Comment: 69 pages, 24 extended abstracts, iTWIST'14 website: http://sites.google.com/site/itwist1

    CAD Tools for DNA Micro-Array Design, Manufacture and Application

    Get PDF
    Motivation: As the human genome project progresses and some microbial and eukaryotic genomes are recognized, numerous biotechnological processes have attracted increasing number of biologists, bioengineers and computer scientists recently. Biotechnological processes profoundly involve production and analysis of highthroughput experimental data. Numerous sequence libraries of DNA and protein structures of a large number of micro-organisms and a variety of other databases related to biology and chemistry are available. For example, microarray technology, a novel biotechnology, promises to monitor the whole genome at once, so that researchers can study the whole genome on the global level and have a better picture of the expressions among millions of genes simultaneously. Today, it is widely used in many fields- disease diagnosis, gene classification, gene regulatory network, and drug discovery. For example, designing organism specific microarray and analysis of experimental data require combining heterogeneous computational tools that usually differ in the data format; such as, GeneMark for ORF extraction, Promide for DNA probe selection, Chip for probe placement on microarray chip, BLAST to compare sequences, MEGA for phylogenetic analysis, and ClustalX for multiple alignments. Solution: Surprisingly enough, despite huge research efforts invested in DNA array applications, very few works are devoted to computer-aided optimization of DNA array design and manufacturing. Current design practices are dominated by ad-hoc heuristics incorporated in proprietary tools with unknown suboptimality. This will soon become a bottleneck for the new generation of high-density arrays, such as the ones currently being designed at Perlegen [109]. The goal of the already accomplished research was to develop highly scalable tools, with predictable runtime and quality, for cost-effective, computer-aided design and manufacturing of DNA probe arrays. We illustrate the utility of our approach by taking a concrete example of combining the design tools of microarray technology for Harpes B virus DNA data
    corecore