12,101 research outputs found

    Discrimination on the Grassmann Manifold: Fundamental Limits of Subspace Classifiers

    Full text link
    We present fundamental limits on the reliable classification of linear and affine subspaces from noisy, linear features. Drawing an analogy between discrimination among subspaces and communication over vector wireless channels, we propose two Shannon-inspired measures to characterize asymptotic classifier performance. First, we define the classification capacity, which characterizes necessary and sufficient conditions for the misclassification probability to vanish as the signal dimension, the number of features, and the number of subspaces to be discerned all approach infinity. Second, we define the diversity-discrimination tradeoff which, by analogy with the diversity-multiplexing tradeoff of fading vector channels, characterizes relationships between the number of discernible subspaces and the misclassification probability as the noise power approaches zero. We derive upper and lower bounds on these measures which are tight in many regimes. Numerical results, including a face recognition application, validate the results in practice.Comment: 19 pages, 4 figures. Revised submission to IEEE Transactions on Information Theor

    Closed-loop optimization of fast-charging protocols for batteries with machine learning.

    Get PDF
    Simultaneously optimizing many design parameters in time-consuming experiments causes bottlenecks in a broad range of scientific and engineering disciplines1,2. One such example is process and control optimization for lithium-ion batteries during materials selection, cell manufacturing and operation. A typical objective is to maximize battery lifetime; however, conducting even a single experiment to evaluate lifetime can take months to years3-5. Furthermore, both large parameter spaces and high sampling variability3,6,7 necessitate a large number of experiments. Hence, the key challenge is to reduce both the number and the duration of the experiments required. Here we develop and demonstrate a machine learning methodology  to efficiently optimize a parameter space specifying the current and voltage profiles of six-step, ten-minute fast-charging protocols for maximizing battery cycle life, which can alleviate range anxiety for electric-vehicle users8,9. We combine two key elements to reduce the optimization cost: an early-prediction model5, which reduces the time per experiment by predicting the final cycle life using data from the first few cycles, and a Bayesian optimization algorithm10,11, which reduces the number of experiments by balancing exploration and exploitation to efficiently probe the parameter space of charging protocols. Using this methodology, we rapidly identify high-cycle-life charging protocols among 224 candidates in 16 days (compared with over 500 days using exhaustive search without early prediction), and subsequently validate the accuracy and efficiency of our optimization approach. Our closed-loop methodology automatically incorporates feedback from past experiments to inform future decisions and can be generalized to other applications in battery design and, more broadly, other scientific domains that involve time-intensive experiments and multi-dimensional design spaces

    Identification and Lossy Reconstruction in Noisy Databases

    Get PDF
    A high-dimensional database system is studied where the noisy versions of the underlying feature vectors are observed in both the enrollment and query phases. The noisy observations are compressed before being stored in the database, and the user wishes to both identify the correct entry corresponding to the noisy query vector and reconstruct the original feature vector within a desired distortion level. A fundamental capacity-storage-distortion tradeoff is identified for this system in the form of single-letter information theoretic expressions. The relation of this problem to the classical Wyner-Ziv rate-distortion problem is shown, where the noisy query vector acts as the correlated side information available only in the lossy reconstruction of the feature vector. \ua9 1963-2012 IEEE

    On palimpsests in neural memory: an information theory viewpoint

    Full text link
    The finite capacity of neural memory and the reconsolidation phenomenon suggest it is important to be able to update stored information as in a palimpsest, where new information overwrites old information. Moreover, changing information in memory is metabolically costly. In this paper, we suggest that information-theoretic approaches may inform the fundamental limits in constructing such a memory system. In particular, we define malleable coding, that considers not only representation length but also ease of representation update, thereby encouraging some form of recycling to convert an old codeword into a new one. Malleability cost is the difficulty of synchronizing compressed versions, and malleable codes are of particular interest when representing information and modifying the representation are both expensive. We examine the tradeoff between compression efficiency and malleability cost, under a malleability metric defined with respect to a string edit distance. This introduces a metric topology to the compressed domain. We characterize the exact set of achievable rates and malleability as the solution of a subgraph isomorphism problem. This is all done within the optimization approach to biology framework.Accepted manuscrip

    Unconditional security from noisy quantum storage

    Full text link
    We consider the implementation of two-party cryptographic primitives based on the sole assumption that no large-scale reliable quantum storage is available to the cheating party. We construct novel protocols for oblivious transfer and bit commitment, and prove that realistic noise levels provide security even against the most general attack. Such unconditional results were previously only known in the so-called bounded-storage model which is a special case of our setting. Our protocols can be implemented with present-day hardware used for quantum key distribution. In particular, no quantum storage is required for the honest parties.Comment: 25 pages (IEEE two column), 13 figures, v4: published version (to appear in IEEE Transactions on Information Theory), including bit wise min-entropy sampling. however, for experimental purposes block sampling can be much more convenient, please see v3 arxiv version if needed. See arXiv:0911.2302 for a companion paper addressing aspects of a practical implementation using block samplin
    corecore