4 research outputs found

    Conformal Prediction for Time Series with Modern Hopfield Networks

    Full text link
    To quantify uncertainty, conformal prediction methods are gaining continuously more interest and have already been successfully applied to various domains. However, they are difficult to apply to time series as the autocorrelative structure of time series violates basic assumptions required by conformal prediction. We propose HopCPT, a novel conformal prediction approach for time series that not only copes with temporal structures but leverages them. We show that our approach is theoretically well justified for time series where temporal dependencies are present. In experiments, we demonstrate that our new approach outperforms state-of-the-art conformal prediction methods on multiple real-world time series datasets from four different domains.Comment: presented at NeurIPS 202

    Méthodes géométriques pour la mémoire et l'apprentissage

    Get PDF
    This thesis is devoted to geometric methods in optimization, learning and neural networks. In many problems of (supervised and unsupervised) learning, pattern recognition, and clustering there is a need to take into account the internal (intrinsic) structure of the underlying space, which is not necessary Euclidean. For Riemannian manifolds we construct computational algorithms for Newton method, conjugate-gradient methods, and some non-smooth optimization methods like the r-algorithm. For this purpose we develop methods for geodesic calculation in submanifolds based on Hamilton equations and symplectic integration. Then we construct a new type of neural associative memory capable of unsupervised learning and clustering. Its learning is based on generalized averaging over Grassmann manifolds. Further extension of this memory involves implicit space transformation and kernel machines. Also we consider geometric algorithms for signal processing and adaptive filtering. Proposed methods are tested for academic examples as well as real-life problems of image recognition and signal processing. Application of proposed neural networks is demonstrated for a complete real-life project of chemical image recognition (electronic nose).Cette these est consacree aux methodes geometriques dans l'optimisation, l'apprentissage et les reseaux neuronaux. Dans beaucoup de problemes de l'apprentissage (supervises et non supervises), de la reconnaissance des formes, et du groupage, il y a un besoin de tenir en compte de la structure interne (intrinseque) de l'espace fondamental, qui n'est pas toujours euclidien. Pour les varietes Riemanniennes nous construisons des algorithmes pour la methode de Newton, les methodes de gradients conjugues, et certaines methodes non-lisses d'optimisation comme r-algorithme. A cette fin nous developpons des methodes pour le calcul des geodesiques dans les sous-varietes bases sur des equations de Hamilton et l'integration symplectique. Apres nous construisons un nouveau type avec de la memoire associative neuronale capable de l'apprentissage non supervise et du groupage (clustering). Son apprentissage est base sur moyennage generalise dans les varietes de Grassmann. Future extension de cette memoire implique les machines a noyaux et transformations de l'espace implicites. Aussi nous considerons des algorithmes geometriques pour le traitement des signaux et le filtrage adaptatif. Les methodes proposees sont testees avec des exemples standard et avec des problemes reels de reconnaissance des images et du traitement des signaux. L'application des reseaux neurologiques proposes est demontree pour un projet reel complet de la reconnaissance des images chimiques (nez electronique)

    Storage capacity of kernel associative memories

    No full text
    This contribution discusses the thermodynamic phases and storage capacity of an extension of the Hopfield-Little model of associative memory via kernel functions. The analysis is presented for the case of polynomial and Gaussian kernels in a replica symmetry ansatz. As a general result we found for both kernels that the storage capacity increases considerably compared to the Hopfield-Little model
    corecore