9 research outputs found

    Quantum Quantile Mechanics: Solving Stochastic Differential Equations for Generating Time-Series

    Full text link
    We propose a quantum algorithm for sampling from a solution of stochastic differential equations (SDEs). Using differentiable quantum circuits (DQCs) with a feature map encoding of latent variables, we represent the quantile function for an underlying probability distribution and extract samples as DQC expectation values. Using quantile mechanics we propagate the system in time, thereby allowing for time-series generation. We test the method by simulating the Ornstein-Uhlenbeck process and sampling at times different from the initial point, as required in financial analysis and dataset augmentation. Additionally, we analyse continuous quantum generative adversarial networks (qGANs), and show that they represent quantile functions with a modified (reordered) shape that impedes their efficient time-propagation. Our results shed light on the connection between quantum quantile mechanics (QQM) and qGANs for SDE-based distributions, and point the importance of differential constraints for model training, analogously with the recent success of physics informed neural networks.Comment: v3, minor updat

    What can we learn from quantum convolutional neural networks?

    Full text link
    We can learn from analyzing quantum convolutional neural networks (QCNNs) that: 1) working with quantum data can be perceived as embedding physical system parameters through a hidden feature map; 2) their high performance for quantum phase recognition can be attributed to generation of a very suitable basis set during the ground state embedding, where quantum criticality of spin models leads to basis functions with rapidly changing features; 3) pooling layers of QCNNs are responsible for picking those basis functions that can contribute to forming a high-performing decision boundary, and the learning process corresponds to adapting the measurement such that few-qubit operators are mapped to full-register observables; 4) generalization of QCNN models strongly depends on the embedding type, and that rotation-based feature maps with the Fourier basis require careful feature engineering; 5) accuracy and generalization of QCNNs with readout based on a limited number of shots favor the ground state embeddings and associated physics-informed models. We demonstrate these points in simulation, where our results shed light on classification for physical processes, relevant for applications in sensing. Finally, we show that QCNNs with properly chosen ground state embeddings can be used for fluid dynamics problems, expressing shock wave solutions with good generalization and proven trainability.Comment: 13 pages, 7 figure

    Quantum Chebyshev Transform: Mapping, Embedding, Learning and Sampling Distributions

    Full text link
    We develop a paradigm for building quantum models in the orthonormal space of Chebyshev polynomials. We show how to encode data into quantum states with amplitudes being Chebyshev polynomials with degree growing exponentially in the system size. Similar to the quantum Fourier transform which maps computational basis space into the phase (Fourier) basis, we describe the quantum circuit for the mapping between computational and Chebyshev spaces. We propose an embedding circuit for generating the orthonormal Chebyshev basis of exponential capacity, represented by a continuously-parameterized shallow isometry. This enables automatic quantum model differentiation, and opens a route to solving stochastic differential equations. We apply the developed paradigm to generative modeling from physically- and financially-motivated distributions, and use the quantum Chebyshev transform for efficient sampling of these distributions in extended computational basis.Comment: 6 pages (+ references), 3 figure

    Quantum Kernel Methods for Solving Differential Equations

    Full text link
    We propose several approaches for solving differential equations (DEs) with quantum kernel methods. We compose quantum models as weighted sums of kernel functions, where variables are encoded using feature maps and model derivatives are represented using automatic differentiation of quantum circuits. While previously quantum kernel methods primarily targeted classification tasks, here we consider their applicability to regression tasks, based on available data and differential constraints. We use two strategies to approach these problems. First, we devise a mixed model regression with a trial solution represented by kernel-based functions, which is trained to minimize a loss for specific differential constraints or datasets. Second, we use support vector regression that accounts for the structure of differential equations. The developed methods are capable of solving both linear and nonlinear systems. Contrary to prevailing hybrid variational approaches for parametrized quantum circuits, we perform training of the weights of the model classically. Under certain conditions this corresponds to a convex optimization problem, which can be solved with provable convergence to global optimum of the model. The proposed approaches also favor hardware implementations, as optimization only uses evaluated Gram matrices, but require quadratic number of function evaluations. We highlight trade-offs when comparing our methods to those based on variational quantum circuits such as the recently proposed differentiable quantum circuits (DQC) approach. The proposed methods offer potential quantum enhancement through the rich kernel representations using the power of quantum feature maps, and start the quest towards provably trainable quantum DE solvers.Comment: 13 pages, 6 figures, v

    La stimulation des défenses de l’arbre par les champignons Ophiostomatoïdes peut expliquer le succès des attaques de Scolytes sur conifères

    No full text

    Predicting Drug Extraction in the Human Gut Wall: Assessing Contributions from Drug Metabolizing Enzymes and Transporter Proteins using Preclinical Models

    No full text
    corecore