15 research outputs found
DPPy: Sampling Determinantal Point Processes with Python
International audienceDeterminantal point processes (DPPs) are specific probability distributions over clouds of points that are used as models and computational tools across physics, probability, statistics, and more recently machine learning. Sampling from DPPs is a challenge and therefore we present DPPy, a Python toolbox that gathers known exact and approximate sampling algorithms. The project is hosted on GitHub and equipped with an extensive documentation. This documentation takes the form of a short survey of DPPs and relates each mathematical property with DPPy objects
Improved Financial Forecasting via Quantum Machine Learning
Quantum algorithms have the potential to enhance machine learning across a
variety of domains and applications. In this work, we show how quantum machine
learning can be used to improve financial forecasting. First, we use classical
and quantum Determinantal Point Processes to enhance Random Forest models for
churn prediction, improving precision by almost 6%. Second, we design quantum
neural network architectures with orthogonal and compound layers for credit
risk assessment, which match classical performance with significantly fewer
parameters. Our results demonstrate that leveraging quantum ideas can
effectively enhance the performance of machine learning, both today as
quantum-inspired classical ML solutions, and even more in the future, with the
advent of better quantum hardware
A Polynomial Time MCMC Method for Sampling from Continuous DPPs
We study the Gibbs sampling algorithm for continuous determinantal point
processes. We show that, given a warm start, the Gibbs sampler generates a
random sample from a continuous -DPP defined on a -dimensional domain by
only taking number of steps. As an application, we design an
algorithm to generate random samples from -DPPs defined by a spherical
Gaussian kernel on a unit sphere in -dimensions, in time
polynomial in