35,035 research outputs found
Kernels over Sets of Finite Sets using RKHS Embeddings, with Application to Bayesian (Combinatorial) Optimization
We focus on kernel methods for set-valued inputs and their application to
Bayesian set optimization, notably combinatorial optimization. We investigate
two classes of set kernels that both rely on Reproducing Kernel Hilbert Space
embeddings, namely the ``Double Sum'' (DS) kernels recently considered in
Bayesian set optimization, and a class introduced here called ``Deep
Embedding'' (DE) kernels that essentially consists in applying a radial kernel
on Hilbert space on top of the canonical distance induced by another kernel
such as a DS kernel. We establish in particular that while DS kernels typically
suffer from a lack of strict positive definiteness, vast subclasses of DE
kernels built upon DS kernels do possess this property, enabling in turn
combinatorial optimization without requiring to introduce a jitter parameter.
Proofs of theoretical results about considered kernels are complemented by a
few practicalities regarding hyperparameter fitting. We furthermore demonstrate
the applicability of our approach in prediction and optimization tasks, relying
both on toy examples and on two test cases from mechanical engineering and
hydrogeology, respectively. Experimental results highlight the applicability
and compared merits of the considered approaches while opening new perspectives
in prediction and sequential design with set inputs
Benchmarking quantum control methods on a 12-qubit system
In this letter, we present an experimental benchmark of operational control
methods in quantum information processors extended up to 12 qubits. We
implement universal control of this large Hilbert space using two complementary
approaches and discuss their accuracy and scalability. Despite decoherence, we
were able to reach a 12-coherence state (or 12-qubits pseudo-pure cat state),
and decode it into an 11 qubit plus one qutrit labeled observable pseudo-pure
state using liquid state nuclear magnetic resonance quantum information
processors.Comment: 11 pages, 4 figures, to be published in PR
Predicting pharmaceutical particle size distributions using kernel mean embedding
In the pharmaceutical industry, the transition to continuous manufacturing of solid dosage forms is adopted by more and more companies. For these continuous processes, high-quality process models are needed. In pharmaceutical wet granulation, a unit operation in the ConsiGmaTM-25 continuous powder-to-tablet system (GEA Pharma systems, Collette, Wommelgem, Belgium), the product under study presents itself as a collection of particles that differ in shape and size. The measurement of this collection results in a particle size distribution. However, the theoretical basis to describe the physical phenomena leading to changes in this particle size distribution is lacking. It is essential to understand how the particle size distribution changes as a function of the unit operation's process settings, as it has a profound effect on the behavior of the fluid bed dryer. Therefore, we suggest a data-driven modeling framework that links the machine settings of the wet granulation unit operation and the output distribution of granules. We do this without making any assumptions on the nature of the distributions under study. A simulation of the granule size distribution could act as a soft sensor when in-line measurements are challenging to perform. The method of this work is a two-step procedure: first, the measured distributions are transformed into a high-dimensional feature space, where the relation between the machine settings and the distributions can be learnt. Second, the inverse transformation is performed, allowing an interpretation of the results in the original measurement space. Further, a comparison is made with previous work, which employs a more mechanistic framework for describing the granules. A reliable prediction of the granule size is vital in the assurance of quality in the production line, and is needed in the assessment of upstream (feeding) and downstream (drying, milling, and tableting) issues. Now that a validated data-driven framework for predicting pharmaceutical particle size distributions is available, it can be applied in settings such as model-based experimental design and, due to its fast computation, there is potential in real-time model predictive control
- …