31,139 research outputs found
The intrinsic value of HFO features as a biomarker of epileptic activity
High frequency oscillations (HFOs) are a promising biomarker of epileptic
brain tissue and activity. HFOs additionally serve as a prototypical example of
challenges in the analysis of discrete events in high-temporal resolution,
intracranial EEG data. Two primary challenges are 1) dimensionality reduction,
and 2) assessing feasibility of classification. Dimensionality reduction
assumes that the data lie on a manifold with dimension less than that of the
feature space. However, previous HFO analyses have assumed a linear manifold,
global across time, space (i.e. recording electrode/channel), and individual
patients. Instead, we assess both a) whether linear methods are appropriate and
b) the consistency of the manifold across time, space, and patients. We also
estimate bounds on the Bayes classification error to quantify the distinction
between two classes of HFOs (those occurring during seizures and those
occurring due to other processes). This analysis provides the foundation for
future clinical use of HFO features and buides the analysis for other discrete
events, such as individual action potentials or multi-unit activity.Comment: 5 pages, 5 figure
Lipschitz Optimisation for Lipschitz Interpolation
Techniques known as Nonlinear Set Membership prediction, Kinky Inference or
Lipschitz Interpolation are fast and numerically robust approaches to
nonparametric machine learning that have been proposed to be utilised in the
context of system identification and learning-based control. They utilise
presupposed Lipschitz properties in order to compute inferences over unobserved
function values. Unfortunately, most of these approaches rely on exact
knowledge about the input space metric as well as about the Lipschitz constant.
Furthermore, existing techniques to estimate the Lipschitz constants from the
data are not robust to noise or seem to be ad-hoc and typically are decoupled
from the ultimate learning and prediction task. To overcome these limitations,
we propose an approach for optimising parameters of the presupposed metrics by
minimising validation set prediction errors. To avoid poor performance due to
local minima, we propose to utilise Lipschitz properties of the optimisation
objective to ensure global optimisation success. The resulting approach is a
new flexible method for nonparametric black-box learning. We provide
experimental evidence of the competitiveness of our approach on artificial as
well as on real data
Error Rates of the Maximum-Likelihood Detector for Arbitrary Constellations: Convex/Concave Behavior and Applications
Motivated by a recent surge of interest in convex optimization techniques,
convexity/concavity properties of error rates of the maximum likelihood
detector operating in the AWGN channel are studied and extended to
frequency-flat slow-fading channels. Generic conditions are identified under
which the symbol error rate (SER) is convex/concave for arbitrary
multi-dimensional constellations. In particular, the SER is convex in SNR for
any one- and two-dimensional constellation, and also in higher dimensions at
high SNR. Pairwise error probability and bit error rate are shown to be convex
at high SNR, for arbitrary constellations and bit mapping. Universal bounds for
the SER 1st and 2nd derivatives are obtained, which hold for arbitrary
constellations and are tight for some of them. Applications of the results are
discussed, which include optimum power allocation in spatial multiplexing
systems, optimum power/time sharing to decrease or increase (jamming problem)
error rate, an implication for fading channels ("fading is never good in low
dimensions") and optimization of a unitary-precoded OFDM system. For example,
the error rate bounds of a unitary-precoded OFDM system with QPSK modulation,
which reveal the best and worst precoding, are extended to arbitrary
constellations, which may also include coding. The reported results also apply
to the interference channel under Gaussian approximation, to the bit error rate
when it can be expressed or approximated as a non-negative linear combination
of individual symbol error rates, and to coded systems.Comment: accepted by IEEE IT Transaction
Measurements in two bases are sufficient for certifying high-dimensional entanglement
High-dimensional encoding of quantum information provides a promising method
of transcending current limitations in quantum communication. One of the
central challenges in the pursuit of such an approach is the certification of
high-dimensional entanglement. In particular, it is desirable to do so without
resorting to inefficient full state tomography. Here, we show how carefully
constructed measurements in two bases (one of which is not orthonormal) can be
used to faithfully and efficiently certify bipartite high-dimensional states
and their entanglement for any physical platform. To showcase the practicality
of this approach under realistic conditions, we put it to the test for photons
entangled in their orbital angular momentum. In our experimental setup, we are
able to verify 9-dimensional entanglement for a pair of photons on a
11-dimensional subspace each, at present the highest amount certified without
any assumptions on the state.Comment: 11+14 pages, 2+7 figure
- …