463 research outputs found
Best Arm Identification with Fairness Constraints on Subpopulations
We formulate, analyze and solve the problem of best arm identification with
fairness constraints on subpopulations (BAICS). Standard best arm
identification problems aim at selecting an arm that has the largest expected
reward where the expectation is taken over the entire population. The BAICS
problem requires that an selected arm must be fair to all subpopulations (e.g.,
different ethnic groups, age groups, or customer types) by satisfying
constraints that the expected reward conditional on every subpopulation needs
to be larger than some thresholds. The BAICS problem aims at correctly
identify, with high confidence, the arm with the largest expected reward from
all arms that satisfy subpopulation constraints. We analyze the complexity of
the BAICS problem by proving a best achievable lower bound on the sample
complexity with closed-form representation. We then design an algorithm and
prove that the algorithm's sample complexity matches with the lower bound in
terms of order. A brief account of numerical experiments are conducted to
illustrate the theoretical findings
Entropic characterization of optimal rates for learning Gaussian mixtures
We consider the question of estimating multi-dimensional Gaussian mixtures
(GM) with compactly supported or subgaussian mixing distributions. Minimax
estimation rate for this class (under Hellinger, TV and KL divergences) is a
long-standing open question, even in one dimension. In this paper we
characterize this rate (for all constant dimensions) in terms of the metric
entropy of the class. Such characterizations originate from seminal works of Le
Cam (1973); Birge (1983); Haussler and Opper (1997); Yang and Barron (1999).
However, for GMs a key ingredient missing from earlier work (and widely
sought-after) is a comparison result showing that the KL and the squared
Hellinger distance are within a constant multiple of each other uniformly over
the class. Our main technical contribution is in showing this fact, from which
we derive entropy characterization for estimation rate under Hellinger and KL.
Interestingly, the sequential (online learning) estimation rate is
characterized by the global entropy, while the single-step (batch) rate
corresponds to local entropy, paralleling a similar result for the Gaussian
sequence model recently discovered by Neykov (2022) and Mourtada (2023).
Additionally, since Hellinger is a proper metric, our comparison shows that GMs
under KL satisfy the triangle inequality within multiplicative constants,
implying that proper and improper estimation rates coincide
Enhance Temporal Relations in Audio Captioning with Sound Event Detection
Automated audio captioning aims at generating natural language descriptions
for given audio clips, not only detecting and classifying sounds, but also
summarizing the relationships between audio events. Recent research advances in
audio captioning have introduced additional guidance to improve the accuracy of
audio events in generated sentences. However, temporal relations between audio
events have received little attention while revealing complex relations is a
key component in summarizing audio content. Therefore, this paper aims to
better capture temporal relationships in caption generation with sound event
detection (SED), a task that locates events' timestamps. We investigate the
best approach to integrate temporal information in a captioning model and
propose a temporal tag system to transform the timestamps into comprehensible
relations. Results evaluated by the proposed temporal metrics suggest that
great improvement is achieved in terms of temporal relation generation
Efficient quantum compression for identically prepared states with arbitrary dimentional
In this paper, we present an efficient quantum compression method for
identically prepared states with arbitrary dimentional
Entanglement as the cross-symmetric part of quantum discord
In this paper, we show that the minimal quantum discord over
"cross-symmetric" state extensions is an entanglement monotone. In particular,
we show that the minimal Bures distance of discord over cross-symmetric
extensions is equivalent to the Bures distance of entanglement. At last, we
refute a long-held but unstated convention that only contractive distances can
be used to construct entanglement monotones by showing that the entanglement
quantifier induced by the Hilbert-Schmidt distance, which is not contractive
under quantum operations, is also an entanglement monotone.Comment: 9 pages, 1 figure. arXiv admin note: text overlap with
arXiv:2012.0383
- …