18,873 research outputs found
Brane structures in microlocal sheaf theory
Let be an exact Lagrangian submanifold of a cotangent bundle ,
asymptotic to a Legendrian submanifold . We study
a locally constant sheaf of -categories on , called the sheaf of
brane structures or . Its fiber is the -category of
spectra, and we construct a Hamiltonian invariant, fully faithful functor from
to the -category of sheaves of spectra on
with singular support in .Comment: 35 pages, 13 figure
Asymptotic minimaxity of False Discovery Rate thresholding for sparse exponential data
We apply FDR thresholding to a non-Gaussian vector whose coordinates X_i,
i=1,..., n, are independent exponential with individual means . The
vector is thought to be sparse, with most coordinates 1 but a
small fraction significantly larger than 1; roughly, most coordinates are
simply `noise,' but a small fraction contain `signal.' We measure risk by
per-coordinate mean-squared error in recovering , and study
minimax estimation over parameter spaces defined by constraints on the
per-coordinate p-norm of :
. We show for large n and
small that FDR thresholding can be nearly Minimax. The FDR control
parameter 0<q<1 plays an important role: when , the FDR estimator is
nearly minimax, while choosing a fixed q>1/2 prevents near minimaxity. These
conclusions mirror those found in the Gaussian case in Abramovich et al. [Ann.
Statist. 34 (2006) 584--653]. The techniques developed here seem applicable to
a wide range of other distributional assumptions, other loss measures and
non-i.i.d. dependency structures.Comment: Published at http://dx.doi.org/10.1214/009053606000000920 in the
Annals of Statistics (http://www.imstat.org/aos/) by the Institute of
Mathematical Statistics (http://www.imstat.org
Higher criticism for detecting sparse heterogeneous mixtures
Higher criticism, or second-level significance testing, is a
multiple-comparisons concept mentioned in passing by Tukey. It concerns a
situation where there are many independent tests of significance and one is
interested in rejecting the joint null hypothesis. Tukey suggested comparing
the fraction of observed significances at a given \alpha-level to the expected
fraction under the joint null. In fact, he suggested standardizing the
difference of the two quantities and forming a z-score; the resulting z-score
tests the significance of the body of significance tests. We consider a
generalization, where we maximize this z-score over a range of significance
levels 0<\alpha\leq\alpha_0.
We are able to show that the resulting higher criticism statistic is
effective at resolving a very subtle testing problem: testing whether n normal
means are all zero versus the alternative that a small fraction is nonzero. The
subtlety of this ``sparse normal means'' testing problem can be seen from work
of Ingster and Jin, who studied such problems in great detail. In their
studies, they identified an interesting range of cases where the small fraction
of nonzero means is so small that the alternative hypothesis exhibits little
noticeable effect on the distribution of the p-values either for the bulk of
the tests or for the few most highly significant tests.
In this range, when the amplitude of nonzero means is calibrated with the
fraction of nonzero means, the likelihood ratio test for a precisely specified
alternative would still succeed in separating the two hypotheses.Comment: Published by the Institute of Mathematical Statistics
(http://www.imstat.org) in the Annals of Statistics
(http://www.imstat.org/aos/) at http://dx.doi.org/10.1214/00905360400000026
Two-Layered Superposition of Broadcast/Multicast and Unicast Signals in Multiuser OFDMA Systems
We study optimal delivery strategies of one common and independent
messages from a source to multiple users in wireless environments. In
particular, two-layered superposition of broadcast/multicast and unicast
signals is considered in a downlink multiuser OFDMA system. In the literature
and industry, the two-layer superposition is often considered as a pragmatic
approach to make a compromise between the simple but suboptimal orthogonal
multiplexing (OM) and the optimal but complex fully-layered non-orthogonal
multiplexing. In this work, we show that only two-layers are necessary to
achieve the maximum sum-rate when the common message has higher priority than
the individual unicast messages, and OM cannot be sum-rate optimal in
general. We develop an algorithm that finds the optimal power allocation over
the two-layers and across the OFDMA radio resources in static channels and a
class of fading channels. Two main use-cases are considered: i) Multicast and
unicast multiplexing when users with uplink capabilities request both
common and independent messages, and ii) broadcast and unicast multiplexing
when the common message targets receive-only devices and users with uplink
capabilities additionally request independent messages. Finally, we develop a
transceiver design for broadcast/multicast and unicast superposition
transmission based on LTE-A-Pro physical layer and show with numerical
evaluations in mobile environments with multipath propagation that the capacity
improvements can be translated into significant practical performance gains
compared to the orthogonal schemes in the 3GPP specifications. We also analyze
the impact of real channel estimation and show that significant gains in terms
of spectral efficiency or coverage area are still available even with
estimation errors and imperfect interference cancellation for the two-layered
superposition system
Priority-Based Synchronization of Distributed Data
We consider the general problem of synchronizing the data on two devices using a minimum amount of communication, a core infrastructural requirement for a large variety of distributed systems. Our approach considers the interactive synchronization of prioritized data, where, for example, certain information is more time-sensitive than other information. We propose and analyze a new scheme for efficient priority-based synchronization, which promises benefits over conventional synchronization
- …