91,697 research outputs found
An information criterion for auxiliary variable selection in incomplete data analysis
Statistical inference is considered for variables of interest, called primary
variables, when auxiliary variables are observed along with the primary
variables. We consider the setting of incomplete data analysis, where some
primary variables are not observed. Utilizing a parametric model of joint
distribution of primary and auxiliary variables, it is possible to improve the
estimation of parametric model for the primary variables when the auxiliary
variables are closely related to the primary variables. However, the estimation
accuracy reduces when the auxiliary variables are irrelevant to the primary
variables. For selecting useful auxiliary variables, we formulate the problem
as model selection, and propose an information criterion for predicting primary
variables by leveraging auxiliary variables. The proposed information criterion
is an asymptotically unbiased estimator of the Kullback-Leibler divergence for
complete data of primary variables under some reasonable conditions. We also
clarify an asymptotic equivalence between the proposed information criterion
and a variant of leave-one-out cross validation. Performance of our method is
demonstrated via a simulation study and a real data example
New Guarantees for Blind Compressed Sensing
Blind Compressed Sensing (BCS) is an extension of Compressed Sensing (CS)
where the optimal sparsifying dictionary is assumed to be unknown and subject
to estimation (in addition to the CS sparse coefficients). Since the emergence
of BCS, dictionary learning, a.k.a. sparse coding, has been studied as a matrix
factorization problem where its sample complexity, uniqueness and
identifiability have been addressed thoroughly. However, in spite of the strong
connections between BCS and sparse coding, recent results from the sparse
coding problem area have not been exploited within the context of BCS. In
particular, prior BCS efforts have focused on learning constrained and complete
dictionaries that limit the scope and utility of these efforts. In this paper,
we develop new theoretical bounds for perfect recovery for the general
unconstrained BCS problem. These unconstrained BCS bounds cover the case of
overcomplete dictionaries, and hence, they go well beyond the existing BCS
theory. Our perfect recovery results integrate the combinatorial theories of
sparse coding with some of the recent results from low-rank matrix recovery. In
particular, we propose an efficient CS measurement scheme that results in
practical recovery bounds for BCS. Moreover, we discuss the performance of BCS
under polynomial-time sparse coding algorithms.Comment: To appear in the 53rd Annual Allerton Conference on Communication,
Control and Computing, University of Illinois at Urbana-Champaign, IL, USA,
201
- …