8,239 research outputs found
Generic continuous spectrum for multi-dimensional quasi periodic Schr\"odinger operators with rough potentials
We study the multi-dimensional operator , where is the shift of the torus
\T^d. When , we show the spectrum of is almost surely purely
continuous for a.e. and generic continuous potentials. When ,
the same result holds for frequencies under an explicit arithmetic criterion.
We also show that general multi-dimensional operators with measurable
potentials do not have eigenvalue for generic
Amplifying Inter-message Distance: On Information Divergence Measures in Big Data
Message identification (M-I) divergence is an important measure of the
information distance between probability distributions, similar to
Kullback-Leibler (K-L) and Renyi divergence. In fact, M-I divergence with a
variable parameter can make an effect on characterization of distinction
between two distributions. Furthermore, by choosing an appropriate parameter of
M-I divergence, it is possible to amplify the information distance between
adjacent distributions while maintaining enough gap between two nonadjacent
ones. Therefore, M-I divergence can play a vital role in distinguishing
distributions more clearly. In this paper, we first define a parametric M-I
divergence in the view of information theory and then present its major
properties. In addition, we design a M-I divergence estimation algorithm by
means of the ensemble estimator of the proposed weight kernel estimators, which
can improve the convergence of mean squared error from
to . We also discuss the decision with
M-I divergence for clustering or classification, and investigate its
performance in a statistical sequence model of big data for the outlier
detection problem.Comment: 30 pages, 4 figure
Nonparametric Independence Screening in Sparse Ultra-High Dimensional Additive Models
A variable screening procedure via correlation learning was proposed Fan and
Lv (2008) to reduce dimensionality in sparse ultra-high dimensional models.
Even when the true model is linear, the marginal regression can be highly
nonlinear. To address this issue, we further extend the correlation learning to
marginal nonparametric learning. Our nonparametric independence screening is
called NIS, a specific member of the sure independence screening. Several
closely related variable screening procedures are proposed. Under the
nonparametric additive models, it is shown that under some mild technical
conditions, the proposed independence screening methods enjoy a sure screening
property. The extent to which the dimensionality can be reduced by independence
screening is also explicitly quantified. As a methodological extension, an
iterative nonparametric independence screening (INIS) is also proposed to
enhance the finite sample performance for fitting sparse additive models. The
simulation results and a real data analysis demonstrate that the proposed
procedure works well with moderate sample size and large dimension and performs
better than competing methods.Comment: 48 page
- β¦