58,194 research outputs found
Semidefinite programming and eigenvalue bounds for the graph partition problem
The graph partition problem is the problem of partitioning the vertex set of
a graph into a fixed number of sets of given sizes such that the sum of weights
of edges joining different sets is optimized. In this paper we simplify a known
matrix-lifting semidefinite programming relaxation of the graph partition
problem for several classes of graphs and also show how to aggregate additional
triangle and independent set constraints for graphs with symmetry. We present
an eigenvalue bound for the graph partition problem of a strongly regular
graph, extending a similar result for the equipartition problem. We also derive
a linear programming bound of the graph partition problem for certain Johnson
and Kneser graphs. Using what we call the Laplacian algebra of a graph, we
derive an eigenvalue bound for the graph partition problem that is the first
known closed form bound that is applicable to any graph, thereby extending a
well-known result in spectral graph theory. Finally, we strengthen a known
semidefinite programming relaxation of a specific quadratic assignment problem
and the above-mentioned matrix-lifting semidefinite programming relaxation by
adding two constraints that correspond to assigning two vertices of the graph
to different parts of the partition. This strengthening performs well on highly
symmetric graphs when other relaxations provide weak or trivial bounds
New bounds for the max--cut and chromatic number of a graph
We consider several semidefinite programming relaxations for the max--cut
problem, with increasing complexity. The optimal solution of the weakest
presented semidefinite programming relaxation has a closed form expression that
includes the largest Laplacian eigenvalue of the graph under consideration.
This is the first known eigenvalue bound for the max--cut when that is
applicable to any graph. This bound is exploited to derive a new eigenvalue
bound on the chromatic number of a graph. For regular graphs, the new bound on
the chromatic number is the same as the well-known Hoffman bound; however, the
two bounds are incomparable in general. We prove that the eigenvalue bound for
the max--cut is tight for several classes of graphs. We investigate the
presented bounds for specific classes of graphs, such as walk-regular graphs,
strongly regular graphs, and graphs from the Hamming association scheme
Learning curves for Gaussian process regression: Approximations and bounds
We consider the problem of calculating learning curves (i.e., average
generalization performance) of Gaussian processes used for regression. On the
basis of a simple expression for the generalization error, in terms of the
eigenvalue decomposition of the covariance function, we derive a number of
approximation schemes. We identify where these become exact, and compare with
existing bounds on learning curves; the new approximations, which can be used
for any input space dimension, generally get substantially closer to the truth.
We also study possible improvements to our approximations. Finally, we use a
simple exactly solvable learning scenario to show that there are limits of
principle on the quality of approximations and bounds expressible solely in
terms of the eigenvalue spectrum of the covariance function.Comment: 25 pages, 10 figure
Correlation bounds for fields and matroids
Let be a finite connected graph, and let be a spanning tree of
chosen uniformly at random. The work of Kirchhoff on electrical networks can be
used to show that the events and are negatively
correlated for any distinct edges and . What can be said for such
events when the underlying matroid is not necessarily graphic? We use Hodge
theory for matroids to bound the correlation between the events ,
where is a randomly chosen basis of a matroid. As an application, we prove
Mason's conjecture that the number of -element independent sets of a matroid
forms an ultra-log-concave sequence in .Comment: 16 pages. Supersedes arXiv:1804.0307
Estimation with Norm Regularization
Analysis of non-asymptotic estimation error and structured statistical
recovery based on norm regularized regression, such as Lasso, needs to consider
four aspects: the norm, the loss function, the design matrix, and the noise
model. This paper presents generalizations of such estimation error analysis on
all four aspects compared to the existing literature. We characterize the
restricted error set where the estimation error vector lies, establish
relations between error sets for the constrained and regularized problems, and
present an estimation error bound applicable to any norm. Precise
characterizations of the bound is presented for isotropic as well as
anisotropic subGaussian design matrices, subGaussian noise models, and convex
loss functions, including least squares and generalized linear models. Generic
chaining and associated results play an important role in the analysis. A key
result from the analysis is that the sample complexity of all such estimators
depends on the Gaussian width of a spherical cap corresponding to the
restricted error set. Further, once the number of samples crosses the
required sample complexity, the estimation error decreases as
, where depends on the Gaussian width of the unit norm
ball.Comment: Fixed technical issues. Generalized some result
Eigenvalue interlacing and weight parameters of graphs
Eigenvalue interlacing is a versatile technique for deriving results in
algebraic combinatorics. In particular, it has been successfully used for
proving a number of results about the relation between the (adjacency matrix or
Laplacian) spectrum of a graph and some of its properties. For instance, some
characterizations of regular partitions, and bounds for some parameters, such
as the independence and chromatic numbers, the diameter, the bandwidth, etc.,
have been obtained. For each parameter of a graph involving the cardinality of
some vertex sets, we can define its corresponding weight parameter by giving
some "weights" (that is, the entries of the positive eigenvector) to the
vertices and replacing cardinalities by square norms. The key point is that
such weights "regularize" the graph, and hence allow us to define a kind of
regular partition, called "pseudo-regular," intended for general graphs. Here
we show how to use interlacing for proving results about some weight parameters
and pseudo-regular partitions of a graph. For instance, generalizing a
well-known result of Lov\'asz, it is shown that the weight Shannon capacity
of a connected graph \G, with vertices and (adjacency matrix)
eigenvalues , satisfies \Theta\le
\Theta^* \le \frac{\|\vecnu\|^2}{1-\frac{\lambda_1}{\lambda_n}} where
is the (standard) Shannon capacity and \vecnu is the positive
eigenvector normalized to have smallest entry 1. In the special case of regular
graphs, the results obtained have some interesting corollaries, such as an
upper bound for some of the multiplicities of the eigenvalues of a
distance-regular graph. Finally, some results involving the Laplacian spectrum
are derived. spectrum are derived
- …