7,777 research outputs found
SOCP relaxation bounds for the optimal subset selection problem applied to robust linear regression
This paper deals with the problem of finding the globally optimal subset of h
elements from a larger set of n elements in d space dimensions so as to
minimize a quadratic criterion, with an special emphasis on applications to
computing the Least Trimmed Squares Estimator (LTSE) for robust regression. The
computation of the LTSE is a challenging subset selection problem involving a
nonlinear program with continuous and binary variables, linked in a highly
nonlinear fashion. The selection of a globally optimal subset using the branch
and bound (BB) algorithm is limited to problems in very low dimension,
tipically d<5, as the complexity of the problem increases exponentially with d.
We introduce a bold pruning strategy in the BB algorithm that results in a
significant reduction in computing time, at the price of a negligeable accuracy
lost. The novelty of our algorithm is that the bounds at nodes of the BB tree
come from pseudo-convexifications derived using a linearization technique with
approximate bounds for the nonlinear terms. The approximate bounds are computed
solving an auxiliary semidefinite optimization problem. We show through a
computational study that our algorithm performs well in a wide set of the most
difficult instances of the LTSE problem.Comment: 12 pages, 3 figures, 2 table
Least quantile regression via modern optimization
We address the Least Quantile of Squares (LQS) (and in particular the Least
Median of Squares) regression problem using modern optimization methods. We
propose a Mixed Integer Optimization (MIO) formulation of the LQS problem which
allows us to find a provably global optimal solution for the LQS problem. Our
MIO framework has the appealing characteristic that if we terminate the
algorithm early, we obtain a solution with a guarantee on its sub-optimality.
We also propose continuous optimization methods based on first-order
subdifferential methods, sequential linear optimization and hybrid combinations
of them to obtain near optimal solutions to the LQS problem. The MIO algorithm
is found to benefit significantly from high quality solutions delivered by our
continuous optimization based methods. We further show that the MIO approach
leads to (a) an optimal solution for any dataset, where the data-points
's are not necessarily in general position, (b) a simple
proof of the breakdown point of the LQS objective value that holds for any
dataset and (c) an extension to situations where there are polyhedral
constraints on the regression coefficient vector. We report computational
results with both synthetic and real-world datasets showing that the MIO
algorithm with warm starts from the continuous optimization methods solve small
() and medium () size problems to provable optimality in under
two hours, and outperform all publicly available methods for large-scale
(10,000) LQS problems.Comment: Published in at http://dx.doi.org/10.1214/14-AOS1223 the Annals of
Statistics (http://www.imstat.org/aos/) by the Institute of Mathematical
Statistics (http://www.imstat.org
Robust variable screening for regression using factor profiling
Sure Independence Screening is a fast procedure for variable selection in
ultra-high dimensional regression analysis. Unfortunately, its performance
greatly deteriorates with increasing dependence among the predictors. To solve
this issue, Factor Profiled Sure Independence Screening (FPSIS) models the
correlation structure of the predictor variables, assuming that it can be
represented by a few latent factors. The correlations can then be profiled out
by projecting the data onto the orthogonal complement of the subspace spanned
by these factors. However, neither of these methods can handle the presence of
outliers in the data. Therefore, we propose a robust screening method which
uses a least trimmed squares method to estimate the latent factors and the
factor profiled variables. Variable screening is then performed on factor
profiled variables by using regression MM-estimators. Different types of
outliers in this model and their roles in variable screening are studied. Both
simulation studies and a real data analysis show that the proposed robust
procedure has good performance on clean data and outperforms the two nonrobust
methods on contaminated data
Deterministic Sampling and Range Counting in Geometric Data Streams
We present memory-efficient deterministic algorithms for constructing
epsilon-nets and epsilon-approximations of streams of geometric data. Unlike
probabilistic approaches, these deterministic samples provide guaranteed bounds
on their approximation factors. We show how our deterministic samples can be
used to answer approximate online iceberg geometric queries on data streams. We
use these techniques to approximate several robust statistics of geometric data
streams, including Tukey depth, simplicial depth, regression depth, the
Thiel-Sen estimator, and the least median of squares. Our algorithms use only a
polylogarithmic amount of memory, provided the desired approximation factors
are inverse-polylogarithmic. We also include a lower bound for non-iceberg
geometric queries.Comment: 12 pages, 1 figur
Bayesian Model Averaging and Weighted Average Least Squares: Equivariance, Stability, and Numerical Issues
This article is concerned with the estimation of linear regression models with uncertainty about the choice of the explanatory variables. We introduce the Stata commands bma and wals which implement, respectively, the exact Bayesian Model Averaging (BMA) estimator and the Weighted Average Least Squares (WALS) estimator developed by Magnus et al. (2010). Unlike standard pretest estimators which are based on some preliminary diagnostic test, these model averaging estimators provide a coherent way of making inference on the regression parameters of interest by taking into account the uncertainty due to both the estimation and the model selection steps. Special emphasis is given to a number practical issues that users are likely to face in applied work: equivariance to certain transformations of the explanatory variables, stability, accuracy, computing speed and out-of-memory problems. Performances of our bma and wals commands are illustrated using simulated data and empirical applications from the literature on model averaging estimation.Model uncertainty;Model averaging;Bayesian analysis;Exact computation
- …