7,393 research outputs found
Geometric fitting by two coaxial cylinders
Fitting two coaxial cylinders to data is a standard problem incomputational metrology and reverse engineering processes, which also arisesin medical imaging. There are many fitting criteria that can beused. One that is widely used in metrology, for example, isthat of the sum of squared minimal distance. A similarnumerical method is developed to fit two coaxial cylinders inthe general position to 3D data, and numerical examples are given
Use of lp norms in fitting curves and surfaces to data
Given a family of curves or surfaces in R s , an important problem is that of finding a member of the family which gives a "best" fit to m given data points. A criterion which is relevant to many application areas is orthogonal distance regression, where the sum of squares of the orthogonal distances from the data points to the surface is minimized. For example, this is important in metrology, where measured data from a manufactured part may have to be modelled. The least squares norm is not always suitable (for example, there may be wild points in the data, accept/reject decisions may be required, etc). So we use this to justify looking at the use of other l p norms. There are different ways to formulate the problem, and we examine methods which generalize in a natural way those available for least squares. The emphasis is on the efficient numerical treatment of the resulting problems
On the local stability of semidefinite relaxations
We consider a parametric family of quadratically constrained quadratic
programs (QCQP) and their associated semidefinite programming (SDP)
relaxations. Given a nominal value of the parameter at which the SDP relaxation
is exact, we study conditions (and quantitative bounds) under which the
relaxation will continue to be exact as the parameter moves in a neighborhood
around the nominal value. Our framework captures a wide array of statistical
estimation problems including tensor principal component analysis, rotation
synchronization, orthogonal Procrustes, camera triangulation and resectioning,
essential matrix estimation, system identification, and approximate GCD. Our
results can also be used to analyze the stability of SOS relaxations of general
polynomial optimization problems.Comment: 23 pages, 3 figure
Adaptive Momentum for Neural Network Optimization
In this thesis, we develop a novel and efficient algorithm for optimizing neural networks inspired by a recently proposed geodesic optimization algorithm. Our algorithm, which we call Stochastic Geodesic Optimization (SGeO), utilizes an adaptive coefficient on top of Polyaks Heavy Ball method effectively controlling the amount of weight put on the previous update to the parameters based on the change of direction in the optimization path. Experimental results on strongly convex functions with Lipschitz gradients and deep Autoencoder benchmarks show that SGeO reaches lower errors than established first-order methods and competes well with lower or similar errors to a recent second-order method called K-FAC (Kronecker-Factored Approximate Curvature). We also incorporate Nesterov style lookahead gradient into our algorithm (SGeO-N) and observe notable improvements. We believe that our research will open up new directions for high-dimensional neural network optimization where combining the efficiency of first-order methods and the effectiveness of second-order methods proves a promising avenue to explore
Fast and numerically stable circle fit
We develop a new algorithm for fitting circles that does not have drawbacks
commonly found in existing circle fits. Our fit achieves ultimate accuracy (to
machine precision), avoids divergence, and is numerically stable even when
fitting circles get arbitrary large. Lastly, our algorithm takes less than 10
iterations to converge, on average.Comment: 16 page
Linear dimensionality reduction: Survey, insights, and generalizations
Linear dimensionality reduction methods are a cornerstone of analyzing high
dimensional data, due to their simple geometric interpretations and typically
attractive computational properties. These methods capture many data features
of interest, such as covariance, dynamical structure, correlation between data
sets, input-output relationships, and margin between data classes. Methods have
been developed with a variety of names and motivations in many fields, and
perhaps as a result the connections between all these methods have not been
highlighted. Here we survey methods from this disparate literature as
optimization programs over matrix manifolds. We discuss principal component
analysis, factor analysis, linear multidimensional scaling, Fisher's linear
discriminant analysis, canonical correlations analysis, maximum autocorrelation
factors, slow feature analysis, sufficient dimensionality reduction,
undercomplete independent component analysis, linear regression, distance
metric learning, and more. This optimization framework gives insight to some
rarely discussed shortcomings of well-known methods, such as the suboptimality
of certain eigenvector solutions. Modern techniques for optimization over
matrix manifolds enable a generic linear dimensionality reduction solver, which
accepts as input data and an objective to be optimized, and returns, as output,
an optimal low-dimensional projection of the data. This simple optimization
framework further allows straightforward generalizations and novel variants of
classical methods, which we demonstrate here by creating an
orthogonal-projection canonical correlations analysis. More broadly, this
survey and generic solver suggest that linear dimensionality reduction can move
toward becoming a blackbox, objective-agnostic numerical technology.JPC and ZG received funding from the UK Engineering and Physical Sciences Research Council (EPSRC EP/H019472/1). JPC received funding from a Sloan Research Fellowship, the Simons Foundation (SCGB#325171 and SCGB#325233), the Grossman Center at Columbia University, and the Gatsby Charitable Trust.This is the author accepted manuscript. The final version is available from MIT Press via http://jmlr.org/papers/v16/cunningham15a.htm
- …