9,220 research outputs found
Randomized progressive iterative approximation for B-spline curve and surface fittings
For large-scale data fitting, the least-squares progressive iterative
approximation is a widely used method in many applied domains because of its
intuitive geometric meaning and efficiency. In this work, we present a
randomized progressive iterative approximation (RPIA) for the B-spline curve
and surface fittings. In each iteration, RPIA locally adjusts the control
points according to a random criterion of index selections. The difference for
each control point is computed concerning the randomized block coordinate
descent method. From geometric and algebraic aspects, the illustrations of RPIA
are provided. We prove that RPIA constructs a series of fitting curves (resp.,
surfaces), whose limit curve (resp., surface) can converge in expectation to
the least-squares fitting result of the given data points. Numerical
experiments are given to confirm our results and show the benefits of RPIA
Preconditioned geometric iterative methods for cubic B-spline interpolation curves
The geometric iterative method (GIM) is widely used in data
interpolation/fitting, but its slow convergence affects the computational
efficiency. Recently, much work was done to guarantee the acceleration of GIM
in the literature. In this work, we aim to further accelerate the rate of
convergence by introducing a preconditioning technique. After constructing the
preconditioner, we preprocess the progressive iterative approximation (PIA) and
its variants, called the preconditioned GIMs. We show that the proposed
preconditioned GIMs converge and the extra computation cost brought by the
preconditioning technique is negligible. Several numerical experiments are
given to demonstrate that our preconditioner can accelerate the convergence
rate of PIA and its variants
Video Data Compression by Progressive Iterative Approximation
In the present paper, the B-spline curve is used for reducing the entropy of video data. We consider the color or luminance variations of a spatial position in a series of frames as input data points in Euclidean space R or R3. The progressive and iterative approximation (PIA) method is a direct and intuitive way of generating curve series of high and higher fitting accuracy. The video data points are approximated using progressive and iterative approximation for least square (LSPIA) fitting. The Lossless video data compression is done through storing the B-spline curve control points (CPs) and the difference between fitted and original video data. The proposed method is applied to two classes of synthetically produced and naturally recorded video sequences and makes a reduction in the entropy of both. However, this reduction is higher for syntactically created than those naturally produced. The comparative analysis of experiments on a variety of video sequences suggests that the entropy of output video data is much less than that of input video data
Progressive Analytics: A Computation Paradigm for Exploratory Data Analysis
Exploring data requires a fast feedback loop from the analyst to the system,
with a latency below about 10 seconds because of human cognitive limitations.
When data becomes large or analysis becomes complex, sequential computations
can no longer be completed in a few seconds and data exploration is severely
hampered. This article describes a novel computation paradigm called
Progressive Computation for Data Analysis or more concisely Progressive
Analytics, that brings at the programming language level a low-latency
guarantee by performing computations in a progressive fashion. Moving this
progressive computation at the language level relieves the programmer of
exploratory data analysis systems from implementing the whole analytics
pipeline in a progressive way from scratch, streamlining the implementation of
scalable exploratory data analysis systems. This article describes the new
paradigm through a prototype implementation called ProgressiVis, and explains
the requirements it implies through examples.Comment: 10 page
DROP: Dimensionality Reduction Optimization for Time Series
Dimensionality reduction is a critical step in scaling machine learning
pipelines. Principal component analysis (PCA) is a standard tool for
dimensionality reduction, but performing PCA over a full dataset can be
prohibitively expensive. As a result, theoretical work has studied the
effectiveness of iterative, stochastic PCA methods that operate over data
samples. However, termination conditions for stochastic PCA either execute for
a predetermined number of iterations, or until convergence of the solution,
frequently sampling too many or too few datapoints for end-to-end runtime
improvements. We show how accounting for downstream analytics operations during
DR via PCA allows stochastic methods to efficiently terminate after operating
over small (e.g., 1%) subsamples of input data, reducing whole workload
runtime. Leveraging this, we propose DROP, a DR optimizer that enables speedups
of up to 5x over Singular-Value-Decomposition-based PCA techniques, and exceeds
conventional approaches like FFT and PAA by up to 16x in end-to-end workloads
- …