4,189 research outputs found
Constructive function approximation: theory and practice
In this paper we study the theoretical limits of finite constructive convex approximations of a given function in a Hilbert space using elements taken from a reduced subset. We also investigate the trade-off between the global error and the partial error during the iterations of the solution. These results are then specialized to constructive function approximation using sigmoidal neural networks. The emphasis then shifts to the implementation issues associated with the problem of achieving given approximation errors when using a finite number of nodes and a finite data set for training
Geometrical Insights for Implicit Generative Modeling
Learning algorithms for implicit generative models can optimize a variety of
criteria that measure how the data distribution differs from the implicit model
distribution, including the Wasserstein distance, the Energy distance, and the
Maximum Mean Discrepancy criterion. A careful look at the geometries induced by
these distances on the space of probability measures reveals interesting
differences. In particular, we can establish surprising approximate global
convergence guarantees for the -Wasserstein distance,even when the
parametric generator has a nonconvex parametrization.Comment: this version fixes a typo in a definitio
On Sensor Network Localization Using SDP Relaxation
A Semidefinite Programming (SDP) relaxation is an effective computational
method to solve a Sensor Network Localization problem, which attempts to
determine the locations of a group of sensors given the distances between some
of them [11]. In this paper, we analyze and determine new sufficient conditions
and formulations that guarantee that the SDP relaxation is exact, i.e., gives
the correct solution. These conditions can be useful for designing sensor
networks and managing connectivities in practice.
Our main contribution is twofold: We present the first non-asymptotic bound
on the connectivity or radio range requirement of the sensors in order to
ensure the network is uniquely localizable. Determining this range is a key
component in the design of sensor networks, and we provide a result that leads
to a correct localization of each sensor, for any number of sensors. Second, we
introduce a new class of graphs that can always be correctly localized by an
SDP relaxation. Specifically, we show that adding a simple objective function
to the SDP relaxation model will ensure that the solution is correct when
applied to a triangulation graph. Since triangulation graphs are very sparse,
this is informationally efficient, requiring an almost minimal amount of
distance information. We also analyze a number objective functions for the SDP
relaxation to solve the localization problem for a general graph.Comment: 20 pages, 4 figures, submitted to the Fields Institute Communications
Series on Discrete Geometry and Optimizatio
- …