2,563 research outputs found

    Construction of analysis-suitable G1G^1 planar multi-patch parameterizations

    Full text link
    Isogeometric analysis allows to define shape functions of global C1C^{1} continuity (or of higher continuity) over multi-patch geometries. The construction of such C1C^{1}-smooth isogeometric functions is a non-trivial task and requires particular multi-patch parameterizations, so-called analysis-suitable G1G^{1} (in short, AS-G1G^{1}) parameterizations, to ensure that the resulting C1C^{1} isogeometric spaces possess optimal approximation properties, cf. [7]. In this work, we show through examples that it is possible to construct AS-G1G^{1} multi-patch parameterizations of planar domains, given their boundary. More precisely, given a generic multi-patch geometry, we generate an AS-G1G^{1} multi-patch parameterization possessing the same boundary, the same vertices and the same first derivatives at the vertices, and which is as close as possible to this initial geometry. Our algorithm is based on a quadratic optimization problem with linear side constraints. Numerical tests also confirm that C1C^{1} isogeometric spaces over AS-G1G^{1} multi-patch parameterized domains converge optimally under mesh refinement, while for generic parameterizations the convergence order is severely reduced

    Unconditionality of orthogonal spline systems in H1H^1

    Full text link
    We give a simple geometric characterization of knot sequences for which the corresponding orthonormal spline system of arbitrary order kk is an unconditional basis in the atomic Hardy space H1[0,1]H^1[0,1].Comment: 30 page

    Representation of Functional Data in Neural Networks

    Get PDF
    Functional Data Analysis (FDA) is an extension of traditional data analysis to functional data, for example spectra, temporal series, spatio-temporal images, gesture recognition data, etc. Functional data are rarely known in practice; usually a regular or irregular sampling is known. For this reason, some processing is needed in order to benefit from the smooth character of functional data in the analysis methods. This paper shows how to extend the Radial-Basis Function Networks (RBFN) and Multi-Layer Perceptron (MLP) models to functional data inputs, in particular when the latter are known through lists of input-output pairs. Various possibilities for functional processing are discussed, including the projection on smooth bases, Functional Principal Component Analysis, functional centering and reduction, and the use of differential operators. It is shown how to incorporate these functional processing into the RBFN and MLP models. The functional approach is illustrated on a benchmark of spectrometric data analysis.Comment: Also available online from: http://www.sciencedirect.com/science/journal/0925231

    Support vector machine for functional data classification

    Get PDF
    In many applications, input data are sampled functions taking their values in infinite dimensional spaces rather than standard vectors. This fact has complex consequences on data analysis algorithms that motivate modifications of them. In fact most of the traditional data analysis tools for regression, classification and clustering have been adapted to functional inputs under the general name of functional Data Analysis (FDA). In this paper, we investigate the use of Support Vector Machines (SVMs) for functional data analysis and we focus on the problem of curves discrimination. SVMs are large margin classifier tools based on implicit non linear mappings of the considered data into high dimensional spaces thanks to kernels. We show how to define simple kernels that take into account the unctional nature of the data and lead to consistent classification. Experiments conducted on real world data emphasize the benefit of taking into account some functional aspects of the problems.Comment: 13 page

    Reproducing Kernel Banach Spaces with the l1 Norm

    Get PDF
    Targeting at sparse learning, we construct Banach spaces B of functions on an input space X with the properties that (1) B possesses an l1 norm in the sense that it is isometrically isomorphic to the Banach space of integrable functions on X with respect to the counting measure; (2) point evaluations are continuous linear functionals on B and are representable through a bilinear form with a kernel function; (3) regularized learning schemes on B satisfy the linear representer theorem. Examples of kernel functions admissible for the construction of such spaces are given.Comment: 28 pages, an extra section was adde

    Uncertainty Relations for Shift-Invariant Analog Signals

    Full text link
    The past several years have witnessed a surge of research investigating various aspects of sparse representations and compressed sensing. Most of this work has focused on the finite-dimensional setting in which the goal is to decompose a finite-length vector into a given finite dictionary. Underlying many of these results is the conceptual notion of an uncertainty principle: a signal cannot be sparsely represented in two different bases. Here, we extend these ideas and results to the analog, infinite-dimensional setting by considering signals that lie in a finitely-generated shift-invariant (SI) space. This class of signals is rich enough to include many interesting special cases such as multiband signals and splines. By adapting the notion of coherence defined for finite dictionaries to infinite SI representations, we develop an uncertainty principle similar in spirit to its finite counterpart. We demonstrate tightness of our bound by considering a bandlimited lowpass train that achieves the uncertainty principle. Building upon these results and similar work in the finite setting, we show how to find a sparse decomposition in an overcomplete dictionary by solving a convex optimization problem. The distinguishing feature of our approach is the fact that even though the problem is defined over an infinite domain with infinitely many variables and constraints, under certain conditions on the dictionary spectrum our algorithm can find the sparsest representation by solving a finite-dimensional problem.Comment: Accepted to IEEE Trans. on Inform. Theor
    corecore