unknown

ZerNet: Convolutional Neural Networks on Arbitrary Surfaces via Zernike Local Tangent Space Estimation

Abstract

In this paper, we propose a novel formulation to extend CNNs to two-dimensional (2D) manifolds using orthogonal basis functions, called Zernike polynomials. In many areas, geometric features play a key role in understanding scientific phenomena. Thus, an ability to codify geometric features into a mathematical quantity can be critical. Recently, convolutional neural networks (CNNs) have demonstrated the promising capability of extracting and codifying features from visual information. However, the progress has been concentrated in computer vision applications where there exists an inherent grid-like structure. In contrast, many geometry processing problems are defined on curved surfaces, and the generalization of CNNs is not quite trivial. The difficulties are rooted in the lack of key ingredients such as the canonical grid-like representation, the notion of consistent orientation, and a compatible local topology across the domain. In this paper, we prove that the convolution of two functions can be represented as a simple dot product between Zernike polynomial coefficients; and the rotation of a convolution kernel is essentially a set of 2-by-2 rotation matrices applied to the coefficients. As such, the key contribution of this work resides in a concise but rigorous mathematical generalization of the CNN building blocks

    Similar works

    Full text

    thumbnail-image

    Available Versions