318 research outputs found

    Greedy Shallow Networks: An Approach for Constructing and Training Neural Networks

    Get PDF
    We present a greedy-based approach to construct an efficient single hidden layer neural network with the ReLU activation that approximates a target function. In our approach we obtain a shallow network by utilizing a greedy algorithm with the prescribed dictionary provided by the available training data and a set of possible inner weights. To facilitate the greedy selection process we employ an integral representation of the network, based on the ridgelet transform, that significantly reduces the cardinality of the dictionary and hence promotes feasibility of the greedy selection. Our approach allows for the construction of efficient architectures which can be treated either as improved initializations to be used in place of random-based alternatives, or as fully-trained networks in certain cases, thus potentially nullifying the need for backpropagation training. Numerical experiments demonstrate the tenability of the proposed concept and its advantages compared to the conventional techniques for selecting architectures and initializations for neural networks

    Ridgelet-based signature for natural image classification

    Get PDF
    This paper presents an approach to grouping natural scenes into (semantically) meaningful categories. The proposed approach exploits the statistics of natural scenes to define relevant image categories. A ridgelet-based signature is used to represent images. This signature is used by a support vector classifier that is well designed to support high dimensional features, resulting in an effective recognition system. As an illustration of the potential of the approach several experiments of binary classifications (e.g. city/landscape or indoor/outdoor) are conducted on databases of natural scenes

    Quantum Ridgelet Transform: Winning Lottery Ticket of Neural Networks with Quantum Computation

    Full text link
    A significant challenge in the field of quantum machine learning (QML) is to establish applications of quantum computation to accelerate common tasks in machine learning such as those for neural networks. Ridgelet transform has been a fundamental mathematical tool in the theoretical studies of neural networks, but the practical applicability of ridgelet transform to conducting learning tasks was limited since its numerical implementation by conventional classical computation requires an exponential runtime exp(O(D))\exp(O(D)) as data dimension DD increases. To address this problem, we develop a quantum ridgelet transform (QRT), which implements the ridgelet transform of a quantum state within a linear runtime O(D)O(D) of quantum computation. As an application, we also show that one can use QRT as a fundamental subroutine for QML to efficiently find a sparse trainable subnetwork of large shallow wide neural networks without conducting large-scale optimization of the original network. This application discovers an efficient way in this regime to demonstrate the lottery ticket hypothesis on finding such a sparse trainable neural network. These results open an avenue of QML for accelerating learning tasks with commonly used classical neural networks.Comment: 27 pages, 4 figure

    Signature of a Cosmic String Wake at z=3z=3

    Full text link
    In this paper, we describe the results of N-body simulation runs, which include a cosmic string wake of tension Gμ=4×108G\mu= 4 \times 10^{-8} on top of the usual ΛCDM\Lambda CDM fluctuations. To obtain a higher resolution of the wake in the simulations compared to previous work, we insert the effects of the string wake at a lower redshift and perform the simulations in a smaller volume. A curvelet analysis of the wake and no-wake maps is applied, indicating that the presence of a wake can be extracted at a three-sigma confidence level from maps of the two-dimensional dark matter projection down to a redshift of z=3z=3.Comment: 8 pages, 6 figures; We have improved the analysis and results. The text now agrees with the published versio
    corecore