119 research outputs found

    Artificial Neural Networks

    Get PDF
    Artificial neural networks (ANNs) constitute a class of flexible nonlinear models designed to mimic biological neural systems. In this entry, we introduce ANN using familiar econometric terminology and provide an overview of ANN modeling approach and its implementation methods.

    Differentiable approximation by means of the Radon transformation and its applications to neural networks

    Get PDF
    AbstractWe treat the problem of simultaneously approximating a several-times differentiable function in several variables and its derivatives by a superposition of a function, say g, in one variable. In our theory, the domain of approximation can be either compact subsets or the whole Euclidean space Rd. We prove that if the domain is compact, the function g can be used without scaling, and that even in the case where the domain of approximation is the whole space Rd, g can be used without scaling if it satisfies a certain condition. Moreover, g can be chosen from a wide class of functions. The basic tool is the inverse Radon transform. As a neural network can output a superposition of g, our results extend well-known neural approximation theorems which are useful in neural computation theory

    The Shallow and the Deep:A biased introduction to neural networks and old school machine learning

    Get PDF
    The Shallow and the Deep is a collection of lecture notes that offers an accessible introduction to neural networks and machine learning in general. However, it was clear from the beginning that these notes would not be able to cover this rapidly changing and growing field in its entirety. The focus lies on classical machine learning techniques, with a bias towards classification and regression. Other learning paradigms and many recent developments in, for instance, Deep Learning are not addressed or only briefly touched upon.Biehl argues that having a solid knowledge of the foundations of the field is essential, especially for anyone who wants to explore the world of machine learning with an ambition that goes beyond the application of some software package to some data set. Therefore, The Shallow and the Deep places emphasis on fundamental concepts and theoretical background. This also involves delving into the history and pre-history of neural networks, where the foundations for most of the recent developments were laid. These notes aim to demystify machine learning and neural networks without losing the appreciation for their impressive power and versatility
    • …
    corecore