278,736 research outputs found

    Artificial Neural Networks

    Get PDF
    Artificial neural networks (ANNs) constitute a class of flexible nonlinear models designed to mimic biological neural systems. In this entry, we introduce ANN using familiar econometric terminology and provide an overview of ANN modeling approach and its implementation methods.

    Generalization Error in Deep Learning

    Get PDF
    Deep learning models have lately shown great performance in various fields such as computer vision, speech recognition, speech translation, and natural language processing. However, alongside their state-of-the-art performance, it is still generally unclear what is the source of their generalization ability. Thus, an important question is what makes deep neural networks able to generalize well from the training set to new data. In this article, we provide an overview of the existing theory and bounds for the characterization of the generalization error of deep neural networks, combining both classical and more recent theoretical and empirical results

    Neural Networks

    Get PDF
    We present an overview of current research on artificial neural networks, emphasizing a statistical perspective. We view neural networks as parameterized graphs that make probabilistic assumptions about data, and view learning algorithms as methods for finding parameter values that look probable in the light of the data. We discuss basic issues in representation and learning, and treat some of the practical issues that arise in fitting networks to data. We also discuss links between neural networks and the general formalism of graphical models

    Geometric deep learning: going beyond Euclidean data

    Get PDF
    Many scientific fields study data with an underlying structure that is a non-Euclidean space. Some examples include social networks in computational social sciences, sensor networks in communications, functional networks in brain imaging, regulatory networks in genetics, and meshed surfaces in computer graphics. In many applications, such geometric data are large and complex (in the case of social networks, on the scale of billions), and are natural targets for machine learning techniques. In particular, we would like to use deep neural networks, which have recently proven to be powerful tools for a broad range of problems from computer vision, natural language processing, and audio analysis. However, these tools have been most successful on data with an underlying Euclidean or grid-like structure, and in cases where the invariances of these structures are built into networks used to model them. Geometric deep learning is an umbrella term for emerging techniques attempting to generalize (structured) deep neural models to non-Euclidean domains such as graphs and manifolds. The purpose of this paper is to overview different examples of geometric deep learning problems and present available solutions, key difficulties, applications, and future research directions in this nascent field

    Neural Networks for Constitutive Modeling -- From Universal Function Approximators to Advanced Models and the Integration of Physics

    Full text link
    Analyzing and modeling the constitutive behavior of materials is a core area in materials sciences and a prerequisite for conducting numerical simulations in which the material behavior plays a central role. Constitutive models have been developed since the beginning of the 19th century and are still under constant development. Besides physics-motivated and phenomenological models, during the last decades, the field of constitutive modeling was enriched by the development of machine learning-based constitutive models, especially by using neural networks. The latter is the focus of the present review, which aims to give an overview of neural networks-based constitutive models from a methodical perspective. The review summarizes and compares numerous conceptually different neural networks-based approaches for constitutive modeling including neural networks used as universal function approximators, advanced neural network models and neural network approaches with integrated physical knowledge. The upcoming of these methods is in-turn closely related to advances in the area of computer sciences, what further adds a chronological aspect to this review. We conclude this review paper with important challenges in the field of learning constitutive relations that need to be tackled in the near future
    • …
    corecore