315,307 research outputs found

    ARTIFICIAL NEURAL NETWORKS AND THEIR APPLICATIONS IN BUSINESS

    Get PDF
    In modern software implementations of artificial neural networks the approach inspired by biology has more or less been abandoned for a more practical approach based on statistics and signal processing. In some of these systems, neural networks, or parts of neural networks (such as artificial neurons), are used as components in larger systems that combine both adaptive and non-adaptive elements. There are many problems which are solved with neural networks, especially in business and economic domains.neuron, neural networks, artificial intelligence, feed-forward neural networks, classification

    Parametrization of stochastic inputs using generative adversarial networks with application in geology

    Get PDF
    We investigate artificial neural networks as a parametrization tool for stochastic inputs in numerical simulations. We address parametrization from the point of view of emulating the data generating process, instead of explicitly constructing a parametric form to preserve predefined statistics of the data. This is done by training a neural network to generate samples from the data distribution using a recent deep learning technique called generative adversarial networks. By emulating the data generating process, the relevant statistics of the data are replicated. The method is assessed in subsurface flow problems, where effective parametrization of underground properties such as permeability is important due to the high dimensionality and presence of high spatial correlations. We experiment with realizations of binary channelized subsurface permeability and perform uncertainty quantification and parameter estimation. Results show that the parametrization using generative adversarial networks is very effective in preserving visual realism as well as high order statistics of the flow responses, while achieving a dimensionality reduction of two orders of magnitude

    Spiking Optical Patterns and Synchronization

    Full text link
    We analyze the time resolved spike statistics of a solitary and two mutually interacting chaotic semiconductor lasers whose chaos is characterized by apparently random, short intensity spikes. Repulsion between two successive spikes is observed, resulting in a refractory period which is largest at laser threshold. For time intervals between spikes greater than the refractory period, the distribution of the intervals follows a Poisson distribution. The spiking pattern is highly periodic over time windows corresponding to the optical length of the external cavity, with a slow change of the spiking pattern as time increases. When zero-lag synchronization between the two lasers is established, the statistics of the nearly perfectly matched spikes are not altered. The similarity of these features to those found in complex interacting neural networks, suggests the use of laser systems as simpler physical models for neural networks

    Computing Vertex Centrality Measures in Massive Real Networks with a Neural Learning Model

    Full text link
    Vertex centrality measures are a multi-purpose analysis tool, commonly used in many application environments to retrieve information and unveil knowledge from the graphs and network structural properties. However, the algorithms of such metrics are expensive in terms of computational resources when running real-time applications or massive real world networks. Thus, approximation techniques have been developed and used to compute the measures in such scenarios. In this paper, we demonstrate and analyze the use of neural network learning algorithms to tackle such task and compare their performance in terms of solution quality and computation time with other techniques from the literature. Our work offers several contributions. We highlight both the pros and cons of approximating centralities though neural learning. By empirical means and statistics, we then show that the regression model generated with a feedforward neural networks trained by the Levenberg-Marquardt algorithm is not only the best option considering computational resources, but also achieves the best solution quality for relevant applications and large-scale networks. Keywords: Vertex Centrality Measures, Neural Networks, Complex Network Models, Machine Learning, Regression ModelComment: 8 pages, 5 tables, 2 figures, version accepted at IJCNN 2018. arXiv admin note: text overlap with arXiv:1810.1176
    corecore