60 research outputs found

    APPLICATION OF NEURAL NETWORKS IN PREDICTIVE DATA MINING

    Get PDF
    Neural Networks represent a meaningfully different approach to using computers in the workplace. A neural network is used to learn patterns and relationships in data. The data may be the results of a market research effort, or the results of a production process given varying operational conditions. Regardless of the specifics involved, applying a neural network is a substantial departure from traditional approaches. In this paper we will look into how neural networks is used in data mining. The ultimate goal of data mining is prediction - and predictive data mining is the most common type of data mining and one that has the most direct business applications. Therefore, we will consider how this technique can be used to classify the performance status of a departmental store in monitoring their productsNeural networks, data mining, prediction

    Data Mining In Network Without Higher Order Connections.

    Get PDF
    Higher order neural networks (HONN) have been shown to have impressive computational, storage and learning capabilities

    Energy Relaxation For Hopfield Network With The New Learning Rule.

    Get PDF
    In this paper, the time for energy relaxation for Little Hopfield neural network using the new activation rule is shown to be better than the relaxation time using Hebbian learning

    Optimization Methods In Training Neural Networks

    Get PDF
    Terdapat beberapa teknik pengekstremuman bagi menyelesaikan masalah aIjabar linear dan tak linear. Kaedah Newton mempunyai sifat yang dipanggil penamatan kuadratik yang bermaksud ia meminimumkan suatu fungsi kuadratik dalam bilangan le1aran yang terhingga. Walaubagaimanapun, kaedah ini memerlukan pengiraan dan pengstoran terbitan kedua bagi fungsi kuadratik yang terlibat. Apabila bilangan parameter n adalah besar, ianya mungkin tidak praktikat· untuk mengira semua terbitap kedua. Hal ini adalah benar bagi rangkaian neural di mana kebanyakan aplikasi praktikal memerlukan beberapa ratus atau ribu pemberat. Bagi masalah-masalah sedemikian, kaedah pengoptimuman yang hanya memerlukan terbitan pertama tetapi masih mempunyai sifat penamatan kuadratik lebih diutamakan. There are a number of extremizing techniques to solve linear and nonlinear algebraic • problems. Newton's method has a property called quadratic termination~ which means that it minimizes a quadratic function exactly in a finite number of iterations. Unfortunately, it requires calculation and storage of the second derivatives of the quadratic function involved. When the number of parameters, n, is large, it may be impractical to compute all the second derivatives. This is especially true for neural networks, where practical applications can require several hundred to many thousands weights. Eor these particular cases, methods that require ,only first derivatives bMt still have quadratic termination are preferred

    Knowledge Extraction In Hopfield Network

    Get PDF
    Deduction simplifies the knowledge representation without affecting the knowledge contents. Thus, deduction makes the clauses more compact and easier to be interpreted as a large amount of redundancy may obscure the meaning ofthe represented knowledge

    Activation Functions in Neuro Symbolic Integration Using Agent Based Modelling

    Get PDF
    Logic program and neural networks are two important aspects in artificial intelligence. This paper is part of an endeavour towards neural networks and logic programming integration. The goal in performing logic programming based on the energy minimization scheme is to achieve the best ratio of global minimum. However, there is no guarantee to find the best minimum in the network. To achieve this, activations functions are modified to accelerate the neuro symbolic integration. These activation functions will reduced the complexity of doing logic programming in Hopfield Neural Network (HNN).The activations functions discussed in this paper are new learning rule, Mc Culloch Pitts function and Hyperbolic Tangent Activation function. This paper also focused on agent based modelling for presenting performance of doing logic programming in Hopfield network using various activation functions. The effects of the activation function are analyzed mathematically and compared with the existing method. Computer simulations are carried out by using NETLOGO to validate the effectiveness on the new activation function. The resuls obtained showed that the Hyperbolic Tangent Activation function outperform other activation functions in doing logic programming in Hopfield network. The models developed by agent based modelling also support this theory

    Neuro Symbolic Integration and Agent Based Modelling

    Get PDF
    Logic program and neural networks are two important perspectives in artificial intelligence. The major domain of neuro-symbolic integration is designed by the theory are usually known as deductive systems which less such elements of human reasoning as adaptation, learning and self-organisation. Meanwhile, neural networks, known as a mathematical model of neurons in the human brain, and have various abilities, and moreover, they also provide parallel computations and therefore can perform some calculations quicker than classical learning algorithms. Hopfield network is a feedback (recurrent) neural network, consisting of a set of N interconnected neurons which each neurons are linked to all others in all the directions. It has synaptic strength pattern which involve Lyapunov function E (energy function) for energy minimization events. It operates as content addressable memory systems with binary or bipolar threshold unit
    corecore