40 research outputs found

    IMPROVEMENTS TO THE BACKPROPAGATION ALGORITHM

    Get PDF
    This paper presents some simple techniques to improve the backpropagation algorithm. Since learning in neural networks is an NP-complete problem and since traditional gradient descent methods are rather slow, many alternatives have been tried in order to accelerate convergence. Some of the proposed methods are mutually compatible and a combination of them normally works better than each method alone

    Approach to an FPGA embedded, autonomous object recognition system: run-time learning and adaptation

    Get PDF
    Neural networks, widely used in pattern recognition, security applications and robot control have been chosen for the task of object recognition within this system. One of the main drawbacks of the implementation of traditional neural networks in reconfigurable hardware is the huge resource consuming demand. This is due not only to their intrinsic parallelism, but also to the traditional big networks designed. However, modern FPGA architectures are perfectly suited for this kind of massive parallel computational needs. Therefore, our proposal is the implementation of Tiny Neural Networks, TNN -self-coined term-, in reconfigurable architectures. One of most important features of TNNs is their learning ability. Therefore, what we show here is the attempt to rise the autonomy features of the system, triggering a new learning phase, at run-time, when necessary. In this way, autonomous adaptation of the system is achieved. The system performs shape identification by the interpretation of object singularities. This is achieved by interconnecting several specialized TNN that work cooperatively. In order to validate the research, the system has been implemented and configured as a perceptron-like TNN with backpropagation learning and applied to the recognition of shapes. Simulation results show that this architecture has significant performance benefit
    corecore