4 research outputs found

    Optimization of back-propagation neural networks architecture and parameters with a hybrid PSO/SA approach

    Get PDF
    Determining the architecture and parameters of neural networks is an important scientific challenge. This paper reports a new hybrid optimization method for optimization of back-propagation neural networks architecture and parameters with a high accuracy. We use particle swarm optimization that has proven to be very effective and fast and has shown to increase the efficiency of simulated annealing when applied to a diverse set of optimization problems. To evaluate the proposed method, we employ the PIMA dataset from the University of California machine learning database. Compared with previous work, we show superior classification accuracy rates of the developed approach

    A Neural Network Architecture for Data Classification

    Full text link

    A neural network architecture for data classification

    No full text
    corecore