111,617 research outputs found

    An adaptive stochastic resonance method based on multi-agent cuckoo search algorithm for bearing fault detection

    Get PDF
    Bearing is widely used in the rotating machinery and prone to failure due to the harsh working environment. The bearing fault-induced impulses are weak because of poor background noise, long vibration transmission path, and slight fault degree. Therefore, the bearing fault detection is difficult. A novel adaptive stochastic resonance method based on multi-agent cuckoo search algorithm for bearing fault detection is proposed. Stochastic resonance (SR) is like a nonlinear filter, which can enhance the weak fault-induced impulses while suppressing the noise. However, the parameters of the nonlinear system exert an influence on the SR effect, and the optimal parameters are difficult to be found. Multi-agent cuckoo search (MACS) algorithm is an excellent heuristic optimization algorithm and can be used to search the parameters of nonlinear system adaptively. Two bearing fault signals are used to validate the effectiveness of our proposed method. Three other adaptive SR methods based on cuckoo search algorithm, particle swarm optimization or genetic algorithm are also used for comparison. The results show that MACS can find the optimal parameters more quickly and more accurately, and our proposed method can enhance the fault-induced impulses efficiently

    Adaptive Stochastic Conjugate Gradient optimization for temporal medical image registration

    Get PDF
    We propose an Adaptive Stochastic Conjugate Gradient (ASCG) optimization algorithm for temporal medical image registration. This method combines the advantages of Conjugate Gradient (CG) method and Adaptive Stochastic Gradient Descent (ASGD) method. The main idea is that the search direction of ASGD is replaced by stochastic approximations of the conjugate gradient of the cost function. In addition, the step size of ASCG is based on the approximation of the Lipschitz constant of the stochastic gradient function. Thus, this algorithm could maintain the good properties of the conjugate gradient method, meanwhile it uses less gradient computation time per iteration and adjusts the step size adaptively as the ASGD method. As a result, this algorithm takes less CPU time than the previous ASGD method. We demonstrate the efficiency of our algorithm on the public available 4D Lung CT data and our clinical Lung/Tumor CT data using the general 4D image registration model. We compare the ASCG with several existing iterative optimization strategies: steepest gradient descent method, conjugate gradient method, Quasi-Newton method (LBFGS) and adaptive stochastic gradient descent method. Our preliminary results indicate that our ASCG algorithm achieves 22% higher accuracy on the POPI dataset and it also performs better than existing methods on other datasets(DIR-Lab dataset and our clinical dataset). Furthermore, we demonstrate that compared with other methods, our ASCG algorithm is more robust to image noises

    Bolstering Stochastic Gradient Descent with Model Building

    Full text link
    Stochastic gradient descent method and its variants constitute the core optimization algorithms that achieve good convergence rates for solving machine learning problems. These rates are obtained especially when these algorithms are fine-tuned for the application at hand. Although this tuning process can require large computational costs, recent work has shown that these costs can be reduced by line search methods that iteratively adjust the stepsize. We propose an alternative approach to stochastic line search by using a new algorithm based on forward step model building. This model building step incorporates second-order information that allows adjusting not only the stepsize but also the search direction. Noting that deep learning model parameters come in groups (layers of tensors), our method builds its model and calculates a new step for each parameter group. This novel diagonalization approach makes the selected step lengths adaptive. We provide convergence rate analysis, and experimentally show that the proposed algorithm achieves faster convergence and better generalization in well-known test problems. More precisely, SMB requires less tuning, and shows comparable performance to other adaptive methods
    • …
    corecore