2 research outputs found

    Global Optimisation Of Neural Network Models Via Sequential Sampling-Importance Resampling

    No full text
    We propose a novel strategy for training neural networks using sequential Monte Carlo algorithms. This global optimisation strategy allows us to learn the probability distribution of the network weights in a sequential framework. It is well suited to applications involving on-line, nonlinear or non-stationary signal processing. We show how the new algorithms can outperform extended Kalman filter (EKF) training

    GLOBAL OPTIMISATION OF NEURAL NETWORK MODELS VIA SEQUENTIAL SAMPLING-IMPORTANCE RESAMPLING

    No full text
    We propose a novel strategy for training neural networks using sequential Monte Carlo algorithms. This global optimisation strategy allows us to learn the probability distribution of the network weights in a sequential framework. It is well suited to applications involving on-line, nonlinear or non-stationary signal processing. We show how the new algorithms can outperform extended Kalman filter (EKF) training. 1
    corecore