3 research outputs found

    Importance Sampling for Objetive Funtion Estimations in Neural Detector Traing Driven by Genetic Algorithms

    Get PDF
    To train Neural Networks (NNs) in a supervised way, estimations of an objective function must be carried out. The value of this function decreases as the training progresses and so, the number of test observations necessary for an accurate estimation has to be increased. Consequently, the training computational cost is unaffordable for very low objective function value estimations, and the use of Importance Sampling (IS) techniques becomes convenient. The study of three different objective functions is considered, which implies the proposal of estimators of the objective function using IS techniques: the Mean-Square error, the Cross Entropy error and the Misclassification error criteria. The values of these functions are estimated by IS techniques, and the results are used to train NNs by the application of Genetic Algorithms. Results for a binary detection in Gaussian noise are provided. These results show the evolution of the parameters during the training and the performances of the proposed detectors in terms of error probability and Receiver Operating Characteristics curves. At the end of the study, the obtained results justify the convenience of using IS in the training

    A fast importance sampling algorithm for unsupervised learning of over-complete dictionaries

    No full text
    We use Bayesian statistics to study the dictionary learning problem in which an over-complete generative signal model has to be adapted for optimally sparse signal representations. With such a formulation we develop a stochastic gradient learning algorithm based on Importance Sampling techniques to minimise the negative marginal log-likelihood. As this likelihood is not available analytically, approximations have to be utilised. The Importance Sampling Monte Carlo marginalisation proposed here improves on previous methods and addresses three main issues: 1) bias of the gradient estimate; 2) multi-modality of the distribution to be approximated; and 3) computational efficiency. Experimental results show the advantages of the new method when compared to previous techniques. The gained efficiency allows the treatment of large scale problems in a statistically sound framework as demonstrated here by the extraction of individual piano notes from a polyphonic piano recording
    corecore