23,266 research outputs found

    Theoretical Interpretations and Applications of Radial Basis Function Networks

    Get PDF
    Medical applications usually used Radial Basis Function Networks just as Artificial Neural Networks. However, RBFNs are Knowledge-Based Networks that can be interpreted in several way: Artificial Neural Networks, Regularization Networks, Support Vector Machines, Wavelet Networks, Fuzzy Controllers, Kernel Estimators, Instanced-Based Learners. A survey of their interpretations and of their corresponding learning algorithms is provided as well as a brief survey on dynamic learning algorithms. RBFNs' interpretations can suggest applications that are particularly interesting in medical domains

    Medical imaging analysis with artificial neural networks

    Get PDF
    Given that neural networks have been widely reported in the research community of medical imaging, we provide a focused literature survey on recent neural network developments in computer-aided diagnosis, medical image segmentation and edge detection towards visual content analysis, and medical image registration for its pre-processing and post-processing, with the aims of increasing awareness of how neural networks can be applied to these areas and to provide a foundation for further research and practical development. Representative techniques and algorithms are explained in detail to provide inspiring examples illustrating: (i) how a known neural network with fixed structure and training procedure could be applied to resolve a medical imaging problem; (ii) how medical images could be analysed, processed, and characterised by neural networks; and (iii) how neural networks could be expanded further to resolve problems relevant to medical imaging. In the concluding section, a highlight of comparisons among many neural network applications is included to provide a global view on computational intelligence with neural networks in medical imaging

    NARX-based nonlinear system identification using orthogonal least squares basis hunting

    No full text
    An orthogonal least squares technique for basis hunting (OLS-BH) is proposed to construct sparse radial basis function (RBF) models for NARX-type nonlinear systems. Unlike most of the existing RBF or kernel modelling methods, whichplaces the RBF or kernel centers at the training input data points and use a fixed common variance for all the regressors, the proposed OLS-BH technique tunes the RBF center and diagonal covariance matrix of individual regressor by minimizing the training mean square error. An efficient optimization method isadopted for this basis hunting to select regressors in an orthogonal forward selection procedure. Experimental results obtained using this OLS-BH technique demonstrate that it offers a state-of-the-art method for constructing parsimonious RBF models with excellent generalization performance

    Comparative performance of some popular ANN algorithms on benchmark and function approximation problems

    Full text link
    We report an inter-comparison of some popular algorithms within the artificial neural network domain (viz., Local search algorithms, global search algorithms, higher order algorithms and the hybrid algorithms) by applying them to the standard benchmarking problems like the IRIS data, XOR/N-Bit parity and Two Spiral. Apart from giving a brief description of these algorithms, the results obtained for the above benchmark problems are presented in the paper. The results suggest that while Levenberg-Marquardt algorithm yields the lowest RMS error for the N-bit Parity and the Two Spiral problems, Higher Order Neurons algorithm gives the best results for the IRIS data problem. The best results for the XOR problem are obtained with the Neuro Fuzzy algorithm. The above algorithms were also applied for solving several regression problems such as cos(x) and a few special functions like the Gamma function, the complimentary Error function and the upper tail cumulative χ2\chi^2-distribution function. The results of these regression problems indicate that, among all the ANN algorithms used in the present study, Levenberg-Marquardt algorithm yields the best results. Keeping in view the highly non-linear behaviour and the wide dynamic range of these functions, it is suggested that these functions can be also considered as standard benchmark problems for function approximation using artificial neural networks.Comment: 18 pages 5 figures. Accepted in Pramana- Journal of Physic

    Magnetic dot arrays modeling via the system of the radial basis function networks

    Full text link
    Two dimensional square lattice general model of the magnetic dot array is introduced. In this model the intradot self-energy is predicted via the neural network and interdot magnetostatic coupling is approximated by the collection of several dipolar terms. The model has been applied to disk-shaped cluster involving 193 ultrathin dots and 772 interaction centers. In this case among the intradot magnetic structures retrieved by neural networks the important role play single-vortex magnetization modes. Several aspects of the model have been understood numerically by means of the simulated annealing method.Comment: 16 pages, 8 figure
    corecore