10 research outputs found

    The Kohonen network incorporating explicit statistics and its application to the travelling salesman problem

    No full text
    In this paper we introduce a new self-organizing neural network, the Kohonen Network Incorporating Explicit Statistics (KNIES) that is based on Kohonen's Self-Organizing Map (SOM). The primary difference between the SOM and the KNIES is the fact that every iteration in the training phase includes two distinct modules - the attracting module and the dispersing module. As a result of the newly introduced dispersing module the neurons maintain the overall statistical properties of the data points. Thus, although in SOM the neurons individually find their places both statistically and topologically, in KNIES they collectively maintain their mean to be the mean of the data points, which they represent. Although the scheme as it is currently implemented maintains the mean as its invariant, the scheme can easily be generalized to maintain higher order central moments as invariants. The new scheme has been used to solve the Euclidean Travelling Salesman Problem (TSP). Experimental results for problems taken from TSPLIB indicate that it is a very accurate NN strategy for the TSP - probably the most accurate neural solutions available in the literature.In this paper we introduce a new self-organizing neural network, the Kohonen Network Incorporating Explicit Statistics (KNIES) that is based on Kohonen's Self-Organizing Map (SOM). The primary difference between the SOM and the KNIES is the fact that every iteration in the training phase includes two distinct modules - the attracting module and the dispersing module. As a result of the newly introduced dispersing module the neurons maintain the overall statistical properties of the data points. Thus, although in SOM the neurons individually find their places both statistically and topologically, in KNIES they collectively maintain their mean to be the mean of the data points, which they represent. Although the scheme as it is currently implemented maintains the mean as its invariant, the scheme can easily be generalized to maintain highe

    Discrete vector quantization for arbitrary distance function estimation

    No full text

    Vector quantization for arbitrary distance function estimation

    No full text
    In this article we apply the concepts of vector quantization for the evaluation of arbitrary distance functions - a problem which has important applications in logistics and location analysis. The input to our problem is the set of coordinates of a large number of nodes whose internode arbitrary "distances" have to be estimated. To render the problem interesting, nontrivial and realistic, we assume that the explicit form of this distance function is both unknown and uncomputable. Unlike traditional operations research methods, which compute aggregate parameters of functional estimators according to certain goodness-of-fit criteria, we have utilized vector quantization principles to first adaptively polarize the nodes into subregions. Subsequently, the parameters characterizing the subregions are learned by using a variety of methods (including, for academic purposes a vector quantization strategy in the metadomain). The algorithms have been rigorously tested for the actual road-travel distances Involving cities in Turkey. The results obtained are not only conclusive, but also the best currently available from any single or hybrid strategy

    Fast, efficient and accurate solutions to the Hamiltonian path problem using neural approaches

    No full text
    Unlike its cousin, the Euclidean Traveling Salesman Problem (TSP), to the best of our knowledge, there has been no documented all-neural solution to the Euclidean Hamiltonian Path Problem (HPP). The reason for this is the fact that the heuristics which map the cities onto the neurons 'lose their credibility' because the underlying cyclic property of the order of the neurons used in the TSP is lost in the HPP. In this paper we present three neural solutions to the HPP. The first of these, GSOM_HPP, is a generalization of Kohonen's self-organizing map (SOM) as modified by Angeniol et al. (Neural Networks 1988;1:289-93). The second and third methods use the recently-introduced self-organizing neural network, the Kohonen Network Incorporating Explicit Statistics (KNIES) (Oommen et al., Proceedings of WIRN/VIETRI-98, the Tenth Italian Workshop on Neural Nets, Vietri Sul Mare, Italy, May 1998. p. 273-282). The primary difference between KNIES and Kohonen's SOM is the fact that unlike SOM, every iteration in the training phase includes two distinct modules - the attracting module and the dispersing module. As a result of SOM and the dispersing module introduced in KNIES the neurons individually find their places both statistically and topologically, and also collectively maintain their mean to be the mean of the data points which they represent. The new philosophy, which has previously (Oommen et al. Proceedings of WIRN/VIETRI-98, the Tenth Italian Workshop on Neural Nets, Vietri Sul Mare, Italy, May 1998. p. 273-282) been used to effectively solve the Euclidean Traveling Salesman Problem (TSP), is now extended to solve the Euclidean Hamiltonian Path (HPP). These algorithms which are the first all-neural solutions to the HPP, have also been rigorously tested. Experimental results for problems obtained by modifying selected instances from the traveling salesman problem library (TSPLIB) (Reinett. ORSA Journal on Computing 1991;3:376-84) for the HPP indicate that they are both accurate an

    Arbitrary distance function estimation using discrete vector quantization

    No full text
    This paper develops a method by which the general philosophies of vector quantization (VQ) and discretized automata learning can be incorporated for the computation of arbitrary distance functions-A problem which has important applications in logistics and location analysis. The input to our problem is the set of coordinates of a large number of nodes whose inter-node arbitrary «distances» have to be estimated. Unlike traditional operations research methods, which use parametric functional estimators, we have utilized discretized VQ principles to first adaptively polarize the nodes into sub-regions. Subsequently, the parameters characterizing the sub-regions are learnt by using a variety of methods. The algorithms have been rigorously tested for the actual road-travel distances involving cities in Turkiye and the results obtained are conclusive. Indeed, from the point of view of both speed and accuracy, these present results are the best currently available from any single or hybrid strategy

    A self-organizing method for map reconstruction

    No full text
    A variety of problems in geographical and satellite-based remote sensing signal processing, and in the area of "zero-error" pattern recognition deal with processing the information contained in the distances between the points in the geographical or feature space. In this paper we consider one such problem, namely, that of reconstructing the points in the geographical or feature space, when we are only given the approsimate distances between the points themselves. In particular, we are interested in the problem of reconstructing a map when the given data is the set of intercity road travel distances. Reported solution approaches primarily involve multi-dimensional scaling techniques. However, we propose a self-organizing method. New method is tested and compared with the Classical multi-dimensional scaling and ALSCAL on different data sets obtained from various countries
    corecore