6 research outputs found

    Applications of Simple Markov Models to Computer Vision

    Get PDF
    In this report we advocate the use of computationally simple algorithms for computer vision, operating in parallel. The design of these algorithms is based on physical constraints present in the image and object spaces. In particular, we discuss the design, implementation, and performance of a Markov Random Field based algorithm for low level segmentation. In addition to having a simple and fast implementation, the algorithm is flexible enough to allow intensity information to be fused with motion and edge information from other sources

    Accelerated Learning Through a Dynamic Adaptation of the Error Surface

    Get PDF
    In this report we describe a novel technique that accelerates learning pirocesses through a dynamic adaptation of the error surface. The algorithm, here name ARON (Adaptive region of Nonlinearity), implements a generalization of the basic McCulloch-Pitts type of neuron which gives to each unit the ability to automatically adapt its operational region according to the requirements of the problem. The changes on the error surface facilitates the progress of the 0ptimization criterion on its search for a minimum. ARON can be used in addition to and bring benefits to a large class of other optimization schemes

    A Fuzzy Locally Sensitive Method for Cluster Analysis

    Get PDF
    Cluster analysis has been playing an important role in pattern recognition, image processing, and time series analysis. The majority of the existing clustering algorithms depend on initial parameters and assumptions about the underlying dat,a structure. In this paper a fuzzy method of mode separation is proposed. The method addresses the task of multi-modal partition through a sequence of locally sensitive searches guided by a stochastic gradient ascent procedure, and addresses the cluster validity problem through a global partition performance criterion. the algorithm is computational efficient and provided gocd results when tested with a number of simulated and real data sets

    A Selective Committee Architecture for Time Series Prediction and Pattern Classification

    Get PDF
    In this report we describe a novel technique to generate a committee architecture for time series prediction. The algorithm, here named Selective Multiple Prediction Network, consists of three steps: a systematic partition of the input hyperspace, a selective training of many agents and a flexible combining strategy. Potencially uncorrelated agents are generated which improves the combination process. The proposed architecture is easily extended to the class of classification problems

    Computational Properties of Generalized Hopfield Networks Applied to Nonlinear Optimization

    Get PDF
    A nonlinear neural framework, called the Generalized Hopfield Network, is proposed, which is able to solve in a parallel distributed manner systems of nonlinear equations. The method is applied to the general nonlinear optimization problem. We demonstrate GHNs implementing the three most important optimization algorithms, namely the Augmented Lagrangian, Generalized Reduced Gradient and Successive Quadratic Programming methods. The study results in a dynamic view of the optimization problem and offers a straightforward model for the parallelization of the optimization computations, thus significantly extending the practical limits of problems that can be formulated as an optimization problem and which can gain from the introduction of nonlinearities in their structure (eg. pattern recognition, supervised learning, design of contentaddressable memories). 2 1. Introduction The ability of networks of highly interconnected nonlinear processors (neurons) to solve complicated optimization..
    corecore