7 research outputs found

    Nuevo enfoque en el diseño y entrenamiento de redes neuronales para la clasificación

    Get PDF
    Tesis (Doctor en Ingeniería con Especialidad en Ingeniería de Sistemas) UANL, 2001.UANLhttp://www.uanl.mx

    Sample Complexity for Learning Recurrent Perceptron Mappings

    Get PDF
    Recurrent perceptron classifiers generalize the classical perceptron model. They take into account those correlations and dependences among input coordinates which arise from linear digital filtering. This paper provides tight bounds on sample complexity associated to the fitting of such models to experimental data. Keywords: perceptrons, recurrent models, neural networks, learning, Vapnik-Chervonenkis dimension 1 Introduction One of the most popular approaches to binary pattern classification, underlying many statistical techniques, is based on perceptrons or linear discriminants ; see for instance the classical reference [9]. In this context, one is interested in classifying k-dimensional input patterns v = (v 1 ; : : : ; v k ) into two disjoint classes A + and A \Gamma . A perceptron P which classifies vectors into A + and A \Gamma is characterized by a vector (of "weights") ~c 2 R k , and operates as follows. One forms the inner product ~c:v = c 1 v 1 + : : : c k v k . I..

    Sample Complexity for Learning Recurrent Perceptron Mappings

    No full text
    Recurrent perceptron classifiers generalize the classical perceptron model. They take into account those correlations and dependences among input coordinates which arise from linear digital filtering. This paper provides tight bounds on sample complexity associated to the fitting of such models to experimental data
    corecore