113,558 research outputs found

    Second-Order Belief Hidden Markov Models

    Get PDF
    Hidden Markov Models (HMMs) are learning methods for pattern recognition. The probabilistic HMMs have been one of the most used techniques based on the Bayesian model. First-order probabilistic HMMs were adapted to the theory of belief functions such that Bayesian probabilities were replaced with mass functions. In this paper, we present a second-order Hidden Markov Model using belief functions. Previous works in belief HMMs have been focused on the first-order HMMs. We extend them to the second-order model

    Fusion of probabilistic knowledge-based classification rules and learning automata for automatic recognition of digital images

    Get PDF
    In this paper, the fusion of probabilistic knowledge-based classification rules and learning automata theory is proposed and as a result we present a set of probabilistic classification rules with self-learning capability. The probabilities of the classification rules change dynamically guided by a supervised reinforcement process aimed at obtaining an optimum classification accuracy. This novel classifier is applied to the automatic recognition of digital images corresponding to visual landmarks for the autonomous navigation of an unmanned aerial vehicle (UAV) developed by the authors. The classification accuracy of the proposed classifier and its comparison with well-established pattern recognition methods is finally reported

    A Reverse Hierarchy Model for Predicting Eye Fixations

    Full text link
    A number of psychological and physiological evidences suggest that early visual attention works in a coarse-to-fine way, which lays a basis for the reverse hierarchy theory (RHT). This theory states that attention propagates from the top level of the visual hierarchy that processes gist and abstract information of input, to the bottom level that processes local details. Inspired by the theory, we develop a computational model for saliency detection in images. First, the original image is downsampled to different scales to constitute a pyramid. Then, saliency on each layer is obtained by image super-resolution reconstruction from the layer above, which is defined as unpredictability from this coarse-to-fine reconstruction. Finally, saliency on each layer of the pyramid is fused into stochastic fixations through a probabilistic model, where attention initiates from the top layer and propagates downward through the pyramid. Extensive experiments on two standard eye-tracking datasets show that the proposed method can achieve competitive results with state-of-the-art models.Comment: CVPR 2014, 27th IEEE Conference on Computer Vision and Pattern Recognition (CVPR). CVPR 201

    Applications of Nonclassical Logic Methods for Purposes of Knowledge Discovery and Data Mining

    Get PDF
    * The work is partially supported by Grant no. NIP917 of the Ministry of Science and Education – Republic of Bulgaria.Methods for solution of a large class of problems on the base of nonclassical, multiple-valued, and probabilistic logics have been discussed. A theory of knowledge about changing knowledge, of defeasible inference, and network approach to an analogous derivation have been suggested. A method for regularity search, logic-axiomatic and logic-probabilistic methods for learning of terms and pattern recognition in the case of multiple-valued logic have been described and generalized. Defeasible analogical inference and new forms of inference using exclusions are considered. The methods are applicable in a broad range of intelligent systems

    Computation of moments for probabilistic finite-state automata

    Full text link
    [EN] The computation of moments of probabilistic finite-state automata (PFA) is researched in this article. First, the computation of moments of the length of the paths is introduced for general PFA, and then, the computation of moments of the number of times that a symbol appears in the strings generated by the PFA is described. These computations require a matrix inversion. Acyclic PFA, such as word graphs, are quite common in many practical applications. Algorithms for the efficient computation of the moments for acyclic PFA are also presented in this paper.This work has been partially supported by the Ministerio de Ciencia y Tecnologia under the grant TIN2017-91452-EXP (IBEM), by the Generalitat Valenciana under the grant PROMETE0/2019/121 (DeepPattern), and by the grant "Ayudas Fundacion BBVA a equipos de investigacion cientifica 2018" (PR[8]_HUM_C2_0087).Sánchez Peiró, JA.; Romero, V. (2020). Computation of moments for probabilistic finite-state automata. Information Sciences. 516:388-400. https://doi.org/10.1016/j.ins.2019.12.052S388400516Sakakibara, Y., Brown, M., Hughey, R., Mian, I. S., Sjölander, K., Underwood, R. C., & Haussler, D. (1994). Stochastic context-free grammers for tRNA modeling. Nucleic Acids Research, 22(23), 5112-5120. doi:10.1093/nar/22.23.5112Álvaro, F., Sánchez, J.-A., & Benedí, J.-M. (2016). An integrated grammar-based approach for mathematical expression recognition. Pattern Recognition, 51, 135-147. doi:10.1016/j.patcog.2015.09.013Mohri, M., Pereira, F., & Riley, M. (2002). Weighted finite-state transducers in speech recognition. Computer Speech & Language, 16(1), 69-88. doi:10.1006/csla.2001.0184Casacuberta, F., & Vidal, E. (2004). Machine Translation with Inferred Stochastic Finite-State Transducers. Computational Linguistics, 30(2), 205-225. doi:10.1162/089120104323093294Ortmanns, S., Ney, H., & Aubert, X. (1997). A word graph algorithm for large vocabulary continuous speech recognition. Computer Speech & Language, 11(1), 43-72. doi:10.1006/csla.1996.0022Soule, S. (1974). Entropies of probabilistic grammars. Information and Control, 25(1), 57-74. doi:10.1016/s0019-9958(74)90799-2Justesen, J., & Larsen, K. J. (1975). On probabilistic context-free grammars that achieve capacity. Information and Control, 29(3), 268-285. doi:10.1016/s0019-9958(75)90437-4Hernando, D., Crespi, V., & Cybenko, G. (2005). Efficient Computation of the Hidden Markov Model Entropy for a Given Observation Sequence. IEEE Transactions on Information Theory, 51(7), 2681-2685. doi:10.1109/tit.2005.850223Nederhof, M.-J., & Satta, G. (2008). Computation of distances for regular and context-free probabilistic languages. Theoretical Computer Science, 395(2-3), 235-254. doi:10.1016/j.tcs.2008.01.010CORTES, C., MOHRI, M., RASTOGI, A., & RILEY, M. (2008). ON THE COMPUTATION OF THE RELATIVE ENTROPY OF PROBABILISTIC AUTOMATA. International Journal of Foundations of Computer Science, 19(01), 219-242. doi:10.1142/s0129054108005644Ilic, V. M., Stankovi, M. S., & Todorovic, B. T. (2011). Entropy Message Passing. IEEE Transactions on Information Theory, 57(1), 375-380. doi:10.1109/tit.2010.2090235Booth, T. L., & Thompson, R. A. (1973). Applying Probability Measures to Abstract Languages. IEEE Transactions on Computers, C-22(5), 442-450. doi:10.1109/t-c.1973.223746Thompson, R. A. (1974). Determination of Probabilistic Grammars for Functionally Specified Probability-Measure Languages. IEEE Transactions on Computers, C-23(6), 603-614. doi:10.1109/t-c.1974.224001Wetherell, C. S. (1980). Probabilistic Languages: A Review and Some Open Questions. ACM Computing Surveys, 12(4), 361-379. doi:10.1145/356827.356829Sanchez, J.-A., & Benedi, J.-M. (1997). Consistency of stochastic context-free grammars from probabilistic estimation based on growth transformations. IEEE Transactions on Pattern Analysis and Machine Intelligence, 19(9), 1052-1055. doi:10.1109/34.615455Hutchins, S. E. (1972). Moments of string and derivation lengths of stochastic context-free grammars. Information Sciences, 4(2), 179-191. doi:10.1016/0020-0255(72)90011-4Heim, A., Sidorenko, V., & Sorger, U. (2008). Computation of distributions and their moments in the trellis. Advances in Mathematics of Communications, 2(4), 373-391. doi:10.3934/amc.2008.2.373Vidal, E., Thollard, F., de la Higuera, C., Casacuberta, F., & Carrasco, R. C. (2005). Probabilistic finite-state machines - part I. IEEE Transactions on Pattern Analysis and Machine Intelligence, 27(7), 1013-1025. doi:10.1109/tpami.2005.147Sánchez, J. A., Rocha, M. A., Romero, V., & Villegas, M. (2018). On the Derivational Entropy of Left-to-Right Probabilistic Finite-State Automata and Hidden Markov Models. Computational Linguistics, 44(1), 17-37. doi:10.1162/coli_a_0030
    corecore