To our knowledge, there are only few systems that are able to automatically
translate handwritten text images into another language, in particular,
Arabic. Typically, the available systems are based on a concatenation of
two systems: a Handwritten Text Recognition (HTR) system and a Machine
Translation (MT) system. Roughly speaking, in the case of recognition
of Arabic text images, our work has focused on the use of the embedded
Bernoulli (mixture) HMMs (BHMMs), that is, embedded HMMs in which
the emission probabilities are modeled with Bernoulli mixtures. In the case
of Arabic text translation, our work has focused on one of the state-of-theart
phrase-based log-linear translation models. In this work we evaluate our
system on the LDC corpus introduced in the NIST OpenHaRT 2010 and
2013 evaluations. Very competitive and promising results are shown. Additionally,
we present the idea of a simple mobile application system for image
translation that recognizes the Arabic text in an image and translates the
recognized text into English.Alkhoury, I. (2013). Arabic recognition and translation system. http://hdl.handle.net/10251/33086.Archivo delegad