1,511 research outputs found

    Boosting Handwriting Text Recognition in Small Databases with Transfer Learning

    Full text link
    In this paper we deal with the offline handwriting text recognition (HTR) problem with reduced training datasets. Recent HTR solutions based on artificial neural networks exhibit remarkable solutions in referenced databases. These deep learning neural networks are composed of both convolutional (CNN) and long short-term memory recurrent units (LSTM). In addition, connectionist temporal classification (CTC) is the key to avoid segmentation at character level, greatly facilitating the labeling task. One of the main drawbacks of the CNNLSTM-CTC (CLC) solutions is that they need a considerable part of the text to be transcribed for every type of calligraphy, typically in the order of a few thousands of lines. Furthermore, in some scenarios the text to transcribe is not that long, e.g. in the Washington database. The CLC typically overfits for this reduced number of training samples. Our proposal is based on the transfer learning (TL) from the parameters learned with a bigger database. We first investigate, for a reduced and fixed number of training samples, 350 lines, how the learning from a large database, the IAM, can be transferred to the learning of the CLC of a reduced database, Washington. We focus on which layers of the network could be not re-trained. We conclude that the best solution is to re-train the whole CLC parameters initialized to the values obtained after the training of the CLC from the larger database. We also investigate results when the training size is further reduced. The differences in the CER are more remarkable when training with just 350 lines, a CER of 3.3% is achieved with TL while we have a CER of 18.2% when training from scratch. As a byproduct, the learning times are quite reduced. Similar good results are obtained from the Parzival database when trained with this reduced number of lines and this new approach.Comment: ICFHR 2018 Conferenc

    Higher order Whitehead products and L∞L_\infty structures on the homology of a DGL

    Full text link
    We detect higher order Whitehead products on the homology HH of a differential graded Lie algebra LL in terms of higher brackets in the transferred L∞L_\infty structure on HH via a given homotopy retraction of LL onto HH.Comment: New references and minor correction

    Tree-structure Expectation Propagation for Decoding LDPC codes over Binary Erasure Channels

    Full text link
    Expectation Propagation is a generalization to Belief Propagation (BP) in two ways. First, it can be used with any exponential family distribution over the cliques in the graph. Second, it can impose additional constraints on the marginal distributions. We use this second property to impose pair-wise marginal distribution constraints in some check nodes of the LDPC Tanner graph. These additional constraints allow decoding the received codeword when the BP decoder gets stuck. In this paper, we first present the new decoding algorithm, whose complexity is identical to the BP decoder, and we then prove that it is able to decode codewords with a larger fraction of erasures, as the block size tends to infinity. The proposed algorithm can be also understood as a simplification of the Maxwell decoder, but without its computational complexity. We also illustrate that the new algorithm outperforms the BP decoder for finite block-siz

    Turbo EP-based Equalization: a Filter-Type Implementation

    Get PDF
    This manuscript has been submitted to Transactions on Communications on September 7, 2017; revised on January 10, 2018 and March 27, 2018; and accepted on April 25, 2018 We propose a novel filter-type equalizer to improve the solution of the linear minimum-mean squared-error (LMMSE) turbo equalizer, with computational complexity constrained to be quadratic in the filter length. When high-order modulations and/or large memory channels are used the optimal BCJR equalizer is unavailable, due to its computational complexity. In this scenario, the filter-type LMMSE turbo equalization exhibits a good performance compared to other approximations. In this paper, we show that this solution can be significantly improved by using expectation propagation (EP) in the estimation of the a posteriori probabilities. First, it yields a more accurate estimation of the extrinsic distribution to be sent to the channel decoder. Second, compared to other solutions based on EP the computational complexity of the proposed solution is constrained to be quadratic in the length of the finite impulse response (FIR). In addition, we review previous EP-based turbo equalization implementations. Instead of considering default uniform priors we exploit the outputs of the decoder. Some simulation results are included to show that this new EP-based filter remarkably outperforms the turbo approach of previous versions of the EP algorithm and also improves the LMMSE solution, with and without turbo equalization

    Trace elements and C and N isotope composition in two mushroom species from a mine-spill contaminated site

    Get PDF
    Fungi play a key role in the functioning of soil in terrestrial ecosystems, and in particular in the remediation of degraded soils. The contribution of fungi to carbon and nutrient cycles, along with their capability to mobilise soil trace elements, is well-known. However, the importance of life history strategy for these functions has not yet been thoroughly studied. This study explored the soil-fungi relationship of two wild edible fungi, the ectomycorrhizal Laccaria laccata and the saprotroph Volvopluteus gloiocephalus. Fruiting bodies and surrounding soils in a mine-spill contaminated area were analysed. Isotope analyses revealed Laccaria laccata fruiting bodies were 15N-enriched when compared to Volvopluteus gloiocephalus, likely due to the transfer of 15N-depleted compounds to their host plant. Moreover, Laccaria laccata fruiting bodies δ13C values were closer to host plant values than surrounding soil, while Volvopluteus gloiocephalus matched the δ13C composition to that of the soil. Fungal species presented high bioaccumulation and concentrations of Cd and Cu in their fruiting bodies. Human consumption of these fruiting bodies may represent a toxicological risk due to their elevated Cd concentrations

    Coordinating heterogeneous IoT devices by means of the centralized vision of the SDN controller

    Get PDF
    The IoT (Internet of Things) has become a reality during recent years. The desire of having everything connected to the Internet results in clearly identified benefits that will impact on socio economic development. However, the exponential growth in the number of IoT devices and their heterogeneity open new challenges that must be carefully studied. Coordination among devices to adapt them to their users' context usually requires high volumes of data to be exchanged with the cloud. In order to reduce unnecessary communications and network overhead, this paper proposes a novel network architecture based on the Software-Defined Networking paradigm that allows IoT devices coordinate and adapt them within the scope of a particular context.Universidad de Málaga. Campus de Excelencia Internacional Andalucía Tech

    Tree-Structure Expectation Propagation for LDPC Decoding over the BEC

    Full text link
    We present the tree-structure expectation propagation (Tree-EP) algorithm to decode low-density parity-check (LDPC) codes over discrete memoryless channels (DMCs). EP generalizes belief propagation (BP) in two ways. First, it can be used with any exponential family distribution over the cliques in the graph. Second, it can impose additional constraints on the marginal distributions. We use this second property to impose pair-wise marginal constraints over pairs of variables connected to a check node of the LDPC code's Tanner graph. Thanks to these additional constraints, the Tree-EP marginal estimates for each variable in the graph are more accurate than those provided by BP. We also reformulate the Tree-EP algorithm for the binary erasure channel (BEC) as a peeling-type algorithm (TEP) and we show that the algorithm has the same computational complexity as BP and it decodes a higher fraction of errors. We describe the TEP decoding process by a set of differential equations that represents the expected residual graph evolution as a function of the code parameters. The solution of these equations is used to predict the TEP decoder performance in both the asymptotic regime and the finite-length regime over the BEC. While the asymptotic threshold of the TEP decoder is the same as the BP decoder for regular and optimized codes, we propose a scaling law (SL) for finite-length LDPC codes, which accurately approximates the TEP improved performance and facilitates its optimization

    Growth of Single-Crystal LiNbO<sub>3</sub> Particles by Aerosol-Assisted Chemical Vapor Deposition Method

    Get PDF
    Adjusting nucleation conditions, an effective shape and size control in the preparation of single-crystal lithium niobate nanoparticles by aerosol-assisted chemical vapor deposition method was demonstrated. The effect of the most relevant parameters leading to nanocrystals taking a specific shape or size once they are synthesized was analyzed. This has allowed us to demonstrate that it is possible to control the size and morphology of particles prepared adjusting the nucleation conditions. The synthesized nanocrystals showed different morphologies including quasi-cubic, tetrahedral, polyhedral, and hexagonal shapes, with characteristic sizes ranging from a few tens to a few hundred nanometers. However, rod-like structures with characteristic lengths ranging from 3 to 5 μm were also obtained. The structural and morphological characterization by X-ray diffraction and high-resolution electron microscopy techniques revealed the single-crystal nature of the synthesized particles

    Neuroscience and subjectivity. A proposal for cooperation between neuroscience and some philosophical traditions

    Get PDF
    One of the main challenges of biology consists in articulating a coherent view of the central nervous system and its structure. To this end, neuroscience has emerged as an interdisciplinary project which looks to integrate the different disciplines
    • …
    corecore