44,765 research outputs found

    MedlinePlus??: The National Library of Medicine?? Brings Quality Information to Health Consumers

    Get PDF
    The National Library of Medicine???s (NLM??) MedlinePlus?? is a high-quality gateway to consumer health information from NLM, the National Institutes of Health (NIH), and other authoritative organizations. For decades, NLM has been a leader in indexing, organizing, and distributing health information to health professionals. In creating MedlinePlus, NLM uses years of accumulated expertise and technical knowledge to produce an authoritative, reliable consumer health Web site. This article describes the development of MedlinePlus???its quality control processes, the integration of NLM and NIH information, NLM???s relationship to other institutions, the technical and staffing infrastructures, the use of feedback for quality improvement, and future plans.published or submitted for publicatio

    Enriching Rare Word Representations in Neural Language Models by Embedding Matrix Augmentation

    Full text link
    The neural language models (NLM) achieve strong generalization capability by learning the dense representation of words and using them to estimate probability distribution function. However, learning the representation of rare words is a challenging problem causing the NLM to produce unreliable probability estimates. To address this problem, we propose a method to enrich representations of rare words in pre-trained NLM and consequently improve its probability estimation performance. The proposed method augments the word embedding matrices of pre-trained NLM while keeping other parameters unchanged. Specifically, our method updates the embedding vectors of rare words using embedding vectors of other semantically and syntactically similar words. To evaluate the proposed method, we enrich the rare street names in the pre-trained NLM and use it to rescore 100-best hypotheses output from the Singapore English speech recognition system. The enriched NLM reduces the word error rate by 6% relative and improves the recognition accuracy of the rare words by 16% absolute as compared to the baseline NLM.Comment: 5 pages, 2 figures, accepted to INTERSPEECH 201

    Fast Separable Non-Local Means

    Full text link
    We propose a simple and fast algorithm called PatchLift for computing distances between patches (contiguous block of samples) extracted from a given one-dimensional signal. PatchLift is based on the observation that the patch distances can be efficiently computed from a matrix that is derived from the one-dimensional signal using lifting; importantly, the number of operations required to compute the patch distances using this approach does not scale with the patch length. We next demonstrate how PatchLift can be used for patch-based denoising of images corrupted with Gaussian noise. In particular, we propose a separable formulation of the classical Non-Local Means (NLM) algorithm that can be implemented using PatchLift. We demonstrate that the PatchLift-based implementation of separable NLM is few orders faster than standard NLM, and is competitive with existing fast implementations of NLM. Moreover, its denoising performance is shown to be consistently superior to that of NLM and some of its variants, both in terms of PSNR/SSIM and visual quality
    • …
    corecore