496,665 research outputs found

    Beyond Stemming and Lemmatization: Ultra-stemming to Improve Automatic Text Summarization

    Full text link
    In Automatic Text Summarization, preprocessing is an important phase to reduce the space of textual representation. Classically, stemming and lemmatization have been widely used for normalizing words. However, even using normalization on large texts, the curse of dimensionality can disturb the performance of summarizers. This paper describes a new method for normalization of words to further reduce the space of representation. We propose to reduce each word to its initial letters, as a form of Ultra-stemming. The results show that Ultra-stemming not only preserve the content of summaries produced by this representation, but often the performances of the systems can be dramatically improved. Summaries on trilingual corpora were evaluated automatically with Fresa. Results confirm an increase in the performance, regardless of summarizer system used.Comment: 22 pages, 12 figures, 9 table

    Determination of the financial impact of machine downtime on the Australia Post large letters sorting process

    Get PDF
    Machine downtime, whether planned or unplanned, is intuitively costly to manufacturing organisations, however is often very difficult to quantify. Costing processes are rarely undertaken within manufacturing organisations. It has previously been estimated that 80% of industrial facilities were unable to accurately cost downtime, with many facilities underestimating the total cost by a factor of 200-300% (Crumrine and Post 2006). It was also acknowledged that the lack of practical guides has hindered costing procedures of any nature being implemented more readily (Dale and Plunkett 1995). Models that did exist rarely considered more than a subset of the costs identified elsewhere, leading to overly conservative estimations. In addition, because cost definitions are not consistent, methodologies for evaluating and quantifying individual costs have not previously been adequately defined. The work outlined in this paper has aimed to develop the first comprehensive methodology for determining the cost of downtime, with particular application to the Australia Post's automated mail processing machines. The method presented may be applied to any manufacturing environment which would benefit from a more complete understanding of the magnitude of the cost of machine or process downtime

    Quantum cryptography: key distribution and beyond

    Full text link
    Uniquely among the sciences, quantum cryptography has driven both foundational research as well as practical real-life applications. We review the progress of quantum cryptography in the last decade, covering quantum key distribution and other applications.Comment: It's a review on quantum cryptography and it is not restricted to QK

    Asymmetric spatial processing under cognitive load

    Get PDF
    Spatial attention allows us to selectively process information within a certain location in space. Despite the vast literature on spatial attention, the effect of cognitive load on spatial processing is still not fully understood. In this study we added cognitive load to a spatial processing task, so as to see whether it would differentially impact upon the processing of visual information in the left versus the right hemispace. The main paradigm consisted of a detection task that was performed during the maintenance interval of a verbal working memory task. We found that increasing cognitive working memory load had a more negative impact on detecting targets presented on the left side compared to those on the right side. The strength of the load effect correlated with the strength of the interaction on an individual level. The implications of an asymmetric attentional bias with a relative disadvantage for the left (vs the right) hemispace under high verbal working memory (WM) load are discussed

    Stochastic accumulation of feature information in perception and memory

    Get PDF
    It is now well established that the time course of perceptual processing influences the first second or so of performance in a wide variety of cognitive tasks. Over the last20 years, there has been a shift from modeling the speed at which a display is processed, to modeling the speed at which different features of the display are perceived and formalizing how this perceptual information is used in decision making. The first of these models(Lamberts, 1995) was implemented to fit the time course of performance in a speeded perceptual categorization task and assumed a simple stochastic accumulation of feature information. Subsequently, similar approaches have been used to model performance in a range of cognitive tasks including identification, absolute identification, perceptual matching, recognition, visual search, and word processing, again assuming a simple stochastic accumulation of feature information from both the stimulus and representations held in memory. These models are typically fit to data from signal-to-respond experiments whereby the effects of stimulus exposure duration on performance are examined, but response times (RTs) and RT distributions have also been modeled. In this article, we review this approach and explore the insights it has provided about the interplay between perceptual processing, memory retrieval, and decision making in a variety of tasks. In so doing, we highlight how such approaches can continue to usefully contribute to our understanding of cognition
    corecore