463,730 research outputs found

    Beyond Stemming and Lemmatization: Ultra-stemming to Improve Automatic Text Summarization

    Full text link
    In Automatic Text Summarization, preprocessing is an important phase to reduce the space of textual representation. Classically, stemming and lemmatization have been widely used for normalizing words. However, even using normalization on large texts, the curse of dimensionality can disturb the performance of summarizers. This paper describes a new method for normalization of words to further reduce the space of representation. We propose to reduce each word to its initial letters, as a form of Ultra-stemming. The results show that Ultra-stemming not only preserve the content of summaries produced by this representation, but often the performances of the systems can be dramatically improved. Summaries on trilingual corpora were evaluated automatically with Fresa. Results confirm an increase in the performance, regardless of summarizer system used.Comment: 22 pages, 12 figures, 9 table

    A magnetic stimulation examination of orthographic neighborhood effects in visual word recognition

    Get PDF
    The split-fovea theory proposes that visual word recognition is mediated by the splitting of the foveal image, with letters to the left of fixation projected to the right hemisphere (RH) and letters to the right of fixation projected to the left hemisphere (LH). We applied repetitive transcranial magnetic stimulation (rTMS) over the left and right occipital cortex during a lexical decision task to investigate the extent to which word recognition processes could be accounted for according to the split-fovea theory. Unilateral rTMS significantly impaired lexical decision latencies to centrally presented words, supporting the suggestion that foveal representation of words is split between the cerebral hemispheres rather than bilateral. Behaviorally, we showed that words that have many orthographic neighbors sharing the same initial letters ("lead neighbors") facilitated lexical decision more than words with few lead neighbors. This effect did not apply to end neighbors (orthographic neighbors sharing the same final letters). Crucially, rTMS over the RH impaired lead-, but not end-neighborhood facilitation. The results support the split-fovea theory, where the RH has primacy in representing lead neighbors of a written word

    Does Phenomenal Consciousness Overflow Attention? An Argument from Feature-Integration

    Get PDF
    In the past two decades a number of arguments have been given in favor of the possibility of phenomenal consciousness without attentional access, otherwise known as phenomenal overflow. This paper will show that the empirical data commonly cited in support of this thesis is, at best, ambiguous between two equally plausible interpretations, one of which does not posit phenomenology beyond attention. Next, after citing evidence for the feature-integration theory of attention, this paper will give an account of the relationship between consciousness and attention that accounts for both the empirical data and our phenomenological intuitions without positing phenomenal consciousness beyond attention. Having undercut the motivations for accepting phenomenal overflow along with having given reasons to think that phenomenal overflow does not occur, I end with the tentative conclusion that attention is a necessary condition for phenomenal consciousness

    Stochastic accumulation of feature information in perception and memory

    Get PDF
    It is now well established that the time course of perceptual processing influences the first second or so of performance in a wide variety of cognitive tasks. Over the last20 years, there has been a shift from modeling the speed at which a display is processed, to modeling the speed at which different features of the display are perceived and formalizing how this perceptual information is used in decision making. The first of these models(Lamberts, 1995) was implemented to fit the time course of performance in a speeded perceptual categorization task and assumed a simple stochastic accumulation of feature information. Subsequently, similar approaches have been used to model performance in a range of cognitive tasks including identification, absolute identification, perceptual matching, recognition, visual search, and word processing, again assuming a simple stochastic accumulation of feature information from both the stimulus and representations held in memory. These models are typically fit to data from signal-to-respond experiments whereby the effects of stimulus exposure duration on performance are examined, but response times (RTs) and RT distributions have also been modeled. In this article, we review this approach and explore the insights it has provided about the interplay between perceptual processing, memory retrieval, and decision making in a variety of tasks. In so doing, we highlight how such approaches can continue to usefully contribute to our understanding of cognition

    Quantum cryptography: key distribution and beyond

    Full text link
    Uniquely among the sciences, quantum cryptography has driven both foundational research as well as practical real-life applications. We review the progress of quantum cryptography in the last decade, covering quantum key distribution and other applications.Comment: It's a review on quantum cryptography and it is not restricted to QK
    corecore