211 research outputs found

    Character Recognition

    Get PDF
    Character recognition is one of the pattern recognition technologies that are most widely used in practical applications. This book presents recent advances that are relevant to character recognition, from technical topics such as image processing, feature extraction or classification, to new applications including human-computer interfaces. The goal of this book is to provide a reference source for academic research and for professionals working in the character recognition field

    Neural correlates of visual-motor disorders in children with developmental coordination disorder

    Get PDF

    Gone in Sixty Milliseconds: Trademark Law and Cognitive Science

    Get PDF
    Trademark dilution is a cause of action for interfering with the uniqueness of a trademark. For example, consumers would probably not think that Kodak soap was produced by the makers of Kodak cameras, but its presence in the market would diminish the uniqueness of the original Kodak mark. Trademark owners think dilution is harmful but have had difficulty explaining why. Many courts have therefore been reluctant to enforce dilution laws, even while legislatures have enacted more of them over the past half century. Courts and commentators have now begun to use psychological theories, drawing on associationist models of cognition, to explain how a trademark can be harmed by the existence of similar marks even when consumers can readily distinguish the marks from one another and thus are not confused. Though the cognitive theory of dilution is internally consistent and appeals to the authority of science, it does not rest on sufficient empirical evidence to justify its adoption. Moreover, the harms it identifies do not generally come from commercial competitors but from free speech about trademarked products. As a result, even a limited dilution law should be held unconstitutional under current First Amendment commercial-speech doctrine. In the absence of constitutional invalidation, the cognitive explanation of dilution is likely to change the law for the worse. Rather than working like fingerprint evidence--which ideally produces more evidence about already-defined crimes--psychological explanations of dilution are more like economic theories in antitrust, which changed the definition of actionable restraints of trade. Given the empirical and normative flaws in the cognitive theory, using it to fill dilution\u27s theoretical vacuum would be a mistake

    Human-computer interaction in ubiquitous computing environments

    Full text link
    Purpose &ndash; The purpose of this paper is to explore characteristics of human-computer interaction when the human body and its movements become input for interaction and interface control in pervasive computing settings. Design/methodology/approach &ndash; The paper quantifies the performance of human movement based on Fitt\u27s Law and discusses some of the human factors and technical considerations that arise in trying to use human body movements as an input medium. Findings &ndash; The paper finds that new interaction technologies utilising human movements may provide more flexible, naturalistic interfaces and support the ubiquitous or pervasive computing paradigm. Practical implications &ndash; In pervasive computing environments the challenge is to create intuitive and user-friendly interfaces. Application domains that may utilize human body movements as input are surveyed here and the paper addresses issues such as culture, privacy, security and ethics raised by movement of a user\u27s body-based interaction styles. Originality/value &ndash; The paper describes the utilization of human body movements as input for interaction and interface control in pervasive computing settings. <br /

    Off-line Arabic Handwriting Recognition System Using Fast Wavelet Transform

    Get PDF
    In this research, off-line handwriting recognition system for Arabic alphabet is introduced. The system contains three main stages: preprocessing, segmentation and recognition stage. In the preprocessing stage, Radon transform was used in the design of algorithms for page, line and word skew correction as well as for word slant correction. In the segmentation stage, Hough transform approach was used for line extraction. For line to words and word to characters segmentation, a statistical method using mathematic representation of the lines and words binary image was used. Unlike most of current handwriting recognition system, our system simulates the human mechanism for image recognition, where images are encoded and saved in memory as groups according to their similarity to each other. Characters are decomposed into a coefficient vectors, using fast wavelet transform, then, vectors, that represent a character in different possible shapes, are saved as groups with one representative for each group. The recognition is achieved by comparing a vector of the character to be recognized with group representatives. Experiments showed that the proposed system is able to achieve the recognition task with 90.26% of accuracy. The system needs only 3.41 seconds a most to recognize a single character in a text of 15 lines where each line has 10 words on average
    • …
    corecore