19,743 research outputs found

    Non-english and non-latin signature verification systems: A survey

    Full text link
    Signatures continue to be an important biometric because they remain widely used as a means of personal verification and therefore an automatic verification system is needed. Manual signature-based authentication of a large number of documents is a difficult and time consuming task. Consequently for many years, in the field of protected communication and financial applications, we have observed an explosive growth in biometric personal authentication systems that are closely connected with measurable unique physical characteristics (e.g. hand geometry, iris scan, finger prints or DNA) or behavioural features. Substantial research has been undertaken in the field of signature verification involving English signatures, but to the best of our knowledge, very few works have considered non-English signatures such as Chinese, Japanese, Arabic etc. In order to convey the state-of-the-art in the field to researchers, in this paper we present a survey of non-English and non-Latin signature verification systems

    Sparse Radial Sampling LBP for Writer Identification

    Full text link
    In this paper we present the use of Sparse Radial Sampling Local Binary Patterns, a variant of Local Binary Patterns (LBP) for text-as-texture classification. By adapting and extending the standard LBP operator to the particularities of text we get a generic text-as-texture classification scheme and apply it to writer identification. In experiments on CVL and ICDAR 2013 datasets, the proposed feature-set demonstrates State-Of-the-Art (SOA) performance. Among the SOA, the proposed method is the only one that is based on dense extraction of a single local feature descriptor. This makes it fast and applicable at the earliest stages in a DIA pipeline without the need for segmentation, binarization, or extraction of multiple features.Comment: Submitted to the 13th International Conference on Document Analysis and Recognition (ICDAR 2015

    University of Missouri, catalog 1972-1973, independent study through correspondence instruction

    Get PDF
    "The Independent Study Department is a part of the Extension Division of the University of Missouri. It has administrative responsibility for all correspondence instruction offered by each of the four University of Missouri campuses are Columbia, Kansas City, Rolla, and St. Louis."--Page 3

    The use of data-mining for the automatic formation of tactics

    Get PDF
    This paper discusses the usse of data-mining for the automatic formation of tactics. It was presented at the Workshop on Computer-Supported Mathematical Theory Development held at IJCAR in 2004. The aim of this project is to evaluate the applicability of data-mining techniques to the automatic formation of tactics from large corpuses of proofs. We data-mine information from large proof corpuses to find commonly occurring patterns. These patterns are then evolved into tactics using genetic programming techniques

    Big Data Privacy Context: Literature Effects On Secure Informational Assets

    Get PDF
    This article's objective is the identification of research opportunities in the current big data privacy domain, evaluating literature effects on secure informational assets. Until now, no study has analyzed such relation. Its results can foster science, technologies and businesses. To achieve these objectives, a big data privacy Systematic Literature Review (SLR) is performed on the main scientific peer reviewed journals in Scopus database. Bibliometrics and text mining analysis complement the SLR. This study provides support to big data privacy researchers on: most and least researched themes, research novelty, most cited works and authors, themes evolution through time and many others. In addition, TOPSIS and VIKOR ranks were developed to evaluate literature effects versus informational assets indicators. Secure Internet Servers (SIS) was chosen as decision criteria. Results show that big data privacy literature is strongly focused on computational aspects. However, individuals, societies, organizations and governments face a technological change that has just started to be investigated, with growing concerns on law and regulation aspects. TOPSIS and VIKOR Ranks differed in several positions and the only consistent country between literature and SIS adoption is the United States. Countries in the lowest ranking positions represent future research opportunities.Comment: 21 pages, 9 figure

    Wills Formalities in the Twenty-First Century

    Get PDF
    Individuals have executed wills the same way for centuries. But over time, traditional requirements have relaxed. This Article makes two principal claims, both of which disrupt fundamental assumptions about the purposes and functions of wills formalities. First, the traditional requirements that a will must be in writing and signed by the testator in the presence of (or acknowledged before) witnesses have never adequately served their stated purposes. For that reason, strict compliance with formalities cannot be justified by their cautionary, protective, evidentiary, and channeling functions. Reducing or eliminating most of the long-standing requirements for execution of a will is consistent with the true purpose of wills formalities--authenticating a document as the one executed by the testator with the intention of having it serve as the binding directive for the post-mortem distribution of the testator\u27s property. This Article\u27s account has important implications for the way that legal scholars, lawmakers, and lawyers think about wills. The Article\u27s second claim is that the substantive standard of the harmless error rule--that the decedent intended a particular document to be the decedent\u27s last will and testament--should be the only threshold that must be satisfied for a court to admit the document to probate. Widespread adoption of such an intent-based rule is preferable to one that is overly formalistic. Current formalism leads both to false positives (i.e., grant of probate to a document not intended by the decedent as the decedent\u27s will) and false negatives (i.e., denial of probate of a document clearly intended by the decedent as the decedent\u27s will). An intent-based rule would make more likely the valid execution of wills by poor and middle-income individuals who typically cannot or do not consult attorneys. An intent-based standard also sets the stage for widespread recognition of electronic wills, if states are able to address concerns about authentication, fraud, and safekeeping of electronic documents. Technological developments could make estate planning in the twenty-first century more accessible than ever before to people of all wealth and income levels if the legal profession is prepared to embrace new ways of executing wills
    • ā€¦
    corecore