28,369 research outputs found

    A Fuzzy Logic Programming Environment for Managing Similarity and Truth Degrees

    Full text link
    FASILL (acronym of "Fuzzy Aggregators and Similarity Into a Logic Language") is a fuzzy logic programming language with implicit/explicit truth degree annotations, a great variety of connectives and unification by similarity. FASILL integrates and extends features coming from MALP (Multi-Adjoint Logic Programming, a fuzzy logic language with explicitly annotated rules) and Bousi~Prolog (which uses a weak unification algorithm and is well suited for flexible query answering). Hence, it properly manages similarity and truth degrees in a single framework combining the expressive benefits of both languages. This paper presents the main features and implementations details of FASILL. Along the paper we describe its syntax and operational semantics and we give clues of the implementation of the lattice module and the similarity module, two of the main building blocks of the new programming environment which enriches the FLOPER system developed in our research group.Comment: In Proceedings PROLE 2014, arXiv:1501.0169

    Biometric cryptosystem using online signatures

    Get PDF
    Biometric cryptosystems combine cryptography and biometrics to benefit from the strengths of both fields. In such systems, while cryptography provides high and adjustable security levels, biometrics brings in non-repudiation and eliminates the need to remember passwords or to carry tokens etc. In this work we present a biometric cryptosystems which uses online signatures, based on the Fuzzy Vault scheme of Jules et al. The Fuzzy Vault scheme releases a previously stored key when the biometric data presented for verification matches the previously stored template hidden in a vault. The online signature of a person is a behavioral biometric which is widely accepted as the formal way of approving documents, bank transactions, etc. As such, biometric-based key release using online signatures may have many application areas. We extract minutiae points (trajectory crossings, endings and points of high curvature) from online signatures and use those during the locking & unlocking phases of the vault. We present our preliminary results and demonstrate that high security level (128 bit encryption key length) can be achieved using online signatures

    Signature Verification Approach using Fusion of Hybrid Texture Features

    Full text link
    In this paper, a writer-dependent signature verification method is proposed. Two different types of texture features, namely Wavelet and Local Quantized Patterns (LQP) features, are employed to extract two kinds of transform and statistical based information from signature images. For each writer two separate one-class support vector machines (SVMs) corresponding to each set of LQP and Wavelet features are trained to obtain two different authenticity scores for a given signature. Finally, a score level classifier fusion method is used to integrate the scores obtained from the two one-class SVMs to achieve the verification score. In the proposed method only genuine signatures are used to train the one-class SVMs. The proposed signature verification method has been tested using four different publicly available datasets and the results demonstrate the generality of the proposed method. The proposed system outperforms other existing systems in the literature.Comment: Neural Computing and Applicatio

    Inspection System And Method For Bond Detection And Validation Of Surface Mount Devices Using Sensor Fusion And Active Perception

    Get PDF
    A hybrid surface mount component inspection system which includes both vision and infrared inspection techniques to determine the presence of surface mount components on a printed wiring board, and the quality of solder joints of surface mount components on printed wiring boards by using data level sensor fusion to combine data from two infrared sensors to obtain emissivity independent thermal signatures of solder joints, and using feature level sensor fusion with active perception to assemble and process inspection information from any number of sensors to determine characteristic feature sets of different defect classes to classify solder defects.Georgia Tech Research Corporatio

    Cosmological space-times with resolved Big Bang in Yang-Mills matrix models

    Full text link
    We present simple solutions of IKKT-type matrix models that can be viewed as quantized homogeneous and isotropic cosmological space-times, with finite density of microstates and a regular Big Bang (BB). The BB arises from a signature change of the effective metric on a fuzzy brane embedded in Lorentzian target space, in the presence of a quantized 4-volume form. The Hubble parameter is singular at the BB, and becomes small at late times. There is no singularity from the target space point of view, and the brane is Euclidean "before" the BB. Both recollapsing and expanding universe solutions are obtained, depending on the mass parameters.Comment: 22 pages, 4 figures. V2,V3: improved discussion, typos fixed. V3: published versio

    Feature Representation for Online Signature Verification

    Full text link
    Biometrics systems have been used in a wide range of applications and have improved people authentication. Signature verification is one of the most common biometric methods with techniques that employ various specifications of a signature. Recently, deep learning has achieved great success in many fields, such as image, sounds and text processing. In this paper, deep learning method has been used for feature extraction and feature selection.Comment: 10 pages, 10 figures, Submitted to IEEE Transactions on Information Forensics and Securit

    AI Solutions for MDS: Artificial Intelligence Techniques for Misuse Detection and Localisation in Telecommunication Environments

    Get PDF
    This report considers the application of Articial Intelligence (AI) techniques to the problem of misuse detection and misuse localisation within telecommunications environments. A broad survey of techniques is provided, that covers inter alia rule based systems, model-based systems, case based reasoning, pattern matching, clustering and feature extraction, articial neural networks, genetic algorithms, arti cial immune systems, agent based systems, data mining and a variety of hybrid approaches. The report then considers the central issue of event correlation, that is at the heart of many misuse detection and localisation systems. The notion of being able to infer misuse by the correlation of individual temporally distributed events within a multiple data stream environment is explored, and a range of techniques, covering model based approaches, `programmed' AI and machine learning paradigms. It is found that, in general, correlation is best achieved via rule based approaches, but that these suffer from a number of drawbacks, such as the difculty of developing and maintaining an appropriate knowledge base, and the lack of ability to generalise from known misuses to new unseen misuses. Two distinct approaches are evident. One attempts to encode knowledge of known misuses, typically within rules, and use this to screen events. This approach cannot generally detect misuses for which it has not been programmed, i.e. it is prone to issuing false negatives. The other attempts to `learn' the features of event patterns that constitute normal behaviour, and, by observing patterns that do not match expected behaviour, detect when a misuse has occurred. This approach is prone to issuing false positives, i.e. inferring misuse from innocent patterns of behaviour that the system was not trained to recognise. Contemporary approaches are seen to favour hybridisation, often combining detection or localisation mechanisms for both abnormal and normal behaviour, the former to capture known cases of misuse, the latter to capture unknown cases. In some systems, these mechanisms even work together to update each other to increase detection rates and lower false positive rates. It is concluded that hybridisation offers the most promising future direction, but that a rule or state based component is likely to remain, being the most natural approach to the correlation of complex events. The challenge, then, is to mitigate the weaknesses of canonical programmed systems such that learning, generalisation and adaptation are more readily facilitated
    • …
    corecore