386 research outputs found
Recommended from our members
Pattern mining approaches used in sensor-based biometric recognition: a review
Sensing technologies place significant interest in the use of biometrics for the recognition and assessment of individuals. Pattern mining techniques have established a critical step in the progress of sensor-based biometric systems that are capable of perceiving, recognizing and computing sensor data, being a technology that searches for the high-level information about pattern recognition from low-level sensor readings in order to construct an artificial substitute for human recognition. The design of a successful sensor-based biometric recognition system needs to pay attention to the different issues involved in processing variable data being - acquisition of biometric data from a sensor, data pre-processing, feature extraction, recognition and/or classification, clustering and validation. A significant number of approaches from image processing, pattern identification and machine learning have been used to process sensor data. This paper aims to deliver a state-of-the-art summary and present strategies for utilizing the broadly utilized pattern mining methods in order to identify the challenges as well as future research directions of sensor-based biometric systems
FLAG : the fault-line analytic graph and fingerprint classification
Fingerprints can be classified into millions of groups by quantitative measurements of their new representations - Fault-Line Analytic Graphs (FLAG), which describe the relationship between ridge flows and singular points. This new model is highly mathematical, therefore, human interpretation can be reduced to a minimum and the time of identification can be significantly reduced.
There are some well known features on fingerprints such as singular points, cores and deltas, which are global features which characterize the fingerprint pattern class, and minutiae which are the local features which characterize an individual fingerprint image. Singular points are more important than minutiae when classifying fingerprints because the geometric relationship among the singular points decide the type of fingerprints.
When the number of fingerprint records becomes large, the current methods need to compare a large number of fingerprint candidates to identify a given fingerprint. This is the result of having a few synthetic types to classify a database with millions of fingerprints. It has been difficult to enlarge the minter of classification groups because there was no computational method to systematically describe the geometric relationship among singular points and ridge flows. In order to define a more efficient classification method, this dissertation also provides a systematic approach to detect singular points with almost pinpoint precision of 2x2 pixels using efficient algorithms
Automatic signature verification system
Philosophiae Doctor - PhDIn this thesis, we explore dynamic signature verification systems. Unlike other signature models, we use genuine signatures in this project as they are more appropriate in real world applications. Signature verification systems are typical examples of biometric devices that use physical and behavioral characteristics to verify that a person really is who he or she claims to be. Other popular biometric examples include fingerprint scanners and hand geometry devices. Hand written signatures have been used for some time to endorse financial transactions and legal contracts although little or no verification of signatures is done. This sets it apart from the other biometrics as it is well accepted method of authentication. Until more recently, only hidden Markov models were used for model construction. Ongoing research on signature verification has revealed that more accurate results can be achieved by combining results of multiple models. We also proposed to use combinations of multiple single variate models instead of single multi variate models which are currently being adapted by many systems. Apart from these, the proposed system is an attractive way for making financial transactions more secure and authenticate electronic documents as it can be easily integrated into existing transaction procedures and electronic communication
Pattern Recognition
A wealth of advanced pattern recognition algorithms are emerging from the interdiscipline between technologies of effective visual features and the human-brain cognition process. Effective visual features are made possible through the rapid developments in appropriate sensor equipments, novel filter designs, and viable information processing architectures. While the understanding of human-brain cognition process broadens the way in which the computer can perform pattern recognition tasks. The present book is intended to collect representative researches around the globe focusing on low-level vision, filter design, features and image descriptors, data mining and analysis, and biologically inspired algorithms. The 27 chapters coved in this book disclose recent advances and new ideas in promoting the techniques, technology and applications of pattern recognition
Advances in Information Security and Privacy
With the recent pandemic emergency, many people are spending their days in smart working and have increased their use of digital resources for both work and entertainment. The result is that the amount of digital information handled online is dramatically increased, and we can observe a significant increase in the number of attacks, breaches, and hacks. This Special Issue aims to establish the state of the art in protecting information by mitigating information risks. This objective is reached by presenting both surveys on specific topics and original approaches and solutions to specific problems. In total, 16 papers have been published in this Special Issue
Data Hiding and Its Applications
Data hiding techniques have been widely used to provide copyright protection, data integrity, covert communication, non-repudiation, and authentication, among other applications. In the context of the increased dissemination and distribution of multimedia content over the internet, data hiding methods, such as digital watermarking and steganography, are becoming increasingly relevant in providing multimedia security. The goal of this book is to focus on the improvement of data hiding algorithms and their different applications (both traditional and emerging), bringing together researchers and practitioners from different research fields, including data hiding, signal processing, cryptography, and information theory, among others
Sign Here!
Sign Here! Handwriting in the Age of New Media features a number of articles from different fields, reaching from cultural and media studies to literature, film and art, and from philosophy and information studies to law and archival studies. Questions addressed in this book are: Will handwriting disappear in the age of new (digital) media? What happens to important cultural and legal concepts, such as original, copy, authenticity, reproducibility, uniqueness, and iterability? Where is the writing hand to be located if handwriting is performed not immediately 'by hand' but when it is (re)mediated by electronic or artistic media? Sign Here! Handwriting in the Age of New Media is the first part in the series Transformations in Art and Culture
Forensic computing strategies for ethical academic writing.
Thesis (M.Com.)-University of KwaZulu-Natal, Westville, 2009.This study resulted in the creation of a conceptual framework for ethical academic writing that can be applied to cases of authorship identification. The framework is the culmination of research into various other forensic frameworks and aspects related to cyber forensics, in order to ensure maximum effectiveness of this newly developed methodology. The research shows how synergies between forensic linguistics and electronic forensics (computer forensics) create the conceptual space for a new, interdisciplinary, cyber forensic linguistics, along with forensic auditing procedures and tools for authorship identification. The research also shows that an individual’s unique word pattern usage can be used to determine document authorship, and that in other instances, authorship can be attributed with a significant degree of probability using the identified process. The importance of this fact cannot be understated, because accusations of plagiarism have to be based on facts that will withstand cross examination in a court of law. Therefore, forensic auditing procedures are required when attributing authorship in cases of suspected plagiarism, which is regarded as one of the most serious problems facing any academic institution.
This study identifies and characterises various forms of plagiarism as well the responses that can be implemented to prevent and deter it. A number of online and offline tools for the detection and prevention of plagiarism are identified, over and above the more commonly used popular tools that, in the author’s view, are overrated because they are based on mechanistic identification of word similarities in source and target texts, rather than on proper grammatical and semantic principles.
Linguistic analysis is a field not well understood and often underestimated. Yet it is a critical field of inquiry in determining specific cases of authorship. The research identifies the various methods of linguistic analysis that could be applied to help establish authorship identity, as well as how they can be applied within a forensic environment. Various software tools that could be used to identify and analyse source documents that were plagiarised are identified and briefly characterised. Concordance, function word analysis and other methods of corpus analysis are explained, along with some of their related software packages. Corpus analysis that in the past would have taken months to perform manually, could now only take a matter of hours using the correct programs, given the availability of computerised analysis tools.
This research integrates the strengths of these tools within a structurally sound forensic auditing framework, the result of which is a conceptual framework that encompasses all the pertinent factors and ensures admissibility in a court of law by adhering to strict rules and features that are characteristic of the legal requirements for a forensic investigation
Data Service Outsourcing and Privacy Protection in Mobile Internet
Mobile Internet data have the characteristics of large scale, variety of patterns, and complex association. On the one hand, it needs efficient data processing model to provide support for data services, and on the other hand, it needs certain computing resources to provide data security services. Due to the limited resources of mobile terminals, it is impossible to complete large-scale data computation and storage. However, outsourcing to third parties may cause some risks in user privacy protection. This monography focuses on key technologies of data service outsourcing and privacy protection, including the existing methods of data analysis and processing, the fine-grained data access control through effective user privacy protection mechanism, and the data sharing in the mobile Internet
- …