58 research outputs found
On Multiview Analysis for Fingerprint Liveness Detection
Fingerprint recognition systems, as any other biometric system, can be subject to attacks, which are usually carried out using artificial fingerprints. Several approaches to discriminate between live and fake fingerprint images have been presented to address this issue. These methods usually rely on the analysis of individual features extracted from the fingerprint images. Such features represent different and complementary views of the object in analysis, and their fusion is likely to improve the classification accuracy. However, very little work in this direction has been reported in the literature. In this work, we present the results of a preliminary investigation on multiview analysis for fingerprint liveness detection. Experimental results show the effectiveness of such approach, which improves previous results in the literatur
Repeatability and Reproducibility of Decisions by Latent Fingerprint Examiners
The interpretation of forensic fingerprint evidence relies on the expertise of latent print examiners. We tested latent print examiners on the extent to which they reached consistent decisions. This study assessed intra-examiner repeatability by retesting 72 examiners on comparisons of latent and exemplar fingerprints, after an interval of approximately seven months; each examiner was reassigned 25 image pairs for comparison, out of total pool of 744 image pairs. We compare these repeatability results with reproducibility (inter-examiner) results derived from our previous study. Examiners repeated 89.1% of their individualization decisions, and 90.1% of their exclusion decisions; most of the changed decisions resulted in inconclusive decisions. Repeatability of comparison decisions (individualization, exclusion, inconclusive) was 90.0% for mated pairs, and 85.9% for nonmated pairs. Repeatability and reproducibility were notably lower for comparisons assessed by the examiners as âdifficultâ than for âeasyâ or âmoderateâ comparisons, indicating that examiners' assessments of difficulty may be useful for quality assurance. No false positive errors were repeated (nâ=â4); 30% of false negative errors were repeated. One percent of latent value decisions were completely reversed (no value even for exclusion vs. of value for individualization). Most of the inter- and intra-examiner variability concerned whether the examiners considered the information available to be sufficient to reach a conclusion; this variability was concentrated on specific image pairs such that repeatability and reproducibility were very high on some comparisons and very low on others. Much of the variability appears to be due to making categorical decisions in borderline cases
Complex systems and the technology of variability analysis
Characteristic patterns of variation over time, namely rhythms, represent a defining feature of complex systems, one that is synonymous with life. Despite the intrinsic dynamic, interdependent and nonlinear relationships of their parts, complex biological systems exhibit robust systemic stability. Applied to critical care, it is the systemic properties of the host response to a physiological insult that manifest as health or illness and determine outcome in our patients. Variability analysis provides a novel technology with which to evaluate the overall properties of a complex system. This review highlights the means by which we scientifically measure variation, including analyses of overall variation (time domain analysis, frequency distribution, spectral power), frequency contribution (spectral analysis), scale invariant (fractal) behaviour (detrended fluctuation and power law analysis) and regularity (approximate and multiscale entropy). Each technique is presented with a definition, interpretation, clinical application, advantages, limitations and summary of its calculation. The ubiquitous association between altered variability and illness is highlighted, followed by an analysis of how variability analysis may significantly improve prognostication of severity of illness and guide therapeutic intervention in critically ill patients
Liveness Detection Competition 2009
The widespread use of personal verification systems based on fingerprints has shown some security weaknesses. Gian Luca Marcialis, assistant professor at the Department of electrical and electronic engineering in the University of Cagliari reports on the first international fingerprint liveness detection competition 2009 â LivDet 2009.
It is well-known that a fingerprint verification system can be deceived by submitting artificial reproductions of fingerprints made up of silicon or gelatine. In a successful attack, these images are processed as âtrueâ fingerprints, allowing a perpetrator to bypass security.
A possible solution in the field of fingerprint verification â especially remote verification â is known as âliveness detectionâ. In this, a standard verification system is coupled with additional hardware or software modules aimed at certifying the authenticity of submitted fingerprints.
While hardware-based solutions are the most expensive, software-based ones attempt to measure liveness from characteristics of images themselves by simply applying image processing algorithms.
In order to assess the main achievements of fingerprint liveness detection, the Department of Electrical and Electronic Engineering of the University of Cagliari, in cooperation with the Department of Electrical and Computer Engineering of the University of Clarkson, is holding the first edition of the Fingerprint Liveness Detection Competition 2009 (LivDet 2009), which will be held during the 15th International Conference on Image Analysis and Processing (ICIAP 2009). LivDet 2009 is open to all academic and industrial institutions which have a solution for the software-based fingerprint vitality (or âlivenessâ) detection problem.
Each participant is invited to submit its algorithm in a Win32 console application. The performance will be evaluated by using a very large data set of âfakeâ and âliveâ fingerprint images captured with three different optical scanners. The performance rank will be compiled and published and the best algorithm will win the âBest Fingerprint Liveness Detection Algorithm Award at ICIAP 2009.
Fingerprint liveness detection background
The duplication of fingerprints (also known as fingerprint spoofing) has remote origins in some fantasy novels of the beginning of the twentieth century. Most recently this subject has become the focus of numerous research groups, both academic and industrial. Some of the first studies date back to 2000 and 2002.1 and 2 These works showed the possibility of fingerprint reproduction and the defrauding of a biometric system. Two different methods were established:
With cooperation
(1)
The user puts his or her finger on a soft material such as Play-Doh, dental impression material or plaster to form the mould (see figure 1).
Consensual method â individual places their finger on a soft material.
Figure 1. Consensual method â individual places their finger on a soft material.
Figure options
(2)
The negative impression of the fingerprint is fixed on the surface.
(3)
Silicone liquid or another similar material, such as wax or gelatine, is poured in the mould (see figure 2).
Consensual method â the negative impression.
Figure 2. Consensual method â the negative impression.
Figure options
(4)
When the liquid is hardened the spoof is formed (figure 3).
Consensual method â the stamp with the reproduction of the pattern.
Figure 3. Consensual method â the stamp with the reproduction of the pattern.
Figure options
Without cooperation
(1)
A latent print left by an unintentional user is enhanced by a powder applied with a brush (figure 4).
Unconsensual method â latent print.
Figure 4. Unconsensual method â latent print.
Figure options
(2)
The fingerprint is photographed and then the image is printed in negative on a transparency (figure 5).
Unconsensual method â mask with the pattern for lithographic process.
Figure 5. Unconsensual method â mask with the pattern for lithographic process.
Figure options
(3)
The paper is placed over a printed circuit board (PCB) and then exposed to UV light (figure 6).
Unconsensual method â UV development process.
Figure 6. Unconsensual method â UV development process.
Figure options
(4)
When the photosensitive layer of the board is developed the surface is etched in an acid solution (see figure 7).
Unconsensual method â etching process.
Figure 7. Unconsensual method â etching process.
Figure options
(5)
The thickness of pattern in the copper is the mould for the stamp (see figure 8)
Unconsensual method â the negative pattern engraved in the copper.
Figure 8. Unconsensual method â the negative pattern engraved in the copper.
Figure options
(6)
For this method, a liquid silicone, such as gelatine or wax, is dripped on the board (figure 9).
Unconsensual method â silicon stamp.
Figure 9. Unconsensual method â silicon stamp.
Figure options
When faced with this threat, a biometric device must decide if the finger on the acquisition sensors is alive or fake. In other words, the recognition process must be upgraded with an added function for detecting the âvitalityâ of the submitted biometric.
Due to the difficulty of the task, the first goal is to achieve good âlivenessâ detection rates when the consensual method is applied. It is worth noting that this method results in the best quality replica and images, since the subject is âconsensualâ.
Liveness detection can be performed by adding some additional hardware to the capture device (eg for checking blood pressure, or heartbeat, which are not present in a âfakeâ finger), thus increasing their cost. A more challenging and difficult solution is to integrate into standard fingerprint sensors some additional algorithm able to detect the âlivenessâ degree from the captured image. They are so-called âsoftware-basedâ approaches. For software-based liveness, the question is: Are there biometric âlivenessâ measurements which can be extracted from captured images?
So far, several algorithms for detecting fingerprint liveness have been proposed,3, 4 and 5 but the main problem is to understand how these algorithms may impact on fingerprint verification systems when integrated. In particular, the objective of this competition is to evaluate various approaches' performance by a shared and well-defined experimental protocol, in order to assess which may be more suitable for the task.
Aims
The goal of the competition is to compare different methodologies for software-based fingerprint liveness detection with a common experimental protocol and data set. The ambition of the competition is to become the reference event for academic and industrial research. The competition is not defined as an official system for quality certification of the proposed solutions, but rather, it hopes to impact the state of the art in this crucial field, security in biometric systems.
Experimental protocol and evaluation
Due to the wide variety of current liveness detection algorithms, the competition defines some constraints for the submitted algorithms:
â˘
Methods must output, for each image, a âliveness degreeâ ranging from 0 to 100 (e.g. Posterior probability of âtrueâ class).
â˘
A training set of fake and live fingerprint images will be made available to each participant, freely downloadable from the LivDet site after registration.
â˘
These images are a small subset of the evaluation data set.
â˘
Each submitted algorithm, as a Win32 console application, must follow the input and output sequence required.
Data set
The data set for the final evaluation comprises three sub-sets, which contain live and fake fingerprint images from three different optical sensors:
Images have been collected consensually and use different materials for the artificial reproduction of the fingerprint (gelatine, silicone, Play-Doh).
Algorithm submission
Each submitted algorithm must be a Win32 console application with the following list of parameters: LIVENESS_XYZ.exe [ndataset] [inputfile] [outputfile]
DATASET Scanners Model No. Resolution(dpi) Image size
Dataset #1 Crossmatch Verifier 300LC 500 480Ă640
Dataset #2 Identix DFR2100 686 720Ă720
Dataset #3 Biometrika FX2000 569 312Ă372
Table options
DATASET Live Samples Fake Samples
Dataset #1 1500 1500
Dataset #2 2000 2000
Dataset #3 2000 2000
Table options
LIVENESS_XYZ.exe
It is the executable name, where XYZ is the identification number of the participant.
LIVENESS_XYZ.exe
Format: Win32 console application (.exe)
[ndataset]
It is the identification number of the data set to analyse.
Legend: 1=Crossmatch, 2=Identix, 3=Biometrika
[inputfile] Txt file with the List of images to analyse. Each image is in RAW format (ASCII)
[outputfile]
Txt file with the output of each processed image, in the same order of input-file. The output is a posterior probability of the live class given the image, or a degree of âlivenessâ normalised in the range 0 and 100 (100 is the maximum degree of liveness, 0 means that the image is fake).
In the case that the algorithm has not been able to process the image, the correspondent output must be â1000 (failure to enrol).
Table options
Each parameter related to the data set configuration must be set before submission. Users can configure their algorithms using the training set available after registration.
Only Win32 console applications with the above characteristics will be accepted for the competition.
Participants may also publish the source code of their algorithm, but this is not mandatory. They are also invited to prepare a paper, which will be submitted at the ICIAP conference, with the description of their solution submitted at LivDet.
Performance evaluation
The parameters adopted for the performance evaluation will be:
â˘
Evaluation per sensor
â
Frej_n: Rate of failure to enrol for the sub-set n.
â
Fcorrlive_n: Rate of correctly classified live fingerprints for sub-set n.
â
Fcorrfake_n: Rate of correctly classified fake fingerprints for sub-set n.
â
Ferrlive_n: Rate of misclassified live fingerprints for sub-set n.
â
Ferrfake_n: Rate of misclassified fake fingerprints for sub-set n.
â
ET: Average processing time per image
â
MAM: Max. Allocated Memory while the algorithm is running.
â˘
Overall evaluation
â
Frej: Rate of failure to enrol.
â
Fcorrlive: Rate of correctly classified live fingerprints.
â
Fcorrfake_n: Rate of correctly classified fake fingerprints.
â
Ferrlive_n: Rate of misclassified live fingerprints.
â
Ferrfake_n: Rate of misclassified fake fingerprints.
LivDet 2009 â Important dates and contacts
LivDet Registration Deadline:
28 February 2009
Deadline for exe-files submission:
30 April 2009
Organising Committee
â˘
Professor Fabio Roli, University of Cagliari (Italy), [email protected]
â˘
Professor Stephanie Schuckers, University of Clarkson (USA), [email protected]
â˘
Dr Gian Luca Marcialis, University of Cagliari (Italy), [email protected]
Conclusions
It is hoped that the first competition of fingerprint liveness detection will be followed by a number of competitions such that further editions are encouraged. In particular, our expectation is that the proposed experimental protocol and data sets may become a standard reference point for the research community.
In our opinion, only a well-shared and well-defined common approach can lead to significant achievements in this field, and hope this first trial may be an event able to focus discussions and further proposals
An algorithm for EMG noise detection in large ECG data
Large collections of electrocardiogram recordings (ECG) are valuable for researchers. However, some sections of the recorded ECG may be corrupted by electromyogram (EMG) noise from muscle. Therefore, EMG noise needs to be detected and filtered before performing data processing. In this study, an automated algorithm for detecting EMG noise in large ECG data is presented. The algorithm extracts EMG artifact from the ECG by using a morphological filter. EMG is identified by setting a threshold for the moving variance of extracted EMG. The algorithm achieved 100% detection rate on the training data. The algorithm was tested on 150 test signals from three sets of test signals (50 signals in each set). Set 1 was created by adding EMG noise to EMG-free ECG signals, set 2 was manually selected ECG sections which contain EMG noise, and set 3 contained randomly selected ECG signals. Sensitivity was 100%, 94%, and 100% on sets 1, 2, and 3, respectively. All sets had 100% specificity. The algorithm has computational complexity of O(N). Š 2004 IEEE
- âŚ