3,153 research outputs found

    Terahertz Security Image Quality Assessment by No-reference Model Observers

    Full text link
    To provide the possibility of developing objective image quality assessment (IQA) algorithms for THz security images, we constructed the THz security image database (THSID) including a total of 181 THz security images with the resolution of 127*380. The main distortion types in THz security images were first analyzed for the design of subjective evaluation criteria to acquire the mean opinion scores. Subsequently, the existing no-reference IQA algorithms, which were 5 opinion-aware approaches viz., NFERM, GMLF, DIIVINE, BRISQUE and BLIINDS2, and 8 opinion-unaware approaches viz., QAC, SISBLIM, NIQE, FISBLIM, CPBD, S3 and Fish_bb, were executed for the evaluation of the THz security image quality. The statistical results demonstrated the superiority of Fish_bb over the other testing IQA approaches for assessing the THz image quality with PLCC (SROCC) values of 0.8925 (-0.8706), and with RMSE value of 0.3993. The linear regression analysis and Bland-Altman plot further verified that the Fish__bb could substitute for the subjective IQA. Nonetheless, for the classification of THz security images, we tended to use S3 as a criterion for ranking THz security image grades because of the relatively low false positive rate in classifying bad THz image quality into acceptable category (24.69%). Interestingly, due to the specific property of THz image, the average pixel intensity gave the best performance than the above complicated IQA algorithms, with the PLCC, SROCC and RMSE of 0.9001, -0.8800 and 0.3857, respectively. This study will help the users such as researchers or security staffs to obtain the THz security images of good quality. Currently, our research group is attempting to make this research more comprehensive.Comment: 13 pages, 8 figures, 4 table

    Evaluation of Biometric Systems

    Get PDF
    International audienceBiometrics is considered as a promising solution among traditional methods based on "what we own" (such as a key) or "what we know" (such as a password). It is based on "what we are" and "how we behave". Few people know that biometrics have been used for ages for identification or signature purposes. In 1928 for example, fingerprints were used for women clerical employees of Los Angeles police department as depicted in Figure 1. Fingerprints were also already used as a signature for commercial exchanges in Babylon (-3000 before JC). Alphonse Bertillon proposed in 1879 to use anthropometric information for police investigation. Nowadays, all police forces in the world use this kind of information to resolve crimes. The first prototypes of terminals providing an automatic processing of the voice and digital fingerprints have been defined in the middle of the years 1970. Nowadays, biometric authentication systems have many applications [1]: border control, e-commerce, etc. The main benefits of this technology are to provide a better security, and to facilitate the authentication process for a user. Also, it is usually difficult to copy the biometric characteristics of an individual than most of other authentication methods such as passwords. Despite the obvious advantages of biometric systems, their proliferation was not as much as attended. The main drawback is the uncertainty of the verification result. By contrast to password checking, the verification of biometric raw data is subject to errors and represented by a similarity percentage (100% is never reached). Others drawbacks related to vulnerabilities and usability issues exist. In addition, in order to be used in an industrial context, the quality of a biometric system must be precisely quantified. We need a reliable evaluation methodology in order to put into obviousness the benefit of a new biometric system. Moreover, many questions remain: Shall we be confident in this technology? What kind of biometric modalities can be used? What are the trends in this domain? The objective of this chapter is to answer these questions, by presenting an evaluation methodology of biometric systems

    Fast computation of the performance evaluation of biometric systems: application to multibiometric

    Full text link
    The performance evaluation of biometric systems is a crucial step when designing and evaluating such systems. The evaluation process uses the Equal Error Rate (EER) metric proposed by the International Organization for Standardization (ISO/IEC). The EER metric is a powerful metric which allows easily comparing and evaluating biometric systems. However, the computation time of the EER is, most of the time, very intensive. In this paper, we propose a fast method which computes an approximated value of the EER. We illustrate the benefit of the proposed method on two applications: the computing of non parametric confidence intervals and the use of genetic algorithms to compute the parameters of fusion functions. Experimental results show the superiority of the proposed EER approximation method in term of computing time, and the interest of its use to reduce the learning of parameters with genetic algorithms. The proposed method opens new perspectives for the development of secure multibiometrics systems by speeding up their computation time.Comment: Future Generation Computer Systems (2012

    Usability and Trust in Information Systems

    Get PDF
    The need for people to protect themselves and their assets is as old as humankind. People's physical safety and their possessions have always been at risk from deliberate attack or accidental damage. The advance of information technology means that many individuals, as well as corporations, have an additional range of physical (equipment) and electronic (data) assets that are at risk. Furthermore, the increased number and types of interactions in cyberspace has enabled new forms of attack on people and their possessions. Consider grooming of minors in chat-rooms, or Nigerian email cons: minors were targeted by paedophiles before the creation of chat-rooms, and Nigerian criminals sent the same letters by physical mail or fax before there was email. But the technology has decreased the cost of many types of attacks, or the degree of risk for the attackers. At the same time, cyberspace is still new to many people, which means they do not understand risks, or recognise the signs of an attack, as readily as they might in the physical world. The IT industry has developed a plethora of security mechanisms, which could be used to mitigate risks or make attacks significantly more difficult. Currently, many people are either not aware of these mechanisms, or are unable or unwilling or to use them. Security experts have taken to portraying people as "the weakest link" in their efforts to deploy effective security [e.g. Schneier, 2000]. However, recent research has revealed at least some of the problem may be that security mechanisms are hard to use, or be ineffective. The review summarises current research on the usability of security mechanisms, and discusses options for increasing their usability and effectiveness

    eCulture: examining and quantifying cultural differences in user acceptance between Chinese and British web site users

    Get PDF
    A thesis submitted for the degree of Doctor of Philosophy of the University of LutonThe World Wide Web (WWW) has become an important medium for communicating between people all over the world. It is regarded as a global system and is associated with a wide user and social system diversity. The effects of differing user-groups and their associated cultures on user acceptance of web sites can be significant, and as a result understanding the behaviour of web users in various cultures is becoming a significant concern. The eCulture research project is based on previous classical theories and research in culture. It applies a factorial experimental design strategy (the Taguchi method) in crosscultural usability / acceptability, together with other approaches such as semiotic analysis and card sorting. Two types of analysis, both top-down and bottom-up have been implemented to investigate differences in web site usability and acceptability between users from Mainland China and the United Kingdom. Based on experiments on web sites investigating the relationship between cultural issues and usability lacceptability aspects between Chinese and British web users, several issues, such as cultural factors, cognitive abilities, social semiotic differences and other issues have emerged. One of the goals has been to develop 'cultural fingerprints' for both web sites and users in different cultures. By comparing cultural and site fingerprints, usability and acceptability of web sites can be diagrammatically matched to the target culture. Experiments investigating qualitative factors and quantitative data collection and analysis based on the Taguchi method has led to the successful development of two versions of 'cultural fingerprint' for both web sites and target cultures in the UK and China. It has been possible to relate these studies to a wider body of knowledge, and to suggest ways in which the work may be extended in the future

    Evaluation of a Vein Biometric Recognition System on an Ordinary Smartphone

    Get PDF
    Nowadays, biometrics based on vein patterns as a trait is a promising technique. Vein patterns satisfy universality, distinctiveness, permanence, performance, and protection against circumvention. However, collectability and acceptability are not completely satisfied. These two properties are directly related to acquisition methods. The acquisition of vein images is usually based on the absorption of near-infrared (NIR) light by the hemoglobin inside the veins, which is higher than in the surrounding tissues. Typically, specific devices are designed to improve the quality of the vein images. However, such devices increase collectability costs and reduce acceptability. This paper focuses on using commercial smartphones with ordinary cameras as potential devices to improve collectability and acceptability. In particular, we use smartphone applications (apps), mainly employed for medical purposes, to acquire images with the smartphone camera and improve the contrast of superficial veins, as if using infrared LEDs. A recognition system has been developed that employs the free IRVeinViewer App to acquire images from wrists and dorsal hands and a feature extraction algorithm based on SIFT (scale-invariant feature transform) with adequate pre- and post-processing stages. The recognition performance has been evaluated with a database composed of 1000 vein images associated to five samples from 20 wrists and 20 dorsal hands, acquired at different times of day, from people of different ages and genders, under five different environmental conditions: day outdoor, indoor with natural light, indoor with natural light and dark homogeneous background, indoor with artificial light, and darkness. The variability of the images acquired in different sessions and under different ambient conditions has a large influence on the recognition rates, such that our results are similar to other systems from the literature that employ specific smartphones and additional light sources. Since reported quality assessment algorithms do not help to reject poorly acquired images, we have evaluated a solution at enrollment and matching that acquires several images subsequently, computes their similarity, and accepts only the samples whose similarity is greater than a threshold. This improves the recognition, and it is practical since our implemented system in Android works in real-time and the usability of the acquisition app is high.MCIN/AEI/ 10.13039/50110001103 Grant PDC2021-121589-I00Fondo Europeo de Desarrollo Regional (FEDER) and ConsejerĂ­a de TransformaciĂłn EconĂłmica, Industria, Conocimiento y Universidades de la Junta de AndalucĂ­a Grant US-126514
    • 

    corecore