6,124 research outputs found

    A smart environment for biometric capture

    No full text
    The development of large scale biometric systems require experiments to be performed on large amounts of data. Existing capture systems are designed for fixed experiments and are not easily scalable. In this scenario even the addition of extra data is difficult. We developed a prototype biometric tunnel for the capture of non-contact biometrics. It is self contained and autonomous. Such a configuration is ideal for building access or deployment in secure environments. The tunnel captures cropped images of the subject's face and performs a 3D reconstruction of the person's motion which is used to extract gait information. Interaction between the various parts of the system is performed via the use of an agent framework. The design of this system is a trade-off between parallel and serial processing due to various hardware bottlenecks. When tested on a small population the extracted features have been shown to be potent for recognition. We currently achieve a moderate throughput of approximate 15 subjects an hour and hope to improve this in the future as the prototype becomes more complete

    Fast computation of the performance evaluation of biometric systems: application to multibiometric

    Full text link
    The performance evaluation of biometric systems is a crucial step when designing and evaluating such systems. The evaluation process uses the Equal Error Rate (EER) metric proposed by the International Organization for Standardization (ISO/IEC). The EER metric is a powerful metric which allows easily comparing and evaluating biometric systems. However, the computation time of the EER is, most of the time, very intensive. In this paper, we propose a fast method which computes an approximated value of the EER. We illustrate the benefit of the proposed method on two applications: the computing of non parametric confidence intervals and the use of genetic algorithms to compute the parameters of fusion functions. Experimental results show the superiority of the proposed EER approximation method in term of computing time, and the interest of its use to reduce the learning of parameters with genetic algorithms. The proposed method opens new perspectives for the development of secure multibiometrics systems by speeding up their computation time.Comment: Future Generation Computer Systems (2012

    SuperIdentity: fusion of identity across real and cyber domains

    No full text
    Under both benign and malign circumstances, people now manage a spectrum of identities across both real-world and cyber domains. Our belief, however, is that all these instances ultimately track back for an individual to reflect a single ‘SuperIdentity’. This paper outlines the assumptions underpinning the SuperIdentity Project, describing the innovative use of data fusion to incorporate novel real-world and cyber cues into a rich framework appropriate for modern identity. The proposed combinatorial model will support a robust identification or authentication decision, with confidence indexed both by the level of trust in data provenance, and the diagnosticity of the identity factors being used. Additionally, the exploration of correlations between factors may underpin the more intelligent use of identity information so that known information may be used to predict previously hidden information. With modern living supporting the ‘distribution of identity’ across real and cyber domains, and with criminal elements operating in increasingly sophisticated ways in the hinterland between the two, this approach is suggested as a way forwards, and is discussed in terms of its impact on privacy, security, and the detection of threa

    Systematic Review on Security and Privacy Requirements in Edge Computing: State of the Art and Future Research Opportunities

    Get PDF
    Edge computing is a promising paradigm that enhances the capabilities of cloud computing. In order to continue patronizing the computing services, it is essential to conserve a good atmosphere free from all kinds of security and privacy breaches. The security and privacy issues associated with the edge computing environment have narrowed the overall acceptance of the technology as a reliable paradigm. Many researchers have reviewed security and privacy issues in edge computing, but not all have fully investigated the security and privacy requirements. Security and privacy requirements are the objectives that indicate the capabilities as well as functions a system performs in eliminating certain security and privacy vulnerabilities. The paper aims to substantially review the security and privacy requirements of the edge computing and the various technological methods employed by the techniques used in curbing the threats, with the aim of helping future researchers in identifying research opportunities. This paper investigate the current studies and highlights the following: (1) the classification of security and privacy requirements in edge computing, (2) the state of the art techniques deployed in curbing the security and privacy threats, (3) the trends of technological methods employed by the techniques, (4) the metrics used for evaluating the performance of the techniques, (5) the taxonomy of attacks affecting the edge network, and the corresponding technological trend employed in mitigating the attacks, and, (6) research opportunities for future researchers in the area of edge computing security and privacy

    A Framework for Biometric and Interaction Performance Assessment of Automated Border Control Processes

    Get PDF
    Automated Border Control (ABC) in airports and land crossings utilize automated technology to verify passenger identity claims. Accuracy, interaction stability, user error, and the need for a harmonized approach to implementation are required. Two models proposed in this paper establish a global path through ABC processes. The first, the generic model, maps separately the enrolment and verification phases of an ABC scenario. This allows a standardization of the process and an exploration of variances and similarities between configurations across implementations. The second, the identity claim process, decomposes the verification phase of the generic model to an enhanced resolution of ABC implementations. Harnessing a human-biometric sensor interaction framework allows the identification and quantification of errors within the system's use, attributing these errors to either system performance or human interaction. Data from a live operational scenario are used to analyze behaviors, which aid in establishing what effect these have on system performance. Utilizing the proposed method will aid already established methods in improving the performance assessment of a system. Through analyzing interactions and possible behavioral scenarios from the live trial, it was observed that 30.96% of interactions included some major user error. Future development using our proposed framework will see technological advances for biometric systems that are able to categorize interaction errors and feedback appropriately

    Quality assessment technique for ubiquitous software and middleware

    Get PDF
    The new paradigm of computing or information systems is ubiquitous computing systems. The technology-oriented issues of ubiquitous computing systems have made researchers pay much attention to the feasibility study of the technologies rather than building quality assurance indices or guidelines. In this context, measuring quality is the key to developing high-quality ubiquitous computing products. For this reason, various quality models have been defined, adopted and enhanced over the years, for example, the need for one recognised standard quality model (ISO/IEC 9126) is the result of a consensus for a software quality model on three levels: characteristics, sub-characteristics, and metrics. However, it is very much unlikely that this scheme will be directly applicable to ubiquitous computing environments which are considerably different to conventional software, trailing a big concern which is being given to reformulate existing methods, and especially to elaborate new assessment techniques for ubiquitous computing environments. This paper selects appropriate quality characteristics for the ubiquitous computing environment, which can be used as the quality target for both ubiquitous computing product evaluation processes ad development processes. Further, each of the quality characteristics has been expanded with evaluation questions and metrics, in some cases with measures. In addition, this quality model has been applied to the industrial setting of the ubiquitous computing environment. These have revealed that while the approach was sound, there are some parts to be more developed in the future

    The future of Cybersecurity in Italy: Strategic focus area

    Get PDF
    This volume has been created as a continuation of the previous one, with the aim of outlining a set of focus areas and actions that the Italian Nation research community considers essential. The book touches many aspects of cyber security, ranging from the definition of the infrastructure and controls needed to organize cyberdefence to the actions and technologies to be developed to be better protected, from the identification of the main technologies to be defended to the proposal of a set of horizontal actions for training, awareness raising, and risk management

    A Survey on Modality Characteristics, Performance Evaluation Metrics, and Security for Traditional and Wearable Biometric Systems

    Get PDF
    Biometric research is directed increasingly towards Wearable Biometric Systems (WBS) for user authentication and identification. However, prior to engaging in WBS research, how their operational dynamics and design considerations differ from those of Traditional Biometric Systems (TBS) must be understood. While the current literature is cognizant of those differences, there is no effective work that summarizes the factors where TBS and WBS differ, namely, their modality characteristics, performance, security and privacy. To bridge the gap, this paper accordingly reviews and compares the key characteristics of modalities, contrasts the metrics used to evaluate system performance, and highlights the divergence in critical vulnerabilities, attacks and defenses for TBS and WBS. It further discusses how these factors affect the design considerations for WBS, the open challenges and future directions of research in these areas. In doing so, the paper provides a big-picture overview of the important avenues of challenges and potential solutions that researchers entering the field should be aware of. Hence, this survey aims to be a starting point for researchers in comprehending the fundamental differences between TBS and WBS before understanding the core challenges associated with WBS and its design
    corecore