8,032 research outputs found

    Feature Level Fusion of Face and Fingerprint Biometrics

    Full text link
    The aim of this paper is to study the fusion at feature extraction level for face and fingerprint biometrics. The proposed approach is based on the fusion of the two traits by extracting independent feature pointsets from the two modalities, and making the two pointsets compatible for concatenation. Moreover, to handle the problem of curse of dimensionality, the feature pointsets are properly reduced in dimension. Different feature reduction techniques are implemented, prior and after the feature pointsets fusion, and the results are duly recorded. The fused feature pointset for the database and the query face and fingerprint images are matched using techniques based on either the point pattern matching, or the Delaunay triangulation. Comparative experiments are conducted on chimeric and real databases, to assess the actual advantage of the fusion performed at the feature extraction level, in comparison to the matching score level.Comment: 6 pages, 7 figures, conferenc

    The College Admissions Problem Under Uncertainty

    Get PDF
    We consider a college admissions problem with uncertainty. We realistically assume that (i) students' college application choices are nontrivial because applications are costly, (ii) college rankings of students are noisy and thus uncertain at the time of application, and (iii) matching between colleges and students takes place in a decentralized setting. We analyze a general equilibrium model where two ranked colleges set admissions standards for student quality signals, and students, knowing their types, decide where to apply to. We show that the optimal student application portfolio need not be monotone in types, and we construct a robust example to show that this can lead to a failure of assortative matching in equilibrium. More importantly, we prove that a unique equilibrium with assortive matching exists provided application costs are small and the lower-ranked college has sufficiently high capacity. We also provide equilibrium comparative static results with respect to college capacities and application costs. We apply the model to the question of race-based admissions policiesmatching, directed search, noise

    Price formation in a sequential selling mechanism

    Get PDF
    This paper analyzes the trade of an indivisible good within a two-stage mechanism, where a seller first negotiates with one potential buyer about the price of the good. If the negotiation fails to produce a sale, a second–price sealed–bid auction with an additional buyer is conducted. The theoretical model predicts that with risk neutral agents all sales take place in the auction rendering the negotiation prior to the auction obsolete. An experimental test of the model provides evidence that average prices and profits are quite precisely predicted by the theoretical benchmark. However, a significant large amount of sales occurs already during the negotiation stage. We show that risk preferences can theoretically account for the existence of sales during the negotiation stage, improve the fit for buyers’ behavior, but is not sufficient to explain sellers’ decisions. We discuss other behavioral explanations that could account for the observed deviations

    Losing Face

    Get PDF
    When person A takes an action that can be interpreted as �making an offer� to person B and B �rejects the offer,� then A may �lose face.� This loss of face (LoF) and consequent disutility will occur only if these actions are common knowledge to A and B. While under some circumstances this LoF can be rationalized by the consequences for future reputation, we claim it also enters directly into the utility function. LoF concerns can lead to fewer offers and inefficiency in markets that involve matching, discrete transactions, and offers/proposals in both directions. This pertains to the marriage market, certain types of labor markets, admissions to colleges and universities, and certain types of joint ventures and collaborations. We offer a simple model of this, and show that under some circumstances welfare can be improved by a mechanism that only reveals offers when both parties say �yes.�

    Losing Face

    Get PDF
    When person A makes an offer to person B and B rejects it, then A may "lose face". This loss of face is assumed to occur only if B knows for sure of A's offer. While under some circumstances loss of face can be rationalized by the consequences for future reputation, it may also enter directly into the utility function. Loss of face concerns can lead to fewer offers and inefficiency in markets that involve matching, discrete transactions, and offers/proposals in both directions, such as the marriage market, certain types of labor markets, admissions to colleges and universities, and joint ventures and collaborations. We offer a simple model of this, and show that under some circumstances welfare can be improved by a mechanism that only reveals offers when both parties say "yes".Matching, marriage markets, anonymity, reputation, adverse selection, Bayesian games, emotions.

    Human Retina Based Identification System Using Gabor Filters and GDA Technique

    Get PDF
    A biometric authentication system provides an automatic person authentication based on some characteristic features possessed by the individual. Among all other biometrics, human retina is a secure and reliable source of person recognition as it is unique, universal, lies at the back of the eyeball and hence it is unforgeable. The process of authentication mainly includes pre-processing, feature extraction and then features matching and classification. Also authentication systems are mainly appointed in verification and identification mode according to the specific application. In this paper, preprocessing and image enhancement stages involve several steps to highlight interesting features in retinal images. The feature extraction stage is accomplished using a bank of Gabor filter with number of orientations and scales. Generalized Discriminant Analysis (GDA) technique has been used to reduce the size of feature vectors and enhance the performance of proposed algorithm. Finally, classification is accomplished using k-nearest neighbor (KNN) classifier to determine the identity of the genuine user or reject the forged one as the proposed method operates in identification mode. The main contribution in this paper is using Generalized Discriminant Analysis (GDA) technique to address ‘curse of dimensionality’ problem. GDA is a novel method used in the area of retina recognition

    Inferring the photometric and size evolution of galaxies from image simulations

    Full text link
    Current constraints on models of galaxy evolution rely on morphometric catalogs extracted from multi-band photometric surveys. However, these catalogs are altered by selection effects that are difficult to model, that correlate in non trivial ways, and that can lead to contradictory predictions if not taken into account carefully. To address this issue, we have developed a new approach combining parametric Bayesian indirect likelihood (pBIL) techniques and empirical modeling with realistic image simulations that reproduce a large fraction of these selection effects. This allows us to perform a direct comparison between observed and simulated images and to infer robust constraints on model parameters. We use a semi-empirical forward model to generate a distribution of mock galaxies from a set of physical parameters. These galaxies are passed through an image simulator reproducing the instrumental characteristics of any survey and are then extracted in the same way as the observed data. The discrepancy between the simulated and observed data is quantified, and minimized with a custom sampling process based on adaptive Monte Carlo Markov Chain methods. Using synthetic data matching most of the properties of a CFHTLS Deep field, we demonstrate the robustness and internal consistency of our approach by inferring the parameters governing the size and luminosity functions and their evolutions for different realistic populations of galaxies. We also compare the results of our approach with those obtained from the classical spectral energy distribution fitting and photometric redshift approach.Our pipeline infers efficiently the luminosity and size distribution and evolution parameters with a very limited number of observables (3 photometric bands). When compared to SED fitting based on the same set of observables, our method yields results that are more accurate and free from systematic biases.Comment: 24 pages, 12 figures, accepted for publication in A&

    Informed Proposal Monte Carlo

    Full text link
    Any search or sampling algorithm for solution of inverse problems needs guidance to be efficient. Many algorithms collect and apply information about the problem on the fly, and much improvement has been made in this way. However, as a consequence of the the No-Free-Lunch Theorem, the only way we can ensure a significantly better performance of search and sampling algorithms is to build in as much information about the problem as possible. In the special case of Markov Chain Monte Carlo sampling (MCMC) we review how this is done through the choice of proposal distribution, and we show how this way of adding more information about the problem can be made particularly efficient when based on an approximate physics model of the problem. A highly nonlinear inverse scattering problem with a high-dimensional model space serves as an illustration of the gain of efficiency through this approach
    corecore