12,008 research outputs found

    DeepMasterPrints: Generating MasterPrints for Dictionary Attacks via Latent Variable Evolution

    Full text link
    Recent research has demonstrated the vulnerability of fingerprint recognition systems to dictionary attacks based on MasterPrints. MasterPrints are real or synthetic fingerprints that can fortuitously match with a large number of fingerprints thereby undermining the security afforded by fingerprint systems. Previous work by Roy et al. generated synthetic MasterPrints at the feature-level. In this work we generate complete image-level MasterPrints known as DeepMasterPrints, whose attack accuracy is found to be much superior than that of previous methods. The proposed method, referred to as Latent Variable Evolution, is based on training a Generative Adversarial Network on a set of real fingerprint images. Stochastic search in the form of the Covariance Matrix Adaptation Evolution Strategy is then used to search for latent input variables to the generator network that can maximize the number of impostor matches as assessed by a fingerprint recognizer. Experiments convey the efficacy of the proposed method in generating DeepMasterPrints. The underlying method is likely to have broad applications in fingerprint security as well as fingerprint synthesis.Comment: 8 pages; added new verification systems and diagrams. Accepted to conference Biometrics: Theory, Applications, and Systems 201

    Assessing systemic risk due to fire sales spillover through maximum entropy network reconstruction

    Full text link
    Assessing systemic risk in financial markets is of great importance but it often requires data that are unavailable or available at a very low frequency. For this reason, systemic risk assessment with partial information is potentially very useful for regulators and other stakeholders. In this paper we consider systemic risk due to fire sales spillover and portfolio rebalancing by using the risk metrics defined by Greenwood et al. (2015). By using the Maximum Entropy principle we propose a method to assess aggregated and single bank's systemicness and vulnerability and to statistically test for a change in these variables when only the information on the size of each bank and the capitalization of the investment assets are available. We prove the effectiveness of our method on 2001-2013 quarterly data of US banks for which portfolio composition is available.Comment: 36 pages, 6 figures, Accepted on Journal of Economic Dynamics and Contro

    A Hybrid Intelligent Early Warning System for Predicting Economic Crises: The Case of China

    Get PDF
    This paper combines artificial neural networks (ANN), fuzzy optimization and time-series econometric models in one unified framework to form a hybrid intelligent early warning system (EWS) for predicting economic crises. Using quarterly data on 12 macroeconomic and financial variables for the Chinese economy during 1999 and 2008, the paper finds that the hybrid model possesses strong predictive power and the likelihood of economic crises in China during 2009 and 2010 remains high.Computational intelligence; artificial neural networks; fuzzy optimization; early warning system; economic crises

    On the simulation of the seismic energy transmission mechanisms

    Full text link
    In recent years, considerable attention has been paid to research and development methods able to assess the seismic energy propagation on the territory. The seismic energy propagation is strongly related to the complexity of the source and it is affected by the attenuation and the scattering effects along the path. Thus, the effect of the earthquake is the result of a complex interaction between the signal emitted by the source and the propagation effects. The purpose of this work is to develop a methodology able to reproduce the propagation law of seismic energy, hypothesizing the "transmission" mechanisms that preside over the distribution of seismic effects on the territory, by means of a structural optimization process with a predetermined energy distribution. Briefly, the approach, based on a deterministic physical model, determines an objective correction of the detected distributions of seismic intensity on the soil, forcing the compatibility of the observed data with the physical-mechanical model. It is based on two hypotheses: (1) the earthquake at the epicentre is simulated by means of a system of distortions split into three parameters; (2) the intensity is considered coincident to the density of elastic energy. The optimal distribution of the beams stiffness is achieved, by reducing the difference between the values of intensity distribution computed on the mesh and those observed during four regional events historically reported concerning the Campania region (Italy)

    Pseudo-Separation for Assessment of Structural Vulnerability of a Network

    Full text link
    Based upon the idea that network functionality is impaired if two nodes in a network are sufficiently separated in terms of a given metric, we introduce two combinatorial \emph{pseudocut} problems generalizing the classical min-cut and multi-cut problems. We expect the pseudocut problems will find broad relevance to the study of network reliability. We comprehensively analyze the computational complexity of the pseudocut problems and provide three approximation algorithms for these problems. Motivated by applications in communication networks with strict Quality-of-Service (QoS) requirements, we demonstrate the utility of the pseudocut problems by proposing a targeted vulnerability assessment for the structure of communication networks using QoS metrics; we perform experimental evaluations of our proposed approximation algorithms in this context

    Some Remarks about the Complexity of Epidemics Management

    Full text link
    Recent outbreaks of Ebola, H1N1 and other infectious diseases have shown that the assumptions underlying the established theory of epidemics management are too idealistic. For an improvement of procedures and organizations involved in fighting epidemics, extended models of epidemics management are required. The necessary extensions consist in a representation of the management loop and the potential frictions influencing the loop. The effects of the non-deterministic frictions can be taken into account by including the measures of robustness and risk in the assessment of management options. Thus, besides of the increased structural complexity resulting from the model extensions, the computational complexity of the task of epidemics management - interpreted as an optimization problem - is increased as well. This is a serious obstacle for analyzing the model and may require an additional pre-processing enabling a simplification of the analysis process. The paper closes with an outlook discussing some forthcoming problems
    corecore