39,206 research outputs found

    A novel hand reconstruction approach and its application to vulnerability assessment

    Full text link
    This is the author’s version of a work that was accepted for publication in Information Sciences. Changes resulting from the publishing process, such as peer review, editing, corrections, structural formatting, and other quality control mechanisms may not be reflected in this document. Changes may have been made to this work since it was submitted for publication. A definitive version was subsequently published in Information Sciences, 238 (2014) DOI: 10.1016/j.ins.2013.06.015The present work proposes a novel probabilistic method to reconstruct a hand shape image from its template. We analyse the degree of similarity between the reconstructed images and the original samples in order to determine whether the synthetic hands are able to deceive hand recognition systems. This analysis is made through the estimation of the success chances of an attack carried out with the synthetic samples against an independent system. The experimental results show that there is a high chance of breaking a hand recognition system using this approach. Furthermore, since it is a probabilistic method, several synthetic images can be generated from each original sample, which increases the success chances of the attack.This work has been partially supported by projects Contexts (S2009/TIC-1485) from CAM, Bio-Challenge (TEC2009-11186), BIOSINT (TEC2012-38630-C04-02) and Bio-Shield (TEC2012-34881) from Spanish MINECO, TABULA RASA (FP7-ICT-257289) and BEAT (FP7-SEC-284989) from EU, and Cátedra UAM-Telefónica. Marta Gomez-Barrero is supported by a FPU Fellowship from Spanish MECD

    Seismic Risk Analysis of Revenue Losses, Gross Regional Product and transportation systems.

    Get PDF
    Natural threats like earthquakes, hurricanes or tsunamis have shown seri- ous impacts on communities. In the past, major earthquakes in the United States like Loma Prieta 1989, Northridge 1994, or recent events in Italy like L’Aquila 2009 or Emilia 2012 earthquake emphasized the importance of pre- paredness and awareness to reduce social impacts. Earthquakes impacted businesses and dramatically reduced the gross regional product. Seismic Hazard is traditionally assessed using Probabilistic Seismic Hazard Anal- ysis (PSHA). PSHA well represents the hazard at a specific location, but it’s unsatisfactory for spatially distributed systems. Scenario earthquakes overcome the problem representing the actual distribution of shaking over a spatially distributed system. The performance of distributed productive systems during the recovery process needs to be explored. Scenario earthquakes have been used to assess the risk in bridge networks and the social losses in terms of gross regional product reduction. The proposed method for scenario earthquakes has been applied to a real case study: Treviso, a city in the North East of Italy. The proposed method for scenario earthquakes requires three models: one representation of the sources (Italian Seismogenic Zonation 9), one attenuation relationship (Sa- betta and Pugliese 1996) and a model of the occurrence rate of magnitudes (Gutenberg Richter). A methodology has been proposed to reduce thou- sands of scenarios to a subset consistent with the hazard at each location. Earthquake scenarios, along with Mote Carlo method, have been used to simulate business damage. The response of business facilities to earthquake has been obtained from fragility curves for precast industrial building. Fur- thermore, from business damage the reduction of productivity has been simulated using economic data from the National statistical service and a proposed piecewise “loss of functionality model”. To simulate the economic process in the time domain, an innovative businesses recovery function has been proposed. The proposed method has been applied to generate scenarios earthquakes at the location of bridges and business areas. The proposed selection method- ology has been applied to reduce 8000 scenarios to a subset of 60. Subse- quently, these scenario earthquakes have been used to calculate three system performance parameters: the risk in transportation networks, the risk in terms of business damage and the losses of gross regional product. A novel model for business recovery process has been tested. The proposed model has been used to represent the business recovery process and simulate the effects of government aids allocated for reconstruction. The proposed method has efficiently modeled the seismic hazard using scenario earthquakes. The scenario earthquakes presented have been used to assess possible consequences of earthquakes in seismic prone zones and to increase the preparedness. Scenario earthquakes have been used to sim- ulate the effects to economy of the impacted area; a significant Gross Regional Product reduction has been shown, up to 77% with an earthquake with 0.0003 probability of occurrence. The results showed that limited funds available after the disaster can be distributed in a more efficient way

    Disaster Risk Management by Communities and Local Governments

    Get PDF
    This study refers to disaster risk management at the local level. The topic was selected by the members of the Natural Disasters Network of the Regional Policy Dialogue, and was presented during its 3rd Meeting, on March 6 and 7, 2003. The goal of this document is to achieve a better knowledge of the best practices and benefits that disaster risk management represents for Latin America and the Caribbean. Included are comparative case studies of the Philippines, Colombia, Guatemala and Switzerland. Also discussed are strengths and weaknesses of local organizations in decentralized systems and financial services for disaster risk management.Disasters, Financial Risk, Decentralization, Civil Society, Environment, disaster risk management

    Learning Fast and Slow: PROPEDEUTICA for Real-time Malware Detection

    Full text link
    In this paper, we introduce and evaluate PROPEDEUTICA, a novel methodology and framework for efficient and effective real-time malware detection, leveraging the best of conventional machine learning (ML) and deep learning (DL) algorithms. In PROPEDEUTICA, all software processes in the system start execution subjected to a conventional ML detector for fast classification. If a piece of software receives a borderline classification, it is subjected to further analysis via more performance expensive and more accurate DL methods, via our newly proposed DL algorithm DEEPMALWARE. Further, we introduce delays to the execution of software subjected to deep learning analysis as a way to "buy time" for DL analysis and to rate-limit the impact of possible malware in the system. We evaluated PROPEDEUTICA with a set of 9,115 malware samples and 877 commonly used benign software samples from various categories for the Windows OS. Our results show that the false positive rate for conventional ML methods can reach 20%, and for modern DL methods it is usually below 6%. However, the classification time for DL can be 100X longer than conventional ML methods. PROPEDEUTICA improved the detection F1-score from 77.54% (conventional ML method) to 90.25%, and reduced the detection time by 54.86%. Further, the percentage of software subjected to DL analysis was approximately 40% on average. Further, the application of delays in software subjected to ML reduced the detection time by approximately 10%. Finally, we found and discussed a discrepancy between the detection accuracy offline (analysis after all traces are collected) and on-the-fly (analysis in tandem with trace collection). Our insights show that conventional ML and modern DL-based malware detectors in isolation cannot meet the needs of efficient and effective malware detection: high accuracy, low false positive rate, and short classification time.Comment: 17 pages, 7 figure

    Identifying causal gateways and mediators in complex spatio-temporal systems

    Get PDF
    J.R. received support by the German National Academic Foundation (Studienstiftung), a Humboldt University Postdoctoral Fellowship, and the German Federal Ministry of Science and Education (Young Investigators Group CoSy-CC2, grant no. 01LN1306A). J.F.D. thanks the Stordalen Foundation and BMBF (project GLUES) for financial support. D.H. has been funded by grant ERC-CZ CORES LL-1201 of the Czech Ministry of Education. M.P. and N.J. received funding from the Czech Science Foundation project No. P303-14-02634S and from the Czech Ministry of Education, Youth and Sports, project No. DAAD-15-30. J.H. was supported by the Czech Science Foundation project GA13-23940S and Czech Health Research Council project NV15-29835A. We thank Mary Lindsey from the National Oceanic and Atmospheric Administration for her kind help with Fig. 4e. NCEP Reanalysis data provided by NOAA/OAR/ESRL PSD, Boulder, Colorado, USA, from their web site at http://www.esrl.noaa.gov/psd/.Peer reviewedPublisher PD
    corecore