4,450 research outputs found

    Improving Loss Estimation for Woodframe Buildings. Volume 2: Appendices

    Get PDF
    This report documents Tasks 4.1 and 4.5 of the CUREE-Caltech Woodframe Project. It presents a theoretical and empirical methodology for creating probabilistic relationships between seismic shaking severity and physical damage and loss for buildings in general, and for woodframe buildings in particular. The methodology, called assembly-based vulnerability (ABV), is illustrated for 19 specific woodframe buildings of varying ages, sizes, configuration, quality of construction, and retrofit and redesign conditions. The study employs variations on four basic floorplans, called index buildings. These include a small house and a large house, a townhouse and an apartment building. The resulting seismic vulnerability functions give the probability distribution of repair cost as a function of instrumental ground-motion severity. These vulnerability functions are useful by themselves, and are also transformed to seismic fragility functions compatible with the HAZUS software. The methods and data employed here use well-accepted structural engineering techniques, laboratory test data and computer programs produced by Element 1 of the CUREE-Caltech Woodframe Project, other recently published research, and standard construction cost-estimating methods. While based on such well established principles, this report represents a substantially new contribution to the field of earthquake loss estimation. Its methodology is notable in that it calculates detailed structural response using nonlinear time-history structural analysis as opposed to the simplifying assumptions required by nonlinear pushover methods. It models physical damage at the level of individual building assemblies such as individual windows, segments of wall, etc., for which detailed laboratory testing is available, as opposed to two or three broad component categories that cannot be directly tested. And it explicitly models uncertainty in ground motion, structural response, component damageability, and contractor costs. Consequently, a very detailed, verifiable, probabilistic picture of physical performance and repair cost is produced, capable of informing a variety of decisions regarding seismic retrofit, code development, code enforcement, performance-based design for above-code applications, and insurance practices

    Utilizing public repositories to improve the decision process for security defect resolution and information reuse in the development environment

    Get PDF
    Security risks are contained in solutions in software systems that could have been avoided if the design choices were analyzed by using public information security data sources. Public security sources have been shown to contain more relevant and recent information on current technologies than any textbook or research article, and these sources are often used by developers for solving software related problems. However, solutions copied from public discussion forums such as StackOverflow may contain security implications when copied directly into the developers environment. Several different methods to identify security bugs are being implemented, and recent efforts are looking into identifying security bugs from communication artifacts during software development lifecycle as well as using public security information sources to support secure design and development. The primary goal of this thesis is to investigate how to utilize public information sources to reduce security defects in software artifacts through improving the decision process for defect resolution and information reuse in the development environment. We build a data collection tool for collecting data from public information security sources and public discussion forums, construct machine learning models for classifying discussion forum posts and bug reports as security or not-security related, as well as word embedding models for finding matches between public security sources and public discussion forum posts or bug reports. The results of this thesis demonstrate that using public information security sources can provide additional validation layers for defect classification models, as well as provide additional security context for public discussion forum posts. The contributions of this thesis are to provide understanding of how public information security sources can better provide context for bug reports and discussion forums. Additionally, we provide data collection APIs for collecting datasets from these sources, and classification and word embedding models for recommending related security sources for bug reports and public discussion forum posts.Masteroppgave i Programutvikling samarbeid med HVLPROG399MAMN-PRO

    A Survey on Automated Software Vulnerability Detection Using Machine Learning and Deep Learning

    Full text link
    Software vulnerability detection is critical in software security because it identifies potential bugs in software systems, enabling immediate remediation and mitigation measures to be implemented before they may be exploited. Automatic vulnerability identification is important because it can evaluate large codebases more efficiently than manual code auditing. Many Machine Learning (ML) and Deep Learning (DL) based models for detecting vulnerabilities in source code have been presented in recent years. However, a survey that summarises, classifies, and analyses the application of ML/DL models for vulnerability detection is missing. It may be difficult to discover gaps in existing research and potential for future improvement without a comprehensive survey. This could result in essential areas of research being overlooked or under-represented, leading to a skewed understanding of the state of the art in vulnerability detection. This work address that gap by presenting a systematic survey to characterize various features of ML/DL-based source code level software vulnerability detection approaches via five primary research questions (RQs). Specifically, our RQ1 examines the trend of publications that leverage ML/DL for vulnerability detection, including the evolution of research and the distribution of publication venues. RQ2 describes vulnerability datasets used by existing ML/DL-based models, including their sources, types, and representations, as well as analyses of the embedding techniques used by these approaches. RQ3 explores the model architectures and design assumptions of ML/DL-based vulnerability detection approaches. RQ4 summarises the type and frequency of vulnerabilities that are covered by existing studies. Lastly, RQ5 presents a list of current challenges to be researched and an outline of a potential research roadmap that highlights crucial opportunities for future work

    Vulnerability Assessment of Buildings due to Land Subsidence using InSAR Data in the Ancient Historical City of Pistoia (Italy)

    Get PDF
    The launch of the medium resolution Synthetic Aperture Radar (SAR) Sentinel-1 constellation in 2014 has allowed public and private organizations to introduce SAR interferometry (InSAR) products as a valuable option in their monitoring systems. The massive stacks of displacement data resulting from the processing of large C-B and radar images can be used to highlight temporal and spatial deformation anomalies, and their detailed analysis and postprocessing to generate operative products for final users. In this work, the wide-area mapping capability of Sentinel-1 was used in synergy with the COSMO-SkyMed high resolution SAR data to characterize ground subsidence affecting the urban fabric of the city of Pistoia (Tuscany Region, central Italy). Line of sight velocities were decomposed on vertical and E–W components, observing slight horizontal movements towards the center of the subsidence area. Vertical displacements and damage field surveys allowed for the calculation of the probability of damage depending on the displacement velocity by means of fragility curves. Finally, these data were translated to damage probability and potential loss maps. These products are useful for urban planning and geohazard management, focusing on the identification of the most hazardous areas on which to concentrate efforts and resources.This work was supported by the Spanish Ministry of Economy, Industry and Competitiveness (MINECO), the State Agency of Research (AEI) and European Funds for Regional Development (FEDER) under projects AQUARISK (ESP2013-47780-C2-2-R) and TEMUSA (TEC2017-85244-C2-1-P) and STAR-EO (TIN2014-55413-C2-2-P). The first author shows gratitude for the PhD student contract BES-2014-069076. The work was conceived during the research stay of P. Ezquerro and R. Tomás in the Università degli Studi di Firenze and the research stay of G. Herrera in the IGOT Lisbon University, supported by the Spanish Ministry of Education, Culture and Sport under fellowships EEBB-I-18-13014, PRX17/00439 and PRX19/00065, respectively. The S-1 monitoring activity is funded and supported by the Tuscany Region under the agreement “Monitoring ground deformation in the Tuscany Region with satellite radar data.” The authors also gratefully acknowledge TRE ALTAMIRA for having processed the S-1 data. The project was carried out using CSK® Products, © ASI (Italian Space Agency), delivered under the ASI Project Id Science 678 - “High resolution Subsidence investigation in the urban area of Pistoia (Tuscany Region, central Italy). The work is under the framework of the e-shape project, which has received funding from the European Union’s Horizon 2020 research and innovation program under grant agreement 820852. This paper is also supported by the PRIMA programme under grant agreement No 1924, project RESERVOIR. The PRIMA programme is supported by the European Union

    PB-JFT-23

    Get PDF

    Exposure modelling and loss estimation for seismic risk assessment of residential buildings: innovative methods and applications

    Get PDF
    Defining the seismic hazard, assessing the vulnerability of the main components of the built environment and, consequently, estimating the expected losses are key steps for setting up effective post-event emergency plans as well as medium-long term mitigation strategies. Despite the significant knowledge advancements achieved in the last years, several points need to be further developed. Among them the collection of reliable building inventories, the selection of appropriate measures of seismic intensity and the definition of accurate loss estimation models still propose some challenges for the scientific community. The present PhD thesis aims at providing a contribution in this direction. After a comprehensive state of the art on seismic risk components along with a literature review focused on the main models to estimate the expected seismic losses, some new procedures related to hazard, exposure and loss estimation, have been proposed and applied. Firstly, a model aimed at estimating the direct economic losses (i.e., building repair costs) has been developed by improving the models currently available in the literature. These models generally account for only the severity of damage (i.e., the maximum damage level), while damage extension and distribution, especially along the building height, are implicitly considered in the repair cost values. If on the one side, the assessment of safety condition depends essentially on damage severity, on the other side, damage extension strongly affects the estimation of economic impact. In this regard, the proposed model allows to explicitly consider both damage severity and distribution along the building height. The model is applicable to both Reinforced Concrete (RC) and masonry building types. It requires the determination of the more frequent damage distributions throughout the building height. At the current state, the procedure has been specifically implemented for existing Reinforced Concrete (RC) building types by performing Non-Linear Dynamic Analyses (NLDAs). As for seismic hazard, correlations between macroseismic intensities and ground motion parameters have been derived processing data related to Italian earthquakes occurred in the last 40 years. Peak Ground Acceleration (PGA), Peak Ground Velocity (PGV) and Housner Intensity (IH) as instrumental measures, and European Macroseismic Scale (EMS-98) and Mercalli-Cancani-Sieberg (MCS) as macroseismic measures, have been considered. The correlations can be used both to adopt empirical damage estimation methods (e.g., Damage Probability Matrices) and to convert the macroseismic data of historical earthquakes into instrumental intensity values, more suitable to risk analyses and design practice. Concerning exposure, an innovative methodology has been developed to convert the information on the typological characteristics collected through the AeDES form (currently used in Italy in post-earthquake usability surveys) to recognized international standards such as the taxonomy proposed by the Global Earthquake Model (GEM) and the EMS-98 building types. The methodology allows to fully exploit the exposure and vulnerability data of post-earthquake surveys related to the Italian built environment and to define an exposure model in terms of risk-oriented classes more suitable for large-scale risk assessments. Furthermore, an approach based on the integration of data collected with the CARTIS procedure (i.e., a protocol used in Italy for the typological-structural characterization of buildings at regional scale) and using the RRVS web-based platform (i.e., for a remote visual screening based on satellite images) has been proposed and specifically applied to the village of Calvello (Basilicata region, Southern Italy). This approach represents a useful tool for compiling residential building inventories in a quick and inexpensive way thus being very suitable in data-poor and economically developing countries. To better illustrate the proposed methodological developments, some applications are provided in the last part of the thesis. The first one proposes a comparison among the results obtained applying some casualty estimation models available in the literature using the vulnerability and damage data collected in the L’Aquila urban area after the 2009 earthquake (data available on the Observed Damage Database Da.D.O. platform). After, by using the same data source, an exposure model in terms of EMS-98 types based on the 2009 post-earthquake data has been implemented for the residential buildings of L'Aquila town and the surrounding municipalities involved in the usability assessment surveys. The third - expansive - application deals with the seismic risk assessment of the Val d’Agri area (Basilicata region, Southern Italy). This area has a strategic role for Italy due to the large quantities of oil extracted from local deposits, making available large resources deriving from royalties. Specifically, earthquake damage scenarios for the residential building stock of 19 villages have been prepared. Considering a seismic vulnerability distribution obtained from the integration of a building-by-building inventory and information collected with the CARTIS and RRVS approaches, the expected losses deriving from a seismic event with an exceedance probability of 10% in 50 years (475 years return period) have been determined. Finally, an action plan for the seismic risk mitigation, essentially based on the reduction of vulnerability of the building stock through a structural strengthening program, has been proposed and specifically applied to one of the villages in the area under study

    Enhancing the collaboration of earthquake engineering research infrastructures

    Get PDF
    Towards stronger international collaboration of earthquake engineering research infrastructures International collaboration and mobility of researchers is a means for maximising the efficiency of use of research infrastructures. The European infrastructures are committed to widen joint research and access to their facilities. This is relevant to European framework for research and innovation, the single market and the competitiveness of the construction industry.JRC.G.4-European laboratory for structural assessmen

    Natural and Technological Hazards in Urban Areas

    Get PDF
    Natural hazard events and technological accidents are separate causes of environmental impacts. Natural hazards are physical phenomena active in geological times, whereas technological hazards result from actions or facilities created by humans. In our time, combined natural and man-made hazards have been induced. Overpopulation and urban development in areas prone to natural hazards increase the impact of natural disasters worldwide. Additionally, urban areas are frequently characterized by intense industrial activity and rapid, poorly planned growth that threatens the environment and degrades the quality of life. Therefore, proper urban planning is crucial to minimize fatalities and reduce the environmental and economic impacts that accompany both natural and technological hazardous events

    An empirical comparison of commercial and open‐source web vulnerability scanners

    Get PDF
    Web vulnerability scanners (WVSs) are tools that can detect security vulnerabilities in web services. Although both commercial and open-source WVSs exist, their vulnerability detection capability and performance vary. In this article, we report on a comparative study to determine the vulnerability detection capabilities of eight WVSs (both open and commercial) using two vulnerable web applications: WebGoat and Damn vulnerable web application. The eight WVSs studied were: Acunetix; HP WebInspect; IBM AppScan; OWASP ZAP; Skipfish; Arachni; Vega; and Iron WASP. The performance was evaluated using multiple evaluation metrics: precision; recall; Youden index; OWASP web benchmark evaluation; and the web application security scanner evaluation criteria. The experimental results show that, while the commercial scanners are effective in detecting security vulnerabilities, some open-source scanners (such as ZAP and Skipfish) can also be effective. In summary, this study recommends improving the vulnerability detection capabilities of both the open-source and commercial scanners to enhance code coverage and the detection rate, and to reduce the number of false-positives
    corecore