857 research outputs found

    Automated Software Testing of Relational Database Schemas

    Get PDF
    Relational databases are critical for many software systems, holding the most valuable data for organisations. Data engineers build relational databases using schemas to specify the structure of the data within a database and defining integrity constraints. These constraints protect the data's consistency and coherency, leading industry experts to recommend testing them. Since manual schema testing is labour-intensive and error-prone, automated techniques enable the generation of test data. Although these generators are well-established and effective, they use default values and often produce many, long, and similar tests --- this results in decreasing fault detection and increasing regression testing time and testers inspection efforts. It raises the following questions: How effective is the optimised random generator at generating tests and its fault detection compared to prior methods? What factors make tests understandable for testers? How to reduce tests while maintaining effectiveness? How effectively do testers inspect differently reduced tests? To answer these questions, the first contribution of this thesis is to evaluate a new optimised random generator against well-established methods empirically. Secondly, identifying understandability factors of schema tests using a human study. Thirdly, evaluating a novel approach that reduces and merge tests against traditional reduction methods. Finally, studying testers' inspection efforts with differently reduced tests using a human study. The results show that the optimised random method efficiently generates effective tests compared to well-established methods. Testers reported that many NULLs and negative numbers are confusing, and they prefer simple repetition of unimportant values and readable strings. The reduction technique with merging is the most effective at minimising tests and producing efficient tests while maintaining effectiveness compared to traditional methods. The merged tests showed an increase in inspection efficiency with a slight accuracy decrease compared to only reduced tests. Therefore, these techniques and investigations can help practitioners adopt these generators in practice

    Coverage Testing in a Production Software Development Environment

    Get PDF
    This project proposes that current testing methodologies used by standard testing tools are not sufficient to ensure sufficient test coverage. Test tools provide important and irreplaceable test data but are not capable of guaranteeing high percentage of path exposure (coverage). If the code path includes loop statements like, if or when then the number of paths to test grows exponentially. The growth of the code path becomes exponential when nested decision statements are considered. The most common methodology used in today\u27s testing environment verifies each line of code but does not verify all path combination. Testing per line of code can not guarantee complete test coverage when considering the variations of nested code paths. The result of lower coverage is a higher field defect rate that increases the overall product support costs

    Using Agile Software Development Practices in a Research Oriented Distributed Simulation

    Get PDF
    Although sometimes controversial, agile methodologies have proven to be a viable choice for some software development projects. Projects suited to agile methodologies are those that involve new technology, have requirements that change rapidly, and are controlled by small, talented teams. Much literature about agile software development leans towards business products and non-government entities. Only a handful of literature resources mention agile software development being used in government contracts and even fewer resources mention research projects. NASA\u27s Airspace and Traffic Operations Simulation (ATOS) is a research oriented simulation that doesn\u27t follow the traditional business project mold. In an effort to gain a better understanding if agile could be used effectively in a NASA contract for a research oriented simulation project, this research looked at what agile practices could be effectively used to help gain simulation reliability while simultaneously allowing routine maintenance, current experiment support, new modeling additions, and comprehensive architectural changes

    Redefining and Evaluating Coverage Criteria Based on the Testing Scope

    Get PDF
    Test coverage information can help testers in deciding when to stop testing and in augmenting their test suites when the measured coverage is not deemed sufficient. Since the notion of a test criterion was introduced in the 70’s, research on coverage testing has been very active with much effort dedicated to the definition of new, more cost-effective, coverage criteria or to the adaptation of existing ones to a different domain. All these studies share the premise that after defining the entity to be covered (e.g., branches), one cannot consider a program to be adequately tested if some of its entities have never been exercised by any input data. However, it is not the case that all entities are of interest in every context. This is particularly true for several paradigms that emerged in the last decade (e.g., component-based development, service-oriented architecture). In such cases, traditional coverage metrics might not always provide meaningful information. In this thesis we address such situation and we redefine coverage criteria so to focus on the program parts that are relevant to the testing scope. We instantiate this general notion of scope-based coverage by introducing three coverage criteria and we demonstrate how they could be applied to different testing contexts. When applied to the context of software reuse, our approach proved to be useful for supporting test case prioritization, selection and minimization. Our studies showed that for prioritization we can improve the average rate of faults detected. For test case selection and minimization, we can considerably reduce the test suite size with small to no extra impact on fault detection effectiveness. When the source code is not available, such as in the service-oriented architecture paradigm, we propose an approach that customizes coverage, measured on invocations at service interface, based on data from similar users. We applied this approach to a real world application and, in our study, we were able to predict the entities that would be of interest for a given user with high precision. Finally, we introduce the first of its kind coverage criterion for operational profile based testing that exploits program spectra obtained from usage traces. Our study showed that it is better correlated than traditional coverage with the probability that the next test input will fail, which implies that our approach can provide a better stopping rule. Promising results were also observed for test case selection. Our redefinition of coverage criteria approaches the topic of coverage testing from a completely different angle. Such a novel perspective paves the way for new avenues of research towards improving the cost-effectiveness of testing, yet all to be explored

    Detecting feature influences to quality attributes in large and partially measured spaces using smart sampling and dynamic learning

    Get PDF
    Emergent application domains (e.g., Edge Computing/Cloud/B5G systems) are complex to be built manually. They are characterised by high variability and are modelled by large Variability Models (VMs), leading to large configuration spaces. Due to the high number of variants present in such systems, it is challenging to find the best-ranked product regarding particular Quality Attributes (QAs) in a short time. Moreover, measuring QAs sometimes is not trivial, requiring a lot of time and resources, as is the case of the energy footprint of software systems — the focus of this paper. Hence, we need a mechanism to analyse how features and their interactions influence energy footprint, but without measuring all configurations. While practical, sampling and predictive techniques base their accuracy on uniform spaces or some initial domain knowledge, which are not always possible to achieve. Indeed, analysing the energy footprint of products in large configuration spaces raises specific requirements that we explore in this work. This paper presents SAVRUS (Smart Analyser of Variability Requirements in Unknown Spaces), an approach for sampling and dynamic statistical learning without relying on initial domain knowledge of large and partially QA-measured spaces. SAVRUS reports the degree to which features and pairwise interactions influence a particular QA, like energy efficiency. We validate and evaluate SAVRUS with a selection of likewise systems, which define large searching spaces containing scattered measurements.Funding for open access charge: Universidad de Málaga / CBUA. This work is supported by the European Union’s H2020 re search and innovation programme under grant agreement DAEMON H2020-101017109, by the projects IRIS PID2021-12281 2OB-I00 (co-financed by FEDER funds), Rhea P18-FR-1081 (MCI/AEI/ FEDER, UE), and LEIA UMA18-FEDERIA-157, and the PRE2019-087496 grant from the Ministerio de Ciencia e Innovación, Spain

    The selection of protected areas in the face of fluctuating populations.

    Get PDF
    Protected areas are an integral part of the majority of national conservation strategies, and are seen by many as the most practical means of safeguarding biological diversity. Nonetheless, many such areas exist in name only or have been awarded protected status for their lack of economic potential rather than any genuine biological significance. Given the imminent extinction crisis it is essential that networks of protected areas are fully representative of key elements of biodiversity. To achieve this goal not only requires the efficient and effective selection of novel protected area networks, but also the regular evaluation of the current situation as regards existing/established networks, to identify and ultimately rectify possible weaknesses and gaps in coverage. Despite increasingly urgent calls for the development and application of comprehensive protected area evaluation systems, analyses of this type remain rare, particularly regarding the number of individuals sustained over time. This thesis provides an in-depth analysis of the status of the existing network of wetland sites identified as nationally/internationally important for their wintering waterbird populations (Special Protection Areas and Ramsar Sites) in Great Britain. In addition, the aim is to examine the distribution patterns and population dynamics of selected waterbird species, and their implications for the selection and management of these wetland protected areas both in Great Britain and across the European Union; to examine the effectiveness and utility of alternative site selection methods, in particular the use of linear programming techniques, for real-world conservation issues; to provide suggestions for the improvement of the current network of protected areas both in Great Britain and across the European Union; and to provide recommendations and suggestions for waterbird conservation in general

    Distribution de la diversité génétique dans les réseaux dendritiques : patrons, processus et implications pour la conservation de la biodiversité

    Get PDF
    L'objectif de cette thèse est de caractériser la distribution de la diversité génétique dans les réseaux dendritiques. Premièrement, nous identifions un patron de diversité génétique dans ces réseaux, ainsi que les effets de l'asymétrie de flux de gènes, les différences de tailles efficaces, et les processus de colonisation sur ces patrons. Deuxièmement, nous caractérisons les patrons de diversité génétique de quatre espèces de poissons (Gobio occitaniae, Squalius cephalus, Barbatula barbatula et Phoxinus phoxinus) dans le bassin de la Garonne, afin d'identifier des zones à protéger. Troisièmement, nous explorons les effets de l'asymétrie de flux de gènes sur l'inférence des histoires démographiques des populations. Finalement, nous combinons des méthodes génétiques et démographiques pour évaluer le statut d'une espèce menacée (Parachondrostoma toxostoma).The objective of this thesis is to characterize the distribution of genetic diversity in dendritic networks. First, we identify a general spatial pattern of genetic diversity on these ecosystems, as well as the effects of asymmetric gene flow, differential in effective population sizes and colonization processes on this pattern. Second, we characterize patterns of genetic diversity of four freshwater fish species (Gobio occitaniae, Squalius cephalus, Barbatula barbatula and Phoxinus phoxinus) at the Garonne river basin, so as to identify priority areas to protect. Third, we explore the effects of gene flow asymmetry on the inference of populations' demographic histories. Finally, we combine genetic and demographic approaches to evaluate the status of a threathened species (Parachondrostoma toxostoma)

    The Impacts of Climate Change and Anthropogenic Processes on Permafrost Soils and USAF Infrastructure within Northern Tier Bases

    Get PDF
    The Department of Defense is planning over $552M in military construction on Eielson Air Force Base within the next three fiscal years. Although many studies have been conducted on permafrost and climate change, the future of our climate as well as any impacts on permafrost soils, remains unclear. This research focused on future climate predictions to determine likely scenarios for the United States Air Force’s Strategic Planners to consider. The most recent 2013 International Panel on Climate Change report predicts a 2.2ºC to 7.8ºC temperature rise in Arctic regions by the end of the 21st Century in the Representative Concentration Pathways, (RCP4.5) emissions scenario. This study provides an explanation as to the impacts of this temperature rise on permafrost soils and Arctic infrastructure. This study developed regression models to analyze historical data related to degree-days, temperature, and seasonal lengths. Initial analysis using regression/forecast techniques show a 1.17ºC temperature increase in the Arctic by the end of the 21stCentury. Additionally, UAF’s GIPL 2.1 model was used to calculate active layer thicknesses and permafrost thickness changes from 1947 to 2100. Results show that the active layer is thinning with some permafrost degradation. This research focused on Central Alaska while further research is recommended on the Alaskan North Slope and Greenland to determine additional impacts on Department of Defense infrastructure
    • …
    corecore