5 research outputs found

    Search-based Unit Test Generation for Evolving Software

    Get PDF
    Search-based software testing has been successfully applied to generate unit test cases for object-oriented software. Typically, in search-based test generation approaches, evolutionary search algorithms are guided by code coverage criteria such as branch coverage to generate tests for individual coverage objectives. Although it has been shown that this approach can be effective, there remain fundamental open questions. In particular, which criteria should test generation use in order to produce the best test suites? Which evolutionary algorithms are more effective at generating test cases with high coverage? How to scale up search-based unit test generation to software projects consisting of large numbers of components, evolving and changing frequently over time? As a result, the applicability of search-based test generation techniques in practice is still fundamentally limited. In order to answer these fundamental questions, we investigate the following improvements to search-based testing. First, we propose the simultaneous optimisation of several coverage criteria at the same time using an evolutionary algorithm, rather than optimising for individual criteria. We then perform an empirical evaluation of different evolutionary algorithms to understand the influence of each one on the test optimisation problem. We then extend a coverage-based test generation with a non-functional criterion to increase the likelihood of detecting faults as well as helping developers to identify the locations of the faults. Finally, we propose several strategies and tools to efficiently apply search-based test generation techniques in large and evolving software projects. Our results show that, overall, the optimisation of several coverage criteria is efficient, there is indeed an evolutionary algorithm that clearly works better for test generation problem than others, the extended coverage-based test generation is effective at revealing and localising faults, and our proposed strategies, specifically designed to test entire software projects in a continuous way, improve efficiency and lead to higher code coverage. Consequently, the techniques and toolset presented in this thesis - which provides support to all contributions here described - brings search-based software testing one step closer to practical usage, by equipping software engineers with the state of the art in automated test generation

    Multiscale modelling of delayed hydride cracking

    Get PDF
    A mechanistic model of delayed hydride cracking (DHC) is crucial to the nuclear industry as a predictive tool for understanding the structural failure of zirconium alloy components that are used to clad fuel pins in water-cooled reactors. Such a model of DHC failure must be both physically accurate and computationally efficient so that it can inform and guide nuclear safety assessments. However, this endeavour has so far proved to be an unsurmountable challenge because of the seemingly intractable multiscale complexity of the DHC phenomenon, which is a manifestation of hydrogen embrittlement that involves the interplay and repetition of three constituent processes: atomic scale diffusion, microscale precipitation and continuum scale fracture. This investigation aims to blueprint a novel multiscale modelling strategy to simulate the early stages of DHC initiation: stress-driven hydrogen diffusion-controlled precipitation of hydrides near loaded flaws in polycrystalline zirconium. Following a careful review of the experimental observations in the literature as well as the standard modelling techniques that are commonplace in nuclear fuel performance codes in the first part of this dissertation, the second and third parts introduce a hybrid multiscale modelling strategy that integrates concepts across a spectrum of length and time scales into one self-consistent framework whilst accounting for the complicated nuances of the zirconium-hydrogen system. In particular, this strategy dissects the DHC mechanism into three interconnected modules: (i) stress analysis, which performs defect micromechanics in hexagonal close-packed zirconium through the application of the mathematical theory of planar elasticity to anisotropic continua; (ii) stress-diffusion analysis, which bridges the classical long-range elastochemical transport with the quantum structure of the hydrogen interstitialcy in the trigonal environment of the tetrahedral site; and (iii) diffusion-precipitation analysis, which translates empirical findings into an optimised algorithm that emulates the thermodynamically favourable spatial assembly of the microscopic hydride needles into macroscopic hydride colonies at prospective nucleation sites. Each module explores several unique mechanistic modelling considerations, including a multipolar expansion of the forces exerted by hydrogen interstitials, a distributed dislocation representation of the hydride platelets, and a stoichiometric hydrogen mass conservation criterion that dictates the lifecycle of hydrides. The investigation proceeds to amalgamate the stress, stress-diffusion and diffusion-precipitation analyses into a unified theory of the mesoscale mechanics that underpin the early stages of DHC failure and a comprehensive simulation of the flaw-tip hydrogen profiles and hydride microstructures. The multiscale theory and simulation are realised within a bespoke software which incorporates computer vision to generate mesoscale micrographs that depict the geometries, morphologies and contours of key metallographic entities: cracks and notches, grains, intergranular and intragranular nucleation sites as well as regions of hydrogen enhancement and complex networks of hydride features. Computer vision mediates the balance between simulation accuracy and simulation efficiency, which is completely novel in the context of DHC research as a paradigm at the intersection of computational science and computer science. Preliminary tests show that the simulation environment of the hybrid model is significantly more accurate and efficient in comparison with the traditional finite element and phase field methodologies. Due to this unprecedented simulation accuracy-efficiency balance, realistic flaw-tip hydrogen profiles and hydride microstructures can be simulated within seconds, which naturally facilitates statistical averaging over ensembles. Such statistical capabilities are highly relevant to nuclear safety assessments and, therefore, a systematic breakdown of the model formulation is presented in the style of a code specification manual so that the bespoke software can be readily adapted within an industrial setting. As the main contribution to DHC research, the proposed multiscale model comprises a state-of-the-art microstructural solver whose unrivalled versatility is demonstrated by showcasing a series of simulated micrographs that are parametrised by flaw acuity, grain size, texture, alloy composition, and histories of thermomechanical cycles. Direct comparisons with experimental micrographs indicate good quantitative agreement and provide some justification to the known qualitative trends. Furthermore, the overall hybrid methodology is proven to scale linearly with the number of hydrides, which is computationally advantageous in its own right because it allows the bespoke software to be extended without compromising its speed. Several possible extensions are outlined which would improve the phenomological accuracy of the multiscale model whilst retaining its efficiency. In its current form, however, this hybrid multiscale model of the early stages of DHC goes far beyond existing methodologies in terms of simulation scope.Open Acces

    Continuous reservoir model updating by ensemble Kalman filter on Grid computing architectures

    Get PDF
    A reservoir engineering Grid computing toolkit, ResGrid and its extensions, were developed and applied to designed reservoir simulation studies and continuous reservoir model updating. The toolkit provides reservoir engineers with high performance computing capacity to complete their projects without requiring them to delve into Grid resource heterogeneity, security certification, or network protocols. Continuous and real-time reservoir model updating is an important component of closed-loop model-based reservoir management. The method must rapidly and continuously update reservoir models by assimilating production data, so that the performance predictions and the associated uncertainty are up-to-date for optimization. The ensemble Kalman filter (EnKF), a Bayesian approach for model updating, uses Monte Carlo statistics for fusing observation data with forecasts from simulations to estimate a range of plausible models. The ensemble of updated models can be used for uncertainty forecasting or optimization. Grid environments aggregate geographically distributed, heterogeneous resources. Their virtual architecture can handle many large parallel simulation runs, and is thus well suited to solving model-based reservoir management problems. In the study, the ResGrid workflow for Grid-based designed reservoir simulation and an adapted workflow provide tools for building prior model ensembles, task farming and execution, extracting simulator output results, implementing the EnKF, and using a web portal for invoking those scripts. The ResGrid workflow is demonstrated for a geostatistical study of 3-D displacements in heterogeneous reservoirs. A suite of 1920 simulations assesses the effects of geostatistical methods and model parameters. Multiple runs are simultaneously executed using parallel Grid computing. Flow response analyses indicate that efficient, widely-used sequential geostatistical simulation methods may overestimate flow response variability when compared to more rigorous but computationally costly direct methods. Although the EnKF has attracted great interest in reservoir engineering, some aspects of the EnKF remain poorly understood, and are explored in the dissertation. First, guidelines are offered to select data assimilation intervals. Second, an adaptive covariance inflation method is shown to be effective to stabilize the EnKF. Third, we show that simple truncation can correct negative effects of nonlinearity and non-Gaussianity as effectively as more complex and expensive reparameterization methods

    GPU PERFORMANCE MODELLING AND OPTIMIZATION

    Get PDF
    Ph.DNUS-TU/E JOINT PH.D
    corecore