264,710 research outputs found

    Fuzzy Adaptive Tuning of a Particle Swarm Optimization Algorithm for Variable-Strength Combinatorial Test Suite Generation

    Full text link
    Combinatorial interaction testing is an important software testing technique that has seen lots of recent interest. It can reduce the number of test cases needed by considering interactions between combinations of input parameters. Empirical evidence shows that it effectively detects faults, in particular, for highly configurable software systems. In real-world software testing, the input variables may vary in how strongly they interact, variable strength combinatorial interaction testing (VS-CIT) can exploit this for higher effectiveness. The generation of variable strength test suites is a non-deterministic polynomial-time (NP) hard computational problem \cite{BestounKamalFuzzy2017}. Research has shown that stochastic population-based algorithms such as particle swarm optimization (PSO) can be efficient compared to alternatives for VS-CIT problems. Nevertheless, they require detailed control for the exploitation and exploration trade-off to avoid premature convergence (i.e. being trapped in local optima) as well as to enhance the solution diversity. Here, we present a new variant of PSO based on Mamdani fuzzy inference system \cite{Camastra2015,TSAKIRIDIS2017257,KHOSRAVANIAN2016280}, to permit adaptive selection of its global and local search operations. We detail the design of this combined algorithm and evaluate it through experiments on multiple synthetic and benchmark problems. We conclude that fuzzy adaptive selection of global and local search operations is, at least, feasible as it performs only second-best to a discrete variant of PSO, called DPSO. Concerning obtaining the best mean test suite size, the fuzzy adaptation even outperforms DPSO occasionally. We discuss the reasons behind this performance and outline relevant areas of future work.Comment: 21 page

    An improved firefly algorithm for optimal microgrid operation with renewable energy

    Get PDF
    Lately, an electrical network in microgrid system becomes very important to rural or remote areas without connection from primary power grid system. Higher cost of fuels, logistic, spare parts and maintenance affect the cost for operation microgrid generation to supply electrical power for remote areas and rural community. This project proposes an Improved Firefly Algorithm (IFA), which is a improvement of classical Firefly Algorithm (FA) technique using characteristic approach of Lévy flights to solve the optimal microgrid operation. The IFA has been used for optimizing the cost of power generation in microgrid system where daily power balance constraints and generation limits are considered. The microgrid system for this case study considered both of renewable energy plant and conventional generator units. There are two test systems that have been considered as case study. The first test system is a simple microgrid system which consists of three generators. The second test system consists of seven generating units including two wind turbines, three fuel-cell plants and two diesel generators. The IFA method has been implemented using MATLAB software. The results obtained by IFA was compared to FA and other algorithms based on optimal cost, convergence characteristics and robustness to validate the effectiveness of the IFA. It shows that the IFA obtained better results in terms of operating costs compared to FA, Differential Evolution (DE), Particle Swarm Optimization (PSO) and Cuckoo Search Algorithm (CSA)

    Optimal Generation Scheduling of Power System for Maximum Renewable Energy Harvesting and Power Losses Minimization

    Get PDF
    This paper proposes an optimal generation scheduling method for a power system integrated with renewable energy sources (RES) based distributed generations (DG) and energy storage systems (ESS) considering maximum harvesting of RES outputs and minimum power system operating losses. The main contribution aims at economically employing RES in a power system. In particular, maximum harvesting of renewable energy is achieved by the mean of ESS management. In addition, minimum power system operating losses can be obtained by properly scheduling operating of ESS and controllable generations. Particle Swam Optimization (PSO) algorithm is applied to search for a near global optimal solutions. The optimization problem is formulated and evaluated taking into account power system operating constraints. The different operation scenarios have been used to investigate the effective of the proposed method via DIgSILENT PowerFactory software. The proposed method is examined with IEEE standard 14-bus and 30-bus test systems.

    Dynamic Message Sign and Diversion Traffic Optimization

    Get PDF
    This dissertation proposes a Dynamic Message Signs (DMS) diversion control system based on principles of existing Advanced Traveler Information Systems and Advanced Traffic Management Systems (ATMS). The objective of the proposed system is to alleviate total corridor traffic delay by choosing optimized diversion rate and alternative road signal-timing plan. The DMS displays adaptive messages at predefined time interval for guiding certain number of drivers to alternative roads. Messages to be displayed on the DMS are chosen by an on-line optimization model that minimizes corridor traffic delay. The expected diversion rate is assumed following a distribution. An optimization model that considers three traffic delay components: mainline travel delay, alternative road signal control delay, and the travel time difference between the mainline and alternative roads is constructed. Signal timing parameters of alternative road intersections and DMS message level are the decision variables; speeds, flow rates, and other corridor traffic data from detectors serve as inputs of the model. Traffic simulation software, CORSIM, served as a developmental environment and test bed for evaluating the proposed system. MATLAB optimization toolboxes have been applied to solve the proposed model. A CORSIM Run-Time-Extension (RTE) has been developed to exchange data between CORSIM and the adopted MATLAB optimization algorithms (Genetic Algorithm, Pattern Search in direct search toolbox, and Sequential Quadratic Programming). Among the three candidate algorithms, the Sequential Quadratic Programming showed the fastest execution speed and yielded the smallest total delays for numerical examples. TRANSYT-7F, the most credible traffic signal optimization software has been used as a benchmark to verify the proposed model. The total corridor delays obtained from CORSIM with the SQP solutions show average reductions of 8.97%, 14.09%, and 13.09% for heavy, moderate and light traffic congestion levels respectively when compared with TRANSYT-7F optimization results. The maximum model execution time at each MATLAB call is fewer than two minutes, which implies that the system is capable of real world implementation with a DMS message and signal update interval of two minutes

    Transferring Interactive Search-Based Software Testing to Industry

    Full text link
    Search-Based Software Testing (SBST) is the application of optimization algorithms to problems in software testing. In previous work, we have implemented and evaluated Interactive Search-Based Software Testing (ISBST) tool prototypes, with a goal to successfully transfer the technique to industry. While SBSE solutions are often validated on benchmark problems, there is a need to validate them in an operational setting. The present paper discusses the development and deployment of SBST tools for use in industry and reflects on the transfer of these techniques to industry. In addition to previous work discussing the development and validation of an ISBST prototype, a new version of the prototype ISBST system was evaluated in the laboratory and in industry. This evaluation is based on an industrial System under Test (SUT) and was carried out with industrial practitioners. The Technology Transfer Model is used as a framework to describe the progression of the development and evaluation of the ISBST system. The paper presents a synthesis of previous work developing and evaluating the ISBST prototype, as well as presenting an evaluation, in both academia and industry, of that prototype's latest version. This paper presents an overview of the development and deployment of the ISBST system in an industrial setting, using the framework of the Technology Transfer Model. We conclude that the ISBST system is capable of evolving useful test cases for that setting, though improvements in the means the system uses to communicate that information to the user are still required. In addition, a set of lessons learned from the project are listed and discussed. Our objective is to help other researchers that wish to validate search-based systems in industry and provide more information about the benefits and drawbacks of these systems.Comment: 40 pages, 5 figure

    Parameter Design Strategies: A Comparison Between Human Designers and the Simulated Annealing Algorithm

    Get PDF
    Computer-based tools have great potential for facilitating the design of large-scale engineering systems. Interviews with veteran designers of desalination systems revealed that they tended to employ a trial-and-error approach to determine critical design parameters when using software design packages. A series of human experiments were conducted to observe the performance and behavior of test subjects during a series of simulated design processes involving seawater reverse osmosis (SWRO) plants. The subjects were mostly students with a spectrum of knowledge levels in desalination system design. The experiments showed that subjects who ranked top in performance behaved very differently from those who were bottom-ranked. The problem-solving profiles of the best performing subjects resembled a well-tuned simulated annealing optimization algorithm while the worst performing subjects used a pseudo random search strategy. This finding could be used to improve computerbased design tools by utilizing the synergy between strengths of humans and computers

    Evolutionary algorithms for the multi-objective test data generation problem

    Get PDF
    Software: Practice & Experience, 42(11):1331-1362Automatic test data generation is a very popular domain in the field of search-based software engineering. Traditionally, the main goal has been to maximize coverage. However, other objectives can be defined, such as the oracle cost, which is the cost of executing the entire test suite and the cost of checking the system behavior. Indeed, in very large software systems, the cost spent to test the system can be an issue, and then it makes sense by considering two conflicting objectives: maximizing the coverage and minimizing the oracle cost. This is what we did in this paper. We mainly compared two approaches to deal with the multi-objective test data generation problem: a direct multi-objective approach and a combination of a mono-objective algorithm together with multi-objective test case selection optimization. Concretely, in this work, we used four state-of-the-art multi-objective algorithms and two mono-objective evolutionary algorithms followed by a multi-objective test case selection based on Pareto efficiency. The experimental analysis compares these techniques on two different benchmarks. The first one is composed of 800 Java programs created through a program generator. The second benchmark is composed of 13 real programs extracted from the literature. In the direct multi-objective approach, the results indicate that the oracle cost can be properly optimized; however, the full branch coverage of the system poses a great challenge. Regarding the mono-objective algorithms, although they need a second phase of test case selection for reducing the oracle cost, they are very effective in maximizing the branch coverage.Spanish Ministry of Science and Innovation and FEDER under contract TIN2008-06491-C04-01 (the M project). Andalusian Government under contract P07-TIC-03044 (DIRICOM project)

    A Requirements-Based Partition Testing Framework Using Particle Swarm Optimization Technique

    Get PDF
    Modern society is increasingly dependent on the quality of software systems. Software failure can cause severe consequences, including loss of human life. There are various ways of fault prevention and detection that can be deployed in different stages of software development. Testing is the most widely used approach for ensuring software quality. Requirements-Based Testing and Partition Testing are two of the widely used approaches for testing software systems. Although both of these techniques are mature and are addressed widely in the literature and despite the general agreement on both of these key techniques of functional testing, a combination of them lacks a systematic approach. In this thesis, we propose a framework along with a procedural process for testing a system using Requirements-Based Partition Testing (RBPT). This framework helps testers to start from the requirements documents and follow a straightforward step by step process to generate the required test cases without loosing any required data. Although many steps of the process are manual, the framework can be used as a foundation for automating the whole test case generation process. Another issue in testing a software product is the test case selection problem. Choosing appropriate test cases is an essential part of software testing that can lead to significant improvements in efficiency, as well as reduced costs of combinatorial testing. Unfortunately, the problem of finding minimum size test sets is NP-complete in general. Therefore, artificial intelligence-based search algorithms have been widely used for generating near-optimal solutions. In this thesis, we also propose a novel technique for test case generation using Particle Swarm Optimization (PSO), an effective optimization tool which has emerged in the last decade. Empirical studies show that in some domains particle swarm optimization is equally well-suited or even better than some other techniques. At the same time, a particle swarm algorithm is much simpler, easier to implement, and has just a few parameters that the user needs to adjust. These properties make PSO an ideal technique for test case generation. In order to have a fair comparison of our newly proposed algorithm against existing techniques, we have designed and implemented a framework for automatic evaluation of these methods. Through experiments using our evaluation framework, we illustrate how this new test case generation technique can outperform other existing methodologies

    Search based software engineering: Trends, techniques and applications

    Get PDF
    © ACM, 2012. This is the author's version of the work. It is posted here by permission of ACM for your personal use. Not for redistribution. The definitive version is available from the link below.In the past five years there has been a dramatic increase in work on Search-Based Software Engineering (SBSE), an approach to Software Engineering (SE) in which Search-Based Optimization (SBO) algorithms are used to address problems in SE. SBSE has been applied to problems throughout the SE lifecycle, from requirements and project planning to maintenance and reengineering. The approach is attractive because it offers a suite of adaptive automated and semiautomated solutions in situations typified by large complex problem spaces with multiple competing and conflicting objectives. This article provides a review and classification of literature on SBSE. The work identifies research trends and relationships between the techniques applied and the applications to which they have been applied and highlights gaps in the literature and avenues for further research.EPSRC and E

    Techniques for Low-latency in Software-defined Radio-based Networks

    Get PDF
    Decreased budgets have pushed the United States Air Force towards using existing systems in new ways. The use of unmanned aerial vehicle swarms is one example of reuse of existing systems. One problem with the increased utilization of these swarms is the congestion of the electromagnetic spectrum. Software-defined or cognitive radios have been proposed as a basis for a potential robust communications solution. The present research aims to develop and test a genetic algorithm-based cognitive engine to begin looking at real-time engines that could be used in future swarms. Here, latency is the optimization objective of primary importance. In testing the engine, particular items of interest include the number of solutions evaluated in a given bound and the engine\u27s reliability in yielding acceptable network performance. Initial experiments indicate the engine can consider significant portions of the search space within a relatively small bound and that the engine is efficient at finding highly fit solutions. Future work for this research includes evaluating how well high fitness correlates to acceptable performance and testing the engine with additional noise floors
    corecore