17 research outputs found

    Which Surrogate Works for Empirical Performance Modelling? A Case Study with Differential Evolution

    Full text link
    It is not uncommon that meta-heuristic algorithms contain some intrinsic parameters, the optimal configuration of which is crucial for achieving their peak performance. However, evaluating the effectiveness of a configuration is expensive, as it involves many costly runs of the target algorithm. Perhaps surprisingly, it is possible to build a cheap-to-evaluate surrogate that models the algorithm's empirical performance as a function of its parameters. Such surrogates constitute an important building block for understanding algorithm performance, algorithm portfolio/selection, and the automatic algorithm configuration. In principle, many off-the-shelf machine learning techniques can be used to build surrogates. In this paper, we take the differential evolution (DE) as the baseline algorithm for proof-of-concept study. Regression models are trained to model the DE's empirical performance given a parameter configuration. In particular, we evaluate and compare four popular regression algorithms both in terms of how well they predict the empirical performance with respect to a particular parameter configuration, and also how well they approximate the parameter versus the empirical performance landscapes

    DeepSQLi: Deep Semantic Learning for Testing SQL Injection

    Full text link
    Security is unarguably the most serious concern for Web applications, to which SQL injection (SQLi) attack is one of the most devastating attacks. Automatically testing SQLi vulnerabilities is of ultimate importance, yet is unfortunately far from trivial to implement. This is because the existence of a huge, or potentially infinite, number of variants and semantic possibilities of SQL leading to SQLi attacks on various Web applications. In this paper, we propose a deep natural language processing based tool, dubbed DeepSQLi, to generate test cases for detecting SQLi vulnerabilities. Through adopting deep learning based neural language model and sequence of words prediction, DeepSQLi is equipped with the ability to learn the semantic knowledge embedded in SQLi attacks, allowing it to translate user inputs (or a test case) into a new test case, which is semantically related and potentially more sophisticated. Experiments are conducted to compare DeepSQLi with SQLmap, a state-of-the-art SQLi testing automation tool, on six real-world Web applications that are of different scales, characteristics and domains. Empirical results demonstrate the effectiveness and the remarkable superiority of DeepSQLi over SQLmap, such that more SQLi vulnerabilities can be identified by using a less number of test cases, whilst running much faster

    Multi-objective Reinforcement Learning Based Multi-microgrid System Optimisation Problem

    Get PDF
    This is the author accepted manuscript. The final version is available from Springer Verlag via the DOI in this recordMicrogrids with energy storage systems and distributed renewable energy sources play a crucial role in reducing the consumption from traditional power sources and the emission of CO2. Connecting multi microgrid to a distribution power grid can facilitate a more robust and reliable operation to increase the security and privacy of the system. The proposed model consists of three layers, smart grid layer, independent system operator (ISO) layer and power grid layer. Each layer aims to maximise its benefit. To achieve these objectives, an intelligent multi-microgrid energy management method is proposed based on the multi-objective reinforcement learning (MORL) techniques, leading to a Pareto optimal set. A non-dominated solution is selected to implement a fair design in order not to favour any particular participant. The simulation results demonstrate the performance of the MORL and verify the viability of the proposed approach.European Commissio

    Which Surrogate Works for Empirical Performance Modelling? A Case Study with Differential Evolution

    Get PDF
    This is the author accepted manuscript. The final version is available from IEEE via the DOI in this record.It is not uncommon that meta-heuristic algorithms contain some intrinsic parameters, the optimal configuration of which is crucial for achieving their peak performance. However, evaluating the effectiveness of a configuration is expensive, as it involves many costly runs of the target algorithm. Perhaps surprisingly, it is possible to build a cheap-to-evaluate surrogate that models the algorithm's empirical performance as a function of its parameters. Such surrogates constitute an important building block for understanding algorithm performance, algorithm portfolio/selection, and the automatic algorithm configuration. In principle, many off-the-shelf machine learning techniques can be used to build surrogates. In this paper, we take the differential evolution (DE) as the baseline algorithm for proof-of-concept study. Regression models are trained to model the DE's empirical performance given a parameter configuration. In particular, we evaluate and compare four popular regression algorithms both in terms of how well they predict the empirical performance with respect to a particular parameter configuration, and also how well they approximate the parameter versus the empirical performance landscapes.Royal Societ

    Multidimensional Resource Fragmentation-Aware Virtual Network Embedding in MEC Systems Interconnected by Metro Optical Networks

    Full text link
    The increasing demand for diverse emerging applications has resulted in the interconnection of multi-access edge computing (MEC) systems via metro optical networks. To cater to these diverse applications, network slicing has become a popular tool for creating specialized virtual networks. However, resource fragmentation caused by uneven utilization of multidimensional resources can lead to reduced utilization of limited edge resources. To tackle this issue, this paper focuses on addressing the multidimensional resource fragmentation problem in virtual network embedding (VNE) in MEC systems with the aim of maximizing the profit of an infrastructure provider (InP). The VNE problem in MEC systems is transformed into a bilevel optimization problem, taking into account the interdependence between virtual node embedding (VNoE) and virtual link embedding (VLiE). To solve this problem, we propose a nested bilevel optimization approach named BiVNE. The VNoE is solved using the ant colony system (ACS) in the upper level, while the VLiE is solved using a combination of a shortest path algorithm and an exact-fit spectrum slot allocation method in the lower level. Evaluation results show that the BiVNE algorithm can effectively enhance the profit of the InP by increasing the acceptance ratio and avoiding resource fragmentation simultaneously

    BiLO-CPDP: Bi-Level Programming for Automated Model Discovery in Cross-Project Defect Prediction

    Full text link
    Cross-Project Defect Prediction (CPDP), which borrows data from similar projects by combining a transfer learner with a classifier, have emerged as a promising way to predict software defects when the available data about the target project is insufficient. How-ever, developing such a model is challenge because it is difficult to determine the right combination of transfer learner and classifier along with their optimal hyper-parameter settings. In this paper, we propose a tool, dubbedBiLO-CPDP, which is the first of its kind to formulate the automated CPDP model discovery from the perspective of bi-level programming. In particular, the bi-level programming proceeds the optimization with two nested levels in a hierarchical manner. Specifically, the upper-level optimization routine is designed to search for the right combination of transfer learner and classifier while the nested lower-level optimization routine aims to optimize the corresponding hyper-parameter settings.To evaluateBiLO-CPDP, we conduct experiments on 20 projects to compare it with a total of 21 existing CPDP techniques, along with its single-level optimization variant and Auto-Sklearn, a state-of-the-art automated machine learning tool. Empirical results show that BiLO-CPDP champions better prediction performance than all other 21 existing CPDP techniques on 70% of the projects, while be-ing overwhelmingly superior to Auto-Sklearn and its single-level optimization variant on all cases. Furthermore, the unique bi-level formalization inBiLO-CPDP also permits to allocate more budget to the upper-level, which significantly boosts the performance
    corecore