3 research outputs found

    Which Surrogate Works for Empirical Performance Modelling? A Case Study with Differential Evolution

    Full text link
    It is not uncommon that meta-heuristic algorithms contain some intrinsic parameters, the optimal configuration of which is crucial for achieving their peak performance. However, evaluating the effectiveness of a configuration is expensive, as it involves many costly runs of the target algorithm. Perhaps surprisingly, it is possible to build a cheap-to-evaluate surrogate that models the algorithm's empirical performance as a function of its parameters. Such surrogates constitute an important building block for understanding algorithm performance, algorithm portfolio/selection, and the automatic algorithm configuration. In principle, many off-the-shelf machine learning techniques can be used to build surrogates. In this paper, we take the differential evolution (DE) as the baseline algorithm for proof-of-concept study. Regression models are trained to model the DE's empirical performance given a parameter configuration. In particular, we evaluate and compare four popular regression algorithms both in terms of how well they predict the empirical performance with respect to a particular parameter configuration, and also how well they approximate the parameter versus the empirical performance landscapes

    An improved deepfake detection method based on CNNS

    Get PDF
    Today's image generation technology can generate high-quality face images, and it isn't easy to recognize the authenticity of the generated images through human eyes. This study aims to improve deepfake detection, a face swapping forgery, by absorbing the advantages of deep learning technologies. This study generates a unified and enhanced data set from multiple sources using spatial enhancement technology to solve the problem of poor detection performance on cross-data sets. Taking the advantages of Inception and ResNet networks, new deepfake detection architecture composed of 20 network layers is proposed as the deepfake detection model. To further improve the proposed model, hyperparameter values are optimized. The experiment result shows that the proposed network significantly enhanced over the mainstream methods, such as ResNeXt50, ResNet101, XceptionNet, and VGG19, in terms of accuracy, loss value, AUC, numbers of parameters, and FLOPs. Overall, the methods introduced in this study can help to expand the data set, better detect deepfake contents, and effectively optimize network model

    Which Surrogate Works for Empirical Performance Modelling? A Case Study with Differential Evolution

    Get PDF
    This is the author accepted manuscript. The final version is available from IEEE via the DOI in this record.It is not uncommon that meta-heuristic algorithms contain some intrinsic parameters, the optimal configuration of which is crucial for achieving their peak performance. However, evaluating the effectiveness of a configuration is expensive, as it involves many costly runs of the target algorithm. Perhaps surprisingly, it is possible to build a cheap-to-evaluate surrogate that models the algorithm's empirical performance as a function of its parameters. Such surrogates constitute an important building block for understanding algorithm performance, algorithm portfolio/selection, and the automatic algorithm configuration. In principle, many off-the-shelf machine learning techniques can be used to build surrogates. In this paper, we take the differential evolution (DE) as the baseline algorithm for proof-of-concept study. Regression models are trained to model the DE's empirical performance given a parameter configuration. In particular, we evaluate and compare four popular regression algorithms both in terms of how well they predict the empirical performance with respect to a particular parameter configuration, and also how well they approximate the parameter versus the empirical performance landscapes.Royal Societ
    corecore