119 research outputs found

    Detecting Clouds in Multispectral Satellite Images Using Quantum-Kernel Support Vector Machines

    Get PDF
    Support vector machines (SVMs) are a well-established classifier effectively deployed in an array of classification tasks. In this work, we consider extending classical SVMs with quantum kernels and applying them to satellite data analysis. The design and implementation of SVMs with quantum kernels (hybrid SVMs) are presented. Here, the pixels are mapped to the Hilbert space using a family of parameterized quantum feature maps (related to quantum kernels). The parameters are optimized to maximize the kernel target alignment. The quantum kernels have been selected such that they enabled analysis of numerous relevant properties while being able to simulate them with classical computers on a real-life large-scale dataset. Specifically, we approach the problem of cloud detection in the multispectral satellite imagery, which is one of the pivotal steps in both on-the-ground and on-board satellite image analysis processing chains. The experiments performed over the benchmark Landsat-8 multispectral dataset revealed that the simulated hybrid SVM successfully classifies satellite images with accuracy comparable to the classical SVM with the RBF kernel for large datasets. Interestingly, for large datasets, the high accuracy was also observed for the simple quantum kernels, lacking quantum entanglement.Comment: 12 pages, 10 figure

    Parallel surrogate-assisted global optimization with expensive functions – a survey

    Get PDF
    Surrogate assisted global optimization is gaining popularity. Similarly, modern advances in computing power increasingly rely on parallelization rather than faster processors. This paper examines some of the methods used to take advantage of parallelization in surrogate based global optimization. A key issue focused on in this review is how different algorithms balance exploration and exploitation. Most of the papers surveyed are adaptive samplers that employ Gaussian Process or Kriging surrogates. These allow sophisticated approaches for balancing exploration and exploitation and even allow to develop algorithms with calculable rate of convergence as function of the number of parallel processors. In addition to optimization based on adaptive sampling, surrogate assisted parallel evolutionary algorithms are also surveyed. Beyond a review of the present state of the art, the paper also argues that methods that provide easy parallelization, like multiple parallel runs, or methods that rely on population of designs for diversity deserve more attention.United States. Dept. of Energy (National Nuclear Security Administration. Advanced Simulation and Computing Program. Cooperative Agreement under the Predictive Academic Alliance Program. DE-NA0002378

    A data-driven intelligent decision support system that combines predictive and prescriptive analytics for the design of new textile fabrics

    Get PDF
    In this paper, we propose an Intelligent Decision Support System (IDSS) for the design of new textile fabrics. The IDSS uses predictive analytics to estimate fabric properties (e.g., elasticity) and composition values (% cotton) and then prescriptive techniques to optimize the fabric design inputs that feed the predictive models (e.g., types of yarns used). Using thousands of data records from a Portuguese textile company, we compared two distinct Machine Learning (ML) predictive approaches: Single-Target Regression (STR), via an Automated ML (AutoML) tool, and Multi-target Regression, via a deep learning Artificial Neural Network. For the prescriptive analytics, we compared two Evolutionary Multi-objective Optimization (EMO) methods (NSGA-II and R-NSGA-II) when optimizing 100 new fabrics, aiming to simultaneously minimize the physical property predictive error and the distance of the optimized values when compared with the learned input space. The two EMO methods were applied to design of 100 new fabrics. Overall, the STR approach provided the best results for both prediction tasks, with Normalized Mean Absolute Error values that range from 4% (weft elasticity) to 11% (pilling) in terms of the fabric properties and a textile composition classification accuracy of 87% when adopting a small tolerance of 0.01 for predicting the percentages of six types of fibers (e.g., cotton). As for the prescriptive results, they favored the R-NSGA-II EMO method, which tends to select Pareto curves that are associated with an average 11% predictive error and 16% distance.This work was carried out within the project "TexBoost: less Commodities more Specialities" reference POCI-01-0247-FEDER-024523, co-funded by Fundo Europeu de Desenvolvimento Regional (FEDER), through Portugal 2020 (P2020)

    A Comprehensive Review of Bio-Inspired Optimization Algorithms Including Applications in Microelectronics and Nanophotonics

    Get PDF
    The application of artificial intelligence in everyday life is becoming all-pervasive and unavoidable. Within that vast field, a special place belongs to biomimetic/bio-inspired algorithms for multiparameter optimization, which find their use in a large number of areas. Novel methods and advances are being published at an accelerated pace. Because of that, in spite of the fact that there are a lot of surveys and reviews in the field, they quickly become dated. Thus, it is of importance to keep pace with the current developments. In this review, we first consider a possible classification of bio-inspired multiparameter optimization methods because papers dedicated to that area are relatively scarce and often contradictory. We proceed by describing in some detail some more prominent approaches, as well as those most recently published. Finally, we consider the use of biomimetic algorithms in two related wide fields, namely microelectronics (including circuit design optimization) and nanophotonics (including inverse design of structures such as photonic crystals, nanoplasmonic configurations and metamaterials). We attempted to keep this broad survey self-contained so it can be of use not only to scholars in the related fields, but also to all those interested in the latest developments in this attractive area

    Accelerating Manufacturing Decisions using Bayesian Optimization: An Optimization and Prediction Perspective

    Get PDF
    Manufacturing is a promising technique for producing complex and custom-made parts with a high degree of precision. It can also provide us with desired materials and products with specified properties. To achieve that, it is crucial to find out the optimum point of process parameters that have a significant impact on the properties and quality of the final product. Unfortunately, optimizing these parameters can be challenging due to the complex and nonlinear nature of the underlying process, which becomes more complicated when there are conflicting objectives, sometimes with multiple goals. Furthermore, experiments are usually costly, time-consuming, and require expensive materials, man, and machine hours. So, each experiment is valuable and it\u27s critical to determine the optimal experiment location to gain the most comprehensive understanding of the process. Sequential learning is a promising approach to actively learn from the ongoing experiments, iteratively update the underlying optimization routine, and adapt the data collection process on the go. This thesis presents a multi-objective Bayesian optimization framework to find out the optimum processing conditions for a manufacturing setup. It uses an acquisition function to collect data points sequentially and iteratively update its understanding of the underlying design space utilizing a Gaussian Process-based surrogate model. In manufacturing processes, the focus is often on obtaining a rough understanding of the design space using minimal experimentation, rather than finding the optimal parameters. This falls under the category of approximating the underlying function rather than design optimization. This approach can provide material scientists or manufacturing engineers with a comprehensive view of the entire design space, increasing the likelihood of making discoveries or making robust decisions. However, a precise and reliable prediction model is necessary for a good approximation. To meet this requirement, this thesis proposes an epsilon-greedy sequential prediction framework that is distinct from the optimization framework. The data acquisition strategy has been refined to balance exploration and exploitation, and a threshold has been established to determine when to switch between the two. The performance of this proposed optimization and prediction framework is evaluated using real-life datasets against the traditional design of experiments. The proposed frameworks have generated effective optimization and prediction results using fewer experiments
    • …
    corecore