84 research outputs found

    Optimized Three Deep Learning Models Based-PSO Hyperparameters for Beijing PM2.5 Prediction

    Get PDF
    Deep learning is a machine learning approach that produces excellent performance in various applications, including natural language processing, image identification, and forecasting. Deep learning network performance depends on the hyperparameter settings. This research attempts to optimize the deep learning architecture of Long short term memory (LSTM), Convolutional neural network (CNN), and Multilayer perceptron (MLP) for forecasting tasks using Particle swarm optimization (PSO), a swarm intelligence-based metaheuristic optimization methodology: Proposed M-1 (PSO-LSTM), M-2 (PSO-CNN), and M-3 (PSO-MLP). Beijing PM2.5 datasets was analyzed to measure the performance of the proposed models. PM2.5 as a target variable was affected by dew point, pressure, temperature, cumulated wind speed, hours of snow, and hours of rain. The deep learning network inputs consist of three different scenarios: daily, weekly, and monthly. The results show that the proposed M-1 with three hidden layers produces the best results of RMSE and MAPE compared to the proposed M-2, M-3, and all the baselines. A recommendation for air pollution management could be generated by using these optimized models

    Flood Forecasting Using Machine Learning Methods

    Get PDF
    This book is a printed edition of the Special Issue Flood Forecasting Using Machine Learning Methods that was published in Wate

    Distributed and Lightweight Meta-heuristic Optimization method for Complex Problems

    Get PDF
    The world is becoming more prominent and more complex every day. The resources are limited and efficiently use them is one of the most requirement. Finding an Efficient and optimal solution in complex problems needs to practical methods. During the last decades, several optimization approaches have been presented that they can apply to different optimization problems, and they can achieve different performance on various problems. Different parameters can have a significant effect on the results, such as the type of search spaces. Between the main categories of optimization methods (deterministic and stochastic methods), stochastic optimization methods work more efficient on big complex problems than deterministic methods. But in highly complex problems, stochastic optimization methods also have some issues, such as execution time, convergence to local optimum, incompatible with distributed systems, and dependence on the type of search spaces. Therefore this thesis presents a distributed and lightweight metaheuristic optimization method (MICGA) for complex problems focusing on four main tracks. 1) The primary goal is to improve the execution time by MICGA. 2) The proposed method increases the stability and reliability of the results by using the multi-population strategy in the second track. 3) MICGA is compatible with distributed systems. 4) Finally, MICGA is applied to the different type of optimization problems with other kinds of search spaces (continuous, discrete and order based optimization problems). MICGA has been compared with other efficient optimization approaches. The results show the proposed work has been achieved enough improvement on the main issues of the stochastic methods that are mentioned before.Maailmasta on päivä päivältä tulossa yhä monimutkaisempi. Resurssit ovat rajalliset, ja siksi niiden tehokas käyttö on erittäin tärkeää. Tehokkaan ja optimaalisen ratkaisun löytäminen monimutkaisiin ongelmiin vaatii tehokkaita käytännön menetelmiä. Viime vuosikymmenien aikana on ehdotettu useita optimointimenetelmiä, joilla jokaisella on vahvuutensa ja heikkoutensa suorituskyvyn ja tarkkuuden suhteen erityyppisten ongelmien ratkaisemisessa. Parametreilla, kuten hakuavaruuden tyypillä, voi olla merkittävä vaikutus tuloksiin. Optimointimenetelmien pääryhmistä (deterministiset ja stokastiset menetelmät) stokastinen optimointi toimii suurissa monimutkaisissa ongelmissa tehokkaammin kuin deterministinen optimointi. Erittäin monimutkaisissa ongelmissa stokastisilla optimointimenetelmillä on kuitenkin myös joitain ongelmia, kuten korkeat suoritusajat, päätyminen paikallisiin optimipisteisiin, yhteensopimattomuus hajautetun toteutuksen kanssa ja riippuvuus hakuavaruuden tyypistä. Tämä opinnäytetyö esittelee hajautetun ja kevyen metaheuristisen optimointimenetelmän (MICGA) monimutkaisille ongelmille keskittyen neljään päätavoitteeseen: 1) Ensisijaisena tavoitteena on pienentää suoritusaikaa MICGA:n avulla. 2) Lisäksi ehdotettu menetelmä lisää tulosten vakautta ja luotettavuutta käyttämällä monipopulaatiostrategiaa. 3) MICGA tukee hajautettua toteutusta. 4) Lopuksi MICGA-menetelmää sovelletaan erilaisiin optimointiongelmiin, jotka edustavat erityyppisiä hakuavaruuksia (jatkuvat, diskreetit ja järjestykseen perustuvat optimointiongelmat). Työssä MICGA-menetelmää verrataan muihin tehokkaisiin optimointimenetelmiin. Tulokset osoittavat, että ehdotetulla menetelmällä saavutetaan selkeitä parannuksia yllä mainittuihin stokastisten menetelmien pääongelmiin liittyen

    High-Performance Modelling and Simulation for Big Data Applications

    Get PDF
    This open access book was prepared as a Final Publication of the COST Action IC1406 “High-Performance Modelling and Simulation for Big Data Applications (cHiPSet)“ project. Long considered important pillars of the scientific method, Modelling and Simulation have evolved from traditional discrete numerical methods to complex data-intensive continuous analytical optimisations. Resolution, scale, and accuracy have become essential to predict and analyse natural and complex systems in science and engineering. When their level of abstraction raises to have a better discernment of the domain at hand, their representation gets increasingly demanding for computational and data resources. On the other hand, High Performance Computing typically entails the effective use of parallel and distributed processing units coupled with efficient storage, communication and visualisation systems to underpin complex data-intensive applications in distinct scientific and technical domains. It is then arguably required to have a seamless interaction of High Performance Computing with Modelling and Simulation in order to store, compute, analyse, and visualise large data sets in science and engineering. Funded by the European Commission, cHiPSet has provided a dynamic trans-European forum for their members and distinguished guests to openly discuss novel perspectives and topics of interests for these two communities. This cHiPSet compendium presents a set of selected case studies related to healthcare, biological data, computational advertising, multimedia, finance, bioinformatics, and telecommunications

    Sustainable Agriculture and Advances of Remote Sensing (Volume 1)

    Get PDF
    Agriculture, as the main source of alimentation and the most important economic activity globally, is being affected by the impacts of climate change. To maintain and increase our global food system production, to reduce biodiversity loss and preserve our natural ecosystem, new practices and technologies are required. This book focuses on the latest advances in remote sensing technology and agricultural engineering leading to the sustainable agriculture practices. Earth observation data, in situ and proxy-remote sensing data are the main source of information for monitoring and analyzing agriculture activities. Particular attention is given to earth observation satellites and the Internet of Things for data collection, to multispectral and hyperspectral data analysis using machine learning and deep learning, to WebGIS and the Internet of Things for sharing and publishing the results, among others

    High-Performance Modelling and Simulation for Big Data Applications

    Get PDF
    This open access book was prepared as a Final Publication of the COST Action IC1406 “High-Performance Modelling and Simulation for Big Data Applications (cHiPSet)“ project. Long considered important pillars of the scientific method, Modelling and Simulation have evolved from traditional discrete numerical methods to complex data-intensive continuous analytical optimisations. Resolution, scale, and accuracy have become essential to predict and analyse natural and complex systems in science and engineering. When their level of abstraction raises to have a better discernment of the domain at hand, their representation gets increasingly demanding for computational and data resources. On the other hand, High Performance Computing typically entails the effective use of parallel and distributed processing units coupled with efficient storage, communication and visualisation systems to underpin complex data-intensive applications in distinct scientific and technical domains. It is then arguably required to have a seamless interaction of High Performance Computing with Modelling and Simulation in order to store, compute, analyse, and visualise large data sets in science and engineering. Funded by the European Commission, cHiPSet has provided a dynamic trans-European forum for their members and distinguished guests to openly discuss novel perspectives and topics of interests for these two communities. This cHiPSet compendium presents a set of selected case studies related to healthcare, biological data, computational advertising, multimedia, finance, bioinformatics, and telecommunications

    Proceedings of the 7th International Conference on Functional-Structural Plant Models, Saariselkä, Finland, 9 - 14 June 2013

    Get PDF
    corecore