67 research outputs found

    Data Challenges and Data Analytics Solutions for Power Systems

    Get PDF
    L'abstract Ăš presente nell'allegato / the abstract is in the attachmen

    Improving Demand Forecasting: The Challenge of Forecasting Studies Comparability and a Novel Approach to Hierarchical Time Series Forecasting

    Get PDF
    Bedarfsprognosen sind in der Wirtschaft unerlĂ€sslich. Anhand des erwarteten Kundenbe-darfs bestimmen Firmen beispielsweise welche Produkte sie entwickeln, wie viele Fabri-ken sie bauen, wie viel Personal eingestellt wird oder wie viel Rohmaterial geordert wer-den muss. FehleinschĂ€tzungen bei Bedarfsprognosen können schwerwiegende Auswir-kungen haben, zu Fehlentscheidungen fĂŒhren, und im schlimmsten Fall den Bankrott einer Firma herbeifĂŒhren. Doch in vielen FĂ€llen ist es komplex, den tatsĂ€chlichen Bedarf in der Zukunft zu antizipie-ren. Die Einflussfaktoren können vielfĂ€ltig sein, beispielsweise makroökonomische Ent-wicklung, das Verhalten von Wettbewerbern oder technologische Entwicklungen. Selbst wenn alle Einflussfaktoren bekannt sind, sind die ZusammenhĂ€nge und Wechselwirkun-gen hĂ€ufig nur schwer zu quantifizieren. Diese Dissertation trĂ€gt dazu bei, die Genauigkeit von Bedarfsprognosen zu verbessern. Im ersten Teil der Arbeit wird im Rahmen einer ĂŒberfassenden Übersicht ĂŒber das gesamte Spektrum der Anwendungsfelder von Bedarfsprognosen ein neuartiger Ansatz eingefĂŒhrt, wie Studien zu Bedarfsprognosen systematisch verglichen werden können und am Bei-spiel von 116 aktuellen Studien angewandt. Die Vergleichbarkeit von Studien zu verbes-sern ist ein wesentlicher Beitrag zur aktuellen Forschung. Denn anders als bspw. in der Medizinforschung, gibt es fĂŒr Bedarfsprognosen keine wesentlichen vergleichenden quan-titativen Meta-Studien. Der Grund dafĂŒr ist, dass empirische Studien fĂŒr Bedarfsprognosen keine vereinheitlichte Beschreibung nutzen, um ihre Daten, Verfahren und Ergebnisse zu beschreiben. Wenn Studien hingegen durch systematische Beschreibung direkt miteinan-der verglichen werden können, ermöglicht das anderen Forschern besser zu analysieren, wie sich Variationen in AnsĂ€tzen auf die PrognosegĂŒte auswirken – ohne die aufwĂ€ndige Notwendigkeit, empirische Experimente erneut durchzufĂŒhren, die bereits in Studien beschrieben wurden. Diese Arbeit fĂŒhrt erstmals eine solche Systematik zur Beschreibung ein. Der weitere Teil dieser Arbeit behandelt Prognoseverfahren fĂŒr intermittierende Zeitreihen, also Zeitreihen mit wesentlichem Anteil von Bedarfen gleich Null. Diese Art der Zeitreihen erfĂŒllen die Anforderungen an Stetigkeit der meisten Prognoseverfahren nicht, weshalb gĂ€ngige Verfahren hĂ€ufig ungenĂŒgende PrognosegĂŒte erreichen. Gleichwohl ist die Rele-vanz intermittierender Zeitreihen hoch – insbesondere Ersatzteile weisen dieses Bedarfs-muster typischerweise auf. ZunĂ€chst zeigt diese Arbeit in drei Studien auf, dass auch die getesteten Stand-der-Technik Machine Learning AnsĂ€tze bei einigen bekannten DatensĂ€t-zen keine generelle Verbesserung herbeifĂŒhren. Als wesentlichen Beitrag zur Forschung zeigt diese Arbeit im Weiteren ein neuartiges Verfahren auf: Der Similarity-based Time Series Forecasting (STSF) Ansatz nutzt ein Aggregation-Disaggregationsverfahren basie-rend auf einer selbst erzeugten Hierarchie statistischer Eigenschaften der Zeitreihen. In Zusammenhang mit dem STSF Ansatz können alle verfĂŒgbaren Prognosealgorithmen eingesetzt werden – durch die Aggregation wird die Stetigkeitsbedingung erfĂŒllt. In Expe-rimenten an insgesamt sieben öffentlich bekannten DatensĂ€tzen und einem proprietĂ€ren Datensatz zeigt die Arbeit auf, dass die PrognosegĂŒte (gemessen anhand des Root Mean Square Error RMSE) statistisch signifikant um 1-5% im Schnitt gegenĂŒber dem gleichen Verfahren ohne Einsatz von STSF verbessert werden kann. Somit fĂŒhrt das Verfahren eine wesentliche Verbesserung der PrognosegĂŒte herbei. Zusammengefasst trĂ€gt diese Dissertation zum aktuellen Stand der Forschung durch die zuvor genannten Verfahren wesentlich bei. Das vorgeschlagene Verfahren zur Standardi-sierung empirischer Studien beschleunigt den Fortschritt der Forschung, da sie verglei-chende Studien ermöglicht. Und mit dem STSF Verfahren steht ein Ansatz bereit, der zuverlĂ€ssig die PrognosegĂŒte verbessert, und dabei flexibel mit verschiedenen Arten von Prognosealgorithmen einsetzbar ist. Nach dem Erkenntnisstand der umfassenden Literatur-recherche sind keine vergleichbaren AnsĂ€tze bislang beschrieben worden

    Multistep ahead time series prediction

    Get PDF
    Time series analysis has been the subject of extensive interest in many fields ofstudy ranging from weather forecasting to economic predictions, over the past twocenturies. It has been fundamental to our understanding of previous patterns withindata and has also been used to make predictions in both the short and long termhorizons. When approaching such problems researchers would typically analyzethe given series for a number of distinct characteristics and select the most ap-propriate technique. However, the complexity of aligning a set of characteristicswith a method has increased in complexity with the advent of Machine Learningand the introduction of Multi-Step Ahead Prediction (MSAP). We examine themodel/strategy approaches which are currently applied to conduct multi-step aheadprediction in time series data and propose an alternative MSAP strategy known asMulti-Resolution Forecast Aggregation.Typically, when researchers propose an alternative strategy or method, they demon-strate it on a relatively small set of time series, thus the general breath of use isunknown. We propose a process that generates a diverse set of synthetic time se-ries, that will enable a robust examination of MRFA and other methods/strategies.This dataset in conjunction with a range of popular prediction methods and MSAPstrategies is then used to develop a meta learner that estimates the normalized meansquare error of the prediction approach for the given time serie

    Evolving machine learning and deep learning models using evolutionary algorithms

    Get PDF
    Despite the great success in data mining, machine learning and deep learning models are yet subject to material obstacles when tackling real-life challenges, such as feature selection, initialization sensitivity, as well as hyperparameter optimization. The prevalence of these obstacles has severely constrained conventional machine learning and deep learning methods from fulfilling their potentials. In this research, three evolving machine learning and one evolving deep learning models are proposed to eliminate above bottlenecks, i.e. improving model initialization, enhancing feature representation, as well as optimizing model configuration, respectively, through hybridization between the advanced evolutionary algorithms and the conventional ML and DL methods. Specifically, two Firefly Algorithm based evolutionary clustering models are proposed to optimize cluster centroids in K-means and overcome initialization sensitivity as well as local stagnation. Secondly, a Particle Swarm Optimization based evolving feature selection model is developed for automatic identification of the most effective feature subset and reduction of feature dimensionality for tackling classification problems. Lastly, a Grey Wolf Optimizer based evolving Convolutional Neural Network-Long Short-Term Memory method is devised for automatic generation of the optimal topological and learning configurations for Convolutional Neural Network-Long Short-Term Memory networks to undertake multivariate time series prediction problems. Moreover, a variety of tailored search strategies are proposed to eliminate the intrinsic limitations embedded in the search mechanisms of the three employed evolutionary algorithms, i.e. the dictation of the global best signal in Particle Swarm Optimization, the constraint of the diagonal movement in Firefly Algorithm, as well as the acute contraction of search territory in Grey Wolf Optimizer, respectively. The remedy strategies include the diversification of guiding signals, the adaptive nonlinear search parameters, the hybrid position updating mechanisms, as well as the enhancement of population leaders. As such, the enhanced Particle Swarm Optimization, Firefly Algorithm, and Grey Wolf Optimizer variants are more likely to attain global optimality on complex search landscapes embedded in data mining problems, owing to the elevated search diversity as well as the achievement of advanced trade-offs between exploration and exploitation

    A hierarchical methodology for vessel traffic flow prediction using Bayesian tensor decomposition and similarity grouping

    Get PDF
    Accurate vessel traffic flow (VTF) prediction can enhance navigation safety and economic efficiency. To address the challenge of the inherently complex and dynamic growth of the VTF time series, a new hierarchical methodology for VTF prediction is proposed. Firstly, the original VTF data is reconfigured as a three-dimensional tensor by a modified Bayesian Gaussian CANDECOMP/PARAFAC (BGCP) tensor decomposition model. Secondly, the VTF matrix (hour ✕ day) of each week is decomposed into high- and low-frequency matrices using a Bidimensional Empirical Mode Decomposition (BEMD) model to address the non-stationary signals affecting prediction results. Thirdly, the self-similarities between VTF matrices of each week within the high-frequency tensor are utilised to rearrange the matrices as different one-dimensional time series to solve the weak mathematical regularity in the high-frequency matrix. Then, a Dynamic Time Warping (DTW) model is employed to identify grouped segments with high similarities to generate more suitable high-frequency tensors. The experimental results verify that the proposed methodology outperforms the state-of-the-art VTF prediction methods using real Automatic Identification System (AIS) datasets collected from two areas. The methodology can potentially optimise relation operations and manage vessel traffic, benefiting stakeholders such as port authorities, ship operators, and freight forwarders

    Recent Developments in Smart Healthcare

    Get PDF
    Medicine is undergoing a sector-wide transformation thanks to the advances in computing and networking technologies. Healthcare is changing from reactive and hospital-centered to preventive and personalized, from disease focused to well-being centered. In essence, the healthcare systems, as well as fundamental medicine research, are becoming smarter. We anticipate significant improvements in areas ranging from molecular genomics and proteomics to decision support for healthcare professionals through big data analytics, to support behavior changes through technology-enabled self-management, and social and motivational support. Furthermore, with smart technologies, healthcare delivery could also be made more efficient, higher quality, and lower cost. In this special issue, we received a total 45 submissions and accepted 19 outstanding papers that roughly span across several interesting topics on smart healthcare, including public health, health information technology (Health IT), and smart medicine

    Advances in Computational Intelligence Applications in the Mining Industry

    Get PDF
    This book captures advancements in the applications of computational intelligence (artificial intelligence, machine learning, etc.) to problems in the mineral and mining industries. The papers present the state of the art in four broad categories: mine operations, mine planning, mine safety, and advances in the sciences, primarily in image processing applications. Authors in the book include both researchers and industry practitioners

    Evolutionary Computation 2020

    Get PDF
    Intelligent optimization is based on the mechanism of computational intelligence to refine a suitable feature model, design an effective optimization algorithm, and then to obtain an optimal or satisfactory solution to a complex problem. Intelligent algorithms are key tools to ensure global optimization quality, fast optimization efficiency and robust optimization performance. Intelligent optimization algorithms have been studied by many researchers, leading to improvements in the performance of algorithms such as the evolutionary algorithm, whale optimization algorithm, differential evolution algorithm, and particle swarm optimization. Studies in this arena have also resulted in breakthroughs in solving complex problems including the green shop scheduling problem, the severe nonlinear problem in one-dimensional geodesic electromagnetic inversion, error and bug finding problem in software, the 0-1 backpack problem, traveler problem, and logistics distribution center siting problem. The editors are confident that this book can open a new avenue for further improvement and discoveries in the area of intelligent algorithms. The book is a valuable resource for researchers interested in understanding the principles and design of intelligent algorithms
    • 

    corecore