23 research outputs found
Big Data Analysis application in the renewable energy market: wind power
Entre as enerxías renovables, a enerxía eólica e unha das tecnoloxías
mundiais de rápido crecemento. Non obstante, esta incerteza debería minimizarse para programar e xestionar
mellor os activos de xeración tradicionais para compensar a falta de electricidade nas redes electricas. A aparición
de técnicas baseadas en datos ou aprendizaxe automática deu a capacidade de proporcionar predicións espaciais
e temporais de alta resolución da velocidade e potencia do vento. Neste traballo desenvólvense tres modelos
diferentes de ANN, abordando tres grandes problemas na predición de series de datos con esta técnica: garantía
de calidade de datos e imputación de datos non válidos, asignación de hiperparámetros e selección de funcións.
Os modelos desenvolvidos baséanse en técnicas de agrupación, optimización e procesamento de sinais para
proporcionar predicións de velocidade e potencia do vento a curto e medio prazo (de minutos a horas)
Load forecasting: a cross-field study on server and energy load forecasting Impact of temporal factors on generalization ability and performance of regression models
The server load prediction and energy load forecasting have available a wide range of
approaches and applications, with their general goal being the prediction of future load
for a specific period of time on a given system. Depending on the specific goal, different
methodologies can be applied.
In this dissertation, the integration of additional temporal information to datasets, as
a mean to create a more generalized model is studied. The main steps involve a deep
literature review in order to find the most suited methodologies and/or learning methods.
A novel dataset enrichment process through the integration of extra temporal information
and lastly, a cross-model testing stage, where trained models for server load prediction and
energy load forecast are applied to the opposite field. This last stage, tests and analyses the
generalization level of the created models through the temporal information integration
procedure.
The created models were both oriented to short-term load forecasting problems, with
the use of data from single and combined months, regarding real data from Wikipedia
servers of the year 2016 in the case of server load prediction and real data regarding
the consumption levels in April 2016 of the city of Leiria/Portugal for the energy load
forecasting case study. The learning methods used for creating the different models
were linear regression, artificial neural networks and support vector machines oriented
to regression problems, more precisely the Smoreg implementation. Results prove that
it is possible to tune the dataset features, e.g., granularity and time window to improve
prediction results and generalization. Results from this work, as well as an optimization
approach through the use of genetic algorithms, normalization effects, split ratio vs crossvalidation
influence and different granularities and time windows were peer-reviewed
published
Forecasting: theory and practice
Forecasting has always been at the forefront of decision making and planning. The uncertainty that surrounds the future is both exciting and challenging, with individuals and organisations seeking to minimise risks and maximise utilities. The large number of forecasting applications calls for a diverse set of forecasting methods to tackle real-life challenges. This article provides a non-systematic review of the theory and the practice of forecasting. We provide an overview of a wide range of theoretical, state-of-the-art models, methods, principles, and approaches to prepare, produce, organise, and evaluate forecasts. We then demonstrate how such theoretical concepts are applied in a variety of real-life contexts. We do not claim that this review is an exhaustive list of methods and applications. However, we wish that our encyclopedic presentation will offer a point of reference for the rich work that has been undertaken over the last decades, with some key insights for the future of forecasting theory and practice. Given its encyclopedic nature, the intended mode of reading is non-linear. We offer cross-references to allow the readers to navigate through the various topics. We complement the theoretical concepts and applications covered by large lists of free or open-source software implementations and publicly-available databases
Shortest Route at Dynamic Location with Node Combination-Dijkstra Algorithm
Abstract— Online transportation has become a basic
requirement of the general public in support of all activities to go
to work, school or vacation to the sights. Public transportation
services compete to provide the best service so that consumers
feel comfortable using the services offered, so that all activities
are noticed, one of them is the search for the shortest route in
picking the buyer or delivering to the destination. Node
Combination method can minimize memory usage and this
methode is more optimal when compared to A* and Ant Colony
in the shortest route search like Dijkstra algorithm, but can’t
store the history node that has been passed. Therefore, using
node combination algorithm is very good in searching the
shortest distance is not the shortest route. This paper is
structured to modify the node combination algorithm to solve the
problem of finding the shortest route at the dynamic location
obtained from the transport fleet by displaying the nodes that
have the shortest distance and will be implemented in the
geographic information system in the form of map to facilitate
the use of the system.
Keywords— Shortest Path, Algorithm Dijkstra, Node
Combination, Dynamic Location (key words
High-Performance Modelling and Simulation for Big Data Applications
This open access book was prepared as a Final Publication of the COST Action IC1406 “High-Performance Modelling and Simulation for Big Data Applications (cHiPSet)“ project. Long considered important pillars of the scientific method, Modelling and Simulation have evolved from traditional discrete numerical methods to complex data-intensive continuous analytical optimisations. Resolution, scale, and accuracy have become essential to predict and analyse natural and complex systems in science and engineering. When their level of abstraction raises to have a better discernment of the domain at hand, their representation gets increasingly demanding for computational and data resources. On the other hand, High Performance Computing typically entails the effective use of parallel and distributed processing units coupled with efficient storage, communication and visualisation systems to underpin complex data-intensive applications in distinct scientific and technical domains. It is then arguably required to have a seamless interaction of High Performance Computing with Modelling and Simulation in order to store, compute, analyse, and visualise large data sets in science and engineering. Funded by the European Commission, cHiPSet has provided a dynamic trans-European forum for their members and distinguished guests to openly discuss novel perspectives and topics of interests for these two communities. This cHiPSet compendium presents a set of selected case studies related to healthcare, biological data, computational advertising, multimedia, finance, bioinformatics, and telecommunications