2,307 research outputs found
Research and Education in Computational Science and Engineering
Over the past two decades the field of computational science and engineering
(CSE) has penetrated both basic and applied research in academia, industry, and
laboratories to advance discovery, optimize systems, support decision-makers,
and educate the scientific and engineering workforce. Informed by centuries of
theory and experiment, CSE performs computational experiments to answer
questions that neither theory nor experiment alone is equipped to answer. CSE
provides scientists and engineers of all persuasions with algorithmic
inventions and software systems that transcend disciplines and scales. Carried
on a wave of digital technology, CSE brings the power of parallelism to bear on
troves of data. Mathematics-based advanced computing has become a prevalent
means of discovery and innovation in essentially all areas of science,
engineering, technology, and society; and the CSE community is at the core of
this transformation. However, a combination of disruptive
developments---including the architectural complexity of extreme-scale
computing, the data revolution that engulfs the planet, and the specialization
required to follow the applications to new frontiers---is redefining the scope
and reach of the CSE endeavor. This report describes the rapid expansion of CSE
and the challenges to sustaining its bold advances. The report also presents
strategies and directions for CSE research and education for the next decade.Comment: Major revision, to appear in SIAM Revie
Evolutionary optimization of sparsely connected and time-lagged neural networks for time series forecasting
Time Series Forecasting (TSF) is an important tool to support decision mak- ing (e.g., planning production resources). Artificial Neural Networks (ANN) are innate candidates for TSF due to advantages such as nonlinear learn- ing and noise tolerance. However, the search for the best model is a complex task that highly affects the forecasting performance. In this work, we propose two novel Evolutionary Artificial Neural Networks (EANN) approaches for TSF based on an Estimation Distribution Algorithm (EDA) search engine. The first new approach consist of Sparsely connected Evolutionary ANN (SEANN), which evolves more flexible ANN structures to perform multi-step ahead forecasts. The second one, consists of an automatic Time lag feature selection EANN (TEANN) approach that evolves not only ANN parameters (e.g., input and hidden nodes, training parameters) but also which set of time lags are fed into the forecasting model. Several experiments were held, using a set of six time series, from different real-world domains. Also, two error metrics (i.e., Mean Squared Error and Symmetric Mean Absolute Per- centage Error) were analyzed. The two EANN approaches were compared against a base EANN (with no ANN structure or time lag optimization) and four other methods (Autoregressive Integrated Moving Average method, Random Forest, Echo State Network and Support Vector Machine). Overall, the proposed SEANN and TEANN methods obtained the best forecasting results. Moreover, they favor simpler neural network models, thus requiring less computational effort when compared with the base EANN.The research reported here has been supported by the Spanish Ministry of Science and Innovation under project TRA2010-21371-C03-03 and FCT - Fundacao para a Ciencia e Tecnologia within the Project Scope PEst- OE/EEI/UI0319/2014. The authors want to thank specially Martin Stepnicka and Lenka Vavrickova for all their help. The authors also want to thank Ramon Sagarna for introducing the subject of EDA
A Survey on Reservoir Computing and its Interdisciplinary Applications Beyond Traditional Machine Learning
Reservoir computing (RC), first applied to temporal signal processing, is a
recurrent neural network in which neurons are randomly connected. Once
initialized, the connection strengths remain unchanged. Such a simple structure
turns RC into a non-linear dynamical system that maps low-dimensional inputs
into a high-dimensional space. The model's rich dynamics, linear separability,
and memory capacity then enable a simple linear readout to generate adequate
responses for various applications. RC spans areas far beyond machine learning,
since it has been shown that the complex dynamics can be realized in various
physical hardware implementations and biological devices. This yields greater
flexibility and shorter computation time. Moreover, the neuronal responses
triggered by the model's dynamics shed light on understanding brain mechanisms
that also exploit similar dynamical processes. While the literature on RC is
vast and fragmented, here we conduct a unified review of RC's recent
developments from machine learning to physics, biology, and neuroscience. We
first review the early RC models, and then survey the state-of-the-art models
and their applications. We further introduce studies on modeling the brain's
mechanisms by RC. Finally, we offer new perspectives on RC development,
including reservoir design, coding frameworks unification, physical RC
implementations, and interaction between RC, cognitive neuroscience and
evolution.Comment: 51 pages, 19 figures, IEEE Acces
Reservoir of Diverse Adaptive Learners and Stacking Fast Hoeffding Drift Detection Methods for Evolving Data Streams
The last decade has seen a surge of interest in adaptive learning algorithms
for data stream classification, with applications ranging from predicting ozone
level peaks, learning stock market indicators, to detecting computer security
violations. In addition, a number of methods have been developed to detect
concept drifts in these streams. Consider a scenario where we have a number of
classifiers with diverse learning styles and different drift detectors.
Intuitively, the current 'best' (classifier, detector) pair is application
dependent and may change as a result of the stream evolution. Our research
builds on this observation. We introduce the \mbox{Tornado} framework that
implements a reservoir of diverse classifiers, together with a variety of drift
detection algorithms. In our framework, all (classifier, detector) pairs
proceed, in parallel, to construct models against the evolving data streams. At
any point in time, we select the pair which currently yields the best
performance. We further incorporate two novel stacking-based drift detection
methods, namely the \mbox{FHDDMS} and \mbox{FHDDMS}_{add} approaches. The
experimental evaluation confirms that the current 'best' (classifier, detector)
pair is not only heavily dependent on the characteristics of the stream, but
also that this selection evolves as the stream flows. Further, our
\mbox{FHDDMS} variants detect concept drifts accurately in a timely fashion
while outperforming the state-of-the-art.Comment: 42 pages, and 14 figure
Development of soft computing and applications in agricultural and biological engineering
Soft computing is a set of “inexact” computing techniques, which are able to model and analyze very complex problems. For these complex problems, more conventional methods have not been able to produce cost-effective, analytical, or complete solutions. Soft computing has been extensively studied and applied in the last three decades for scientific research and engineering computing. In agricultural and biological engineering, researchers and engineers have developed methods of fuzzy logic, artificial neural networks, genetic algorithms, decision trees, and support vector machines to study soil and water regimes related to crop growth, analyze the operation of food processing, and support decision-making in precision farming. This paper reviews the development of soft computing techniques. With the concepts and methods, applications of soft computing in the field of agricultural and biological engineering are presented, especially in the soil and water context for crop management and decision support in precision agriculture. The future of development and application of soft computing in agricultural and biological engineering is discussed
Artificial Intelligence and Cognitive Computing
Artificial intelligence (AI) is a subject garnering increasing attention in both academia and the industry today. The understanding is that AI-enhanced methods and techniques create a variety of opportunities related to improving basic and advanced business functions, including production processes, logistics, financial management and others. As this collection demonstrates, AI-enhanced tools and methods tend to offer more precise results in the fields of engineering, financial accounting, tourism, air-pollution management and many more. The objective of this collection is to bring these topics together to offer the reader a useful primer on how AI-enhanced tools and applications can be of use in today’s world. In the context of the frequently fearful, skeptical and emotion-laden debates on AI and its value added, this volume promotes a positive perspective on AI and its impact on society. AI is a part of a broader ecosystem of sophisticated tools, techniques and technologies, and therefore, it is not immune to developments in that ecosystem. It is thus imperative that inter- and multidisciplinary research on AI and its ecosystem is encouraged. This collection contributes to that
Robotic ubiquitous cognitive ecology for smart homes
Robotic ecologies are networks of heterogeneous robotic devices pervasively embedded in everyday environments, where they cooperate to perform complex tasks. While their potential makes them increasingly popular, one fundamental problem is how to make them both autonomous and adaptive, so as to reduce the amount of preparation, pre-programming and human supervision that they require in real world applications. The project RUBICON develops learning solutions which yield cheaper, adaptive and efficient coordination of robotic ecologies. The approach we pursue builds upon a unique combination of methods from cognitive robotics, machine learning, planning and agent- based control, and wireless sensor networks. This paper illustrates the innovations advanced by RUBICON in each of these fronts before describing how the resulting techniques have been integrated and applied to a smart home scenario. The resulting system is able to provide useful services and pro-actively assist the users in their activities. RUBICON learns through an incremental and progressive approach driven by the feed- back received from its own activities and from the user, while also self-organizing the manner in which it uses available sensors, actuators and other functional components in the process. This paper summarises some of the lessons learned by adopting such an approach and outlines promising directions for future work
Oil and Gas flow Anomaly Detection on offshore naturally flowing wells using Deep Neural Networks
Dissertation presented as the partial requirement for obtaining a Master's degree in Data Science and Advanced Analytics, specialization in Data ScienceThe Oil and Gas industry, as never before, faces multiple challenges. It is being impugned for being
dirty, a pollutant, and hence the more demand for green alternatives. Nevertheless, the world still has
to rely heavily on hydrocarbons, since it is the most traditional and stable source of energy, as opposed
to extensively promoted hydro, solar or wind power. Major operators are challenged to produce the
oil more efficiently, to counteract the newly arising energy sources, with less of a climate footprint,
more scrutinized expenditure, thus facing high skepticism regarding its future. It has to become
greener, and hence to act in a manner not required previously.
While most of the tools used by the Hydrocarbon E&P industry is expensive and has been used for
many years, it is paramount for the industry’s survival and prosperity to apply predictive maintenance
technologies, that would foresee potential failures, making production safer, lowering downtime,
increasing productivity and diminishing maintenance costs. Many efforts were applied in order to
define the most accurate and effective predictive methods, however data scarcity affects the speed
and capacity for further experimentations. Whilst it would be highly beneficial for the industry to invest
in Artificial Intelligence, this research aims at exploring, in depth, the subject of Anomaly Detection,
using the open public data from Petrobras, that was developed by experts.
For this research the Deep Learning Neural Networks, such as Recurrent Neural Networks with LSTM
and GRU backbones, were implemented for multi-class classification of undesirable events on naturally
flowing wells. Further, several hyperparameter optimization tools were explored, mainly focusing on
Genetic Algorithms as being the most advanced methods for such kind of tasks.
The research concluded with the best performing algorithm with 2 stacked GRU and the following
vector of hyperparameters weights: [1, 47, 40, 14], which stand for timestep 1, number of hidden units
47, number of epochs 40 and batch size 14, producing F1 equal to 0.97%.
As the world faces many issues, one of which is the detrimental effect of heavy industries to the
environment and as result adverse global climate change, this project is an attempt to contribute to
the field of applying Artificial Intelligence in the Oil and Gas industry, with the intention to make it
more efficient, transparent and sustainable
Reservoir characterization using intelligent seismic inversion
Integrating different types of data having different scales is the major challenge in reservoir characterization studies. Seismic data is among those different types of data, which is usually used by geoscientists for structural mapping of the subsurface and making interpretations of the reservoir\u27s facies distribution. Yet, it has been a common aim of geoscientists to incorporate seismic data in high-resolution reservoir description through a process called seismic inversion.;In this study, an intelligent seismic inversion methodology is presented to achieve a desirable correlation between relatively low-frequency seismic signals, and the much higher frequency wireline-log data. Vertical seismic profile (VSP) is used as an intermediate step between the well logs and the surface seismic. Generalized regression neural network (GRNN) is used to build two correlation models between; (1) Surface seismic and VSP, (2) VSP and well logs both using synthetic seismic data, and real data taken from the Buffalo Valley Field
- …