10,767 research outputs found
Modeling Financial Time Series with Artificial Neural Networks
Financial time series convey the decisions and actions of a population of human actors over time. Econometric and regressive models have been developed in the past decades for analyzing these time series. More recently, biologically inspired artificial neural network models have been shown to overcome some of the main challenges of traditional techniques by better exploiting the non-linear, non-stationary, and oscillatory nature of noisy, chaotic human interactions. This review paper explores the options, benefits, and weaknesses of the various forms of artificial neural networks as compared with regression techniques in the field of financial time series analysis.CELEST, a National Science Foundation Science of Learning Center (SBE-0354378); SyNAPSE program of the Defense Advanced Research Project Agency (HR001109-03-0001
DANTE: Deep AlterNations for Training nEural networks
We present DANTE, a novel method for training neural networks using the
alternating minimization principle. DANTE provides an alternate perspective to
traditional gradient-based backpropagation techniques commonly used to train
deep networks. It utilizes an adaptation of quasi-convexity to cast training a
neural network as a bi-quasi-convex optimization problem. We show that for
neural network configurations with both differentiable (e.g. sigmoid) and
non-differentiable (e.g. ReLU) activation functions, we can perform the
alternations effectively in this formulation. DANTE can also be extended to
networks with multiple hidden layers. In experiments on standard datasets,
neural networks trained using the proposed method were found to be promising
and competitive to traditional backpropagation techniques, both in terms of
quality of the solution, as well as training speed.Comment: 19 page
Design optimization applied in structural dynamics
This paper introduces the design optimization strategies, especially for structures which have dynamic constraints. Design optimization involves first the modeling and then the optimization of the problem. Utilizing the Finite Element (FE) model of a structure directly in an optimization process requires a long computation time. Therefore the Backpropagation Neural Networks (NNs) are introduced as a so called surrogate model for the FE model. Optimization techniques mentioned in this study cover the Genetic Algorithm (GA) and the Sequential Quadratic Programming (SQP) methods. For the applications of the introduced techniques, a multisegment cantilever beam problem under the constraints of its first and second natural frequency has been selected and solved using four different approaches
Visualizing Deep Networks by Optimizing with Integrated Gradients
Understanding and interpreting the decisions made by deep learning models is
valuable in many domains. In computer vision, computing heatmaps from a deep
network is a popular approach for visualizing and understanding deep networks.
However, heatmaps that do not correlate with the network may mislead human,
hence the performance of heatmaps in providing a faithful explanation to the
underlying deep network is crucial. In this paper, we propose I-GOS, which
optimizes for a heatmap so that the classification scores on the masked image
would maximally decrease. The main novelty of the approach is to compute
descent directions based on the integrated gradients instead of the normal
gradient, which avoids local optima and speeds up convergence. Compared with
previous approaches, our method can flexibly compute heatmaps at any resolution
for different user needs. Extensive experiments on several benchmark datasets
show that the heatmaps produced by our approach are more correlated with the
decision of the underlying deep network, in comparison with other
state-of-the-art approaches
PSO based Neural Networks vs. Traditional Statistical Models for Seasonal Time Series Forecasting
Seasonality is a distinctive characteristic which is often observed in many
practical time series. Artificial Neural Networks (ANNs) are a class of
promising models for efficiently recognizing and forecasting seasonal patterns.
In this paper, the Particle Swarm Optimization (PSO) approach is used to
enhance the forecasting strengths of feedforward ANN (FANN) as well as Elman
ANN (EANN) models for seasonal data. Three widely popular versions of the basic
PSO algorithm, viz. Trelea-I, Trelea-II and Clerc-Type1 are considered here.
The empirical analysis is conducted on three real-world seasonal time series.
Results clearly show that each version of the PSO algorithm achieves notably
better forecasting accuracies than the standard Backpropagation (BP) training
method for both FANN and EANN models. The neural network forecasting results
are also compared with those from the three traditional statistical models,
viz. Seasonal Autoregressive Integrated Moving Average (SARIMA), Holt-Winters
(HW) and Support Vector Machine (SVM). The comparison demonstrates that both
PSO and BP based neural networks outperform SARIMA, HW and SVM models for all
three time series datasets. The forecasting performances of ANNs are further
improved through combining the outputs from the three PSO based models.Comment: 4 figures, 4 tables, 31 references, conference proceeding
Comparative performance of some popular ANN algorithms on benchmark and function approximation problems
We report an inter-comparison of some popular algorithms within the
artificial neural network domain (viz., Local search algorithms, global search
algorithms, higher order algorithms and the hybrid algorithms) by applying them
to the standard benchmarking problems like the IRIS data, XOR/N-Bit parity and
Two Spiral. Apart from giving a brief description of these algorithms, the
results obtained for the above benchmark problems are presented in the paper.
The results suggest that while Levenberg-Marquardt algorithm yields the lowest
RMS error for the N-bit Parity and the Two Spiral problems, Higher Order
Neurons algorithm gives the best results for the IRIS data problem. The best
results for the XOR problem are obtained with the Neuro Fuzzy algorithm. The
above algorithms were also applied for solving several regression problems such
as cos(x) and a few special functions like the Gamma function, the
complimentary Error function and the upper tail cumulative
-distribution function. The results of these regression problems
indicate that, among all the ANN algorithms used in the present study,
Levenberg-Marquardt algorithm yields the best results. Keeping in view the
highly non-linear behaviour and the wide dynamic range of these functions, it
is suggested that these functions can be also considered as standard benchmark
problems for function approximation using artificial neural networks.Comment: 18 pages 5 figures. Accepted in Pramana- Journal of Physic
Optimization of Evolutionary Neural Networks Using Hybrid Learning Algorithms
Evolutionary artificial neural networks (EANNs) refer to a special class of
artificial neural networks (ANNs) in which evolution is another fundamental
form of adaptation in addition to learning. Evolutionary algorithms are used to
adapt the connection weights, network architecture and learning algorithms
according to the problem environment. Even though evolutionary algorithms are
well known as efficient global search algorithms, very often they miss the best
local solutions in the complex solution space. In this paper, we propose a
hybrid meta-heuristic learning approach combining evolutionary learning and
local search methods (using 1st and 2nd order error information) to improve the
learning and faster convergence obtained using a direct evolutionary approach.
The proposed technique is tested on three different chaotic time series and the
test results are compared with some popular neuro-fuzzy systems and a recently
developed cutting angle method of global optimization. Empirical results reveal
that the proposed technique is efficient in spite of the computational
complexity
- …