Skip to main content
Article thumbnail
Location of Repository

A Negative Log Likelihood Function-Based Nonlinear Neural Network Approach

By Porntip Dechpichai and Pamela Davy


The most commonly used objective function in Artificial Neural Networks (ANNs) is the sum of squared errors. This requires the target and forecasted output vector to have the same dimension. In the context of nonlinear financial time series, both conditional mean and variance (volatility) tend to evolve over time. It is therefore of interest to consider neural networks with two-dimensional output even though the target data are one-dimensional. The idea of the back-propagation algorithm can be extended to this situation. For example, the negative log-likelihood based on a parametric statistical model is a possible alternative to the traditional least squares objective. It has been found that the RMSPE for the mean is smaller for of the developed neural network (LLNN) than traditional neural network (NN) in the majority cases. LLNN also provides comparable performance to NN for the real data set (Index of Stock Exchange of Thailand). KEY WORDS: Artificial neural network, log-likelihood, conditional mean, conditional heteroscedasticity

Year: 2009
OAI identifier: oai:CiteSeerX.psu:
Provided by: CiteSeerX
Download PDF:
Sorry, we are unable to provide the full text but you may find it at the following location(s):
  • (external link)
  • (external link)
  • Suggested articles

    To submit an update or takedown request for this paper, please submit an Update/Correction/Removal Request.