2,139 research outputs found

    A Nonparametric Approach to Pricing Options Learning Networks

    Get PDF
    For practitioners of equity markets, option pricing is a major challenge during high volatility periods and Black-Scholes formula for option pricing is not the proper tool for very deep out-of-the-money options. The Black-Scholes pricing errors are larger in the deeper out-of-the money options relative to the near the-money options, and it's mispricing worsens with increased volatility. Experts opinion is that the Black-Scholes model is not the proper pricing tool in high volatility situations especially for very deep out-of-the-money options. They also argue that prior to the 1987 crash, volatilities were symmetric around zero moneyness, with in-the-money and out-of-the money having higher implied volatilities than at-the-money options. However, after the crash, the call option implied volatilities were decreasing monotonically as the call went deeper into out-of-the-money, while the put option implied volatilities were decreasing monotonically as the put went deeper into in-the-money. Since these findings cannot be explained by the Black-Scholes model and its variations, researchers searched for improved option pricing models. Feedforward networks provide more accurate pricing estimates for the deeper out-of-the money options and handles pricing during high volatility with considerably lower errors for out-of-the-money call and put options. This could be invaluable information for practitioners as option pricing is a major challenge during high volatility periods. In this article a nonparametric method for estimating S&P 100 index option prices using artificial neural networks is presented. To show the value of artificial neural network pricing formulas, Black-Scholes option prices are compared with the network prices against market prices. To illustrate the practical relevance of the network pricing approach, it is applied to the pricing of S&P 100 index options from April 4, 2014 to April 9, 2014. On the five days data while Black-Scholes formula prices have a mean 10.17errorforputs,and10.17 error for puts, and 1.98 for calls, while neural network’s error is less than 5forputs,and5 for puts, and 1 for calls

    The Essential Order of (L_p,p

    Get PDF
    درسنا في هذا البحث درجة التقريب الاساسي بأستخدام الشبكة العصبية المنتظمة ، وكيف يمكن تقريب الدوال  المتعددة المتغيرات في فضاء  عندما  بأستخدام الشبكة العصبية الامامية المنتظمة ، وكذلك بامكاننا الحصول على مبرهنات مباشرة وعكسية ونظرية تكافؤ للتقريب المتعددة المتغيرات في فضاء  عندما  بأستخدام الشبكة العصبية الامامية المنتظمة .This paper is concerning with essential degree of approximation using regular neural networks and how a multivariate function in  spaces for  can be approximated using a forward regular neural network. So, we can have the essential approximation ability of a multivariate function in  spaces for  using regular FFN

    Artificial Neural Networks

    Get PDF
    Artificial neural networks (ANNs) constitute a class of flexible nonlinear models designed to mimic biological neural systems. In this entry, we introduce ANN using familiar econometric terminology and provide an overview of ANN modeling approach and its implementation methods.

    Neural network representation and learning of mappings and their derivatives

    Get PDF
    Discussed here are recent theorems proving that artificial neural networks are capable of approximating an arbitrary mapping and its derivatives as accurately as desired. This fact forms the basis for further results establishing the learnability of the desired approximations, using results from non-parametric statistics. These results have potential applications in robotics, chaotic dynamics, control, and sensitivity analysis. An example involving learning the transfer function and its derivatives for a chaotic map is discussed

    Measuring efficiency with neural networks. An application to the public sector

    Get PDF
    In this note we propose the artificial neural networks for measuring efficiency as a complementary tool to the common techniques of the efficiency literature. In the application to the public sector we find that the neural network allows to conclude more robust results to rank decision-making units.DEA

    Why and When Can Deep -- but Not Shallow -- Networks Avoid the Curse of Dimensionality: a Review

    Get PDF
    The paper characterizes classes of functions for which deep learning can be exponentially better than shallow learning. Deep convolutional networks are a special case of these conditions, though weight sharing is not the main reason for their exponential advantage

    Nonparametric regression using deep neural networks with ReLU activation function

    Get PDF
    Consider the multivariate nonparametric regression model. It is shown that estimators based on sparsely connected deep neural networks with ReLU activation function and properly chosen network architecture achieve the minimax rates of convergence (up to logn\log n-factors) under a general composition assumption on the regression function. The framework includes many well-studied structural constraints such as (generalized) additive models. While there is a lot of flexibility in the network architecture, the tuning parameter is the sparsity of the network. Specifically, we consider large networks with number of potential network parameters exceeding the sample size. The analysis gives some insights into why multilayer feedforward neural networks perform well in practice. Interestingly, for ReLU activation function the depth (number of layers) of the neural network architectures plays an important role and our theory suggests that for nonparametric regression, scaling the network depth with the sample size is natural. It is also shown that under the composition assumption wavelet estimators can only achieve suboptimal rates.Comment: article, rejoinder and supplementary materia
    corecore