5,711 research outputs found

    Review of modern numerical methods for a simple vanilla option pricing problem

    Get PDF
    Option pricing is a very attractive issue of financial engineering and optimization. The problem of determining the fair price of an option arises from the assumptions made under a given financial market model. The increasing complexity of these market assumptions contributes to the popularity of the numerical treatment of option valuation. Therefore, the pricing and hedging of plain vanilla options under the Black–Scholes model usually serve as a bench-mark for the development of new numerical pricing approaches and methods designed for advanced option pricing models. The objective of the paper is to present and compare the methodological concepts for the valuation of simple vanilla options using the relatively modern numerical techniques in this issue which arise from the discontinuous Galerkin method, the wavelet approach and the fuzzy transform technique. A theoretical comparison is accompanied by an empirical study based on the numerical verification of simple vanilla option prices. The resulting numerical schemes represent a particularly effective option pricing tool that enables some features of options that are depend-ent on the discretization of the computational domain as well as the order of the polynomial approximation to be captured better

    Fuzzy Supernova Templates I: Classification

    Full text link
    Modern supernova (SN) surveys are now uncovering stellar explosions at rates that far surpass what the world's spectroscopic resources can handle. In order to make full use of these SN datasets, it is necessary to use analysis methods that depend only on the survey photometry. This paper presents two methods for utilizing a set of SN light curve templates to classify SN objects. In the first case we present an updated version of the Bayesian Adaptive Template Matching program (BATM). To address some shortcomings of that strictly Bayesian approach, we introduce a method for Supernova Ontology with Fuzzy Templates (SOFT), which utilizes Fuzzy Set Theory for the definition and combination of SN light curve models. For well-sampled light curves with a modest signal to noise ratio (S/N>10), the SOFT method can correctly separate thermonuclear (Type Ia) SNe from core collapse SNe with 98% accuracy. In addition, the SOFT method has the potential to classify supernovae into sub-types, providing photometric identification of very rare or peculiar explosions. The accuracy and precision of the SOFT method is verified using Monte Carlo simulations as well as real SN light curves from the Sloan Digital Sky Survey and the SuperNova Legacy Survey. In a subsequent paper the SOFT method is extended to address the problem of parameter estimation, providing estimates of redshift, distance, and host galaxy extinction without any spectroscopy.Comment: 26 pages, 12 figures. Accepted to Ap

    Exploiting zoning based on approximating splines in cursive script recognition

    Get PDF
    Because of its complexity, handwriting recognition has to exploit many sources of information to be successful, e.g. the handwriting zones. Variability of zone-lines, however, requires a more flexible representation than traditional horizontal or linear methods. The proposed method therefore employs approximating cubic splines. Using entire lines of text rather than individual words is shown to improve the zoning accuracy, especially for short words. The new method represents an improvement over existing methods in terms of range of applicability, zone-line precision and zoning-classification accuracy. Application to several problems of handwriting recognition is demonstrated and evaluated

    Monotonicity preserving approximation of multivariate scattered data

    Full text link
    This paper describes a new method of monotone interpolation and smoothing of multivariate scattered data. It is based on the assumption that the function to be approximated is Lipschitz continuous. The method provides the optimal approximation in the worst case scenario and tight error bounds. Smoothing of noisy data subject to monotonicity constraints is converted into a quadratic programming problem. Estimation of the unknown Lipschitz constant from the data by sample splitting and cross-validation is described. Extension of the method for locally Lipschitz functions is presented.<br /

    An evolutionary approach to constraint-regularized learning

    Get PDF
    The success of machine learning methods for inducing models from data crucially depends on the proper incorporation of background knowledge about the model to be learned. The idea of constraint-regularized learning is to em- ploy fuzzy set-based modeling techniques in order to express such knowl- edge in a flexible way, and to formalize it in terms of fuzzy constraints. Thus, background knowledge can be used to appropriately bias the learn- ing process within the regularization framework of inductive inference. After a brief review of this idea, the paper offers an operationalization of constraint- regularized learning. The corresponding framework is based on evolutionary methods for model optimization and employs fuzzy rule bases of the Takagi- Sugeno type as flexible function approximators

    Monotone approximation of aggregation operators using least squares splines

    Full text link
    The need for monotone approximation of scattered data often arises in many problems of regression, when the monotonicity is semantically important. One such domain is fuzzy set theory, where membership functions and aggregation operators are order preserving. Least squares polynomial splines provide great flexbility when modeling non-linear functions, but may fail to be monotone. Linear restrictions on spline coefficients provide necessary and sufficient conditions for spline monotonicity. The basis for splines is selected in such a way that these restrictions take an especially simple form. The resulting non-negative least squares problem can be solved by a variety of standard proven techniques. Additional interpolation requirements can also be imposed in the same framework. The method is applied to fuzzy systems, where membership functions and aggregation operators are constructed from empirical data.<br /
    • …
    corecore