18,424 research outputs found

    A Parallel Algorithm for solving BSDEs - Application to the pricing and hedging of American options

    Get PDF
    We present a parallel algorithm for solving backward stochastic differential equations (BSDEs in short) which are very useful theoretic tools to deal with many financial problems ranging from option pricing option to risk management. Our algorithm based on Gobet and Labart (2010) exploits the link between BSDEs and non linear partial differential equations (PDEs in short) and hence enables to solve high dimensional non linear PDEs. In this work, we apply it to the pricing and hedging of American options in high dimensional local volatility models, which remains very computationally demanding. We have tested our algorithm up to dimension 10 on a cluster of 512 CPUs and we obtained linear speedups which proves the scalability of our implementationComment: 25 page

    A Parallel Algorithm for solving BSDEs - Application to the pricing and hedging of American options

    Get PDF
    We present a parallel algorithm for solving backward stochastic differential equations (BSDEs in short) which are very useful theoretic tools to deal with many financial problems ranging from option pricing option to risk management. Our algorithm based on Gobet and Labart (2010) exploits the link between BSDEs and non linear partial differential equations (PDEs in short) and hence enables to solve high dimensional non linear PDEs. In this work, we apply it to the pricing and hedging of American options in high dimensional local volatility models, which remains very computationally demanding. We have tested our algorithm up to dimension 10 on a cluster of 512 CPUs and we obtained linear speedups which proves the scalability of our implementationbackward stochastic differential equations, parallel computing, Monte- Carlo methods, non linear PDE, American options, local volatility model.

    PURIFY: a new approach to radio-interferometric imaging

    Get PDF
    In a recent article series, the authors have promoted convex optimization algorithms for radio-interferometric imaging in the framework of compressed sensing, which leverages sparsity regularization priors for the associated inverse problem and defines a minimization problem for image reconstruction. This approach was shown, in theory and through simulations in a simple discrete visibility setting, to have the potential to outperform significantly CLEAN and its evolutions. In this work, we leverage the versatility of convex optimization in solving minimization problems to both handle realistic continuous visibilities and offer a highly parallelizable structure paving the way to significant acceleration of the reconstruction and high-dimensional data scalability. The new algorithmic structure promoted relies on the simultaneous-direction method of multipliers (SDMM), and contrasts with the current major-minor cycle structure of CLEAN and its evolutions, which in particular cannot handle the state-of-the-art minimization problems under consideration where neither the regularization term nor the data term are differentiable functions. We release a beta version of an SDMM-based imaging software written in C and dubbed PURIFY (http://basp-group.github.io/purify/) that handles various sparsity priors, including our recent average sparsity approach SARA. We evaluate the performance of different priors through simulations in the continuous visibility setting, confirming the superiority of SARA

    ROOT - A C++ Framework for Petabyte Data Storage, Statistical Analysis and Visualization

    Full text link
    ROOT is an object-oriented C++ framework conceived in the high-energy physics (HEP) community, designed for storing and analyzing petabytes of data in an efficient way. Any instance of a C++ class can be stored into a ROOT file in a machine-independent compressed binary format. In ROOT the TTree object container is optimized for statistical data analysis over very large data sets by using vertical data storage techniques. These containers can span a large number of files on local disks, the web, or a number of different shared file systems. In order to analyze this data, the user can chose out of a wide set of mathematical and statistical functions, including linear algebra classes, numerical algorithms such as integration and minimization, and various methods for performing regression analysis (fitting). In particular, ROOT offers packages for complex data modeling and fitting, as well as multivariate classification based on machine learning techniques. A central piece in these analysis tools are the histogram classes which provide binning of one- and multi-dimensional data. Results can be saved in high-quality graphical formats like Postscript and PDF or in bitmap formats like JPG or GIF. The result can also be stored into ROOT macros that allow a full recreation and rework of the graphics. Users typically create their analysis macros step by step, making use of the interactive C++ interpreter CINT, while running over small data samples. Once the development is finished, they can run these macros at full compiled speed over large data sets, using on-the-fly compilation, or by creating a stand-alone batch program. Finally, if processing farms are available, the user can reduce the execution time of intrinsically parallel tasks - e.g. data mining in HEP - by using PROOF, which will take care of optimally distributing the work over the available resources in a transparent way

    CEDAR: tools for event generator tuning

    Full text link
    I describe the work of the CEDAR collaboration in developing tools for tuning and validating Monte Carlo event generator programs. The core CEDAR task is to interface the Durham HepData database of experimental measurements to event generator validation tools such as the UCL JetWeb system - this has necessitated the migration of HepData to a new relational database system and a Java-based interaction model. The "number crunching" part of JetWeb is also being upgraded, from the Fortran HZTool library to the new C++ Rivet system and a generator interfacing layer named RivetGun. Finally, I describe how Rivet is already being used as a central part of a new generator tuning system, and summarise two other CEDAR activities, HepML and HepForge.Comment: 13 pages, prepared for XI International Workshop on Advanced Computing and Analysis Techniques in Physics Research, Amsterdam, April 23-27 200

    Inference in Graphical Gaussian Models with Edge and Vertex Symmetries with the gRc Package for R

    Get PDF
    In this paper we present the R package gRc for statistical inference in graphical Gaussian models in which symmetry restrictions have been imposed on the concentration or partial correlation matrix. The models are represented by coloured graphs where parameters associated with edges or vertices of same colour are restricted to being identical. We describe algorithms for maximum likelihood estimation and discuss model selection issues. The paper illustrates the practical use of the gRc package.

    Effectiveness and feasibility of lowering playground density during recess to promote physical activity and decrease sedentary time at primary school

    Get PDF
    Background: This pilot study aimed at investigating the effectiveness of lowering playground density on increasing children’s physical activity and decreasing sedentary time. Also the feasibility of this intervention was tested. Methods: Data were collected in September and October 2012 in three Belgian schools in 187, 9–12 year old children. During the intervention, playground density was decreased by splitting up recesses and decreasing the number of children sharing the playground. A within-subject design was used. Children wore accelerometers during the study week. Three-level (class – participant - measurement (baseline or intervention)) linear regression models were used to determine intervention effects. After the intervention week the school principals filled out a questionnaire concerning the feasibility of the intervention. Results: The available play space was 12.18 ± 4.19 m2/child at baseline and increased to 24.24 ± 8.51 m2/child during intervention. During the intervention sedentary time decreased (−0.58 min/recess; -3.21%/recess) and moderate-to-vigorous physical activity (+1.04 min/recess; +5.9%/recess) increased during recess and during the entire school day (sedentary time: -3.29%/school day; moderate-to-vigorous physical activity +1.16%/school day). All principals agreed that children enjoyed the intervention; but some difficulties were reported. Conclusions: Lowering playground density can be an effective intervention for decreasing children’s sedentary time and increasing their physical activity levels during recess; especially in least active children

    An Object-Oriented Framework for Statistical Simulation: The R Package simFrame

    Get PDF
    Simulation studies are widely used by statisticians to gain insight into the quality of developed methods. Usually some guidelines regarding, e.g., simulation designs, contamination, missing data models or evaluation criteria are necessary in order to draw meaningful conclusions. The R package simFrame is an object-oriented framework for statistical simulation, which allows researchers to make use of a wide range of simulation designs with a minimal effort of programming. Its object-oriented implementation provides clear interfaces for extensions by the user. Since statistical simulation is an embarrassingly parallel process, the framework supports parallel computing to increase computational performance. Furthermore, an appropriate plot method is selected automatically depending on the structure of the simulation results. In this paper, the implementation of simFrame is discussed in great detail and the functionality of the framework is demonstrated in examples for different simulation designs.
    corecore