11,152 research outputs found

    Incorporating statistical model error into the calculation of acceptability prices of contingent claims

    Get PDF
    The determination of acceptability prices of contingent claims requires the choice of a stochastic model for the underlying asset price dynamics. Given this model, optimal bid and ask prices can be found by stochastic optimization. However, the model for the underlying asset price process is typically based on data and found by a statistical estimation procedure. We define a confidence set of possible estimated models by a nonparametric neighborhood of a baseline model. This neighborhood serves as ambiguity set for a multi-stage stochastic optimization problem under model uncertainty. We obtain distributionally robust solutions of the acceptability pricing problem and derive the dual problem formulation. Moreover, we prove a general large deviations result for the nested distance, which allows to relate the bid and ask prices under model ambiguity to the quality of the observed data.Comment: 27 pages, 2 figure

    Online Estimation and Optimization of Utility-Based Shortfall Risk

    Full text link
    Utility-Based Shortfall Risk (UBSR) is a risk metric that is increasingly popular in financial applications, owing to certain desirable properties that it enjoys. We consider the problem of estimating UBSR in a recursive setting, where samples from the underlying loss distribution are available one-at-a-time. We cast the UBSR estimation problem as a root finding problem, and propose stochastic approximation-based estimations schemes. We derive non-asymptotic bounds on the estimation error in the number of samples. We also consider the problem of UBSR optimization within a parameterized class of random variables. We propose a stochastic gradient descent based algorithm for UBSR optimization, and derive non-asymptotic bounds on its convergence

    On the Minimization of Convex Functionals of Probability Distributions Under Band Constraints

    Full text link
    The problem of minimizing convex functionals of probability distributions is solved under the assumption that the density of every distribution is bounded from above and below. A system of sufficient and necessary first-order optimality conditions as well as a bound on the optimality gap of feasible candidate solutions are derived. Based on these results, two numerical algorithms are proposed that iteratively solve the system of optimality conditions on a grid of discrete points. Both algorithms use a block coordinate descent strategy and terminate once the optimality gap falls below the desired tolerance. While the first algorithm is conceptually simpler and more efficient, it is not guaranteed to converge for objective functions that are not strictly convex. This shortcoming is overcome in the second algorithm, which uses an additional outer proximal iteration, and, which is proven to converge under mild assumptions. Two examples are given to demonstrate the theoretical usefulness of the optimality conditions as well as the high efficiency and accuracy of the proposed numerical algorithms.Comment: 13 pages, 5 figures, 2 tables, published in the IEEE Transactions on Signal Processing. In previous versions, the example in Section VI.B contained some mistakes and inaccuracies, which have been fixed in this versio

    Mixed-Integer Convex Nonlinear Optimization with Gradient-Boosted Trees Embedded

    Get PDF
    Decision trees usefully represent sparse, high dimensional and noisy data. Having learned a function from this data, we may want to thereafter integrate the function into a larger decision-making problem, e.g., for picking the best chemical process catalyst. We study a large-scale, industrially-relevant mixed-integer nonlinear nonconvex optimization problem involving both gradient-boosted trees and penalty functions mitigating risk. This mixed-integer optimization problem with convex penalty terms broadly applies to optimizing pre-trained regression tree models. Decision makers may wish to optimize discrete models to repurpose legacy predictive models, or they may wish to optimize a discrete model that particularly well-represents a data set. We develop several heuristic methods to find feasible solutions, and an exact, branch-and-bound algorithm leveraging structural properties of the gradient-boosted trees and penalty functions. We computationally test our methods on concrete mixture design instance and a chemical catalysis industrial instance
    • …
    corecore