22 research outputs found

    Convex risk measures for portfolio optimization and concepts of flexibility

    Get PDF
    Due to their axiomatic foundation and their favorable computational properties convex risk measures are becoming a powerful tool in financial risk management. In this paper we will review the fundamental structural concepts of convex risk measures within the framework of convex analysis. Then we will exploit it for deriving strong duality relations in a generic portfolio optimization context. In particular, the duality relationship can be used for designing new, efficient approximation algorithms based on Nesterov's smoothing techniques for non-smooth convex optimization. Furthermore, the presented concepts enable us to formalize the notion of flexibility as the (marginal) risk absorption capacity of a technology or (available) resource

    Risk management of power portfolios and valuation of flexibility

    Get PDF
    Risk management by applying operational flexibility is becoming a key issue for production companies. This paper discusses how a power portfolio can be hedged through its own production assets. In particular we model operational flexibility of a hydro pump storage plant and show how to dispatch it to hedge against adverse movements in the portfolio. Moreover, we present how volume risk, which is not hedgeable with standard contracts from power exchanges, can be managed by an intelligent dispatch policy. Despite the incompleteness of the market we quantify the value of this operational flexibility in the framework of coherent risk measure

    Pivoting in Linear Complementarity: TwoPolynomial-Time Cases

    Get PDF
    We study the behavior of simple principal pivoting methods for the P-matrix linear complementarity problem (P-LCP). We solve an open problem of Morris by showing that Murty's least-index pivot rule (under any fixed index order) leads to a quadratic number of iterations on Morris's highly cyclic P-LCP examples. We then show that on K-matrix LCP instances, all pivot rules require only a linear number of iterations. As the main tool, we employ unique-sink orientations of cubes, a useful combinatorial abstraction of the P-LC

    Pivoting in Linear Complementarity: Two Polynomial-Time Cases

    Get PDF
    We study the behavior of simple principal pivoting methods for the P-matrix linear complementarity problem (P-LCP). We solve an open problem of Morris by showing that Murty’s least-index pivot rule (under any fixed index order) leads to a quadratic number of iterations on Morris’s highly cyclic P-LCP examples. We then show that on K-matrix LCP instances, all pivot rules require only a linear number of iterations. As the main tool, we employ unique-sink orientations of cubes, a useful combinatorial abstraction of the P-LCP

    Convex risk measures for portfolio optimization and concepts of flexibility

    No full text
    Due to their axiomatic foundation and their favorable computational properties convex risk measures are becoming a powerful tool in financial risk management. In this paper we will review the fundamental structural concepts of convex risk measures within the framework of convex analysis. Then we will exploit it for deriving strong duality relations in a generic portfolio optimization context. In particular, the duality relationship can be used for designing new, efficient approximation algorithms based on Nesterov’s smoothing techniques for non-smooth convex optimization. Furthermore, the presented concepts enable us to formalize the notion of flexibility as the (marginal) risk absorption capacity of a technology or (available) resources.

    Scheduling to Minimize Maximum Workload

    No full text
    We pose the following problem: given m jobs, each of which requires a certain total amount of labour that must be performed within specified time periods, how should one schedule the jobs' execution to obtain a total workload that is as even as possible? A related question is: what is the minimal work capacity needed to accomplish all jobs? These questions can be formulated as a linear program, but the number of variables and constraints required usually will be large. Using linear duality theory we instead derive a purely combinatorial problem whose resolution leads to the needed minimal capacity, and thus to the imposed bottleneck. Then we concentrate on the important special case where the time constraints for performing each job are in the form of a single time interval: We detail a simple procedure that efficiently determines the minimal capacity and the bottleneck. A second efficient combinatorial algorithm determines a feasible execution schedule which minimizes the maximum total workload. These algorithms require a computational time of the order of m 2 and negligible core memory, and for most practical applications can be implemented on microcomputers.production/scheduling: workstudies, programming: linear, algorithms

    Combinatorial Maximum Improvement Algorithm for LP and LCP

    No full text
    this paper, we show how one can design new pivot algorithms for solving the LP and the LCP. In particular, we are interested in combinatorial pivot algorithms which solve the LP and a certain class of LCP's. Here, a pivot algorithm is called combinatorial if the pivot choice depends only on the signs of entries of their dictionaries. The best source of combinatorial pivot algorithms is in the theory of oriented matroid (OM) programming [Bla77a, Edm94, Fuk82, FT92, LL86, Ter87, Tod85, Wan87]. The well-known Bland's pivot rule [Bla77b] for the simplex method can be considered as a combinatorial algorithm, but it is not a typical one. The main characteristic of the "OM" algorithms is that the feasibility may not be preserved at all in both primal and dual problem, and the finiteness of the algorithms is guaranteed by some purely combinatorial improvement argument rather than by the reasoning based on the increment of the objective function value. One immediate advantage of combinatorial algorithms is that the degeneracy does not have to be treated separately. Thus a very simple combinatorial algorithm, such as the criss-cross method [Ter87, Wan87], solves the general LP correctly and yields one of the simplest proofs of the strong duality theorem. There is a well-noted disadvantage of combinatorial algorithms. The number of pivot operations to solve the LP tends to grow rapidly in practice. Furthermore it is often quite easy to construct a class of LP's for which a given combinatorial algorithm takes an exponential number of pivot operations in the input size. In this paper, we review the finiteness proof of combinatorial algorithms and study a new algorithm in the class. The key ingredients of the new algorithm are "history dependency" and "largest combinatorial improveme..
    corecore