40 research outputs found

    Minimizing measures of risk by saddle point conditions

    Get PDF
    The minimization of risk functions is becoming a very important topic due to its interesting applications in Mathematical Finance and Actuarial Mathematics. This paper addresses this issue in a general framework. Many types of risk function may be involved. A general representation theorem of risk functions is used in order to transform the initial optimization problem into an equivalent one that overcomes several mathematical caveats of risk functions. This new problem involves Banach spaces but a mean value theorem for risk measures is stated, and this simplifies the dual problem. Then, optimality is characterized by saddle point properties of a bilinear expression involving the primal and the dual variable. This characterization is significantly different if one compares it with previous literature. Furthermore, the saddle point condition very easily applies in practice. Four applications in finance and insurance are presented.This research was partially supported by ‘‘Welzia Management SGIIC SA, RD_Sistemas SA’’ and ‘‘MEyC’’ (Spain), Grant ECO2009-14457-C04.Publicad

    Local Minima, Symmetry-breaking, and Model Pruning in Variational Free

    No full text
    Abstract Approximate inference by variational free energy minimization (also known as variationalBayes, or ensemble learning) has maximum likelihood and maximum a posteriori method

    March 1999

    No full text
    Introduction Nelder and Mead (1965) give a method of minimizing a function of n variables by first computing its value at each of the n+1 points of a simplex defined by an initial point (x 1 , x 2 , ...,x n ) and the n points (x 1 , 0,..., 0), (0, x 2 ,..., 0), ..., (0, 0,..., x n ) . By a sequence of operations called reflections, contractions and expansions the point corresponding to the maximum functional value is successively replaced by a point with a smaller value. This process is repeated until the variance of the functional values over the simplex is smaller than a given tolerance or until a maximum number of cycles is exceeded, whichever occurs first. The present note gives a J program for this algorithm and a Windows form for its convenient use. Windows

    Improvements to Watson's Incremental

    No full text
    We present three improvements to the incremental minimization algorithm presented by Bruce Watson at FSMNLP'2001 in Helsinki. Two of them do not affect the asymptotical worst-case computational complexity of the algorithm, but they can increase the speed on a large class of automata. The third improvement is introduction of full memoization (memoization in the original paper was only partial) , accompanied by the proof that the algorithm can run in O(jQj 2 ) time for all practical sizes of data -- much better than the exponential time given by Bruce Watson
    corecore