1,323 research outputs found

    Multivariate utility maximization with proportional transaction costs and random endowment

    Get PDF
    In this paper we deal with a utility maximization problem at finite horizon on a continuous-time market with conical (and time varying) constraints (particularly suited to model a currency market with proportional transaction costs). In particular, we extend the results in Campi and Owen (2011) to the situation where the agent is initially endowed with a random and possibly unbounded quantity of assets. We start by studying some basic properties of the value function (which is now defined on a space of random variables), then we dualize the problem following some convex analysis techniques which have proven very useful in this field of research. We finally prove the existence of a solution to the dual and (under an additional boundedness assumption on the endowment) to the primal problem. The last section of the paper is devoted to an application of our results to utility indifference pricing.Transaction costs ; Foreign exchange market ; Multivariate utility function ; Optimal portfolio ; Duality theory ; Random endowment ; Utility-based pricing

    Spatial models for flood risk assessment

    Get PDF
    The problem of computing risk measures associated to flood events is extremely important not only from the point of view of civil protection systems but also because of the necessity for the municipalities of insuring against the damages. In this work we propose, in the framework of an integrated strategy, an operating solution which merges in a conditional approach the information usually available in this setup. First we use a Logistic Auto-Logistic (LAM) model for the estimation of the univariate conditional probabilities of flood events. This approach has two fundamental advantages: it allows to incorporate auxiliary information and does not require the target variables to be independent. Then we simulate the joint distribution of floodings by means of the Gibbs Sampler. Finally we propose an algorithm to increase ex post the spatial autocorrelation of the simulated events. The methodology is shown to be effective by means of an application to the estimation of the flood probability of Italian hydrographic regions

    A note on maximum likelihood estimation of a Pareto mixture

    Get PDF
    In this paper we study Maximum Likelihood Estimation of the parameters of a Pareto mixture. Application of standard techniques to a mixture of Pareto is problematic. For this reason we develop two alternative algorithms. The first one is the Simulated Annealing and the second one is based on Cross-Entropy minimization. The Pareto distribution is a commonly used model for heavy-tailed data. It is a two-parameter distribution whose shape parameter determines the degree of heaviness of the tail, so that it can be adapted to data with different features. This work is motivated by an application in the operational risk measurement field: we fit a Pareto mixture to operational losses recorded by a bank in two different business lines. Losses below an unknown threshold are discarded, so that the observed data are truncated. The thresholds used in the two business lines are unknown. Thus, under the assumption that each population follows a Pareto distribution, the appropriate model is a mixture of Pareto where all the parameters have to be estimated.

    A framework for cut-off sampling in business survey design

    Get PDF
    In sampling theory the large concentration of the population with respect to most surveyed variables constitutes a problem which is difficult to tackle by means of classical tools. One possible solution is given by cut-off sampling, which explicitly prescribes to discard part of the population; in particular, if the population is composed by firms or establishments, the method results in the exclusion of the “smallest” firms. Whereas this sampling scheme is common among practitioners, its theoretical foundations tend to be considered weak, because the inclusion probability of some units is equal to zero. In this paper we propose a framework to justify cut-off sampling and to determine the census and cut-off thresholds. We use an estimation model which assumes as known the weight of the discarded units with respect to each variable; we compute the variance of the estimator and its bias, which is caused by violations of the aforementioned hypothesis. We develop an algorithm which minimizes the MSE as a function of multivariate auxiliary information at the population level. Considering the combinatorial optimization nature of the model, we resort to the theory of stochastic relaxation: in particular, we use the simulated annealing algorithm.Cut-off sampling, skewed populations, model-based estimation, optimal stratification, simulated annealing

    Spatial models for flood risk assessment

    Get PDF
    The problem of computing risk measures associated to flood events is extremely important not only from the point of view of civil protection systems but also because of the necessity for the municipalities of insuring against the damages. In this work we propose, in the framework of an integrated strategy, an operating solution which merges in a conditional approach the information usually available in this setup. First we use a Logistic Auto-Logistic (LAM) model for the estimation of the univariate conditional probabilities of flood events. This approach has two fundamental advantages: it allows to incorporate auxiliary information and does not require the target variables to be indepen- dent. Then we simulate the joint distribution of floodings by means of the Gibbs Sampler. Finally we propose an algorithm to increase ex post the spatial autocorrelation of the simulated events. The methodology is shown to be effective by means of an application to the estimation of the flood probability of Italian hydrographic regions.Flood Risk, Conditional Approach, LAM Model, Pseudo-Maximum Likelihood Estimation, Spatial Autocorrelation, Gibbs Sampler.

    Utility indifference valuation for non-smooth payoffs with an application to power derivatives

    Get PDF
    We consider the problem of exponential utility indifference valuation under the simplified framework where traded and nontraded assets are uncorrelated but where the claim to be priced possibly depends on both. Traded asset prices follow a multivariate Black and Scholes model, while nontraded asset prices evolve as generalized Ornstein–Uhlenbeck processes. We provide a BSDE characterization of the utility indifference price (UIP) for a large class of non-smooth, possibly unbounded, payoffs depending simultaneously on both classes of assets. Focusing then on Vanilla claims and using the Gaussian structure of the model allows us to employ some BSDE techniques (in particular, a Malliavin-type representation theorem due to Ma and Zhang, Ann Appl Probab 12:1390–1418, 2002) to prove the regularity of Z and to characterize the UIP for possibly discontinuous Vanilla payoffs as a viscosity solution of a suitable PDE with continuous space derivatives. The optimal hedging strategy is also identified essentially as the delta hedging strategy corresponding to the UIP. Since there are no closed-form formulas in general, we also obtain asymptotic expansions for prices and hedging strategies when the risk aversion parameter is small. Finally, our results are applied to pricing and hedging power derivatives in various structural models for energy market

    Private Communications Using Optical Chaos

    Get PDF
    After a brief summary of the basic methods for secure transmission using optical chaos, we report on most recent achievements, namely, on the comparison between the standard two-laser and the three-laser schemes and on the network architecture for multiuser secure transmission. From our investigations, we found that while both the basic two-laser and the three-laser schemes are suitable to secure data exchange, the three-laser scheme offers a better level of privacy due to its symmetrical topology. Moreover, while transmission based on optical chaos is usually restricted to point-to-point interconnections, a more advanced solution, derived from the well-known public key cryptography, allows for private message transmission between any couple of subscribers in a network

    In silico screening of mutational effects on enzyme-proteic inhibitor affinity: a docking-based approach

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Molecular recognition between enzymes and proteic inhibitors is crucial for normal functioning of many biological pathways. Mutations in either the enzyme or the inhibitor protein often lead to a modulation of the binding affinity with no major alterations in the 3D structure of the complex.</p> <p>Results</p> <p>In this study, a rigid body docking-based approach has been successfully probed in its ability to predict the effects of single and multiple point mutations on the binding energetics in three enzyme-proteic inhibitor systems. The only requirement of the approach is an accurate structural model of the complex between the wild type forms of the interacting proteins, with the assumption that the architecture of the mutated complexes is almost the same as that of the wild type and no major conformational changes occur upon binding. The method was applied to 23 variants of the ribonuclease inhibitor-angiogenin complex, to 15 variants of the barnase-barstar complex, and to 8 variants of the bovine pancreatic trypsin inhibitor-ÎČ Trypsin system, leading to thermodynamic and kinetic estimates consistent with in vitro data. Furthermore, simulations with and without explicit water molecules at the protein-protein interface suggested that they should be included in the simulations only when their positions are well defined both in the wild type and in the mutants and they result to be relevant for the modulation of mutational effects on the association process.</p> <p>Conclusion</p> <p>The correlative models built in this study allow for predictions of mutational effects on the thermodynamics and kinetics of association of three substantially different systems, and represent important extensions of our computational approach to cases in which it is not possible to estimate the absolute free energies. Moreover, this study is the first example in the literature of an extensive evaluation of the correlative weights of the single components of the ZDOCK score on the thermodynamics and kinetics of binding of protein mutants compared to the native state.</p> <p>Finally, the results of this study corroborate and extend a previously developed quantitative model for in silico predictions of absolute protein-protein binding affinities spanning a wide range of values, i.e. from -10 up to -21 kcal/mol.</p> <p>The computational approach is simple and fast and can be used for structure-based design of protein-protein complexes and for in silico screening of mutational effects on protein-protein recognition.</p
    • 

    corecore