11,398 research outputs found
Risk averse shape optimization - risk measures and stochastic orders
Thema der vorliegenden Arbeit ist risikoaverse Formoptimierung. Gewöhnliche Formoptimierung umfasst jene Art von Problemen, bei denen die zu optimierende Variable die Form eines Objekts ist. Hier ist das Objekt ein elastischer Körper auf den eine Kraft einwirkt. Basierend auf den Konzepten der Sensitivitätsanalyse bezüglich Form und Topologie wird eine Algorithmus konstruiert, der den elastischen Körper bezüglich seiner elastischen Eigenschaften und dem dabei insgesamt gebrauchten Volumen sukzessiv verbesssert. Das Resultat ist eine Struktur, die auf die wirkende Kraft so stabil wie möglich reagiert und dabei möglichst geringes Volumen hat. Diese Art von Problemen benötigen effiziente numerische Löser für die zugrundeliegenden partiellen Differentialgleichungen (hier ist dies das Modell der linearen Elastizität). Dazu werden unterschiedliche Finite Elemente Methoden sowie Ansätze zur Gittergenerierung benutzt. Mit den sogenannten Levelset Methoden wird die Entwicklung der Struktur numerisch beschrieben.
Unsicherheit kommt dann ins Spiel, wenn die berücksichtigten Kräfte als zufällig angenommen werden. So kann nun das elastische Verhalten des Körpers als Zufallsvariable angesehen werden, die von der Form abhängt. In einem ersten Ansatz wird die Form bezüglich verschiedener Risikomaße optimiert, die vor allem in Modellen der (Finanz-) Okonomie zum Einsatz kommen. Solche Risikomaße definieren unterschiedliche Bewertungen von Risiko, welches einer Zufallsvariable anhängt.
Risikoneutrale und risikoaverse Modelle werden zur Formoptimierung benutzt.
Eine neue Perspektive eröffnet sich, wenn 'stochastic dominance relations' zur Risikobewertung herangezogen werden. Diese definieren eine Halbordnung auf dem Raum der Zufallsvariablen und ermöglichen es, diese in Relation zu stellen. Ausgehend von einem Benchmark, welches eine gewisse Güte für das Verhalten unter Einwirkung der zufälligen Kräfte beschreibt, kann eine Menge von Formen identifiziert werden, deren Verhalten unter Einwirkung der Kräfte nicht schlechter als das Benchmark ist. Aus dieser Menge werden dann Formen nach einem weiteren Kriterium, z.B. möglichst geringes Volumen, ausgewählt. Auf diese Art und Weise werden Formen mit geringem Volumen gefunden, die aber dennoch die zuvor gestellten Anforderungen erfüllen.In this thesis, risk averse shape optimization of elastic strucures is at issue. Shape optimization, in general, deals with the type of problems where the variable to be optimized is the geometry or the shape of a domain. Here, the domain represents an elastic body which is subjected to a force applied. Relying on the concepts of shape and topology sensitivity analysis an algorithm is implemented which successively improves the elastic body concerning the elastic response and the total volume.
Eventually, this procedure results in a body which is, on the one hand, as stiff as possible regarding the force applied and, on the other hand, whose volume is as small as possible. Solving such problems numerically requires an efficient solver for the underlying partial differential equation (here the linearized elasticity model). To this end, different finite element methods and mesh generation approaches are applied.
Level set methods are employed to realize the evolution of the elastic body in the discrete setting.
Uncertainty is then introduced by considering the force applied to be random. Thus, the elastic body can be interpreted as a parameter defining a random variable. In a first approach risk measures, which are well-known in economics, are proposed to assess random variables. Risk measures give a notion of risk associated with random variables. Risk neutral and risk averse models are discussed and used to optimize over
a class of shapes.
A new perspective arises when stochastic dominance relations are employed for the assessment of risk. They define an order on the space of random variables and allow to compare these to each other directly. Taking a benchmark random variable associated with a required behavior under uncertainty, a set of acceptable shapes can be identified by comparision to this benchmark. An additional criterion, e.g. minimal volume, is used to select shapes from this set. In that way, shapes with low volume are found which still meet the prescribed requirements
Managing Risk of Bidding in Display Advertising
In this paper, we deal with the uncertainty of bidding for display
advertising. Similar to the financial market trading, real-time bidding (RTB)
based display advertising employs an auction mechanism to automate the
impression level media buying; and running a campaign is no different than an
investment of acquiring new customers in return for obtaining additional
converted sales. Thus, how to optimally bid on an ad impression to drive the
profit and return-on-investment becomes essential. However, the large
randomness of the user behaviors and the cost uncertainty caused by the auction
competition may result in a significant risk from the campaign performance
estimation. In this paper, we explicitly model the uncertainty of user
click-through rate estimation and auction competition to capture the risk. We
borrow an idea from finance and derive the value at risk for each ad display
opportunity. Our formulation results in two risk-aware bidding strategies that
penalize risky ad impressions and focus more on the ones with higher expected
return and lower risk. The empirical study on real-world data demonstrates the
effectiveness of our proposed risk-aware bidding strategies: yielding profit
gains of 15.4% in offline experiments and up to 17.5% in an online A/B test on
a commercial RTB platform over the widely applied bidding strategies
Optimal design, robustness, and risk aversion
Highly optimized tolerance is a model of optimization in engineered systems,
which gives rise to power-law distributions of failure events in such systems.
The archetypal example is the highly optimized forest fire model. Here we give
an analytic solution for this model which explains the origin of the power
laws. We also generalize the model to incorporate risk aversion, which results
in truncation of the tails of the power law so that the probability of
disastrously large events is dramatically lowered, giving the system more
robustness.Comment: 11 pages, 2 figure
Diversification and Endogenous Financial Networks
We test the hypothesis that interconnections across financial institutions
can be explained by a diversification motive. This idea stems from the
empirical evidence of the existence of long-term exposures that cannot be
explained by a liquidity motive (maturity or currency mismatch). We model
endogenous interconnections of heterogenous financial institutions facing
regulatory constraints using a maximization of their expected utility. Both
theoretical and simulation-based results are compared to a stylized genuine
financial network. The diversification motive appears to plausibly explain
interconnections among key players. Using our model, the impact of regulation
on interconnections between banks -currently discussed at the Basel Committee
on Banking Supervision- is analyzed
Risk Aversion in Finite Markov Decision Processes Using Total Cost Criteria and Average Value at Risk
In this paper we present an algorithm to compute risk averse policies in
Markov Decision Processes (MDP) when the total cost criterion is used together
with the average value at risk (AVaR) metric. Risk averse policies are needed
when large deviations from the expected behavior may have detrimental effects,
and conventional MDP algorithms usually ignore this aspect. We provide
conditions for the structure of the underlying MDP ensuring that approximations
for the exact problem can be derived and solved efficiently. Our findings are
novel inasmuch as average value at risk has not previously been considered in
association with the total cost criterion. Our method is demonstrated in a
rapid deployment scenario, whereby a robot is tasked with the objective of
reaching a target location within a temporal deadline where increased speed is
associated with increased probability of failure. We demonstrate that the
proposed algorithm not only produces a risk averse policy reducing the
probability of exceeding the expected temporal deadline, but also provides the
statistical distribution of costs, thus offering a valuable analysis tool
- …