1,089 research outputs found
Deep Learning in a Generalized HJM-type Framework Through Arbitrage-Free Regularization
We introduce a regularization approach to arbitrage-free factor-model
selection. The considered model selection problem seeks to learn the closest
arbitrage-free HJM-type model to any prespecified factor-model. An asymptotic
solution to this, a priori computationally intractable, problem is represented
as the limit of a 1-parameter family of optimizers to computationally tractable
model selection tasks. Each of these simplified model-selection tasks seeks to
learn the most similar model, to the prescribed factor-model, subject to a
penalty detecting when the reference measure is a local martingale-measure for
the entire underlying financial market. A simple expression for the penalty
terms is obtained in the bond market withing the affine-term structure setting,
and it is used to formulate a deep-learning approach to arbitrage-free affine
term-structure modelling. Numerical implementations are also performed to
evaluate the performance in the bond market.Comment: 23 Pages + Reference
A survey of random processes with reinforcement
The models surveyed include generalized P\'{o}lya urns, reinforced random
walks, interacting urn models, and continuous reinforced processes. Emphasis is
on methods and results, with sketches provided of some proofs. Applications are
discussed in statistics, biology, economics and a number of other areas.Comment: Published at http://dx.doi.org/10.1214/07-PS094 in the Probability
Surveys (http://www.i-journals.org/ps/) by the Institute of Mathematical
Statistics (http://www.imstat.org
On Policy Evaluation with Aggregate Time-Series Shocks
We propose a new algorithm for estimating treatment effects in contexts where
the exogenous variation comes from aggregate time-series shocks. Our estimator
combines data-driven unit-level weights with a time-series model. We use the
unit weights to control for unobserved aggregate confounders and use the
time-series model to extract the quasi-random variation from the observed
shock. We examine our algorithm's performance in a realistic simulation based
on Nakamura and Steinsson [2014]. We provide statistical guarantees for our
estimator in a practically relevant regime, where both cross-sectional and
time-series dimensions are large and show how to use it to conduct robust
inference
Safety Evaluation Using Counterfactual Simulations: The use of computational driver behavior models in crash avoidance systems and virtual simulations with optimal subsampling
Traffic safety is a problem worldwide. In-vehicle conflict and crash avoidance systems have been under development and assessment for some time, as integral parts of Advanced Driver Assistance Systems (ADAS) and Automated Driving Systems (ADS). Among the methods used to assess conflict and crash avoidance systems developed by the automotive industry, virtual safety assessment methods have been shown to have great potential and efficiency. In fact, scenario generation-based virtual safety assessments play—and are likely to continue to play—a very important role in the assessments of vehicles of all levels of automation. The ultimate aim of this thesis is to improve the safety performance of conflict and crash avoidance systems. This aim is addressed through the use of computational driver models in two different ways. First, by using comfort-zone boundaries in system design, and second, by using a behavior-based crash-causation model together with a novel optimized scenario generation method for virtual safety assessment.The first objective of this thesis is to investigate how a driver model which includes road users’ comfortable behaviors in crash avoidance algorithms impacts the systems’ safety performance and the residual crash characteristics. Chinese car-to-two-wheeler crashes were targeted; Automated Emergency Braking (AEB) algorithms, which comprised the proposed crash avoidance systems, were compared to a traditional AEB algorithm. The proposed algorithms showed larger safety performance benefits. In addition, the similarities in residual crash characteristics regarding impact speed and location after different AEB implementations can potentially simplify the designs of in-crash protection system in future.The second objective is to develop and apply a method for efficient subsampling in crash-causation-model-based scenario generation for virtual safety assessment. The method, which is machine-learning-assisted, actively and iteratively updates the sampling probability based on new simulation results. The crash-causation model is based on off-road glances and a distribution of driver maximum decelerations in critical situations. A simple time-to-collision-based AEB algorithm was used to demonstrate the assessment process as well as the benefits of combining crash-causation-model-based scenario generation and optimal subsampling. The sampling methods are designed to target specific safety benefit indicators, such as impact speed reduction and crash avoidance rate. The results of the study show that the proposed sampling method requires almost 50% fewer simulations than traditional importance sampling.Future work aims to focus on applying the active sampling method to driver-model-based car-to-vulnerable road user (VRU) scenario generation. In addition to assessing conflict and crash avoidance system performance, a novel stopping criterion based on Bayesian future prediction will be further developed and demonstrated for use in experiments (e.g., as part of developing driver models) and virtual simulations (e.g., using driver-behavior-based crash-causation models). This criterion will be able to indicate when studies are unlikely to yield actionable results within the budget available, facilitating the decision to discontinue them while they are being run
Damage segregation at fissioning may increase growth rates: A superprocess model
A fissioning organism may purge unrepairable damage by bequeathing it
preferentially to one of its daughters. Using the mathematical formalism of
superprocesses, we propose a flexible class of analytically tractable models
that allow quite general effects of damage on death rates and splitting rates
and similarly general damage segregation mechanisms. We show that, in a
suitable regime, the effects of randomness in damage segregation at fissioning
are indistinguishable from those of randomness in the mechanism of damage
accumulation during the organism's lifetime. Moreover, the optimal population
growth is achieved for a particular finite, non-zero level of combined
randomness from these two sources. In particular, when damage accumulates
deterministically, optimal population growth is achieved by a moderately
unequal division of damage between the daughters. Too little or too much
division is sub-optimal. Connections are drawn both to recent experimental
results on inheritance of damage in protozoans, to theories of the evolution of
aging, and to models of resource division between siblings.Comment: Version 2 had significant conceptual and organizational changes,
though only minor changes to the mathematics. Version 3 has minor
proofreading corrections, and a few new references. The paper will appear in
Theoretical Population Biolog
- …