1,492 research outputs found

    Markov switching quadratic term structure models

    Get PDF
    In this paper, we consider a discrete time economy where we assume that the short term interest rate follows a quadratic term structure of a regime switching asset process. The possible non-linear structure and the fact that the interest rate can have different economic or financial trends justify the interest of Regime Switching Quadratic Term Structure Model (RS-QTSM). Indeed, this regime switching process depends on the values of a Markov chain with a time dependent transition probability matrix which can well captures the different states (regimes) of the economy. We prove that under this modelling that the conditional zero coupon bond price admits also a quadratic term structure. Moreover, the stochastic coefficients which appear in this decomposition satisfy an explicit system of coupled stochastic backward recursions

    Fast & Confident Probabilistic Categorization

    Get PDF
    We describe NRC's submission to the Anomaly Detection/Text Mining competition organised at the Text Mining Workshop 2007. This submission relies on a straightforward implementation of the probabilistic categoriser described in (Gaussier et al., ECIR'02). This categoriser is adapted to handle multiple labelling and a piecewise-linear confidence estimation layer is added to provide an estimate of the labelling confidence. This technique achieves a score of 1.689 on the test data

    The microscopic theory of fission

    Full text link
    Fission-fragment properties have been calculated for thermal neutron-induced fission on a 239Pu^{239}\textrm{Pu} target, using constrained Hartree-Fock-Bogoliubov calculations with a finite-range effective interaction. A quantitative criterion based on the interaction energy between the nascent fragments is introduced to define the scission configurations. The validity of this criterion is benchmarked against experimental measurements of the kinetic energies and of multiplicities of neutrons emitted by the fragments.Comment: 8 page, 4 figures, to be published in Proceedings of the 4th International Workshop on Fission and Fission Product Spectroscop

    Astrophysical Implication of Low E(2^+_1) in Neutron-rich Sn Isotopes

    Full text link
    The observation and prediction of unusually depressed first excited 2^+_1 states in even-A neutron - rich isotopes of semi-magic Sn above 132Sn provide motivations for reviewing the problems related to the nuclear astrophysics in general. In the present work, the beta-decay rates of the exotic even Sn isotopes (134,136Sn) above the 132Sn core have been calculated as a function of temperature (T). In order to get the necessary ft values, B(GT) values corresponding to allowed Gamow Teller (GT-) beta-decay have been theoretically calculated using shell model. The total decay rate shows decrease with increasing temperature as the ground state population is depleted and population of excited states with slower decay rates increases. The abundance at each Z value is inversely proportional to the decay constant of the waiting point nucleus for that particular Z. So the increase in half-life of isotopes of Sn, like 136Sn, might have substantial impact on the r-process nucleosynthesis.Comment: 4th International Workshop on Nuclear Fission and Fission Product Spectroscopy, CEA Cadarache, May 13 - 16, 2009, 4 pages, 2 figure

    Data Cube Approximation and Mining using Probabilistic Modeling

    Get PDF
    On-line Analytical Processing (OLAP) techniques commonly used in data warehouses allow the exploration of data cubes according to different analysis axes (dimensions) and under different abstraction levels in a dimension hierarchy. However, such techniques are not aimed at mining multidimensional data. Since data cubes are nothing but multi-way tables, we propose to analyze the potential of two probabilistic modeling techniques, namely non-negative multi-way array factorization and log-linear modeling, with the ultimate objective of compressing and mining aggregate and multidimensional values. With the first technique, we compute the set of components that best fit the initial data set and whose superposition coincides with the original data; with the second technique we identify a parsimonious model (i.e., one with a reduced set of parameters), highlight strong associations among dimensions and discover possible outliers in data cells. A real life example will be used to (i) discuss the potential benefits of the modeling output on cube exploration and mining, (ii) show how OLAP queries can be answered in an approximate way, and (iii) illustrate the strengths and limitations of these modeling approaches

    Variance optimal hedging for continuous time additive processes and applications

    Full text link
    For a large class of vanilla contingent claims, we establish an explicit F\"ollmer-Schweizer decomposition when the underlying is an exponential of an additive process. This allows to provide an efficient algorithm for solving the mean variance hedging problem. Applications to models derived from the electricity market are performed

    On some expectation and derivative operators related to integral representations of random variables with respect to a PII process

    Get PDF
    Given a process with independent increments XX (not necessarily a martingale) and a large class of square integrable r.v. H=f(XT)H=f(X_T), ff being the Fourier transform of a finite measure Ό\mu, we provide explicit Kunita-Watanabe and F\"ollmer-Schweizer decompositions. The representation is expressed by means of two significant maps: the expectation and derivative operators related to the characteristics of XX. We also provide an explicit expression for the variance optimal error when hedging the claim HH with underlying process XX. Those questions are motivated by finding the solution of the celebrated problem of global and local quadratic risk minimization in mathematical finance.Comment: 29 page

    Variance Optimal Hedging for discrete time processes with independent increments. Application to Electricity Markets

    Get PDF
    We consider the discretized version of a (continuous-time) two-factor model introduced by Benth and coauthors for the electricity markets. For this model, the underlying is the exponent of a sum of independent random variables. We provide and test an algorithm, which is based on the celebrated Foellmer-Schweizer decomposition for solving the mean-variance hedging problem. In particular, we establish that decomposition explicitely, for a large class of vanilla contingent claims. Interest is devoted in the choice of rebalancing dates and its impact on the hedging error, regarding the payoff regularity and the non stationarity of the log-price process
    • 

    corecore