20,207 research outputs found
Which Method for Pricing Weather Derivatives ?
Since the introduction of the first weather derivative in the United-States in 1997, a significant number of work was directed towards the pricing of this product and the modelling of the daily average temperature which characterizes most of the traded weather instruments. The weather derivatives were created to enable companies to hedge against climate risks. They respond more to a need to cover seasonal variations which may cause loss of profits for companies than to a coverage need in property damage. Despite the abundance of work on the topic, no consensus has emerged so far about the methodology for evaluating weather derivatives. The major problems of these instruments are on one hand, they are based on an meteorological index that is not traded on financial market which does not allow the use of traditional pricing methods and on the other hand, it is difficult to get round this obstacle by susbtituting the underlying for a linked exchanged security since the weather index is weakly correlated with prices of other financial assets. To further the question of evaluation, we propose in this paper to, firstly, shed light on the difficulties of implementing the three major pricing approaches suggested in the literature for the weather derivatives (actuarial, arbitrage-free and consumption-based methods) and, secondly, to compute the prices of a weather contract by the three methodologies for comparison.weather derivatives; arbitrage-free pricing method; actuarial pricing approach; consumption-based pricing model; risk-neutral distribution; market price of risk; finite difference method; Monte-Carlo simulations.
Biological applications of the theory of birth-and-death processes
In this review, we discuss the applications of the theory of birth-and-death
processes to problems in biology, primarily, those of evolutionary genomics.
The mathematical principles of the theory of these processes are briefly
described. Birth-and-death processes, with some straightforward additions such
as innovation, are a simple, natural formal framework for modeling a vast
variety of biological processes such as population dynamics, speciation, genome
evolution, including growth of paralogous gene families and horizontal gene
transfer, and somatic evolution of cancers. We further describe how empirical
data, e.g., distributions of paralogous gene family size, can be used to choose
the model that best reflects the actual course of evolution among different
versions of birth-death-and-innovation models. It is concluded that
birth-and-death processes, thanks to their mathematical transparency,
flexibility and relevance to fundamental biological process, are going to be an
indispensable mathematical tool for the burgeoning field of systems biology.Comment: 29 pages, 4 figures; submitted to "Briefings in Bioinformatics
Which Method for Pricing Weather Derivatives ?
Since the introduction of the first weather derivative in the United-States in 1997, a significant number of work was directed towards the pricing of this product and the modelling of the daily average temperature which characterizes most of the traded weather instruments. The weather derivatives were created to enable companies to hedge against climate risks. They respond more to a need to cover seasonal variations which may cause loss of profits for companies than to a coverage need in property damage. Despite the abundance of work on the topic, no consensus has emerged so far about the methodology for evaluating weather derivatives. The major problems of these instruments are on one hand, they are based on an meteorological index that is not traded on financial market which does not allow the use of traditional pricing methods and on the other hand, it is difficult to get round this obstacle by susbtituting the underlying for a linked exchanged security since the weather index is weakly correlated with prices of other financial assets. To further the question of evaluation, we propose in this paper to, firstly, shed light on the difficulties of implementing the three major pricing approaches suggested in the literature for the weather derivatives (actuarial, arbitrage-free and consumption-based methods) and, secondly, to compute the prices of a weather contract by the three methodologies for comparison
HARPO: a TPC as a gamma-ray telescope and polarimeter
A gas Time Projection Chamber can be used for gamma-ray astronomy with
excellent angular-precision and sensitivity to faint sources, and for
polarimetry, through the measurement of photon conversion to pairs. We
present the expected performance in simulations and the recent development of a
demonstrator for tests in a polarized photon beam.Comment: SPIE Astronomical Telescopes + Instrumentation, Ultraviolet to gamma
ray, Montr\'eal, Canada 2014. v2: note added in proof. Copyright 2014 SPIE.
One print or electronic copy may be made for personal use only. Systematic
reproduction and distribution, duplication of any material in this paper for
a fee or for commercial purposes, or modification of the content of the paper
are prohibite
Benchmark of FEM, Waveguide and FDTD Algorithms for Rigorous Mask Simulation
An extremely fast time-harmonic finite element solver developed for the
transmission analysis of photonic crystals was applied to mask simulation
problems. The applicability was proven by examining a set of typical problems
and by a benchmarking against two established methods (FDTD and a differential
method) and an analytical example. The new finite element approach was up to
100 times faster than the competing approaches for moderate target accuracies,
and it was the only method which allowed to reach high target accuracies.Comment: 12 pages, 8 figures (see original publication for images with a
better resolution
Scale relativity and fractal space-time: theory and applications
In the first part of this contribution, we review the development of the
theory of scale relativity and its geometric framework constructed in terms of
a fractal and nondifferentiable continuous space-time. This theory leads (i) to
a generalization of possible physically relevant fractal laws, written as
partial differential equation acting in the space of scales, and (ii) to a new
geometric foundation of quantum mechanics and gauge field theories and their
possible generalisations. In the second part, we discuss some examples of
application of the theory to various sciences, in particular in cases when the
theoretical predictions have been validated by new or updated observational and
experimental data. This includes predictions in physics and cosmology (value of
the QCD coupling and of the cosmological constant), to astrophysics and
gravitational structure formation (distances of extrasolar planets to their
stars, of Kuiper belt objects, value of solar and solar-like star cycles), to
sciences of life (log-periodic law for species punctuated evolution, human
development and society evolution), to Earth sciences (log-periodic
deceleration of the rate of California earthquakes and of Sichuan earthquake
replicas, critical law for the arctic sea ice extent) and tentative
applications to system biology.Comment: 63 pages, 14 figures. In : First International Conference on the
Evolution and Development of the Universe,8th - 9th October 2008, Paris,
Franc
The influence of knowledge in the replication of routines
From a resource-based pespective, one of the most important levers of firm strategy are resources that are difficult to imitate. a crucial challenge for managers then is to replicate these resources wihin the firm, while at the same time protecting them from imitation by competitors. Organizational routines are often named as candidates for such resources. A good understanding of the replication of organizational routines is therefore of great strategic interest. This article focuses on one aspect that seems to play an important role in the replication of routines: knowledge. The objective of this article is to identify knowledge-related aspects that have an influence in the replication of routines. In this and by defining routines in their social and cognitive dimensions, it contributes to a better understanding of their duplication process.
RNA-Seq optimization with eQTL gold standards.
BackgroundRNA-Sequencing (RNA-Seq) experiments have been optimized for library preparation, mapping, and gene expression estimation. These methods, however, have revealed weaknesses in the next stages of analysis of differential expression, with results sensitive to systematic sample stratification or, in more extreme cases, to outliers. Further, a method to assess normalization and adjustment measures imposed on the data is lacking.ResultsTo address these issues, we utilize previously published eQTLs as a novel gold standard at the center of a framework that integrates DNA genotypes and RNA-Seq data to optimize analysis and aid in the understanding of genetic variation and gene expression. After detecting sample contamination and sequencing outliers in RNA-Seq data, a set of previously published brain eQTLs was used to determine if sample outlier removal was appropriate. Improved replication of known eQTLs supported removal of these samples in downstream analyses. eQTL replication was further employed to assess normalization methods, covariate inclusion, and gene annotation. This method was validated in an independent RNA-Seq blood data set from the GTEx project and a tissue-appropriate set of eQTLs. eQTL replication in both data sets highlights the necessity of accounting for unknown covariates in RNA-Seq data analysis.ConclusionAs each RNA-Seq experiment is unique with its own experiment-specific limitations, we offer an easily-implementable method that uses the replication of known eQTLs to guide each step in one's data analysis pipeline. In the two data sets presented herein, we highlight not only the necessity of careful outlier detection but also the need to account for unknown covariates in RNA-Seq experiments
- …