18 research outputs found

    A statistical analysis of product prices in online markets

    Full text link
    We empirically investigate fluctuations in product prices in online markets by using a tick-by-tick price data collected from a Japanese price comparison site, and find some similarities and differences between product and asset prices. The average price of a product across e-retailers behaves almost like a random walk, although the probability of price increase/decrease is higher conditional on the multiple events of price increase/decrease. This is quite similar to the property reported by previous studies about asset prices. However, we fail to find a long memory property in the volatility of product price changes. Also, we find that the price change distribution for product prices is close to an exponential distribution, rather than a power law distribution. These two findings are in a sharp contrast with the previous results regarding asset prices. We propose an interpretation that these differences may stem from the absence of speculative activities in product markets; namely, e-retailers seldom repeat buy and sell of a product, unlike traders in asset markets.Comment: 5 pages, 5 figures, 1 table, proceedings of APFA

    Artifactual log-periodicity in finite size data: Relevance for earthquake aftershocks

    Full text link
    The recently proposed discrete scale invariance and its associated log-periodicity are an elaboration of the concept of scale invariance in which the system is scale invariant only under powers of specific values of the magnification factor. We report on the discovery of a novel mechanism for such log-periodicity relying solely on the manipulation of data. This ``synthetic'' scenario for log-periodicity relies on two steps: (1) the fact that approximately logarithmic sampling in time corresponds to uniform sampling in the logarithm of time; and (2) a low-pass-filtering step, as occurs in constructing cumulative functions, in maximum likelihood estimations, and in de-trending, reddens the noise and, in a finite sample, creates a maximum in the spectrum leading to a most probable frequency in the logarithm of time. We explore in detail this mechanism and present extensive numerical simulations. We use this insight to analyze the 27 best aftershock sequences studied by Kisslinger and Jones [1991] to search for traces of genuine log-periodic corrections to Omori's law, which states that the earthquake rate decays approximately as the inverse of the time since the last main shock. The observed log-periodicity is shown to almost entirely result from the ``synthetic scenario'' owing to the data analysis. From a statistical point of view, resolving the issue of the possible existence of log-periodicity in aftershocks will be very difficult as Omori's law describes a point process with a uniform sampling in the logarithm of the time. By construction, strong log-periodic fluctuations are thus created by this logarithmic sampling.Comment: LaTeX, JGR preprint with AGU++ v16.b and AGUTeX 5.0, use packages graphicx, psfrag and latexsym, 41 eps figures, 26 pages. In press J. Geophys. Re

    Economic returns of research: the Pareto law and its implications

    No full text
    At what level should government or companies support research? This complex multi-faceted question encompasses such qualitative bonus as satisfying natural human curiosity, the quest for knowledge and the impact on education and culture, but one of its most scrutinized component reduces to the assessment of economic performance and wealth creation derived from research. Many studies report evidences of positive economic benefits derived from basic research [CITE]. In certain areas such as biotechnology, semi-conductor physics, optical communications [CITE], the impact of basic research is direct while, in other disciplines, the path from discovery to applications is full of surprises. As a consequence, there are persistent uncertainties in the quantification of the exact economic returns of public expenditure on basic research. This gives little help to policy makers trying to determine what should be the level of funding. Here, we suggest that these uncertainties have a fundamental origin to be found in the interplay between the intrinsic "fat tail" power law nature of the distribution of economic returns, characterized by a mathematically diverging variance, and the stochastic character of discovery rates. In the regime where the cumulative economic wealth derived from research is expected to exhibit a long-term positive trend, we show that strong fluctuations blur out significantly the short-time scales: a few major unpredictable innovations may provide a finite fraction of the total creation of wealth. In such a scenario, any attempt to assess the economic impact of research over a finite time horizon encompassing only a small number of major discoveries is bound to be highly unreliable. New tools, developed in the theory of self-similar and complex systems [CITE] to tackle similar extreme fluctuations in Nature [CITE], can be adapted to measure the economic benefits of research, which is intimately associated to this large variability

    Economic returns of research: the Pareto . . .

    No full text
    At what level should government or companies support research? This complex multi-faceted question encompasses such qualitative bonus as satisfying natural human curiosity, the quest for knowledge and the impact on education and culture, but one of its most scrutinized component reduces to the assessment of economic performance and wealth creation derived from research. Many studies report evidences of positive economic benefits derived from basic research [1,2]. In certain areas such as biotechnology, semi-conductor physics, optical communications [3], the impact of basic research is direct while, in other disciplines, the path from discovery to applications is full of surprises. As a consequence, there are persistent uncertainties in the quantification of the exact economic returns of public expenditure on basic research. This gives little help to policy makers trying to determine what should be the level of funding. Here, we suggest that these uncertainties have a fundamental origin to be found in the interplay between the intrinsic "fat tail" power law nature of the distribution of economic returns, characterized by a mathematically diverging variance, and the stochastic character of discovery rates. In the regime where the cumulative economic wealth derived from research is expected to exhibit a long-term positive trend, we show that strong fluctuations blur out significantly the short-time scales: a few major unpredictable innovations may provide a finite fraction of the total creation of wealth. In such a scenario, any attempt to assess the economic impact of research over a finite time horizon encompassing only a small number of major discoveries is bound to be highly unreliable. New tools, developed in the theory of self-similar and complex s..

    A Data-Analytic Method for Forecasting Next Record Catastrophe Loss

    No full text
    We develop in this article a data-analytic method to forecast the severity of next record insured loss to property caused by natural catastrophic events. The method requires and employs the knowledge of an expert and accounts for uncertainty in parameter estimation. Both considerations are essential for the task at hand because the available data are typically scarce in extreme value analysis. In addition, we consider three-parameter Gamma priors for the parameter in the model and thus provide simple analytical solutions to several key elements of interest, such as the predictive moments of record value. As a result, the model enables practitioners to gain insights into the behavior of such predictive moments without concerning themselves with the computational issues that are often associated with a complex Bayesian analysis. A data set consisting of catastrophe losses occurring in the United States between 1990 and 1999 is analyzed, and the forecasts of next record loss are made under various prior assumptions. We demonstrate that the proposed method provides more reliable and theoretically sound forecasts, whereas the conditional mean approach, which does not account for either prior information or uncertainty in parameter estimation, may provide inadmissible forecasts. Copyright The Journal of Risk and Insurance.

    Insurability of Climate Risks

    No full text
    The IPCC 2007 report noted that both the frequency and strength of hurricanes, floods and droughts have increased during the past few years. Thus, climate risk, and more specifically natural catastrophes, are now hardly insurable: losses can be huge (and the actuarial pure premium might even be infinite), diversification through the central limit theorem is not possible because of geographical correlation (a lot of additional capital is required), there might exist no insurance market since the price asked by insurance companies can be much higher than the price householders are willing to pay (short-term horizon of policyholders), and, due to climate change, there is more uncertainty (and thus additional risk). The first idea we will discuss in this paper, about insurance markets and climate risks, is that insurance exists only if risk can be transferred, not only to reinsurance companies but also to capital markets (through securitization or catastrophes options). The second one is that climate is changing, and therefore, not only prices and capital required should be important, but also uncertainty can be very large. It is extremely difficult to insure in a changing environment. The Geneva Papers (2008) 33, 91–109. doi:10.1057/palgrave.gpp.2510155
    corecore