482 research outputs found

    The Econometric Analysis of Microscopic Simulation Models

    Get PDF
    This paper studies how to compare different microscopic simulation (MS) models and how to compare a MS model with real world. The parameters of interest are classified and characterized, various econometric methods are applied for the comparison. We illustrate the methodolgy on testing of the equality of parameters, such as mean, autocorrelation coefficient, for both the case of comparing two different MS models and of comparing a MS model with real worldMS models, econometrics

    Long Memory, Heterogeneity and Trend Chasing

    Get PDF
    Long-range dependence in volatility is one of the most prominent examples of applications in financial market research involving universal power laws. Its characterization has recently spurred attempts at theoretical explanation of the underlying mechanism. This paper contributes to this recent development by analyzing a simple market fraction asset pricing model with two types of traders fundamentalists who trade on the price deviation from estimated fundamental value and trend followers who follow a trend which is updated through a geometric learning process. Our analysis shows that the heterogeneity, trend chasing through learning, and the interplay of noisy processes and a stable deterministic equilibrium can be the source of power-law distributed fluctuations. Statistical analysis based on Monte Carlo simulations are conducted to characterize the long memory. Realistic estimates of the power-law decay indices and the (FI)GARCH parameters are found.asset pricing; fundamentalists and trend followers; market fraction; stability; learning; long memory

    Heterogeneity, Profitability and Autocorrelations

    Get PDF
    This paper contributes to the development of recent literature on the explanation power and calibration issue of heterogeneous asset pricing models by presenting a simple stochastic market fraction asset pricing model of two types of traders (fundamentalists and trend followers) under a market maker scenario. It seeks to explain aspects of financial market behaviour (such as market dominance, under and over-reaction, profitability and survivability) and to characterize various statistical properties (including autocorrelation structure) of the stochastic model by using the the dynamics of the underlying deterministic system, traders? behaviour and market fractions. Statistical analysis based on Monte Carlo simulations shows that the long-run behaviour and convergence of the market prices, long (short)-run profitability of the fundamental (trend following) trading strategy, survivability of chartists, and various under and over-reaction autocorrelation patterns of returns can be characterized by the stability and bifurcations of the underlying deterministic system. Our analysis underpins mechanism on various market behaviour (such as under/over-reactions), market dominance and stylized facts in high frequency financial markets.asset pricing; heterogeneous beliefs; market fraction; stability; bifurcation; market behaviour; profitability; survivability; autocorrelation

    Is mortality spatial or social?

    Get PDF
    Mortality modelling for the purposes of demographic forecasting and actuarial pricing is generally done at an aggregate level using national data. Modelling at this level fails to capture the variation in mortality within country and potentially leads to a mis-specification of mortality forecasts for a subset of the population. This can have detrimental effects for pricing and reserving in the actuarial context. In this paper we consider mortality rates at a regional level and analyse the variation in those rates. We consider whether variation in mortality rates within a country can be explained using local economic and social variables. Using Northern Ireland data on mortality and measures of deprivation we identify the variables explaining mortality variation. We create a population polarisation variable and find that this variable is significant in explaining some of the variation in mortality rates. Further, we consider whether spatial and non-spatial models have a part to play in explaining mortality differentials

    Competition or Authorization—Manufacturers’ Choice of Remanufacturing Strategies

    Get PDF
    In the face of the cannibalization of remanufactured products produced by independent remanufacturers (IRs), original equipment manufacturers (OEMs) can produce remanufactured products themselves to compete with independent remanufacturers (IRs), or they can authorize the IRs to cooperate because of their seller reputation. This paper studies the key factors that influence OEMs’ choice of remanufacturing strategies. By establishing three two-stage models and comparing them, the thresholds for OEMs to choose different remanufacturing strategies were obtained. There is also an interesting finding that when the authorization fee is higher than a certain value, even if the remanufactured product poses a competitive threat to the new product, the OEM will help the IR improve their remanufacturing technology to save costs and achieve a win–win situation. With the increase in authorization fees, OEMs’ profits will increase first and then decrease, so it is not always better for OEMs to charge higher authorization fees. Whether it is an authorization or a competitive scenario, the improvement in remanufacturing technology by OEMs can increase the output of remanufactured products, which is conducive to environmental protection

    GraphR: Accelerating Graph Processing Using ReRAM

    Full text link
    This paper presents GRAPHR, the first ReRAM-based graph processing accelerator. GRAPHR follows the principle of near-data processing and explores the opportunity of performing massive parallel analog operations with low hardware and energy cost. The analog computation is suit- able for graph processing because: 1) The algorithms are iterative and could inherently tolerate the imprecision; 2) Both probability calculation (e.g., PageRank and Collaborative Filtering) and typical graph algorithms involving integers (e.g., BFS/SSSP) are resilient to errors. The key insight of GRAPHR is that if a vertex program of a graph algorithm can be expressed in sparse matrix vector multiplication (SpMV), it can be efficiently performed by ReRAM crossbar. We show that this assumption is generally true for a large set of graph algorithms. GRAPHR is a novel accelerator architecture consisting of two components: memory ReRAM and graph engine (GE). The core graph computations are performed in sparse matrix format in GEs (ReRAM crossbars). The vector/matrix-based graph computation is not new, but ReRAM offers the unique opportunity to realize the massive parallelism with unprecedented energy efficiency and low hardware cost. With small subgraphs processed by GEs, the gain of performing parallel operations overshadows the wastes due to sparsity. The experiment results show that GRAPHR achieves a 16.01x (up to 132.67x) speedup and a 33.82x energy saving on geometric mean compared to a CPU baseline system. Com- pared to GPU, GRAPHR achieves 1.69x to 2.19x speedup and consumes 4.77x to 8.91x less energy. GRAPHR gains a speedup of 1.16x to 4.12x, and is 3.67x to 10.96x more energy efficiency compared to PIM-based architecture.Comment: Accepted to HPCA 201

    A Novel Adaptive Search Range Algorithm for Motion Estimation Based on H.264

    Get PDF
    Motion estimation (ME) is very vital to video compression. Due to the adoption of the high precision of motion vector (MV) in H.264 encoder, the computational cost increases rapidly, and ME takes about 60% of the whole encoding time. In order to accommodate the new variable block size motion estimation strategy adopted in H.264, this paper proposes a novel adaptive search range(ASR) algorithm as a optimized part based on UMHexagonS. Not only we utilize the median_MVP and interframe information in our ASR algorithm but also a penalty function is included. Experimental results indicate that our proposed method reduces the computational complexity in a certain degree and enhances encoding efficiency but has few changes in the reconstructed image quality and bit rate

    Explaining young mortality

    Get PDF
    Stochastic modeling of mortality rates focuses on fitting linear models to logarithmically adjusted mortality data from the middle or late ages. Whilst this modeling enables insurers to project mortality rates and hence price mortality products it does not provide good fit for younger aged mortality. Mortality rates below the early 20’s are important to model as they give an insight into estimates of the cohort effect for more recent years of birth. It is also important given the cumulative nature of life expectancy to be able to forecast mortality improvements at all ages. When we attempt to fit existing models to a wider age range, 5–89, rather than 20–89 or 50–89, their weaknesses are revealed as the results are not satisfactory. The linear innovations in existing models are not flexible enough to capture the non-linear profile of mortality rates that we see at the lower ages. In this paper, we modify an existing 4 factor model of mortality to enable better fitting to a wider age range, and using data from seven developed countries our empirical results show that the proposed model has a better fit to the actual data, is robust, and has good forecasting ability
    • 

    corecore