29,859 research outputs found

    A statistical analysis of particle trajectories in living cells

    Get PDF
    Recent advances in molecular biology and fluorescence microscopy imaging have made possible the inference of the dynamics of single molecules in living cells. Such inference allows to determine the organization and function of the cell. The trajectories of particles in the cells, computed with tracking algorithms, can be modelled with diffusion processes. Three types of diffusion are considered : (i) free diffusion; (ii) subdiffusion or (iii) superdiffusion. The Mean Square Displacement (MSD) is generally used to determine the different types of dynamics of the particles in living cells (Qian, Sheetz and Elson 1991). We propose here a non-parametric three-decision test as an alternative to the MSD method. The rejection of the null hypothesis -- free diffusion -- is accompanied by claims of the direction of the alternative (subdiffusion or a superdiffusion). We study the asymptotic behaviour of the test statistic under the null hypothesis, and under parametric alternatives which are currently considered in the biophysics literature, (Monnier et al,2012) for example. In addition, we adapt the procedure of Benjamini and Hochberg (2000) to fit with the three-decision test setting, in order to apply the test procedure to a collection of independent trajectories. The performance of our procedure is much better than the MSD method as confirmed by Monte Carlo experiments. The method is demonstrated on real data sets corresponding to protein dynamics observed in fluorescence microscopy.Comment: Revised introduction. A clearer and shorter description of the model (section 2

    The effect of round-off error on long memory processes

    Full text link
    We study how the round-off (or discretization) error changes the statistical properties of a Gaussian long memory process. We show that the autocovariance and the spectral density of the discretized process are asymptotically rescaled by a factor smaller than one, and we compute exactly this scaling factor. Consequently, we find that the discretized process is also long memory with the same Hurst exponent as the original process. We consider the properties of two estimators of the Hurst exponent, namely the local Whittle (LW) estimator and the Detrended Fluctuation Analysis (DFA). By using analytical considerations and numerical simulations we show that, in presence of round-off error, both estimators are severely negatively biased in finite samples. Under regularity conditions we prove that the LW estimator applied to discretized processes is consistent and asymptotically normal. Moreover, we compute the asymptotic properties of the DFA for a generic (i.e. non Gaussian) long memory process and we apply the result to discretized processes.Comment: 44 pages, 4 figures, 4 table

    Instrumental Variable Interpretation of Cointegration with Inference Results for Fractional Cointegration

    Get PDF
    In this paper we propose an alternative characterization of the central notion of cointegration, exploiting the relationship between the autocovariance and the cross-covariance functions of the series. This characterization leads us to propose a new estimator of the cointegrating parameter based on the instrumental variables (IV) methodology. The instrument is a delayed regressor obtained from the conditional bivariate system of nonstationary fractionally integrated processes with a weakly stationary error correction term. We prove the consistency of this estimator and derive its limiting distribution. We also show that, in the I(1) case, with a semiparametric correction simpler than the one required for the fully modified ordinary least squares (FM-OLS), our fully modified instrumental variables (FM-IV) estimator is median-unbiased, a mixture of normals, and asymptotically efficient. As a consequence, standard inference can be conducted with this new FM-IV estimator of the cointegrating parameter. We show by the use of Monte Carlo simulations that the small sample gains with the new IV estimator over OLS are remarkable.Publicad

    Spatial snow water equivalent estimation for mountainous areas using wireless-sensor networks and remote-sensing products

    Get PDF
    We developed an approach to estimate snow water equivalent (SWE) through interpolation of spatially representative point measurements using a k-nearest neighbors (k-NN) algorithm and historical spatial SWE data. It accurately reproduced measured SWE, using different data sources for training and evaluation. In the central-Sierra American River basin, we used a k-NN algorithm to interpolate data from continuous snow-depth measurements in 10 sensor clusters by fusing them with 14 years of daily 500-m resolution SWE-reconstruction maps. Accurate SWE estimation over the melt season shows the potential for providing daily, near real-time distributed snowmelt estimates. Further south, in the Merced-Tuolumne basins, we evaluated the potential of k-NN approach to improve real-time SWE estimates. Lacking dense ground-measurement networks, we simulated k-NN interpolation of sensor data using selected pixels of a bi-weekly Lidar-derived snow water equivalent product. k-NN extrapolations underestimate the Lidar-derived SWE, with a maximum bias of −10 cm at elevations below 3000 m and +15 cm above 3000 m. This bias was reduced by using a Gaussian-process regression model to spatially distribute residuals. Using as few as 10 scenes of Lidar-derived SWE from 2014 as training data in the k-NN to estimate the 2016 spatial SWE, both RMSEs and MAEs were reduced from around 20–25 cm to 10–15 cm comparing to using SWE reconstructions as training data. We found that the spatial accuracy of the historical data is more important for learning the spatial distribution of SWE than the number of historical scenes available. Blending continuous spatially representative ground-based sensors with a historical library of SWE reconstructions over the same basin can provide real-time spatial SWE maps that accurately represents Lidar-measured snow depth; and the estimates can be improved by using historical Lidar scans instead of SWE reconstructions

    Supply driven mortgage choice

    Get PDF
    Variable mortgage contracts dominate the UK mortgage market (Miles, 2004). The dominance of the variable rate mortgage contracts has important consequences for the transmission mechanism of monetary policy decisions and systemic risks (Khandani et al., 2012; Fuster and Vickery, 2013). This raises an obvious concern that a mortgage market such as that in the UK, where the major proportion of mortgage debt is either at a variable or fixed for less than two years rate (Badarinza, et al., 2013; CML, 2012), is vulnerable to alterations in the interest rate regime. Theoretically, mortgage choice is determined by demand and supply factors. So far, most of the existing literature has focused on the demand side perspective, and what is limited is consideration of supply side factors in empirical investigation on mortgage choice decisions. This paper uniquely explores whether supply side factors may partially explain observed/ex-post mortgage type decisions. Empirical results detect that lenders’ profit motives and mortgage funding/pricing issues may have assisted in preferences toward variable rate contracts. Securitisation is found to positively impact upon gross mortgage lending volumes while negatively impacting upon the share of variable lending flows. This shows that an increase in securitisation not only improves liquidity in the supply of mortgage funds, but also has the potential to shift mortgage choices toward fixed mortgage debt. The policy implications may involve a number of measures, including reconsideration of the capital requirements for the fixed, as opposed to the variable rate mortgage debt, growing securitisation and optimisation of the mortgage pricing policies
    corecore