4,567 research outputs found

    Modeling and estimation of multiresolution stochastic processes

    Get PDF
    Includes bibliographical references (p. 47-51).Caption title.Research supported in part by the National Science Foundation. ECS-8700903 Research supported in part by the Air Force Office of Scientific Research. AFOSR-88-0032 Research supported in part by the US Army Research Office. DAAL03-86-K-0171 Research supported in part by INRIA.Michele Basseville ... [et al.]

    An Automated procedure for simulating complex arrival processes: A Web-based approach

    Get PDF
    In industry, simulation is one of the most widely used probabilistic modeling tools for modeling highly complex systems. Major sources of complexity include the inputs that drive the logic of the model. Effective simulation input modeling requires the use of accurate and efficient input modeling procedures. This research focuses on nonstationary arrival processes. The fundamental stochastic model on which this study is conducted is the nonhomogeneous Poisson process (NHPP) which has successfully been used to characterize arrival processes where the arrival rate changes over time. Although a number of methods exist for modeling the rate and mean value functions that define the behavior of NHPPs, one of the most flexible is a multiresolution procedure that is used to model the mean value function for processes possessing long-term trends over time or asymmetric, multiple cyclic behavior. In this research, a statistical-estimation procedure for automating the multiresolution procedure is developed that involves the following steps at each resolution level corresponding to a basic cycle: (a) transforming the cumulative relative frequency of arrivals within the cycle to obtain a linear statistical model having normal residuals with homogeneous variance; (b) fitting specially formulated polynomials to the transformed arrival data; (c) performing a likelihood ratio test to determine the degree of the fitted polynomial; and (d) fitting a polynomial of the degree determined in (c) to the original (untransformed) arrival data. Next, an experimental performance evaluation is conducted to test the effectiveness of the estimation method. A web-based application for modeling NHPPs using the automated multiresolution procedure and generating realizations of the NHPP is developed. Finally, a web-based simulation infrastructure that integrates modeling, input analysis, verification, validation and output analysis is discussed

    Time-varying parametric modelling and time-dependent spectral characterisation with applications to EEG signals using multi-wavelets

    Get PDF
    A new time-varying autoregressive (TVAR) modelling approach is proposed for nonstationary signal processing and analysis, with application to EEG data modelling and power spectral estimation. In the new parametric modelling framework, the time-dependent coefficients of the TVAR model are represented using a novel multi-wavelet decomposition scheme. The time-varying modelling problem is then reduced to regression selection and parameter estimation, which can be effectively resolved by using a forward orthogonal regression algorithm. Two examples, one for an artificial signal and another for an EEG signal, are given to show the effectiveness and applicability of the new TVAR modelling method

    A Multiresolution Stochastic Process Model for Predicting Basketball Possession Outcomes

    Full text link
    Basketball games evolve continuously in space and time as players constantly interact with their teammates, the opposing team, and the ball. However, current analyses of basketball outcomes rely on discretized summaries of the game that reduce such interactions to tallies of points, assists, and similar events. In this paper, we propose a framework for using optical player tracking data to estimate, in real time, the expected number of points obtained by the end of a possession. This quantity, called \textit{expected possession value} (EPV), derives from a stochastic process model for the evolution of a basketball possession; we model this process at multiple levels of resolution, differentiating between continuous, infinitesimal movements of players, and discrete events such as shot attempts and turnovers. Transition kernels are estimated using hierarchical spatiotemporal models that share information across players while remaining computationally tractable on very large data sets. In addition to estimating EPV, these models reveal novel insights on players' decision-making tendencies as a function of their spatial strategy.Comment: 31 pages, 9 figure

    High-Dimensional Bayesian Geostatistics

    Full text link
    With the growing capabilities of Geographic Information Systems (GIS) and user-friendly software, statisticians today routinely encounter geographically referenced data containing observations from a large number of spatial locations and time points. Over the last decade, hierarchical spatiotemporal process models have become widely deployed statistical tools for researchers to better understand the complex nature of spatial and temporal variability. However, fitting hierarchical spatiotemporal models often involves expensive matrix computations with complexity increasing in cubic order for the number of spatial locations and temporal points. This renders such models unfeasible for large data sets. This article offers a focused review of two methods for constructing well-defined highly scalable spatiotemporal stochastic processes. Both these processes can be used as "priors" for spatiotemporal random fields. The first approach constructs a low-rank process operating on a lower-dimensional subspace. The second approach constructs a Nearest-Neighbor Gaussian Process (NNGP) that ensures sparse precision matrices for its finite realizations. Both processes can be exploited as a scalable prior embedded within a rich hierarchical modeling framework to deliver full Bayesian inference. These approaches can be described as model-based solutions for big spatiotemporal datasets. The models ensure that the algorithmic complexity has ∼n\sim n floating point operations (flops), where nn the number of spatial locations (per iteration). We compare these methods and provide some insight into their methodological underpinnings
    • …
    corecore