117 research outputs found
Smooth flexible models of nonhomogeneous Poisson processes fit to one or more process realizations
Simulation is a technique of creating representations or models of real world systems or processes and conducting experiments to predict behavior of actual systems. Input modeling is a critical aspect of simulation modeling. Stochastic input models are used to model various aspects of the system under uncertainty including process times and interarrival times. This research focuses on input models for nonstationary arrival processes that can be represented as nonhomogeneous Poisson processes (NHPPs). In particular, a smooth flexible model for the mean-value function (or integrated rate function) of a general NHPP is estimated. To represent the mean-value function, the method utilizes a specially formulated polynomial that is constrained in least-squares estimation to be nondecreasing so the corresponding rate function is nonnegative and continuously differentiable. The degree of the polynomial is determined by applying a modified likelihood ratio test to a set of transformed arrival times resulting from a variance stabilizing transformation of the observed data. Given the degree of polynomial, final estimates of the polynomial coefficients are obtained from original arrival times using least-squares estimation. The method is extended to fit an NHPP model to multiple observed realizations of a process. In addition, the method is adapted to a multiresolution procedure that effectively models NHPPs with long term trend and cyclic behavior given multiple process realizations. An experimental performance evaluation is conducted to determine the capabilities and limitations of the NHPP fitting procedure for single and multiple realizations of test processes. The method is implemented in a Java-based programming environment along with a web interface that allows user to upload observed data, fit an NHPP, and generate realizations of the fitted NHPP for use in simulation experiments
Simulations of some Doubly Stochastic Poisson Point Processes
International audienceComputer simulations of point processes are important either to verify the results of certain theoretical calculations that can be very awkward at times, or to obtain practical results when these calculations become almost impossible. One of the most common methods for the simulation of nonstationary Poisson processes is random thinning. Its extension when the intensity becomes random (doubly stochastic Poisson processes) depends on the structure of this intensity. If the random density takes only discrete values, which is a common situation in many physical problems where quantum mechanics introduces discrete states, it is shown that the thinning method can be applied without error. We study in particular the case of binary density and we present the kind of theoretical calculations that then become possible. The results of various experiments realized with data obtained by simulation show fairly good agreement with the theoretical calculations
Estimation and control of non-linear and hybrid systems with applications to air-to-air guidance
Issued as Progress report, and Final report, Project no. E-21-67
Recommended from our members
Data-driven Decisions in Service Systems
This thesis makes contributions to help provide data-driven (or evidence-based) decision support to service systems, especially hospitals. Three selected topics are presented.
First, we discuss how Little's Law, which relates average limits and expected values of stationary distributions, can be applied to service systems data that are collected over a finite time interval. To make inferences based on the indirect estimator of average waiting times, we propose methods for estimating confidence intervals and for adjusting estimates to reduce bias. We show our new methods are effective using simulations and data from a US bank call center.
Second, we address important issues that need to be taken into account when testing whether real arrival data can be modeled by nonhomogeneous Poisson processes (NHPPs). We apply our method to data from a US bank call center and a hospital emergency department and demonstrate that their arrivals come from NHPPs.
Lastly, we discuss an approach to standardize the Intensive Care Unit admission process, which currently lacks a well-defined criteria. Using data from nearly 200,000 hospitalizations, we discuss how we can quantify the impact of Intensive Care Unit admission on individual patient's clinical outcomes. We then use this quantified impact and a stylized model to discuss optimal admission policies. We use simulation to compare the performance of our proposed optimal policies to the current admission policy, and show that the gain can be significant
Consistent estimator of ex-post covariation of discretely observed diffusion processes and its application to high frequency financial time series
First chapter of my thesis reviews recent developments in the theory and practice of
volatility measurement. We review the basic theoretical framework and describe the
main approaches to volatility measurement in continuous time. In this literature the
central parameter of interest is the integrated variance and its multivariate counterpart. We describe the measurement of these parameters under ideal circumstances
and when the data are subject to measurement error, microstructure issues. We also
describe some common applications of this literature.
In the second chapter, we propose a new estimator of multivariate ex-post volatility that is robust to microstructure noise and asynchronous data timing. The method
is based on Fourier domain techniques. The advantage of this method is that it does
not require an explicit time alignment, unlike existing methods in the literature. We
derive the large sample properties of our estimator under general assumptions allowing for the number of sample points for different assets to be of different order of
magnitude. We show in extensive simulations that our method outperforms the time
domain estimator especially when two assets are traded very asynchronously and with
different liquidity.
In the third chapter, we propose to model high frequency price series by a timedeformed L´evy process. The deformation function is modeled by a piecewise linear
function of a physical time with a slope depending on the marks associated with
intra-day transaction data. The performance of a quasi-MLE and an estimator based
on a permutation-like statistic is examined in extensive simulations. We also consider
estimating the deformation function nonparametrically by pulling together many time
series. We show that financial returns spaced by equal elapse of estimated deformed time are homogenous. We propose an order execution strategy using the fitted deformation tim
Data-driven reconfigurable supply chain design and inventory control
In this dissertation, we examine resource mobility in a supply chain that attempts to satisfy geographically distributed demand through resource sharing, where the resources can be inventory and manufacturing capacity. Our objective is to examine how resource mobility, coupled with data-driven analytics, can result in supply chains that without customer service level reduction blend the advantages of distributed production-inventory systems (e.g., fast fulfillment) and centralized systems (e.g., economies of scale, less total buffer inventory, and reduced capital expenditures). We present efficient and effective solution methods for logistics management of multi-location production-inventory systems with transportable production capacity. We present a novel, generalized representation of demand uncertainty and propose data-driven responses to the manage a single location inventory system under such demands.Ph.D
Scaling Multidimensional Inference for Big Structured Data
In information technology, big data is a collection of data sets so large and complex that it becomes difficult to process using traditional data processing applications [151]. In a
world of increasing sensor modalities, cheaper storage, and more data oriented questions, we are quickly passing the limits of tractable computations using traditional statistical analysis
methods. Methods which often show great results on simple data have difficulties processing complicated multidimensional data. Accuracy alone can no longer justify unwarranted memory
use and computational complexity. Improving the scaling properties of these methods for multidimensional data is the only way to make these methods relevant. In this work we explore methods for improving the scaling properties of parametric and nonparametric
models. Namely, we focus on the structure of the data to lower the complexity of a specific family of problems. The two types of structures considered in this work are distributive
optimization with separable constraints (Chapters 2-3), and scaling Gaussian processes for multidimensional lattice input (Chapters 4-5). By improving the scaling of these methods, we can expand their use to a wide range of applications which were previously intractable
open the door to new research questions
Numerical Simulations
This book will interest researchers, scientists, engineers and graduate students in many disciplines, who make use of mathematical modeling and computer simulation. Although it represents only a small sample of the research activity on numerical simulations, the book will certainly serve as a valuable tool for researchers interested in getting involved in this multidisciplinary field. It will be useful to encourage further experimental and theoretical researches in the above mentioned areas of numerical simulation
- …