328 research outputs found

    An Automated procedure for simulating complex arrival processes: A Web-based approach

    Get PDF
    In industry, simulation is one of the most widely used probabilistic modeling tools for modeling highly complex systems. Major sources of complexity include the inputs that drive the logic of the model. Effective simulation input modeling requires the use of accurate and efficient input modeling procedures. This research focuses on nonstationary arrival processes. The fundamental stochastic model on which this study is conducted is the nonhomogeneous Poisson process (NHPP) which has successfully been used to characterize arrival processes where the arrival rate changes over time. Although a number of methods exist for modeling the rate and mean value functions that define the behavior of NHPPs, one of the most flexible is a multiresolution procedure that is used to model the mean value function for processes possessing long-term trends over time or asymmetric, multiple cyclic behavior. In this research, a statistical-estimation procedure for automating the multiresolution procedure is developed that involves the following steps at each resolution level corresponding to a basic cycle: (a) transforming the cumulative relative frequency of arrivals within the cycle to obtain a linear statistical model having normal residuals with homogeneous variance; (b) fitting specially formulated polynomials to the transformed arrival data; (c) performing a likelihood ratio test to determine the degree of the fitted polynomial; and (d) fitting a polynomial of the degree determined in (c) to the original (untransformed) arrival data. Next, an experimental performance evaluation is conducted to test the effectiveness of the estimation method. A web-based application for modeling NHPPs using the automated multiresolution procedure and generating realizations of the NHPP is developed. Finally, a web-based simulation infrastructure that integrates modeling, input analysis, verification, validation and output analysis is discussed

    Smooth flexible models of nonhomogeneous Poisson processes fit to one or more process realizations

    Get PDF
    Simulation is a technique of creating representations or models of real world systems or processes and conducting experiments to predict behavior of actual systems. Input modeling is a critical aspect of simulation modeling. Stochastic input models are used to model various aspects of the system under uncertainty including process times and interarrival times. This research focuses on input models for nonstationary arrival processes that can be represented as nonhomogeneous Poisson processes (NHPPs). In particular, a smooth flexible model for the mean-value function (or integrated rate function) of a general NHPP is estimated. To represent the mean-value function, the method utilizes a specially formulated polynomial that is constrained in least-squares estimation to be nondecreasing so the corresponding rate function is nonnegative and continuously differentiable. The degree of the polynomial is determined by applying a modified likelihood ratio test to a set of transformed arrival times resulting from a variance stabilizing transformation of the observed data. Given the degree of polynomial, final estimates of the polynomial coefficients are obtained from original arrival times using least-squares estimation. The method is extended to fit an NHPP model to multiple observed realizations of a process. In addition, the method is adapted to a multiresolution procedure that effectively models NHPPs with long term trend and cyclic behavior given multiple process realizations. An experimental performance evaluation is conducted to determine the capabilities and limitations of the NHPP fitting procedure for single and multiple realizations of test processes. The method is implemented in a Java-based programming environment along with a web interface that allows user to upload observed data, fit an NHPP, and generate realizations of the fitted NHPP for use in simulation experiments

    Risk analysis of light-frame wood construction due to multiple hazards

    Get PDF
    Light-frame wood buildings are widely built in the United States (U.S.). Natural hazards cause huge losses to light-frame wood construction. This study proposes methodologies and a framework to evaluate the performance and risk of light-frame wood construction. Performance-based engineering (PBE) aims to ensure that a building achieves the desired performance objectives when subjected to hazard loads. In this study, the collapse risk of a typical one-story light-frame wood building is determined using the Incremental Dynamic Analysis method. The collapse risks of buildings at four sites in the Eastern, Western, and Central regions of U.S. are evaluated. Various sources of uncertainties are considered in the collapse risk assessment so that the influence of uncertainties on the collapse risk of lightframe wood construction is evaluated. The collapse risks of the same building subjected to maximum considered earthquakes at different seismic zones are found to be non-uniform. In certain areas in the U.S., the snow accumulation is significant and causes huge economic losses and threatens life safety. Limited study has been performed to investigate the snow hazard when combined with a seismic hazard. A Filtered Poisson Process (FPP) model is developed in this study, overcoming the shortcomings of the typically used Bernoulli model. The FPP model is validated by comparing the simulation results to weather records obtained from the National Climatic Data Center. The FPP model is applied in the proposed framework to assess the risk of a light-frame wood building subjected to combined snow and earthquake loads. The snow accumulation has a significant influence on the seismic losses of the building. The Bernoulli snow model underestimates the seismic loss of buildings in areas with snow accumulation. An object-oriented framework is proposed in this study to performrisk assessment for lightframe wood construction. For home owners and stake holders, risks in terms of economic losses is much easier to understand than engineering parameters (e.g., inter story drift). The proposed framework is used in two applications. One is to assess the loss of the building subjected to mainshock-aftershock sequences. Aftershock and downtime costs are found to be important factors in the assessment of seismic losses. The framework is also applied to a wood building in the state of Washington to assess the loss of the building subjected to combined earthquake and snow loads. The proposed framework is proven to be an appropriate tool for risk assessment of buildings subjected to multiple hazards. Limitations and future works are also identified

    Nonparametric Stochastic Generation of Daily Precipitation and Other Weather Variables

    Get PDF
    Traditional stochastic approaches for synthetic generation of weather variables often assume a prior functional form for the stochastic process, are often not capable of reproducing the probabilistic structure present in the data, and may not be uniformly applicable across sites. In an attempt to find a general framework for stochastic generation of weather variables, this study marks a unique departure from the traditional approaches, and ushers in the use of data-driven nonparametric techniques and demonstrates their utility. Precipitation is one of the key variables that drive hydrologic systems and hence warrants more focus . In this regard, two major aspects of precipitation modeling were considered: (I) resampling traces under the assumption of stationarity in the process, or with some treatment of the seasonality, and (2) investigations into interannual and secular trends in precipitation and their likely implications. A nonparametric seasonal wet/dry spell model was developed for the generation of daily precipitation. In this the probability density functions of interest are estimated using non parametric kernel density estimators. In the course of development of this model, various nonparametric density estimators for discrete and continuous data were reviewed, tested, and documented, which resulted in the development of a nonparametric estimator for discrete probability estimation. Variations in seasonality of precipitation as a function of latitude and topographic factors were seen through the non parametric estimation of the time-varying occurrence frequency. Nonparametric spectral analysis, performed on monthly precipitation, revealed significant interannual frequencies and coherence with known atmospheric oscillations. Consequently, a non parametric, nonhomogeneous Markov chain for modeling daily precipitation was developed that obviated the need to divide the year into seasons. Multivariate nonparametric resampling technique from the nonparametrically fitted probability density functions, which can be likened to a smoothed bootstrap approach, was developed for the simulation of other weather variables (solar radiation, maximum and minimum temperature, average dew point temperature, and average wind speed). In this technique the vector of variables on a day is generated by conditioning on the vector of these variables on the preceding day and the precipitation amount on the current day generated from the wet/dry spell model

    Parameterized Seismic Reliability Assessment and Life-Cycle Analysis of Aging Highway Bridges

    Get PDF
    The highway bridge infrastructure system within the United States is rapidly deteriorating and a significant percentage of these bridges are approaching the end of their useful service life. Deterioration mechanisms affect the load resisting capacity of critical structural components and render aging highway bridges more vulnerable to earthquakes compared to pristine structures. While past literature has traditionally neglected the simultaneous consideration of seismic and aging threats to highway bridges, a joint fragility assessment framework is needed to evaluate the impact of deterioration mechanisms on bridge vulnerability during earthquakes. This research aims to offer an efficient methodology for accurate estimation of the seismic fragility of aging highway bridges. In addition to aging, which is a predominant threat that affects lifetime seismic reliability, other stressors such as repeated seismic events or simultaneous presence of truck traffic are also incorporated in the seismic fragility analysis. The impact of deterioration mechanisms on bridge component responses are assessed for a range of exposure conditions following the nonlinear dynamic analysis of three-dimensional high-fidelity finite element aging bridge models. Subsequently, time-dependent fragility curves are developed at the bridge component and system level to assess the probability of structural damage given the earthquake intensity. In addition to highlighting the importance of accounting for deterioration mechanisms, these time-evolving fragility curves are used within an improved seismic loss estimation methodology to aid in efficient channeling of monetary resources for structural retrofit or seismic upgrade. Further, statistical learning methods are employed to derive flexible parameterized fragility models conditioned on earthquake hazard intensity, bridge design parameters, and deterioration affected structural parameters to provide significant improvements over traditional fragility models and aid in efficient estimation of aging bridge vulnerabilities. In order to facilitate bridge management decision making, a methodology is presented to demonstrate the applicability of the proposed multi-dimensional fragility models to estimate the in-situ aging bridge reliabilities with field-measurement data across a transportation network. Finally, this research proposes frameworks to offer guidance to risk analysts regarding the importance of accounting for supplementary threats stemming from multiple seismic shocks along the service life of the bridge structures and the presence of truck traffic atop the bridge deck during earthquake events

    Efficient algorithms for analyzing large scale network dynamics: Centrality, community and predictability

    Get PDF
    Large scale networks are an indispensable part of our daily life; be it biological network, smart grids, academic collaboration networks, social networks, vehicular networks, or the networks as part of various smart environments, they are fast becoming ubiquitous. The successful realization of applications and services over them depend on efficient solution to their computational challenges that are compounded with network dynamics. The core challenges underlying large scale networks, for example: determining central (influential) nodes (and edges), interactions and contacts among nodes, are the basis behind the success of applications and services. Though at first glance these challenges seem to be trivial, the network characteristics affect their effective and efficient evaluation strategy. We thus propose to leverage large scale network structural characteristics and temporal dynamics in addressing these core conceptual challenges in this dissertation. We propose a divide and conquer based computationally efficient algorithm that leverages the underlying network community structure for deterministic computation of betweenness centrality indices for all nodes. As an integral part of it, we also propose a computationally efficient agglomerative hierarchical community detection algorithm. Next, we propose a network structure evolution based novel probabilistic link prediction algorithm that predicts set of links occurring over subsequent time periods with higher accuracy. To best capture the evolution process and have higher prediction accuracy we propose multiple time scales with the Markov prediction model. Finally, we propose to capture the multi-periodicity of human mobility pattern with sinusoidal intensity function of a cascaded nonhomogeneous Poisson process, to predict the future contacts over mobile networks. We use real data set and benchmarked approaches to validate the better performance of our proposed approaches --Abstract, page iii

    Statistical Modeling: Regression, Survival Analysis, and Time Series Analysis

    Get PDF
    Statistical Modeling provides an introduction to regression, survival analysis, and time series analysis for students who have completed calculus-based courses in probability and mathematical statistics. The book uses the R language to fit statistical models, conduct Monte Carlo simulation experiments and generate graphics. Over 300 exercises at the end of the chapters makes this an appropriate text for a class in statistical modeling. Part 1: RegressionChapter 1: Simple Linear Regression Chapter 2: Inference in Simple Linear Regression Chapter 3: Topics in RegressionPart II: Survival Analysis Chapter 4: Probability Models in Survival AnalysisChapter 5: Statistical Methods in Survival Analysis Chapter 6: Topics in Survival Analysis Part III: Time Series Analysis Chapter 7: Basic Methods in Time Series AnalysisChapter 8: Modeling in Time Series Analysis Chapter 9: Topics in Time Series Analysi
    • …
    corecore