77,389 research outputs found

    Uniform random generation of large acyclic digraphs

    Full text link
    Directed acyclic graphs are the basic representation of the structure underlying Bayesian networks, which represent multivariate probability distributions. In many practical applications, such as the reverse engineering of gene regulatory networks, not only the estimation of model parameters but the reconstruction of the structure itself is of great interest. As well as for the assessment of different structure learning algorithms in simulation studies, a uniform sample from the space of directed acyclic graphs is required to evaluate the prevalence of certain structural features. Here we analyse how to sample acyclic digraphs uniformly at random through recursive enumeration, an approach previously thought too computationally involved. Based on complexity considerations, we discuss in particular how the enumeration directly provides an exact method, which avoids the convergence issues of the alternative Markov chain methods and is actually computationally much faster. The limiting behaviour of the distribution of acyclic digraphs then allows us to sample arbitrarily large graphs. Building on the ideas of recursive enumeration based sampling we also introduce a novel hybrid Markov chain with much faster convergence than current alternatives while still being easy to adapt to various restrictions. Finally we discuss how to include such restrictions in the combinatorial enumeration and the new hybrid Markov chain method for efficient uniform sampling of the corresponding graphs.Comment: 15 pages, 2 figures. To appear in Statistics and Computin

    Approximating Cross-validatory Predictive P-values with Integrated IS for Disease Mapping Models

    Full text link
    An important statistical task in disease mapping problems is to identify out- lier/divergent regions with unusually high or low residual risk of disease. Leave-one-out cross-validatory (LOOCV) model assessment is a gold standard for computing predictive p-value that can flag such outliers. However, actual LOOCV is time-consuming because one needs to re-simulate a Markov chain for each posterior distribution in which an observation is held out as a test case. This paper introduces a new method, called iIS, for approximating LOOCV with only Markov chain samples simulated from a posterior based on a full data set. iIS is based on importance sampling (IS). iIS integrates the p-value and the likelihood of the test observation with respect to the distribution of the latent variable without reference to the actual observation. The predictive p-values computed with iIS can be proved to be equivalent to the LOOCV predictive p-values, following the general theory for IS. We com- pare iIS and other three existing methods in the literature with a lip cancer dataset collected in Scotland. Our empirical results show that iIS provides predictive p-values that are al- most identical to the actual LOOCV predictive p-values and outperforms the existing three methods, including the recently proposed ghosting method by Marshall and Spiegelhalter (2007).Comment: 21 page

    Optimal Distribution of Renewable Energy Systems Considering Aging and Long-Term Weather Effect in Net-Zero Energy Building Design

    Get PDF
    Generation system interruptions in net-zero energy buildings (NZEBs) may result in missing the net-zero targets by a great margin. Consequently, it is significant to incorporate a realistic reliability model for renewable energy systems (RESs) that considers aging and long-term weather conditions. This study proposed a robust design optimization method that deals with the selection of RES to achieve NZEB. Different case studies were evaluated: 1. Deterministic approach; 2. Markov chain-based reliability without the aging effect; 3. Markov chain-based reliability with the aging effect. The results showed that the optimal sizes of RES, considering the aging effect, were much larger than the other two cases based on the annual energy balance. Moreover, the consideration of the aging effect on the reliability assessment of the generation system for NZEB opens a pathway for a more robust and economic design of RES

    STATISTICAL UNCERTAINTY IN DROUGHT FORECASTING USING MARKOV CHAINS AND THE STANDARD PRECIPITATION INDEX (SPI).

    Get PDF
    Droughts affect basic human activities, and food and industry production. An adequate drought forecasting is crucial to guarantee the survival of population and promote societal development. The Standard Precipitation Index (SPI) is recommended by the World Meteorological Organization (WMO) to monitor meteorological drought. Using drought classification based on SPI to build Markov chains is a common tool for drought forecasting. However, Markov chains building process produce uncertainties inherent to the transition probabilities estimation. These uncertainties are often ignored by practitioners. In this study we analyze the statistical uncertainties of using Markov chains for drought annual forecasting. As a case study, the dry region of the State of Ceará (Northeastern Brazil) is analyzed, considering the precipitation records from 1911 to 2019. In addition to 100-year database (1911-2011) for Markov chain modeling and 8-year data (2012-2019) for forecasting validation, four fictional database extensions were considered in order to assess the effect of database size in the uncertainty. A likelihood ratio is used as index to assess model performance. The uncertainties assessment showed that an apparent performant Markov chain model for drought class forecasting may not be more informative than the historic proportion of drought class. Considering these uncertainties is crucial for an adequate forecasting with Markov chains

    Design of an adaptive-rate video-streaming service with different classes of users

    Get PDF
    The provision of end-to-end Quality of Service (QoS) for multimedia services over IP-based networks is already an open issue. To achieve this goal, service providers need to manage Service Level Agreements (SLAs), which specify parameters of the services operation such as availability and performance. Additional mechanisms are needed to quantitatively evaluate the user-level SLA parameters. This work is focused on the evaluation and assessment of different design options of an adaptive VoD service providing several classes of users and fulfilling the SLA commitments. Based on a straightforward Markov Chain, Markov-Reward Chain (MRC) models are developed in order to obtain various QoS measures of the adaptive VoD service. The MRC model has a clear understanding with the design and operation of the VoD system.5th IFIP International Conference on Network Control & Engineering for QoS, Security and MobilityRed de Universidades con Carreras en Informática (RedUNCI

    Design of an adaptive-rate video-streaming service with different classes of users

    Get PDF
    The provision of end-to-end Quality of Service (QoS) for multimedia services over IP-based networks is already an open issue. To achieve this goal, service providers need to manage Service Level Agreements (SLAs), which specify parameters of the services operation such as availability and performance. Additional mechanisms are needed to quantitatively evaluate the user-level SLA parameters. This work is focused on the evaluation and assessment of different design options of an adaptive VoD service providing several classes of users and fulfilling the SLA commitments. Based on a straightforward Markov Chain, Markov-Reward Chain (MRC) models are developed in order to obtain various QoS measures of the adaptive VoD service. The MRC model has a clear understanding with the design and operation of the VoD system.5th IFIP International Conference on Network Control & Engineering for QoS, Security and MobilityRed de Universidades con Carreras en Informática (RedUNCI

    LIFECYCLE ENERGY CONSUMPTION PREDICTION OF RESIDENTIAL BUILDINGS BY INCORPORATING LONGITUDINAL UNCERTAINTIES

    Get PDF
    Accurate prediction of buildings’ lifecycle energy consumption is a critical part in lifecycle assessment of residential buildings. Longitudinal variations in building conditions, weather conditions and building’s service life can cause significant deviation of the prediction from the real lifecycle energy consumption. The objective is to improve the accuracy of lifecycle energy consumption prediction by properly modelling the longitudinal variations in residential energy consumption model using Markov chain based stochastic approach. A stochastic Markov model considering longitudinal uncertainties in building condition, degree days, and service life is developed: 1) Building’s service life is estimated through Markov deterioration curve derived from actual building condition data; 2) Neural Network is used to project periodic energy consumption distribution for each joint energy state of building condition and temperature state; 3) Lifecycle energy consumption is aggregated based on Markov process and the state probability. A case study on predicting lifecycle energy consumption of a residential building is presented using the proposed model and the result is compared to that of a traditional deterministic model and three years’ measured annual energy consumptions. It shows that the former model generates much narrower distribution than the latter model when compared to the measured data, which indicates improved result

    Performance Prediction Modeling of Concrete Bridge Deck Condition Using an Optimized Approach

    Get PDF
    Developing an accurate and reliable model for concrete bridge deck deterioration rates is a significant step in improving the condition assessment process. The main goal of this study is to develop a deterioration prediction model based on the condition ratings of concrete bridge decks over the past 25 years as reported in the National Bridge Inventory (NBI) database. While the literatures have typically suggested the Markov chain method as the most common technique used in condition assessment of bridges, the analysis in this pilot study suggests that the lognormal distribution function is a better model for concrete bridge deck condition data. This paper compares the two approaches and presents a new approach that combines the more commonly used Markov chain method with the lognormal distribution function to arrive at an optimal model for predicting bridge deck deterioration rates. The prediction error in the combined model is less than each of the two models (i.e. Markov and Lognormal). Additionally, the steel structure type illustrated the highest deterioration rates within condition ratings from 8 to 4 Comparing with other types. The bridge decks that have ADT of more than 4,000 (vehicles/day) deteriorated faster than of those with ADT less than 4,000 with the same type of structure and skew angle. Bridge decks with skew angles more than 30º deteriorate faster than of those with skew angles less than 30°. Furthermore, it showed that most new Michigan concrete bridge decks may take at least 40 years before dropping gradually from 9 to 3

    CONTINUOUS TIME MULTI-STATE MODELS FOR INTERVAL CENSORED DATA

    Get PDF
    Continuous-time multi-state models are widely used in modeling longitudinal data of disease processes with multiple transient states, yet the analysis is complex when subjects are observed periodically, resulting in interval censored data. Recently, most studies focused on modeling the true disease progression as a discrete time stationary Markov chain, and only a few studies have been carried out regarding non-homogenous multi-state models in the presence of interval-censored data. In this dissertation, several likelihood-based methodologies were proposed to deal with interval censored data in multi-state models. Firstly, a continuous time version of a homogenous Markov multi-state model with backward transitions was proposed to handle uneven follow-up assessments or skipped visits, resulting in the interval censored data. Simulations were used to compare the performance of the proposed model with the traditional discrete time stationary Markov chain under different types of observation schemes. We applied these two methods to the well-known Nun study, a longitudinal study of 672 participants aged ≥ 75 years at baseline and followed longitudinally with up to ten cognitive assessments per participant. Secondly, we constructed a non-homogenous Markov model for this type of panel data. The baseline intensity was assumed to be Weibull distributed to accommodate the non-homogenous property. The proportional hazards method was used to incorporate risk factors into the transition intensities. Simulation studies showed that the Weibull assumption does not affect the accuracy of the parameter estimates for the risk factors. We applied our model to data from the BRAiNS study, a longitudinal cohort of 531 subjects each cognitively intact at baseline. Last, we presented a parametric method of fitting semi-Markov models based on Weibull transition intensities with interval censored cognitive data with death as a competing risk. We relaxed the Markov assumption and took interval censoring into account by integrating out all possible unobserved transitions. The proposed model also allowed for incorporating time-dependent covariates. We provided a goodness-of-fit assessment for the proposed model by the means of prevalence counts. To illustrate the methods, we applied our model to the BRAiNS study

    Reliability model for component-based systems in cosmic (a case study)

    Get PDF
    Software component technology has a substantial impact on modern IT evolution. The benefits of this technology, such as reusability, complexity management, time and effort reduction, and increased productivity, have been key drivers of its adoption by industry. One of the main issues in building component-based systems is the reliability of the composed functionality of the assembled components. This paper proposes a reliability assessment model based on the architectural configuration of a component-based system and the reliability of the individual components, which is usage- or testing-independent. The goal of this research is to improve the reliability assessment process for large software component-based systems over time, and to compare alternative component-based system design solutions prior to implementation. The novelty of the proposed reliability assessment model lies in the evaluation of the component reliability from its behavior specifications, and of the system reliability from its topology; the reliability assessment is performed in the context of the implementation-independent ISO/IEC 19761:2003 International Standard on the COSMIC method chosen to provide the component\u27s behavior specifications. In essence, each component of the system is modeled by a discrete time Markov chain behavior based on its behavior specifications with extended-state machines. Then, a probabilistic analysis by means of Markov chains is performed to analyze any uncertainty in the component\u27s behavior. Our hypothesis states that the less uncertainty there is in the component\u27s behavior, the greater the reliability of the component. The system reliability assessment is derived from a typical component-based system architecture with composite reliability structures, which may include the composition of the serial reliability structures, the parallel reliability structures and the p-out-of-n reliability structures. The approach of assessing component-based system reliability in the COSMIC context is illustrated with the railroad crossing case study. © 2008 World Scientific Publishing Company
    corecore