486 research outputs found

    Modeling for seasonal marked point processes: An analysis of evolving hurricane occurrences

    Full text link
    Seasonal point processes refer to stochastic models for random events which are only observed in a given season. We develop nonparametric Bayesian methodology to study the dynamic evolution of a seasonal marked point process intensity. We assume the point process is a nonhomogeneous Poisson process and propose a nonparametric mixture of beta densities to model dynamically evolving temporal Poisson process intensities. Dependence structure is built through a dependent Dirichlet process prior for the seasonally-varying mixing distributions. We extend the nonparametric model to incorporate time-varying marks, resulting in flexible inference for both the seasonal point process intensity and for the conditional mark distribution. The motivating application involves the analysis of hurricane landfalls with reported damages along the U.S. Gulf and Atlantic coasts from 1900 to 2010. We focus on studying the evolution of the intensity of the process of hurricane landfall occurrences, and the respective maximum wind speed and associated damages. Our results indicate an increase in the number of hurricane landfall occurrences and a decrease in the median maximum wind speed at the peak of the season. Introducing standardized damage as a mark, such that reported damages are comparable both in time and space, we find that there is no significant rising trend in hurricane damages over time.Comment: Published at http://dx.doi.org/10.1214/14-AOAS796 in the Annals of Applied Statistics (http://www.imstat.org/aoas/) by the Institute of Mathematical Statistics (http://www.imstat.org

    Reactive point processes: A new approach to predicting power failures in underground electrical systems

    Full text link
    Reactive point processes (RPPs) are a new statistical model designed for predicting discrete events in time based on past history. RPPs were developed to handle an important problem within the domain of electrical grid reliability: short-term prediction of electrical grid failures ("manhole events"), including outages, fires, explosions and smoking manholes, which can cause threats to public safety and reliability of electrical service in cities. RPPs incorporate self-exciting, self-regulating and saturating components. The self-excitement occurs as a result of a past event, which causes a temporary rise in vulner ability to future events. The self-regulation occurs as a result of an external inspection which temporarily lowers vulnerability to future events. RPPs can saturate when too many events or inspections occur close together, which ensures that the probability of an event stays within a realistic range. Two of the operational challenges for power companies are (i) making continuous-time failure predictions, and (ii) cost/benefit analysis for decision making and proactive maintenance. RPPs are naturally suited for handling both of these challenges. We use the model to predict power-grid failures in Manhattan over a short-term horizon, and to provide a cost/benefit analysis of different proactive maintenance programs.Comment: Published at http://dx.doi.org/10.1214/14-AOAS789 in the Annals of Applied Statistics (http://www.imstat.org/aoas/) by the Institute of Mathematical Statistics (http://www.imstat.org

    Yield and Reliability Analysis for Nanoelectronics

    Get PDF
    As technology has continued to advance and more break-through emerge, semiconductor devices with dimensions in nanometers have entered into all spheres of our lives. Accordingly, high reliability and high yield are very much a central concern to guarantee the advancement and utilization of nanoelectronic products. However, there appear to be some major challenges related to nanoelectronics in regard to the field of reliability: identification of the failure mechanisms, enhancement of the low yields of nano products, and management of the scarcity and secrecy of available data [34]. Therefore, this dissertation investigates four issues related to the yield and reliability of nanoelectronics. Yield and reliability of nanoelectronics are affected by defects generated in the manufacturing processes. An automatic method using model-based clustering has been developed to detect the defect clusters and identify their patterns where the distribution of the clustered defects is modeled by a new mixture distribution of multivariate normal distributions and principal curves. The new mixture model is capable of modeling defect clusters with amorphous, curvilinear, and linear patterns. We evaluate the proposed method using both simulated and experimental data and promising results have been obtained. Yield is one of the most important performance indexes for measuring the success of nano fabrication and manufacturing. Accurate yield estimation and prediction is essential for evaluating productivity and estimating production cost. This research studies advanced yield modeling approaches which consider the spatial variations of defects or defect counts. Results from real wafer map data show that the new yield models provide significant improvement in yield estimation compared to the traditional Poisson model and negative binomial model. The ultra-thin SiO2 is a major factor limiting the scaling of semiconductor devices. High-k gate dielectric materials such as HfO2 will replace SiO2 in future generations of MOS devices. This study investigates the two-step breakdown mechanisms and breakdown sequences of double-layered high-k gate stacks by monitoring the relaxation of the dielectric films. The hazard rate is a widely used metric for measuring the reliability of electronic products. This dissertation studies the hazard rate function of gate dielectrics breakdown. A physically feasible failure time distribution is used to model the time-to-breakdown data and a Bayesian approach is adopted in the statistical analysis

    Model misspecification in peaks over threshold analysis

    Full text link
    Classical peaks over threshold analysis is widely used for statistical modeling of sample extremes, and can be supplemented by a model for the sizes of clusters of exceedances. Under mild conditions a compound Poisson process model allows the estimation of the marginal distribution of threshold exceedances and of the mean cluster size, but requires the choice of a threshold and of a run parameter, KK, that determines how exceedances are declustered. We extend a class of estimators of the reciprocal mean cluster size, known as the extremal index, establish consistency and asymptotic normality, and use the compound Poisson process to derive misspecification tests of model validity and of the choice of run parameter and threshold. Simulated examples and real data on temperatures and rainfall illustrate the ideas, both for estimating the extremal index in nonstandard situations and for assessing the validity of extremal models.Comment: Published in at http://dx.doi.org/10.1214/09-AOAS292 the Annals of Applied Statistics (http://www.imstat.org/aoas/) by the Institute of Mathematical Statistics (http://www.imstat.org

    Non-Homogeneous Poisson Process to Model Seasonal Events: Application to the Health Diseases

    Get PDF
    The daily number of hospital admissions due to mosquito-borne diseases can vary greatly. This variability can be explained by different factors such as season of the year, temperature and pollution levels, among others. In this paper, we propose a new class of non-homogeneous Poisson processes which incorporates seasonality factors to more realistically fit data related to rare events, and in particular we show how the modifications applied to the special NHPP intensity function improve the analysis and fit of daily hospital admissions, due to dengue in Ribeirão Preto, São Paulo state, Brazil
    • …
    corecore