184 research outputs found

    Sheffer sequences, probability distributions and approximation operators

    Get PDF
    We present a new method to compute formulas for the action on monomials of a generalization of binomial approximation operators of Popoviciu type, or equivalently moments of associated discrete probability distributions with finite support. These quantities are necessary to check the assumptions of the Korovkin Theorem for approximation operators, or equivalently the Feller Theorem for convergence of the probability distributions. Our method unifies and simplifies computations of well-known special cases. It only requires a few basic facts from Umbral Calculus. We illustrate our method to well-known approximation operators and probability distributions, as well as to some recent q-generalizations of the Bernstein approximation operator introduced by Lewanowicz and Wo´zny, Lupa¸s, and Phillips

    A selected survey of umbral calculus

    Get PDF
    We survey the mathematical literature on umbral calculus (otherwise known as the calculus of finite differences) from its roots in the 19th century (and earlier) as a set of "magic rules" for lowering and raising indices, through its rebirth in the 1970’s as Rota’s school set it on a firm logical foundation using operator methods, to the current state of the art with numerous generalizations and applications. The survey itself is complemented by a fairly complete bibliography (over 500 references) which we expect to update regularly

    Data-driven online monitoring of wind turbines

    Get PDF
    Condition based maintenance is a modern approach to maintenance which has been successfully used in several industrial sectors. In this paper we present a concrete statistical approach to condition based maintenance for wind turbine by applying ideas from statistical process control. A specific problem in wind turbine maintenance is that failures of a certain part may have causes that originate in other parts a long time ago. This calls for methods that can produce timely warnings by combining sensor data from different sources. Our method improves on existing methods used in wind turbine maintenance by using adaptive alarm thresholds for the monitored parameters that correct for values of other relevant parameters. We illustrate our method with a case study that shows that our method is able to predict upcoming failures much earlier than currently used methods

    Small nonparametric tolerance regions

    Get PDF
    We present a new natural way to construct nonparametric multivariate tolerance regions. Unlike the classical nonparametric tolerance intervals, where the endpoints of the tolerance intervals are determined by beforehand chosen order statistics, we take the shortest interval, that contains a certain number of observations. We extend this idea to higher dimensions by replacing the class of intervals with other classes of sets, like ellipsoids, hyperrectangles or convex sets. The asymptotic behaviour of our tolerance regions is derived using empirical process theory, in particular the concept of generalized quantiles. Finite sample properties of our tolerance regions are investigated through a simulation study

    Robust and Efficient Uncertainty Quantification and Validation of RFIC Isolation

    Get PDF
    Modern communication and identification products impose demanding constraints on reliability of components. Due to this statistical constraints more and more enter optimization formulations of electronic products. Yield constraints often require efficient sampling techniques to obtain uncertainty quantification also at the tails of the distributions. These sampling techniques should outperform standard Monte Carlo techniques, since these latter ones are normally not efficient enough to deal with tail probabilities. One such a technique, Importance Sampling, has successfully been applied to optimize Static Random Access Memories (SRAMs) while guaranteeing very small failure probabilities, even going beyond 6-sigma variations of parameters involved. Apart from this, emerging uncertainty quantifications techniques offer expansions of the solution that serve as a response surface facility when doing statistics and optimization. To efficiently derive the coefficients in the expansions one either has to solve a large number of problems or a huge combined problem. Here parameterized Model Order Reduction (MOR) techniques can be used to reduce the work load. To also reduce the amount of parameters we identify those that only affect the variance in a minor way. These parameters can simply be set to a fixed value. The remaining parameters can be viewed as dominant. Preservation of the variation also allows to make statements about the approximation accuracy obtained by the parameter-reduced problem. This is illustrated on an RLC circuit. Additionally, the MOR technique used should not affect the variance significantly. Finally we consider a methodology for reliable RFIC isolation using floor-plan modeling and isolation grounding. Simulations show good comparison with measurements

    Symbolic computation and exact distributions of nonparametric test statistics

    Get PDF
    We show how to use computer algebra for computing exact distributions on nonparametric statistics. We give several examples of nonparametric statistics with explicit probability generating functions that can be handled this way. In particular, we give a new table of critical values of the Jonckheere-Terpstra test that extends tables known in the literature

    Smallest nonparametric tolerance regions

    Get PDF
    We present a new, natural way to construct nonparametric multivariate tolerance regions. Unlike the classical nonparametric tolerance intervals, where the endpoints are determined by beforehand chosen order statistics, we take the shortest interval, that contains a certain number of observations. We extend this idea to higher dimensions by replacing the class of intervals by a general class of indexing sets, which specializes to the classes of ellipsoids, hyperrectangles or convex sets. The asymptotic behavior of our tolerance regions is derived using empirical process theory, in particular the concept of generalized quantiles. Finite sample properties of our tolerance regions are investigated through a simulation study. Real data examples are also presented

    A simplified analytical model of ultrafine particle concentration within an indoor environment

    Get PDF
    Exposure to indoor fine and ultrafine particulate matter (PM) has been recognised as a fundamental problem as most people spend over 85% of their time indoor. Experimental data derived from a field campaign conducted in a confined environment have been used to investigate the physical mechanisms governing indoor-outdoor PM exchanges in different operating conditions, e.g. natural ventilation and infiltration. An analytical model based on the mass balance of PM has been used to estimate indoor fine and ultrafine PM concentration. Indoor-outdoor concentration ratio, penetration factor and air exchange rate have been estimated and related to the differential pressure measured at the openings

    Importance sampling for high speed statistical Monte-Carlo simulations

    Get PDF
    As transistor dimensions of Static Random AccessMemory (SRAM) become smaller with each new technology generation, they become increasingly susceptible to statistical variations in their parameters. These statistical variations can result in failing memory. SRAM is used as a building block for the construction of large Integrated Circuits (IC). To ensure SRAM does not degrade the yield (fraction of functional devices) of ICs, very low failure probabilities of Pfail = 10-10 are strived for. For instance in SRAMmemory design one aims to get a 0.1% yield loss for 10Mbit memory, which means that 1 in 10 billion cells fails (Pfail = 10-10; this corresponds with an occurrence of -6.4s when dealing with a normal distribution). To simulate such probabilities, traditional Monte-Carlo simulations are not sufficient and more advanced techniques are required. Importance Sampling is a technique that is relatively easy to implement and provides sufficiently accurate results. Importance sampling is a well known technique in statistics to estimate the occurrences of rare events. Rare or extreme events can be associated with dramatic costs, like in finance or because of reasons of safety in environment (dikes, power plants). Recently this technique also received new attention in circuit design. Importance sampling tunes Monte Carlo to the area in parameter space from where the rare events are generated. By this a speed up of several orders can be achieved when compared to standard Monte Carlo methods. We describe the underlying mathematics. Experiments reveal the intrinsic power of the method. The efficiency of the method increases when the dimension of the parameter space increases. The method could be a valuable extension to the statistical capacities of any circuit simulator A Matlab implementation is included in the Appendix
    • …
    corecore