1,423,150 research outputs found

    Ordered samples control charts for ordinal variables

    No full text
    The paper presents a new method for statistical process control when ordinal variables are involved. This is the case of a quality characteristic evaluated by an ordinal scale. The method allows a statistical analysis without exploiting an arbitrary numerical conversion of scale levels and without using the traditional sample synthesis operators (sample mean and variance). It consists of a different approach based on the use of a new sample scale obtained by ordering the original variable sample space according to some specific ‘dominance criteria' fixed on the basis of the monitored process haracteristics. Samples are directly reported on the chart and no distributional shape is assumed for the population (universe) of evaluations. Finally, a practical application of the method in the health sector is provided

    Solving the stationary Liouville equation via a boundary element method

    Full text link
    Intensity distributions of linear wave fields are, in the high frequency limit, often approximated in terms of flow or transport equations in phase space. Common techniques for solving the flow equations for both time dependent and stationary problems are ray tracing or level set methods. In the context of predicting the vibro-acoustic response of complex engineering structures, reduced ray tracing methods such as Statistical Energy Analysis or variants thereof have found widespread applications. Starting directly from the stationary Liouville equation, we develop a boundary element method for solving the transport equations for complex multi-component structures. The method, which is an improved version of the Dynamical Energy Analysis technique introduced recently by the authors, interpolates between standard statistical energy analysis and full ray tracing, containing both of these methods as limiting cases. We demonstrate that the method can be used to efficiently deal with complex large scale problems giving good approximations of the energy distribution when compared to exact solutions of the underlying wave equation

    Application of optimal data-based binning method to spatial analysis of ecological datasets

    Full text link
    Investigation of highly structured data sets to unveil statistical regularities is of major importance in complex system research. The first step is to choose the scale at which to observe the process, the most informative scale being the one that includes the important features while disregarding noisy details in the data. In the investigation of spatial patterns, the optimal scale defines the optimal bin size of the histogram in which to visualize the empirical density of the pattern. In this paper we investigate a method proposed recently by K.~H.~Knuth to find the optimal bin size of an histogram as a tool for statistical analysis of spatial point processes. We test it through numerical simulations on various spatial processes which are of interest in ecology. We show that Knuth optimal bin size rule reducing noisy fluctuations performs better than standard kernel methods to infer the intensity of the underlying process. Moreover it can be used to highlight relevant spatial characteristics of the underlying distribution such as space anisotropy and clusterization. We apply these findings to analyse cluster-like structures in plants' arrangement of Barro Colorado Island rainforest.Comment: 49 pages, 25 figure

    Statistics of mixing in three-dimensional Rayleigh--Taylor turbulence at low Atwood number and Prandtl number one

    Full text link
    Three-dimensional miscible Rayleigh--Taylor (RT) turbulence at small Atwood number and at Prandtl number one is investigated by means of high resolution direct numerical simulations of the Boussinesq equations. RT turbulence is a paradigmatic time-dependent turbulent system in which the integral scale grows in time following the evolution of the mixing region. In order to fully characterize the statistical properties of the flow, both temporal and spatial behavior of relevant statistical indicators have been analyzed. Scaling of both global quantities ({\it e.g.}, Rayleigh, Nusselt and Reynolds numbers) and scale dependent observables built in terms of velocity and temperature fluctuations are considered. We extend the mean-field analysis for velocity and temperature fluctuations to take into account intermittency, both in time and space domains. We show that the resulting scaling exponents are compatible with those of classical Navier--Stokes turbulence advecting a passive scalar at comparable Reynolds number. Our results support the scenario of universality of turbulence with respect to both the injection mechanism and the geometry of the flow

    A cautionary note on using the scale prior for the parameter N of a binomial distribution

    Get PDF
    Statistical analysis of ecological data may require the estimation of the size of a population, or of the number of species with a certain population. This task frequently reduces to estimating the discrete parameter N representing the number of trials in a binomial distribution. In Bayesian methods, there has been a substantial amount of discussion on how to select the prior for N. We propose a prior for N based on an objective measure of the worth that each value of N has in being included in the model space. This prior is compared (through the analysis of the popular snowshoe hare dataset) with the scale prior which, in our opinion, cannot be understood from solid objective considerations
    • 

    corecore