140 research outputs found

    Будівельна лихоманка на Київських схилах

    Get PDF
    Forty marine-terminating glaciers have been surveyed daily since 2000 using cloud-free MODIS visible imagery (Box and Decker 2011; http://bprc. osu.edu/MODIS/). The net area change of the 40 glaciers during the period of observation has been -1775 km2, with the 18 northernmost (>72°N) glaciers alone contributing to half of the net area change. In 2012, the northernmost glaciers lost a collective area of 255 km2, or 86% of the total net area change of the 40 glaciers surveyed. The six glaciers with the largest net area loss in 2012 were Petermann (-141 km2), 79 glacier (-27 km2), Zachariae (-26 km2), Steenstrup (-19 km2), Steensby (-16 km2, the greatest retreat since observations began), and Jakobshavn (-13 km2). While the total area change was negative in 2012, the area of four of the forty glaciers did increase relative to the end of the 2011 melt season. The anomalous advance of these four glaciers is not easily explained, as the mechanisms controlling the behavior of individual glaciers are uncertain due to their often unique geographic!settings

    Sequential design of computer experiments for the estimation of a probability of failure

    Full text link
    This paper deals with the problem of estimating the volume of the excursion set of a function f:RdRf:\mathbb{R}^d \to \mathbb{R} above a given threshold, under a probability measure on Rd\mathbb{R}^d that is assumed to be known. In the industrial world, this corresponds to the problem of estimating a probability of failure of a system. When only an expensive-to-simulate model of the system is available, the budget for simulations is usually severely limited and therefore classical Monte Carlo methods ought to be avoided. One of the main contributions of this article is to derive SUR (stepwise uncertainty reduction) strategies from a Bayesian-theoretic formulation of the problem of estimating a probability of failure. These sequential strategies use a Gaussian process model of ff and aim at performing evaluations of ff as efficiently as possible to infer the value of the probability of failure. We compare these strategies to other strategies also based on a Gaussian process model for estimating a probability of failure.Comment: This is an author-generated postprint version. The published version is available at http://www.springerlink.co

    Reduction of the value of information sharing as demand becomes strongly auto-correlated

    Get PDF
    Information sharing has been identified, in the academic literature, as one of the most important levers to mitigate the bullwhip effect in supply chains. A highly-cited article on the bullwhip effect has claimed that the percentage inventory reduction resulting from information sharing in a two level supply chain, when the downstream demand is autoregressive of order one, is an increasing function of the autoregressive parameter of the demand. In this paper we show that this is true only for a certain range of the autoregressive parameter and there is a maximum value beyond which the bullwhip ratio at the upstream stage is reduced and the percentage inventory reduction resulting from information sharing decreases towards zero. We also show that this maximum value of the autoregressive parameter can be as high as 0.7 which represents a common value that may be encountered in many practical contexts. This means that large benefits of information sharing cannot be assumed for those Stock Keeping Units (SKUs) with highly positively auto-correlated demand. Instead, equally careful analysis is needed for these items as for those SKUs with less strongly auto-correlated demand

    Statistical strategies for avoiding false discoveries in metabolomics and related experiments

    Full text link

    A review of techniques for parameter sensitivity analysis of environmental models

    Full text link
    Mathematical models are utilized to approximate various highly complex engineering, physical, environmental, social, and economic phenomena. Model parameters exerting the most influence on model results are identified through a ‘sensitivity analysis’. A comprehensive review is presented of more than a dozen sensitivity analysis methods. This review is intended for those not intimately familiar with statistics or the techniques utilized for sensitivity analysis of computer models. The most fundamental of sensitivity techniques utilizes partial differentiation whereas the simplest approach requires varying parameter values one-at-a-time. Correlation analysis is used to determine relationships between independent and dependent variables. Regression analysis provides the most comprehensive sensitivity measure and is commonly utilized to build response surfaces that approximate complex models.Peer Reviewedhttp://deepblue.lib.umich.edu/bitstream/2027.42/42691/1/10661_2004_Article_BF00547132.pd
    corecore