1,738 research outputs found

    A Large Deviation Rate and Central Limit Theorem for Horton Ratios

    Get PDF
    Although originating in hydrology, the classical Horton analysis is based on a geometric progression that is widely used in the empirical analysis of branching patterns found in biology, atmospheric science, plant pathology, etc., and more recently in tree register allocation in computer science. The main results of this paper are a large deviation rate and a central limit theorem for Horton bifurcation ratios in a standard network model. The methods are largely self-contained. In particular, derivations of some previously known results of the theory are indicated along the way

    t-tests, non-parametric tests, and large studies—a paradox of statistical practice?

    Get PDF
    Background During the last 30 years, the median sample size of research studies published in high-impact medical journals has increased manyfold, while the use of non-parametric tests has increased at the expense of t-tests. This paper explores this paradoxical practice and illustrates its consequences. Methods A simulation study is used to compare the rejection rates of the Wilcoxon-Mann-Whitney (WMW) test and the two-sample t-test for increasing sample size. Samples are drawn from skewed distributions with equal means and medians but with a small difference in spread. A hypothetical case study is used for illustration and motivation. Results The WMW test produces, on average, smaller p-values than the t-test. This discrepancy increases with increasing sample size, skewness, and difference in spread. For heavily skewed data, the proportion of p<0.05 with the WMW test can be greater than 90% if the standard deviations differ by 10% and the number of observations is 1000 in each group. The high rejection rates of the WMW test should be interpreted as the power to detect that the probability that a random sample from one of the distributions is less than a random sample from the other distribution is greater than 50%. Conclusions Non-parametric tests are most useful for small studies. Using non-parametric tests in large studies may provide answers to the wrong question, thus confusing readers. For studies with a large sample size, t-tests and their corresponding confidence intervals can and should be used even for heavily skewed data

    A model of bimetallism

    Get PDF
    Bimetallism has been the subject of considerable debate: Was it a viable monetary system? Was it a desirable system? In our model, the (exogenous and stochastic) amount of each metal can be split between monetary uses to satisfy a cash-in-advance constraint, and nonmonetary uses in which the stock of uncoined metal yields utility. The ratio of the monies in the cash-in-advance constraint is endogenous. Bimetallism is feasible: we find a continuum of steady states (in the certainty case) indexed by the constant exchange rate of the monies; we also prove existence for a range of fixed exchange rates in the stochastic version. Bimetallism does not appear desirable on a welfare basis: among steady states, we prove that welfare under monometallism is higher than under any bimetallic equilibrium. We compute welfare and the variance of the price level under a variety of regimes (bimetallism, monometallism with and without trade money) and find that bimetallism can significantly stabilize the price level, depending on the covariance between the shocks to the supplies of metals.Bimetallism ; Gold

    Robust Decentralized Secondary Frequency Control in Power Systems: Merits and Trade-Offs

    Get PDF
    Frequency restoration in power systems is conventionally performed by broadcasting a centralized signal to local controllers. As a result of the energy transition, technological advances, and the scientific interest in distributed control and optimization methods, a plethora of distributed frequency control strategies have been proposed recently that rely on communication amongst local controllers. In this paper we propose a fully decentralized leaky integral controller for frequency restoration that is derived from a classic lag element. We study steady-state, asymptotic optimality, nominal stability, input-to-state stability, noise rejection, transient performance, and robustness properties of this controller in closed loop with a nonlinear and multivariable power system model. We demonstrate that the leaky integral controller can strike an acceptable trade-off between performance and robustness as well as between asymptotic disturbance rejection and transient convergence rate by tuning its DC gain and time constant. We compare our findings to conventional decentralized integral control and distributed-averaging-based integral control in theory and simulations

    The dark haloes of early-type galaxies in low-density environments: XMM-Newton and Chandra observations of NGC 57, NGC 7796 and IC 1531

    Full text link
    We present analysis of Chandra and XMM-Newton observations of three early-type galaxies, NGC 57, NGC 7796 and IC 1531. All three are found in very low density environments, and appear to have no neighbours of comparable size. NGC 57 has a halo of kT~0.9 keV, solar metallicity gas, while NGC 7796 and IC 1531 both have ~0.55 keV, 0.5-0.6 Zsol haloes. IC 1531 has a relatively compact halo, and we consider it likely that gas has been removed from the system by the effects of AGN heating. For NGC 57 and NGC 7796 we estimate mass, entropy and cooling time profiles and find that NGC 57 has a fairly massive dark halo with a mass-to-light ratio of 44.7 (4.0,-8.5) Msol/Lsol (1 sigma uncertainties) at 4.75 Re. This is very similar to the mass-to-light ratio found for NGC 4555 and confirms that isolated ellipticals can possess sizable dark matter haloes. We find a significantly lower mass-to-light ratio for NGC 7796, 10.6 (+2.5,-2.3) Msol/Lsol at 5 Re, and discuss the possibility that NGC 7796 hosts a galactic wind, causing us to underestimate its mass.Comment: 14 pages, 9 figures, accepted for publication in MNRA
    corecore