19,456 research outputs found

    A coupled hydromechanical bounding surface model predicting the hysteretic behaviour of unsaturated soils

    Get PDF
    This paper presents a bounding surface model to predict the hydromechanical behaviour of unsaturated soils under isotropic stress states. The model combines the hydraulic law of Gallipoli et al. [8] with the mechanical law of Gallipoli and Bruno [9]. The hydraulic law relates the degree of saturation to the single variable scaled suction, which accounts for the effect of both suction and void ratio on the water retention behaviour of soils. The hydraulic law is made up of two closed-form equations, one for drying paths and one for wetting paths. Similarly, the mechanical law relates the void ratio to the single variable scaled stress, which accounts for the effect of both stress state and degree of saturation on the deformation of soils. The mechanical law is made up of two closed-form equations, one for loading paths and one for unloading paths. The proposed hydromechanical model is expressed in a finite form and has therefore the advantage of not requiring any approximate numerical integration. The model has been validated against four sets of laboratory data showing a good ability to predict the coupled behaviour of unsaturated soils (e.g. collapse-compression upon wetting) by means of a relatively small number of material parameters

    Quantifying Bimodality Part 2: A Likelihood Ratio Test for the Comparison of a Unimodal Normal Distribution and a Bimodal Mixture of Two Normal Distributions. Bruno D. Zumbo is

    Get PDF
    Scientists in a variety of fields are often faced with the question of whether a sample is best described as unimodal or bimodal. In an earlier paper (Frankland & Zumbo, 2002), a simple and convenient method for assessing bimodality was described. That method is extended by developing and demonstrating a likelihood ratio test (LRT) for bimodality for the comparison of a unimodal normal distribution and a bimodal mixture of two normal distributions. As in Frankland and Zumbo (2002), the LRT approach is demonstrated using algorithms in SPSS

    Resolving the Issue of How Reliability is Related to Statistical Power: Adhering to Mathematical Definitions

    Get PDF
    Reliability in classical test theory is a population-dependent concept, defined as a ratio of true-score variance and observed-score variance, where observed-score variance is a sum of true and error components. On the other hand, the power of a statistical significance test is a function of the total variance, irrespective of its decomposition into true and error components. For that reason, the reliability of a dependent variable is a function of the ratio of true-score variance and observed-score variance, whereas statistical power is a function of the sum of the same two variances. Controversies about how reliability is related to statistical power often can be explained by authors’ use of the term “reliability” in a general way to mean “consistency,” “precision,” or “dependability,” which does not always correspond to its mathematical definition as a variance ratio. The present note shows how adherence to the mathematical definition can help resolve the issue and presents some derivations and illustrative examples that have further implications for significance testing and practical research

    Quantifying Bimodality Part I: An Easily Implemented Method Using \u3cem\u3eSPSS\u3c/em\u3e

    Get PDF
    Scientists in a variety of fields are faced with the question of whether or not a particular sample of data are best described as unimodal or bimodal. We provide a simple and convenient method for assessing bimodality. The use of the non-linear algorithms in SPSS for modeling complex mixture distributions is demonstrated on a unimodal normal distribution (with 2 free parameters) and on bimodal mixture of two normal distributions (with 5 free parameters)

    Multistage Zeeman deceleration of metastable neon

    Full text link
    A supersonic beam of metastable neon atoms has been decelerated by exploiting the interaction between the magnetic moment of the atoms and time-dependent inhomogeneous magnetic fields in a multistage Zeeman decelerator. Using 91 deceleration solenoids, the atoms were decelerated from an initial velocity of 580m/s to final velocities as low as 105m/s, corresponding to a removal of more than 95% of their initial kinetic energy. The phase-space distribution of the cold, decelerated atoms was characterized by time-of-flight and imaging measurements, from which a temperature of 10mK was obtained in the moving frame of the decelerated sample. In combination with particle-trajectory simulations, these measurements allowed the phase-space acceptance of the decelerator to be quantified. The degree of isotope separation that can be achieved by multistage Zeeman deceleration was also studied by performing experiments with pulse sequences generated for 20^{20}Ne and 22^{22}Ne.Comment: 16 pages, 15 figure

    Adaptive mesh refinement with spectral accuracy for magnetohydrodynamics in two space dimensions

    Get PDF
    We examine the effect of accuracy of high-order spectral element methods, with or without adaptive mesh refinement (AMR), in the context of a classical configuration of magnetic reconnection in two space dimensions, the so-called Orszag-Tang vortex made up of a magnetic X-point centered on a stagnation point of the velocity. A recently developed spectral-element adaptive refinement incompressible magnetohydrodynamic (MHD) code is applied to simulate this problem. The MHD solver is explicit, and uses the Elsasser formulation on high-order elements. It automatically takes advantage of the adaptive grid mechanics that have been described elsewhere in the fluid context [Rosenberg, Fournier, Fischer, Pouquet, J. Comp. Phys. 215, 59-80 (2006)]; the code allows both statically refined and dynamically refined grids. Tests of the algorithm using analytic solutions are described, and comparisons of the Orszag-Tang solutions with pseudo-spectral computations are performed. We demonstrate for moderate Reynolds numbers that the algorithms using both static and refined grids reproduce the pseudo--spectral solutions quite well. We show that low-order truncation--even with a comparable number of global degrees of freedom--fails to correctly model some strong (sup--norm) quantities in this problem, even though it satisfies adequately the weak (integrated) balance diagnostics.Comment: 19 pages, 10 figures, 1 table. Submitted to New Journal of Physic

    Effects from inhomogeneities in the chiral transition

    Full text link
    We consider an approximation procedure to evaluate the finite-temperature one-loop fermionic density in the presence of a chiral background field which systematically incorporates effects from inhomogeneities in the chiral field through a derivative expansion. We apply the method to the case of a simple low-energy effective chiral model which is commonly used in the study of the chiral phase transition, the linear sigma-model coupled to quarks. The modifications in the effective potential and their consequences for the bubble nucleation process are discussed.Comment: 11 pages, 5 figures. v2: appendix and references added, published versio
    • …
    corecore