280 research outputs found

    Cosmological Constraints on a Power Law Universe

    Get PDF
    Linearly coasting cosmology is comfortably concordant with a host of cosmological observations. It is surprisingly an excellent fit to SNe Ia observations and constraints arising from age of old quasars. In this article we highlight the overall viability of an open linear coasting cosmological model.The model is consistent with the latest SNe Ia ``gold'' sample and accommodates a very old high-redshift quasar, which the standard cold-dark model fails to do.Comment: 10 pages, 2 figure

    Statistical disclosure control in tabular data

    Get PDF
    Data disseminated by National Statistical Agencies (NSAs) can be classified as either microdata or tabular data. Tabular data is obtained from microdata by crossing one or more categorical variables. Although cell tables provide aggregated information, they also need to be protected. This chapter is a short introduction to tabular data protection. It contains three main sections. The first one shows the different types of tables that can be obtained, and how they are modeled. The second describes the practical rules for detection of sensitive cells that are used by NSAs. Finally, an overview of protection methods is provided, with a particular focus on two of them: “cell suppression problem” and “controlled tabular adjustment”.Postprint (published version

    Computational Experiments with Minimum-Distance Controlled Perturbation Methods

    Full text link
    Abstract. Minimum-distance controlled perturbation is a recent family of methods for the protection of statistical tabular data. These methods are both efficient and versatile, since can deal with large tables of any structure and dimension, and in practice only need the solution of a linear or quadratic optimization problem. The purpose of this paper is to give insight into the behaviour of such methods through some computational experiments. In particular, the paper (1) illustrates the theoretical results about the low disclosure risk of the method; (2) analyzes the solutions provided by the method on a standard set of seven difficult and complex instances; and (3) shows the behaviour of a new approach obtained by the combination of two existing ones

    Scaling predictions for radii of weakly bound triatomic molecules

    Full text link
    The mean-square radii of the molecules 4^4He3_3, 4^4He26_2-^6Li, 4^4He27_2-^7Li and 4^4He223_2-^{23}Na are calculated using a three-body model with contact interactions. They are obtained from a universal scaling function calculated within a renormalized scheme for three particles interacting through pairwise Dirac-delta interaction. The root-mean-square distance between two atoms of mass mAm_A in a triatomic molecule are estimated to be of de order of C2/[mA(E3E2)]{\cal C}\sqrt{\hbar^2/[m_A(E_3-E_2)]}, where E2E_2 is the dimer and E3E_3 the trimer binding energies, and C{\cal C} is a constant (varying from 0.6\sim 0.6 to 1\sim 1) that depends on the ratio between E2E_2 and E3E_3. Considering previous estimates for the trimer energies, we also predict the sizes of Rubidium and Sodium trimers in atomic traps.Comment: 7 pages, 2 figure

    Network Flows Heuristics for Complementary Cell Suppression: An Empirical Evaluation and Extensions

    Full text link
    Several network flows heuristics have been suggested in the past for the solution of the complementary suppression problem. However, a limited computational experience using them is reported in the literature, and, moreover, they were only appropriate for two-dimensional tables. The purpose of this paper is twofold. First, we perform an em-pirical comparison of two network flows heuristics. They are improved versions of already existing approaches. Second, we show that exten-sions of network flows methods (i.e., multicommodity network flows and network flows with side constraints) can model three-dimensional, hierarchical and linked tables. Exploiting this network structure can improve the performance of any solution method solely based on linear programming formulations

    Impact of the intensification of beef production in Brazil on greenhouse gas emissions

    Get PDF
    The objective of this study was to investigate the impact of increasing pasture productivity using fertilizers, forage legumes, supplements and concentrates, on the emissions of greenhouse gases (GHGs) in five scenarios for beef production with Nellore cattle in the Cerrado region of Brazil. A life cycle analysis (LCA) approach, from birth of calves to mature animals ready for slaughter at the farm gate, was utilized using both the Tier 1 and Tier 2 methodologies of the Intergovernmental Panel on Climate Change and the results were expressed in carbon dioxide equivalents per kg of carcass produced. The first four scenarios were based solely on cattle production on pasture, ranging from degraded Brachiaria pastures with minimal management, through to a mixed legume/Brachiaria pasture reformed every five years with P and K fertilizers and lime and an improved N fertilized (150 kg N/ha per year) pasture of Guinea grass (Panicum maximum). The most intensive scenario was also based on a fertilized Guinea grass pasture but with a 75 day finishing period in confinement with total mixed ration. To compare scenarios a herd based on 400 cows was utilized. Across the scenarios from 1 to 5 the increase in digestibility promoted a reduction in the forage intake for animal weight gain and a concomitant reduction in methane emissions per herd. For the estimation of nitrous oxide emissions from animal excreta using Tier 2, emission factors from a study in the Cerrado region were utilized which postulated lower emission from dung than from urine and much lower emissions in the long dry season in this region. Fossil carbon dioxide emissions from direct use of fuel and energy were also included in the LCA along with that necessary for the production of fertilizers, supplements and feeds. The greatest impact of intensification of the beef production systems was in the reduction of the area necessary for carcass production from 320 to 45 square meters per kg carcass. Carcass production increased from 43 to 65 Mg per herd across the scenarios from 1 to 5, and total emissions per kg carcass were estimated by Tier 2 methodology to be reduced from 53.7 to 27.9 kg carbon dioxide equivalents. GHG emissions per kg carcass were slightly lower for the mixed grass legume scenario (3), although this was partly due to the lack of data on emissions of nitrous oxide from legume residues. Another large source of uncertainty for the confection of such LCAs was the lack of data for enteric methane emissions from cattle grazing tropical forages

    An assessment of Evans' unified field theory I

    Get PDF
    Evans developed a classical unified field theory of gravitation and electromagnetism on the background of a spacetime obeying a Riemann-Cartan geometry. This geometry can be characterized by an orthonormal coframe theta and a (metric compatible) Lorentz connection Gamma. These two potentials yield the field strengths torsion T and curvature R. Evans tried to infuse electromagnetic properties into this geometrical framework by putting the coframe theta to be proportional to four extended electromagnetic potentials A; these are assumed to encompass the conventional Maxwellian potential in a suitable limit. The viable Einstein-Cartan(-Sciama-Kibble) theory of gravity was adopted by Evans to describe the gravitational sector of his theory. Including also the results of an accompanying paper by Obukhov and the author, we show that Evans' ansatz for electromagnetism is untenable beyond repair both from a geometrical as well as from a physical point of view. As a consequence, his unified theory is obsolete.Comment: 39 pages of latex, modified because of referee report, mistakes and typos removed, partly reformulated, taken care of M.W.Evans' rebutta
    corecore