816 research outputs found

    Minimum Length from First Principles

    Get PDF
    We show that no device or gedanken experiment is capable of measuring a distance less than the Planck length. By "measuring a distance less than the Planck length" we mean, technically, resolve the eigenvalues of the position operator to within that accuracy. The only assumptions in our argument are causality, the uncertainty principle from quantum mechanics and a dynamical criteria for gravitational collapse from classical general relativity called the hoop conjecture. The inability of any gedanken experiment to measure a sub-Planckian distance suggests the existence of a minimal length.Comment: 8 pages, Honorable Mention in the 2005 Gravity Research Foundation Essay Competitio

    Towards a High Energy Theory for the Higgs Phase of Gravity

    Get PDF
    Spontaneous Lorentz violation due to a time-dependent expectation value for a massless scalar has been suggested as a method for dynamically generating dark energy. A natural candidate for the scalar is a Goldstone boson arising from the spontaneous breaking of a U(1) symmetry. We investigate the low-energy effective action for such a Goldstone boson in a general class of models involving only scalars, proving that if the scalars have standard kinetic terms then at the {\em classical} level the effective action does not have the required features for spontaneous Lorentz violation to occur asymptotically (t)(t \to \infty) in an expanding FRW universe. Then we study the large NN limit of a renormalizable field theory with a complex scalar coupled to massive fermions. In this model an effective action for the Goldstone boson with the properties required for spontaneous Lorentz violation can be generated. Although the model has shortcomings, we feel it represents progress towards finding a high energy completion for the Higgs phase of gravity.Comment: 20 pages, 5 figures;fixed typos and added reference

    Minimum Length from Quantum Mechanics and Classical General Relativity

    Get PDF
    We derive fundamental limits on measurements of position, arising from quantum mechanics and classical general relativity. First, we show that any primitive probe or target used in an experiment must be larger than the Planck length, lPl_P. This suggests a Planck-size {\it minimum ball} of uncertainty in any measurement. Next, we study interferometers (such as LIGO) whose precision is much finer than the size of any individual components and hence are not obviously limited by the minimum ball. Nevertheless, we deduce a fundamental limit on their accuracy of order lPl_P. Our results imply a {\it device independent} limit on possible position measurements.Comment: 8 pages, latex, to appear in the Physical Review Letter

    The Baryon-Dark Matter Ratio Via Moduli Decay After Affleck-Dine Baryogenesis

    Get PDF
    Low-scale supersymmetry breaking in string motivated theories implies the presence of O(100) TeV scale moduli, which generically lead to a significant modification of the history of the universe prior to Big Bang Nucleosynthesis. Such an approach implies a non-thermal origin for dark matter resulting from scalar decay, where the lightest supersymmetric particle can account for the observed dark matter relic density. We study the further effect of the decay on the baryon asymmetry of the universe, and find that this can satisfactorily address the problem of the over-production of the baryon asymmetry by the Affleck-Dine mechanism in the MSSM. Remarkably, there is a natural connection between the baryon and dark matter abundances today, which leads to a solution of the `Cosmic Coincidence Problem'.Comment: 12 pages, no figure. v2: references adde

    Seiberg Duality and e+ e- Experiments

    Get PDF
    Seiberg duality in supersymmetric gauge theories is the claim that two different theories describe the same physics in the infrared limit. However, one cannot easily work out physical quantities in strongly coupled theories and hence it has been difficult to compare the physics of the electric and magnetic theories. In order to gain more insight into the equivalence of two theories, we study the ``e+ e-'' cross sections into ``hadrons'' for both theories in the superconformal window. We describe a technique which allows us to compute the cross sections exactly in the infrared limit. They are indeed equal in the low-energy limit and the equality is guaranteed because of the anomaly matching condition. The ultraviolet behavior of the total ``e+ e-'' cross section is different for the two theories. We comment on proposed non-supersymmetric dualities. We also analyze the agreement of the ``\gamma\gamma'' and ``WW'' scattering amplitudes in both theories, and in particular try to understand if their equivalence can be explained by the anomaly matching condition.Comment: 24 pages, 2 figures, uses psfi

    The role of Comprehension in Requirements and Implications for Use Case Descriptions

    Get PDF
    Within requirements engineering it is generally accepted that in writing specifications (or indeed any requirements phase document), one attempts to produce an artefact which will be simple to comprehend for the user. That is, whether the document is intended for customers to validate requirements, or engineers to understand what the design must deliver, comprehension is an important goal for the author. Indeed, advice on producing ‘readable’ or ‘understandable’ documents is often included in courses on requirements engineering. However, few researchers, particularly within the software engineering domain, have attempted either to define or to understand the nature of comprehension and it’s implications for guidance on the production of quality requirements. Therefore, this paper examines thoroughly the nature of textual comprehension, drawing heavily from research in discourse process, and suggests some implications for requirements (and other) software documentation. In essence, we find that the guidance on writing requirements, often prevalent within software engineering, may be based upon assumptions which are an oversimplification of the nature of comprehension. Hence, the paper examines guidelines which have been proposed, in this case for use case descriptions, and the extent to which they agree with discourse process theory; before suggesting refinements to the guidelines which attempt to utilise lessons learned from our richer understanding of the underlying discourse process theory. For example, we suggest subtly different sets of writing guidelines for the different tasks of requirements, specification and design

    Effect of Tuned Parameters on a LSA MCQ Answering Model

    Full text link
    This paper presents the current state of a work in progress, whose objective is to better understand the effects of factors that significantly influence the performance of Latent Semantic Analysis (LSA). A difficult task, which consists in answering (French) biology Multiple Choice Questions, is used to test the semantic properties of the truncated singular space and to study the relative influence of main parameters. A dedicated software has been designed to fine tune the LSA semantic space for the Multiple Choice Questions task. With optimal parameters, the performances of our simple model are quite surprisingly equal or superior to those of 7th and 8th grades students. This indicates that semantic spaces were quite good despite their low dimensions and the small sizes of training data sets. Besides, we present an original entropy global weighting of answers' terms of each question of the Multiple Choice Questions which was necessary to achieve the model's success.Comment: 9 page

    Comments on Non-Commutative Phenomenology

    Full text link
    It is natural to ask whether non-commutative geometry plays a role in four dimensional physics. By performing explicit computations in various toy models, we show that quantum effects lead to violations of Lorentz invariance at the level of operators of dimension three or four. The resulting constraints are very stringent.Comment: Correction of an error in the U(1) and U(N) calculation leads to stronger limits than those given previously Clarifying comments and reference adde

    The GUT Scale and Superpartner Masses from Anomaly Mediated Supersymmetry Breaking

    Get PDF
    We consider models of anomaly-mediated supersymmetry breaking (AMSB) in which the grand unification (GUT) scale is determined by the vacuum expectation value of a chiral superfield. If the anomaly-mediated contributions to the potential are balanced by gravitational-strength interactions, we find a model-independent prediction for the GUT scale of order MPlanck/(16π2)M_{\rm Planck} / (16\pi^2). The GUT threshold also affects superpartner masses, and can easily give rise to realistic predictions if the GUT gauge group is asymptotically free. We give an explicit example of a model with these features, in which the doublet-triplet splitting problem is solved. The resulting superpartner spectrum is very different from that of previously considered AMSB models, with gaugino masses typically unifying at the GUT scale.Comment: 17 page

    An exploration of concepts of community through a case study of UK university web production

    No full text
    The paper explores the inter-relation and differences between the concepts of occupational community, community of practice, online community and social network. It uses as a case study illustration the domain of UK university web site production and specifically a listserv for those involved in it. Different latent occupational communities are explored, and the potential for the listserv to help realize these as an active sense of community is considered. The listserv is not (for most participants) a tight knit community of practice, indeed it fails many criteria for an online community. It is perhaps best conceived as a loose knit network of practice, valued for information, implicit support and for the maintenance of weak ties. Through the analysis the case for using strict definitions of the theoretical concepts is made
    corecore