744 research outputs found

    Minimum Length from First Principles

    Get PDF
    We show that no device or gedanken experiment is capable of measuring a distance less than the Planck length. By "measuring a distance less than the Planck length" we mean, technically, resolve the eigenvalues of the position operator to within that accuracy. The only assumptions in our argument are causality, the uncertainty principle from quantum mechanics and a dynamical criteria for gravitational collapse from classical general relativity called the hoop conjecture. The inability of any gedanken experiment to measure a sub-Planckian distance suggests the existence of a minimal length.Comment: 8 pages, Honorable Mention in the 2005 Gravity Research Foundation Essay Competitio

    Towards a High Energy Theory for the Higgs Phase of Gravity

    Get PDF
    Spontaneous Lorentz violation due to a time-dependent expectation value for a massless scalar has been suggested as a method for dynamically generating dark energy. A natural candidate for the scalar is a Goldstone boson arising from the spontaneous breaking of a U(1) symmetry. We investigate the low-energy effective action for such a Goldstone boson in a general class of models involving only scalars, proving that if the scalars have standard kinetic terms then at the {\em classical} level the effective action does not have the required features for spontaneous Lorentz violation to occur asymptotically (t)(t \to \infty) in an expanding FRW universe. Then we study the large NN limit of a renormalizable field theory with a complex scalar coupled to massive fermions. In this model an effective action for the Goldstone boson with the properties required for spontaneous Lorentz violation can be generated. Although the model has shortcomings, we feel it represents progress towards finding a high energy completion for the Higgs phase of gravity.Comment: 20 pages, 5 figures;fixed typos and added reference

    Compression creep of filamentary composites

    Get PDF
    Axial and transverse strain fields induced in composite laminates subjected to compressive creep loading were compared for several types of laminate layups. Unidirectional graphite/epoxy as well as multi-directional graphite/epoxy and graphite/PEEK layups were studied. Specimens with and without holes were tested. The specimens were subjected to compressive creep loading for a 10-hour period. In-plane displacements were measured using moire interferometry. A computer based data reduction scheme was developed which reduces the whole-field displacement fields obtained using moire to whole-field strain contour maps. Only slight viscoelastic response was observed in matrix-dominated laminates, except for one test in which catastrophic specimen failure occurred after a 16-hour period. In this case the specimen response was a complex combination of both viscoelastic and fracture mechanisms. No viscoelastic effects were observed for fiber-dominated laminates over the 10-hour creep time used. The experimental results for specimens with holes were compared with results obtained using a finite-element analysis. The comparison between experiment and theory was generally good. Overall strain distributions were very well predicted. The finite element analysis typically predicted slightly higher strain values at the edge of the hole, and slightly lower strain values at positions removed from the hole, than were observed experimentally. It is hypothesized that these discrepancies are due to nonlinear material behavior at the hole edge, which were not accounted for during the finite-element analysis

    The Baryon-Dark Matter Ratio Via Moduli Decay After Affleck-Dine Baryogenesis

    Get PDF
    Low-scale supersymmetry breaking in string motivated theories implies the presence of O(100) TeV scale moduli, which generically lead to a significant modification of the history of the universe prior to Big Bang Nucleosynthesis. Such an approach implies a non-thermal origin for dark matter resulting from scalar decay, where the lightest supersymmetric particle can account for the observed dark matter relic density. We study the further effect of the decay on the baryon asymmetry of the universe, and find that this can satisfactorily address the problem of the over-production of the baryon asymmetry by the Affleck-Dine mechanism in the MSSM. Remarkably, there is a natural connection between the baryon and dark matter abundances today, which leads to a solution of the `Cosmic Coincidence Problem'.Comment: 12 pages, no figure. v2: references adde

    Minimum Length from Quantum Mechanics and Classical General Relativity

    Get PDF
    We derive fundamental limits on measurements of position, arising from quantum mechanics and classical general relativity. First, we show that any primitive probe or target used in an experiment must be larger than the Planck length, lPl_P. This suggests a Planck-size {\it minimum ball} of uncertainty in any measurement. Next, we study interferometers (such as LIGO) whose precision is much finer than the size of any individual components and hence are not obviously limited by the minimum ball. Nevertheless, we deduce a fundamental limit on their accuracy of order lPl_P. Our results imply a {\it device independent} limit on possible position measurements.Comment: 8 pages, latex, to appear in the Physical Review Letter

    Seiberg Duality and e+ e- Experiments

    Get PDF
    Seiberg duality in supersymmetric gauge theories is the claim that two different theories describe the same physics in the infrared limit. However, one cannot easily work out physical quantities in strongly coupled theories and hence it has been difficult to compare the physics of the electric and magnetic theories. In order to gain more insight into the equivalence of two theories, we study the ``e+ e-'' cross sections into ``hadrons'' for both theories in the superconformal window. We describe a technique which allows us to compute the cross sections exactly in the infrared limit. They are indeed equal in the low-energy limit and the equality is guaranteed because of the anomaly matching condition. The ultraviolet behavior of the total ``e+ e-'' cross section is different for the two theories. We comment on proposed non-supersymmetric dualities. We also analyze the agreement of the ``\gamma\gamma'' and ``WW'' scattering amplitudes in both theories, and in particular try to understand if their equivalence can be explained by the anomaly matching condition.Comment: 24 pages, 2 figures, uses psfi

    A Large-Scale, Open-Domain, Mixed-Interface Dialogue-Based ITS for STEM

    Get PDF
    We present Korbit, a large-scale, open-domain, mixed-interface, dialogue-based intelligent tutoring system (ITS). Korbit uses machine learning, natural language processing and reinforcement learning to provide interactive, personalized learning online. Korbit has been designed to easily scale to thousands of subjects, by automating, standardizing and simplifying the content creation process. Unlike other ITS, a teacher can develop new learning modules for Korbit in a matter of hours. To facilitate learning across a widerange of STEM subjects, Korbit uses a mixed-interface, which includes videos, interactive dialogue-based exercises, question-answering, conceptual diagrams, mathematical exercises and gamification elements. Korbit has been built to scale to millions of students, by utilizing a state-of-the-art cloud-based micro-service architecture. Korbit launched its first course in 2019 on machine learning, and since then over 7,000 students have enrolled. Although Korbit was designed to be open-domain and highly scalable, A/B testing experiments with real-world students demonstrate that both student learning outcomes and student motivation are substantially improved compared to typical online courses

    The role of Comprehension in Requirements and Implications for Use Case Descriptions

    Get PDF
    Within requirements engineering it is generally accepted that in writing specifications (or indeed any requirements phase document), one attempts to produce an artefact which will be simple to comprehend for the user. That is, whether the document is intended for customers to validate requirements, or engineers to understand what the design must deliver, comprehension is an important goal for the author. Indeed, advice on producing ‘readable’ or ‘understandable’ documents is often included in courses on requirements engineering. However, few researchers, particularly within the software engineering domain, have attempted either to define or to understand the nature of comprehension and it’s implications for guidance on the production of quality requirements. Therefore, this paper examines thoroughly the nature of textual comprehension, drawing heavily from research in discourse process, and suggests some implications for requirements (and other) software documentation. In essence, we find that the guidance on writing requirements, often prevalent within software engineering, may be based upon assumptions which are an oversimplification of the nature of comprehension. Hence, the paper examines guidelines which have been proposed, in this case for use case descriptions, and the extent to which they agree with discourse process theory; before suggesting refinements to the guidelines which attempt to utilise lessons learned from our richer understanding of the underlying discourse process theory. For example, we suggest subtly different sets of writing guidelines for the different tasks of requirements, specification and design

    Local design optimization for composite transport fuselage crown panels

    Get PDF
    Composite transport fuselage crown panel design and manufacturing plans were optimized to have projected cost and weight savings of 18 percent and 45 percent, respectively. These savings are close to those quoted as overall NASA ACT program goals. Three local optimization tasks were found to influence the cost and weight of fuselage crown panels. This paper summarizes the effect of each task and describes in detail the task associated with a design cost model. Studies were performed to evaluate the relationship between manufacturing cost and design details. A design tool was developed to aid in these investigations. The development of the design tool included combining cost and performance constraints with a random search optimization algorithm. The resulting software was used in a series of optimization studies that evaluated the sensitivity of design variables, guidelines, criteria, and material selection on cost. The effect of blending adjacent design points in a full scale panel subjected to changing load distributions and local variations was shown to be important. Technical issues and directions for future work were identified

    Comments on Non-Commutative Phenomenology

    Full text link
    It is natural to ask whether non-commutative geometry plays a role in four dimensional physics. By performing explicit computations in various toy models, we show that quantum effects lead to violations of Lorentz invariance at the level of operators of dimension three or four. The resulting constraints are very stringent.Comment: Correction of an error in the U(1) and U(N) calculation leads to stronger limits than those given previously Clarifying comments and reference adde
    corecore