39 research outputs found
Protecting the Primordial Baryon Asymmetry From Erasure by Sphalerons
If the baryon asymmetry of the universe was created at the GUT scale,
sphalerons together with exotic sources of -violation could have erased
it, unless the latter satisfy stringent bounds. We elaborate on how the small
Yukawa coupling of the electron drastically weakens previous estimates of these
bounds.Comment: 41 pp., 4 latex figures included and 3 uuencoded or postscript
figures available by request, UMN-TH-1213-9
UWOMJ Volume 30, Number 4, November 1960
Schulich School of Medicine & Dentistryhttps://ir.lib.uwo.ca/uwomj/1181/thumbnail.jp
Algebraic Quantum Gravity (AQG) IV. Reduced Phase Space Quantisation of Loop Quantum Gravity
We perform a canonical, reduced phase space quantisation of General
Relativity by Loop Quantum Gravity (LQG) methods. The explicit construction of
the reduced phase space is made possible by the combination of 1. the Brown --
Kuchar mechanism in the presence of pressure free dust fields which allows to
deparametrise the theory and 2. Rovelli's relational formalism in the extended
version developed by Dittrich to construct the algebra of gauge invariant
observables. Since the resulting algebra of observables is very simple, one can
quantise it using the methods of LQG. Basically, the kinematical Hilbert space
of non reduced LQG now becomes a physical Hilbert space and the kinematical
results of LQG such as discreteness of spectra of geometrical operators now
have physical meaning. The constraints have disappeared, however, the dynamics
of the observables is driven by a physical Hamiltonian which is related to the
Hamiltonian of the standard model (without dust) and which we quantise in this
paper.Comment: 31 pages, no figure
Using character varieties: Presentations, invariants, divisibility and determinants
If G is a finitely generated group, then the set of all characters from G into a linear algebraic group is a useful (but not complete) invariant of G . In this thesis, we present some new methods for computing with the variety of SL2C -characters of a finitely presented group. We review the theory of Fricke characters, and introduce a notion of presentation simplicity which uses these results. With this definition, we give a set of GAP routines which facilitate the simplification of group presentations. We provide an explicit canonical basis for an invariant ring associated with a symmetrically presented group\u27s character variety. Then, turning to the divisibility properties of trace polynomials, we examine a sequence of polynomials rn(a) governing the weak divisibility of a family of shifted linear recurrence sequences. We prove a discriminant/determinant identity about certain factors of rn( a) in an intriguing manner. Finally, we indicate how ordinary generating functions may be used to discover linear factors of sequences of discriminants.
Other novelties include an unusual binomial identity, which we use to prove a well-known formula for traces; the use of a generating function to find the inverse of a map xn ∣→ fn(x); and a brief exploration of the relationship between finding the determinants of a parametrized family of matrices and the Smith Normal Forms of the sequence
Part decomposition of 3D surfaces
This dissertation describes a general algorithm that automatically decomposes realworld scenes and objects into visual parts. The input to the algorithm is a 3 D triangle mesh that approximates the surfaces of a scene or object. This geometric mesh completely specifies the shape of interest. The output of the algorithm is a set of boundary contours that dissect the mesh into parts where these parts agree with human perception. In this algorithm, shape alone defines the location of a bom1dary contour for a part. The algorithm leverages a human vision theory known as the minima rule that states that human visual perception tends to decompose shapes into parts along lines of negative curvature minima. Specifically, the minima rule governs the location of part boundaries, and as a result the algorithm is known as the Minima Rule Algorithm. Previous computer vision methods have attempted to implement this rule but have used pseudo measures of surface curvature. Thus, these prior methods are not true implementations of the rule. The Minima Rule Algorithm is a three step process that consists of curvature estimation, mesh segmentation, and quality evaluation. These steps have led to three novel algorithms known as Normal Vector Voting, Fast Marching Watersheds, and Part Saliency Metric, respectively. For each algorithm, this dissertation presents both the supporting theory and experimental results. The results demonstrate the effectiveness of the algorithm using both synthetic and real data and include comparisons with previous methods from the research literature. Finally, the dissertation concludes with a summary of the contributions to the state of the art