93,255 research outputs found

    Fermion Masses in SO(10) Models

    Full text link
    We examine many SO(10) models for their viability or otherwise in explaining all the fermion masses and mixing angles. This study is carried out for both supersymmetric and non-supersymmetric models and with minimal (10+126ˉ10+\bar{126}) and non-minimal (10+126ˉ+12010+\bar{126}+120) Higgs content. Extensive numerical fits to fermion masses and mixing are carried out in each case assuming dominance of type-II or type-I seesaw mechanism. Required scale of the B-L breaking is identified in each case. In supersymmetric case, several sets of data at the GUT scale with or without inclusion of finite supersymmetric corrections are used. All models studied provide quite good fits if the type-I seesaw mechanism dominates while many fail if the type-II seesaw dominates. This can be traced to the absence of the bb-τ\tau unification at the GUT scale in these models. The minimal non-supersymmetric model with type-I seesaw dominance gives excellent fits. In the presence of a 45H45_H and an intermediate scale, the model can also account for the gauge coupling unification making it potentially interesting model for the complete unification. Structure of the Yukawa coupling matrices obtained numerically in this specific case is shown to follow from a very simple U(1) symmetry and a Froggatt-Nielsen singlet.Comment: 31 pages, 9 Tables, 4 figure

    Graph matching with a dual-step EM algorithm

    Get PDF
    This paper describes a new approach to matching geometric structure in 2D point-sets. The novel feature is to unify the tasks of estimating transformation geometry and identifying point-correspondence matches. Unification is realized by constructing a mixture model over the bipartite graph representing the correspondence match and by affecting optimization using the EM algorithm. According to our EM framework, the probabilities of structural correspondence gate contributions to the expected likelihood function used to estimate maximum likelihood transformation parameters. These gating probabilities measure the consistency of the matched neighborhoods in the graphs. The recovery of transformational geometry and hard correspondence matches are interleaved and are realized by applying coupled update operations to the expected log-likelihood function. In this way, the two processes bootstrap one another. This provides a means of rejecting structural outliers. We evaluate the technique on two real-world problems. The first involves the matching of different perspective views of 3.5-inch floppy discs. The second example is furnished by the matching of a digital map against aerial images that are subject to severe barrel distortion due to a line-scan sampling process. We complement these experiments with a sensitivity study based on synthetic data

    Light Neutralinos as Dark Matter in the Unconstrained Minimal Supersymmetric Standard Model

    Full text link
    The allowed parameter space for the lightest neutralino as the dark matter is explored using the Minimal Supersymmetric Standard Model as the low-energy effective theory without further theoretical constraints such as GUT. Selecting values of the parameters which are in agreement with present experimental limits and applying the additional requirement that the lightest neutralino be in a cosmologically interesting range, we give limits on the neutralino mass and composition. A similar analysis is also performed implementing the grand unification constraints. The elastic scattering cross section of the selected neutralinos on 27^{27}Al and on other materials for dark matter experiments is discussed.Comment: Submitted to Astroparticle Physics, 19 Feb. 96, Latex 23 pages with 24 figures in a gzip compressed file FIGURE.PS.GZ available via anonymous ftp from ftp://iws104.mppmu.mpg.de/pub/gabutt

    Set Unification

    Full text link
    The unification problem in algebras capable of describing sets has been tackled, directly or indirectly, by many researchers and it finds important applications in various research areas--e.g., deductive databases, theorem proving, static analysis, rapid software prototyping. The various solutions proposed are spread across a large literature. In this paper we provide a uniform presentation of unification of sets, formalizing it at the level of set theory. We address the problem of deciding existence of solutions at an abstract level. This provides also the ability to classify different types of set unification problems. Unification algorithms are uniformly proposed to solve the unification problem in each of such classes. The algorithms presented are partly drawn from the literature--and properly revisited and analyzed--and partly novel proposals. In particular, we present a new goal-driven algorithm for general ACI1 unification and a new simpler algorithm for general (Ab)(Cl) unification.Comment: 58 pages, 9 figures, 1 table. To appear in Theory and Practice of Logic Programming (TPLP

    A Formalization of the Theorem of Existence of First-Order Most General Unifiers

    Full text link
    This work presents a formalization of the theorem of existence of most general unifiers in first-order signatures in the higher-order proof assistant PVS. The distinguishing feature of this formalization is that it remains close to the textbook proofs that are based on proving the correctness of the well-known Robinson's first-order unification algorithm. The formalization was applied inside a PVS development for term rewriting systems that provides a complete formalization of the Knuth-Bendix Critical Pair theorem, among other relevant theorems of the theory of rewriting. In addition, the formalization methodology has been proved of practical use in order to verify the correctness of unification algorithms in the style of the original Robinson's unification algorithm.Comment: In Proceedings LSFA 2011, arXiv:1203.542

    An Abstract Machine for Unification Grammars

    Full text link
    This work describes the design and implementation of an abstract machine, Amalia, for the linguistic formalism ALE, which is based on typed feature structures. This formalism is one of the most widely accepted in computational linguistics and has been used for designing grammars in various linguistic theories, most notably HPSG. Amalia is composed of data structures and a set of instructions, augmented by a compiler from the grammatical formalism to the abstract instructions, and a (portable) interpreter of the abstract instructions. The effect of each instruction is defined using a low-level language that can be executed on ordinary hardware. The advantages of the abstract machine approach are twofold. From a theoretical point of view, the abstract machine gives a well-defined operational semantics to the grammatical formalism. This ensures that grammars specified using our system are endowed with well defined meaning. It enables, for example, to formally verify the correctness of a compiler for HPSG, given an independent definition. From a practical point of view, Amalia is the first system that employs a direct compilation scheme for unification grammars that are based on typed feature structures. The use of amalia results in a much improved performance over existing systems. In order to test the machine on a realistic application, we have developed a small-scale, HPSG-based grammar for a fragment of the Hebrew language, using Amalia as the development platform. This is the first application of HPSG to a Semitic language.Comment: Doctoral Thesis, 96 pages, many postscript figures, uses pstricks, pst-node, psfig, fullname and a macros fil
    • …
    corecore