33,970 research outputs found

    Notes on nonabelian (0,2) theories and dualities

    Get PDF
    In this paper we explore basic aspects of nonabelian (0,2) GLSM's in two dimensions for unitary gauge groups, an arena that until recently has largely been unexplored. We begin by discussing general aspects of (0,2) theories, including checks of dynamical supersymmetry breaking, spectators and weak coupling limits, and also build some toy models of (0,2) theories for bundles on Grassmannians, which gives us an opportunity to relate physical anomalies and trace conditions to mathematical properties. We apply these ideas to study (0,2) theories on Pfaffians, applying recent perturbative constructions of Pfaffians of Jockers et al. We discuss how existing dualities in (2,2) nonabelian gauge theories have a simple mathematical understanding, and make predictions for additional dualities in (2,2) and (0,2) gauge theories. Finally, we outline how duality works in open strings in unitary gauge theories, and also describe why, in general terms, we expect analogous dualities in (0,2) theories to be comparatively rare.Comment: 93 pages, LaTeX; v2: typos fixe

    DPCA: Dimensionality Reduction for Discriminative Analytics of Multiple Large-Scale Datasets

    Full text link
    Principal component analysis (PCA) has well-documented merits for data extraction and dimensionality reduction. PCA deals with a single dataset at a time, and it is challenged when it comes to analyzing multiple datasets. Yet in certain setups, one wishes to extract the most significant information of one dataset relative to other datasets. Specifically, the interest may be on identifying, namely extracting features that are specific to a single target dataset but not the others. This paper develops a novel approach for such so-termed discriminative data analysis, and establishes its optimality in the least-squares (LS) sense under suitable data modeling assumptions. The criterion reveals linear combinations of variables by maximizing the ratio of the variance of the target data to that of the remainders. The novel approach solves a generalized eigenvalue problem by performing SVD just once. Numerical tests using synthetic and real datasets showcase the merits of the proposed approach relative to its competing alternatives.Comment: 5 pages, 2 figure
    • …
    corecore