201,939 research outputs found

    Noncommutative spaces and matrix embeddings on flat R^{2n+1}

    Full text link
    We conjecture an embedding operator which assigns, to any 2n+1 hermitian matrices, a 2n-dimensional hypersurface in flat (2n + 1)-dimensional Euclidean space. This corresponds to precisely defining a fuzzy D(2n)-brane corresponding to N D0-branes. Points on the emergent hypersurface correspond to zero eigenstates of the embedding operator, which have an interpretation as coherent states underlying the emergent noncommutative geometry. Using this correspondence, all physical properties of the emergent D(2n)-brane can be computed. We apply our conjecture to noncommutative flat and spherical spaces. As a by-product, we obtain a construction of a rotationally symmetric flat noncommutative space in 4 dimensions.Comment: 14 pages, no figures. v2: added references and a clarificatio

    Non-standard embedding and five-branes in heterotic M-Theory

    Get PDF
    We construct vacua of M-theory on S^1/Z_2 associated with Calabi-Yau three-folds. These vacua are appropriate for compactification to N=1 supersymmetry theories in both four and five dimensions. We allow for general E_8 x E_8 gauge bundles and for the presence of five-branes. The five-branes span the four-dimensional uncompactified space and are wrapped on holomorphic curves in the Calabi-Yau space. Properties of these vacua, as well as of the resulting low-energy theories, are discussed. We find that the low-energy gauge group is enlarged by gauge fields that originate on the five-brane world-volumes. In addition, the five-branes increase the types of new E_8 x E_8 breaking patterns allowed by the non-standard embedding. Characteristic features of the low-energy theory, such as the threshold corrections to the gauge kinetic functions, are significantly modified due to the presence of the five-branes, as compared to the case of standard or non-standard embeddings without five-branes.Comment: 34 pages, Latex 2e with amsmath, typos removed, factors corrected, refs improve

    A Comparison of Tests for Embeddings

    Get PDF
    It is possible to compare results for the classical tests for embeddings of chaotic data with the results of a recently proposed test. The classical tests, which depend on real numbers (fractal dimensions, Lyapunov exponents) averaged over an attractor, are compared with a topological test that depends on integers. The comparison can only be done for mappings into three dimensions. We find that the classical tests fail to predict when a mapping is an embedding and when it is not. We point out the reasons for this failure, which are not restricted to three dimensions

    Conditional t-SNE: Complementary t-SNE embeddings through factoring out prior information

    Get PDF
    Dimensionality reduction and manifold learning methods such as t-Distributed Stochastic Neighbor Embedding (t-SNE) are routinely used to map high-dimensional data into a 2-dimensional space to visualize and explore the data. However, two dimensions are typically insufficient to capture all structure in the data, the salient structure is often already known, and it is not obvious how to extract the remaining information in a similarly effective manner. To fill this gap, we introduce \emph{conditional t-SNE} (ct-SNE), a generalization of t-SNE that discounts prior information from the embedding in the form of labels. To achieve this, we propose a conditioned version of the t-SNE objective, obtaining a single, integrated, and elegant method. ct-SNE has one extra parameter over t-SNE; we investigate its effects and show how to efficiently optimize the objective. Factoring out prior knowledge allows complementary structure to be captured in the embedding, providing new insights. Qualitative and quantitative empirical results on synthetic and (large) real data show ct-SNE is effective and achieves its goal

    Approximated and User Steerable tSNE for Progressive Visual Analytics

    Full text link
    Progressive Visual Analytics aims at improving the interactivity in existing analytics techniques by means of visualization as well as interaction with intermediate results. One key method for data analysis is dimensionality reduction, for example, to produce 2D embeddings that can be visualized and analyzed efficiently. t-Distributed Stochastic Neighbor Embedding (tSNE) is a well-suited technique for the visualization of several high-dimensional data. tSNE can create meaningful intermediate results but suffers from a slow initialization that constrains its application in Progressive Visual Analytics. We introduce a controllable tSNE approximation (A-tSNE), which trades off speed and accuracy, to enable interactive data exploration. We offer real-time visualization techniques, including a density-based solution and a Magic Lens to inspect the degree of approximation. With this feedback, the user can decide on local refinements and steer the approximation level during the analysis. We demonstrate our technique with several datasets, in a real-world research scenario and for the real-time analysis of high-dimensional streams to illustrate its effectiveness for interactive data analysis

    A Unified Approach to Attractor Reconstruction

    Full text link
    In the analysis of complex, nonlinear time series, scientists in a variety of disciplines have relied on a time delayed embedding of their data, i.e. attractor reconstruction. The process has focused primarily on heuristic and empirical arguments for selection of the key embedding parameters, delay and embedding dimension. This approach has left several long-standing, but common problems unresolved in which the standard approaches produce inferior results or give no guidance at all. We view the current reconstruction process as unnecessarily broken into separate problems. We propose an alternative approach that views the problem of choosing all embedding parameters as being one and the same problem addressable using a single statistical test formulated directly from the reconstruction theorems. This allows for varying time delays appropriate to the data and simultaneously helps decide on embedding dimension. A second new statistic, undersampling, acts as a check against overly long time delays and overly large embedding dimension. Our approach is more flexible than those currently used, but is more directly connected with the mathematical requirements of embedding. In addition, the statistics developed guide the user by allowing optimization and warning when embedding parameters are chosen beyond what the data can support. We demonstrate our approach on uni- and multivariate data, data possessing multiple time scales, and chaotic data. This unified approach resolves all the main issues in attractor reconstruction.Comment: 22 pages, revised version as submitted to CHAOS. Manuscript is currently under review. 4 Figures, 31 reference
    • …
    corecore