16,497 research outputs found

    Non-commutative solitons and strong-weak duality

    Full text link
    Some properties of the non-commutative versions of the sine-Gordon model (NCSG) and the corresponding massive Thirring theories (NCMT) are studied. Our method relies on the NC extension of integrable models and the master Lagrangian approach to deal with dual theories. The master Lagrangians turn out to be the NC versions of the so-called affine Toda model coupled to matter fields (NCATM) associated to the group GL(2), in which the Toda field belongs to certain representations of either U(1)xU(1)U(1){x} U(1) or U(1)CU(1)_{C} corresponding to the Lechtenfeld et al. (NCSG1_{1}) or Grisaru-Penati (NCSG2_{2}) proposals for the NC versions of the sine-Gordon model, respectively. Besides, the relevant NCMT1,2_{1, 2} models are written for two (four) types of Dirac fields corresponding to the Moyal product extension of one (two) copy(ies) of the ordinary massive Thirring model. The NCATM1,2_{1,2} models share the same one-soliton (real Toda field sector of model 2) exact solutions, which are found without expansion in the NC parameter θ\theta for the corresponding Toda and matter fields describing the strong-weak phases, respectively. The correspondence NCSG1_{1} ↔\leftrightarrow NCMT1_{1} is promising since it is expected to hold on the quantum level.Comment: 24 pages, 1 fig., LaTex. Typos in star products of eqs. (3.11)-(3.13) and footnote 1 were corrected. Version to appear in JHE

    Semiparametric Cross Entropy for rare-event simulation

    Full text link
    The Cross Entropy method is a well-known adaptive importance sampling method for rare-event probability estimation, which requires estimating an optimal importance sampling density within a parametric class. In this article we estimate an optimal importance sampling density within a wider semiparametric class of distributions. We show that this semiparametric version of the Cross Entropy method frequently yields efficient estimators. We illustrate the excellent practical performance of the method with numerical experiments and show that for the problems we consider it typically outperforms alternative schemes by orders of magnitude

    Estimator Selection: End-Performance Metric Aspects

    Full text link
    Recently, a framework for application-oriented optimal experiment design has been introduced. In this context, the distance of the estimated system from the true one is measured in terms of a particular end-performance metric. This treatment leads to superior unknown system estimates to classical experiment designs based on usual pointwise functional distances of the estimated system from the true one. The separation of the system estimator from the experiment design is done within this new framework by choosing and fixing the estimation method to either a maximum likelihood (ML) approach or a Bayesian estimator such as the minimum mean square error (MMSE). Since the MMSE estimator delivers a system estimate with lower mean square error (MSE) than the ML estimator for finite-length experiments, it is usually considered the best choice in practice in signal processing and control applications. Within the application-oriented framework a related meaningful question is: Are there end-performance metrics for which the ML estimator outperforms the MMSE when the experiment is finite-length? In this paper, we affirmatively answer this question based on a simple linear Gaussian regression example.Comment: arXiv admin note: substantial text overlap with arXiv:1303.428

    Thermal Entanglement of a Spin-1/2 Ising-Heisenberg Model on a Symmetrical Diamond Chain

    Full text link
    The entanglement quantum properties of a spin-1/2 Ising-Heisenberg model on a symmetrical diamond chain were analyzed. Due to the separable nature of the Ising-type exchange interactions between neighboring Heisenberg dimers, calculation of the entanglement can be performed exactly for each individual dimer. Pairwise thermal entanglement was studied in terms of the isotropic Ising-Heisenberg model, and analytical expressions for the concurrence (as a measure of bipartite entanglement) were obtained. The effects of external magnetic field HH and next-nearest neighbor interaction JmJ_m between nodal Ising sites were considered. The ground-state structure and entanglement properties of the system were studied in a wide range of the coupling constant values. Various regimes with different values of the ground-state entanglement were revealed, depending on the relation between competing interaction strengths. Finally, some novel effects, such as the two-peak behavior of concurrence versus temperature and coexistence of phases with different values of magnetic entanglement were observed

    Optimal Joins Using Compact Data Structures

    Get PDF
    Worst-case optimal join algorithms have gained a lot of attention in the database literature. We now count with several algorithms that are optimal in the worst case, and many of them have been implemented and validated in practice. However, the implementation of these algorithms often requires an enhanced indexing structure: to achieve optimality we either need to build completely new indexes, or we must populate the database with several instantiations of indexes such as B+-trees. Either way, this means spending an extra amount of storage space that may be non-negligible. We show that optimal algorithms can be obtained directly from a representation that regards the relations as point sets in variable-dimensional grids, without the need of extra storage. Our representation is a compact quadtree for the static indexes, and a dynamic quadtree sharing subtrees (which we dub a qdag) for intermediate results. We develop a compositional algorithm to process full join queries under this representation, and show that the running time of this algorithm is worst-case optimal in data complexity. Remarkably, we can extend our framework to evaluate more expressive queries from relational algebra by introducing a lazy version of qdags (lqdags). Once again, we can show that the running time of our algorithms is worst-case optimal
    • …
    corecore