5,431 research outputs found

    The Competitive Dynamics of Entrepreneurial Market Entry

    Get PDF
    Research on general market entry usually focuses on large enterprises, often, however, small entrants can alter the competitive dynamic of an industry. This volume brings together the most prominent thought leaders and the best research on the asymmetric entrant-incumbent dynamics. This ideas presented offer a more nuanced perpective on how, when, where and with whar consequence small, single-product firms enter market that are dominated by large, multiproduct and multimarket incumbents. Sholars and student in entrepreneurship, strategy, international business and related fields will find this excellent collection of key published and original material illuminating

    Metallic characteristics in superlattices composed of insulators, NdMnO3/SrMnO3/LaMnO3

    Full text link
    We report on the electronic properties of superlattices composed of three different antiferromagnetic insulators, NdMnO3/SrMnO3/LaMnO3 grown on SrTiO3 substrates. Photoemission spectra obtained by tuning the x-ray energy at the Mn 2p -> 3d edge show a Fermi cut-off, indicating metallic behavior mainly originating from Mn e_g electrons. Furthermore, the density of states near the Fermi energy and the magnetization obey a similar temperature dependence, suggesting a correlation between the spin and charge degrees of freedom at the interfaces of these oxides

    Tensor Networks for Dimensionality Reduction and Large-Scale Optimizations. Part 2 Applications and Future Perspectives

    Full text link
    Part 2 of this monograph builds on the introduction to tensor networks and their operations presented in Part 1. It focuses on tensor network models for super-compressed higher-order representation of data/parameters and related cost functions, while providing an outline of their applications in machine learning and data analytics. A particular emphasis is on the tensor train (TT) and Hierarchical Tucker (HT) decompositions, and their physically meaningful interpretations which reflect the scalability of the tensor network approach. Through a graphical approach, we also elucidate how, by virtue of the underlying low-rank tensor approximations and sophisticated contractions of core tensors, tensor networks have the ability to perform distributed computations on otherwise prohibitively large volumes of data/parameters, thereby alleviating or even eliminating the curse of dimensionality. The usefulness of this concept is illustrated over a number of applied areas, including generalized regression and classification (support tensor machines, canonical correlation analysis, higher order partial least squares), generalized eigenvalue decomposition, Riemannian optimization, and in the optimization of deep neural networks. Part 1 and Part 2 of this work can be used either as stand-alone separate texts, or indeed as a conjoint comprehensive review of the exciting field of low-rank tensor networks and tensor decompositions.Comment: 232 page

    Tensor Networks for Dimensionality Reduction and Large-Scale Optimizations. Part 2 Applications and Future Perspectives

    Full text link
    Part 2 of this monograph builds on the introduction to tensor networks and their operations presented in Part 1. It focuses on tensor network models for super-compressed higher-order representation of data/parameters and related cost functions, while providing an outline of their applications in machine learning and data analytics. A particular emphasis is on the tensor train (TT) and Hierarchical Tucker (HT) decompositions, and their physically meaningful interpretations which reflect the scalability of the tensor network approach. Through a graphical approach, we also elucidate how, by virtue of the underlying low-rank tensor approximations and sophisticated contractions of core tensors, tensor networks have the ability to perform distributed computations on otherwise prohibitively large volumes of data/parameters, thereby alleviating or even eliminating the curse of dimensionality. The usefulness of this concept is illustrated over a number of applied areas, including generalized regression and classification (support tensor machines, canonical correlation analysis, higher order partial least squares), generalized eigenvalue decomposition, Riemannian optimization, and in the optimization of deep neural networks. Part 1 and Part 2 of this work can be used either as stand-alone separate texts, or indeed as a conjoint comprehensive review of the exciting field of low-rank tensor networks and tensor decompositions.Comment: 232 page
    • …
    corecore