51 research outputs found

    Numerical methods in Tensor Networks

    Get PDF
    In many applications that deal with high dimensional data, it is important to not store the high dimensional object itself, but its representation in a data sparse way. This aims to reduce the storage and computational complexity. There is a general scheme for representing tensors with the help of sums of elementary tensors, where the summation structure is defined by a graph/network. This scheme allows to generalize commonly used approaches in representing a large amount of numerical data (that can be interpreted as a high dimensional object) using sums of elementary tensors. The classification does not only distinguish between elementary tensors and non-elementary tensors, but also describes the number of terms that is needed to represent an object of the tensor space. This classification is referred to as tensor network (format). This work uses the tensor network based approach and describes non-linear block Gauss-Seidel methods (ALS and DMRG) in the context of the general tensor network framework. Another contribution of the thesis is the general conversion of different tensor formats. We are able to efficiently change the underlying graph topology of a given tensor representation while using the similarities (if present) of both the original and the desired structure. This is an important feature in case only minor structural changes are required. In all approximation cases involving iterative methods, it is crucial to find and use a proper initial guess. For linear iteration schemes, a good initial guess helps to decrease the number of iteration steps that are needed to reach a certain accuracy, but it does not change the approximation result. For non-linear iteration schemes, the approximation result may depend on the initial guess. This work introduces a method to successively create an initial guess that improves some approximation results. This algorithm is based on successive rank 1 increments for the r-term format. There are still open questions about how to find the optimal tensor format for a given general problem (e.g. storage, operations, etc.). For instance in the case where a physical background is given, it might be efficient to use this knowledge to create a good network structure. There is however, no guarantee that a better (with respect to the problem) representation structure does not exist

    07021 Abstracts Collection -- Symmetric Cryptography

    Get PDF
    From .. to .., the Dagstuhl Seminar 07021 ``Symmetric Cryptography\u27\u27 automatically was held in the International Conference and Research Center (IBFI), Schloss Dagstuhl. During the seminar, several participants presented their current research, and ongoing work and open problems were discussed. Abstracts of the presentations given during the seminar as well as abstracts of seminar results and ideas are put together in this paper. The first section describes the seminar topics and goals in general. Links to extended abstracts or full papers are provided, if available

    09031 Abstracts Collection -- Symmetric Cryptography

    Get PDF
    From 11.01.09 to 16.01.09, the Seminar 09031 in ``Symmetric Cryptography \u27\u27 was held in Schloss Dagstuhl~--~Leibniz Center for Informatics. During the seminar, several participants presented their current research, and ongoing work and open problems were discussed. Abstracts of the presentations given during the seminar as well as abstracts of seminar results and ideas are put together in this paper. The first section describes the seminar topics and goals in general. Links to extended abstracts or full papers are provided, if available

    Optimization problems in contracted tensor networks

    Get PDF
    Abstract We discuss the calculus of variations in tensor representations with a special focus on tensor networks and apply it to functionals of practical interest. The survey provides all necessary ingredients for applying minimization methods in a general setting. The important cases of target functionals which are linear and quadratic with respect to the tensor product are discussed, and combinations of these functionals are presented in detail. As an example, we consider the representation rank compression in tensor networks. For the numerical treatment, we use the nonlinear block Gauss-Seidel method. We demonstrate the rate of convergence in numerical tests

    Correlated Multimodal Imaging in Life Sciences:Expanding the Biomedical Horizon

    Get PDF
    International audienceThe frontiers of bioimaging are currently being pushed toward the integration and correlation of several modalities to tackle biomedical research questions holistically and across multiple scales. Correlated Multimodal Imaging (CMI) gathers information about exactly the same specimen with two or more complementary modalities that-in combination-create a composite and complementary view of the sample (including insights into structure, function, dynamics and molecular composition). CMI allows to describe biomedical processes within their overall spatio-temporal context and gain a mechanistic understanding of cells, tissues, diseases or organisms by untangling their molecular mechanisms within their native environment. The two best-established CMI implementations for small animals and model organisms are hardware-fused platforms in preclinical imaging (Hybrid Imaging) and Correlated Light and Electron Microscopy (CLEM) in biological imaging. Although the merits of Preclinical Hybrid Imaging (PHI) and CLEM are well-established, both approaches would benefit from standardization of protocols, ontologies and data handling, and the development of optimized and advanced implementations. Specifically, CMI pipelines that aim at bridging preclinical and biological imaging beyond CLEM and PHI are rare but bear great potential to substantially advance both bioimaging and biomedical research. CMI faces three mai

    Demo: Visual Programming for the Semantic Desktop with Konduit

    Get PDF
    In this demo description, we present Konduit, a desktop-based platform for visual programming with RDF data. Based on the idea of the semantic desktop, non-technical users can create, manipulate and mash-up RDF data with Konduit, and thus generate simple applications or workflows, which are aimed to simplify their everyday work by automating repetitive tasks. The platform allows to combine data from both Web and desktop and integrate it with existing desktop functionality, thus bringing us closer to a convergence of Web and desktop.peer-reviewe

    enVision: An integrated approach towards Semantic Authoring

    Get PDF
    Writing is one of the most common activities of a researcher. And since, the writing process is not only about writing, but also about researching the background of the domain or collecting the necessary references, we propose an integrated writing environment having semantic web technologies as foundation and encapsulating all the necessities involved in this above mentioned process.peer-reviewe

    Numerical methods in Tensor Networks

    Get PDF
    In many applications that deal with high dimensional data, it is important to not store the high dimensional object itself, but its representation in a data sparse way. This aims to reduce the storage and computational complexity. There is a general scheme for representing tensors with the help of sums of elementary tensors, where the summation structure is defined by a graph/network. This scheme allows to generalize commonly used approaches in representing a large amount of numerical data (that can be interpreted as a high dimensional object) using sums of elementary tensors. The classification does not only distinguish between elementary tensors and non-elementary tensors, but also describes the number of terms that is needed to represent an object of the tensor space. This classification is referred to as tensor network (format). This work uses the tensor network based approach and describes non-linear block Gauss-Seidel methods (ALS and DMRG) in the context of the general tensor network framework. Another contribution of the thesis is the general conversion of different tensor formats. We are able to efficiently change the underlying graph topology of a given tensor representation while using the similarities (if present) of both the original and the desired structure. This is an important feature in case only minor structural changes are required. In all approximation cases involving iterative methods, it is crucial to find and use a proper initial guess. For linear iteration schemes, a good initial guess helps to decrease the number of iteration steps that are needed to reach a certain accuracy, but it does not change the approximation result. For non-linear iteration schemes, the approximation result may depend on the initial guess. This work introduces a method to successively create an initial guess that improves some approximation results. This algorithm is based on successive rank 1 increments for the r-term format. There are still open questions about how to find the optimal tensor format for a given general problem (e.g. storage, operations, etc.). For instance in the case where a physical background is given, it might be efficient to use this knowledge to create a good network structure. There is however, no guarantee that a better (with respect to the problem) representation structure does not exist
    • …
    corecore