36 research outputs found

    Learning Parallel Computations with ParaLab

    Full text link
    In this paper, we present the ParaLab teachware system, which can be used for learning the parallel computation methods. ParaLab provides the tools for simulating the multiprocessor computational systems with various network topologies, for carrying out the computational experiments in the simulation mode, and for evaluating the efficiency of the parallel computation methods. The visual presentation of the parallel computations taking place in the computational experiments is the key feature of the system. ParaLab can be used for the laboratory training within various teaching courses in the field of parallel, distributed, and supercomputer computations

    Propagating large open quantum systems towards their steady states: cluster implementation of the time-evolving block decimation scheme

    Full text link
    Many-body quantum systems are subjected to the Curse of Dimensionality: The dimension of the Hilbert space H\mathcal{H}, where these systems live in, grows exponentially with systems' 'size' (number of their components, "bodies"). It means that, in order to specify a state of a quantum system, we need a description whose length grows exponentially with the system size. However, with some systems it is possible to escape the curse by using low-rank tensor approximations known as `matrix-product state/operator (MPS/O) representation' in the quantum community and `tensor-train decomposition' among applied mathematicians. Motivated by recent advances in computational quantum physics, we consider chains of NN spins coupled by nearest-neighbor interactions. The spins are subjected to an action coming from the environment. Spatially disordered interaction and environment-induced decoherence drive systems into non-trivial asymptotic states. The dissipative evolution is modeled with a Markovian master equation in the Lindblad form. By implementing the MPO technique and propagating system states with the time-evolving block decimation (TEBD) scheme (which allows to keep the length of the state descriptions fixed), it is in principle possible to reach the corresponding steady states. We propose and realize a cluster implementation of this idea. The implementation on four nodes allowed us to resolve steady states of the model systems with N=128N = 128 spins

    Localization in periodically modulated speckle potentials

    Full text link
    Disorder in a 1D quantum lattice induces Anderson localization of the eigenstates and drastically alters transport properties of the lattice. In the original Anderson model, the addition of a periodic driving increases, in a certain range of the driving's frequency and amplitude, localization length of the appearing Floquet eigenstates. We go beyond the uncorrelated disorder case and address the experimentally relevant situation when spatial correlations are present in the lattice potential. Their presence induces the creation of an effective mobility edge in the energy spectrum of the system. We find that a slow driving leads to resonant hybridization of the Floquet states, by increasing both the participation numbers and effective widths of the states in the strongly localized band and decreasing values of these characteristics for the states in the quasi-extended band. Strong driving homogenizes the bands, so that the Floquet states loose compactness and tend to be spatially smeared. In the basis of the stationary Hamiltonian, these states retain localization in terms of participation number but become de-localized and spectrum-wide in term of their effective widths. Signatures of thermalization are also observed.Comment: 6 pages, 3 figure

    Unfolding quantum master equation into a system of real-valued equations: computationally effective expansion over the basis of SU(N)SU(N) generators

    Full text link
    Dynamics of an open NN-state quantum system is typically modeled with a Markovian master equation describing the evolution of the system's density operator. By using generators of SU(N)SU(N) group as a basis, the density operator can be transformed into a real-valued 'Bloch vector'. The Lindbladian, a super-operator which serves a generator of the evolution, %in the master equation, can be expanded over the same basis and recast in the form of a real matrix. Together, these expansions result is a non-homogeneous system of N21N^2-1 real-valued linear differential equations for the Bloch vector. Now one can, e.g., implement a high-performance parallel simplex algorithm to find a solution of this system which guarantees exact preservation of the norm and Hermiticity of the density matrix. However, when performed in a straightforward way, the expansion turns to be an operation of the time complexity O(N10)\mathcal{O}(N^{10}). The complexity can be reduced when the number of dissipative operators is independent of NN, which is often the case for physically meaningful models. Here we present an algorithm to transform quantum master equation into a system of real-valued differential equations and propagate it forward in time. By using a scalable model, we evaluate computational efficiency of the algorithm and demonstrate that it is possible to handle the model system with N=103N = 10^3 states on a single node of a computer cluster

    Improved vectorization of OpenCV algorithms for RISC-V CPUs

    Full text link
    The development of an open and free RISC-V architecture is of great interest for a wide range of areas, including high-performance computing and numerical simulation in mathematics, physics, chemistry and other problem domains. In this paper, we discuss the possibilities of accelerating computations on available RISC-V processors by improving the vectorization of several computer vision and machine learning algorithms in the widely used OpenCV library. It is shown that improved vectorization speeds up computations on existing prototypes of RISC-V devices by tens of percent

    Predictive data analysis in the field of CRM

    No full text
    Tato diplomová práce se zabývá problematikou prediktivního data miningu v oblasti CRM. Práce je rozdělena do dvou částí: v první z nich je popsána teoretická část této problematiky spolu se zkoumáním současného stavu této oblasti, ve druhé se nachází analytická část práce, v níž byla prostřednictvím prediktivního data miningu vyřešena analytická otázka reálného podniku. Pro vyřešení stanovené otázky byl použit programovací jazyk Python s použitím odpovědných knihoven pro práci s daty. U analytické otázky byla popsána motivace – důvod, proč je pro podnik výhodné nalézt řešení dané otázky, a také vyhodnocení dosažených výsledků.This thesis is devoted to the problem of predictive data analysis in the field of CRM. The work is divided into two parts: the first describes the theoretical part of this issue along with a study of the current state of this area, the second is the analytical part of the work, which solves the analytical issue of a real company. To solve the problem, the programming language Python was used with the necessary libraries to work with the data. The motivation for the analytical question was described - the reason why the company benefits from finding a solution to the question, as well as an evaluation of the results achieved

    The analysis of the data from online advertising for the needs of a particular business

    No full text
    V této bakalářské práci je prováděna analýza dat z internetové reklamy vlastního podniku autora. Data byla exportována z CRM-systému podniku a následně analyzována. Pro analýzu byl použit akademický nástroj LISp-Miner, který slouží k dobývání znalostí z databáze. Práce je rozdělena na dvě části: teoretickou, která popisuje oblast dobývání znalostí z databáze a vybraný nástroj, a praktickou, kde je provedeno předzpracování, modelování a vyhodnocení výsledků v rámci této práce. Hlavním cílem této bakalářské práce je nalezení odpovědí na analytické otázky, které autora coby vlastníka dat zajímají.This bachelor's thesis analyzes the online advertising data of the author's own company. The data was exported from the enterprise CRM system, and then analyzed. For analysis the LISp-Miner training tool was used to extract knowledge from the database. The work is divided into two parts: a theoretical part, which describes the field of getting knowledge from the database and the program chosen for the analysis. And the practical one, where preliminary processing, modeling and evaluation of results in this thesis are carried out. The main purpose of this work is to find answers to analytical questions, which are of interest to the author as the owner of the data
    corecore