4,938 research outputs found

    Thermodynamic length in open quantum systems

    Get PDF
    The dissipation generated during a quasistatic thermodynamic process can be characterised by introducing a metric on the space of Gibbs states, in such a way that minimally-dissipating protocols correspond to geodesic trajectories. Here, we show how to generalize this approach to open quantum systems by finding the thermodynamic metric associated to a given Lindblad master equation. The obtained metric can be understood as a perturbation over the background geometry of equilibrium Gibbs states, which is induced by the Kubo-Mori-Bogoliubov (KMB) inner product. We illustrate this construction on two paradigmatic examples: an Ising chain and a two-level system interacting with a bosonic bath with different spectral densities.Comment: 22 pages, 3 figures. v5: minor corrections, accepted in Quantu

    A study into the feasibility of generic programming for the construction of complex software

    Get PDF
    A high degree of abstraction and capacity for reuse can be obtained in software design through the use of Generic Programming (GP) concepts. Despite widespread use of GP in computing, some areas such as the construction of generic component libraries as the skeleton for complex computing systems with extensive domains have been neglected. Here we consider the design of a library of generic components based on the GP paradigm implemented with Java. Our aim is to investigate the feasibility of using GP paradigm in the construction of complex computer systems where the management of users interacting with the system and the optimisation of the system’s resources is required.Postprint (author's final draft

    Distributed-based massive processing of activity logs for efficient user modeling in a Virtual Campus

    Get PDF
    This paper reports on a multi-fold approach for the building of user models based on the identification of navigation patterns in a virtual campus, allowing for adapting the campus’ usability to the actual learners’ needs, thus resulting in a great stimulation of the learning experience. However, user modeling in this context implies a constant processing and analysis of user interaction data during long-term learning activities, which produces huge amounts of valuable data stored typically in server log files. Due to the large or very large size of log files generated daily, the massive processing is a foremost step in extracting useful information. To this end, this work studies, first, the viability of processing large log data files of a real Virtual Campus using different distributed infrastructures. More precisely, we study the time performance of massive processing of daily log files implemented following the master-slave paradigm and evaluated using Cluster Computing and PlanetLab platforms. The study reveals the complexity and challenges of massive processing in the big data era, such as the need to carefully tune the log file processing in terms of chunk log data size to be processed at slave nodes as well as the bottleneck in processing in truly geographically distributed infrastructures due to the overhead caused by the communication time among the master and slave nodes. Then, an application of the massive processing approach resulting in log data processed and stored in a well-structured format is presented. We show how to extract knowledge from the log data analysis by using the WEKA framework for data mining purposes showing its usefulness to effectively build user models in terms of identifying interesting navigation patters of on-line learners. The study is motivated and conducted in the context of the actual data logs of the Virtual Campus of the Open University of Catalonia.Peer ReviewedPostprint (author's final draft

    Lectures on dynamical models for quantum measurements

    Full text link
    In textbooks, ideal quantum measurements are described in terms of the tested system only by the collapse postulate and Born's rule. This level of description offers a rather flexible position for the interpretation of quantum mechanics. Here we analyse an ideal measurement as a process of interaction between the tested system S and an apparatus A, so as to derive the properties postulated in textbooks. We thus consider within standard quantum mechanics the measurement of a quantum spin component s^z\hat s_z by an apparatus A, being a magnet coupled to a bath. We first consider the evolution of the density operator of S+A describing a large set of runs of the measurement process. The approach describes the disappearance of the off-diagonal terms ("truncation") of the density matrix as a physical effect due to A, while the registration of the outcome has classical features due to the large size of the pointer variable, the magnetisation. A quantum ambiguity implies that the density matrix at the final time can be decomposed on many bases, not only the one of the measurement. This quantum oddity prevents to connect individual outcomes to measurements, a difficulty known as the "measurement problem". It is shown that it is circumvented by the apparatus as well, since the evolution in a small time interval erases all decompositions, except the one on the measurement basis. Once one can derive the outcome of individual events from quantum theory, the so-called "collapse of the wave function" or the "reduction of the state" appears as the result of a selection of runs among the original large set. Hence nothing more than standard quantum mechanics is needed to explain features of measurements. The employed statistical formulation is advocated for the teaching of quantum theory.Comment: 43 pages, 5 figures. Lectures given in the "Advanced School on Quantum Foundations and Open Quantum Systems", Joao Pessoa, Brazil, summer 2012. To appear in the proceedings and in IJMP
    • …
    corecore