6 research outputs found

    Classical information driven quantum dot thermal machines

    Full text link
    We analyze the transient response of quantum dot thermal machines that can be driven by hyperfine interaction acting as a source of classical information. Our setup comprises a quantum dot coupled to two contacts that drive heat flow while coupled to a nuclear spin bath. The quantum dot thermal machines operate both as batteries and as engines, depending on the parameter range. The electrons in the quantum dot interact with the nuclear spins via hyperfine spin-flip processes as typically seen in solid state systems such as GaAs quantum dots. The hyperfine interaction in such systems, which is often treated as a deterrent for quantum information processing, can favorably be regarded as a driving agent for classical information flow into a heat engine setup. We relate this information flow to Landauer's erasure of the nuclear spin bath, leading to a battery operation. We further demonstrate that the setup can perform as a transient power source even under a voltage bias across the dot. Focusing on the transient thermoelectric operation, our analysis clearly indicates the role of Landauer's erasure to deliver a higher output power than a conventional quantum dot thermoelectric setup and an efficiency greater than that of an identical Carnot cycle in steady state, which is consistent with recently proposed bounds on efficiency for systems subject to a feedback controller. The role of nuclear spin relaxation processes on these aspects is also studied. Finally, we introduce the Coulomb interaction in the dot and analyze the transient thermoelectric response of the system. Our results elaborate on the effective use of somewhat undesirable scattering processes as a non-equilibrium source of Shannon information flow in thermal machines and the possibilities that may arise from the use of a quantum information source.Comment: 10 pages, 7 figure

    Cooperative Data Exchange with Weighted Cost based on d-Basis Construction

    Get PDF
    We consider the cooperative data exchange problem, in which nodes are fully connected with each other. Each node initially only has a subset of the K packets making up a file and wants to recover the whole file. Node i can make a broadcast transmission, which incurs cost w_i and is received by all other nodes. The goal is to minimize the total cost of transmissions that all nodes have to send, which is also called weighted cost. Following the same idea of our previous work which provided a method based on d-Basis construction to solve cooperative data exchange problem without weighted cost, we present a modified method to solve cooperative data exchange problem with weighted cost. We present a polynomial-time deterministic algorithm to compute the minimum weighted cost and determine the rate vector and the packets that should be used to generate each transmission. By leveraging the connection to Maximum Distance Separable codes, the coefficients of linear combinations of the optimal coding scheme can be efficiently generated. Our algorithm has significantly lower complexity than the state of the art. In particular, we prove that the minimum weighted cost function is a convex function of the total number of transmissions for integer rate cases

    Group Fairness with Uncertainty in Sensitive Attributes

    Full text link
    We consider learning a fair predictive model when sensitive attributes are uncertain, say, due to a limited amount of labeled data, collection bias, or privacy mechanism. We formulate the problem, for the independence notion of fairness, using the information bottleneck principle, and propose a robust optimization with respect to an uncertainty set of the sensitive attributes. As an illustrative case, we consider the joint Gaussian model and reduce the task to a quadratically constrained quadratic problem (QCQP). To ensure a strict fairness guarantee, we propose a robust QCQP and completely characterize its solution with an intuitive geometric understanding. When uncertainty arises due to limited labeled sensitive attributes, our analysis reveals the contribution of each new sample towards the optimal performance achieved with unlimited access to labeled sensitive attributes. This allows us to identify non-trivial regimes where uncertainty incurs no performance loss of the proposed algorithm while continuing to guarantee strict fairness. We also propose a bootstrap-based generic algorithm that is applicable beyond the Gaussian case. We demonstrate the value of our analysis and method on synthetic data as well as real-world classification and regression tasks
    corecore