7,744 research outputs found

    Advances in privacy-preserving machine learning

    Get PDF
    Building useful predictive models often involves learning from personal data. For instance, companies use customer data to target advertisements, online education platforms collect student data to recommend content and improve user engagement, and medical researchers fit diagnostic models to patient data. A recent line of research aims to design learning algorithms that provide rigorous privacy guarantees for user data, in the sense that their outputs---models or predictions---leak as little information as possible about individuals in the training data. The goal of this dissertation is to design private learning algorithms with performance comparable to the best possible non-private ones. We quantify privacy using \emph{differential privacy}, a well-studied privacy notion that limits how much information is leaked about an individual by the output of an algorithm. Training a model using a differentially private algorithm prevents an adversary from confidently determining whether a specific person's data was used for training the model. We begin by presenting a technique for practical differentially private convex optimization that can leverage any off-the-shelf optimizer as a black box. We also perform an extensive empirical evaluation of the state-of-the-art algorithms on a range of publicly available datasets, as well as in an industry application. Next, we present a learning algorithm that outputs a private classifier when given black-box access to a non-private learner and a limited amount of unlabeled public data. We prove that the accuracy guarantee of our private algorithm in the PAC model of learning is comparable to that of the underlying non-private learner. Such a guarantee is not possible, in general, without public data. Lastly, we consider building recommendation systems, which we model using matrix completion. We present the first algorithm for matrix completion with provable user-level privacy and accuracy guarantees. Our algorithm consistently outperforms the state-of-the-art private algorithms on a suite of datasets. Along the way, we give an optimal algorithm for differentially private singular vector computation which leads to significant savings in terms of space and time when operating on sparse matrices. It can also be used for private low-rank approximation

    Firm Characteristics and Influence on Government Rule-Making: theory and evidence

    Get PDF
    An adversarial game is used to model the amount of influence a firm has over a government regulator, and its equilibrium level of regulation, as a function of firm fundamentals. The effective influence of a firm is identified as comprising both intrinsic and exerted components; where the latter involves distorting regulation via a transfer to the regulator. Understanding the source of a firm's high influence is found to be important for -among other things - predicting whether it faces higher or lower regulatory constraint than other firms. Data from the World Business Environment Survey provides strong evidence in support of model hypotheses across a wide range of government agents, countries, and regulatory areas. Of particular relevance to public debate, large firms are found to be more likely to be influential, but also more likely to experience regulatory constraint than smaller firms.Political Economy; Regulation; Influence
    • …
    corecore