4 research outputs found

    Differentially Private Gaussian Processes

    Get PDF
    A major challenge for machine learning is increasing the availability of data while respecting the privacy of individuals. Here we combine the provable privacy guarantees of the Differential Privacy framework with the flexibility of Gaussian processes (GPs). We propose a method using GPs to provide Differentially Private (DP) regression. We then improve this method by crafting the DP noise covariance structure to efficiently protect the training data, while minimising the scale of the added noise. We find that, for the dataset used, this cloaking method achieves the greatest accuracy, while still providing privacy guarantees, and offers practical DP for regression over multi-dimensional inputs. Together these methods provide a starter toolkit for combining differential privacy and GPs

    Differentially private regression with Gaussian processes

    No full text
    A major challenge for machine learning is increasing the availability of data while respecting the privacy of individuals. Here we combine the provable privacy guarantees of the differential privacy framework with the flexibility of Gaussian processes (GPs). We propose a method using GPs to provide differentially private (DP) regression. We then improve this method by crafting the DP noise covariance structure to efficiently protect the training data, while minimising the scale of the added noise. We find that this cloaking method achieves the greatest accuracy, while still providing privacy guarantees, and offers practical DP for regression over multi-dimensional inputs. Together these methods provide a starter toolkit for combining differential privacy and GPs

    Differentially Private Regression using Gaussian Processes

    No full text
    A major challenge for machine learning is increasing the availability of data while respecting the privacy of individuals. Here we combine the provable privacy guarantees of the differential privacy framework with the flexibility of Gaussian processes (GPs). We propose a method using GPs to provide differentially private (DP) regression. We then improve this method by crafting the DP noise covariance structure to efficiently protect the training data, while minimising the scale of the added noise. We find that this cloaking method achieves the greatest accuracy, while still providing privacy guarantees, and offers practical DP for regression over multi-dimensional inputs. Together these methods provide a starter toolkit for combining differential privacy and GPs

    Quantitative modelling of the Waddington epigenetic landscape

    No full text
    C.H. Waddington introduced the epigenetic landscape as a metaphor to represent cellular decision-making during development. Like a population of balls rolling down a rough hillside, developing cells follow specific trajectories (valleys) and eventually come to rest in one or another low-energy state that represents a mature cell type. Waddington depicted the topography of this landscape as determined by interactions among gene products, thereby connecting genotype to phenotype. In modern terms, each point on the landscape represents a state of the underlying genetic regulatory network, which in turn is described by a gene expression profile. In this chapter we demonstrate how the mathematical formalism of Hopfield networks can be used to model this epigenetic landscape. Hopfield networks are auto-associative artificial neural networks; input patterns are stored as attractors of the network and can be recalled from noisy or incomplete inputs. The resulting models capture the temporal dynamics of a gene regulatory network, yielding quantitative insight into cellular development and phenotype
    corecore