496 research outputs found

    Bregman divergences based on optimal design criteria and simplicial measures of dispersion

    Get PDF
    In previous work the authors defined the k-th order simplicial distance between probability distributions which arises naturally from a measure of dispersion based on the squared volume of random simplices of dimension k. This theory is embedded in the wider theory of divergences and distances between distributions which includes Kullback–Leibler, Jensen–Shannon, Jeffreys–Bregman divergence and Bhattacharyya distance. A general construction is given based on defining a directional derivative of a function ϕ from one distribution to the other whose concavity or strict concavity influences the properties of the resulting divergence. For the normal distribution these divergences can be expressed as matrix formula for the (multivariate) means and covariances. Optimal experimental design criteria contribute a range of functionals applied to non-negative, or positive definite, information matrices. Not all can distinguish normal distributions but sufficient conditions are given. The k-th order simplicial distance is revisited from this aspect and the results are used to test empirically the identity of means and covariances

    Extended generalised variances, with applications

    Get PDF
    We consider a measure ψk of dispersion which extends the notion of Wilk’s generalised variance for a d-dimensional distribution, and is based on the mean squared volume of simplices of dimension k≤d formed by k+1 independent copies. We show how ψk can be expressed in terms of the eigenvalues of the covariance matrix of the distribution, also when a n-point sample is used for its estimation, and prove its concavity when raised at a suitable power. Some properties of dispersion-maximising distributions are derived, including a necessary and sufficient condition for optimality. Finally, we show how this measure of dispersion can be used for the design of optimal experiments, with equivalence to A and D-optimal design for k=1 and k=d, respectively. Simple illustrative examples are presented

    Active Discrimination Learning for Gaussian Process Models

    Full text link
    The paper covers the design and analysis of experiments to discriminate between two Gaussian process models, such as those widely used in computer experiments, kriging, sensor location and machine learning. Two frameworks are considered. First, we study sequential constructions, where successive design (observation) points are selected, either as additional points to an existing design or from the beginning of observation. The selection relies on the maximisation of the difference between the symmetric Kullback Leibler divergences for the two models, which depends on the observations, or on the mean squared error of both models, which does not. Then, we consider static criteria, such as the familiar log-likelihood ratios and the Fr\'echet distance between the covariance functions of the two models. Other distance-based criteria, simpler to compute than previous ones, are also introduced, for which, considering the framework of approximate design, a necessary condition for the optimality of a design measure is provided. The paper includes a study of the mathematical links between different criteria and numerical illustrations are provided.Comment: 25 pages, 10 figure

    Discrimination between Gaussian process models: active learning and static constructions

    Get PDF
    The paper covers the design and analysis of experiments to discriminate between two Gaussian process models with different covariance kernels, such as those widely used in computer experiments, kriging, sensor location and machine learning. Two frameworks are considered. First, we study sequential constructions, where successive design (observation) points are selected, either as additional points to an existing design or from the beginning of observation. The selection relies on the maximisation of the difference between the symmetric Kullback Leibler divergences for the two models, which depends on the observations, or on the mean squared error of both models, which does not. Then, we consider static criteria, such as the familiar log-likelihood ratios and the Fréchet distance between the covariance functions of the two models. Other distance-based criteria, simpler to compute than previous ones, are also introduced, for which, considering the framework of approximate design, a necessary condition for the optimality of a design measure is provided. The paper includes a study of the mathematical links between different criteria and numerical illustrations are provided

    Extremal measures maximizing functionals based on simplicial volumes

    Get PDF
    We consider functionals measuring the dispersion of a d-dimensional distribution which are based on the volumes of simplices of dimension k ≤ d formed by k + 1 independent copies and raised to some power δ. We study properties of extremal measures that maximize these functionals. In particular, for positive δ we characterize their support and for negative δ we establish connection with potential theory and motivate the application to space-filling design for computer experiments. Several illustrative examples are presented

    Phase II study of lonidamine in metastatic breast cancer.

    Get PDF
    Thirty patients with previously treated metastatic breast cancer were entered in a phase II study with oral lonidamine. Twenty-eight patients are evaluable for toxicity and 25 for response. A partial remission was obtained in four patients (16%) and disease stability in 11 (44%): 10 patients progressed (40%). Toxicity was acceptable, consisting mainly of myalgias (39% of patients) and asthenia (21.4%). No myelotoxicity was observed. The drug is active in previously treated metastatic breast cancer and, because of its peculiar pattern of action and toxicity, deserves to be evaluated in combination with cytotoxic chemotherapy
    • …
    corecore