103,715 research outputs found

    Joint estimation of phase and phase diffusion for quantum metrology

    Get PDF
    Phase estimation, at the heart of many quantum metrology and communication schemes, can be strongly affected by noise, whose amplitude may not be known, or might be subject to drift. Here, we investigate the joint estimation of a phase shift and the amplitude of phase diffusion, at the quantum limit. For several relevant instances, this multiparameter estimation problem can be effectively reshaped as a two-dimensional Hilbert space model, encompassing the description of an interferometer phase probed with relevant quantum states -- split single-photons, coherent states or N00N states. For these cases, we obtain a trade-off bound on the statistical variances for the joint estimation of phase and phase diffusion, as well as optimum measurement schemes. We use this bound to quantify the effectiveness of an actual experimental setup for joint parameter estimation for polarimetry. We conclude by discussing the form of the trade-off relations for more general states and measurements.Comment: Published in Nature Communications. Supplementary Information available at http://www.nature.com/ncomms/2014/140404/ncomms4532/extref/ncomms4532-s1.pd

    Learning the Structure and Parameters of Large-Population Graphical Games from Behavioral Data

    Full text link
    We consider learning, from strictly behavioral data, the structure and parameters of linear influence games (LIGs), a class of parametric graphical games introduced by Irfan and Ortiz (2014). LIGs facilitate causal strategic inference (CSI): Making inferences from causal interventions on stable behavior in strategic settings. Applications include the identification of the most influential individuals in large (social) networks. Such tasks can also support policy-making analysis. Motivated by the computational work on LIGs, we cast the learning problem as maximum-likelihood estimation (MLE) of a generative model defined by pure-strategy Nash equilibria (PSNE). Our simple formulation uncovers the fundamental interplay between goodness-of-fit and model complexity: good models capture equilibrium behavior within the data while controlling the true number of equilibria, including those unobserved. We provide a generalization bound establishing the sample complexity for MLE in our framework. We propose several algorithms including convex loss minimization (CLM) and sigmoidal approximations. We prove that the number of exact PSNE in LIGs is small, with high probability; thus, CLM is sound. We illustrate our approach on synthetic data and real-world U.S. congressional voting records. We briefly discuss our learning framework's generality and potential applicability to general graphical games.Comment: Journal of Machine Learning Research. (accepted, pending publication.) Last conference version: submitted March 30, 2012 to UAI 2012. First conference version: entitled, Learning Influence Games, initially submitted on June 1, 2010 to NIPS 201

    Dimensions of design space: a decision-theoretic approach to optimal research design

    Get PDF
    Bayesian decision theory can be used not only to establish the optimal sample size and its allocation in a single clinical study, but also to identify an optimal portfolio of research combining different types of study design. Within a single study, the highest societal pay-off to proposed research is achieved when its sample sizes, and allocation between available treatment options, are chosen to maximise the Expected Net Benefit of Sampling (ENBS). Where a number of different types of study informing different parameters in the decision problem could be conducted, the simultaneous estimation of ENBS across all dimensions of the design space is required to identify the optimal sample sizes and allocations within such a research portfolio. This is illustrated through a simple example of a decision model of zanamivir for the treatment of influenza. The possible study designs include: i) a single trial of all the parameters; ii) a clinical trial providing evidence only on clinical endpoints; iii) an epidemiological study of natural history of disease and iv) a survey of quality of life. The possible combinations, samples sizes and allocation between trial arms are evaluated over a range of costeffectiveness thresholds. The computational challenges are addressed by implementing optimisation algorithms to search the ENBS surface more efficiently over such large dimensions.Bayesian decision theory; expected value of information; research design; costeffectiveness analysis

    Freeze-drying modeling and monitoring using a new neuro-evolutive technique

    Get PDF
    This paper is focused on the design of a black-box model for the process of freeze-drying of pharmaceuticals. A new methodology based on a self-adaptive differential evolution scheme is combined with a back-propagation algorithm, as local search method, for the simultaneous structural and parametric optimization of the model represented by a neural network. Using the model of the freeze-drying process, both the temperature and the residual ice content in the product vs. time can be determine off-line, given the values of the operating conditions (the temperature of the heating shelf and the pressure in the drying chamber). This makes possible to understand if the maximum temperature allowed by the product is trespassed and when the sublimation drying is complete, thus providing a valuable tool for recipe design and optimization. Besides, the black box model can be applied to monitor the freeze-drying process: in this case, the measurement of product temperature is used as input variable of the neural network in order to provide in-line estimation of the state of the product (temperature and residual amount of ice). Various examples are presented and discussed, thus pointing out the strength of the too

    Latent class analysis for segmenting preferences of investment bonds

    Get PDF
    Market segmentation is a key component of conjoint analysis which addresses consumer preference heterogeneity. Members in a segment are assumed to be homogenous in their views and preferences when worthing an item but distinctly heterogenous to members of other segments. Latent class methodology is one of the several conjoint segmentation procedures that overcome the limitations of aggregate analysis and a-priori segmentation. The main benefit of Latent class models is that market segment membership and regression parameters of each derived segment are estimated simultaneously. The Latent class model presented in this paper uses mixtures of multivariate conditional normal distributions to analyze rating data, where the likelihood is maximized using the EM algorithm. The application focuses on customer preferences for investment bonds described by four attributes; currency, coupon rate, redemption term and price. A number of demographic variables are used to generate segments that are accessible and actionable.peer-reviewe
    corecore