4,663 research outputs found

    Isoelastic Agents and Wealth Updates in Machine Learning Markets

    Get PDF
    Recently, prediction markets have shown considerable promise for developing flexible mechanisms for machine learning. In this paper, agents with isoelastic utilities are considered. It is shown that the costs associated with homogeneous markets of agents with isoelastic utilities produce equilibrium prices corresponding to alpha-mixtures, with a particular form of mixing component relating to each agent's wealth. We also demonstrate that wealth accumulation for logarithmic and other isoelastic agents (through payoffs on prediction of training targets) can implement both Bayesian model updates and mixture weight updates by imposing different market payoff structures. An iterative algorithm is given for market equilibrium computation. We demonstrate that inhomogeneous markets of agents with isoelastic utilities outperform state of the art aggregate classifiers such as random forests, as well as single classifiers (neural networks, decision trees) on a number of machine learning benchmarks, and show that isoelastic combination methods are generally better than their logarithmic counterparts.Comment: Appears in Proceedings of the 29th International Conference on Machine Learning (ICML 2012

    Algorithm for SIS and MultiSIS problems

    Get PDF
    SIS problem has numerous applications in cryptography. Known algorithms for solving that problem are exponential in complexity. A new algorithm is suggested in this note, its complexity is sub-exponential for a range of parameters

    Containing epidemic outbreaks by message-passing techniques

    Get PDF
    The problem of targeted network immunization can be defined as the one of finding a subset of nodes in a network to immunize or vaccinate in order to minimize a tradeoff between the cost of vaccination and the final (stationary) expected infection under a given epidemic model. Although computing the expected infection is a hard computational problem, simple and efficient mean-field approximations have been put forward in the literature in recent years. The optimization problem can be recast into a constrained one in which the constraints enforce local mean-field equations describing the average stationary state of the epidemic process. For a wide class of epidemic models, including the susceptible-infected-removed and the susceptible-infected-susceptible models, we define a message-passing approach to network immunization that allows us to study the statistical properties of epidemic outbreaks in the presence of immunized nodes as well as to find (nearly) optimal immunization sets for a given choice of parameters and costs. The algorithm scales linearly with the size of the graph and it can be made efficient even on large networks. We compare its performance with topologically based heuristics, greedy methods, and simulated annealing

    Resampling: an improvement of Importance Sampling in varying population size models

    Get PDF
    Sequential importance sampling algorithms have been defined to estimate likelihoods in models of ancestral population processes. However, these algorithms are based on features of the models with constant population size, and become inefficient when the population size varies in time, making likelihood-based inferences difficult in many demographic situations. In this work, we modify a previous sequential importance sampling algorithm to improve the efficiency of the likelihood estimation. Our procedure is still based on features of the model with constant size, but uses a resampling technique with a new resampling probability distribution depending on the pairwise composite likelihood. We tested our algorithm, called sequential importance sampling with resampling (SISR) on simulated data sets under different demographic cases. In most cases, we divided the computational cost by two for the same accuracy of inference, in some cases even by one hundred. This study provides the first assessment of the impact of such resampling techniques on parameter inference using sequential importance sampling, and extends the range of situations where likelihood inferences can be easily performed

    Effect of collisions on neutrino flavor inhomogeneity in a dense neutrino gas

    Full text link
    We investigate the stability, with respect to spatial inhomogeneity, of a two-dimensional dense neutrino gas. The system exhibits growth of seed inhomogeneity due to nonlinear coherent neutrino self-interactions. In the absence of incoherent collisional effects, we observe a dependence of this instability growth rate on the neutrino mass spectrum: the normal neutrino mass hierarchy exhibits spatial instability over a larger range of neutrino number density compared to that of the inverted case. We further consider the effect of elastic incoherent collisions of the neutrinos with a static background of heavy, nucleon-like scatterers. At small scales, the growth of flavor instability can be suppressed by collisions. At large length scales we find, perhaps surprisingly, that for inverted neutrino mass hierarchy incoherent collisions fail to suppress flavor instabilities, independent of the coupling strength.Comment: 10 pages, 6 figures Version accepted in PLB. Minor changes. Title change

    Epidemic Thresholds with External Agents

    Full text link
    We study the effect of external infection sources on phase transitions in epidemic processes. In particular, we consider an epidemic spreading on a network via the SIS/SIR dynamics, which in addition is aided by external agents - sources unconstrained by the graph, but possessing a limited infection rate or virulence. Such a model captures many existing models of externally aided epidemics, and finds use in many settings - epidemiology, marketing and advertising, network robustness, etc. We provide a detailed characterization of the impact of external agents on epidemic thresholds. In particular, for the SIS model, we show that any external infection strategy with constant virulence either fails to significantly affect the lifetime of an epidemic, or at best, sustains the epidemic for a lifetime which is polynomial in the number of nodes. On the other hand, a random external-infection strategy, with rate increasing linearly in the number of infected nodes, succeeds under some conditions to sustain an exponential epidemic lifetime. We obtain similar sharp thresholds for the SIR model, and discuss the relevance of our results in a variety of settings.Comment: 12 pages, 2 figures (to appear in INFOCOM 2014

    Skellam shrinkage: Wavelet-based intensity estimation for inhomogeneous Poisson data

    Full text link
    The ubiquity of integrating detectors in imaging and other applications implies that a variety of real-world data are well modeled as Poisson random variables whose means are in turn proportional to an underlying vector-valued signal of interest. In this article, we first show how the so-called Skellam distribution arises from the fact that Haar wavelet and filterbank transform coefficients corresponding to measurements of this type are distributed as sums and differences of Poisson counts. We then provide two main theorems on Skellam shrinkage, one showing the near-optimality of shrinkage in the Bayesian setting and the other providing for unbiased risk estimation in a frequentist context. These results serve to yield new estimators in the Haar transform domain, including an unbiased risk estimate for shrinkage of Haar-Fisz variance-stabilized data, along with accompanying low-complexity algorithms for inference. We conclude with a simulation study demonstrating the efficacy of our Skellam shrinkage estimators both for the standard univariate wavelet test functions as well as a variety of test images taken from the image processing literature, confirming that they offer substantial performance improvements over existing alternatives.Comment: 27 pages, 8 figures, slight formatting changes; submitted for publicatio

    Exponential Krylov time integration for modeling multi-frequency optical response with monochromatic sources

    Get PDF
    Light incident on a layer of scattering material such as a piece of sugar or white paper forms a characteristic speckle pattern in transmission and reflection. The information hidden in the correlations of the speckle pattern with varying frequency, polarization and angle of the incident light can be exploited for applications such as biomedical imaging and high-resolution microscopy. Conventional computational models for multi-frequency optical response involve multiple solution runs of Maxwell's equations with monochromatic sources. Exponential Krylov subspace time solvers are promising candidates for improving efficiency of such models, as single monochromatic solution can be reused for the other frequencies without performing full time-domain computations at each frequency. However, we show that the straightforward implementation appears to have serious limitations. We further propose alternative ways for efficient solution through Krylov subspace methods. Our methods are based on two different splittings of the unknown solution into different parts, each of which can be computed efficiently. Experiments demonstrate a significant gain in computation time with respect to the standard solvers.Comment: 22 pages, 4 figure
    • 

    corecore