763 research outputs found

    Homotopy Methods to Compute Equilibria in Game Theory

    Get PDF
    This paper presents a complete survey of the use of homotopy methods in game theory.Homotopies allow for a robust computation of game-theoretic equilibria and their refinements. Homotopies are also suitable to compute equilibria that are selected by variousselection theories. We present all relevant techniques underlying homotopy algorithms.We give detailed expositions of the Lemke-Howson algorithm and the Van den Elzen-Talman algorithm to compute Nash equilibria in 2-person games, and the Herings-Vanden Elzen, Herings-Peeters, and McKelvey-Palfrey algorithms to compute Nash equilibriain general n-person games.operations research and management science;

    Locally adaptive smoothing with Markov random fields and shrinkage priors

    Full text link
    We present a locally adaptive nonparametric curve fitting method that operates within a fully Bayesian framework. This method uses shrinkage priors to induce sparsity in order-k differences in the latent trend function, providing a combination of local adaptation and global control. Using a scale mixture of normals representation of shrinkage priors, we make explicit connections between our method and kth order Gaussian Markov random field smoothing. We call the resulting processes shrinkage prior Markov random fields (SPMRFs). We use Hamiltonian Monte Carlo to approximate the posterior distribution of model parameters because this method provides superior performance in the presence of the high dimensionality and strong parameter correlations exhibited by our models. We compare the performance of three prior formulations using simulated data and find the horseshoe prior provides the best compromise between bias and precision. We apply SPMRF models to two benchmark data examples frequently used to test nonparametric methods. We find that this method is flexible enough to accommodate a variety of data generating models and offers the adaptive properties and computational tractability to make it a useful addition to the Bayesian nonparametric toolbox.Comment: 38 pages, to appear in Bayesian Analysi

    Fast DD-classification of functional data

    Full text link
    A fast nonparametric procedure for classifying functional data is introduced. It consists of a two-step transformation of the original data plus a classifier operating on a low-dimensional hypercube. The functional data are first mapped into a finite-dimensional location-slope space and then transformed by a multivariate depth function into the DDDD-plot, which is a subset of the unit hypercube. This transformation yields a new notion of depth for functional data. Three alternative depth functions are employed for this, as well as two rules for the final classification on [0,1]q[0,1]^q. The resulting classifier has to be cross-validated over a small range of parameters only, which is restricted by a Vapnik-Cervonenkis bound. The entire methodology does not involve smoothing techniques, is completely nonparametric and allows to achieve Bayes optimality under standard distributional settings. It is robust, efficiently computable, and has been implemented in an R environment. Applicability of the new approach is demonstrated by simulations as well as a benchmark study

    Regularized Ordinal Regression and the ordinalNet R Package

    Full text link
    Regularization techniques such as the lasso (Tibshirani 1996) and elastic net (Zou and Hastie 2005) can be used to improve regression model coefficient estimation and prediction accuracy, as well as to perform variable selection. Ordinal regression models are widely used in applications where the use of regularization could be beneficial; however, these models are not included in many popular software packages for regularized regression. We propose a coordinate descent algorithm to fit a broad class of ordinal regression models with an elastic net penalty. Furthermore, we demonstrate that each model in this class generalizes to a more flexible form, for instance to accommodate unordered categorical data. We introduce an elastic net penalty class that applies to both model forms. Additionally, this penalty can be used to shrink a non-ordinal model toward its ordinal counterpart. Finally, we introduce the R package ordinalNet, which implements the algorithm for this model class

    Development of dynamic travel demand models for hurricane evacuation

    Get PDF
    Little attention has been given to estimating dynamic travel demand in transportation planning in the past. However, when factors influencing travel are changing significantly over time – such as with an approaching hurricane - dynamic demand and the resulting variation in traffic flow on the network become important. In this study, dynamic travel demand models for hurricane evacuation were developed with two methodologies: survival analysis and sequential choice model. Using survival analysis, the time before evacuation from a pending hurricane is modeled with those that do not evacuate considered as censored observations. A Cox proportional hazards regression model with time-dependent variables and a Piecewise Exponential model were estimated. In the sequential choice model the decision to evacuate in the face of an oncoming hurricane is considered as a series of binary choices over time. A sequential logit model and a sequential complementary log-log model were developed. Each model is capable of predicting the probability of a household evacuating at each time period before hurricane landfall as a function of the household’s socio-economic characteristics, the characteristics of the hurricane (such as distance to the storm), and policy decisions (such as the issuing of evacuation orders). Three datasets were used in this study. They were data from Southwest Louisiana collected following Hurricane Andrew, data from South Carolina collected following Hurricane Floyd, and stated preference survey data collected from New Orleans area. Based on the analysis, the sequential logit model was found to be the best alternative for modeling dynamic travel demand for hurricane evacuation. The sequential logit model produces predictions which are superior to those of current evacuation participation rate models with response curves. Transfer of the sequential logit model estimated on the Floyd data to the Andrew data demonstrated that the sequential logit model is capable of estimating dynamic travel demand in a different environment than the one in which it was estimated, with reasonable accuracy. However, more study is required on the transferability of models of this type, as well as the development of procedures that would allow the updating of transferred model parameters to better reflect local evacuation behavior
    corecore