29,778 research outputs found
The Logit Equilibrium: A Perspective on Intuitive Behavioral Anomalies
This paper considers a class of models in which rank-based payoffs are sensitive to small amounts of noise in decision making. Examples include auction, price-competition, coordination, and location games. Observed laboratory behavior in these games is often responsive to asymmetric costs associated with deviations from the Nash equilibrium. These payoff asymmetry effects are incorporated in an approach that introduces noisy behavior via probabilistic choice. In equilibrium, behavior is characterized by a probability distribution that satisfies a "rational expectations" consistency condition: the beliefs that determine player's expected payoffs match the decision distributions that arise from applying a logit probabilistic choice function to those expected payoffs. We prove existence of a unique, symmetric logit (quantal response) equilibrium and derive comparative statics results. The paper provides a unified perspective on many recent laboratory studies of games in which Nash equilibrium predictions are inconsistent with both intuition and experimental evidence.logit equilibrium, quantal response equilibrium, probabilistic choice, auctions.
A tutorial on recursive models for analyzing and predicting path choice behavior
The problem at the heart of this tutorial consists in modeling the path
choice behavior of network users. This problem has been extensively studied in
transportation science, where it is known as the route choice problem. In this
literature, individuals' choice of paths are typically predicted using discrete
choice models. This article is a tutorial on a specific category of discrete
choice models called recursive, and it makes three main contributions: First,
for the purpose of assisting future research on route choice, we provide a
comprehensive background on the problem, linking it to different fields
including inverse optimization and inverse reinforcement learning. Second, we
formally introduce the problem and the recursive modeling idea along with an
overview of existing models, their properties and applications. Third, we
extensively analyze illustrative examples from different angles so that a
novice reader can gain intuition on the problem and the advantages provided by
recursive models in comparison to path-based ones
Benefits of spatio-temporal modelling for short term wind power forecasting at both individual and aggregated levels
The share of wind energy in total installed power capacity has grown rapidly
in recent years around the world. Producing accurate and reliable forecasts of
wind power production, together with a quantification of the uncertainty, is
essential to optimally integrate wind energy into power systems. We build
spatio-temporal models for wind power generation and obtain full probabilistic
forecasts from 15 minutes to 5 hours ahead. Detailed analysis of the forecast
performances on the individual wind farms and aggregated wind power are
provided. We show that it is possible to improve the results of forecasting
aggregated wind power by utilizing spatio-temporal correlations among
individual wind farms. Furthermore, spatio-temporal models have the advantage
of being able to produce spatially out-of-sample forecasts. We evaluate the
predictions on a data set from wind farms in western Denmark and compare the
spatio-temporal model with an autoregressive model containing a common
autoregressive parameter for all wind farms, identifying the specific cases
when it is important to have a spatio-temporal model instead of a temporal one.
This case study demonstrates that it is possible to obtain fast and accurate
forecasts of wind power generation at wind farms where data is available, but
also at a larger portfolio including wind farms at new locations. The results
and the methodologies are relevant for wind power forecasts across the globe as
well as for spatial-temporal modelling in general
Field-aware Calibration: A Simple and Empirically Strong Method for Reliable Probabilistic Predictions
It is often observed that the probabilistic predictions given by a machine
learning model can disagree with averaged actual outcomes on specific subsets
of data, which is also known as the issue of miscalibration. It is responsible
for the unreliability of practical machine learning systems. For example, in
online advertising, an ad can receive a click-through rate prediction of 0.1
over some population of users where its actual click rate is 0.15. In such
cases, the probabilistic predictions have to be fixed before the system can be
deployed.
In this paper, we first introduce a new evaluation metric named field-level
calibration error that measures the bias in predictions over the sensitive
input field that the decision-maker concerns. We show that existing post-hoc
calibration methods have limited improvements in the new field-level metric and
other non-calibration metrics such as the AUC score. To this end, we propose
Neural Calibration, a simple yet powerful post-hoc calibration method that
learns to calibrate by making full use of the field-aware information over the
validation set. We present extensive experiments on five large-scale datasets.
The results showed that Neural Calibration significantly improves against
uncalibrated predictions in common metrics such as the negative log-likelihood,
Brier score and AUC, as well as the proposed field-level calibration error.Comment: WWW 202
Methodological and empirical challenges in modelling residential location choices
The modelling of residential locations is a key element in land use and transport planning. There are significant empirical and methodological challenges inherent in such modelling, however, despite recent advances both in the availability of spatial datasets and in computational and choice modelling techniques.
One of the most important of these challenges concerns spatial aggregation. The housing market is characterised by the fact that it offers spatially and functionally heterogeneous products; as a result, if residential alternatives are represented as aggregated spatial units (as in conventional residential location models), the variability of dwelling attributes is lost, which may limit the predictive ability and policy sensitivity of the model. This thesis presents a modelling framework for residential location choice that addresses three key challenges: (i) the development of models at the dwelling-unit level, (ii) the treatment of spatial structure effects in such dwelling-unit level models, and (iii) problems associated with estimation in such modelling frameworks in the absence of disaggregated dwelling unit supply data. The proposed framework is applied to the residential location choice context in London.
Another important challenge in the modelling of residential locations is the choice set formation problem. Most models of residential location choices have been developed based on the assumption that households consider all available alternatives when they are making location choices. Due the high search costs associated with the housing market, however, and the limited capacity of households to process information, the validity of this assumption has been an on-going debate among researchers. There have been some attempts in the literature to incorporate the cognitive capacities of households within discrete choice models of residential location: for instance, by modelling households’ choice sets exogenously based on simplifying assumptions regarding their spatial search behaviour (e.g., an anchor-based search strategy) and their characteristics. By undertaking an empirical comparison of alternative models within the context of residential location choice in the Greater London area this thesis investigates the feasibility and practicality of applying deterministic choice set formation approaches to capture the underlying search process of households. The thesis also investigates the uncertainty of choice sets in residential location choice modelling and proposes a simplified probabilistic choice set formation approach to model choice sets and choices simultaneously.
The dwelling-level modelling framework proposed in this research is practice-ready and can be used to estimate residential location choice models at the level of dwelling units without requiring independent and disaggregated dwelling supply data. The empirical comparison of alternative exogenous choice set formation approaches provides a guideline for modellers and land use planners to avoid inappropriate choice set formation approaches in practice. Finally, the proposed simplified choice set formation model can be applied to model the behaviour of households in online real estate environments.Open Acces
ANALYSING THREE BASIC DECISIONS OF TOURISTS: GOING AWAY, GOING ABROAD AND GOING ON TOUR
This study analyses determinant factors in the taking of three basic decisions on the part of holidaymakers: going on holiday, foreign holidays and multi-destination holidays. We propose various research hypotheses relating to the impact on these decisions of various personal characteristics. The methodology used estimates various Binomial Logit models. The empirical application carried out in Spain on a sample of 3,781 individuals allows us to conclude that personal characteristics related to the chosen destination, personal restrictions and socio-demographic and psychographic characteristics are determinants of these decisions. This study analyses determinant factors in the taking of three basic decisions onthe part of holidaymakers: going on holiday, foreign holidays and multi-destinationholidays. We propose various research hypotheses relating to the impact on thesedecisions of various personal characteristics. The methodology used estimates variousBinomial Logit models. The empirical application carried out in Spain on a sample of3,781 individuals allows us to conclude that personal characteristics related to thechosen destination, personal restrictions and socio-demographic and psychographiccharacteristics are determinants of these decisions.Tourism Marketing, Choice Behaviour, Probabilistic Choice Models Marketing Turístico, Comportamiento de Elección, Modelos Probabilísticos de Elección.
A Review of Bankruptcy Prediction Studies: 1930-Present
One of the most well-known bankruptcy prediction models was developed by Altman [1968] using multivariate discriminant analysis. Since Altman\u27s model, a multitude of bankruptcy prediction models have flooded the literature. The primary goal of this paper is to summarize and analyze existing research on bankruptcy prediction studies in order to facilitate more productive future research in this area. This paper traces the literature on bankruptcy prediction from the 1930\u27s, when studies focused on the use of simple ratio analysis to predict future bankruptcy, to present. The authors discuss how bankruptcy prediction studies have evolved, highlighting the different methods, number and variety of factors, and specific uses of models.
Analysis of 165 bankruptcy prediction studies published from 1965 to present reveals trends in model development. For example, discriminant analysis was the primary method used to develop models in the 1960\u27s and 1970\u27s. Investigation of model type by decade shows that the primary method began to shift to logit analysis and neural networks in the 1980\u27s and 1990\u27s. The number of factors utilized in models is also analyzed by decade, showing that the average has varied over time but remains around 10 overall.
Analysis of accuracy of the models suggests that multivariate discriminant analysis and neural networks are the most promising methods for bankruptcy prediction models. The findings also suggest that higher model accuracy is not guaranteed with a greater number of factors. Some models with two factors are just as capable of accurate prediction as models with 21 factors
Non compliance in organic certification: determinants for Italy
Organic certification is based on controls on operators, and verify if they are compliant with respect to organic regulations. Control procedures are a transaction cost that affect organic farming relative competiveness. Here we propose an analysis aiming at increasing the efficiency in the individuation of key risk factors in the organic certification process. The study refers to Italian organic farmers and represents an attempt to implement a risk based inspection scheme based on a statistical approach
- …
