373 research outputs found
On ordinal utility, cardinal utility, and random utility Â
Though the Random Utility Model (RUM) was conceivedÂ
entirely in terms of ordinal utility, the apparatus throughwhich it is widely practised exhibits properties ofÂ
cardinal utility.  The adoption of cardinal utility as aÂ
working operation of ordinal is perfectly valid, providedÂ
interpretations drawn from that operation remain faithfulÂ
to ordinal utility.  The paper considers whether the latterrequirement holds true for several measurements commonlyÂ
derived from RUM.  In particular it is found thatÂ
measurements of consumer surplus change may depart fromÂ
ordinal utility, and exploit the cardinality inherent inÂ
the practical apparatus.
Equivalence and Stooge Strategies in Zero-Sum Games
Classes of two-person zero-sum games termed "equivalent games" are defined. These are games with identical value and identical optimal mixed-strategies but with different matrix entries and thus different opportunities for exploiting a nonrational opponent. An experiment was conducted to investigate the strategy-choice behavior of subjects playing pairs of these "equivalent games." Also investigated was the extent to which subjects would exploit a programmed stooge as a function of the degree to which the stooge departed from his optimal strategy mix. The results indicated that subjects learned to exploit the nonrational play of the stooge opponent. The game factor, on the other hand, seemed to have no significant effect upon the strategy-choice behavior of the players. The implications of these results are discussed in light of questions raised by previous research on decision-making in 2 x 2 zero-sum games.Peer Reviewedhttp://deepblue.lib.umich.edu/bitstream/2027.42/67183/2/10.1177_002200277301700306.pd
An algorithm to discover the k-clique cover in networks
In social network analysis, a k-clique is a relaxed clique, i.e., a k-clique is a quasi-complete sub-graph. A k-clique in a graph is a sub-graph where the distance between any two vertices is no greater than k. The
visualization of a small number of vertices can be easily performed in a graph.
However, when the number of vertices and edges increases the visualization
becomes incomprehensible. In this paper, we propose a new graph mining approach based on k-cliques. The concept of relaxed clique is extended to the whole graph, to achieve a general view, by covering the network with k-cliques.
The sequence of k-clique covers is presented, combining small world concepts
with community structure components. Computational results and examples are
presented
Stepwise disarmament and sudden destruction in a two-person game: a research tool
Peer Reviewedhttp://deepblue.lib.umich.edu/bitstream/2027.42/66968/2/10.1177_002200276400800104.pd
Combining Interval and Probabilistic Uncertainty: What Is Computable?
In many practical problems, we need to process measurement results. For example, we need such data processing to predict future values of physical quantities. In these computations, it is important to take into account that measurement results are never absolutely exact, that there is always measurement uncertainty, because of which the measurement re-sults are, in general, somewhat different from the actual (unknown) values of the corresponding quantities. In some cases, all we know about mea-surement uncertainty is an upper bound; in this case, we have an interval uncertainty, meaning that all we know about the actual value is that is belongs to a certain interval. In other cases, we have some information – usually partial – about the corresponding probability distribution. New data processing challenges appear all the time; in many of these cases, it is important to come up with appropriate algorithms for taking uncertainty into account
Nonstationary Stochastic Resonance in a Single Neuron-Like System
Stochastic resonance holds much promise for the detection of weak signals in
the presence of relatively loud noise. Following the discovery of nondynamical
and of aperiodic stochastic resonance, it was recently shown that the
phenomenon can manifest itself even in the presence of nonstationary signals.
This was found in a composite system of differentiated trigger mechanisms
mounted in parallel, which suggests that it could be realized in some
elementary neural networks or nonlinear electronic circuits. Here, we find that
even an individual trigger system may be able to detect weak nonstationary
signals using stochastic resonance. The very simple modification to the trigger
mechanism that makes this possible is reminiscent of some aspects of actual
neuron physics. Stochastic resonance may thus become relevant to more types of
biological or electronic systems injected with an ever broader class of
realistic signals.Comment: Plain Latex, 7 figure
Using clustering of rankings to explain brand preferences with personality and socio-demographic variables
The primary aim of market segmentation is to identify relevant groups of consumers that can be addressed efficiently by marketing or advertising campaigns. This paper addresses the issue whether consumer groups can be identified from background variables that are not brand-related, and how much personality vs. socio-demographic variables contribute to the identification of consumer clusters. This is done by clustering aggregated preferences for 25 brands across 5 different product categories, and by relating socio-demographic and personality variables to the clusters using logistic regression and random forests over a range of different numbers of clusters. Results indicate that some personality variables contribute significantly to the identification of consumer groups in one sample. However, these results were not replicated on a second sample that was more heterogeneous in terms of socio-demographic characteristics and not representative of the brands target audience
Social Cohesion, Structural Holes, and a Tale of Two Measures
EMBARGOED - author can archive pre-print or post-print on any open access repository after 12 months from publication. Publication date is May 2013 so embargoed until May 2014.This is an author’s accepted manuscript (deposited at arXiv arXiv:1211.0719v2 [physics.soc-ph] ), which was subsequently published in Journal of Statistical Physics May 2013, Volume 151, Issue 3-4, pp 745-764. The final publication is available at link.springer.com http://link.springer.com/article/10.1007/s10955-013-0722-
An axiomatization of cumulative prospect theory
This paper presents a method for axiomatizing a variety of models for decision making under uncertainty, including Expected Utility and Cumulative Prospect Theory. This method identifies, for each model, the situations that permit consistent inferences about the ordering of value differences. Examples of rankdependent and sign-dependent preference patterns are used to motivate the models and the tradeoff consistency axioms that characterize them. The major properties of the value function in Cumulative Prospect Theory—diminishing sensitivity and loss aversion—are contrasted with the principle of diminishing marginal utility that is commonly assumed in Expected Utility
- …