11,719 research outputs found

    Consumer Profile Identification and Allocation

    Full text link
    We propose an easy-to-use methodology to allocate one of the groups which have been previously built from a complete learning data base, to new individuals. The learning data base contains continuous and categorical variables for each individual. The groups (clusters) are built by using only the continuous variables and described with the help of the categorical ones. For the new individuals, only the categorical variables are available, and it is necessary to define a model which computes the probabilities to belong to each of the clusters, by using only the categorical variables. Then this model provides a decision rule to assign the new individuals and gives an efficient tool to decision-makers. This tool is shown to be very efficient for customers allocation in consumer clusters for marketing purposes, for example.Comment: Accepted in the IWANN 07 conference San Sebastian, June 2007

    An axiomatic approach to the measurement of envy

    Get PDF
    We characterize a class of envy-as-inequity measures. There are three key axioms. Decomposability requires that overall envy is the sum of the envy within and between subgroups. The other two axioms deal with the two-individual setting and specify how the envy measure should react to simple changes in the individuals’ commodity bundles. The characterized class measures how much one individual envies another individual by the relative utility difference (using the envious’ utility function) between the bundle of the envied and the bundle of the envious, where the utility function that must be used to represent the ordinal preferences is the ‘ray’ utility function. The class measures overall envy by the sum of these (transformed) relative utility differences. We discuss our results in the light of previous contributions to envy measurement and multidimensional inequality measurement

    Simplification and Saving

    Get PDF
    The daunting complexity of important financial decisions can lead to procrastination. We evaluate a low-cost intervention that substantially simplifies the retirement savings plan participation decision. Individuals received an opportunity to enroll in a retirement savings plan at a pre-selected contribution rate and asset allocation, allowing them to collapse a multidimensional problem into a binary choice between the status quo and the pre-selected alternative. The intervention increases plan enrollment rates by 10 to 20 percentage points. We find that a similar intervention can be used to increase contribution rates among employees who are already participating in a savings plan.

    On the almost sure convergence of adaptive allocation procedures

    Get PDF
    In this paper, we provide some general convergence results for adaptive designs for treatment comparison, both in the absence and presence of covariates. In particular, we demonstrate the almost sure convergence of the treatment allocation proportion for a vast class of adaptive procedures, also including designs that have not been formally investigated but mainly explored through simulations, such as Atkinson's optimum biased coin design, Pocock and Simon's minimization method and some of its generalizations. Even if the large majority of the proposals in the literature rely on continuous allocation rules, our results allow to prove via a unique mathematical framework the convergence of adaptive allocation methods based on both continuous and discontinuous randomization functions. Although several examples of earlier works are included in order to enhance the applicability, our approach provides substantial insight for future suggestions, especially in the absence of a prefixed target and for designs characterized by sequences of allocation rules.Comment: Published at http://dx.doi.org/10.3150/13-BEJ591 in the Bernoulli (http://isi.cbs.nl/bernoulli/) by the International Statistical Institute/Bernoulli Society (http://isi.cbs.nl/BS/bshome.htm

    On the rate-distortion performance and computational efficiency of the Karhunen-Loeve transform for lossy data compression

    Get PDF
    We examine the rate-distortion performance and computational complexity of linear transforms for lossy data compression. The goal is to better understand the performance/complexity tradeoffs associated with using the Karhunen-Loeve transform (KLT) and its fast approximations. Since the optimal transform for transform coding is unknown in general, we investigate the performance penalties associated with using the KLT by examining cases where the KLT fails, developing a new transform that corrects the KLT's failures in those examples, and then empirically testing the performance difference between this new transform and the KLT. Experiments demonstrate that while the worst KLT can yield transform coding performance at least 3 dB worse than that of alternative block transforms, the performance penalty associated with using the KLT on real data sets seems to be significantly smaller, giving at most 0.5 dB difference in our experiments. The KLT and its fast variations studied here range in complexity requirements from O(n^2) to O(n log n) in coding vectors of dimension n. We empirically investigate the rate-distortion performance tradeoffs associated with traversing this range of options. For example, an algorithm with complexity O(n^3/2) and memory O(n) gives 0.4 dB performance loss relative to the full KLT in our image compression experiment

    Efficient Design with Interdependent Valuations

    Get PDF
    We study efficient, Bayes-Nash incentive compatible mechanisms in a social choice setting that allows for informational and allocative externalities. We show that such mechanisms exist only if a congruence condition relating private and social rates of information substitution is satisfied. If signals are multidimensional, the congruence condition is determined by an integrability constraint, and it can hold only in non-generic cases such as the private value case or the symmetric case. If signals are one-dimensional, the congruence condition reduces to a monotonicity constraint and it can be generically satisfied. We apply the results to the study of multi-object auctions, and we discuss why such auctions cannot be reduced to one-dimensional models without loss of generality.
    • 

    corecore