9,794 research outputs found

    Testing predictor contributions in sufficient dimension reduction

    Full text link
    We develop tests of the hypothesis of no effect for selected predictors in regression, without assuming a model for the conditional distribution of the response given the predictors. Predictor effects need not be limited to the mean function and smoothing is not required. The general approach is based on sufficient dimension reduction, the idea being to replace the predictor vector with a lower-dimensional version without loss of information on the regression. Methodology using sliced inverse regression is developed in detail

    Principal Fitted Components for Dimension Reduction in Regression

    Full text link
    We provide a remedy for two concerns that have dogged the use of principal components in regression: (i) principal components are computed from the predictors alone and do not make apparent use of the response, and (ii) principal components are not invariant or equivariant under full rank linear transformation of the predictors. The development begins with principal fitted components [Cook, R. D. (2007). Fisher lecture: Dimension reduction in regression (with discussion). Statist. Sci. 22 1--26] and uses normal models for the inverse regression of the predictors on the response to gain reductive information for the forward regression of interest. This approach includes methodology for testing hypotheses about the number of components and about conditional independencies among the predictors.Comment: Published in at http://dx.doi.org/10.1214/08-STS275 the Statistical Science (http://www.imstat.org/sts/) by the Institute of Mathematical Statistics (http://www.imstat.org

    One Step at a Time: Does Gradualism Build Coordination?

    Get PDF
    We study how gradualism -- increasing required levels (“thresholds”) of contributions slowly over time rather than requiring a high level of contribution immediately -- affects individuals’ decisions to contribute to a public project. Using a laboratory binary choice minimum-effort coordination game, we randomly assign participants to three treatments: starting and continuing at a high threshold, starting at a low threshold but jumping to a high threshold after a few periods, and starting at a low threshold and gradually increasing the threshold over time (the “gradualism” treatment). We find that individuals coordinate most successfully at the high threshold in the gradualism treatment relative to the other two groups. We propose a theory based on belief updating to explain why gradualism works. We also discuss alternative explanations such as reinforcement learning, conditional cooperation, inertia, preference for consistency, and limited attention. Our findings point to a simple, voluntary mechanism to promote successful coordination when the capacity to impose sanctions is limited.Gradualism; Coordination; Cooperation; Public Goods; Belief-based Learning; Laboratory Experiment

    Tacit Lobbying Agreements: An Experimental Study

    Get PDF
    We experimentally study the common wisdom that money buys political influence. In the game, one lobbyist has the opportunity to influence redistributive tax policies in her favor by transferring money to two competing candidates. The success of the lobbying investment depends on whether or not the candidates are willing to respond and able to collude on low-tax policies that do not harm their relative chances in the elections. In the experiment, we find that lobbying is never successful when the lobbyist and candidates interact just once. By contrast, it yields substantially lower redistribution in about 40% of societies with finitely-repeated encounters. However, lobbying investments are not always profitable, and profit-sharing between the lobbyist and candidates depends on prominent equity norms. Our experimental results shed new light on the complex process of buying political influence in everyday politics and help explain why only relatively few corporate firms do actually lobby.lobbying, redistribution, elections, bargaining, collusion

    Tacit Lobbying Agreements: An Experimental Study

    Get PDF
    We experimentally study the common wisdom that money buys political influence. In the game, one lobbyist has the opportunity to influence redistributive tax policies in her favor by transferring money to two competing candidates. The success of the lobbying investment depends on whether or not the candidates are willing to respond and able to collude on low-tax policies that do not harm their relative chances in the elections. In the experiment, we find that lobbying is never successful when the lobbyist and candidates interact just once. By contrast, it yields substantially lower redistribution in about 40% of societies with finitely-repeated encounters. However, lobbying investments are not always profitable, and profit-sharing between the lobbyist and candidates depends on prominent equity norms. Our experimental results shed new light on the complex process of buying political influence in everyday politics and help explain why only relatively few corporate firms do actually lobby.lobbying, redistribution, elections, bargaining, collusion

    Fisher Lecture: Dimension Reduction in Regression

    Full text link
    Beginning with a discussion of R. A. Fisher's early written remarks that relate to dimension reduction, this article revisits principal components as a reductive method in regression, develops several model-based extensions and ends with descriptions of general approaches to model-based and model-free dimension reduction in regression. It is argued that the role for principal components and related methodology may be broader than previously seen and that the common practice of conditioning on observed values of the predictors may unnecessarily limit the choice of regression methodology.Comment: This paper commented in: [arXiv:0708.3776], [arXiv:0708.3777], [arXiv:0708.3779]. Rejoinder in [arXiv:0708.3781]. Published at http://dx.doi.org/10.1214/088342306000000682 in the Statistical Science (http://www.imstat.org/sts/) by the Institute of Mathematical Statistics (http://www.imstat.org

    Family Labor Supply and Aggregate Saving

    Get PDF
    I study the impact of idiosyncratic risk on savings and employment in a small open economy populated by two-member families. Families incur a fixed cost of participation when both members are employed. Because of market incompleteness and information asymmetries, this cost coupled with labor market frictions can generate multiple equilibria. In particular, there might be one equilibrium with high employment and low saving and another one with low employment and high saving. The model predicts that aggregate saving and employment rates are negatively correlated across countries. I present empirical evidence that supports the general equilibrium prediction of the modelSaving ; Employment ; Family labor supply ; Multiple equilibria

    LDR: A Package for Likelihood-Based Sufficient Dimension Reduction

    Get PDF
    We introduce a new mlab software package that implements several recently proposed likelihood-based methods for sufficient dimension reduction. Current capabilities include estimation of reduced subspaces with a fixed dimension d, as well as estimation of d by use of likelihood-ratio testing, permutation testing and information criteria. The methods are suitable for preprocessing data for both regression and classification. Implementations of related estimators are also available. Although the software is more oriented to command-line operation, a graphical user interface is also provided for prototype computations.

    Generalized asset pricing: Expected Downside Risk-Based Equilibrium Modelling

    Get PDF
    We introduce an equilibrium asset pricing model, which we build on the relationship between a novel risk measure, the Expected Downside Risk (EDR) and the expected return. On the one hand, our proposed risk measure uses a nonparametric approach that allows us to get rid of any assumption on the distribution of returns. On the other hand, our asset pricing model is based on loss-averse investors of Prospect Theory, through which we implement the risk-seeking behaviour of investors in a dynamic setting. By including EDR in our proposed model unrealistic assumptions of commonly used equilibrium models - such as the exclusion of risk-seeking or price-maker investors and the assumption of unlimited leverage opportunity for a unique interest rate - can be omitted. Therefore, we argue that based on more realistic assumptions our model is able to describe equilibrium expected returns with higher accuracy, which we support by empirical evidence as well.Comment: 55 pages, 15 figures, 1 table, 3 appandices, Econ. Model. (2015
    corecore