2,621 research outputs found

    Cross-classification analysis in the field of management and organization : comments on the DEL-technique

    Get PDF
    The DEL-technique, a proportionate reduction in error measure, developed by Hildebrand, Laing and Rosenthal, has been applied and portrayed as a promising prediction analysis technique to evaluate theory on the basis of cross-classification data, though it was controversial at its birth in the early 70s. According to the opponents, Goodman and Kruskal, the interpretation of DEL as a proportionate reduction in error measure of knowing a prediction rule over not knowing the prediction rule, cannot be held, because it is benchmarked against independence instead of ignorance. However, even when neglecting this criticism the DEL-measure can be easily misinterpreted as a measure of acceptance of the specified customized hypothesis as the only and best relationship between two categorical variables, when the context for the interpretation is not carefully stated in terms of the adhered research paradigm: theory-testing versus prediction logic. When taking into account this criticism, the researchers need to be acknowledged for clearly addressing some of the methodological problems in prediction research, however, an alternative proportionate reduction in error measure may generate unequivocally interpretable results and outperforms the DEL-technique.

    Caveats for using statistical significance tests in research assessments

    Full text link
    This paper raises concerns about the advantages of using statistical significance tests in research assessments as has recently been suggested in the debate about proper normalization procedures for citation indicators. Statistical significance tests are highly controversial and numerous criticisms have been leveled against their use. Based on examples from articles by proponents of the use statistical significance tests in research assessments, we address some of the numerous problems with such tests. The issues specifically discussed are the ritual practice of such tests, their dichotomous application in decision making, the difference between statistical and substantive significance, the implausibility of most null hypotheses, the crucial assumption of randomness, as well as the utility of standard errors and confidence intervals for inferential purposes. We argue that applying statistical significance tests and mechanically adhering to their results is highly problematic and detrimental to critical thinking. We claim that the use of such tests do not provide any advantages in relation to citation indicators, interpretations of them, or the decision making processes based upon them. On the contrary their use may be harmful. Like many other critics, we generally believe that statistical significance tests are over- and misused in the social sciences including scientometrics and we encourage a reform on these matters.Comment: Accepted version for Journal of Informetric

    Dimensionality reduction methods for contingency tables with ordinal variables

    Get PDF
    Correspondence analysis is a widely used tool for obtaining a graphical representation of the interdependence between the rows and columns of a contingency table, by using a dimensionality reduction of the spaces. The maximum information regarding the association between the two categorical variables is then visualized allowing to understand its nature. Several extensions of this method take directly into account the possible ordinal structure of the variables by using different dimensionality reduction tools. Aim of this paper is to present an unified theoretical framework of several methods of correspondence analysis with ordinal variables

    The Correlation Between Husband's and Wife's Education: Canada, 1971-1996

    Get PDF
    We present a measure of the correlation between the education levels of spouses based on a bivariate ordered probit model. The change in this correlation over time can be measured while controlling for the large changes in the educational attainment levels. The model is estimated with data from 20 Surveys of Consumer Finances in Canada over 1971-1996. Our main findings are a reduction in this correlation among younger couples beginning in the 1980s, and an inverted U- shaped effect of the spouses' age difference on the correlation, with the maximum correlation occurring approximately when the spouses' ages are equal.correlation; education level; bivariate ordered probit model; SCF

    Organizing effects of testosterone and economic behavior: not just risk taking

    Get PDF
    Recent literature emphasizes the role that testosterone, as well as markers indicating early exposure to T and its organizing effect on the brain (such as the ratio of second to fourth finger,D2:D4), have on performance in financial markets. These results may suggest that the main effect of T, either circulating or in fetal exposure, on economic behavior occurs through the increased willingness to take risks. However, these findings indicate that traders with a low digit ratio are not only more profitable, but more able to survive in the long run, thus the effect might consist of more than just lower risk aversion. In addition, recent literature suggests a positive correlation between abstract reasoning ability and higher willingness to take risks. To test the two hypotheses of testosterone on performance in financial activities (effect on risk attitude versus a complex effect involving risk attitude and reasoning ability), we gather data on the three variables in a sample of 188 ethnically homogeneous college students (Caucasians). We measure a D2:D4 digit ratio, abstract reasoning ability with the Raven Progressive Matrices task, and risk attitude with choice among lotteries. Low digit ratio in men is associated with higher risk taking and higher scores in abstract reasoning ability when a combined measure of risk aversion over different tasks is used. This explains both the higher performance and higher survival rate observed in traders, as well as the observed correlation between abstract reasoning ability and risk taking. We also analyze how much of the total effect of digit ratio on risk attitude is direct, and how much is mediated. Mediation analysis shows that a substantial part of the effect of T on attitude to risk is mediated by abstract reasoning ability

    Nonparametric partial correlation

    Get PDF

    Nonparametric Identification of Multivariate Mixtures

    Get PDF
    This article analyzes the identifiability of k-variate, M-component finite mixture models in which each component distribution has independent marginals, including models in latent class analysis. Without making parametric assumptions on the component distributions, we investigate how one can identify the number of components and the component distributions from the distribution function of the observed data. We reveal an important link between the number of variables (k), the number of values each variable can take, and the number of identifiable components. A lower bound on the number of components (M) is nonparametrically identifiable if k >= 2, and the maximum identifiable number of components is determined by the number of different values each variable takes. When M is known, the mixing proportions and the component distributions are nonparametrically identified from matrices constructed from the distribution function of the data if (i) k >= 3, (ii) two of k variables take at least M different values, and (iii) these matrices satisfy some rank and eigenvalue conditions. For the unknown M case, we propose an algorithm that possibly identifies M and the component distributions from data. We discuss a condition for nonparametric identi fication and its observable implications. In case M cannot be identified, we use our identification condition to develop a procedure that consistently estimates a lower bound on the number of components by estimating the rank of a matrix constructed from the distribution function of observed variables.finite mixture, latent class analysis, latent class model, model selection, number of components, rank estimation

    Small Cities Blues: Looking for Growth Factors in Small and Medium-Sized Cities

    Get PDF
    The purpose of this exploratory study is to attempt to identify particular public policies which have the potential to increase the economic viability of smaller metropolitan areas and cities. We identify characteristics associated with smaller metro areas that performed better-than-expected (winners) and worse-than-expected (losers) during the 1990s, given their resources, industrial mix, and location as of 1990. Once these characteristics have been identified, we look for evidence that public policy choices may have promoted and enhanced a metro area's ability to succeed and to regain control of its own economic destiny. Methodologically, we construct a regression model which identifies the small metro areas that achieved higher-than-expected economic prosperity (winners) and the areas that saw lower-than-expected economic prosperity (losers) according to the model. Next, we explore whether indications exist that winners and losers are qualitatively different from other areas in ways that may indicate consequences of policy choices. A cluster analysis is completed to group the metro areas based on changes in a host of social, economic, and demographic variables between 1990 and 2000. We then use contingency table analysis and ANOVA to see if "winning" or "losing," as measured by the error term from the regression, is related to the grouping of metro areas in a way that may indicate the presence of deliberate and replicable government policy.economic, development, growth, factors, erickcek, mckinney, incentives, local, regional, small, medium, cities
    corecore