194 research outputs found

    MNC Staffing policies for the managing director position in foreign subsidiaries : the results of an innovative research method

    Get PDF
    This research note draws the attention to the harmful consequence of a serious lack of empirical research in the field of International Human Resource Management: myth-building on the basis of one or two publications. The apparent myth of high expatriate failure rates is shortly discussed. To prevent another myth from appearing, this time in the field of staffing policies, this research note provides an empirical test of the framework proposed by Meredith Downes (1996) for making decisions about staffing foreign subsidiaries. The propositions set forward by Downes are tested using a database of nearly 1800 subsidiaries located in twenty-two different countries. Headquarters of these subsidiaries are located in nine different countries and operate in eight different industries. Although the variables suggested by Downes have a fair explanatory power, some of the specific propositions had to be rejected.management and organization theory ;

    Why replication studies are essential: learning from failure and success

    Get PDF
    Van Witteloostuijn’s (2016) commentary “What happened to Popperian Falsification?” is an excellent summary of the many problems that plague research in the (Social) Sciences in general and (International) Business & Management in particular. As van Witteloostuijn (2016:pp] admits his “[...] diagnosis is anything but new – quite the contrary”, nor is it applicable only to the Social Sciences. When preparing this note, I was reminded of Cargo Cult Science, a 1974 Caltech commencement address by Physicist Richard Feynman (Feynman, 1974), which – more than four decades ago – makes many of the same points, including the pervasive problem of a lack of replication studies, which will be the topic of this short rejoinder. Conducting replication studies is more difficult in International Business (IB) than it is in many other disciplines. For instance in Psychology – a discipline that favours experimental research – one might be able to replicate a particular study within weeks or, in some cases, even days. However, in IB data collection is typically very time-consuming and fraught with many problems not encountered in purely domestic research (for a summary see Harzing, Reiche & Pudelko, 2013). Moreover, most journals in our field only publish articles with novel research findings and a strong theoretical contribution, and are thus not open to replication studies. To date, most studies in IB are therefore unique and are never replicated. This is regrettable, because even though difficult, replication is even more essential in IB than it is in domestic studies, because differences in cultural and institutional environments might limit generalization from studies conducted in a single home or host country. Somehow though, pleas for replication studies – however well articulated and however often repeated – seem to be falling on deaf ears. Academics are only human, and many humans learn best from personal stories and examples, especially if they evoke vivid emotions or associations. Hence, in this note, instead of providing yet another essayistic plea for replication, I will attempt to argue “by example”. I present two short case studies from my own research: one in which the lack of replication resulted in the creation of myths, and another in which judicious replication strengthened arguments for a new – less biased – measure of research performance. Finally, I will provide a recommendation on how to move forward that can be implemented immediately without the need for a complete overhaul of our current system of research dissemination

    What, who or where? Rejoinder to "identifying research topic development in Business and Management education research using legitimation code theory"

    Get PDF
    Arbaugh, Fornaciari and Hwang (2016) use citation analysis – with Google Scholar as their source of citation data – to track the development of Business and Management Education research by studying the field’s 100 most highly cited articles. In their article, the authors distinguish several factors that might impact on an article’s level of citations: the topic it addresses, the profile of the author(s) who wrote it and the prominence of the journal that the article is published in. Although these three factors might seem rather intuitive, and the authors certainly are not the first to identify them, there is a surprising dearth of studies in the bibliometrics literature that attempt to disentangle the relative impact of these factors on citation outcomes. Yet, this question is of considerable relevance in the context of academic evaluation. If citation levels of individual articles are determined more by what is published (topic) and who publishes it (author) rather than by where it is published (journal), this would provide clear evidence that the frequently used practice of employing the ISI journal impact factor to evaluate individual articles or authors is inappropriate

    Two new kids on the block: How do Crossref and Dimensions compare with Google Scholar, Microsoft Academic, Scopus and the Web of Science?

    Get PDF
    In the last three years, several new (free) sources for academic publication and citation data have joined the now well-established Google Scholar, complementing the two traditional commercial data sources: Scopus and the Web of Science. The most important of these new data sources are Microsoft Academic (2016), Crossref (2017) and Dimensions (2018). Whereas Microsoft Academic has received some attention from the bibliometric community, there are as yet very few studies that have investigated the coverage of Crossref or Dimensions. To address this gap, this brief letter assesses Crossref and Dimensions coverage in comparison to Google Scholar, Microsoft Academic, Scopus and the Web of Science through a detailed investigation of the full publication and citation record of a single academic, as well as six top journals in Business & Economics. Overall, this first small-scale study suggests that, when compared to Scopus and the Web of Science, Crossref and Dimensions have a similar or better coverage for both publications and citations, but a substantively lower coverage than Google Scholar and Microsoft Academic. If our findings can be confirmed by larger-scale studies, Crossref and Dimensions might serve as good alternatives to Scopus and the Web of Science for both literature reviews and citation analysis. However, Google Scholar and Microsoft Academic maintain their position as the most comprehensive free sources for publication and citation data

    Internal vs. external promotion, part two: seven advantages of internal promotion, plus some general tips for both

    Get PDF
    In the second and final part of a series considering the relative merits of pursuing internal or external promotions, Anne-Wil Harzing sets out why seeking advancement internally might be a more attractive option, again highlighting seven specific reasons. The series concludes with some more general tips for promotion applications, including how to harness your experience submitting to academic journals

    Disambiguating impact

    Get PDF
    Outside of specific institutional and organizational settings discussions about ‘impact’ often descend into confusion, as the term holds several meanings for researchers. Responding to a provocation from Ziyad Marar “On Measuring Social Science Impact” and reflecting on different ways of understanding impact, Anne-Wil Harzing, provides a guide to understanding different aspects of impact for academics and a set tools for how they can assess impact in different ways
    • 

    corecore