4 research outputs found

    Author Responsibilities in Improving the Quality of Peer Reviews: A Rejoinder to Iivari (2016)

    Get PDF
    In this rejoinder to Iivari (2016), I discuss authors’ responsibilities in the process of ensuring quality reviews. I argue that one overlooked element in quality peer reviewing is authors’ unconstrained right to submit manuscripts in whatever form or quality they desire. As such, I suggest adding some constraints and offering more freedom to reviewers to maintain viability of the scholarly publication system. I offer three responses to Iivari’s suggestions and add two further suggestions for change

    Information Systems Research Themes: A Seventeen-year Data-driven Temporal Analysis

    Get PDF
    Extending the research on our discipline’s identity, we examine how the major research themes have evolved in four top IS journals: Management Information Systems Quarterly (MISQ), Information Systems Research (ISR), Journal of the Association for Information Systems (JAIS), and Journal of Management Information Systems (JMIS). By doing so, we answer Palvia, Daneshvar Kakhki, Ghoshal, Uppala, and Wang’s (2015) call to provide continuous updates to the research trends in IS due to the discipline’s dynamism. Second, building on Sidorov, Evangelopoulos, Valacich, and Ramakrishnan (2008) we examine temporal trends in prominent research streams over the last 17 years. We show that, as IS research evolves over time, certain themes appear to endure the test of time, while others peak and trough. More importantly, our analysis identifies new emergent themes that have begun to gain prominence in IS research community. Further, we break down our findings by journal and show the type of content that they may desire most. Our findings also allow the IS research community to discern the specific contributions and roles of our premier journals in the evolution of research themes over time

    An examination of IS Conference reviewing practices

    Get PDF
    There has been considerable interest over the years within the IS research community into how to shape articles for successful publication. Little effort has been made, however, to examine the reviewing criteria that make a difference to publication. We argue that, to provide better guidance to authors, more solid evidence is needed into the factors that contribute to acceptance decisions. This paper examines empirically the outcomes of the reviewing processes of three well-known IS conferences held in 2007. Our analyses reveal four major findings. First, the evaluation criteria that influence the acceptance/rejection decision vary by conference. Second, those differences can be explained in terms of the maturity and breadth of the specific conference of interest. Third, while objective review criteria influence acceptance/rejection decisions, subjective assessment on the part of the program committees may also play a substantial role. Fourth, while high scores on objective criteria are essential for acceptance, they do not guarantee acceptance. On the other hand, low scores on any criterion are likely to result in rejection
    corecore