31 research outputs found

    The Effects of the Quantification of Faculty Productivity: Perspectives from the Design Science Research Community

    Get PDF
    In recent years, efforts to assess faculty research productivity have focused more on the measurable quantification of academic outcomes. For benchmarking academic performance, researchers have developed different ranking and rating lists that define so-called high-quality research. While many scholars in IS consider lists such as the Senior Scholar’s basket (SSB) to provide good guidance, others who belong to less-mainstream groups in the IS discipline could perceive these lists as constraining. Thus, we analyzed the perceived impact of the SSB on information systems (IS) academics working in design science research (DSR) and, in particular, how it has affected their research behavior. We found the DSR community felt a strong normative influence from the SSB. We conducted a content analysis of the SSB and found evidence that some of its journals have come to accept DSR more. We note the emergence of papers in the SSB that outline the role of theory in DSR and describe DSR methodologies, which indicates that the DSR community has rallied to describe what to expect from a DSR manuscript to the broader IS community and to guide the DSR community on how to organize papers for publication in the SSB

    INFORMATION SYSTEMS AND THE NATURAL SELECTION OF BAD SCIENCE (28)

    Get PDF
    Recent studies of information systems suggest a coalescing around a limited set of methods and subject areas, particularly led by a dominance of technology adoption studies and research methods that orbit around the technology adoption model (TAM). This is interpreted as evidence of a maturing of a discipline and the development of scientific foundations. I would suggest that far from this being the case, the dominance of particular method and topics is resulting in a disciplinary stagnation and the fuelling of an increasing irrelevance of information systems studies to both practice and research innovation. Having illustrated this with reference to two recent information systems trends studies, and briefly critiqued the dominant information systems paradigm, I draw on a recent study of the evolution of behavioural sciences using computer models. I suggest that the development of information systems is an example of bad science, constrained by social and economic forces. I offer some suggestions on how different environmental forces could be applied to reinvigorate information systems. However, I conclude by suggesting that regardless of changing evolutionary forces, there is a deeper underlying philosophical concern which is catalysing the malaise of information systems

    Citation Analyses in Information Systems

    Get PDF
    Few scientists that specialize in information systems would recognize the name one of the field’s most cited authors, Ike Antkare. It is not that Antkare is from an obscure discipline. This aberration is the result of a vulnerability of citation analyses. A vulnerability proven with a computer program. Today, funding, promotion and tenure extension depend on the results of these analyses. This paper explores the nature of citation analyses in the information systems (IS) field and classifies them based on an adapted framework of Zupic and Cater (2015). The results illustrate two types of citation analyses. The first type contains ranking studies using measures of the h-family index calculated on citation networks. The second type involves co-citation analysis applying cluster or factor analysis to determine the intellectual structure, trajectory or maturity

    Information Systems: To Be, or Not To Be, a Science? Is that the Question?

    Get PDF
    In this commentary, we complement McBride’s (2018) paper by setting the debate in its historical context and building on the “rite of passage” notion that Chughtai and Myers (2017) introduced to denote the process of researchers entering a field of practice. We first summarize McBride’s (2018) main point concerning whether or not IS is a science and pick up on the systemic nature of IS. In doing so, we incorporate how researchers have historically treated the debate and distinguish science per se from the scientific method. We turn then to reflect on the point that this debate apparently refuses to die. We conclude with a forward-thinking section in which we consider the implications of our considering the topic not for the field as a whole but for individual IS researchers. We end with our own modest call for action in terms of focusing on the everyday practices of IS researchers— specifically, the rites of passage or transitions (and lack of them) we (should?) go through in how we practice our research

    What makes IS Implementation Successful? A Study on Implementation Effectiveness

    Get PDF
    It has been noted that implementation climate is positively associated with implementation effectiveness. However, the recipe for a successful implementation of IS/IT systems still doesn’t exist. Specifically, it is unclear what a good implementation climate requires, what it should be, and to what extent the acceptance and success of the implementation of a new IS/IT system is affected. Despite success and opportunities for organizations that innovate with information systems (IS) and information technology (IT) in general there are also many failures of IS/IT implementations caused by both technical and non-technical problems. This study, based on the Klein-Sorra model of implementation effectiveness, shows that skills and innovation-values fit do significantly influence intention to use in the context of our questionnaire-based survey, the implementation of a new document management system (DMS) at the Dutch Police. Survey data was collected from 41 end-users. For practitioners, this research offers practices to be considered during implementation of a new system

    EVOLUTION OF IS RESEARCH BASED ON LITERATURE PUBLISHED IN TWO LEADING IS JOURNALS - EJIS AND MISQ

    Get PDF
    There is growing interest amongst IS academics and scholars in studying the evolution of IS research. Scholarly literature published in top-ranking IS journals provide a pertinent source for this exploratory study. However, the list of journals selected for such a study should ideally be representative of publication outlets from different regions of the world. Thus, in the research on IS evolution presented in this paper, the authors’ selection of the two leading IS journals – EJIS and MISQ – is motivated by their conscious attempt to chart the evolution of IS research in both European and North American contexts. Towards this end, the paper employs co-citation analysis to identify prominent articles, authors and journals being referenced to by the citing EJIS and MISQ authors; it utilises extended citation data (e.g., keywords and article abstracts) to recognise frequently occurring noun phrases in the citing articles. The contribution of this paper is the methodological study of the evolution of IS research based on a comparative co-citation analysis of journals. The limitation of the paper is its underlying dataset that presently comprises of only two journals

    A Methodology for Profiling Literature using Co-citation Analysis

    Get PDF
    The contribution of this paper is a methodology for profiling literature in Information Systems (IS) using a powerful tool for co-citation analysis - Citespace. Co-citation analysis provide important insights into knowledge domains by identifying frequently co-cited papers, authors and journals. The methodology is applied to a dataset comprising of citation data pertaining to a leading European journal – the European Journal of Information Systems (EJIS). In this paper we outline the different steps involved in using Citespace to profile literature in IS and use the EJIS dataset as an example. We hope that the readers will employ and/or extend the given methodology to conduct similar bibliometric studies in IS and other research areas

    Exploring the modelling and simulation knowledge base through journal co-citation analysis

    Get PDF
    “The final publication is available at Springer via http://dx.doi.org/10.1007/s11192-013-1136-zCo-citation analysis is a form of content analysis that can be applied in the context of scholarly publications with the purpose of identifying prominent articles, authors and journals being referenced to by the citing authors. It identifies co-cited references that occur in the reference list of two or more citing articles, with the resultant co-citation network providing insights into the constituents of a knowledge domain (e.g., significant authors and papers). The contribution of the paper is twofold; (a) the demonstration of the added value of using co-citation analysis, and for this purpose the underlying dataset that is chosen is the peer-reviewed publication of the Society for Modeling and Simulation International (SCS)—SIMULATION; (b) the year 2012 being the 60th anniversary of the SCS, the authors hope that this paper will lead to further acknowledgement and appreciation of the Society in charting the growth of Modeling and Simulation (M&S) as a discipline
    corecore