4 research outputs found

    Four guiding principles for research on evolved information processing traits and technology-mediated task performance

    Get PDF
    Evolved information processing traits are defined as mental traits that have been evolved by our species in response to evolutionary pressures and that are associated with the processing of information. Evolutionary psychologists and human evolution researchers have long realized that theorizing about evolved mental traits is very difficult to do in ways that lead to valid testable predictions. Quite often that theorizing leads to what are known as Panglossian (or naïve) explanations, which may at first glance be seen as valid evolutionary explanations of observable traits, but end up proving to be wrong and misleading. We propose four meta-theoretical principles to guide future research on evolved information processing traits and their effects on technology-mediated task performance, and help researchers avoid Panglossian explanations. We argue that this type of research holds the promise of bringing fresh insights into the study of human behavior toward information and communication technologies, and thus, helping advance the field of information systems through a promising path that has rarely been taken before. We derive the four principles from mathematical formulations developed based on two of the most fundamental conceptual tools employed in population genetics and mathematical modeling of evolutionary processes: Fisher\u27s Fundamental Theorem of Natural Selection and the Price Equation. We provide an illustration of the application of the principles through an empirical study of a technology-mediated learning task. The analysis was conducted using WarpPLS 1.0. The study provides support for a puzzling phenomenon, known as flashbulb memorization, the context of web-mediated learning

    Mapping Design Contributions in Information Systems Research: The Design Research Activity Framework

    Get PDF
    Despite growing interest in design science research in information systems, our understanding about what constitutes a design contribution and the range of research activities that can produce design contributions remains limited. We propose the design research activity (DRA) framework for classifying design contributions based on the type of statements researchers use to express knowledge contributions and the researcher role with respect to the artifact. These dimensions combine to produce a DRA framework that contains four quadrants: construction, manipulation, deployment, and elucidation. We use the framework in two ways. First, we classify design contributions that the Journal of the Association for Information Systems (JAIS) published from 2007 to 2019 and show that the journal published a broad range of design research across all four quadrants. Second, we show how one can use our framework to analyze the maturity of design-oriented knowledge in a specific field as reflected in the degree of activity across the different quadrants. The DRA framework contributes by showing that design research encompasses both design science research and design-oriented behavioral research. The framework can help authors and reviewers assess research with design implications and help researchers position and understand design research as a journey through the four quadrants

    PLS-Based Model Selection: The Role of Alternative Explanations in Information Systems Research

    Get PDF
    Exploring theoretically plausible alternative models for explaining the phenomenon under study is a crucial step in advancing scientific knowledge. This paper advocates model selection in information systems (IS) studies that use partial least squares path modeling (PLS) and suggests the use of model selection criteria derived from information theory for this purpose. These criteria allow researchers to compare alternative models and select a parsimonious yet well-fitting model. However, as our review of prior IS research practice shows, their use—while common in the econometrics field and in factor-based SEM—has not found its way into studies using PLS. Using a Monte Carlo study, we compare the performance of several model selection criteria in selecting the best model from a set of competing models under different model set-ups and various conditions of sample size, effect size, and loading patterns. Our results suggest that appropriate model selection cannot be achieved by relying on the PLS criteria (i.e., R2, Adjusted R2, GoF, and Q2), as is the current practice in academic research. Instead, model selection criteria—in particular, the Bayesian information criterion (BIC) and the Geweke-Meese criterion (GM)—should be used due to their high model selection accuracy and ease of use. To support researchers in the adoption of these criteria, we introduce a five-step procedure that delineates the roles of model selection and statistical inference and discuss misconceptions that may arise in their use
    corecore