1,982 research outputs found

    THE ROLE OF SIMULATION IN SUPPORTING LONGER-TERM LEARNING AND MENTORING WITH TECHNOLOGY

    Get PDF
    Mentoring is an important part of professional development and longer-term learning. The nature of longer-term mentoring contexts means that designing, developing, and testing adaptive learning sys-tems for use in this kind of context would be very costly as it would require substantial amounts of fi-nancial, human, and time resources. Simulation is a cheaper and quicker approach for evaluating the impact of various design and development decisions. Within the Artificial Intelligence in Education (AIED) research community, however, surprisingly little attention has been paid to how to design, de-velop, and use simulations in longer-term learning contexts. The central challenge is that adaptive learning system designers and educational practitioners have limited guidance on what steps to consider when designing simulations for supporting longer-term mentoring system design and development deci-sions. My research work takes as a starting point VanLehn et al.’s [1] introduction to applications of simulated students and Erickson et al.’s [2] suggested approach to creating simulated learning envi-ronments. My dissertation presents four research directions using a real-world longer-term mentoring context, a doctoral program, for illustrative purposes. The first direction outlines a framework for guid-ing system designers as to what factors to consider when building pedagogical simulations, fundamen-tally to answer the question: how can a system designer capture a representation of a target learning context in a pedagogical simulation model? To illustrate the feasibility of this framework, this disserta-tion describes how to build, the SimDoc model, a pedagogical model of a longer-term mentoring learn-ing environment – a doctoral program. The second direction builds on the first, and considers the issue of model fidelity, essentially to answer the question: how can a system designer determine a simulation model’s fidelity to the desired granularity level? This dissertation shows how data from a target learning environment, the research literature, and common sense are combined to achieve SimDoc’s medium fidelity model. The third research direction explores calibration and validation issues to answer the question: how many simulation runs does it take for a practitioner to have confidence in the simulation model’s output? This dissertation describes the steps taken to calibrate and validate the SimDoc model, so its output statistically matches data from the target doctoral program, the one at the university of Saskatchewan. The fourth direction is to demonstrate the applicability of the resulting pedagogical model. This dissertation presents two experiments using SimDoc to illustrate how to explore pedagogi-cal questions concerning personalization strategies and to determine the effectiveness of different men-toring strategies in a target learning context. Overall, this dissertation shows that simulation is an important tool in the AIED system design-ers’ toolkit as AIED moves towards designing, building, and evaluating AIED systems meant to support learners in longer-term learning and mentoring contexts. Simulation allows a system designer to exper-iment with various design and implementation decisions in a cost-effective and timely manner before committing to these decisions in the real world

    Techno-economic Evaluation Methodology and Preliminary Comparison of an Amine-based and Advanced Solid Sorbent-based CO2 Capture Process for NGCC Power Plants

    Get PDF
    AbstractThe post combustion capture process using the traditional amine based solvent absorption process is a very mature technology that suffers from a high energy penalty being taken on the power plant and requires significant capital investment that causes us a high increase in the cost of electricity. An advanced solid-based adsorption is discussed in this work as well as a techno-economic evaluation methodology in order to compare the advantages of this novel process to the conventional process. Some indications of the expected technical and economic benefits of the process are also discussed

    A Cross-Cohort Changepoint Model for Customer-Base Analysis

    Get PDF
    We introduce a new methodology that can capture and explain differences across a series of cohorts of new customers in a repeat-transaction setting. More specifically, this new framework, which we call a vector changepoint model, exploits the underlying regime structure in a sequence of acquired customer cohorts to make predictive statements about new cohorts for which the firm has little or no longitudinal transaction data. To accomplish this, we develop our model within a hierarchical Bayesian framework to uncover evidence of (latent) regime changes for each cohort-level parameter separately, while disentangling cross-cohort changes from calendar-time changes. Calibrating the model using multicohort donation data from a nonprofit organization, we find that holdout predictions for new cohorts using this model have greater accuracy—and greater diagnostic value—compared to a variety of strong benchmarks. Our modeling approach also highlights the perils of pooling data across cohorts without accounting for cross-cohort shifts, thus enabling managers to quantify their uncertainty about potential regime changes and avoid “old data” aggregation bias

    Aerospace Medicine and Biology: A continuing bibliography with indexes, supplement 140

    Get PDF
    This bibliography lists 306 reports, articles, and other documents introduced into the NASA scientific and technical information system in March 1975

    A survey of statistical network models

    Full text link
    Networks are ubiquitous in science and have become a focal point for discussion in everyday life. Formal statistical models for the analysis of network data have emerged as a major topic of interest in diverse areas of study, and most of these involve a form of graphical representation. Probability models on graphs date back to 1959. Along with empirical studies in social psychology and sociology from the 1960s, these early works generated an active network community and a substantial literature in the 1970s. This effort moved into the statistical literature in the late 1970s and 1980s, and the past decade has seen a burgeoning network literature in statistical physics and computer science. The growth of the World Wide Web and the emergence of online networking communities such as Facebook, MySpace, and LinkedIn, and a host of more specialized professional network communities has intensified interest in the study of networks and network data. Our goal in this review is to provide the reader with an entry point to this burgeoning literature. We begin with an overview of the historical development of statistical network modeling and then we introduce a number of examples that have been studied in the network literature. Our subsequent discussion focuses on a number of prominent static and dynamic network models and their interconnections. We emphasize formal model descriptions, and pay special attention to the interpretation of parameters and their estimation. We end with a description of some open problems and challenges for machine learning and statistics.Comment: 96 pages, 14 figures, 333 reference

    Experimental Methods and the Welfare Evaluation of Policy Lotteries

    Get PDF
    Policies impose lotteries of outcomes on individuals, since we never know exactly what the effects of the policy will be. In order to evaluate alternative policies, we therefore need to make some assumptions about individual preferences, even before social welfare functions are applied. Instead of making a priori assumptions about those preferences that are likely to be wrong, there are two broad ways in which experimental methods are used to evaluate policy. One is to use experiments to estimate individual preferences, valuations and beliefs, and use those estimates as priors in the evaluation of policy. The other approach is to undertake deliberate randomization, or exploit accidental or natural randomization, to infer the effects of policy. The strengths and weaknesses of these approaches are reviewed, and their complementarities identified.

    Computational model and empirical study of the self-undermining proposition in job demands-resources theory, A

    Get PDF
    2019 Summer.Includes bibliographical references.The current conceptual model in job demands-resources (JD-R) theory contains eight propositions to explain the dual processes through which job demands and resources influence individuals' strain, motivation, and job performance. Although the theory is generally well-supported and widely-used in industrial-organizational (I-O) and occupational health (OHP) psychology literature, more research is needed to validate its two most recent propositions; that motivation and strain can lead to increases in job resources and demands through job crafting and self-undermining behaviors, respectively. The goal of this study was to test the dynamic variable relationships in the self-undermining proposition through two research methods in an academic context. First, I developed and tested a computational model of the self-undermining proposition based in JD-R theory and other psychological theories and research. Second, I collected longitudinal data from undergraduate students at two U.S. universities and analyzed the data through cross-lagged panel analyses and repeated measures multivariate analyses of variance. The results of the two methods were contradictory. Specifically, the specifications and theoretical assumptions of the computational model resulted in simulations of a perpetual loss spiral via a positive feedback loop, whereas statistical analyses of the longitudinal data did not identify or support the self-undermining proposition. Overall, the results did not support the self-undermining proposition and were influenced by several methodological limitations of this study, but these limitations and results exemplified several broader limitations of JD-R theory and suggested that the theory is currently inviable and in need of respecification
    corecore