4,946 research outputs found

    Using latent semantic analysis to detect non-cognitive variables of academic performance

    Get PDF
    This thesis explores the possibilities of using latent semantic analysis to detect evidence of intrapersonal personality variables in post-secondary student essays. Determining student achievement based on non-cognitive variables is a complex process. Automated essay scoring tools are already in use today in grading and evaluating student texts based on cognitive domain traits, but at this time are not utilized to analyze non-cognitive domains such as personality. Could such tools be configured to detect non-cognitive variables in student essays? Key concepts in this proposal—personality traits, latent semantic analysis, automated essay evaluation, and online cinema reviews—are explored followed by a literature review to justify the research. As a proof of concept study, 43 writing samples written to a constructed response task are collected and analyzed by a test model specifically designed to evaluate sentiment in a movie review constructed response format. A test model is created using LightSIDE, a software tool for text assessment, to predict the sentiment of these essays with highly encouraging results. The thesis concludes with a path for future research in the largely unexplored area of automated assessment of non-cognitive variables

    A descriptive case study: Investigating the implementation of web based, automated grading and tutorial software in a freshman computer literacy course

    Get PDF
    Students in higher education require satisfactory computer skills to be successful. While today’s students may have greater exposure to technology, research shows that their actual computer knowledge and skills are superficial and narrow. As a result, the freshman computer literacy course remains an important curricular component. This study investigates the implementation of an innovative Web-based technology for delivering software proficiency training for Microsoft Office. Building upon decades of end-user computing satisfaction and technology acceptance research, the purpose of the study is to describe the instructor and student experiences that result from the implementation and use of MyITLab educational software. The nature of the study is descriptive, rather than evaluative, with the following goals: (a) to describe instructors’ experiences with the software, (b) to identify patterns of technology usage and utility, and (c) to elucidate levels of computing satisfaction and technology acceptance among users. The study applies a mixed-method, single-unit, embedded case study design to focus the inquiry on an introductory computer applications course, offered in the Fall 2011 semester at a college in western Canada. The embedded units consist of five instructors, with 322 students enrolled across 10 sections. Data were analyzed from course documents, classroom observations, instructor interviews, and a student survey that produced 149 satisfactory responses. The survey was constructed by adapting instruments based on the Wixom and Todd (2005) integrated research model and the Unified Theory of Acceptance and Use of Technology (UTAUT) model. Results of the study are summarized into five assertions: 1) MyITLab effectively eliminates or, at least, reduces instructor grading workloads for assignments, 2) MyITLab provides students with frequent corrective feedback on assignments, 3) the step-by-step presentation of instructions in MyITLab may not solely meet the needs of solution-based learning outcomes, 4) instructors should be trained on MyITLab to maximize the software’s utility, and 5) the MyITLab solution bank of acceptable responses should be expanded to reduce potential grading inaccuracies. An enhanced Wixom and Todd (2005) model is also presented for future research of educational software. Lastly, the reader is encouraged to reconsider the information presented and generalize it for their own purposes

    Psychometrics in Practice at RCEC

    Get PDF
    A broad range of topics is dealt with in this volume: from combining the psychometric generalizability and item response theories to the ideas for an integrated formative use of data-driven decision making, assessment for learning and diagnostic testing. A number of chapters pay attention to computerized (adaptive) and classification testing. Other chapters treat the quality of testing in a general sense, but for topics like maintaining standards or the testing of writing ability, the quality of testing is dealt with more specifically.\ud All authors are connected to RCEC as researchers. They present one of their current research topics and provide some insight into the focus of RCEC. The selection of the topics and the editing intends that the book should be of special interest to educational researchers, psychometricians and practitioners in educational assessment

    A Foundation For Educational Research at Scale: Evolution and Application

    Get PDF
    The complexities of how people learn have plagued researchers for centuries. A range of experimental and non-experimental methodologies have been used to isolate and implement positive interventions for students\u27 cognitive, meta-cognitive, behavioral, and socio-emotional successes in learning. But the face of learning is changing in the digital age. The value of accrued knowledge, popular throughout the industrial age, is being overpowered by the value of curiosity and the ability to ask critical questions. Most students can access the largest free collection of human knowledge (and cat videos) with ease using their phones or laptops and omnipresent cellular and Wi-Fi networks. Viewing this new-age capacity for connection as an opportunity, educational stakeholders have delegated many traditional learning tasks to online environments. With this influx of online learning, student errors can be corrected with immediacy, student data is more prevalent and actionable, and teachers can intervene with efficiency and efficacy. As such, endeavors in educational data mining, learning analytics, and authentic educational research at scale have grown popular in recent years; fields afforded by the luxuries of technology and driven by the age-old goal of understanding how people learn. This dissertation explores the evolution and application of ASSISTments Research, an approach to authentic educational research at scale that leverages ASSISTments, a popular online learning platform, to better understand how people learn. Part I details the evolution and advocacy of two tools that form the research arm of ASSISTments: the ASSISTments TestBed and the Assessment of Learning Infrastructure (ALI). An NSF funded Data Infrastructure Building Blocks grant (#1724889, $494,644 2017-2020), outlines goals for the new age of ASSISTments Research as a result of lessons learned in recent years. Part II details a personal application of these research tools with a focus on the framework of Self Determination Theory. The primary facets of this theory, thought to positively affect learning and intrinsic motivation, are investigated in depth through randomized controlled trials targeting Autonomy, Belonging, and Competence. Finally, a synthesis chapter highlights important connections between Parts I & II, offering lessons learned regarding ASSISTments Research and suggesting additional guidance for its future development, while broadly defining contributions to the Learning Sciences community

    Innovation in Pedagogy and Technology Symposium: University of Nebraska, May 8, 2018

    Get PDF
    Selected Conference Proceedings, Presented by University of Nebraska Online and University of Nebraska Information Technology Services. University of Nebraska Information Technology Services (NU ITS) and University of Nebraska Online (NU Online) present an education and technology symposium each spring. The Innovation in Pedagogy and Technology Symposium provides University of Nebraska (NU) faculty and staff the opportunity to learn from nationally recognized experts, share their experiences and learn from the initiatives of colleagues from across the system. This event is offered free to NU administrators, faculty and staff free of charge. Tuesday, May 8, 2018 The Cornhusker Marriott, Lincoln, NE Technology has forever changed the landscape of higher education and continues to do so—often at a rapid pace. At the University of Nebraska, we strive to embrace technology to enhance both teaching and learning, to provide key support systems and meet institutional goals. The Innovation in Pedagogy and Technology Symposium is designed for any NU administrator, faculty or staff member who is involved in the use of technology in education at all levels. Past events have drawn over 500 NU faculty, staff and IT professionals from across the four campuses for a day of discovery and networking. The 2018 event was held in downtown Lincoln. The schedule included: • Presentations by University of Nebraska faculty, staff and administrators • Concurrent sessions focused on pedagogy/instructional design, support and administrative strategies and emerging technologies • Panel discussions • Roundtable discussions and networking time • Sponsor exhibits • Continental breakfast and lunch Keynote Presentation: Learning How to Learn: Powerful Mental Tools to Help You Master Tough Subjects • Barbara Oakley, Ph.D., Oakland University Fostering Quality by Identifying & Evaluating Effective Practices through Rigorous Research • Tanya Joosten, University of Wisconsin-Milwaukee Synchronous Online & In Person Classrooms: Challenges & Rewards Five Years Into Practice • Elsbeth Magilton We Nudge and You Can Too: Improving Outcomes with an Emailed Nudge • Ben Smith It Takes a System to Build an Affordable Content Program • Brad Severa, Jane Petersen, Kimberly Carlson, Betty Jacques, Brian Moore, Andrew Cano, Michael Jolley Five Generations: Preparing Multiple Generations of Learners for a Multi-Generational Workforce • Olimpia Leite-Trambly, Sharon Obasi., Toni Hill Schedule NU! Schedule SC! • Cheri Polenske, Jean Padrnos, Corrie Svehla See It & Believe It (Assessing Professional Behaviors & Clinical Reasoning with Video Assignments) • Grace Johnson, Megan Frazee Group Portfolios as a Gateway to Creativity, Collaboration & Synergy in an Environment Course • Katherine Nashleanas Learning to Learn Online: Helping Online Students Navigate Online Learning • Suzanne Withem Beyond Closed Captioning: The Other ADA Accessibility Requirements • Analisa McMillan, Peggy Moore (UNMC) Using Interactive Digital Wall (iWall) Technology to Promote Active Learning • Cheryl Thompson, Suhasini Kotcherlakota, Patrick Rejda, Paul Dye Cybersecurity Threats & Challenges • JR Noble Digital Badges: A Focus on Skill Acquisition • Benjamin Malczyk Creating a Student Success Center Transitioning Graduate Students to an Online Community • Brian Wilson, Christina Yao, Erica DeFrain, Andrew Cano Male Allies: Supporting an Inclusive Environment in ITS • Heath Tuttle (, Wes Juranek Featured Extended Presentation: Broaden Your Passion! Encouraging Women in STEM • Barbara Oakley, Oakland University in Rochester, Michigan Students as Creative Forces to Enhance Curriculum via E-Learning • Betsy Becker, Peggy Moore, Dele Davies Rethinking Visual Communication Curriculum: The Success of Emporium Style • Adam Wagler (UNL), Katie Krcmarik, Alan Eno A Course Delivery Evolution: Moving from Lecture to Online to a Flipped Classroom • Kim Michael, Tanya Custer Enhancing the Quality of Online Teaching via Collaborative Course Development • B. Jean Mandernach, Steve McGahan Collaborating Across NU for Accessible Video • Heath Tuttle, Jane Petersen, Jaci Lindburg Structuring Security for Success • Matt Morton, Rick Haugerud Future Directions for University of Nebraska Wireless Networking • Brian Cox, Jay Wilmes Using Learning Analytics in Canvas to Improve Online Learning • Martonia Gaskill,, Phu Vu, Broaden Your Passion! Encouraging Women in STEM • Featured Speaker: Barbara Oakley, Oakland University in Rochester, MI Translating Studio Courses Online • Claire Amy Schultz Hidden Treasures: Lesser Known Secrets of Canvas • Julie Gregg, Melissa Diers, Analisa McMillan Your Learners, Their Devices & You: Incorporating BYOD Technology into Your Didactics • Tedd Welniak Extending the Conversation about Teaching with Technology • Marlina Davidson, Timi Barone, Dana Richter-Egger, Schuetzler, Jaci Lindburg Scaling up Student Assessment: Issues and Solutions • Paul van Vliet Closing Keynote: Navigating Change: It’s a Whitewater Adventure • Marjorie J. Kostelnik, Professor and Senior Associate to the President doi 10.13014/K2Q23XFDhttps://digitalcommons.unl.edu/zeabook/1068/thumbnail.jp

    Technology and Testing

    Get PDF
    From early answer sheets filled in with number 2 pencils, to tests administered by mainframe computers, to assessments wholly constructed by computers, it is clear that technology is changing the field of educational and psychological measurement. The numerous and rapid advances have immediate impact on test creators, assessment professionals, and those who implement and analyze assessments. This comprehensive new volume brings together leading experts on the issues posed by technological applications in testing, with chapters on game-based assessment, testing with simulations, video assessment, computerized test development, large-scale test delivery, model choice, validity, and error issues. Including an overview of existing literature and ground-breaking research, each chapter considers the technological, practical, and ethical considerations of this rapidly-changing area. Ideal for researchers and professionals in testing and assessment, Technology and Testing provides a critical and in-depth look at one of the most pressing topics in educational testing today

    Student Modeling From Different Aspects

    Get PDF
    With the wide usage of online tutoring systems, researchers become interested in mining data from logged files of these systems, so as to get better understanding of students. Varieties of aspects of students’ learning have become focus of studies, such as modeling students’ mastery status and affects. On the other hand, Randomized Controlled Trial (RCT), which is an unbiased method for getting insights of education, finds its way in Intelligent Tutoring System. Firstly, people are curious about what kind of settings would work better. Secondly, such a tutoring system, with lots of students and teachers using it, provides an opportunity for building a RCT infrastructure underlying the system. With the increasing interest in Data mining and RCTs, the thesis focuses on these two aspects. In the first part, we focus on analyzing and mining data from ASSISTments, an online tutoring system run by a team in Worcester Polytechnic Institute. Through the data, we try to answer several questions from different aspects of students learning. The first question we try to answer is what matters more to student modeling, skill information or student information. The second question is whether it is necessary to model students’ learning at different opportunity count. The third question is about the benefits of using partial credit, rather than binary credit as measurement of students’ learning in RCTs. The fourth question focuses on the amount that students spent Wheel Spinning in the tutoring system. The fifth questions studies the tradeoff between the mastery threshold and the time spent in the tutoring system. By answering the five questions, we both propose machine learning methodology that can be applied in educational data mining, and present findings from analyzing and mining the data. In the second part, we focused on RCTs within ASSISTments. Firstly, we looked at a pilot study of reassessment and relearning, which suggested a better system setting to improve students’ robust learning. Secondly, we proposed the idea to build an infrastructure of learning within ASSISTments, which provides the opportunities to improve the whole educational environment

    Automation and robotics human performance

    Get PDF
    The scope of this report is limited to the following: (1) assessing the feasibility of the assumptions for crew productivity during the intra-vehicular activities and extra-vehicular activities; (2) estimating the appropriate level of automation and robotics to accomplish balanced man-machine, cost-effective operations in space; (3) identifying areas where conceptually different approaches to the use of people and machines can leverage the benefits of the scenarios; and (4) recommending modifications to scenarios or developing new scenarios that will improve the expected benefits. The FY89 special assessments are grouped into the five categories shown in the report. The high level system analyses for Automation & Robotics (A&R) and Human Performance (HP) were performed under the Case Studies Technology Assessment category, whereas the detailed analyses for the critical systems and high leverage development areas were performed under the appropriate operations categories (In-Space Vehicle Operations or Planetary Surface Operations). The analysis activities planned for the Science Operations technology areas were deferred to FY90 studies. The remaining activities such as analytic tool development, graphics/video demonstrations and intelligent communicating systems software architecture were performed under the Simulation & Validations category

    MOOClm: Learner Modelling for MOOCs

    Get PDF
    Massively Open Online Learning systems, or MOOCs, generate enormous quantities of learning data. Analysis of this data has considerable potential benefits for learners, educators, teaching administrators and educational researchers. How to realise this potential is still an open question. This thesis explores use of such data to create a rich Open Learner Model (OLM). The OLM is designed to take account of the restrictions and goals of lifelong learner model usage. Towards this end, we structure the learner model around a standard curriculum-based ontology. Since such a learner model may be very large, we integrate a visualisation based on a highly scalable circular treemap representation. The visualisation allows the student to either drill down further into increasingly detailed views of the learner model, or filter the model down to a smaller, selected subset. We introduce the notion of a set of Reference learner models, such as an ideal student, a typical student, or a selected set of learning objectives within the curriculum. Introducing these provides a foundation for a learner to make a meaningful evaluation of their own model by comparing against a reference model. To validate the work, we created MOOClm to implement this framework, then used this in the context of a Small Private Online Course (SPOC) run at the University of Sydney. We also report a qualitative usability study to gain insights into the ways a learner can make use of the OLM. Our contribution is the design and validation of MOOClm, a framework that harnesses MOOC data to create a learner model with an OLM interface for student and educator usage
    • …
    corecore