887 research outputs found

    A robust methodology for automated essay grading

    Get PDF
    None of the available automated essay grading systems can be used to grade essays according to the National Assessment Program – Literacy and Numeracy (NAPLAN) analytic scoring rubric used in Australia. This thesis is a humble effort to address this limitation. The objective of this thesis is to develop a robust methodology for automatically grading essays based on the NAPLAN rubric by using heuristics and rules based on English language and neural network modelling

    An exploratory study into automated précis grading

    Get PDF
    Automated writing evaluation is a popular research field, but the main focus has been on evaluating argumentative essays. In this paper, we consider a different genre, namely précis texts. A précis is a written text that provides a coherent summary of main points of a spoken or written text. We present a corpus of English précis texts which all received a grade assigned by a highly-experienced English language teacher and were subsequently annotated following an exhaustive error typology. With this corpus we trained a machine learning model which relies on a number of linguistic, automatic summarization and AWE features. Our results reveal that this model is able to predict the grade of précis texts with only a moderate error margin

    Exploring Automated Essay Scoring Models for Multiple Corpora and Topical Component Extraction from Student Essays

    Get PDF
    Since it is a widely accepted notion that human essay grading is labor-intensive, automatic scoring method has drawn more attention. It reduces reliance on human effort and subjectivity over time and has commercial benefits for standardized aptitude tests. Automated essay scoring could be defined as a method for grading student essays, which is based on high inter-agreement with human grader, if they exist, and requires no human effort during the process. This research mainly focuses on improving existing Automated Essay Scoring (AES) models with different technologies. We present three different scoring models for grading two corpora: the Response to Text Assessment (RTA) and the Automated Student Assessment Prize (ASAP). First of all, a traditional machine learning model that extracts features based on semantic similarity measurement is employed for grading the RTA task. Secondly, a neural network model with the co-attention mechanism is used for grading sourced-based writing tasks. Thirdly, we propose a hybrid model integrating the neural network model with hand-crafted features. Experiments show that the feature-based model outperforms its baseline, but a stand-alone neural network model significantly outperforms the feature-based model. Additionally, a hybrid model integrating the neural network model and hand-crafted features outperforms its baselines, especially in a cross-prompt experimental setting. Besides, we present two investigations of using the intermediate output of the neural network model for keywords and key phrases extraction from student essays and the source article. Experiments show that keywords and key phrases extracted by our models support the feature-based AES model, and human effort can be relieved by using automated essay quality signals during the training process

    DeepEval: An Integrated Framework for the Evaluation of Student Responses in Dialogue Based Intelligent Tutoring Systems

    Get PDF
    The automatic assessment of student answers is one of the critical components of an Intelligent Tutoring System (ITS) because accurate assessment of student input is needed in order to provide effective feedback that leads to learning. But this is a very challenging task because it requires natural language understanding capabilities. The process requires various components, concepts identification, co-reference resolution, ellipsis handling etc. As part of this thesis, we thoroughly analyzed a set of student responses obtained from an experiment with the intelligent tutoring system DeepTutor in which college students interacted with the tutor to solve conceptual physics problems, designed an automatic answer assessment framework (DeepEval), and evaluated the framework after implementing several important components. To evaluate our system, we annotated 618 responses from 41 students for correctness. Our system performs better as compared to the typical similarity calculation method. We also discuss various issues in automatic answer evaluation

    Automatic text scoring using neural networks

    Get PDF
    Automated Text Scoring (ATS) provides a cost-effective and consistent alternative to human marking. However, in order to achieve good performance, the predictive features of the system need to be manually engineered by human experts. We introduce a model that forms word representations by learning the extent to which specific words contribute to the text’s score. Using Long-Short Term Memory networks to represent the meaning of texts, we demonstrate that a fully automated framework is able to achieve excellent results over similar approaches. In an attempt to make our results more interpretable, and inspired by recent advances in visualizing neural networks, we introduce a novel method for identifying the regions of the text that the model has found more discriminative.This is the accepted manuscript. It is currently embargoed pending publication

    Innovation in Pedagogy and Technology Symposium: University of Nebraska, May 8, 2018

    Get PDF
    Selected Conference Proceedings, Presented by University of Nebraska Online and University of Nebraska Information Technology Services. University of Nebraska Information Technology Services (NU ITS) and University of Nebraska Online (NU Online) present an education and technology symposium each spring. The Innovation in Pedagogy and Technology Symposium provides University of Nebraska (NU) faculty and staff the opportunity to learn from nationally recognized experts, share their experiences and learn from the initiatives of colleagues from across the system. This event is offered free to NU administrators, faculty and staff free of charge. Tuesday, May 8, 2018 The Cornhusker Marriott, Lincoln, NE Technology has forever changed the landscape of higher education and continues to do so—often at a rapid pace. At the University of Nebraska, we strive to embrace technology to enhance both teaching and learning, to provide key support systems and meet institutional goals. The Innovation in Pedagogy and Technology Symposium is designed for any NU administrator, faculty or staff member who is involved in the use of technology in education at all levels. Past events have drawn over 500 NU faculty, staff and IT professionals from across the four campuses for a day of discovery and networking. The 2018 event was held in downtown Lincoln. The schedule included: • Presentations by University of Nebraska faculty, staff and administrators • Concurrent sessions focused on pedagogy/instructional design, support and administrative strategies and emerging technologies • Panel discussions • Roundtable discussions and networking time • Sponsor exhibits • Continental breakfast and lunch Keynote Presentation: Learning How to Learn: Powerful Mental Tools to Help You Master Tough Subjects • Barbara Oakley, Ph.D., Oakland University Fostering Quality by Identifying & Evaluating Effective Practices through Rigorous Research • Tanya Joosten, University of Wisconsin-Milwaukee Synchronous Online & In Person Classrooms: Challenges & Rewards Five Years Into Practice • Elsbeth Magilton We Nudge and You Can Too: Improving Outcomes with an Emailed Nudge • Ben Smith It Takes a System to Build an Affordable Content Program • Brad Severa, Jane Petersen, Kimberly Carlson, Betty Jacques, Brian Moore, Andrew Cano, Michael Jolley Five Generations: Preparing Multiple Generations of Learners for a Multi-Generational Workforce • Olimpia Leite-Trambly, Sharon Obasi., Toni Hill Schedule NU! Schedule SC! • Cheri Polenske, Jean Padrnos, Corrie Svehla See It & Believe It (Assessing Professional Behaviors & Clinical Reasoning with Video Assignments) • Grace Johnson, Megan Frazee Group Portfolios as a Gateway to Creativity, Collaboration & Synergy in an Environment Course • Katherine Nashleanas Learning to Learn Online: Helping Online Students Navigate Online Learning • Suzanne Withem Beyond Closed Captioning: The Other ADA Accessibility Requirements • Analisa McMillan, Peggy Moore (UNMC) Using Interactive Digital Wall (iWall) Technology to Promote Active Learning • Cheryl Thompson, Suhasini Kotcherlakota, Patrick Rejda, Paul Dye Cybersecurity Threats & Challenges • JR Noble Digital Badges: A Focus on Skill Acquisition • Benjamin Malczyk Creating a Student Success Center Transitioning Graduate Students to an Online Community • Brian Wilson, Christina Yao, Erica DeFrain, Andrew Cano Male Allies: Supporting an Inclusive Environment in ITS • Heath Tuttle (, Wes Juranek Featured Extended Presentation: Broaden Your Passion! Encouraging Women in STEM • Barbara Oakley, Oakland University in Rochester, Michigan Students as Creative Forces to Enhance Curriculum via E-Learning • Betsy Becker, Peggy Moore, Dele Davies Rethinking Visual Communication Curriculum: The Success of Emporium Style • Adam Wagler (UNL), Katie Krcmarik, Alan Eno A Course Delivery Evolution: Moving from Lecture to Online to a Flipped Classroom • Kim Michael, Tanya Custer Enhancing the Quality of Online Teaching via Collaborative Course Development • B. Jean Mandernach, Steve McGahan Collaborating Across NU for Accessible Video • Heath Tuttle, Jane Petersen, Jaci Lindburg Structuring Security for Success • Matt Morton, Rick Haugerud Future Directions for University of Nebraska Wireless Networking • Brian Cox, Jay Wilmes Using Learning Analytics in Canvas to Improve Online Learning • Martonia Gaskill,, Phu Vu, Broaden Your Passion! Encouraging Women in STEM • Featured Speaker: Barbara Oakley, Oakland University in Rochester, MI Translating Studio Courses Online • Claire Amy Schultz Hidden Treasures: Lesser Known Secrets of Canvas • Julie Gregg, Melissa Diers, Analisa McMillan Your Learners, Their Devices & You: Incorporating BYOD Technology into Your Didactics • Tedd Welniak Extending the Conversation about Teaching with Technology • Marlina Davidson, Timi Barone, Dana Richter-Egger, Schuetzler, Jaci Lindburg Scaling up Student Assessment: Issues and Solutions • Paul van Vliet Closing Keynote: Navigating Change: It’s a Whitewater Adventure • Marjorie J. Kostelnik, Professor and Senior Associate to the President doi 10.13014/K2Q23XFDhttps://digitalcommons.unl.edu/zeabook/1068/thumbnail.jp

    A rubric based approach towards Automated Essay Grading : focusing on high level content issues and ideas

    Get PDF
    Assessment of a student’s work is by no means an easy task. Even if the student response is in the form of multiple choice answers, manually marking those answer sheets is a task that most teachers regard as rather tedious. The development of an automated method to grade these essays was thus an inevitable step.This thesis proposes a novel approach towards Automated Essay Grading through the use of various concepts found within the field of Narratology. Through a review of the literature, several methods in which essays are graded were identified together with some of the problems. Mainly, the issues and challenges that plague AEG systems were that those following the statistical approach needed a way to deal with more implicit features of free text, while other systems which did manage that were highly dependent on the type of student response, the systems having pre-knowledge pertaining to the subject domain in addition to requiring more computational power. It was also found that while narrative essays are one of the main methods in which a student might be able to showcase his/her mastery over the English language, no system thus far has attempted to incorporate narrative concepts into analysing these type of free text responses.It was decided that the proposed solution would be centred on the detection of Events, which was in turn used to determine the score an essay receives under the criteria of Audience, Ideas, Character and Setting and Cohesion, as defined by the NAPLAN rubric. From the results gathered from experiments conducted on the four criteria mentioned above, it was concluded that the concept of detecting Events as they were within a narrative type story when applied to essay grading, does have a relation towards the score the essay receives. All experiments achieved an average F-measure score of 0.65 and above while exact agreement rates were no lower than 70%. Chi-squared and paired T-test values all indicated that there was insufficient evidence to show that there was any significant difference between the scores generated by the computer and those of the human markers

    Final Master\u27s Portfolio

    Get PDF
    This portfolio is the final project for my master’s at BGSU in English with a specialization in English Teaching. With multimodality being the overarching theme of this portfolio, my goal for each project was to be able to able to combine research and argument to exemplify the efficient integration of multimodality into instruction
    • …
    corecore