2,044 research outputs found
Strategic Error as Style: Finessing the Grammar Checker
Composition studies lacks a comprehensive theory of error, one which successfully defines error in writing and offers a pedagogical response to ostensible errors that neither ignores nor pathologizes them. Electronic text-critiquing technologies offer some promise of helping writers notice and correct errors, but they are under-researched in composition and rarely well-integrated into pedagogical praxis. This research on the grammar and style checker in Microsoft Word considers the program as an electronic checklist for making decisions about what counts as an error in a given rhetorical situation. This study also offers a theory of error grounded in the idea of attention, or cognitive load, some of which an electronic checker can relieve in its areas of its greatest effectiveness, which this research quantifies. The proposed theory of error forms the basis for a pedagogy of register, understood as typified style, and establishes that error itself can be a strategic style move
Ellogon: A New Text Engineering Platform
This paper presents Ellogon, a multi-lingual, cross-platform, general-purpose
text engineering environment. Ellogon was designed in order to aid both
researchers in natural language processing, as well as companies that produce
language engineering systems for the end-user. Ellogon provides a powerful
TIPSTER-based infrastructure for managing, storing and exchanging textual data,
embedding and managing text processing components as well as visualising
textual data and their associated linguistic information. Among its key
features are full Unicode support, an extensive multi-lingual graphical user
interface, its modular architecture and the reduced hardware requirements.Comment: 7 pages, 9 figures. Will be presented to the Third International
Conference on Language Resources and Evaluation - LREC 200
Computer based writing support for dyslexic adults using language constraints
Computers have been used effectively to provide support for people with a variety of
special needs. One such group is adults with dyslexia. Dyslexia is commonly recognised
as a learning disorder characterised by reading, writing and spelling difficulties. It inhibits
recognition and processing of graphic symbols, particularly those pertaining to language.
Computers are a useful aid for dyslexic adults, especially word processors and their
associated spelling tools. However, there are still areas where improvements are needed.
Creating an environment, which minimises visual discomfort associated with proof
reading and making selections from lists would be of benefit. Furthermore providing the
correct type and level of support for spelling, grammar and sentence construction may
result in higher standards being achieved.
A survey of 250 dyslexic adults established their requirements and enabled the
development of a specialist word processing system and associated spelling support tools.
The hypothesis, that using a language with enforced structure and rigid constraints has a
positive affect for dyslexic adults, was also tested. A support tool, which provided a
controlled environment, to assist with sentence construction for dyslexic adults was
developed from this. Three environments were created using the word processing system:
environment 1 used the basic system with no support, environment 2 provided spelling
support suggested by the survey subjects and environment 3 used the sentence
constructing tool providing support and control. Using these environments in controlled
experiments indicated that although environment 2 achieved high academic standards,
environment 3 produced written work to an even higher standard and at the same time,
the subjects derived greater satisfaction in using it.
This research proves that working in a controlled, rigid environment, where structure is
enforced, substantially benefits dyslexic adults performing computer-based writing tasks
Applying Formal Methods to Networking: Theory, Techniques and Applications
Despite its great importance, modern network infrastructure is remarkable for
the lack of rigor in its engineering. The Internet which began as a research
experiment was never designed to handle the users and applications it hosts
today. The lack of formalization of the Internet architecture meant limited
abstractions and modularity, especially for the control and management planes,
thus requiring for every new need a new protocol built from scratch. This led
to an unwieldy ossified Internet architecture resistant to any attempts at
formal verification, and an Internet culture where expediency and pragmatism
are favored over formal correctness. Fortunately, recent work in the space of
clean slate Internet design---especially, the software defined networking (SDN)
paradigm---offers the Internet community another chance to develop the right
kind of architecture and abstractions. This has also led to a great resurgence
in interest of applying formal methods to specification, verification, and
synthesis of networking protocols and applications. In this paper, we present a
self-contained tutorial of the formidable amount of work that has been done in
formal methods, and present a survey of its applications to networking.Comment: 30 pages, submitted to IEEE Communications Surveys and Tutorial
PLATICA: Personalized Language Acquisition Training & Instruction Chatbot Assistant
English is immensely important and useful in our society, however there are many people across the world who are learning English as a second language and have limited options to practice. Casual English conversations with native speakers is one of the most proven and immersive ways to practice a language. However, not everyone has those opportunities or the resources to attend ESL classes. We aim to solve this issue with our project PLATICA, a robust, low-cost mobile application that anyone can use to build experience conversing in English. PLATICA takes advantage of state-of-the-art deep learning and natural language processing techniques to emulate real conversations while providing real-time grammar feedback to assist the user in improving their English skills. PLATICA as an end-to-end learning pipeline could also be adapted to other languages in the future
The Use of Online Automated Writing Checkers among EFL Learners
Writing is regarded as a vital learning tool for all subject areas. However, it is tough for EFL students in college programmes to grasp and possess excellent writing skills. This paper describes the findings of a study conducted to understand better EFL learners’ perceptions of using online automated writing checkers (OAWCs). This study aims to elicit Learners’ perspectives on enhancing their writing skills with OAWCs. A questionnaire was provided to sixty Saudi female students in the College of Science and Arts, Unizah, Qassim University. The results demonstrate the learners’ positive perceptions of the use of these technologies. Based on the findings, educational implications are proposed for this descriptive study and future research
Soylent: A Word Processor with a Crowd Inside
This paper introduces architectural and interaction patterns for integrating crowdsourced human contributions directly into user interfaces. We focus on writing and editing, com-plex endeavors that span many levels of conceptual and pragmatic activity. Authoring tools offer help with prag-matics, but for higher-level help, writers commonly turn to other people. We thus present Soylent, a word processing interface that enables writers to call on Mechanical Turk workers to shorten, proofread, and otherwise edit parts of their documents on demand. To improve worker quality, we introduce the Find-Fix-Verify crowd programming pat-tern, which splits tasks into a series of generation and re-view stages. Evaluation studies demonstrate the feasibility of crowdsourced editing and investigate questions of relia-bility, cost, wait time, and work time for edits.National Science Foundation (U.S.) (Grant No. IIS-0712793
An exploratory research on grammar checking of Bangla sentences using statistical language models
N-gram based language models are very popular and extensively used statistical methods for solving various natural language processing problems including grammar checking. Smoothing is one of the most effective techniques used in building a language model to deal with data sparsity problem. Kneser-Ney is one of the most prominently used and successful smoothing technique for language modelling. In our previous work, we presented a Witten-Bell smoothing based language modelling technique for checking grammatical correctness of Bangla sentences which showed promising results outperforming previous methods. In this work, we proposed an improved method using Kneser-Ney smoothing based n-gram language model for grammar checking and performed a comparative performance analysis between Kneser-Ney and Witten-Bell smoothing techniques for the same purpose. We also provided an improved technique for calculating the optimum threshold which further enhanced the the results. Our experimental results show that, Kneser-Ney outperforms Witten-Bell as a smoothing technique when used with n-gram LMs for checking grammatical correctness of Bangla sentences
- …