172,674 research outputs found

    The effectiveness of Neuro-Linguistic Programming (NLP) on students’ motivation and attitude

    Get PDF
    Neuro-Linguistic Programming (NLP) was developed at the University of California at Santa Cruz in the 1970's by its founders and principal authors, Richard Bandler and John Grinder, a professor of linguistics (Tosey & Mathison, 2003). Richard Bandler was a mathematics and computer science students while John Grinder is a linguistics professor. NLP helps in the teaching and learning process, especially in the classroom management. NLP was first introduced in Malaysia in the 21st century and since there are many training centres such as Akademi NLP Malaysia which train Malaysian universities and polytechnic lecturers and NLP Malaysia Centre of Excellence (NLPMC) which was the first to train NLP in Malay language

    Natural language processing

    Get PDF
    Beginning with the basic issues of NLP, this chapter aims to chart the major research activities in this area since the last ARIST Chapter in 1996 (Haas, 1996), including: (i) natural language text processing systems - text summarization, information extraction, information retrieval, etc., including domain-specific applications; (ii) natural language interfaces; (iii) NLP in the context of www and digital libraries ; and (iv) evaluation of NLP systems

    The effects of clinical hypnosis versus Neurolinguistic Programming (NLP) before External Cephalic Version (ECV) : a prospective off-centre randomised, double-blind, controlled trial

    Get PDF
    Objective. To examine the effects of clinical hypnosis versus NLP intervention on the success rate of ECV procedures in comparison to a control group. Methods. A prospective off-centre randomised trial of a clinical hypnosis intervention against NLP of women with a singleton breech fetus at or after 370/7 (259 days) weeks of gestation and normal amniotic fluid index. All 80 participants heard a 20-minute recorded intervention via head phones. Main outcome assessed was success rate of ECV. The intervention groups were compared with a control group with standard medical care alone (n=122). Results. A total of 42 women, who received a hypnosis intervention prior to ECV, had a 40.5% (n=17), successful ECV, whereas 38 women, who received NLP, had a 44.7% (n=17) successful ECV (P > 0.05). The control group had similar patient characteristics compared to the intervention groups (P > 0.05). In the control group (n = 122) 27.3% (n = 33) had a statistically significant lower successful ECV procedure than NLP (P = 0.05) and hypnosis and NLP (P = 0.03). Conclusions. These findings suggest that prior clinical hypnosis and NLP have similar success rates of ECV procedures and are both superior to standard medical care alone

    Identifying beneficial task relations for multi-task learning in deep neural networks

    Full text link
    Multi-task learning (MTL) in deep neural networks for NLP has recently received increasing interest due to some compelling benefits, including its potential to efficiently regularize models and to reduce the need for labeled data. While it has brought significant improvements in a number of NLP tasks, mixed results have been reported, and little is known about the conditions under which MTL leads to gains in NLP. This paper sheds light on the specific task relations that can lead to gains from MTL models over single-task setups.Comment: Accepted for publication at EACL 201

    Software Infrastructure for Natural Language Processing

    Full text link
    We classify and review current approaches to software infrastructure for research, development and delivery of NLP systems. The task is motivated by a discussion of current trends in the field of NLP and Language Engineering. We describe a system called GATE (a General Architecture for Text Engineering) that provides a software infrastructure on top of which heterogeneous NLP processing modules may be evaluated and refined individually, or may be combined into larger application systems. GATE aims to support both researchers and developers working on component technologies (e.g. parsing, tagging, morphological analysis) and those working on developing end-user applications (e.g. information extraction, text summarisation, document generation, machine translation, and second language learning). GATE promotes reuse of component technology, permits specialisation and collaboration in large-scale projects, and allows for the comparison and evaluation of alternative technologies. The first release of GATE is now available - see http://www.dcs.shef.ac.uk/research/groups/nlp/gate/Comment: LaTeX, uses aclap.sty, 8 page
    corecore