1,755 research outputs found
Spartan Daily, December 8, 1986
Volume 87, Issue 67https://scholarworks.sjsu.edu/spartandaily/7524/thumbnail.jp
Feasibility report: Delivering case-study based learning using artificial intelligence and gaming technologies
This document describes an investigation into the technical feasibility of a game to support learning based on case studies. Information systems students using the game will conduct fact-finding interviews with virtual characters. We survey relevant technologies in computational linguistics and games. We assess the applicability of the various approaches and propose an architecture for the game based on existing techniques. We propose a phased development plan for the development of the game
Spartan Daily, October 19, 1976
Volume 67, Issue 32https://scholarworks.sjsu.edu/spartandaily/6118/thumbnail.jp
The Pacifican, May 4, 1973
https://scholarlycommons.pacific.edu/pacifican/2092/thumbnail.jp
Recommended from our members
An empirically-based debugging system for novice programmers
The research described here concerns the design and construction of an empirically-based debugging aid for first-time computer users, integrated into the Open University's SOLO programming environment. Its basis is an account of the processes involved as human experts debug faulty code, which account was later found to be supported by empirical tests on human experts. The account implies that an understanding of the intentions of the programmer is not essential to successful debugging of a certain class of programs. That class comprises programs written in a database-dependent language by users who are initially completely computer-naive and who during their course become competent to write simple programs which embody one or more basic AI techniques such as recursive inference. The debugging system, called AURAC, incorporates an explicit model of the debugging strategies used by human experts. Its understanding, therefore, is of programming in general and of the SOLO environment in particular. We present in the process a broad taxonomy of naive users' errors, showing that they can be divided into types, each type requiring a different debugging approach and indicating a different degree of expertise on the part of the perpetrator. SOLO is a conveniently delimited though nonetheless rich problem domain.
Also described is a new version of SOLO itself (MacSOLO) which incorporates a large number of traps for the simple errors which plague novices, thus enabling AURAC to concentrate on the more interesting programming mistakes. AURAC is intended to operate after the event rather than whilst a program is actually being written, and is able via analysis of programming cliches and of data flows to isolate errors in the user's code. Where AURAC cannot analyse, or where its analysis yields nothing useful, it describes the corresponding section of code instead, so that the user receives a coherent output.
MacSOLO and AURAC together form a unified system, based upon the principles of Simplicity, Consistency and Transparency. We show how these principles were applied during the design and construction phases
November 11, 1985
The Breeze is the student newspaper of James Madison University in Harrisonburg, Virginia
Spartan Daily, March 15, 1973
Volume 60, Issue 85https://scholarworks.sjsu.edu/spartandaily/5719/thumbnail.jp
AN EASY ACCESS INTERACTIVE STATISTICAL SYSTEM FOR USE AND TRAINING IN BIOMETRY
One of the most important tools of the applied statistician is the digital computer. It is natural, therefore, for the instructor in applied statistics to want his students to become familiar with the use of computers. If his students are going to get actual experience in using a computer for statistical analysis, he often has only two alternatives. The students can be required to write their own statistical programs or they can use programs already available through a computer facility.
If the course is to be taught such that each student is responsible for his own programs, the instructor must either require that the students have previous programming experience or he must be prepared to spend a portion of his class time teaching a programming language. Neither of these seem to be satisfactory. First, to make knowledge of programming a prerequisite will often reduce the number of people interested in the course. Many students, who would otherwise enroll, might be completely unfamiliar with programming and have no real interest in becoming programmers. To spend a portion of the class time in teaching a programming language and associated programming techniques would often mean that the emphasis of the class could easily shift from the statistical methods to computer programming. This would result in a significant reduction in the enount of nnteriel the class could cover.
The alternative to having each student write his own programs is to use prepared programs available through a computer facility. In most instances, this would mean that each time a student wished to use the computer for a statistical analysis he would have to prepare the data for card input, send the cards to the computer facility, wait, and finally have his results returned. Again either the instructor would have to assign a particular program and would lead the class through the data preparation or he would expect each student to be responsible for reading the program documentation and preparing the data for himself. In many statistical analyses the investigator might wish to run several different programs. For each of these the student might have to review the relevant documentation, punch a new set of data cards and wait. Unfortunately, rather then repeat this procedure several times a student may become satisfied with running only the primary analysis without spending time, for instance, verifying the underlying assumptions.
An example of the type of situation which might indicate several computer run; would be data on which an Analysis of Variance is to be performed. Consider the problem of a student who has data from patients being treated with several different drugs. He wished to test the null hypothesis of no significant differences between the treatment means. He night first wish to run I Bartlett\u27s test for homogeneity of variances. If transforms are necessary on the data he will wish to try them. If he is satisfied that the variances are not significantly different, he will compute the Analysis of Variance possibly following that with Duncan\u27s multiple range test. Since each method is probably done by a different program, the date might have to be completely punched three or four different times. Rather than doing all the extra work the student might simply run the Analysis of Variance and be satisfied with a less than a complete data analysis.
The problems introduced here give the necessary background for the discussion of the APL Statistical System which follows. This discussion is divided into three sections. The first section includes two chapters and discusses broadly the APL Statistical System characteristics which contribute to overcome some of the problems involved in utilizing a computer in statistical instruction. The second chapter describes two basic utilization: of the Statistical System.
The second section describes the computer hardware configuration on which the system is currently being implemented. It also describes some of the important characteristics of the programming language used. A description of the actual statistical System with a list of the statistical methods which are available to the user is also included in the third chapter.
The third section is actually a user\u27s manual giving the operating procedures for the system, an explanation of the keyboard, data entry, and s few of the basic APL operators. To make it an independent part of the thesis so that it may be used alone as a manual, s more complete description of how to use each of the statistical methods is given. For each method an example is shown which can be verified in most cases by the reference source listed in the example. A complete program listing of all the programs, or functions, used in this system can be found in the Appendix
- …