17,567 research outputs found
Intelligent tutoring systems for systems engineering methodologies
The general goal is to provide the technology required to build systems that can provide intelligent tutoring in IDEF (Integrated Computer Aided Manufacturing Definition Method) modeling. The following subject areas are covered: intelligent tutoring systems for systems analysis methodologies; IDEF tutor architecture and components; developing cognitive skills for IDEF modeling; experimental software; and PC based prototype
Knowledge Based Systems: A Critical Survey of Major Concepts, Issues, and Techniques
This Working Paper Series entry presents a detailed survey of knowledge based systems. After being in a relatively dormant state for many years, only recently is Artificial Intelligence (AI) - that branch of computer science that attempts to have machines emulate intelligent behavior - accomplishing practical results. Most of these results can be attributed to the design and use of Knowledge-Based Systems, KBSs (or ecpert systems) - problem solving computer programs that can reach a level of performance comparable to that of a human expert in some specialized problem domain. These systems can act as a consultant for various requirements like medical diagnosis, military threat analysis, project risk assessment, etc. These systems possess knowledge to enable them to make intelligent desisions. They are, however, not meant to replace the human specialists in any particular domain. A critical survey of recent work in interactive KBSs is reported. A case study (MYCIN) of a KBS, a list of existing KBSs, and an introduction to the Japanese Fifth Generation Computer Project are provided as appendices. Finally, an extensive set of KBS-related references is provided at the end of the report
Recommended from our members
The effect of multiple knowledge sources on learning and teaching
Current paradigms for machine-based learning and teaching tend to perform their task in isolation from a rich context of existing knowledge. In contrast, the research project presented here takes the view that bringing multiple sources of knowledge to bear is of central importance to learning in complex domains. As a consequence teaching must both take advantage of and beware of interactions between new and existing knowledge. The central process which connects learning to its context is reasoning by analogy, a primary concern of this research. In teaching, the connection is provided by the explicit use of a learning model to reason about the choice of teaching actions. In this learning paradigm, new concepts are incrementally refined and integrated into a body of expertise, rather than being evaluated against a static notion of correctness. The domain chosen for this experimentation is that of learning to solve "algebra story problems." A model of acquiring problem solving skills in this domain is described, including: representational structures for background knowledge, a problem solving architecture, learning mechanisms, and the role of analogies in applying existing problem solving abilities to novel problems. Examples of learning are given for representative instances of algebra story problems. After relating our views to the psychological literature, we outline the design of a teaching system. Finally, we insist on the interdependence of learning and teaching and on the synergistic effects of conducting both research efforts in parallel
Integration of Abductive and Deductive Inference Diagnosis Model and Its Application in Intelligent Tutoring System
This dissertation presents a diagnosis model, Integration of Abductive and Deductive Inference diagnosis model (IADI), in the light of the cognitive processes of human diagnosticians. In contrast with other diagnosis models, that are based on enumerating, tracking and classifying approaches, the IADI diagnosis model relies on different inferences to solve the diagnosis problems. Studies on a human diagnosticians\u27 process show that a diagnosis process actually is a hypothesizing process followed by a verification process. The IADI diagnosis model integrates abduction and deduction to simulate these processes. The abductive inference captures the plausible features of this hypothesizing process while the deductive inference presents the nature of the verification process. The IADI diagnosis model combines the two inference mechanisms with a structure analysis to form the three steps of diagnosis, mistake detection by structure analysis, misconception hypothesizing by abductive inference, and misconception verification by deductive inference. An intelligent tutoring system, Recursive Programming Tutor (RPT), has been designed and developed to teach students the basic concepts of recursive programming. The RPT prototype illustrates the basic features of the IADI diagnosis approach, and also shows a hypertext-based tutoring environment and the tutoring strategies, such as concentrating diagnosis on the key steps of problem solving, organizing explanations by design plans and incorporating the process of tutoring into diagnosis
Recommended from our members
Modelling medical diagnostic processes
The thesis investigates the development of medical reasoning processes and how student modelling of such processes can be achieved in intelligent tutoring systems. The domain of orthopaedics was chosen for the research. Literature has shown that medical reasoning has been modelled mainly from an expert point of view. The research problem addressed is to model explicitly various levels of medical expertise in terms of reasoning strategies. The thesis reports on a system, DEMEREST (DEvelopment of MEdical REasoning STrategies), a developmental user model component which describes successive stages of medical reasoning and which could ultimately be part of a medical tutor. The system diagnoses physicians' reasoning strategies, determines the level of expertise and produces a plan corresponding to the application of these strategies. As a basis of doing so, a set of seven reasoning strategies was identified in the medical problem solving literature. These strategies are based on generalisation, specialisation, confirmation, elimination, problem refinement, hypothesis generation and anatomy. An empirical study was carried out to examine the development of these strategies. Protocols of ten physicians at various levels of expertise were collected and analysed. A number of interactions of strategies at different levels of expertise was identified in half of these protocols and this information was used to construct a model of changes of strategies over time. Planning in· artificial intelligence was used as a means of decomposing medical problem solving into a set of goals; the goals being associated with the reasoning strategies. By taking this approach, medical reasoning is viewed as a planning process. The remaining protocols from the empirical study were used to evaluate DEMEREST. The system was tested for its ability to determine a level of expertise for each protocol, model the reasoning strategies applied and their interactions, and generate a plan for each protocol. The assessment of the overall performance of the system showed that it was successful. This assessment also helped to identify conceptual as well as implementation constraints of the prototype system. The main result of the research undertaken in this thesis is that the design of the system DEMEREST demonstrates the feasibility of modelling the development of medical reasoning strategies and its usefulness for student modelling
Recommended from our members
Understanding analogical reasoning : viewpoints from psychology and related disciplines
Analogy and metaphor have a long history of study in linguistics, education, philosophy and psychology. Consensus over what analogy is or how analogy functions in language and thought, however, has been elusive. This paper, the first in a two part series, examines these various research traditions, attempting to bring out major lines of agreement over the role of analogy in individual human experience. As well as being a general literature review which may be helpful for newcomers to the study of analogy, this paper attempts to extract from these literatures existing theories, models and concepts which may be interesting or useful for computational studies of analogical reasoning
OFMTutor: An operator function model intelligent tutoring system
The design, implementation, and evaluation of an Operator Function Model intelligent tutoring system (OFMTutor) is presented. OFMTutor is intended to provide intelligent tutoring in the context of complex dynamic systems for which an operator function model (OFM) can be constructed. The human operator's role in such complex, dynamic, and highly automated systems is that of a supervisory controller whose primary responsibilities are routine monitoring and fine-tuning of system parameters and occasional compensation for system abnormalities. The automated systems must support the human operator. One potentially useful form of support is the use of intelligent tutoring systems to teach the operator about the system and how to function within that system. Previous research on intelligent tutoring systems (ITS) is considered. The proposed design for OFMTutor is presented, and an experimental evaluation is described
3. Toward a Cognitive Theory for the Measu rement of Achievement
INTRODUCTION
Given the demands for higher levels of learning in our schools and the press for education in the skilled trades, the professions, and the sciences, we must develop more powerful and specific methods for assessing achievement. We need forms of assessment that educators can use to improve educational practice and to diagnose individual progress by monitoring the outcomes of learning and training. Compared to the well-developed technology for aptitude measurement and selection testing, however, the measurement of achievement and diagnosis of learning problems is underdeveloped. This is because the correlational models that support prediction are insufficient for the task of prescribing remediation or other instructional interventions. Tests can predict fa ilure without a theory of what causes success, but intervening to prevent failure and enhance competence requires deeper understanding.
The study of the nature of learning is therefore integral to the assessment of achievement. We must use what we know about the cognitive properties of acquired proficiency and about the structures and processes that develop as a student becomes competent in a domain . We know that learning is not simply a matter of the accretion of subject-matter concepts and procedures; it consists rather of organizing and restructuring of this information to enable skillful procedures and processes of problem representation and solution. Somehow, tests must be sensitive to how well this structuring has proceeded in the student being tested.
The usual forms of achievement tests are not effective diagnostic aids. In order for tests to become usefully prescriptive, they must identify performance components that facilitate or interfere with current proficiency and the attainment of eventual higher levels of achievement. Curriculum analysis of the content and skill to be learned in a subject matter does not automatically provide information about how students attain competence about the difficulties they meet in attaining it. An array of subject-matter subtests differing in difficulty is not enough for useful diagnosis. Rather, qualitative indicators of specific properties of performance that influence learning and characterize levels of competence need to be identified.
In order to ascertain the critical differences between successful and unsuccessful student performance, we need to appraise the knowledge structures and cognitive processes that reveal degrees of competence in a field of study. We need a fuller understanding of what to test and how test items relate to target knowledge. In contrast, most of current testing technology is post hoc and has focused on what to do after test items are constructed. Analysis of item difficulty, development of discrimination indices, scaling and norming procedures, and analysis of test dimensions and factorial composition take place after the item is written. A theory of acquisition and performance is needed before and during item design
Designing high fidelity simulation to maximize student registered nursing decision-making ability
The current healthcare environment is a complex system of patients, procedures, and equipment that strives to deliver safe and effective medical care. High fidelity simulation provides healthcare educators with a tool to create safety conscious practitioners utilizing an environment that replicates practice without risk to patients. Using HFS learning opportunities to refine a learner\u27s clinical decision-making skills under time pressure and high stakes outcomes could provide new opportunities for training the healthcare workforce of the future. This design based research project explored how to structure HFS training to facilitate the development of decision-making in second semester Registered Nursing learners. Borrowing from the research base of aviation and the military, a framework of Situation Awareness was used to define decision-making skills. Using a naturalistic decision-making approach, the research sought to understand how the design of the HFS learning event impacted the ability of participants to demonstrate behaviors of Situation Awareness. Findings of this study demonstrated that design based research is a powerful tool to create a rich understanding of the high fidelity simulation learning experience. The results also supported the work of Jeffries (2005) reiterating that HFS simulation design must be created using strong pedagogical principles that support specific learning outcomes. Particular attention should be focused on maintenance of fidelity, understanding complexity and scaffolding learning opportunities through a multi-phased approach that minimally includes debriefing. The research related to this small group suggests that the briefing stage of HFS learning should be further explored for its influence on learning in HFS. The influence of the facilitator/faculty on the HFS was emphasized in this research suggesting that faculty development would be important for use of this new tool. Additional implications of the research suggest that high fidelity simulation has a role in team training and development of communication skills
- …