294 research outputs found

    Simultaneous optimization of decisions using a linear utility function

    Get PDF
    The purpose of this paper is to simultaneously optimize decision rules for combinations of elementary decisions. As a result of this approach, rules are found that make more efficient use of the data than does optimizing those decisions separately. The framework for the approach is derived from empirical Bayesian theory. To illustrate the approach, two elementary decisions--selection and mastery decisions--are combined into a simple decision network. A linear utility structure is assumed. Decision rules are derived both for quota-free and quota-restricted selection-mastery decisions for several subpopulations. An empirical example of instructional decision making in an individual study system concludes the paper. The example involves 43 freshmen medical students (27 were disadvantaged and 16 were advantaged with respect to elementary medical knowledge). Both the selection and mastery tests consisted of 17 free-response items on elementary medical knowledge with test scores ranging from 0 to 100. The treatment consisted of a computer-aided instructional program

    The use of decision theory in the Minnesota Adaptive Instructional System

    Get PDF
    The application of the Minnesota Adaptive Instructional System (MAIS) decision procedure by R. D. Tennyson et al. (1975, 1977) is examined. The MAIS is a computer-based adaptive instructional system. The problems of determining the optimal number of interrogatory examples in the MAIS can be formalized as a problem of Bayesian decision making. Two features of the MAIS decision procedure can be improved by using other results from this decision-theory approach. The first feature deals with the determination of the loss ratio "R." A lottery method for assessing this ratio empirically is discussed. The second feature concerns the choice of the loss function involved. It is argued that in many situations, the assumed threshold loss function in the MAIS is an unrealistic representation of the loss actually incurred. A linear utility function is proposed to meet the objections to threshold loss. Whether or not these two innovations are really improvements of the present decision component in the MAIS in terms of student performance on posttests, learning time, and amount of instruction must be decided on the basis of experiments. Research projects for these areas have already been planned. One table and one figure illustrate the decision theory approach. A 38-item list of references is included

    A minimax procedure in the context of sequential mastery testing

    Get PDF

    Applications of decision theory to computer-based adaptive instructional systems

    Get PDF
    This paper considers applications of decision theory to the problem of instructional decision-making in computer-based adaptive instructional systems, using the Minnesota Adaptive Instructional System (MAIS) as an example. The first section indicates how the problem of selecting the appropriate amount of instruction in MAIS can be situated within the general framework of empirical Bayesian decision theory. The linear loss model and the classical test model are discussed in this context. The second section describes six characteristics essential in effective computerized adaptive instructional systems: (1) initial diagnosis and prescription; (2) sequential character of the instructional decision-making process; (3) appropriate amount of instruction for each student; (4) sequence of instruction; (5) instructional time control; and (6) advisement of learning need. It is shown that all but the sequence of instruction could be improved in MAIS with the extensions proposed. Several new lines of research arising from the application of psychometric theory to the decision component in MAIS are reviewed

    A simultaneous approach to optimizing treatment assignments with mastery scores

    Get PDF
    An approach to simultaneous optimization of assignments of subjects to treatments followed by an end-of-mastery test is presented using the framework of Bayesian decision theory. Focus is on demonstrating how rules for the simultaneous optimization of sequences of decisions can be found. The main advantages of the simultaneous approach, compared to the separate approach, are the more efficient use of data and the fact that more realistic utility structures can be used. The utility structure dealt with in this combined decision problem is a linear utility function. Decision rules are derived for quota-free as well as quota-restricted assignment situations when several culturally biased subpopulations of subjects are to be distinguished. The procedures are demonstrated with an empirical example of instructional decision making in an individualized study system that involves combining two elementary decisions

    Adaptive mastery testing using the Rasch model and Bayesian sequential decision theory

    Get PDF
    A version of sequential mastery testing is studied in which response behavior is modeled by an item response theory (IRT) model. First, a general theoretical framework is sketched that is based on a combination of Bayesian sequential decision theory and item response theory. A discussion follows on how IRT based sequential mastery testing can be generalized to adaptive item and testlet selection rules; i.e., to a situation in which the choice of the next item or testlet to be administered is optimized using the information from previous responses. The performance of IRT based sequential and adaptive sequential mastery testing is studied in a number of simulations using the Rasch model. The possibilities and difficulties of application of the approach in the framework of the two-parameter logistic and three-parameter logistic models is also discussed

    A Comparison of Parallelism for Combuter-Based learning environments

    Get PDF
    In this paper we discuss an experiment that was carried out with a prototype, designed in conformity with the concept of parallelism and the Parallel Instruction theory (the PI theory). We designed this prototype with five different interfaces, and ran an empirical study in which 18 participants completed an abstract task. The five basic designs were based on hypotheses of the PI theory that for solving tasks on screens all task relevant information must be in view on a computer monitor, as clearly as possible. The condition with two parallel frames and the condition with one long web page appeared to be the best design for this type of task, better than window versions that we normally use for our computer simulations on the web. We do not only describe the results of the abstract task in the five conditions, but we also discuss the results from the perspective of concrete, realistic tasks with computer simulations. The interface with two parallel frames is the best solution here, but also the interface with long web pages (‘virtual parallelism’) is a great favourite in practice when doing realistic tasks

    Prototype 3D Real-Time Imaging System Based on a Sparse PZT Spiral Array

    Get PDF

    Automatic Max-Likelihood Envelope Detection Algorithm for Quantitative High-Frame-Rate Ultrasound for Neonatal Brain Monitoring

    Get PDF
    Objective: Post-operative brain injury in neonates may result from disturbed cerebral perfusion, but accurate peri-operative monitoring is lacking. High-frame-rate (HFR) cerebral ultrasound could visualize and quantify flow in all detectable vessels using spectral Doppler; however, automated quantification in small vessels is challenging because of low signal amplitude. We have developed an automatic envelope detection algorithm for HFR pulsed wave spectral Doppler signals, enabling neonatal brain quantitative parameter maps during and after surgery. Methods: HFR ultrasound data from high-risk neonatal surgeries were recorded with a custom HFR mode (frame rate = 1000 Hz) on a Zonare ZS3 system. A pulsed wave Doppler spectrogram was calculated for each pixel containing blood flow in the image, and spectral peak velocity was tracked using a max-likelihood estimation algorithm of signal and noise regions in the spectrogram, where the most likely cross-over point marks the blood flow velocity. The resulting peak systolic velocity (PSV), end-diastolic velocity (EDV) and resistivity index (RI) were compared with other detection schemes, manual tracking and RIs from regular pulsed wave Doppler measurements in 10 neonates. Results: Envelope detection was successful in both high- and low-quality arterial and venous flow spectrograms. Our technique had the lowest root mean square error for EDV, PSV and RI (0.46 cm/s, 0.53 cm/s and 0.15, respectively) when compared with manual tracking. There was good agreement between the clinical pulsed wave Doppler RI and HFR measurement with a mean difference of 0.07. Conclusion: The max-likelihood algorithm is a promising approach to accurate, automated cerebral blood flow monitoring with HFR imaging in neonates.</p

    Ultrasound-triggered local release of lipophilic drugs from a novel polymeric ultrasound contrast agent

    Get PDF
    The advantage of ultrasound contrast agents (UCAs) as drug delivery systems is the ability to non-invasively control the local and triggered release of a drug or gene. In this study we designed and characterized a novel UCA-based drug delivery system, based on polymer-shelled microcapsules filled with a mixture of gas and oil, for the local delivery of lipophilic drugs
    • …
    corecore