52,115 research outputs found
Architecture and Design of Medical Processor Units for Medical Networks
This paper introduces analogical and deductive methodologies for the design
medical processor units (MPUs). From the study of evolution of numerous earlier
processors, we derive the basis for the architecture of MPUs. These specialized
processors perform unique medical functions encoded as medical operational
codes (mopcs). From a pragmatic perspective, MPUs function very close to CPUs.
Both processors have unique operation codes that command the hardware to
perform a distinct chain of subprocesses upon operands and generate a specific
result unique to the opcode and the operand(s). In medical environments, MPU
decodes the mopcs and executes a series of medical sub-processes and sends out
secondary commands to the medical machine. Whereas operands in a typical
computer system are numerical and logical entities, the operands in medical
machine are objects such as such as patients, blood samples, tissues, operating
rooms, medical staff, medical bills, patient payments, etc. We follow the
functional overlap between the two processes and evolve the design of medical
computer systems and networks.Comment: 17 page
Factors shaping the evolution of electronic documentation systems
The main goal is to prepare the space station technical and managerial structure for likely changes in the creation, capture, transfer, and utilization of knowledge. By anticipating advances, the design of Space Station Project (SSP) information systems can be tailored to facilitate a progression of increasingly sophisticated strategies as the space station evolves. Future generations of advanced information systems will use increases in power to deliver environmentally meaningful, contextually targeted, interconnected data (knowledge). The concept of a Knowledge Base Management System is emerging when the problem is focused on how information systems can perform such a conversion of raw data. Such a system would include traditional management functions for large space databases. Added artificial intelligence features might encompass co-existing knowledge representation schemes; effective control structures for deductive, plausible, and inductive reasoning; means for knowledge acquisition, refinement, and validation; explanation facilities; and dynamic human intervention. The major areas covered include: alternative knowledge representation approaches; advanced user interface capabilities; computer-supported cooperative work; the evolution of information system hardware; standardization, compatibility, and connectivity; and organizational impacts of information intensive environments
Predictions for Scientific Computing Fifty Years from Now
This essay is adapted from a talk given June 17, 1998 at the conference "Numerical Analysis and Computers - 50 Years of Progress" held at the University of Manchester, England in commemoration of the 50th anniversary of the Mark 1 computer
From SpaceStat to CyberGIS: Twenty Years of Spatial Data Analysis Software
This essay assesses the evolution of the way in which spatial data analytical methods have been incorporated into software tools over the past two decades. It is part retrospective and prospective, going beyond a historical review to outline some ideas about important factors that drove the software development, such as methodological advances, the open source movement and the advent of the internet and cyberinfrastructure. The review highlights activities carried out by the author and his collaborators and uses SpaceStat, GeoDa, PySAL and recent spatial analytical web services developed at the ASU GeoDa Center as illustrative examples. It outlines a vision for a spatial econometrics workbench as an example of the incorporation of spatial analytical functionality in a cyberGIS.
Sail intelligent terminal evaluation
Engineering assessments, recommendations, and equipment necessary to solve the operational problems are described, and operational flexibility of the intelligent terminal facility are extended. The following capabilities were considered: (1) the operation of at least two D/D stations and one remote graphics terminal simultaneously; (2) the capability to run plotter, AIDS and FORTRAN programs simultaneously; (3) simultaneous use of system utility routines of D/D stations and remote graphics terminal; (4) the capability to provide large volume hardcopy of data and graphics; and (5) the capability to eliminate or at least ease the current operation/programming problems with related labor costs. The overall intelligent terminal development, and plans guiding the analysis and equipment acquisitions were studied, and the assessments and analyses performed are also summarized
Microcomputer Intelligence for Technical Training (MITT): The evolution of an intelligent tutoring system
Microcomputer Intelligence for Technical Training (MITT) uses Intelligent Tutoring System (OTS) technology to deliver diagnostic training in a variety of complex technical domains. Over the past six years, MITT technology has been used to develop training systems for nuclear power plant diesel generator diagnosis, Space Shuttle fuel cell diagnosis, and message processing diagnosis for the Minuteman missile. Presented here is an overview of the MITT system, describing the evolution of the MITT software and the benefits of using the MITT system
A review of High Performance Computing foundations for scientists
The increase of existing computational capabilities has made simulation
emerge as a third discipline of Science, lying midway between experimental and
purely theoretical branches [1, 2]. Simulation enables the evaluation of
quantities which otherwise would not be accessible, helps to improve
experiments and provides new insights on systems which are analysed [3-6].
Knowing the fundamentals of computation can be very useful for scientists, for
it can help them to improve the performance of their theoretical models and
simulations. This review includes some technical essentials that can be useful
to this end, and it is devised as a complement for researchers whose education
is focused on scientific issues and not on technological respects. In this
document we attempt to discuss the fundamentals of High Performance Computing
(HPC) [7] in a way which is easy to understand without much previous
background. We sketch the way standard computers and supercomputers work, as
well as discuss distributed computing and discuss essential aspects to take
into account when running scientific calculations in computers.Comment: 33 page
Design activities: how to analyze cognitive effort associated to cognitive treatments?
Working memory issues are important in many real activities. Thus, measuring cognitive effort (or mental load) has been a main research topic for years in cognitive ergonomics, though no consensual method to study such aspect has been proposed. In addition, we argue that cognitive effort has to be related to an analysis of the evolution of cognitive processes, which has been called "time processing". Towards this end, we present and discuss paradigms that have been used for years to study writing activities and, in experiments reported in this paper, for studying design activities, such as computer-graphic tasks or web site desig
Multimedia Technologies and Virtual Organizing of Learning
This paper addresses the issue of what are the efficient uses of multimedia technologies in teaching processes, and what are the conditions that require these technologies in online learning. The background of this paper is made by exploring online master programs, given the burgeoning interest of this emerging phenomenon of the future of distance learning. Rigorous analysis and careful measurement of communication required were covered by empirical data. This analysis is to provide an early window into several communications processes and tasks that occur in virtual context of learning, using multimedia technologies.multimedia technologies, virtual organization, online learning, communication, effectiveness of learning
- âŠ