13,169 research outputs found

    Study of nonstationary random process theory Final report, 1 Jul. 1966 - 30 Apr. 1967

    Get PDF
    Nonstationary random processes in nonreal-time applications - theories for nonreal-time correlation and spectrum analysi

    Synthesis methods for manual aerospace control systems with applications to SST design

    Get PDF
    Synthesis methods for manual aerospace control systems using digital programming and man machine performance data with application to supersonic transport desig

    Study of Nonstationary Random Process Theory

    Get PDF
    Nonstationary random process theor

    Correlation of spray dropsize distribution and injector variables Interim report

    Get PDF
    Correlation of spray drop size distribution and injector variable

    PASCAL/48 reference manual

    Get PDF
    PASCAL/48 is a programming language for the Intel MCS-48 series of microcomputers. In particular, it can be used with the Intel 8748. It is designed to allow the programmer to control most of the instructions being generated and the allocation of storage. The language can be used instead of ASSEMBLY language in most applications while allowing the user the necessary degree of control over hardware resources. Although it is called PASCAL/48, the language differs in many ways from PASCAL. The program structure and statements of the two languages are similar, but the expression mechanism and data types are different. The PASCAL/48 cross-compiler is written in PASCAL and runs on the CDC CYBER NOS system. It generates object code in Intel hexadecimal format that can be used to program the MCS-48 series of microcomputers. This reference manual defines the language, describes the predeclared procedures, lists error messages, illustrates use, and includes language syntax diagrams

    A translator writing system for microcomputer high-level languages and assemblers

    Get PDF
    In order to implement high level languages whenever possible, a translator writing system of advanced design was developed. It is intended for routine production use by many programmers working on different projects. As well as a fairly conventional parser generator, it includes a system for the rapid generation of table driven code generators. The parser generator was developed from a prototype version. The translator writing system includes various tools for the management of the source text of a compiler under construction. In addition, it supplies various default source code sections so that its output is always compilable and executable. The system thereby encourages iterative enhancement as a development methodology by ensuring an executable program from the earliest stages of a compiler development project. The translator writing system includes PASCAL/48 compiler, three assemblers, and two compilers for a subset of HAL/S

    Disentanglement and Decoherence without dissipation at non-zero temperatures

    Get PDF
    Decoherence is well understood, in contrast to disentanglement. According to common lore, irreversible coupling to a dissipative environment is the mechanism for loss of entanglement. Here, we show that, on the contrary, disentanglement can in fact occur at large enough temperatures TT even for vanishingly small dissipation (as we have shown previously for decoherence). However, whereas the effect of TT on decoherence increases exponentially with time, the effect of TT on disentanglement is constant for all times, reflecting a fundamental difference between the two phenomena. Also, the possibility of disentanglement at a particular TT increases with decreasing initial entanglement.Comment: 3 page

    System balance analysis for vector computers

    Get PDF
    The availability of vector processors capable of sustaining computing rates of 10 to the 8th power arithmetic results pers second raised the question of whether peripheral storage devices representing current technology can keep such processors supplied with data. By examining the solution of a large banded linear system on these computers, it was found that even under ideal conditions, the processors will frequently be waiting for problem data

    A Research-Based Curriculum for Teaching the Photoelectric Effect

    Get PDF
    Physics faculty consider the photoelectric effect important, but many erroneously believe it is easy for students to understand. We have developed curriculum on this topic including an interactive computer simulation, interactive lectures with peer instruction, and conceptual and mathematical homework problems. Our curriculum addresses established student difficulties and is designed to achieve two learning goals, for students to be able to (1) correctly predict the results of photoelectric effect experiments, and (2) describe how these results lead to the photon model of light. We designed two exam questions to test these learning goals. Our instruction leads to better student mastery of the first goal than either traditional instruction or previous reformed instruction, with approximately 85% of students correctly predicting the results of changes to the experimental conditions. On the question designed to test the second goal, most students are able to correctly state both the observations made in the photoelectric effect experiment and the inferences that can be made from these observations, but are less successful in drawing a clear logical connection between the observations and inferences. This is likely a symptom of a more general lack of the reasoning skills to logically draw inferences from observations.Comment: submitted to American Journal of Physic

    Renosterveld Conservation in South Africa: A Case Study for Handling Uncertainty in Knowledge-Based Neural Networks for Environmental Management

    Get PDF
    This work presents an artificial intelligence method for the development of decision support systems for environmental management and demonstrates its strengths using an example from the domain of biodiversity and conservation biology. The approach takes into account local expert knowledge together with collected field data about plant habitats in order to identify areas which show potential for conserving thriving areas of Renosterveld vegetation and areas that are best suited for agriculture. The available data is limited and cannot be adequately explained by expert knowledge alone. The paradigm combines expert knowledge about the local conditions with the collected ground truth in a knowledge-based neural network. The integration of symbolic knowledge with artificial neural networks is becoming an. increasingly popular paradigm for solving real-world applications. The paradigm provides means for using prior knowledge to determine the network architecture, to program a subset of weights to induce a learning bias which guides network training, and to extract knowledge from trained networks; it thus provides a methodology for dealing with uncertainty in the prior knowledge. The role of neural networks then becomes that of knowledge refinement. The open question on how to determine the strength of the inductive bias of programmed weights is addressed by presenting a heuristic which takes the network architecture and training algorithm, the prior knowledge, and the training data into consideration
    • …
    corecore