192 research outputs found

    A pascal compiler for PDP 11 minicomputers

    Get PDF
    In this paper the development of a cross-compiler running on the central computing facility is described. The compiler transforms PASCAL source code into object code for the PDP 11 family. The arguments for higher level languages on minicomputers and the choice made for PASCAL are discussed. It is shown that only a minor effort in terms of manpower is required if such a development is based on an existing compiler that is suited to the purpose of adaptation. Even without large amounts of optimization the code produced is both compact and efficient. Some attention is paid to requirements that should be fulfilled in portable compilers. The paper ends with a discussion of some strong points and weak points of the PDP 11 architecture

    A study of systems implementation languages for the POCCNET system

    Get PDF
    The results are presented of a study of systems implementation languages for the Payload Operations Control Center Network (POCCNET). Criteria are developed for evaluating the languages, and fifteen existing languages are evaluated on the basis of these criteria

    In the Beginning... A Legacy of Computing at Marshall University

    Get PDF
    This book provides a brief history of the early computing technology at Marshall University, Huntington, W.Va., in the forty years: 1959-1999. This was before the move to Intel and Windows based servers. After installation of an IBM Accounting Machine in 1959, which arguably does not fit the modern definition of a computer, the first true computer arrived in 1963 and was installed in a room below the Registrar’s office. For the next twenty years several departments ordered their own midrange standalone systems to fit their individual departmental requirements. These represented different platforms from different vendors, and were not connected to each other. At the same time, the Marshall Computer Center developed an interconnected, multi-processor environment. With the software problems of year 2000, and the I/T move to the new Drinko Library, several systems were scrapped. New systems were installed on the pc server platforms. This book includes images of the various systems, several comments from users, and hardware and software descriptions

    Implications of Structured Programming for Machine Architecture

    Get PDF
    Based on an empirical study of more than 10,000 lines of program text written in a GOTO-less language, a machine architecture specifically designed for structured programs is proposed. Since assignment, CALL, RETURN, and IF statements together account for 93 percent of all executable statements, special care is given to ensure that these statements can be implemented efficiently. A highly compact instruction encoding scheme is presented, which can reduce program size by a factor of 3. Unlike a Huffman code, which utilizes variable length fields, this method uses only fixed length (1-byte) opcode and address fields. The most frequent instructions consist of a single 1-byte field. As a consequence, instruction decoding time is minimized, and the machine is efficient with respect to both space and time. © 1978, ACM. All rights reserved

    Linden World, September 8, 1988

    Get PDF
    Student Newspaper of Lindenwood Collegehttps://digitalcommons.lindenwood.edu/linden_world/1082/thumbnail.jp

    Microcomputer Based Simulation

    Get PDF
    Digital simulation is a useful tool in many scientific areas. Interactive simulation can provide the user with a better appreciation of a problem area. With the introduction of large scale integrated circuits and in particular the advent of the microprocessor, a large amount of computing power is available at low cost. The aim of this project therefore was to investigate the feasibility of producing a minimum cost, easy to use, interactive digital simulation system. A hardware microcomputer system was constructed to test simulation program concepts and an interactive program was designed and developed for this system. By the use of a set of commands and subsequent interactive dialogue, the program allows the user to enter and perform simulation tasks. The simulation program is unusual in that it does not require a sophisticated operating system or other system programs such as compilers. The program does not require any backup memory devices such as magnetic disc or tape and indeed could be stored in ROM or EPROM. The program is designed to be flexible and extendable and could be easily modified to run with a variety of hardware configurations. The highly interactive nature of the system means that its operation requires very little programming experience. The microcomputer hardware system uses two microprocessors together with specially designed interfaces. One was dedicated to the implementation of the simulation equations, and the other provided an input/output capability including a low cost CRT display

    Design of a real-time wind turbine simulator using a custom parallel architecture

    Get PDF
    The design of a new parallel-processing digital simulator is described. The new simulator has been developed specifically for analysis of wind energy systems in real time. The new processor has been named: the Wind Energy System Time-domain simulator, version 3 (WEST-3). Like previous WEST versions, WEST-3 performs many computations in parallel. The modules in WEST-3 are pure digital processors, however. These digital processors can be programmed individually and operated in concert to achieve real-time simulation of wind turbine systems. Because of this programmability, WEST-3 is very much more flexible and general than its two predecessors. The design features of WEST-3 are described to show how the system produces high-speed solutions of nonlinear time-domain equations. WEST-3 has two very fast Computational Units (CU's) that use minicomputer technology plus special architectural features that make them many times faster than a microcomputer. These CU's are needed to perform the complex computations associated with the wind turbine rotor system in real time. The parallel architecture of the CU causes several tasks to be done in each cycle, including an IO operation and the combination of a multiply, add, and store. The WEST-3 simulator can be expanded at any time for additional computational power. This is possible because the CU's interfaced to each other and to other portions of the simulation using special serial buses. These buses can be 'patched' together in essentially any configuration (in a manner very similar to the programming methods used in analog computation) to balance the input/ output requirements. CU's can be added in any number to share a given computational load. This flexible bus feature is very different from many other parallel processors which usually have a throughput limit because of rigid bus architecture

    A brief history of software engineering

    Get PDF
    Abstract We present a personal perspective of the Art of Programming. We start with its state around 1960 and follow its development to the present day. The term Software Engineering became known after a conference in 1968, when the difficulties and pitfalls of designing complex systems were frankly discussed. A search for solutions began. It concentrated on better methodologies and tools. The most prominent were programming languages reflecting the procedural, modular, and then object-oriented styles. Software engineering is intimately tied to their emergence and improvement. Also of significance were efforts of systematizing, even automating program documentation and testing. Ultimately, analytic verification and correctness proofs were supposed to replace testing. More recently, the rapid growth of computing power made it possible to apply computing to ever more complicated tasks. This trend dramatically increased the demands on software engineers. Programs and systems became complex and almost impossible to fully understand. The sinking cost and the abundance of computing resources inevitably reduced the care for good design. Quality seemed extravagant, a loser in the race for profit. But we should be concerned about the resulting deterioration in quality. Our limitations are no longer given by slow hardware, but by our own intellectual capability. From experience we know that most programs could be significantly improved, made more reliable, economical and comfortable to use. The 1960s and the Origin of Software Engineering It is unfortunate that people dealing with computers often have little interest in the history of their subject. As a result, many concepts and ideas are propagated and advertised as being new, which existed decades ago, perhaps under a different terminology. I believe it worth while to occasionally spend some time to consider the past and to investigate how terms and concepts originated. I consider the late 1950s as an essential period of the era of computing. Large computers became available to research institutions and universities. Their presence was noticed mainly in engineering and natural sciences, but also in business they soon became indispensable. The time when they were accessible only to a few insiders in laboratories, when they broke down every time one wanted to use them, belonged to the past. Their emergence from the closed laboratory of electrical engineers into the public domain meant that their use, in particular their programming, became an activity of many. A new profession was born; but the large computers themselves became hidden within closely guarded cellars. Programmers brought their programs to the counter, where a dispatcher would pick them up, queue them, and where the results could be fetched hours or days later. There was no interactivity between man and computer. 2 Programming was known to be a sophisticated task requiring devotion and scrutiny, and a love for obscure codes and tricks. In order to facilitate this coding, formal notations were created. We now call them programming languages. The primary idea was to replace sequences of special instruction code by mathematical formulas. The first widely known language, Fortran, was issued by IBM (Backus, 1957), soon followed by Algol (1958) and its official successor in 1960. As computers were then used for computing rather than storing and communicating, these languages catered mainly to numerical mathematics. In 1962 the language Cobol was issued by the US Department of Defense for business applications. But as computing capacity grew, so did the demands on programs and on programmers: Tasks became more and more intricate. It was slowly recognized that programming was a difficult task, and that mastering complex problems was non-trivial, even when -or because -computers were so powerful. Salvation was sought in "better" programming languages, in more "tools", even in automation. A better language should be useful in a wider area of application, be more like a "natural" language, offer more facilities. PL/1 was designed to unify scientific and commercial worlds. It was advertised under the slogan "Everybody can program thanks to PL/1". Programming languages and their compilers became a principal cornerstone of computing science. But they neither fitted into mathematics nor electronics, the two traditional sectors where computers were used. A new discipline emerged, called Computer Science in America, Informatics in Europe. In 1963 the first time-sharing system appeared (MIT, Stanford, McCarthy, DEC PDP-1). It brought back the missing interactivity. Computer manufacturers picked the idea up and announced time-sharing systems for their large mainframes (IBM 360/67, GE 645). It turned out that the transition from batch processing systems to time-sharing systems, or in fact their merger, was vastly more difficult then anticipated. Systems were announced and could not be delivered on time. The problems were too complex. Research was to be conducted "on the job". The new, hot topics were multiprocessing and concurrent programming. The difficulties brought big companies to the brink of collapse. In 1968 a conference sponsored by NATO was dedicated to the topic (1968 at Garmisch-Partenkirchen, Germany) [1]. Although critical comments had occasionally been voiced earlier Programming as a Discipline In the academic world it was mainly E.W.Dijkstra and C.A.R.Hoare, who recognized the problems and offered new ideas. In 1965 Dijkstra wrote his famous Notes on Structured Programming Furthermore, in 1966 Dijkstra wrote a seminal paper about harmoniously cooperating processes Of course, all this did not change the situation, nor dispel all difficulties over night. Industry could change neither policies nor tools rapidly. Nevertheless, intensive training courses on structured programming were organized, notably by H. D. Mills in IBM. None less than the US Department of Defense realized that problems were urgent and growing. It started a project that ultimately led to the programming language Ada, a highly structured language suitable for a wide variety of applications. Software development within the DoD would then be based exclusively on Ada UNIX and C Yet, another trend started to pervade the programming arena, notably academia, pointing in the opposite direction. It was spawned by the spread of the UNIX operating system, contrasting to MIT's MULTICS and used on the quickly emerging minicomputers. UNIX was a highly welcome relief from the large operating systems established on mainframe computers. In its tow UNIX carried the language C [8], which had been explicitly designed to support the development of UNIX. Evidently, it was therefore at least attractive, if not even mandatory to use C for the development of applications running under UNIX, which acted like a Trojan horse for C. From the point of view of software engineering, the rapid spread of C represented a great leap backward. It revealed that the community at large had hardly grasped the true meaning of the term "high-level language" which became an ill-understood buzzword. What, if anything, was to be "high-level"? As this issue lies at the core of software engineering, we need to elaborate. Abstraction Computer systems are machines of large complexity. This complexity can be mastered intellectually by one tool only: Abstraction. A language represents an abstract computer whose objects and constructs lie closer (higher) to the problem to be represented than to the concrete machine. For example, in a high-level language we deal with numbers, indexed arrays, data types, conditional and repetitive statements, rather than with bits and bytes, addressed words, jumps and condition codes. However, these abstractions are beneficial only, if they are consistently and completely defined in terms of their own properties. If this is not so, if the abstractions can be understood only in terms of the facilities of an underlying computer, then the benefits are marginal, almost given away. If debugging a program -undoubtedly the most pervasive activity in software engineeringrequires a "hexadecimal dump", then the language is hardly worth the trouble
    • …
    corecore