2,794 research outputs found

    FrameWorks 3D: composition in the third dimension

    No full text
    Music composition on computer is a challenging task, involving a range of data types to be managed within a single software tool. A composition typically comprises a complex arrangement of material, with many internal relationships between data in different locations - repetition, inversion, retrograde, reversal and more sophisticated transformations. The creation of such complex artefacts is labour intensive, and current systems typically place a significant cognitive burden on the composer in terms of maintaining a work as a coherent whole. FrameWorks 3D is an attempt to improve support for composition tasks within a Digital Audio Workstation (DAW) style environment via a novel three-dimensional (3D) user-interface. In addition to the standard paradigm of tracks, regions and tape recording analogy, FrameWorks displays hierarchical and transformational information in a single, fully navigable workspace. The implementation combines Java with Max/MSP to create a cross-platform, user-extensible package and will be used to assess the viability of such a tool and to develop the ideas furthe

    Factors shaping the evolution of electronic documentation systems

    Get PDF
    The main goal is to prepare the space station technical and managerial structure for likely changes in the creation, capture, transfer, and utilization of knowledge. By anticipating advances, the design of Space Station Project (SSP) information systems can be tailored to facilitate a progression of increasingly sophisticated strategies as the space station evolves. Future generations of advanced information systems will use increases in power to deliver environmentally meaningful, contextually targeted, interconnected data (knowledge). The concept of a Knowledge Base Management System is emerging when the problem is focused on how information systems can perform such a conversion of raw data. Such a system would include traditional management functions for large space databases. Added artificial intelligence features might encompass co-existing knowledge representation schemes; effective control structures for deductive, plausible, and inductive reasoning; means for knowledge acquisition, refinement, and validation; explanation facilities; and dynamic human intervention. The major areas covered include: alternative knowledge representation approaches; advanced user interface capabilities; computer-supported cooperative work; the evolution of information system hardware; standardization, compatibility, and connectivity; and organizational impacts of information intensive environments

    Design and integrity of deterministic system architectures.

    Get PDF
    Architectures represented by system construction 'building block' components and interrelationships provide the structural form. This thesis addresses processes, procedures and methods that support system design synthesis and specifically the determination of the integrity of candidate architectural structures. Particular emphasis is given to the structural representation of system architectures, their consistency and functional quantification. It is a design imperative that a hierarchically decomposed structure maintains compatibility and consistency between the functional and realisation solutions. Complex systems are normally simplified by the use of hierarchical decomposition so that lower level components are precisely defined and simpler than higher-level components. To enable such systems to be reconstructed from their components, the hierarchical construction must provide vertical intra-relationship consistency, horizontal interrelationship consistency, and inter-component functional consistency. Firstly, a modified process design model is proposed that incorporates the generic structural representation of system architectures. Secondly, a system architecture design knowledge domain is proposed that enables viewpoint evaluations to be aggregated into a coherent set of domains that are both necessary and sufficient to determine the integrity of system architectures. Thirdly, four methods of structural analysis are proposed to assure the integrity of the architecture. The first enables the structural compatibility between the 'building blocks' that provide the emergent functional properties and implementation solution properties to be determined. The second enables the compatibility of the functional causality structure and the implementation causality structure to be determined. The third method provides a graphical representation of architectural structures. The fourth method uses the graphical form of structural representation to provide a technique that enables quantitative estimation of performance estimates of emergent properties for large scale or complex architectural structures. These methods have been combined into a procedure of formal design. This is a design process that, if rigorously executed, meets the requirements for reconstructability

    The role of the host in a cooperating mainframe and workstation environment, volumes 1 and 2

    Get PDF
    In recent years, advancements made in computer systems have prompted a move from centralized computing based on timesharing a large mainframe computer to distributed computing based on a connected set of engineering workstations. A major factor in this advancement is the increased performance and lower cost of engineering workstations. The shift to distributed computing from centralized computing has led to challenges associated with the residency of application programs within the system. In a combined system of multiple engineering workstations attached to a mainframe host, the question arises as to how does a system designer assign applications between the larger mainframe host and the smaller, yet powerful, workstation. The concepts related to real time data processing are analyzed and systems are displayed which use a host mainframe and a number of engineering workstations interconnected by a local area network. In most cases, distributed systems can be classified as having a single function or multiple functions and as executing programs in real time or nonreal time. In a system of multiple computers, the degree of autonomy of the computers is important; a system with one master control computer generally differs in reliability, performance, and complexity from a system in which all computers share the control. This research is concerned with generating general criteria principles for software residency decisions (host or workstation) for a diverse yet coupled group of users (the clustered workstations) which may need the use of a shared resource (the mainframe) to perform their functions

    Collaborative Control: A Robot-Centric Model for Vehicle Teleoperation

    Get PDF

    Proceedings of the NSSDC Conference on Mass Storage Systems and Technologies for Space and Earth Science Applications

    Get PDF
    The proceedings of the National Space Science Data Center Conference on Mass Storage Systems and Technologies for Space and Earth Science Applications held July 23 through 25, 1991 at the NASA/Goddard Space Flight Center are presented. The program includes a keynote address, invited technical papers, and selected technical presentations to provide a broad forum for the discussion of a number of important issues in the field of mass storage systems. Topics include magnetic disk and tape technologies, optical disk and tape, software storage and file management systems, and experiences with the use of a large, distributed storage system. The technical presentations describe integrated mass storage systems that are expected to be available commercially. Also included is a series of presentations from Federal Government organizations and research institutions covering their mass storage requirements for the 1990's

    Scenario-based system architecting : a systematic approach to developing future-proof system architectures

    Get PDF
    This thesis summarizes the research results of Mugurel T. Ionita, based on the work conducted in the context of the STW15 - AIMES16 project. The work presented in this thesis was conducted at Philips Research and coordinated by Eindhoven University of Technology. It resulted in six external available publications, and ten internal reports which are company confidential. The research regarded the methodology of developing system architectures, focusing in particular on two aspects of the early architecting phases. These were, first the generation of multiple architectural options, to consider the most likely changes to appear in the business environment, and second the quantitative assessment of these options with respect to how well they contribute to the overall quality attributes of the future system, including cost and risk analysis. The main reasons for looking at these two aspects of the architecting process was because architectures usually have to live for long periods of time, up to 5 years, which requires that they are able to deal successfully with the uncertainty associated with the future business environment. A second reason was because the quality attributes, the costs and the risks of a future system are usually dictated by its architecture, and therefore an early quantitative estimate about these attributes could prevent the system redesign. The research results of this project were two methods, namely a method for designing architecture options that are more future-proof, meaning more resilient to future changes, (SODA method), and within SODA a method for the quantitative assessment of the proposed architectural options (SQUASH method). The validation of the two methods has been performed in the area of professional systems, where they were applied in a concrete case study from the medical domain. The SODA method is an innovative solution to the problem of developing system architectures that are designed to survive the most likely changes to be foreseen in the future business environment of the system. The method enables on one hand the business stakeholders of a system to provide the architects with their knowledge and insight about the future when new systems are created. And on the other hand, the method enables the architects to take a long view and think strategically in terms of different plausible futures and unexpected surprises, when designing the high level structure of their systems. The SQUASH method is a systematic way of assessing in a quantitative manner, the proposed architectural options, with respect to how well they deal with quality aspects, costs and risks, before the architecture is actually implemented. The method enables the architects to reason about the most relevant attributes of the future system, and to make more informed decisions about their design, based on the quantitative data. Both methods, SODA and SQUASH, are descriptive in nature, rooted in the best industrial practices, and hence proposing better ways of developing system architectures

    NSSDC Conference on Mass Storage Systems and Technologies for Space and Earth Science Applications, volume 2

    Get PDF
    This report contains copies of nearly all of the technical papers and viewgraphs presented at the NSSDC Conference on Mass Storage Systems and Technologies for Space and Earth Science Application. This conference served as a broad forum for the discussion of a number of important issues in the field of mass storage systems. Topics include the following: magnetic disk and tape technologies; optical disk and tape; software storage and file management systems; and experiences with the use of a large, distributed storage system. The technical presentations describe, among other things, integrated mass storage systems that are expected to be available commercially. Also included is a series of presentations from Federal Government organizations and research institutions covering their mass storage requirements for the 1990's
    corecore