2,332 research outputs found

    Modelling adaptive routing in Wide Area Networks

    Get PDF
    Bibliography: leaves 132-138.This study investigates the modelling of adative routing algorithms with specific reference to the algorithm of an existing Wide Area Network (WAN). Packets in the network are routed at each node on the basis of routing tables which contain internal and external delays for each route from the node. The internal delay on a route represents the time that packets queued for transmission will have to wait before being transmitted, while the external delay on a route represents the delay to other nodes via that route. Several modelling methods are investigated and compared for the purpose of identifying the most appropriate and applicable technique. A model of routing in the WAN using an analytic technique is described. The hypothesis of this study is that dynamic routing can be modelled as a sequence of models exhibiting fixed routing. The modelling rationale is that a series of analytic models is run and solved. The routing algorithm of the WAN studied is such that, if viewed at any time instant, the network is one with static routing and no buffer overflow. This characteristic, together with a real time modelling requirement, influences the modelling technique which is applied. Each model represents a routing update interval and a multiclass open queueing network is used to solve the model during a particular interval. Descriptions of the design and implementation of X wan, an X Window based modelling system, are provided. A feature of the modelling system is that it provides a Graphical User Interface (GUI), allowing interactive network specification and the direct observation of network routing through the medium of this interface. Various applications of the modelling system are presented, and overall network behaviour is examined. Experimentation with the routing algorithm is conducted, and (tentative) recommendations are made on ways in which network performance could be improved. A different routing algorithm is also implemented, for the purpose of comparison and to demonstrate the ease with which this can be affected

    Automated Bidding in Computing Service Markets. Strategies, Architectures, Protocols

    Get PDF
    This dissertation contributes to the research on Computational Mechanism Design by providing novel theoretical and software models - a novel bidding strategy called Q-Strategy, which automates bidding processes in imperfect information markets, a software framework for realizing agents and bidding strategies called BidGenerator and a communication protocol called MX/CS, for expressing and exchanging economic and technical information in a market-based scheduling system

    Brain-wave measures of workload in advanced cockpits: The transition of technology from laboratory to cockpit simulator, phase 2

    Get PDF
    The present Phase 2 small business innovation research study was designed to address issues related to scalp-recorded event-related potential (ERP) indices of mental workload and to transition this technology from the laboratory to cockpit simulator environments for use as a systems engineering tool. The project involved five main tasks: (1) Two laboratory studies confirmed the generality of the ERP indices of workload obtained in the Phase 1 study and revealed two additional ERP components related to workload. (2) A task analysis' of flight scenarios and pilot tasks in the Advanced Concepts Flight Simulator (ACFS) defined cockpit events (i.e., displays, messages, alarms) that would be expected to elicit ERPs related to workload. (3) Software was developed to support ERP data analysis. An existing ARD-proprietary package of ERP data analysis routines was upgraded, new graphics routines were developed to enhance interactive data analysis, and routines were developed to compare alternative single-trial analysis techniques using simulated ERP data. (4) Working in conjunction with NASA Langley research scientists and simulator engineers, preparations were made for an ACFS validation study of ERP measures of workload. (5) A design specification was developed for a general purpose, computerized, workload assessment system that can function in simulators such as the ACFS

    Cognitive computing: algorithm design in the intersection of cognitive science and emerging computer architectures

    Full text link
    For the first time in decades computers are evolving into a fundamentally new class of machine. Transistors are still getting smaller, more economical, and more power-efficient, but operating frequencies leveled off in the mid-2000's. Today, improving performance requires placing a larger number of slower processing cores on each of many chips. Software written for such machines must scale out over many cores rather than scaling up with a faster single core. Biological computation is an extreme manifestation of such a many-slow-core architecture and therefore offers a potential source of ideas for leveraging new hardware. This dissertation addresses several problems in the intersection of emerging computer architectures and biological computation, termed Cognitive Computing: What mechanisms are necessary to maintain stable representations in a large distributed learning system? How should complex biologically-inspired algorithms be tested? How do visual sensing limitations like occlusion influence performance of classification algorithms? Neurons have a limited dynamic output range, but must process real-world signals over a wide dynamic range without saturating or succumbing to endogenous noise. Many existing neural network models leverage spatial competition to address this issue, but require hand-tuning of several parameters for a specific, fixed distribution of inputs. Integrating spatial competition with a stabilizing learning process produces a neural network model capable of autonomously adapting to a non-stationary distribution of inputs. Human-engineered complex systems typically include a number of architectural features to curtail complexity and simplify testing. Biological systems do not obey these constraints. Biologically-inspired algorithms are thus dramatically more difficult to engineer. Augmenting standard tools from the software engineering community with features targeted towards biologically-inspired systems is an effective mitigation. Natural visual environments contain objects that are occluded by other objects. Such occlusions are under-represented in the standard benchmark datasets for testing classification algorithms. This bias masks the negative effect of occlusion on performance. Correcting the bias with a new dataset demonstrates that occlusion is a dominant variable in classification performance. Modifying a state-of-the-art algorithm with mechanisms for occlusion resistance doubles classification performance in high-occlusion cases without penalty for unoccluded objects

    Design requirements for SRB production control system. Volume 5: Appendices

    Get PDF
    A questionnaire to be used to screen potential candidate production control software packages is presented

    Estimation of the demand for public services communications

    Get PDF
    Market analyses and economic studies are presented to support NASA planning for a communications satellite system to provide public services in health, education, mobile communications, data transfer, and teleconferencing

    Modelling material mass balances over wastewater treatment plants

    Get PDF
    Includes bibliographical references.The overall objective of whole wastewater treatment plant (WWTP)modelling is to develop a COD (electron), carbon (C), nitrogen (N), phosphorus (P), alkilinity (proton), calcium (Ca), magnesium (Mg) and inorganic suspended solids (ISS) concentrations mass balances models for unit operations in municipal WWTPs. The development of such a model, for both steady state and dynamic simulation conditions, is an objective greater that this thesis project, however, it makes a number of significant steps towards it

    Generation of (synthetic) influent data for performing wastewater treatment modelling studies

    Get PDF
    The success of many modelling studies strongly depends on the availability of sufficiently long influent time series - the main disturbance of a typical wastewater treatment plant (WWTP) - representing the inherent natural variability at the plant inlet as accurately as possible. This is an important point since most modelling projects suffer from a lack of realistic data representing the influent wastewater dynamics. The objective of this paper is to show the advantages of creating synthetic data when performing modelling studies for WWTPs. This study reviews the different principles that influent generators can be based on, in order to create realistic influent time series. In addition, the paper summarizes the variables that those models can describe: influent flow rate, temperature and traditional/emerging pollution compounds, weather conditions (dry/wet) as well as their temporal resolution (from minutes to years). The importance of calibration/validation is addressed and the authors critically analyse the pros and cons of manual versus automatic and frequentistic vs Bayesian methods. The presentation will focus on potential engineering applications of influent generators, illustrating the different model concepts with case studies. The authors have significant experience using these types of tools and have worked on interesting case studies that they will share with the audience. Discussion with experts at the WWTmod seminar shall facilitate identifying critical knowledge gaps in current WWTP influent disturbance models. Finally, the outcome of these discussions will be used to define specific tasks that should be tackled in the near future to achieve more general acceptance and use of WWTP influent generators
    corecore