12 research outputs found

    Optimal scheduling for UET-UCT generalized n-dimensional grid task graphs

    No full text

    Optimal Scheduling for UET-UCT Generalized n-Dimensional Grid Task Graphs

    No full text
    The n-dimensional grid is one of the most representative patterns of data flow in parallel computation. The most frequently used scheduling models for grids is the unit execution - unit communication time (UET-UCT). In this paper we enhance the model of ndimensional grid by adding extra diagonal edges. First, we calculate the optimal makespan for the generalized UETUCT grid topology and, then, we establish the minimum number of processors required, to achieve the optimal makespan. Furthermore, we solve the scheduling problem for generalized n-dimensional grids by proposing an optimal time and space scheduling strategy. We thus prove that UET-UCT scheduling of generalized ndimensional grids is low complexity tractable. 1. Introduction Task scheduling is one of the most important and difficult problems in parallel systems. Since the general scheduling problem is known to be NP-complete (see Ullman [13]), researchers have given attention to other methods such as heuristics, approximation..

    Dynamic task scheduling and binding for many-core systems through stream rewriting

    Get PDF
    This thesis proposes a novel model of computation, called stream rewriting, for the specification and implementation of highly concurrent applications. Basically, the active tasks of an application and their dependencies are encoded as a token stream, which is iteratively modified by a set of rewriting rules at runtime. In order to estimate the performance and scalability of stream rewriting, a large number of experiments have been evaluated on many-core systems and the task management has been implemented in software and hardware.In dieser Dissertation wurde Stream Rewriting als eine neue Methode entwickelt, um Anwendungen mit einer großen Anzahl von dynamischen Tasks zu beschreiben und effizient zur Laufzeit verwalten zu können. Dabei werden die aktiven Tasks in einem Datenstrom verpackt, der zur Laufzeit durch wiederholtes Suchen und Ersetzen umgeschrieben wird. Um die Performance und Skalierbarkeit zu bestimmen, wurde eine Vielzahl von Experimenten mit Many-Core-Systemen durchgeführt und die Verwaltung von Tasks über Stream Rewriting in Software und Hardware implementiert

    AVATAR - Machine Learning Pipeline Evaluation Using Surrogate Model

    Get PDF
    © 2020, The Author(s). The evaluation of machine learning (ML) pipelines is essential during automatic ML pipeline composition and optimisation. The previous methods such as Bayesian-based and genetic-based optimisation, which are implemented in Auto-Weka, Auto-sklearn and TPOT, evaluate pipelines by executing them. Therefore, the pipeline composition and optimisation of these methods requires a tremendous amount of time that prevents them from exploring complex pipelines to find better predictive models. To further explore this research challenge, we have conducted experiments showing that many of the generated pipelines are invalid, and it is unnecessary to execute them to find out whether they are good pipelines. To address this issue, we propose a novel method to evaluate the validity of ML pipelines using a surrogate model (AVATAR). The AVATAR enables to accelerate automatic ML pipeline composition and optimisation by quickly ignoring invalid pipelines. Our experiments show that the AVATAR is more efficient in evaluating complex pipelines in comparison with the traditional evaluation approaches requiring their execution

    2001-2002 Graduate Catalog

    Get PDF

    Efficient algorithms for new computational models

    Get PDF
    Thesis (Ph. D.)--Massachusetts Institute of Technology, Dept. of Electrical Engineering and Computer Science, 2003.Includes bibliographical references (p. 155-163).This electronic version was submitted by the student author. The certified thesis is available in the Institute Archives and Special Collections.Advances in hardware design and manufacturing often lead to new ways in which problems can be solved computationally. In this thesis we explore fundamental problems in three computational models that are based on such recent advances. The first model is based on new chip architectures, where multiple independent processing units are placed on one chip, allowing for an unprecedented parallelism in hardware. We provide new scheduling algorithms for this computational model. The second model is motivated by peer-to-peer networks, where countless (often inexpensive) computing devices cooperate in distributed applications without any central control. We state and analyze new algorithms for load balancing and for locality-aware distributed data storage in peer-to-peer networks. The last model is based on extensions of the streaming model. It is an attempt to capture the class of problems that can be efficiently solved on massive data sets. We give a number of algorithms for this model, and compare it to other models that have been proposed for massive data set computations. Our algorithms and complexity results for these computational models follow the central thesis that it is an important part of theoretical computer science to model real-world computational structures, and that such effort is richly rewarded by a plethora of interesting and challenging problems.by Jan Matthias Ruhl.Ph.D

    The Development of a bi-level geographic information systems (GIS) database model for informal settlement upgrading

    Get PDF
    Bibliography : leaves 348-369.Existing Urban GIS models are faced with several limitations. Firstly, these models tend to be single-scale in nature. They are usually designed to operate at either metropolitan- or at the local-level. Secondly, they are generally designed to cater only for the needs of the formal and environmental sectors of the city system. These models do not cater for the "gaps" of data that exist in digital cadastres throughout the world. In the developed countries, these gaps correspond to areas of physical decay or economic decline. In the developing countries, they correspond to informal settlement areas. In this thesis, a new two-scale urban GIS database model, termed the "Bi-Ievel model" is proposed. This model has been specifically designed to address these gaps in the digital cadastre. Furthermore, the model addresses the short-comings facing current informal settlement upgrading models by providing mechanisms for community participation, project management, creating linkages to formal and environmental sectoral models, and for co-ordinating initiatives at a global-level. The Bi-Ievel model is comprised of a metropolitan-level and a series of local-level database components. These components are inter-linked through bi-directional database warehouse connections. While the model requires Internet-connectivity to achieve its full potential across a metropolitan region, it recognises the need for community participation-based methods at a local-level. Members of the community are actually involved in capturing and entering informal settlement data into the local-level database

    HUMAN CONTROL OF ROBOTIC MECHANISMS: MODELLING AND ASSESSMENT OF ASSISTIVE DEVICES

    Get PDF
    The prescription and use of Assistive Technology, particularly teleprostheses, may be enhanced by the use of standard assessment techniques. For input devices, in particular, existing assessment studies, most of which are based on Fitts' Law, have produced contradictory results. This thesis has made contributions to these and related fields, particularly in the following four areas. Fitts' Law (and background information theory) is examined. The inability of this paradigm to match experimental results is noted and explained. Following a review of the contributing fields, a new method of assessing input devices is proposed, based on Fitts' Law, classical control and the concept of 'profiling'. To determine the suitability of the proposed method, it is applied to the results of over 2000 trials. The resulting analysis emphasises the importance of interaction effects and their influence on general comparison techniques for input devices. The process of verification has highlighted gain susceptability as a performance criterion which reflects user susceptability; a technique which may be particularly applicable to Assistive Technology.Dept. of Mechanical and Marine Engineerin
    corecore