177 research outputs found

    Tools for Real-Time Control Systems Co-Design : A Survey

    Get PDF
    This report presents a survey of current simulation tools in the area of integrated control and real-time systems design. Each tool is presented with a quick overview followed by a more detailed section describing comparative aspects of the tool. These aspects describe the context and purpose of the tool (scenarios, development stages, activities, and qualities/constraints being addressed) and the actual tool technology (tool architecture, inputs, outputs, modeling content, extensibility and availability). The tools presented in the survey are the following; Jitterbug and TrueTime from the Department of Automatic Control at Lund University, Sweden, AIDA and XILO from the Department of Machine Design at the Royal Institute of Technology, Sweden, Ptolemy II from the Department of Electrical Engineering and Computer Sciences at Berkeley, California, RTSIM from the RETIS Laboratory, Pisa, Italy, and Syndex and Orccad from INRIA, France. The survey also briefly describes some existing commercial tools related to the area of real-time control systems

    Control and Embedded Computing: Survey of Research Directions

    Get PDF
    This paper provides a survey of the role of feedback control in embedded realtimesystems, presented in the context of a new EU/IST Network of Excellence, ARTIST2.The survey highlights recent research efforts and future research directions in the areasof codesign of computer-based control systems, implementation-aware embedded controlsystems, and control of real-time computing systems

    GRASP/Ada (Graphical Representations of Algorithms, Structures, and Processes for Ada): The development of a program analysis environment for Ada. Reverse engineering tools for Ada, task 1, phase 2

    Get PDF
    The study, formulation, and generation of structures for Ada (GRASP/Ada) are discussed in this second phase report of a three phase effort. Various graphical representations that can be extracted or generated from source code are described and categorized with focus on reverse engineering. The overall goal is to provide the foundation for a CASE (computer-aided software design) environment in which reverse engineering and forward engineering (development) are tightly coupled. Emphasis is on a subset of architectural diagrams that can be generated automatically from source code with the control structure diagram (CSD) included for completeness

    Probabilistic grid scheduling based on job statistics and monitoring information

    Get PDF
    This transfer thesis presents a novel, probabilistic approach to scheduling applications on computational Grids based on their historical behaviour, current state of the Grid and predictions of the future execution times and resource utilisation of such applications. The work lays a foundation for enabling a more intuitive, user-friendly and effective scheduling technique termed deadline scheduling. Initial work has established motivation and requirements for a more efficient Grid scheduler, able to adaptively handle dynamic nature of the Grid resources and submitted workload. Preliminary scheduler research identified the need for a detailed monitoring of Grid resources on the process level, and for a tool to simulate non-deterministic behaviour and statistical properties of Grid applications. A simulation tool, GridLoader, has been developed to enable modelling of application loads similar to a number of typical Grid applications. GridLoader is able to simulate CPU utilisation, memory allocation and network transfers according to limits set through command line parameters or a configuration file. Its specific strength is in achieving set resource utilisation targets in a probabilistic manner, thus creating a dynamic environment, suitable for testing the scheduler’s adaptability and its prediction algorithm. To enable highly granular monitoring of Grid applications, a monitoring framework based on the Ganglia Toolkit was developed and tested. The suite is able to collect resource usage information of individual Grid applications, integrate it into standard XML based information flow, provide visualisation through a Web portal, and export data into a format suitable for off-line analysis. The thesis also presents initial investigation of the utilisation of University College London Central Computing Cluster facility running Sun Grid Engine middleware. Feasibility of basic prediction concepts based on the historical information and process meta-data have been successfully established and possible scheduling improvements using such predictions identified. The thesis is structured as follows: Section 1 introduces Grid computing and its major concepts; Section 2 presents open research issues and specific focus of the author’s research; Section 3 gives a survey of the related literature, schedulers, monitoring tools and simulation packages; Section 4 presents the platform for author’s work – the Self-Organising Grid Resource management project; Sections 5 and 6 give detailed accounts of the monitoring framework and simulation tool developed; Section 7 presents the initial data analysis while Section 8.4 concludes the thesis with appendices and references

    Secure and Privacy Driven Energy Data Analytics

    Get PDF
    PhD thesis in Information technologyRenewable resources are the main energy sources in a smart grid project. In order to ensure the smooth functioning of the smart grid, Information and Communication Technologies (ICT) need to be utilised efficiently. The objective of the SmartNEM project is to effectively utilise the technologies such as Machine Learning, Blockchain and Data Hubs for the aforementioned purpose and at the same time ensure a secured and privacy preserved solution. The data involved in smart grids require high security and it can be sensitive due to the household data which contains personal information. The individuals can be reluctant to share these data due to mistrust and to avoid unnecessary manipulation of the data they provide. In order to overcome this it is necessary to build a trust based framework in which one could ensure data security and data privacy for the data owners to open up their data for data analysis. To achieves this we have proposed an architecture called TOTEM, Token for Controlled Computation, which integrates Blockchain and Big Data technologies. The conventional method of data analysis demands data be moved across the network to the location where the execution happens, however in the TOTEM architecture computational code will be moved to the data owner’s environment where the data is located. The TOTEM is a three layer architecture (Blockchain consortium layer, Storage layer and Computational layer) with two main actors, data provider and data consumer. Data provider provides metadata of the data they own and provide resources for the execution of data. Data consumers will get an opportunity to execute their own code on the data provider®s data. For a controlled computation and to avoid malicious functions an entity called totem is introduced in the architecture. The authorised users should meet the requirements of Totem value for executing their code on the requested data. For live monitoring of the totem value throughout the run time is achieved with the components such as totem manager and updaters in the computational layer. The code must follow a specific format and will undergo preliminary checks with the TOTEM defined SDK and smart contracts deployed by the data providers in the blockchain network. The Extended TOTEM architecture is also proposed to address the additional features when it is needed to combine the results from multiple data providers without sharing the data. This research work focused on the design of the TOTEM architecture and implementation as a proof of concept for the newly introduced components in the architecture. We have also introduced artificial intelligence in the framework to improve core features’ functionality. In the present research, the TOTEM architecture is proposed for the SmartNEM project to utilize the energy data for decision making and figure out the trends or patterns, while maintaining data privacy, data ownership, accountability and traceability. Moreover, the architecture can be extended to other domains such as health, education, etc, where data security and privacy is the key concern in sharing the data

    Optimizing energy efficiency in operating built environment assets through building Information modeling: a case study

    Get PDF
    Reducing carbon emissions and addressing environmental policies in the construction domain has been intensively explored with solutions ranging from energy efficiency techniques with building informatics to user behavior modelling and monitoring. Such strategies have managed to improve current practices in managing buildings, however decarbonizing the built environment and reducing the energy performance gap remains a complex undertaking that requires more comprehensive and sustainable solutions. In this context, building information modelling (BIM), can help the sustainability agenda as the digitalization of product and process information provides a unique opportunity to optimize energy-efficiency-related decisions across the entire lifecycle and supply chain. BIM is foreseen as a means to waste and emissions reduction, performance gap minimization, in-use energy enhancements, and total lifecycle assessment. It also targets the whole supply chain related to design, construction, as well as management and use of facilities, at the different qualifications levels (including blue-collar workers). In this paper, we present how building information modelling can be utilized to address energy efficiency in buildings in the operation phase, greatly contributing to achieving carbon emissions targets. In this paper, we provide two main contributions: (i) we present a BIM-oriented methodology for supporting building energy optimization, based on which we identify few training directions with regards to BIM, and (ii) we provide an application use case as identified in the European research project “Sporte2” to demonstrate the advantages of BIM in energy efficiency with respect to several energy metrics

    A method for the architectural design of distributed control systems for large, civil jet engines: a systems engineering approach

    Get PDF
    The design of distributed control systems (DCSs) for large, civil gas turbine engines is a complex architectural challenge. To date, the majority of research into DCSs has focused on the contributing technologies and high temperature electronics rather than the architecture of the system itself. This thesis proposes a method for the architectural design of distributed systems using a genetic algorithm to generate, evaluate and refine designs. The proposed designs are analysed for their architectural quality, lifecycle value and commercial benefit. The method is presented along with results proving the concept. Whilst the method described here is applied exclusively to Distributed Control System (DCS) for jet engines, the principles and methods could be adapted for a broad range of complex systems

    EFFICIENT IMPLEMENTATION OF BRANCH-AND-BOUND METHOD ON DESKTOP GRIDS

    Get PDF
    The Berkeley Open Infrastructure for Network Computing (BOINC) is an opensource middleware system for volunteer and desktop grid computing. In this paper we propose BNBTEST, a BOINC version of distributed branch and bound method. The crucial issues of distributed branch-and-bound method are traversing the search tree and loading balance. We developed subtaskspackaging method and three dierent subtasks' distribution strategies to solve these
    • 

    corecore