145 research outputs found

    System Architecture for Distributed Control Systems and Electricity Market Infrastructures.

    Get PDF
    M.S. Thesis. University of Hawaiʻi at Mānoa 2018

    An Implementation for Transforming a Home Energy Management System to a Multi-agent System

    Get PDF
    In the United States, 41% of produced energy is consumed by the building sector, i.e. residential and commercial buildings (Building Energy Data Book, Buildings Sector, US Department of Energy, Office of Energy Efficiency and Renewable Energy, Building Technologies Office, 2012. n.d.). The anticipation is that Home Energy Management Systems (HEMS) will support energy efficiency gains through control of the devices in an optimal fashion. New opportunities are offering the ability to integrate grid type controls and the most suitable way to perform these controls is through a multi-agent system (MAS). In this paper, approaches on supporting a HEMS and MAS integration are discussed

    The HELP Hospital Information System: Update 1998

    Get PDF
    journal articleBiomedical Informatic

    Supporting distributed computation over wide area gigabit networks

    Get PDF
    The advent of high bandwidth fibre optic links that may be used over very large distances has lead to much research and development in the field of wide area gigabit networking. One problem that needs to be addressed is how loosely coupled distributed systems may be built over these links, allowing many computers worldwide to take part in complex calculations in order to solve "Grand Challenge" problems. The research conducted as part of this PhD has looked at the practicality of implementing a communication mechanism proposed by Craig Partridge called Late-binding Remote Procedure Calls (LbRPC). LbRPC is intended to export both code and data over the network to remote machines for evaluation, as opposed to traditional RPC mechanisms that only send parameters to pre-existing remote procedures. The ability to send code as well as data means that LbRPC requests can overcome one of the biggest problems in Wide Area Distributed Computer Systems (WADCS): the fixed latency due to the speed of light. As machines get faster, the fixed multi-millisecond round trip delay equates to ever increasing numbers of CPU cycles. For a WADCS to be efficient, programs should minimise the number of network transits they incur. By allowing the application programmer to export arbitrary code to the remote machine, this may be achieved. This research has looked at the feasibility of supporting secure exportation of arbitrary code and data in heterogeneous, loosely coupled, distributed computing environments. It has investigated techniques for making placement decisions for the code in cases where there are a large number of widely dispersed remote servers that could be used. The latter has resulted in the development of a novel prototype LbRPC using multicast IP for implicit placement and a sequenced, multi-packet saturation multicast transport protocol. These prototypes show that it is possible to export code and data to multiple remote hosts, thereby removing the need to perform complex and error prone explicit process placement decisions

    Co-Simulation of distributed flexibility coordination schemes

    Get PDF
    Cílem práce je implementovat a otestovat simulační prostředí, které umožní spojení simulátorů různých typů úloh. Toto prostředí je aplikováno na simulaci koordinovaného optimálního řízení spotřeby energie 20 domácností s různými požadavky na velikost spotřeby a možnostmi uložení energie. Výsledky ukazují, že koordinované řízení spotřeby energie více domácností může dosáhnout značných úspor ve srovnání s řízením spotřeby jednotlivých domácností bez ohledu na ostatní.The goal of the thesis is to implement and test co-simulation environment making it possible to connect simulators of different type. The environment is applied on simulation of coordinated optimal control of energy consumption of 20 households with different preferences on energy supply and its storage capacity. The results show that coordinated control of energy consumption may achieve considerable savings in comparison with control of individual households regardless to the others

    Secure Configuration and Management of Linux Systems using a Network Service Orchestrator.

    Get PDF
    Manual management of the configuration of network devices and computing devices (hosts) is an error-prone task. Centralized automation of these tasks can lower the costs of management, but can also introduce unknown or unanticipated security risks. Misconfiguration (deliberate (by outsiders) or inadvertent (by insiders)) can expose a system to significant risks. Centralized network management has seen significant progress in recent years, resulting in model-driven approaches that are clearly superior to previous "craft" methods. Host management has seen less development. The tools available have developed in separate task-specific ways. This thesis explores two aspects of the configuration management problem for hosts: (1) implementing host management using the model-driven (network) management tools; (2) establishing the relative security of traditional methods and the above proposal for model driven host management. It is shown that the model-driven approach is feasible, and the security of the model driven approach is significantly higher than that of existing approaches

    Minutes of the CD-ROM Workshop

    Get PDF
    The workshop described in this document had two goals: (1) to establish guidelines for the CD-ROM as a tool to distribute datasets; and (2) to evaluate current scientific CD-ROM projects as an archive. Workshop attendees were urged to coordinate with European groups to develop CD-ROM, which is already available at low cost in the U.S., as a distribution medium for astronomical datasets. It was noted that NASA has made the CD Publisher at the National Space Science Data Center (NSSDC) available to the scientific community when the Publisher is not needed for NASA work. NSSDC's goal is to provide the Publisher's user with the hardware and software tools needed to design a user's dataset for distribution. This includes producing a master CD and copies. The prerequisite premastering process is described, as well as guidelines for CD-ROM construction. The production of discs was evaluated. CD-ROM projects, guidelines, and problems of the technology were discussed

    Design of a graphic user interface for a network management protocol

    Get PDF

    Strategies for long-term monitoring of tide gauges using GPS

    Get PDF
    Changes in mean sea (MSL) level recorded relative to tide gauge benchmarks (TGBM) are corrupted by vertical land movements. Accurate estimates of changes in absolute sea level, require these MSL records to be corrected for ground level changes at tide gauge sites. For more than a decade, the Global Positioning System (GPS) has been used to determine positions of TGBMs and to monitor their position changes, i.e. station velocities, over time in the International Terrestrial Reference System (ITRS). This was initially carried out by episodic GPS campaigns and later on by continuous GPS (CGPS) or a combination of both. Highly accurate realizations of the ITRS, satellite orbits and models for the mitigation of systematic effects currently enable the determination of station positions using GPS at the centimetre or even millimetre level. It is however argued that accurate long--term estimates of changes in the vertical component at the 1mm/yr level cannot be achieved, making intercomparisons between GPS estimates and other techniques necessary. Daily processing and analysis of continuous GPS networks requires automated procedures. The modifications and improvements to the existing procedures at the IESSG are described. The newly developed tools include the monitoring and quality control of daily archived GPS observations and of processing results. A special focus is on the coordinate time series analysis and methodologies used to obtain the best possible estimates of vertical station velocities and associated uncertainties. The coordinate time series of 21 CGPS stations in the UK and France are analysed. Eight of these stations are co-located with tide gauges. The effects of two processing strategies and two realizations of the ITRS on the coordinate time series are investigated. Filtered coordinate time series are obtained by application of a regional filtering technique. Station velocity estimates are obtained by fitting a model including a linear and annual term, and offsets to the unfiltered and filtered coordinate time series. Realistic uncertainties for these velocities are obtained from the application of two empirical methods which account for coloured noise in the coordinate time series. Results from these are compared to the Maximum Likelihood Estimation (MLE), which allows for more rigorous and accurate, simultaneous estimation of the model parameters and their uncertainties. Strategies for coordinate time series analysis on a daily or monthly, and annual or bi-annual basis are defined. At two CGPS stations the dual-CGPS station concept is tested and compared to the single baseline analysis and the application of an adaptive filter. An empirical method to obtain coordinate time series specific filter parameters is described. This investigation shows that reliable relative vertical station velocity estimates can be obtained after much shorter observation spans than absolute vertical station velocity estimates. The availability of dual-CGPS station pairs allows a simplified processing strategy and a multitude of coordinate time series analysis methods, all contributing to a better understanding of the variations in the positions of CGPS stations. Vertical station velocity estimates for the unfiltered and filtered coordinate time series and different analysis strategies are compared for 17 of the CGPS stations and show disagreements of up to 2mm/yr. At the eight CGPS stations co-located with or close to tide gauges alternative estimates of vertical land/crustal movements from absolute gravimetry, geological information and glacial isostatic adjustment models are compared to the GPS estimates, and it is suggested that the latter are systematically offset. An alignment procedure is demonstrated, correcting the vertical station velocity estimates of all 17 CGPS stations for this offset. The correlation of the geology-aligned vertical station velocity estimates and the MSL records from eight tide gauges suggests changes in absolute sea level of approximately +1mm/yr around the UK

    Information Security in Smart Grid Demonstration Environment

    Get PDF
    The ever growing population and need for energy has culminated in an energy crisis. Old, traditional energy sources are running low and the transition to renewable ones has begun. The electric grid, however, is very old, being inefficient and incapable of meeting the needs of today. One solution for these problems is to utilize a two-way flow of electricity and information, also known as Smart Grid. As Smart Grid utilizes information and communications technology, it will be exposed to information security threats. Smart Grid comprises of many systems, creating a complex automation environment. Thus, even if making Smart Grid secure is troublesome, it is essential to ensure its security since the consequences of successful attacks can be disastrous. This thesis is part of CLEEN SHOK Smart Grids and Energy Markets project and studies the information security of the Smart Grid demonstration environment. The main goals are to analyze and test the information security of the Smart Grid implementation, and to generate a best practice information security checklist for different players in the Smart Grid environment. The thesis is divided into four phases. In the literature study the focus is on information security landscape and features, as well as Smart Grid on general level. This phase includes a presentation of the conceptual model of Smart Grid and the demonstration environment on a general level. In the analysis demonstration environment is analyzed through threat modelling and closer examination of the demonstration equipment. The threat model works from the customer´s point of view, concentrating on home energy management system, and providing high abstract level analysis, whereas the examination of the equipment provides more specific analysis. In the testing, the demonstration environment is tested, and the results are presented. This phase also includes the testing layout and introduces the software used for the testing. The final section focuses on generating a best practice security list. This checklist provides the top 10 critical controls of information security for the Smart Grid environment, especially for a home automation environment. In the course of the study, it is indicated that the information security of the demonstration environment has shortages. The most common vulnerabilities are due to wrong software configurations, and using vulnerable versions of software. The most critical part of the demonstration environment is the end user's device, which in this study was ThereGate. This equipment has many security issues that need to be taken care of. Se-curing ThereGate is essential in regard to the entire system's dependability and security. To secure dependable Smart Grid, stronger methods like strong client authentication are required. As long as standards only recommend and do not require information security methods, like encryption, they will not be used, and thus, they will make the system more vulnerable. As a result, it can be said that more security research is required in order to secure a dependable Smart Grid
    corecore