202,729 research outputs found

    Performance assessment of real-time data management on wireless sensor networks

    Get PDF
    Technological advances in recent years have allowed the maturity of Wireless Sensor Networks (WSNs), which aim at performing environmental monitoring and data collection. This sort of network is composed of hundreds, thousands or probably even millions of tiny smart computers known as wireless sensor nodes, which may be battery powered, equipped with sensors, a radio transceiver, a Central Processing Unit (CPU) and some memory. However due to the small size and the requirements of low-cost nodes, these sensor node resources such as processing power, storage and especially energy are very limited. Once the sensors perform their measurements from the environment, the problem of data storing and querying arises. In fact, the sensors have restricted storage capacity and the on-going interaction between sensors and environment results huge amounts of data. Techniques for data storage and query in WSN can be based on either external storage or local storage. The external storage, called warehousing approach, is a centralized system on which the data gathered by the sensors are periodically sent to a central database server where user queries are processed. The local storage, in the other hand called distributed approach, exploits the capabilities of sensors calculation and the sensors act as local databases. The data is stored in a central database server and in the devices themselves, enabling one to query both. The WSNs are used in a wide variety of applications, which may perform certain operations on collected sensor data. However, for certain applications, such as real-time applications, the sensor data must closely reflect the current state of the targeted environment. However, the environment changes constantly and the data is collected in discreet moments of time. As such, the collected data has a temporal validity, and as time advances, it becomes less accurate, until it does not reflect the state of the environment any longer. Thus, these applications must query and analyze the data in a bounded time in order to make decisions and to react efficiently, such as industrial automation, aviation, sensors network, and so on. In this context, the design of efficient real-time data management solutions is necessary to deal with both time constraints and energy consumption. This thesis studies the real-time data management techniques for WSNs. It particularly it focuses on the study of the challenges in handling real-time data storage and query for WSNs and on the efficient real-time data management solutions for WSNs. First, the main specifications of real-time data management are identified and the available real-time data management solutions for WSNs in the literature are presented. Secondly, in order to provide an energy-efficient real-time data management solution, the techniques used to manage data and queries in WSNs based on the distributed paradigm are deeply studied. In fact, many research works argue that the distributed approach is the most energy-efficient way of managing data and queries in WSNs, instead of performing the warehousing. In addition, this approach can provide quasi real-time query processing because the most current data will be retrieved from the network. Thirdly, based on these two studies and considering the complexity of developing, testing, and debugging this kind of complex system, a model for a simulation framework of the real-time databases management on WSN that uses a distributed approach and its implementation are proposed. This will help to explore various solutions of real-time database techniques on WSNs before deployment for economizing money and time. Moreover, one may improve the proposed model by adding the simulation of protocols or place part of this simulator on another available simulator. For validating the model, a case study considering real-time constraints as well as energy constraints is discussed. Fourth, a new architecture that combines statistical modeling techniques with the distributed approach and a query processing algorithm to optimize the real-time user query processing are proposed. This combination allows performing a query processing algorithm based on admission control that uses the error tolerance and the probabilistic confidence interval as admission parameters. The experiments based on real world data sets as well as synthetic data sets demonstrate that the proposed solution optimizes the real-time query processing to save more energy while meeting low latency.Fundação para a Ciência e Tecnologi

    Evaluating ITER remote handling middleware concepts

    Get PDF
    Remote maintenance activities in ITER will be performed by a unique set of hardware systems, supported by an extensive software kit. A layer of middleware will manage and control a complex set of interconnections between teams of operators, hardware devices in various operating theatres, and databases managing tool and task logistics. The middleware is driven by constraints on amounts and timing of data like real-time control loops, camera images, and database access. The Remote Handling Study Centre (RHSC), located at FOM institute DIFFER, has a 4-operator work cell in an ITER relevant RH Control Room setup which connects to a virtual hot cell back-end. The centre is developing and testing flexible integration of the Control Room components, resulting in proof-of-concept tests of this middleware layer. SW components studied include generic human-machine interface software, a prototype of a RH operations management system, and a distributed virtual reality system supporting multi-screen, multi-actor, and multiple independent views. Real-time rigid body dynamics and contact interaction simulation software supports simulation of structural deformation, “augmented reality” operations and operator training. The paper presents generic requirements and conceptual design of middleware components and Operations Management System in the context of a RH Control Room work cell. The simulation software is analyzed for real-time performance and it is argued that it is critical for middleware to have complete control over the physical network to be able to guarantee bandwidth and latency to the components.</p

    CERN Storage Systems for Large-Scale Wireless

    Get PDF
    The project aims at evaluating the use of CERN computing infrastructure for next generation sensor networks data analysis. The proposed system allows the simulation of a large-scale sensor array for traffic analysis, streaming data to CERN storage systems in an efficient way. The data are made available for offline and quasi-online analysis, enabling both long term planning and fast reaction on the environment

    INDEMICS: An Interactive High-Performance Computing Framework for Data Intensive Epidemic Modeling

    Get PDF
    We describe the design and prototype implementation of Indemics (_Interactive; Epi_demic; _Simulation;)—a modeling environment utilizing high-performance computing technologies for supporting complex epidemic simulations. Indemics can support policy analysts and epidemiologists interested in planning and control of pandemics. Indemics goes beyond traditional epidemic simulations by providing a simple and powerful way to represent and analyze policy-based as well as individual-based adaptive interventions. Users can also stop the simulation at any point, assess the state of the simulated system, and add additional interventions. Indemics is available to end-users via a web-based interface. Detailed performance analysis shows that Indemics greatly enhances the capability and productivity of simulating complex intervention strategies with a marginal decrease in performance. We also demonstrate how Indemics was applied in some real case studies where complex interventions were implemented

    Delivering building simulation information via new communication media

    Get PDF
    Often, the goal of understanding how the building works and the impact of design decisions is hampered by limitations in the presentation of performance data. Contemporary results display is often constrained to what was considered good practice some decades ago rather than in ways that preserve the richness of the underlying data. This paper reviews a framework for building simulation support that addresses these presentation limitations as well as making a start on issues related to distributed team working. The framework uses tools and communication protocols that enable concurrent information sharing and provide a richer set of options for understanding complex performance relationships

    Delivering building simulation information via new communication media

    Get PDF
    Often, the goal of understanding how the building works and the impact of design decisions is hampered by limitations in the presentation of performance data. Contemporary results display is often constrained to what was considered good practice some decades ago rather than in ways that preserve the richness of the underlying data. This paper reviews a framework for building simulation support that addresses these presentation limitations as well as making a start on issues related to distributed team working. The framework uses tools and communication protocols that enable concurrent information sharing and provide a richer set of options for understanding complex performance relationships

    ERIGrid Holistic Test Description for Validating Cyber-Physical Energy Systems

    Get PDF
    Smart energy solutions aim to modify and optimise the operation of existing energy infrastructure. Such cyber-physical technology must be mature before deployment to the actual infrastructure, and competitive solutions will have to be compliant to standards still under development. Achieving this technology readiness and harmonisation requires reproducible experiments and appropriately realistic testing environments. Such testbeds for multi-domain cyber-physical experiments are complex in and of themselves. This work addresses a method for the scoping and design of experiments where both testbed and solution each require detailed expertise. This empirical work first revisited present test description approaches, developed a newdescription method for cyber-physical energy systems testing, and matured it by means of user involvement. The new Holistic Test Description (HTD) method facilitates the conception, deconstruction and reproduction of complex experimental designs in the domains of cyber-physical energy systems. This work develops the background and motivation, offers a guideline and examples to the proposed approach, and summarises experience from three years of its application.This work received funding in the European Community’s Horizon 2020 Program (H2020/2014–2020) under project “ERIGrid” (Grant Agreement No. 654113)
    • …
    corecore