579 research outputs found

    Efficient Data Compression with Error Bound Guarantee in Wireless Sensor Networks

    Get PDF
    We present a data compression and dimensionality reduction scheme for data fusion and aggregation applications to prevent data congestion and reduce energy consumption at network connecting points such as cluster heads and gateways. Our in-network approach can be easily tuned to analyze the data temporal or spatial correlation using an unsupervised neural network scheme, namely the autoencoders. In particular, our algorithm extracts intrinsic data features from previously collected historical samples to transform the raw data into a low dimensional representation. Moreover, the proposed framework provides an error bound guarantee mechanism. We evaluate the proposed solution using real-world data sets and compare it with traditional methods for temporal and spatial data compression. The experimental validation reveals that our approach outperforms several existing wireless sensor network's data compression methods in terms of compression efficiency and signal reconstruction.Comment: ACM MSWiM 201

    Finding next-to-shortest paths in a graph

    Get PDF
    We study the problem of finding the next-to-shortest paths in a graph. A next-to-shortest (u,v)(u,v)-path is a shortest (u,v)(u,v)-path amongst (u,v)(u,v)-paths with length strictly greater than the length of the shortest (u,v)(u,v)-path. In constrast to the situation in directed graphs, where the problem has been shown to be NP-hard, providing edges of length zero are allowed, we prove the somewhat surprising result that there is a polynomial time algorithm for the undirected version of the problem

    Fast Gr\"obner Basis Computation for Boolean Polynomials

    Full text link
    We introduce the Macaulay2 package BooleanGB, which computes a Gr\"obner basis for Boolean polynomials using a binary representation rather than symbolic. We compare the runtime of several Boolean models from systems in biology and give an application to Sudoku

    Tracing Execution of Software for Design Coverage

    Full text link
    Test suites are designed to validate the operation of a system against requirements. One important aspect of a test suite design is to ensure that system operation logic is tested completely. A test suite should drive a system through all abstract states to exercise all possible cases of its operation. This is a difficult task. Code coverage tools support test suite designers by providing the information about which parts of source code are covered during system execution. Unfortunately, code coverage tools produce only source code coverage information. For a test engineer it is often hard to understand what the noncovered parts of the source code do and how they relate to requirements. We propose a generic approach that provides design coverage of the executed software simplifying the development of new test suites. We demonstrate our approach on common design abstractions such as statecharts, activity diagrams, message sequence charts and structure diagrams. We implement the design coverage using Third Eye tracing and trace analysis framework. Using design coverage, test suites could be created faster by focussing on untested design elements.Comment: Short version of this paper to be published in Proceedings of 16th IEEE International Conference on Automated Software Engineering (ASE 2001). 13 pages, 9 figure

    Synthesis of Suffix Trees

    Full text link
    The implications of probabilistic communication have been far-reaching and pervasive. Given the current status of flexible models, leading analysts particularly desire the simulation of sensor networks, which embodies the intuitive principles of cryptography. We propose a novel method for the evaluation of Smalltalk, which we call WAE

    State-of-the-Art and Comparative Review of Adaptive Sampling Methods for Kriging

    Get PDF
    Metamodels aim to approximate characteristics of functions or systems from the knowledge extracted on only a finite number of samples. In recent years kriging has emerged as a widely applied metamodeling technique for resource-intensive computational experiments. However its prediction quality is highly dependent on the size and distribution of the given training points. Hence, in order to build proficient kriging models with as few samples as possible adaptive sampling strategies have gained considerable attention. These techniques aim to find pertinent points in an iterative manner based on information extracted from the current metamodel. A review of adaptive schemes for kriging proposed in the literature is presented in this article. The objective is to provide the reader with an overview of the main principles of adaptive techniques, and insightful details to pertinently employ available tools depending on the application at hand. In this context commonly applied strategies are compared with regards to their characteristics and approximation capabilities. In light of these experiments, it is found that the success of a scheme depends on the features of a specific problem and the goal of the analysis. In order to facilitate the entry into adaptive sampling a guide is provided. All experiments described herein are replicable using a provided open source toolbox. © 2020, The Author(s)
    • …
    corecore