1,702 research outputs found

    Energy Efficient Data Acquistion in Wireless Sensor Network

    Get PDF

    Context for Ubiquitous Data Management

    Get PDF
    In response to the advance of ubiquitous computing technologies, we believe that for computer systems to be ubiquitous, they must be context-aware. In this paper, we address the impact of context-awareness on ubiquitous data management. To do this, we overview different characteristics of context in order to develop a clear understanding of context, as well as its implications and requirements for context-aware data management. References to recent research activities and applicable techniques are also provided

    Decentralised Control of Adaptive Sampling in Wireless Sensor Networks

    No full text
    The efficient allocation of the limited energy resources of a wireless sensor network in a way that maximises the information value of the data collected is a significant research challenge. Within this context, this paper concentrates on adaptive sampling as a means of focusing a sensorā€™s energy consumption on obtaining the most important data. Specifically, we develop a principled information metric based upon Fisher information and Gaussian process regression that allows the information content of a sensorā€™s observations to be expressed. We then use this metric to derive three novel decentralised control algorithms for information-based adaptive sampling which represent a trade-off in computational cost and optimality. These algorithms are evaluated in the context of a deployed sensor network in the domain of flood monitoring. The most computationally efficient of the three is shown to increase the value of information gathered by approximately 83%, 27%, and 8% per day compared to benchmarks that sample in a naive non-adaptive manner, in a uniform non-adaptive manner, and using a state-of-the-art adaptive sampling heuristic (USAC) correspondingly. Moreover, our algorithm collects information whose total value is approximately 75% of the optimal solution (which requires an exponential, and thus impractical, amount of time to compute)

    Certainty of outlier and boundary points processing in data mining

    Full text link
    Data certainty is one of the issues in the real-world applications which is caused by unwanted noise in data. Recently, more attentions have been paid to overcome this problem. We proposed a new method based on neutrosophic set (NS) theory to detect boundary and outlier points as challenging points in clustering methods. Generally, firstly, a certainty value is assigned to data points based on the proposed definition in NS. Then, certainty set is presented for the proposed cost function in NS domain by considering a set of main clusters and noise cluster. After that, the proposed cost function is minimized by gradient descent method. Data points are clustered based on their membership degrees. Outlier points are assigned to noise cluster and boundary points are assigned to main clusters with almost same membership degrees. To show the effectiveness of the proposed method, two types of datasets including 3 datasets in Scatter type and 4 datasets in UCI type are used. Results demonstrate that the proposed cost function handles boundary and outlier points with more accurate membership degrees and outperforms existing state of the art clustering methods.Comment: Conference Paper, 6 page

    The design and implementation of fuzzy query processing on sensor networks

    Get PDF
    Sensor nodes and Wireless Sensor Networks (WSN) enable observation of the physical world in unprecedented levels of granularity. A growing number of environmental monitoring applications are being designed to leverage data collection features of WSN, increasing the need for efficient data management techniques and for comparative analysis of various data management techniques. My research leverages aspects of fuzzy database, specifically fuzzy data representation and fuzzy or flexible queries to improve upon the efficiency of existing data management techniques by exploiting the inherent uncertainty of the data collected by WSN. Herein I present my research contributions. I provide classification of WSN middleware to illustrate varying approaches to data management for WSN and identify a need to better handle the uncertainty inherent in data collected from physical environments and to take advantage of the imprecision of the data to increase the efficiency of WSN by requiring less information be transmitted to adequately answer queries posed by WSN monitoring applications. In this dissertation, I present a novel approach to querying WSN, in which semantic knowledge about sensor attributes is represented as fuzzy terms. I present an enhanced simulation environment that supports more flexible and realistic analysis by using cellular automata models to separately model the deployed WSN and the underlying physical environment. Simulation experiments are used to evaluate my fuzzy query approach for environmental monitoring applications. My analysis shows that using fuzzy queries improves upon other data management techniques by reducing the amount of data that needs to be collected to accurately satisfy application requests. This reduction in data transmission results in increased battery life within sensors, an important measure of cost and performance for WSN applications

    Wireless Data Acquisition for Edge Learning: Data-Importance Aware Retransmission

    Full text link
    By deploying machine-learning algorithms at the network edge, edge learning can leverage the enormous real-time data generated by billions of mobile devices to train AI models, which enable intelligent mobile applications. In this emerging research area, one key direction is to efficiently utilize radio resources for wireless data acquisition to minimize the latency of executing a learning task at an edge server. Along this direction, we consider the specific problem of retransmission decision in each communication round to ensure both reliability and quantity of those training data for accelerating model convergence. To solve the problem, a new retransmission protocol called data-importance aware automatic-repeat-request (importance ARQ) is proposed. Unlike the classic ARQ focusing merely on reliability, importance ARQ selectively retransmits a data sample based on its uncertainty which helps learning and can be measured using the model under training. Underpinning the proposed protocol is a derived elegant communication-learning relation between two corresponding metrics, i.e., signal-to-noise ratio (SNR) and data uncertainty. This relation facilitates the design of a simple threshold based policy for importance ARQ. The policy is first derived based on the classic classifier model of support vector machine (SVM), where the uncertainty of a data sample is measured by its distance to the decision boundary. The policy is then extended to the more complex model of convolutional neural networks (CNN) where data uncertainty is measured by entropy. Extensive experiments have been conducted for both the SVM and CNN using real datasets with balanced and imbalanced distributions. Experimental results demonstrate that importance ARQ effectively copes with channel fading and noise in wireless data acquisition to achieve faster model convergence than the conventional channel-aware ARQ.Comment: This is an updated version: 1) extension to general classifiers; 2) consideration of imbalanced classification in the experiments. Submitted to IEEE Journal for possible publicatio
    • ā€¦
    corecore