112 research outputs found

    Anomaly Detection in Streaming Sensor Data

    Full text link
    In this chapter we consider a cell phone network as a set of automatically deployed sensors that records movement and interaction patterns of the population. We discuss methods for detecting anomalies in the streaming data produced by the cell phone network. We motivate this discussion by describing the Wireless Phone Based Emergency Response (WIPER) system, a proof-of-concept decision support system for emergency response managers. We also discuss some of the scientific work enabled by this type of sensor data and the related privacy issues. We describe scientific studies that use the cell phone data set and steps we have taken to ensure the security of the data. We describe the overall decision support system and discuss three methods of anomaly detection that we have applied to the data.Comment: 35 pages. Book chapter to appear in "Intelligent Techniques for Warehousing and Mining Sensor Network Data" (IGI Global), edited by A. Cuzzocre

    Towards Aggregating Time-Discounted Information in Sensor Networks

    Get PDF
    Sensor networks are deployed to monitor a seemingly endless list of events in a multitude of application domains. Through data collection and aggregation enhanced with data mining and machine learning techniques, many static and dynamic patterns can be found by sensor networks. The aggregation problem is complicated by the fact that the perceived value of the data collected by the sensors is affected by many factors such as time, location and user valuation. In addition, the value of information deteriorates often dramatically over time. Through our research, we already achieved some results: A formal algebraic analysis of information discounting, especially affected by time. A general model and two specific models are developed for information discounting. The two specific models formalize exponetial time-discount and linear time-discount. An algebraic analysis of aggregation of values that decay with time exponentially. Three types of aggregators that offset discounting effects are formalized and analyzed. A natural synthesis of these three aggregators is discovered and modeled. We apply our theoretical models to emergency response with thresholding and confirm with extensive simulation. For long-term monitoring tasks, we laid out a theoretical foundation for discovering an emergency through generations of sensors, analysed the achievability of a long-term task and found an optimum way to distribute sensors in a monitored area to maximize the achievability. We proposed an implementation for our alert system with state-of-art wireless microcontrollers, sensors, real-time operating systems and embedded internet protocols. By allowing aggregation of time-discounted information to proceed in an arbitrary, not necessarily pairwise manner, our results are also applicable to other similar homeland security and military application domains where there is a strong need to model not only timely aggregation of data collected by individual sensors, but also the dynamics of this aggregation. Our research can be applied to many real-world scenarios. A typical scenario is monitoring wildfire in the forest: A batch of first-generation sensors are deployed by UAVs to monitor a forest for possible wildfire. They monitor various weather quantities and recognize the area with the highest possibility of producing a fire --- the so-called area of interest (AoI). Since the environment changes dynamically, so after a certain time, the sensors re-identify the AoI. The value of the knowledge they learned about the previous AoI decays with time quickly, our methods of aggregation of time-discounted information can be applied to get update knowledge. Close to depletion of their energy of the current generation of sensors, a new generation of sensors are deployed and inherit the knowledge from the current generation. Through this way, monitoring long-term tasks becomes feasible. At the end of this thesis, we propose some extensions and directions from our current research: Generalize and extend the special classes of Type 1 and Type 2 aggregation operators; Analyze aggregation operator of Type 3 and Type 4, find some special applicable candidates; Data aggregation across consecutive generations of sensors in order to learn about events with discounting that take a long time to manifest themselves; Network implications of various aggregation strategies; Algorithms for implementation of some special classes of aggregators. Implement wireless sensor network that can autonomously learn and recognize patterns of emergencies, predict incidents and trigger alarms through machine learning

    Optimisation of Mobile Communication Networks - OMCO NET

    Get PDF
    The mini conference “Optimisation of Mobile Communication Networks” focuses on advanced methods for search and optimisation applied to wireless communication networks. It is sponsored by Research & Enterprise Fund Southampton Solent University. The conference strives to widen knowledge on advanced search methods capable of optimisation of wireless communications networks. The aim is to provide a forum for exchange of recent knowledge, new ideas and trends in this progressive and challenging area. The conference will popularise new successful approaches on resolving hard tasks such as minimisation of transmit power, cooperative and optimal routing

    Energieeffiziente und rechtzeitige Ereignismeldung mittels drahtloser Sensornetze

    Get PDF
    This thesis investigates the suitability of state-of-the-art protocols for large-scale and long-term environmental event monitoring using wireless sensor networks based on the application scenario of early forest fire detection. By suitable combination of energy-efficient protocol mechanisms a novel communication protocol, referred to as cross-layer message-merging protocol (XLMMP), is developed. Qualitative and quantitative protocol analyses are carried out to confirm that XLMMP is particularly suitable for this application area. The quantitative analysis is mainly based on finite-source retrial queues with multiple unreliable servers. While this queueing model is widely applicable in various research areas even beyond communication networks, this thesis is the first to determine the distribution of the response time in this model. The model evaluation is mainly carried out using Markovian analysis and the method of phases. The obtained quantitative results show that XLMMP is a feasible basis to design scalable wireless sensor networks that (1) may comprise hundreds of thousands of tiny sensor nodes with reduced node complexity, (2) are suitable to monitor an area of tens of square kilometers, (3) achieve a lifetime of several years. The deduced quantifiable relationships between key network parameters — e.g., node size, node density, size of the monitored area, aspired lifetime, and the maximum end-to-end communication delay — enable application-specific optimization of the protocol

    Modelling of extreme data in a wireless sensor network through the application of random field theory

    Get PDF
    Wireless Sensor Networks (WSNs) consist of a large number of small, simple sensor nodes which support sensing, processing, and wireless transmission capabilities in order to monitor some physical environment. The data that they collect will be transmitted to an information sink where it can be accessed by the user. Due to the vast number of nodes expected in many WSN applications, there is the possibility that at certain times the network may have potentially huge amounts of data to transmit to the sink. The fact that the nodes have limited energy resources means that when a huge amount of data is generated, the nodes' batteries will be depleted at aggressive rates as nodes try to forward this data to the sink. This problem will be particularly severe in regions of the network close to the sink, as nodes in these regions will be responsible for routing the data from large areas of the network to the sink. This phenomenon is often described as a "data-implosion" around the sink. We develop a model of the node data in a wireless sensor network based on a stochastic model of the underlying phenomenon being observed by the network. The model is based on a stationary Gaussian random field and we use this model to study the size and spatial distribution of the sets of nodes that observe statistically high data. This knowledge is exploited in order to ameliorate the data-implosion problem. Effectively, we implement a data suppression scheme that only lets nodes which sense statistically high data attempt to transmit their data to the sink. Further, we also use our model to study network data that belongs to a given contour level and show that we can achieve further data suppression by only transmitting node data if it belongs to some predefined contour level. Finally, we show how the knowledge of the size and spatial distribution of statistically high node data in a WSN can be used to study the traffic in both schedule and contention based MAC protocols
    • …
    corecore