79 research outputs found

    Congestion control in wireless sensor networks

    Get PDF
    Information-sensing and data-forwarding in Wireless Sensor Networks (WSN) often incurs high traffic demands, especially during event detection and concurrent transmissions. Managing such large amounts of data remains a considerable challenge in resource-limited systems like WSN, which typically observe a many-to-one transmission model. The result is often a state of constant buffer-overload or congestion, preventing desirable performance to the extent of collapsing an entire network. The work herein seeks to circumvent congestion issues and its negative effects in WSN and derivative platforms such as Body Sensor Networks (BSN). The recent proliferation of WSN has emphasized the need for high Quality-of-Service (QoS) in applications involving real-time and remote monitoring systems such as home automation, military surveillance, environmental hazard detection, as well as BSN-based healthcare and assisted-living systems. Nevertheless, nodes in WSN are often resource-starved as data converges and cause congestion at critical points in such networks. Although this has been a primal concern within the WSN field, elementary issues such as fairness and reliability that directly relate to congestion are still under-served. Moreover, hindering loss of important packets, and the need to avoid packet entrapment in certain network areas remain salient avenues of research. Such issues provide the motivation for this thesis, which lead to four research concerns: (i) reduction of high-traffic volumes; (ii) optimization of selective packet discarding; (iii) avoidance of infected areas; and (iv) collision avoidance with packet-size optimization. Addressing these areas would provide for high QoS levels, and pave the way for seamless transmissions in WSN. Accordingly, the first chapter attempts to reduce the amount of network traffic during simultaneous data transmissions, using a rate-limiting technique known as Relaxation Theory (RT). The goal is for substantial reductions in otherwise large data-streams that cause buffer overflows. Experimentation and analysis with Network Simulator 2 (NS-2), show substantial improvement in performance, leading to our belief that RT-MMF can cope with high incoming traffic scenarios and thus, avoid congestion issues. Whilst limiting congestion is a primary objective, this thesis keenly addresses subsequent issues, especially in worst-case scenarios where congestion is inevitable. The second research question aims at minimizing the loss of important packets crucial to data interpretation at end-systems. This is achieved using the integration of selective packet discarding and Multi-Objective Optimization (MOO) function, contributing to the effective resource-usage and optimized system. A scheme was also developed to detour packet transmissions when nodes become infected. Extensive evaluations demonstrate that incoming packets are successfully delivered to their destinations despite the presence of infected nodes. The final research question addresses packet collisions in a shared wireless medium using distributed collision control that takes packet sizes into consideration. Performance evaluation and analysis reveals desirable performance that are resulted from a strong consideration of packet sizes, and the effect of different Bit Error Rates (BERs)

    Critical Management Issues for Implementing RFID in Supply Chain Management

    Get PDF
    The benefits of radio frequency identification (RFID) technology in the supply chain are fairly compelling. It has the potential to revolutionise the efficiency, accuracy and security of the supply chain with significant impact on overall profitability. A number of companies are actively involved in testing and adopting this technology. It is estimated that the market for RFID products and services will increase significantly in the next few years. Despite this trend, there are major impediments to RFID adoption in supply chain. While RFID systems have been around for several decades, the technology for supply chain management is still emerging. We describe many of the challenges, setbacks and barriers facing RFID implementations in supply chains, discuss the critical issues for management and offer some suggestions. In the process, we take an in-depth look at cost, technology, standards, privacy and security and business process reengineering related issues surrounding RFID technology in supply chains

    A Survey on Energy Efficient Routing Protocols in Wireless Sensor Networks

    Get PDF
    Energy efficiency is one of the critical issues in the Wireless Sensor Networks (WSNs), since sensor devices are tiny and integrated with a limited capacity battery. In most of the advanced applications, WSNs operate in very harsh areas and not under supervision of human controls. Routing protocols play a significant role in energy balancing by incorporating the techniques that can reduce control overhead, proper data aggregation method and feasible path selection. It demands a unique requirement due to its frequent topology changes and distributive nature. One of the major concerns in the design of routing protocol in WSNs is efficient energy usage and prolonging Network lifetime. This paper mainly discusses different issues related to energy efficiency in routing protocols of all categories. It incorporates most recent routing protocols which improves the energy efficiency in various application environments. This paper also provides comprehensive details of each protocol which emphasize their principles and explore their advantages and limitations. These protocols belong to different classifications based on Network Structures, communication model, topology and QoS parameters. It also includes more relevant and prominent comparisons with all recent State-of-Art works

    Selecting Root Exploit Features Using Flying Animal-Inspired Decision

    Get PDF
    Malware is an application that executes malicious activities to a computer system, including mobile devices. Root exploit brings more damages among all types of malware because it is able to run in stealthy mode. It compromises the nucleus of the operating system known as kernel to bypass the Android security mechanisms. Once it attacks and resides in the kernel, it is able to install other possible types of malware to the Android devices. In order to detect root exploit, it is important to investigate its features to assist machine learning to predict it accurately. This study proposes flying animal-inspired (1) bat, 2) firefly, and 3) bee) methods to search automatically the exclusive features, then utilizes these flying animal-inspired decision features to improve the machine learning prediction. Furthermore, a boosting method (Adaboost) boosts the multilayer perceptron (MLP) potential to a stronger classification. The evaluation jotted the best result is from bee search, which recorded 91.48 percent in accuracy, 82.2 percent in true positive rate, and 0.1 percent false positive rate

    Proceedings, MSVSCC 2012

    Get PDF
    Proceedings of the 6th Annual Modeling, Simulation & Visualization Student Capstone Conference held on April 19, 2012 at VMASC in Suffolk, Virginia

    Cyber-Security Solutions for Ensuring Smart Grid Distribution Automation Functions

    Get PDF
    The future generation of the electrical network is known as the smart grid. The distribution domain of the smart grid intelligently supplies electricity to the end-users with the aid of the decentralized Distribution Automation (DA) in which intelligent control functions are distributed and accomplished via real-time communication between the DA components. Internet-based communication via the open protocols is the latest trend for decentralized DA communication. Internet communication has many benefits, but it exposes the critical infrastructure’s data to cyber-security threats. Security attacks may not only make DA services unreachable but may also result in undesirable physical consequences and serious damage to the distribution network environment. Therefore, it is compulsory to protect DA communication against such attacks. There is no single model for securing DA communication. In fact, the security level depends on several factors such as application requirements, communication media, and, of course, the cost.There are several smart grid security frameworks and standards, which are under development by different organizations. However, smart grid cyber-security field has not yet reached full maturity and, it is still in the early phase of its progress. Security protocols in IT and computer networks can be utilized to secure DA communication because industrial ICT standards have been designed in accordance with Open Systems Interconnection model. Furthermore, state-of-the-art DA concepts such as Active distribution network tend to integrate processing data into IT systems.This dissertation addresses cyber-security issues in the following DA functions: substation automation, feeder automation, Logic Selectivity, customer automation and Smart Metering. Real-time simulation of the distribution network along with actual automation and data networking devices are used to create hardware-in-the-loop simulation, and experiment the mentioned DA functions with the Internet communication. This communication is secured by proposing the following cyber-security solutions.This dissertation proposes security solutions for substation automation by developing IEC61850-TLS proxy and adding OPen Connectivity Unified Architecture (OPC UA) Wrapper to Station Gateway. Secured messages by Transport Layer Security (TLS) and OPC UA security are created for protecting substation local and remote communications. Data availability is main concern that is solved by designing redundant networks.The dissertation also proposes cyber-security solutions for feeder automation and Logic Selectivity. In feeder automation, Centralized Protection System (CPS) is proposed as the place for making Decentralized feeder automation decisions. In addition, applying IP security (IPsec) in Tunnel mode is proposed to establish a secure communication path for feeder automation messages. In Logic Selectivity, Generic Object Oriented Substation Events (GOOSE) are exchanged between the substations. First, Logic Selectivity functional characteristics are analyzed. Then, Layer 2 Tunneling over IPsec in Transport mode is proposed to create a secure communication path for exchanging GOOSE over the Internet. Next, communication impact on Logic Selectivity performance is investigated by measuring the jitter and latency in the GOOSE communication. Lastly, reliability improvement by Logic Selectivity is evaluated by calculating reliability indices.Customer automation is the additional extension to the smart grid DA. This dissertation proposes an integration solution for the heterogeneous communication parties (TCP/IP and Controller Area Network) in Home Area Network. The developed solution applies Secure Socket Layer in order to create secured messages.The dissertation also proposes Secondary Substation Automation Unit (SSAU) for realtime communication of low voltage data to metering database. Point-to-Point Tunneling Protocol is proposed to create a secure communication path for Smart Metering data.The security analysis shows that the proposed security solutions provide the security requirements (Confidentiality, Integrity and Availability) for DA communication. Thus, communication is protected against security attacks and DA functions are ensured. In addition, CPS and SSAU are proposed to distribute intelligence over the substations level

    Distributed anomaly detection models for industrial wireless sensor networks

    Get PDF
    Wireless Sensor Networks (WSNs) are firmly established as an integral technology that enables automation and control through pervasive monitoring for many industrial applications. These range from environmental applications and healthcare applications to major industrial monitoring applications such as infrastructure and structural monitoring. The key features that are common to such applications can be noted as involving large amounts of data, consisting of dynamic observation environments, non-homogeneous data distributions with evolving patterns and sensing functionality leading to data-driven control. Also in most industrial applications a major requirement is to have near real-time decision support. Accordingly there is a vital need to have a secure continuous and reliable sensing mechanism in integrated WSNs where integrity of the data is assured. However, in practice WSNs are vulnerable to different security attacks, faults and malfunction due to inherent resource constraints, openly commoditised wireless technologies employed and naive modes of implementation. Misbehaviour resulting from such threats manifest as anomalies in the sensed data streams in critically compromising the systems. Therefore, it is vital that effective techniques are introduced in accurately detecting anomalies and assuring the integrity of the data. This research focuses on investigating such models for large scale industrial wireless sensor networks. Focusing on achieving an anomaly detection framework that is adaptable and scalable, a hierarchical data partitioning approach with fuzzy data modelling is introduced first. In this model unsupervised data partitioning is performed in a distributed manner by adapting fuzzy c-means clustering in an incremental model over a hierarchical node topology. It is found that non-parametric and non-probabilistic determination of anomalies can be done by evaluating the fuzzy membership scores and inter-cluster distances adaptively over the node hierarchy. Considering heterogeneous data distributions with evolving patterns, a granular anomaly detection model that uses an entropy criterion to dynamically partition the data is proposed next. This successfully overcomes the issue of determining the proper number of expected clusters in a dynamic manner. In this approach the data is partitioned on to different cohesive regions using cumulative point-wise entropy directly. The effect of differential density distributions when relying on an entropy criterion is mitigated by introducing an average relative density measure to segregate isolated outliers prior to the partitioning. The combination of these two factors is shown to be significantly successful in determining anomalies adaptively in a fully dynamic manner. The need for near real-time anomaly evaluation is focused next on this thesis. Building upon the entropy based data partitioning model that is also proposed, a Point-of-View (PoV) entropy evaluation model is developed next. This employs an incremental data processing model as opposed to batch-wise data processing. Three unique points-of-view are introduced as the reference points over which point-wise entropy is computed in evaluating its relative change as the data streams evolve. Overall this thesis proposes efficient unsupervised anomaly detection models that employ distributed in-network data processing for accurate determination of anomalies. The resource constrained environment is taken in to account in each of the models with innovations made to achieve non-parametric and non-probabilistic detection

    Pervasive computing reference architecture from a software engineering perspective (PervCompRA-SE)

    Get PDF
    Pervasive computing (PervComp) is one of the most challenging research topics nowadays. Its complexity exceeds the outdated main frame and client-server computation models. Its systems are highly volatile, mobile, and resource-limited ones that stream a lot of data from different sensors. In spite of these challenges, it entails, by default, a lengthy list of desired quality features like context sensitivity, adaptable behavior, concurrency, service omnipresence, and invisibility. Fortunately, the device manufacturers improved the enabling technology, such as sensors, network bandwidth, and batteries to pave the road for pervasive systems with high capabilities. On the other hand, this domain area has gained an enormous amount of attention from researchers ever since it was first introduced in the early 90s of the last century. Yet, they are still classified as visionary systems that are expected to be woven into people’s daily lives. At present, PervComp systems still have no unified architecture, have limited scope of context-sensitivity and adaptability, and many essential quality features are insufficiently addressed in PervComp architectures. The reference architecture (RA) that we called (PervCompRA-SE) in this research, provides solutions for these problems by providing a comprehensive and innovative pair of business and technical architectural reference models. Both models were based on deep analytical activities and were evaluated using different qualitative and quantitative methods. In this thesis we surveyed a wide range of research projects in PervComp in various subdomain areas to specify our methodological approach and identify the quality features in the PervComp domain that are most commonly found in these areas. It presented a novice approach that utilizes theories from sociology, psychology, and process engineering. The thesis analyzed the business and architectural problems in two separate chapters covering the business reference architecture (BRA) and the technical reference architecture (TRA). The solutions for these problems were introduced also in the BRA and TRA chapters. We devised an associated comprehensive ontology with semantic meanings and measurement scales. Both the BRA and TRA were validated throughout the course of research work and evaluated as whole using traceability, benchmark, survey, and simulation methods. The thesis introduces a new reference architecture in the PervComp domain which was developed using a novel requirements engineering method. It also introduces a novel statistical method for tradeoff analysis and conflict resolution between the requirements. The adaptation of the activity theory, human perception theory and process re-engineering methods to develop the BRA and the TRA proved to be very successful. Our approach to reuse the ontological dictionary to monitor the system performance was also innovative. Finally, the thesis evaluation methods represent a role model for researchers on how to use both qualitative and quantitative methods to evaluate a reference architecture. Our results show that the requirements engineering process along with the trade-off analysis were very important to deliver the PervCompRA-SE. We discovered that the invisibility feature, which was one of the envisioned quality features for the PervComp, is demolished and that the qualitative evaluation methods were just as important as the quantitative evaluation methods in order to recognize the overall quality of the RA by machines as well as by human beings
    • …
    corecore