1,594 research outputs found
Multihop clustering algorithm for load balancing in wireless sensor networks
The paper presents a new cluster based routing algorithm that exploits the redundancy properties of the sensor networks in order to address the traditional problem of load balancing and energy efficiency in the WSNs.The algorithm makes use of the nodes in a sensor network of which area coverage is covered by the neighbours of the nodes and mark them as temporary cluster heads. The algorithm then forms two layers of multi hop communication. The bottom layer which involves intra cluster communication and the top layer which involves inter cluster communication involving the temporary cluster heads. Performance studies indicate that the proposed algorithm solves effectively the problem of load balancing and is also more efficient in terms of energy consumption from Leach and the enhanced version of Leach
Recommended from our members
Improving the network transmission cost of differentiated web services
This paper investigates into the transmission cost of web services related messages which is affected by network
latency. Web services enable seamless interaction and integration of e-business applications. Web services contain a
collection of operations so as to interact with outside world over the Internet through XML messaging. Though XML
effectively describe message related information and is fairly human readable, it badly affects the performance of Web
services in terms of transmission cost, processing cost, and so on. This paper aims to minimize network latency of message
communication of Web services by employing pre-emptive resume scheduling. Fundamental principle of this approach is the
provision of preferential treatment to some messages as compared to others. This approach assigns different priorities to
distinct classes of messages given the fact that some messages may tolerate longer delays than others. For instance, shorter
messages may be given higher priority than longer messages, or the Web service provider may give higher priority to the
messages of paying subscribers
Multihop clustering algorithm for load balancing in wireless sensor networks
The paper presents a new cluster based routing algorithm that exploits the redundancy properties of the sensor networks in order to address the traditional problem of load balancing and energy efficiency in the WSNs.The algorithm makes use of the nodes in a sensor network of which area coverage is covered by the neighbours of the nodes and mark them as temporary cluster heads. The algorithm then forms two layers of multi hop communication. The bottom layer which involves intra cluster communication and the top layer which involves inter cluster communication involving the temporary cluster heads. Performance studies indicate that the proposed algorithm solves effectively the problem of load balancing and is also more efficient in terms of energy consumption from Leach and the enhanced version of Leach
Recommended from our members
A discrete-time performance model for congestion control mechanism using queue thresholds with QOS constraints
This paper presents a new analytical framework for the congestion control of Internet traffic using a
queue threshold scheme. This framework includes two discrete-time analytical models for the performance
evaluation of a threshold based congestion control mechanism and compares performance measurements through
typical numerical results. To satisfy the low delay along with high throughput, model-I incorporates one
threshold to make the arrival process step reduce from arrival rate Âż1 directly to Âż2 once the number of packets in
the system has reached the threshold value L1. The source operates normally, otherwise. Model-II incorporates
two thresholds to make the arrival rate linearly reduce from Âż1 to Âż2 with system contents when the number of
packets in the system is between two thresholds L1 and L2. The source operates normally with arrival rate Âż1
before threshold L1, and with arrival rate Âż2 after the threshold L2. In both performance models, the mean packet
delay W, probability of packet loss PL and throughput S have been found as functions of the thresholds and
maximum drop probability. The performance comparison results for the two models have also been made
through typical numerical results. The results clearly demonstrate how different load settings can provide
different tradeoffs between throughput, loss probability and delay to suit different service requirements
Recommended from our members
Performance modelling of a multiple threshold RED mechanism for bursty and correlated Internet traffic with MMPP arrival process
Access to the large web content hosted all over the world by users of the Internet engage
many hosts, routers/switches and faster links. They challenge the internet backbone to operate at
its capacity to assure e±cient content access. This may result in congestion and raises concerns over
various Quality of Service (QoS) issues like high delays, high packet loss and low throughput of the
system for various Internet applications. Thus, there is a need to develop effective congestion control
mechanisms in order to meet various Quality of Service (QoS) related performance parameters. In this
paper, our emphasis is on the Active Queue Management (AQM) mechanisms, particularly Random
Early Detection (RED). We propose a threshold based novel analytical model based on standard RED
mechanism. Various numerical examples are presented for Internet traffic scenarios containing both the
burstiness and correlation properties of the network traffic
The case for validating ADDIE model as a digital forensic model for peer to peer network investigation
Rapid technological advancement can substantially impact the processes of digital forensic investigation and present a myriad of challenges to the investigator. With these challenges, it is necessary to have a standard digital forensic framework as the foundation of any digital investigation. State-of-the-art digital forensic models assume that it is safe to move from one investigation stage to the next. It guides the investigators with the required steps and procedures. This brings a great stride to validate a non-specific framework to be used in most digital investigation procedures. This paper considers a new technique for detecting active peers that participate in a peer-to-peer (P2P) network. As part of our study, we crawled the μTorrent P2P client over ten days in different instances while logging all participating peers. We then employed digital forensic techniques to analyse the popular users and generate evidence within them with high accuracy. We evaluated our approach against the standard Analysis, Design, Development, Implementation, and Evaluation (ADDIE) model for the digital investigation to achieve the credible digital evidence presented in this paper. Finally, we presented a validation case for the ADDIE model using the United States Daubert Test and the United Kingdom’s Forensic Science Regulator Guidance – 218 (FSR-G-218) and Forensic Science Regulator Guidance – 201 (FSR-G-201) to formulate it as a standard digital forensic model
A note on “Taylor–Couette flow of a generalized second grade fluid due to a constant couple”
In this brief note, we show that the unsteady flow of a generalized second grade fluid due to a constant couple, as well as the similar flow of Newtonian and ordinary second grade fluids, ultimately becomes steady. For this, a new form of the exact solution for velocity is established. This solution is presented as a sum of the steady and transient components. The required time to reach the steady-state is obtained by graphical illustrations
Recommended from our members
Empirical study of cultural dimensions and cybersecurity development
yesThis study seeks to investigate how the development of e-government services impacts on cybersecurity. The study uses the methods of correlation and multiple regression to analyse two sets of global data, the e-government development index of the 2015 United Nations e-government survey and the 2015 Inter-national Telecommunication Union global cybersecurity develop-ment index (GCI 2015). After analysing the various contextual factors affecting e-government development , the study found that, various composite measures of e-government development are significantly correlated with cybersecurity development. The therefore study contributes to the understanding of the relation-ship between e-government and cybersecurity development. The authors developed a model to highlight this relationship and have validated the model using empirical data. This is expected to provide guidance on specific dimensions of e-government services that will stimulate the development of cybersecurity. The study provided the basis for understanding the patterns in cybersecurity development and has implication for policy makers in developing trust and confidence for the adoption e-government services.National Information Technology Development Agency, Nigeria
- …