92 research outputs found
Opportunistic Use of Client Repeaters to Improve Performance of WLANs
Currently deployed IEEE 802.11WLANs (Wi-Fi networks) share access point (AP) bandwidth on a per-packet basis. However, the various stations communicating with the AP often have different signal qualities, resulting in different transmission rates. This induces a phenomenon known as the rate anomaly problem, in which stations with lower signal quality transmit at lower rates and consume a significant majority of airtime, thereby dramatically reducing the throughput of stations transmitting at high rates. We propose a practical, deployable system, called SoftRepeater, in which stations cooperatively address the rate anomaly problem. Specifically, higher-rate Wi-Fi stations opportunistically transform themselves into repeaters for stations with low data-rates when transmitting to/from the AP. The key challenge is to determine when it is beneficial to enable the repeater functionality. In this paper, we propose an initiation protocol that ensures that repeater functionality is enabled only when appropriate. Also, our system can run directly on top of today's 802.11 infrastructure networks. We also describe a novel, zero-overhead network coding scheme that further alleviates undesirable symptoms of the rate anomaly problem. We evaluate our system using simulation and testbed implementation, and find that SoftRepeater can improve cumulative throughput by up to 200%
Subjective Audio Quality over a Secure IEEE 802.11n Draft 2.0 Wireless Local Area Network
This thesis investigates the quality of audio generated by a G.711 codec and transmission over an IEEE 802.11n draft 2.0 wireless local area network (WLAN). Decline in audio quality due to additional calls or by securing the WLAN with transport mode Internet Protocol Security (IPsec) is quantified. Audio quality over an IEEE 802.11n draft 2.0 WLAN is also compared to that of IEEE 802.11b and IEEE 802.11g WLANs under the same conditions. Audio quality is evaluated by following International Telecommunication Union Telecommunication Standardization Sector (ITU-T) Recommendation P.800, where human subjects rate audio clips recorded during various WLAN configurations. The Mean Opinion Score (MOS) is calculated as the average audio quality score given for each WLAN configuration. An 85% confidence interval is calculated for each MOS. Results suggest that audio quality over an IEEE 802.11n draft 2.0 WLAN is not higher than over an IEEE 802.11b WLAN when up to 10 simultaneous G.711 calls occur. A linear regression of the subjective scores also suggest that an IEEE 802.11n draft 2.0 WLAN can sustain an MOS greater than 3.0 (fair quality) for up to 75 simultaneous G.711 calls secured with WPA2, or up to 40 calls secured with both WPA2 and transport mode IPsec. The data strongly suggest that toll quality audio (MOS â„ 4.0) is not currently practical over IEEE 802.11 WLANs secured with WPA2, even with the G.711 codec
Evaluation of Wireless Sensor Networks Technologies
Wireless sensor networks represent a new technology that has emerged from developments in ultra low power microcontrollers and sophisticated low cost wireless data devices. Their small size and power consumption allow a number of independent ânodesâ (known as Motes) to be distributed in the field, all capable of ad-hoc networking and multihop message transmission. New routing algorithms allow remote data to be passed reliably through the network to a final control point. This occurs within the constraints of low power RF transmissions in a congested 2.4GHz radio spectrum. Wireless sensor network nodes are suitable for applications requiring long term autonomous operation, away from mains power supplies, such as environmental or health monitoring. To achieve this, sophisticated power management techniques must be used, with the units remaining âasleepâ in ultra low power mode for long periods of time.
The main aim of this research described in this thesis is first to review the area and then to evaluate one of the current hardware platforms and the popular software used with it called TinyOS. Therefore this research uses a hardware platform designed from University of Berkeley, called the TmoteSky. Practical work has been carried out in different scenarios. Using Java tools running on a PC, and customized applications running on the Motes, data has been captured, together with information showing topology configuration and adaptive routing of the network and radio link quality information. Results show that the technology is promising for distributed data acquisition applications, although in time critical monitoring systems new power management schemes and networking protocols to improve latency in the system will be required
Recommended from our members
Multimedia delivery in the future internet
The term âNetworked Mediaâ implies that all kinds of media including text, image, 3D graphics, audio
and video are produced, distributed, shared, managed and consumed on-line through various networks,
like the Internet, Fiber, WiFi, WiMAX, GPRS, 3G and so on, in a convergent manner [1]. This white
paper is the contribution of the Media Delivery Platform (MDP) cluster and aims to cover the Networked
challenges of the Networked Media in the transition to the Future of the Internet.
Internet has evolved and changed the way we work and live. End users of the Internet have been confronted
with a bewildering range of media, services and applications and of technological innovations concerning
media formats, wireless networks, terminal types and capabilities. And there is little evidence that the pace
of this innovation is slowing. Today, over one billion of users access the Internet on regular basis, more
than 100 million users have downloaded at least one (multi)media file and over 47 millions of them do so
regularly, searching in more than 160 Exabytes1 of content. In the near future these numbers are expected
to exponentially rise. It is expected that the Internet content will be increased by at least a factor of 6, rising
to more than 990 Exabytes before 2012, fuelled mainly by the users themselves. Moreover, it is envisaged
that in a near- to mid-term future, the Internet will provide the means to share and distribute (new)
multimedia content and services with superior quality and striking flexibility, in a trusted and personalized
way, improving citizensâ quality of life, working conditions, edutainment and safety.
In this evolving environment, new transport protocols, new multimedia encoding schemes, cross-layer inthe
network adaptation, machine-to-machine communication (including RFIDs), rich 3D content as well as
community networks and the use of peer-to-peer (P2P) overlays are expected to generate new models of
interaction and cooperation, and be able to support enhanced perceived quality-of-experience (PQoE) and
innovative applications âon the moveâ, like virtual collaboration environments, personalised services/
media, virtual sport groups, on-line gaming, edutainment. In this context, the interaction with content
combined with interactive/multimedia search capabilities across distributed repositories, opportunistic P2P
networks and the dynamic adaptation to the characteristics of diverse mobile terminals are expected to
contribute towards such a vision.
Based on work that has taken place in a number of EC co-funded projects, in Framework Program 6 (FP6)
and Framework Program 7 (FP7), a group of experts and technology visionaries have voluntarily
contributed in this white paper aiming to describe the status, the state-of-the art, the challenges and the way
ahead in the area of Content Aware media delivery platforms
Cross-layer design and optimization of medium access control protocols for wlans
This thesis provides a contribution to the field of Medium Access Control (MAC) layer protocol design for wireless networks by proposing and evaluating mechanisms that enhance different aspects of the network performance. These enhancements are achieved through the exchange of information between different layers of the traditional protocol stack, a concept known as Cross-Layer (CL) design. The main thesis contributions are divided into two parts.
The first part of the thesis introduces a novel MAC layer protocol named Distributed Queuing Collision Avoidance (DQCA). DQCA behaves as a reservation scheme that ensures collision-free data transmissions at the majority of the time and switches automatically to an Aloha-like random access mechanism when the traffic load is low. DQCA can be enriched by more advanced scheduling algorithms based on a CL dialogue between the MAC and other protocol layers, to provide higher throughput and Quality of Service (QoS) guarantees.
The second part of the thesis explores a different challenge in MAC layer design, related to the ability of multiple antenna systems to offer point-to-multipoint communications. Some modifications to the recently approved IEEE 802.11n standard are proposed in order to handle simultaneous multiuser downlink transmissions. A number of multiuser MAC schemes that handle channel access and scheduling issues and provide mechanisms for feedback acquisition have been presented and evaluated. The obtained performance enhancements have been demonstrated with the help of both theoretical analysis and simulation obtained results
A Systematic Framework for Radio Frequency Identification (RFID) Hazard Mitigation in the Blood Transfusion Supply Chain from Donation to Distribution
The RFID Consortium is developing what will be the first FDA-approved use of radio frequency identification (RFID) technology to identify, track, manage, and monitor blood throughout the entire blood transfusion supply chain. The iTraceTM is an innovative technological system designed to optimize the procedures currently employed when tracing blood from the donor to the recipient. With all novel technologies it is essential to consider not only the advantages, but also the potential harms that may come about from using the system. The deployment of the iTraceTM consists of two phases: 1) Phase One - application of the iTraceTM from the donor to blood center distribution, and 2) Phase Two - application of the iTraceTM from blood center distribution to transfusion. This dissertation seeks to identify the possible hazards that may occur when utilizing the iTraceTM during Phase One, and to assess the mitigation and correction processes to combat these hazards. A thorough examination of verification and validation tests, as well as of the system design, requirements, and standard operating procedures was performed to qualify and quantify each hazard into specific categories of severity and likelihood. A traceability matrix was also established to link each hazard with its associated tests and/or features. Furthermore, a series of analyses were conducted to determine whether the benefits of implementing the iTraceTM outweighed the risks and whether the mitigation and correction strategies of the hazards were effective. Ultimately, this dissertation serves as a usable, generalizable framework for the management of RFID-related hazards in the blood transfusion supply chain from donor to blood center distribution
- âŠ