2,790 research outputs found

    Modeling the relationship between network operators and venue owners in public Wi-Fi deployment using non-cooperative game theory

    Get PDF
    Wireless data demands keep rising at a fast rate. In 2016, Cisco measured a global mobile data traffic volume of 7.2 Exabytes per month and projected a growth to 49 Exabytes per month in 2021. Wi-Fi plays an important role in this as well. Up to 60% of the total mobile traffic was off-loaded via Wi-Fi (and femtocells) in 2016. This is further expected to increase to 63% in 2021. In this publication, we look into the roll-out of public Wi-Fi networks, public meaning in a public or semi-public place (pubs, restaurants, sport stadiums, etc.). More concretely we look into the collaboration between two parties, a technical party and a venue owner, for the roll-out of a new Wi-Fi network. The technical party is interested in reducing load on its mobile network and generating additional direct revenues, while the venue owner wants to improve the attractiveness of the venue and consequentially generate additional indirect revenues. Three Wi-Fi pricing models are considered: entirely free, slow access with ads or fast access via paid access (freemium), and paid access only (premium). The technical party prefers a premium model with high direct revenues, the venue owner a free/freemium model which is attractive to its customers, meaning both parties have conflicting interests. This conflict has been modeled using non-cooperative game theory incorporating detailed cost and revenue models for all three Wi-Fi pricing models. The initial outcome of the game is a premium Wi-Fi network, which is not the optimal solution from an outsider's perspective as a freemium network yields highest total payoffs. By introducing an additional compensation scheme which corresponds with negotiation in real life, the outcome of the game is steered toward a freemium solution

    Impact of EU duty cycle and transmission power limitations for sub-GHz LPWAN SRDs : an overview and future challenges

    Get PDF
    Long-range sub-GHz technologies such as LoRaWAN, SigFox, IEEE 802.15.4, and DASH7 are increasingly popular for academic research and daily life applications. However, especially in the European Union (EU), the use of their corresponding frequency bands are tightly regulated, since they must confirm to the short-range device (SRD) regulations. Regulations and standards for SRDs exist on various levels, from global to national, but are often a source of confusion. Not only are multiple institutes responsible for drafting legislation and regulations, depending on the type of document can these rules be informational or mandatory. Regulations also vary from region to region; for example, regulations in the United States of America (USA) rely on electrical field strength and harmonic strength, while EU regulations are based on duty cycle and maximum transmission power. A common misconception is the presence of a common 1% duty cycle, while in fact the duty cycle is frequency band-specific and can be loosened under certain circumstances. This paper clarifies the various regulations for the European region, the parties involved in drafting and enforcing regulation, and the impact on recent technologies such as SigFox, LoRaWAN, and DASH7. Furthermore, an overview is given of potential mitigation approaches to cope with the duty cycle constraints, as well as future research directions

    Performance analysis of feedback-free collision resolution NDMA protocol

    Get PDF
    To support communications of a large number of deployed devices while guaranteeing limited signaling load, low energy consumption, and high reliability, future cellular systems require efficient random access protocols. However, how to address the collision resolution at the receiver is still the main bottleneck of these protocols. The network-assisted diversity multiple access (NDMA) protocol solves the issue and attains the highest potential throughput at the cost of keeping devices active to acquire feedback and repeating transmissions until successful decoding. In contrast, another potential approach is the feedback-free NDMA (FF-NDMA) protocol, in which devices do repeat packets in a pre-defined number of consecutive time slots without waiting for feedback associated with repetitions. Here, we investigate the FF-NDMA protocol from a cellular network perspective in order to elucidate under what circumstances this scheme is more energy efficient than NDMA. We characterize analytically the FF-NDMA protocol along with the multipacket reception model and a finite Markov chain. Analytic expressions for throughput, delay, capture probability, energy, and energy efficiency are derived. Then, clues for system design are established according to the different trade-offs studied. Simulation results show that FF-NDMA is more energy efficient than classical NDMA and HARQ-NDMA at low signal-to-noise ratio (SNR) and at medium SNR when the load increases.Peer ReviewedPostprint (published version

    An optimal data service providing framework in cloud radio access network

    Get PDF
    Much work has been conducted to design effective and efficient algorithms for quality of service (QoS)-aware service computing in the past several years. The wireless mobile computing and cloud computing environments have brought many challenges to QoS-aware service providing. Mobile cloud computing (MCC) and cloud radio accessing networks (C-RANs) are the new paradigms arising in recent years. This work proposes a wireless data service providing framework in C-RAN aiming to provide data service in C-RAN by a more efficient way. The efficiency is measured by cost with time constraint. An abstract formal model is built on the proposed framework, and the corresponding optimal solution is deduced theoretically using queuing theory and convex optimization. The simulation results show that the proposed optimal strategy on the optimal solution works well and has a better performance than compared one

    Will 5G See its Blind Side? Evolving 5G for Universal Internet Access

    Get PDF
    Internet has shown itself to be a catalyst for economic growth and social equity but its potency is thwarted by the fact that the Internet is off limits for the vast majority of human beings. Mobile phones---the fastest growing technology in the world that now reaches around 80\% of humanity---can enable universal Internet access if it can resolve coverage problems that have historically plagued previous cellular architectures (2G, 3G, and 4G). These conventional architectures have not been able to sustain universal service provisioning since these architectures depend on having enough users per cell for their economic viability and thus are not well suited to rural areas (which are by definition sparsely populated). The new generation of mobile cellular technology (5G), currently in a formative phase and expected to be finalized around 2020, is aimed at orders of magnitude performance enhancement. 5G offers a clean slate to network designers and can be molded into an architecture also amenable to universal Internet provisioning. Keeping in mind the great social benefits of democratizing Internet and connectivity, we believe that the time is ripe for emphasizing universal Internet provisioning as an important goal on the 5G research agenda. In this paper, we investigate the opportunities and challenges in utilizing 5G for global access to the Internet for all (GAIA). We have also identified the major technical issues involved in a 5G-based GAIA solution and have set up a future research agenda by defining open research problems

    Resource-aware task scheduling by an adversarial bandit solver method in wireless sensor networks

    Get PDF
    This article was published in the Eurasip Journal on Wireless Communications and Networking [©2016 Springer International Publishing.] and the definite version is available at: http://dx.doi.org/10.1186/s13638-015-0515-y. The article website is at: http://jwcn.eurasipjournals.springeropen.com/articles/10.1186/s13638-015-0515-yA wireless sensor network (WSN) is composed of a large number of tiny sensor nodes. Sensor nodes are very resource-constrained, since nodes are often battery-operated and energy is a scarce resource. In this paper, a resource-aware task scheduling (RATS) method is proposed with better performance/resource consumption trade-off in a WSN. Particularly, RATS exploits an adversarial bandit solver method called exponential weight for exploration and exploitation (Exp3) for target tracking application of WSN. The proposed RATS method is compared and evaluated with the existing scheduling methods exploiting online learning: distributed independent reinforcement learning (DIRL), reinforcement learning (RL), and cooperative reinforcement learning (CRL), in terms of the tracking quality/energy consumption trade-off in a target tracking application. The communication overhead and computational effort of these methods are also computed. Simulation results show that the proposed RATS outperforms the existing methods DIRL and RL in terms of achieved tracking performance. © 2016, Khan.Publishe
    corecore