2,638 research outputs found
Survey of Spectrum Sharing for Inter-Technology Coexistence
Increasing capacity demands in emerging wireless technologies are expected to
be met by network densification and spectrum bands open to multiple
technologies. These will, in turn, increase the level of interference and also
result in more complex inter-technology interactions, which will need to be
managed through spectrum sharing mechanisms. Consequently, novel spectrum
sharing mechanisms should be designed to allow spectrum access for multiple
technologies, while efficiently utilizing the spectrum resources overall.
Importantly, it is not trivial to design such efficient mechanisms, not only
due to technical aspects, but also due to regulatory and business model
constraints. In this survey we address spectrum sharing mechanisms for wireless
inter-technology coexistence by means of a technology circle that incorporates
in a unified, system-level view the technical and non-technical aspects. We
thus systematically explore the spectrum sharing design space consisting of
parameters at different layers. Using this framework, we present a literature
review on inter-technology coexistence with a focus on wireless technologies
with equal spectrum access rights, i.e. (i) primary/primary, (ii)
secondary/secondary, and (iii) technologies operating in a spectrum commons.
Moreover, we reflect on our literature review to identify possible spectrum
sharing design solutions and performance evaluation approaches useful for
future coexistence cases. Finally, we discuss spectrum sharing design
challenges and suggest future research directions
Learning and Management for Internet-of-Things: Accounting for Adaptivity and Scalability
Internet-of-Things (IoT) envisions an intelligent infrastructure of networked
smart devices offering task-specific monitoring and control services. The
unique features of IoT include extreme heterogeneity, massive number of
devices, and unpredictable dynamics partially due to human interaction. These
call for foundational innovations in network design and management. Ideally, it
should allow efficient adaptation to changing environments, and low-cost
implementation scalable to massive number of devices, subject to stringent
latency constraints. To this end, the overarching goal of this paper is to
outline a unified framework for online learning and management policies in IoT
through joint advances in communication, networking, learning, and
optimization. From the network architecture vantage point, the unified
framework leverages a promising fog architecture that enables smart devices to
have proximity access to cloud functionalities at the network edge, along the
cloud-to-things continuum. From the algorithmic perspective, key innovations
target online approaches adaptive to different degrees of nonstationarity in
IoT dynamics, and their scalable model-free implementation under limited
feedback that motivates blind or bandit approaches. The proposed framework
aspires to offer a stepping stone that leads to systematic designs and analysis
of task-specific learning and management schemes for IoT, along with a host of
new research directions to build on.Comment: Submitted on June 15 to Proceeding of IEEE Special Issue on Adaptive
and Scalable Communication Network
Architectures and Key Technical Challenges for 5G Systems Incorporating Satellites
Satellite Communication systems are a promising solution to extend and
complement terrestrial networks in unserved or under-served areas. This aspect
is reflected by recent commercial and standardisation endeavours. In
particular, 3GPP recently initiated a Study Item for New Radio-based, i.e., 5G,
Non-Terrestrial Networks aimed at deploying satellite systems either as a
stand-alone solution or as an integration to terrestrial networks in mobile
broadband and machine-type communication scenarios. However, typical satellite
channel impairments, as large path losses, delays, and Doppler shifts, pose
severe challenges to the realisation of a satellite-based NR network. In this
paper, based on the architecture options currently being discussed in the
standardisation fora, we discuss and assess the impact of the satellite channel
characteristics on the physical and Medium Access Control layers, both in terms
of transmitted waveforms and procedures for enhanced Mobile BroadBand (eMBB)
and NarrowBand-Internet of Things (NB-IoT) applications. The proposed analysis
shows that the main technical challenges are related to the PHY/MAC procedures,
in particular Random Access (RA), Timing Advance (TA), and Hybrid Automatic
Repeat reQuest (HARQ) and, depending on the considered service and
architecture, different solutions are proposed.Comment: Submitted to Transactions on Vehicular Technologies, April 201
Achieving Max-Min Throughput in LoRa Networks
With growing popularity, LoRa networks are pivotally enabling Long Range
connectivity to low-cost and power-constrained user equipments (UEs). Due to
its wide coverage area, a critical issue is to effectively allocate wireless
resources to support potentially massive UEs in the cell while resolving the
prominent near-far fairness problem for cell-edge UEs, which is challenging to
address due to the lack of tractable analytical model for the LoRa network and
its practical requirement for low-complexity and low-overhead design. To
achieve massive connectivity with fairness, we investigate the problem of
maximizing the minimum throughput of all UEs in the LoRa network, by jointly
designing high-level policies of spreading factor (SF) allocation, power
control, and duty cycle adjustment based only on average channel statistics and
spatial UE distribution. By leveraging on the Poisson rain model along with
tailored modifications to our considered LoRa network, we are able to account
for channel fading, aggregate interference and accurate packet overlapping, and
still obtain a tractable and yet accurate closed-form formula for the packet
success probability and hence throughput. We further propose an iterative
balancing (IB) method to allocate the SFs in the cell such that the overall
max-min throughput can be achieved within the considered time period and cell
area. Numerical results show that the proposed scheme with optimized design
greatly alleviates the near-far fairness issue, and significantly improves the
cell-edge throughput.Comment: 6 pages, 4 figures, published in Proc. International Conference on
Computing, Networking and Communications (ICNC), 2020. This paper proposes
stochastic-geometry based analytical framework for a single-cell LoRa
network, with joint optimization to achieve max-min throughput for the users.
Extended journal version for large-scale multi-cell LoRa network:
arXiv:2008.0743
A Modelling and Experimental Framework for Battery Lifetime Estimation in NB-IoT and LTE-M
To enable large-scale Internet of Things (IoT) deployment, Low-power
wide-area networking (LPWAN) has attracted a lot of research attention with the
design objectives of low-power consumption, wide-area coverage, and low cost.
In particular, long battery lifetime is central to these technologies since
many of the IoT devices will be deployed in hard-toaccess locations. Prediction
of the battery lifetime depends on the accurate modelling of power consumption.
This paper presents detailed power consumption models for two cellular IoT
technologies: Narrowband Internet of Things (NB-IoT) and Long Term Evolution
for Machines (LTE-M). A comprehensive power consumption model based on User
Equipment (UE) states and procedures for device battery lifetime estimation is
presented. An IoT device power measurement testbed has been setup and the
proposed model has been validated via measurements with different coverage
scenarios and traffic configurations, achieving the modelling inaccuracy within
5%. The resulting estimated battery lifetime is promising, showing that the
10-year battery lifetime requirement specified by 3GPP can be met with proper
configuration of traffic profile, transmission, and network parameters.Comment: submitted to IEEE Internet of Things Journal, 12 pages, 10 figure
- …