108 research outputs found
Millimetre wave frequency band as a candidate spectrum for 5G network architecture : a survey
In order to meet the huge growth in global mobile data traffic in 2020 and beyond, the development of the 5th Generation (5G) system is required as the current 4G system is expected to fall short of the provision needed for such growth. 5G is anticipated to use a higher carrier frequency in the millimetre wave (mm-wave) band, within the 20 to 90 GHz, due to the availability of a vast amount of unexploited bandwidth. It is a revolutionary step to use these bands because of their different propagation characteristics, severe atmospheric attenuation, and hardware constraints. In this paper, we carry out a survey of 5G research contributions and proposed design architectures based on mm-wave communications. We present and discuss the use of mm-wave as indoor and outdoor mobile access, as a wireless backhaul solution, and as a key enabler for higher order sectorisation. Wireless standards such as IEE802.11ad, which are operating in mm-wave band have been presented. These standards have been designed for short range, ultra high data throughput systems in the 60 GHz band. Furthermore, this survey provides new insights regarding relevant and open issues in adopting mm-wave for 5G networks. This includes increased handoff rate and interference in Ultra-Dense Network (UDN), waveform consideration with higher spectral efficiency, and supporting spatial multiplexing in mm-wave line of sight. This survey also introduces a distributed base station architecture in mm-wave as an approach to address increased handoff rate in UDN, and to provide an alternative way for network densification in a time and cost effective manner
Modeling and Analysis of sub-Terahertz Communication Channel via Mixture of Gamma Distribution
With the recent developments on opening the terahertz (THz) spectrum for
experimental purposes by the Federal Communications Commission, transceivers
operating in the range of 0.1THz-10THz, which are known as THz bands, will
enable ultra-high throughput wireless communications. However, actual
implementation of the high-speed and high-reliability THz band communication
systems should start with providing extensive knowledge in regards to the
propagation channel characteristics. Considering the huge bandwidth and the
rapid changes in the characteristics of THz wireless channels, ray tracing and
one-shot statistical modeling are not adequate to define an accurate channel
model. In this work, we propose Gamma mixture-based channel modeling for the
THz band via the expectation-maximization (EM) algorithm. First, maximum
likelihood estimation (MLE) is applied to characterize the Gamma mixture model
parameters, and then EM algorithm is used to compute MLEs of the unknown
parameters of the measurement data. The accuracy of the proposed model is
investigated by using the Weighted relative mean difference (WMRD) error
metrics, Kullback-Leibler (KL)-divergence, and Kolmogorov-Smirnov test to show
the difference between the proposed model and the actual probability density
functions (PDFs) that are obtained via the designed test environment. According
to WMRD error metrics, KL-divergence, and KS test results, PDFs generated by
the mixture of Gamma distributions fit the actual histogram of the measurement
data. It is shown that instead of taking pseudo-average characteristics of
sub-bands in the wideband, using the mixture models allows for determining
channel parameters more precisely.Comment: This paper has been accepted for publication in IEEE Transactions on
Vehicular Technolog
Network Management and Control for mmWave Communications
Millimeter-wave (mmWave) is one of the key technologies that enables the next wireless
generation. mmWave offers a much higher bandwidth than sub-6GHz communications
which allows multi-gigabit-per-second rates. This also alleviates the scarcity of spectrum
at lower frequencies, where most devices connect through sub-6GHz bands. However new
techniques are necessary to overcome the challenges associated with such high frequencies.
Most of these challenges come from the high spatial attenuation at the mmWave band,
which requires new paradigms that differ from sub-6GHz communications. Most notably
mmWave telecommunications are characterized by the need to be directional in order to
extend the operational range. This is achieved by using electronically steerable antenna
arrays, that focus the energy towards the desired direction by combining each antenna
element constructively or destructively. Additionally, most of the energy comes from
the Line Of Sight (LOS) component which gives mmWave a quasi-optical behaviour
where signals can reflect off walls and still be used for communication. Some other
challenges that directional communications bring are mobility tracking, blockages and
misalignments due to device rotation. The IEEE 802.11ad amendment introduced wireless
telecommunications in the unlicensed 60 GHz band. It is the first standard to address
the limitations of mmWave. It does so by introducing new mechanisms at the Medium
Access Control (MAC) and Physical (PHY) layers. It introduces multi-band operation,
relay operation mode, hybrid channel access scheme, beam tracking and beam forming
among others.
In this thesis we present a series of works that aim to improve mmWave
telecommunications. First we give an overview of the intrinsic challenges of mmWave
telecommunications, by explaining the modifications to the MAC and PHY layers. This
sets the base for the rest of the thesis. Then do a comprehensive study on how mmWave
behaves with existing technologies, namely TCP. TCP is unable to distinguish losses
caused by congestion or by transmission errors caused by channel degradation. Since
mmWave is affected by blockages more than sub-6GHz technologies, we propose a set
of parameters that improve the channel quality even for mobile scenarios. The next job
focuses on reducing the initial access overhead of mmWave by using sub-6GHz information
to steer towards the desired direction. We start this work by doing a comprehensive High Frequency (HF) and Low Frequency (LF) correlation, analyzing the similarity of
the existing paths between the two selected frequencies. Then we propose a beam
steering algorithm that reduces the overhead to one third of the original time. Once
we have studied how to reduce the initial access overhead, we propose a mechanism
to reduce the beam tracking overhead. For this we propose an open platform based
on a Field Programmable Gate Arrays (FPGA) where we implement an algorithm that
completely removes the need to train on the Station (STA) side. This is achieved by
changing beam patterns on the STA side while the Access Point (AP) is sending the
preamble. We can change up to 10 beam patterns without losing connection and we reduce
the overhead by a factor of 8.8 with respect to the IEEE 802.11ad standard. Finally
we present a dual band location system based on Commercial-Off-The-Shelve (COTS)
devices. Locating the STA can improve the quality of the channel significantly, since the
AP can predict and react to possible blockages. First we reverse engineer existing 60
GHz enabled COTS devices to extract Channel State Information (CSI) and Fine Timing
Measurements (FTM) measurements, from which we can estimate angle and distance.
Then we develop an algorithm that is able to choose between HF and LF in order to
improve the overall accuracy of the system. We achieve less than 17 cm of median error
in indoor environments, even when some areas are Non Line Of Sight (NLOS).This work has been supported by IMDEA Networks Institute.Programa de Doctorado en IngenierÃa Telemática por la Universidad Carlos III de MadridPresidente: Matthias Hollick.- Secretario: Vincenzo Mancuso.- Vocal: Paolo Casar
Exploring Wireless Data Center Networks: Can They Reduce Energy Consumption While Providing Secure Connections?
Data centers have become the digital backbone of the modern world. To support the growing demands on bandwidth, Data Centers consume an increasing amount of power. A significant portion of that power is consumed by information technology (IT) equipment, including servers and networking components. Additionally, the complex cabling in traditional data centers poses design and maintenance challenges and increases the energy cost of the cooling infrastructure by obstructing the flow of chilled air. Hence, to reduce the power consumption of the data centers, we proposed a wireless server-to-server data center network architecture using millimeter-wave links to eliminate the need for power-hungry switching fabric of traditional fat-tree-based data center networks. The server-to-server wireless data center network (S2S-WiDCN) architecture requires Line-of-Sight (LoS) between servers to establish direct communication links. However, in the presence of interference from internal or external sources, or an obstruction, such as an IT technician, the LoS may be blocked. To address this issue, we also propose a novel obstruction-aware adaptive routing algorithm for S2S-WiDCN.
S2S-WiDCN can reduce the power consumption of the data center network portion while not affecting the power consumption of the servers in the data center, which contributes significantly towards the total power consumption of the data center. Moreover, servers in data centers are almost always underutilized due to over-provisioning, which contributes heavily toward the high-power consumption of the data centers. To address the high power consumption of the servers, we proposed a network-aware bandwidth-constrained server consolidation algorithm called Network-Aware Server Consolidation (NASCon) for wireless data centers that can reduce the power consumption up to 37% while improving the network performance. However, due to the arrival of new tasks and the completion of existing tasks, the consolidated utilization profile of servers change, which may have an adverse effect on overall power consumption over time. To overcome this, NASCon algorithm needs to be executed periodically. We have proposed a mathematical model to estimate the optimal inter-consolidation time, which can be used by the data center resource management unit for scheduling NASCon consolidation operation in real-time and leverage the benefits of server consolidation.
However, in any data center environment ensuring security is one of the highest design priorities. Hence, for S2S-WiDCN to become a practical and viable solution for data center network design, the security of the network has to be ensured. S2S-WiDCN data center can be vulnerable to a variety of different attacks as it uses wireless links over an unguided channel for communication. As being a wireless system, the network has to be secured against common threats associated with any wireless networks such as eavesdropping attack, denial of services attack, and jamming attack. In parallel, other security threats such as the attack on the control plane, side-channel attack through traffic analysis are also possible. We have done an extensive study to elaborate the scope of these attacks as well as explore probable solutions against these issues. We also proposed viable solutions for the attack against eavesdropping, denial of services, jamming, and control-plane attack. To address the traffic analysis attack, we proposed a simulated annealing-based random routing mechanism which can be adopted instead of default routing in the wireless data center
- …