24 research outputs found

    Relay assisted device-to-device communication with channel uncertainty

    Get PDF
    The gains of direct communication between user equipment in a network may not be fully realised due to the separation between the user equipment and due to the fading that the channel between these user equipment experiences. In order to fully realise the gains that direct (device-to-device) communication promises, idle user equipment can be exploited to serve as relays to enforce device-to-device communication. The availability of potential relay user equipment creates a problem: a way to select the relay user equipment. Moreover, unlike infrastructure relays, user equipment are carried around by people and these users are self-interested. Thus the problem of relay selection goes beyond choosing which device to assist in relayed communication but catering for user self-interest. Another problem in wireless communication is the unavailability of perfect channel state information. This reality creates uncertainty in the channel and so in designing selection algorithms, channel uncertainty awareness needs to be a consideration. Therefore the work in this thesis considers the design of relay user equipment selection algorithms that are not only device centric but that are relay user equipment centric. Furthermore, the designed algorithms are channel uncertainty aware. Firstly, a stable matching based relay user equipment selection algorithm is put forward for underlay device-to-device communication. A channel uncertainty aware approach is proposed to cater to imperfect channel state information at the devices. The algorithm is combined with a rate based mode selection algorithm. Next, to cater to the queue state at the relay user equipment, a cross-layer selection algorithm is proposed for a twoway decode and forward relay set up. The algorithm proposed employs deterministic uncertainty constraint in the interference channel, solving the selection algorithm in a heuristic fashion. Then a cluster head selection algorithm is proposed for device-to-device group communication constrained by channel uncertainty in the interference channel. The formulated rate maximization problem is solved for deterministic and probabilistic constraint scenarios, and the problem extended to a multiple-input single-out scenario for which robust beamforming was designed. Finally, relay utility and social distance based selection algorithms are proposed for full duplex decode and forward device-to-device communication set up. A worst-case approach is proposed for a full channel uncertainty scenario. The results from computer simulations indicate that the proposed algorithms offer spectral efficiency, fairness and energy efficiency gains. The results also showed clearly the deterioration in the performance of networks when perfect channel state information is assumed

    Centralized and partial decentralized design for the Fog Radio Access Network

    Get PDF
    Fog Radio Access Network (F-RAN) has been shown to be a promising network architecture for the 5G network. With F-RAN, certain amount of signal processing functionalities are pushed from the Base Station (BS) on the network edge to the BaseBand Units (BBU) pool located remotely in the cloud. Hence, partially centralized network operation and management can be achieved, which can greatly improve the energy and spectral efficiency of the network, in order to meet the requirements of 5G. In this work, the optimal design for both uplink and downlink of F-RAN are intensively investigated

    Towards reliable communication in LTE-A connected heterogeneous machine to machine network

    Get PDF
    Machine to machine (M2M) communication is an emerging technology that enables heterogeneous devices to communicate with each other without human intervention and thus forming so-called Internet of Things (IoTs). Wireless cellular networks (WCNs) play a significant role in the successful deployment of M2M communication. Specially the ongoing massive deployment of long term evolution advanced (LTE-A) makes it possible to establish machine type communication (MTC) in most urban and remote areas, and by using LTE-A backhaul network, a seamless network communication is being established between MTC-devices and-applications. However, the extensive network coverage does not ensure a successful implementation of M2M communication in the LTE-A, and therefore there are still some challenges. Energy efficient reliable transmission is perhaps the most compelling demand for various M2M applications. Among the factors affecting reliability of M2M communication are the high endto-end delay and high bit error rate. The objective of the thesis is to provide reliable M2M communication in LTE-A network. In this aim, to alleviate the signalling congestion on air interface and efficient data aggregation we consider a cluster based architecture where the MTC devices are grouped into number of clusters and traffics are forwarded through some special nodes called cluster heads (CHs) to the base station (BS) using single or multi-hop transmissions. In many deployment scenarios, some machines are allowed to move and change their location in the deployment area with very low mobility. In practice, the performance of data transmission often degrades with the increase of distance between neighboring CHs. CH needs to be reselected in such cases. However, frequent re-selection of CHs results in counter effect on routing and reconfiguration of resource allocation associated with CH-dependent protocols. In addition, the link quality between a CH-CH and CH-BS are very often affected by various dynamic environmental factors such as heat and humidity, obstacles and RF interferences. Since CH aggregates the traffic from all cluster members, failure of the CH means that the full cluster will fail. Many solutions have been proposed to combat with error prone wireless channel such as automatic repeat request (ARQ) and multipath routing. Though the above mentioned techniques improve the communication reliability but intervene the communication efficiency. In the former scheme, the transmitter retransmits the whole packet even though the part of the packet has been received correctly and in the later one, the receiver may receive the same information from multiple paths; thus both techniques are bandwidth and energy inefficient. In addition, with retransmission, overall end to end delay may exceed the maximum allowable delay budget. Based on the aforementioned observations, we identify CH-to-CH channel is one of the bottlenecks to provide reliable communication in cluster based multihop M2M network and present a full solution to support fountain coded cooperative communications. Our solution covers many aspects from relay selection to cooperative formation to meet the user’s QoS requirements. In the first part of the thesis, we first design a rateless-coded-incremental-relay selection (RCIRS) algorithm based on greedy techniques to guarantee the required data rate with a minimum cost. After that, we develop fountain coded cooperative communication protocols to facilitate the data transmission between two neighbor CHs. In the second part, we propose joint network and fountain coding schemes for reliable communication. Through coupling channel coding and network coding simultaneously in the physical layer, joint network and fountain coding schemes efficiently exploit the redundancy of both codes and effectively combat the detrimental effect of fading conditions in wireless channels. In the proposed scheme, after correctly decoding the information from different sources, a relay node applies network and fountain coding on the received signals and then transmits to the destination in a single transmission. Therefore, the proposed schemes exploit the diversity and coding gain to improve the system performance. In the third part, we focus on the reliable uplink transmission between CHs and BS where CHs transmit to BS directly or with the help of the LTE-A relay nodes (RN). We investigate both type-I and type-II enhanced LTE-A networks and propose a set of joint network and fountain coding schemes to enhance the link robustness. Finally, the proposed solutions are evaluated through extensive numerical simulations and the numerical results are presented to provide a comparison with the related works found in the literature

    Video transport optimization techniques design and evaluation for next generation cellular networks

    Get PDF
    Video is foreseen to be the dominant type of data traffic in the Internet. This vision is supported by a number of studies which forecast that video traffic will drastically increase in the following years, surpassing Peer-to-Peer traffic in volume already in the current year. Current infrastructures are not prepared to deal with this traffic increase. The current Internet, and in particular the mobile Internet, was not designed with video requirements in mind and, as a consequence, its architecture is very inefficient for handling this volume of video traffic. When a large part of traffic is associated to multimedia entertainment, most of the mobile infrastructure is used in a very inefficient way to provide such a simple service, thereby saturating the whole cellular network, and leading to perceived quality levels that are not adequate to support widespread end user acceptance. The main goal of the research activity in this thesis is to evolve the mobile Internet architecture for efficient video traffic support. As video is expected to represent the majority of the traffic, the future architecture should efficiently support the requirements of this data type, and specific enhancements for video should be introduced at all layers of the protocol stack where needed. These enhancements need to cater for improved quality of experience, improved reliability in a mobile world (anywhere, anytime), lower exploitation cost, and increased flexibility. In this thesis a set of video delivery mechanisms are designed to optimize the video transmission at different layers of the protocol stack and at different levels of the cellular network. Upon the architectural choices, resource allocation schemes are implemented to support a range of video applications, which cover video broadcast/multicast streaming, video on demand, real-time streaming, video progressive download and video upstreaming. By means of simulation, the benefits of the designed mechanisms in terms of perceived video quality and network resource saving are shown and compared to existing solutions. Furthermore, selected modules are implemented in a real testbed and some experimental results are provided to support the development of such transport mechanisms in practice

    Smart Sensor Technologies for IoT

    Get PDF
    The recent development in wireless networks and devices has led to novel services that will utilize wireless communication on a new level. Much effort and resources have been dedicated to establishing new communication networks that will support machine-to-machine communication and the Internet of Things (IoT). In these systems, various smart and sensory devices are deployed and connected, enabling large amounts of data to be streamed. Smart services represent new trends in mobile services, i.e., a completely new spectrum of context-aware, personalized, and intelligent services and applications. A variety of existing services utilize information about the position of the user or mobile device. The position of mobile devices is often achieved using the Global Navigation Satellite System (GNSS) chips that are integrated into all modern mobile devices (smartphones). However, GNSS is not always a reliable source of position estimates due to multipath propagation and signal blockage. Moreover, integrating GNSS chips into all devices might have a negative impact on the battery life of future IoT applications. Therefore, alternative solutions to position estimation should be investigated and implemented in IoT applications. This Special Issue, “Smart Sensor Technologies for IoT” aims to report on some of the recent research efforts on this increasingly important topic. The twelve accepted papers in this issue cover various aspects of Smart Sensor Technologies for IoT

    Interference as an Issue and a Resource in Wireless Networks

    Get PDF
    This dissertation will be focused on the phenomenon of interference in wireless net- works. On one hand, interference will be viewed as a negative factor that one should mitigate in order to improve the performance of a wireless network in terms of achiev- able rate, and on the other hand as an asset to increase the performance of a network in terms of security. The problems that will be investigated are, first, the character- isation of the performance of a communication network modelled as an interference channel (IC) when interference alignment (IA) is used to mitigate the interference with imperfect knowledge of the channel state, second, the characterisation of the secrecy in the Internet-of-Things (IoT) framework where some devices may use artificial noise to generate interference to potential eavesdroppers. Different scenarios will be studied in the case where interference is unwanted; the first one is when the channel error is bounded. A lower bound on the capacity achievable in this case is provided and a new performance metric namely the saturating SNR is derived. The derived lower bound is studied with respect to some parameters of the estimation strategy when using Least-Square estimation to estimate the channel ma- trices. The second scenario deals with unbounded Gaussian estimation errors, here the statistical distribution of the achievable rate is given along with a new performance metric called outage probability that simplifies the study of the IC with IA under im- perfect CSI. The results are used to optimise the network parameters and extend the analysis further to the case of cellular networks. In the wanted interference situation, the secrecy of the worst-case communication is studied and the conditions for secrecy are provided. Furthermore the average number of secure links achievable in the network is studied according to a theoretical model that is developed for the IoT case

    Prediction-based techniques for the optimization of mobile networks

    Get PDF
    Mención Internacional en el título de doctorMobile cellular networks are complex system whose behavior is characterized by the superposition of several random phenomena, most of which, related to human activities, such as mobility, communications and network usage. However, when observed in their totality, the many individual components merge into more deterministic patterns and trends start to be identifiable and predictable. In this thesis we analyze a recent branch of network optimization that is commonly referred to as anticipatory networking and that entails the combination of prediction solutions and network optimization schemes. The main intuition behind anticipatory networking is that knowing in advance what is going on in the network can help understanding potentially severe problems and mitigate their impact by applying solution when they are still in their initial states. Conversely, network forecast might also indicate a future improvement in the overall network condition (i.e. load reduction or better signal quality reported from users). In such a case, resources can be assigned more sparingly requiring users to rely on buffered information while waiting for the better condition when it will be more convenient to grant more resources. In the beginning of this thesis we will survey the current anticipatory networking panorama and the many prediction and optimization solutions proposed so far. In the main body of the work, we will propose our novel solutions to the problem, the tools and methodologies we designed to evaluate them and to perform a real world evaluation of our schemes. By the end of this work it will be clear that not only is anticipatory networking a very promising theoretical framework, but also that it is feasible and it can deliver substantial benefit to current and next generation mobile networks. In fact, with both our theoretical and practical results we show evidences that more than one third of the resources can be saved and even larger gain can be achieved for data rate enhancements.Programa Oficial de Doctorado en Ingeniería TelemáticaPresidente: Albert Banchs Roca.- Presidente: Pablo Serrano Yañez-Mingot.- Secretario: Jorge Ortín Gracia.- Vocal: Guevara Noubi

    Optimization Modeling and Machine Learning Techniques Towards Smarter Systems and Processes

    Get PDF
    The continued penetration of technology in our daily lives has led to the emergence of the concept of Internet-of-Things (IoT) systems and networks. An increasing number of enterprises and businesses are adopting IoT-based initiatives expecting that it will result in higher return on investment (ROI) [1]. However, adopting such technologies poses many challenges. One challenge is improving the performance and efficiency of such systems by properly allocating the available and scarce resources [2, 3]. A second challenge is making use of the massive amount of data generated to help make smarter and more informed decisions [4]. A third challenge is protecting such devices and systems given the surge in security breaches and attacks in recent times [5]. To that end, this thesis proposes the use of various optimization modeling and machine learning techniques in three different systems; namely wireless communication systems, learning management systems (LMSs), and computer network systems. In par- ticular, the first part of the thesis posits optimization modeling techniques to improve the aggregate throughput and power efficiency of a wireless communication network. On the other hand, the second part of the thesis proposes the use of unsupervised machine learning clustering techniques to be integrated into LMSs to identify unengaged students based on their engagement with material in an e-learning environment. Lastly, the third part of the thesis suggests the use of exploratory data analytics, unsupervised machine learning clustering, and supervised machine learning classification techniques to identify malicious/suspicious domain names in a computer network setting. The main contributions of this thesis can be divided into three broad parts. The first is developing optimal and heuristic scheduling algorithms that improve the performance of wireless systems in terms of throughput and power by combining wireless resource virtualization with device-to-device and machine-to-machine communications. The second is using unsupervised machine learning clustering and association algorithms to determine an appropriate engagement level model for blended e-learning environments and study the relationship between engagement and academic performance in such environments. The third is developing a supervised ensemble learning classifier to detect malicious/suspicious domain names that achieves high accuracy and precision

    Recent Advances in Indoor Localization Systems and Technologies

    Get PDF
    Despite the enormous technical progress seen in the past few years, the maturity of indoor localization technologies has not yet reached the level of GNSS solutions. The 23 selected papers in this book present the recent advances and new developments in indoor localization systems and technologies, propose novel or improved methods with increased performance, provide insight into various aspects of quality control, and also introduce some unorthodox positioning methods

    Modelling, Dimensioning and Optimization of 5G Communication Networks, Resources and Services

    Get PDF
    This reprint aims to collect state-of-the-art research contributions that address challenges in the emerging 5G networks design, dimensioning and optimization. Designing, dimensioning and optimization of communication networks resources and services have been an inseparable part of telecom network development. The latter must convey a large volume of traffic, providing service to traffic streams with highly differentiated requirements in terms of bit-rate and service time, required quality of service and quality of experience parameters. Such a communication infrastructure presents many important challenges, such as the study of necessary multi-layer cooperation, new protocols, performance evaluation of different network parts, low layer network design, network management and security issues, and new technologies in general, which will be discussed in this book
    corecore