1,482 research outputs found

    Towards Massive Machine Type Communications in Ultra-Dense Cellular IoT Networks: Current Issues and Machine Learning-Assisted Solutions

    Get PDF
    The ever-increasing number of resource-constrained Machine-Type Communication (MTC) devices is leading to the critical challenge of fulfilling diverse communication requirements in dynamic and ultra-dense wireless environments. Among different application scenarios that the upcoming 5G and beyond cellular networks are expected to support, such as eMBB, mMTC and URLLC, mMTC brings the unique technical challenge of supporting a huge number of MTC devices, which is the main focus of this paper. The related challenges include QoS provisioning, handling highly dynamic and sporadic MTC traffic, huge signalling overhead and Radio Access Network (RAN) congestion. In this regard, this paper aims to identify and analyze the involved technical issues, to review recent advances, to highlight potential solutions and to propose new research directions. First, starting with an overview of mMTC features and QoS provisioning issues, we present the key enablers for mMTC in cellular networks. Along with the highlights on the inefficiency of the legacy Random Access (RA) procedure in the mMTC scenario, we then present the key features and channel access mechanisms in the emerging cellular IoT standards, namely, LTE-M and NB-IoT. Subsequently, we present a framework for the performance analysis of transmission scheduling with the QoS support along with the issues involved in short data packet transmission. Next, we provide a detailed overview of the existing and emerging solutions towards addressing RAN congestion problem, and then identify potential advantages, challenges and use cases for the applications of emerging Machine Learning (ML) techniques in ultra-dense cellular networks. Out of several ML techniques, we focus on the application of low-complexity Q-learning approach in the mMTC scenarios. Finally, we discuss some open research challenges and promising future research directions.Comment: 37 pages, 8 figures, 7 tables, submitted for a possible future publication in IEEE Communications Surveys and Tutorial

    An intelligent approach to quality of service for MPEG-4 video transmission in IEEE 802.15.1

    Get PDF
    Nowadays, wireless connectivity is becoming ubiquitous spreading to companies and in domestic areas. IEEE 802.15.1 commonly known as Bluetooth is high-quality, high-security, high-speed and low-cost radio signal technology. This wireless technology allows a maximum access range of 100 meters yet needs power as low as 1mW. Regrettably, IEEE 802.15.1 has a very limited bandwidth. This limitation can become a real problem If the user wishes to transmit a large amount of data in a very short time. The version 1.2 which is used in this project could only carry a maximum download rate of 724Kbps and an upload rate of 54Kbps In its asynchronous mode. But video needs a very large bandwidth to be transmitted with a sufficient level of quality. Video transmission over IEEE 802.15.1 networks would therefore be difficult to achieve, due to the limited bandwidth. Hence, a solution to transmit digital video with a sufficient quality of picture to arrive at the receiving end is required. A hybrid scheme has been developed in this thesis, comprises of a fuzzy logic set of rules and an artificial neural network algorithms. MPEG-4 video compression has been used in this work to optimise the transmission. This research further utilises an ‘added-buffer’ to prevent excessive data loss of MPEG-4 video over IEEE 802.15.1transmission and subsequently increase picture quality. The neural-fuzzy scheme regulates the output rate of the added-buffer to ensure that MPEG-4 video stream conforms to the traffic conditions of the IEEE 802.15.1 channel during the transmission period, that is to send more data when the bandwidth is not fully used and keep the data in the buffers if the bandwidth is overused. Computer simulation results confirm that intelligence techniques and added-buffer do improve quality of picture, reduce data loss and communication delay, as compared with conventional MPEG video transmission over IEEE 802.15.1

    Security and the smart city: A systematic review

    Get PDF
    The implementation of smart technology in cities is often hailed as the solution to many urban challenges such as transportation, waste management, and environmental protection. Issues of security and crime prevention, however, are in many cases neglected. Moreover, when researchers do introduce new smart security technologies, they rarely discuss their implementation or question how new smart city security might affect traditional policing and urban planning processes. This systematic review explores the recent literature concerned with new ‘smart city’ security technologies and aims to investigate to what extent these new interventions correspond with traditional functions of security interventions. Through an extensive literature search we compiled a list of security interventions for smart cities and suggest several changes to the conceptual status quo in the field. Ultimately, we propose three clear categories to categorise security interventions in smart cities: Those interventions that use new sensors but traditional actuators, those that seek to make old systems smart, and those that introduce entirely new functions. These themes are then discussed in detail and the importance of each group of interventions for the overall field of urban security and governance is assessed

    TSU Faculty Research Database-Jan 2017

    Get PDF
    Research interests and selected publications from 230 Texas Southern University faculty have been updated in Jan 2017. Faculty from Public Affairs, the College of Science, Engineering and Technology, the College of Pharmacy and Health Sciences, the College of Education, the College of Business, the College of Liberal Arts and Behavioral Sciences, the Law School and the School of Communications are included

    Cooperative Radio Resource Management for Next Generation Systems

    Get PDF

    An investigation into dynamical bandwidth management and bandwidth redistribution using a pool of cooperating interfacing gateways and a packet sniffer in mobile cloud computing

    Get PDF
    Mobile communication devices are increasingly becoming an essential part of almost every aspect of our daily life. However, compared to conventional communication devices such as laptops, notebooks, and personal computers, mobile devices still lack in terms of resources such as processor, storage and network bandwidth. Mobile Cloud Computing is intended to augment the capabilities of mobile devices by moving selected workloads away from resource-limited mobile devices to resource-intensive servers hosted in the cloud. Services hosted in the cloud are accessed by mobile users on-demand via the Internet using standard thick or thin applications installed on their devices. Nowadays, users of mobile devices are no longer satisfied with best-effort service and demand QoS when accessing and using applications and services hosted in the cloud. The Internet was originally designed to provide best-effort delivery of data packets, with no guarantee on packet delivery. Quality of Service has been implemented successfully in provider and private networks since the Internet Engineering Task Force introduced the Integrated Services and Differentiated Services models. These models have their legacy but do not adequately address the Quality of Service needs in Mobile Cloud Computing where users are mobile, traffic differentiation is required per user, device or application, and packets are routed across several network domains which are independently administered. This study investigates QoS and bandwidth management in Mobile Cloud Computing and considers a scenario where a virtual test-bed made up of GNS3 network software emulator, Cisco IOS image, Wireshark packet sniffer, Solar-Putty, and Firefox web browser appliance is set up on a laptop virtualized with VMware Workstation 15 Pro. The virtual test-bed is in turn connected to the real world Internet via the host laptop's Ethernet Network Interface Card. Several virtual Firefox appliances are set up as endusers and generate traffic by launching web applications such as video streaming, file download and Internet browsing. The traffic generated by the end-users and bandwidth used is measured, monitored, and tracked using a Wireshark packet sniffer installed on all interfacing gateways that connect the end-users to the cloud. Each gateway aggregates the demand of connected hosts and delivers Quality of Service to connected users based on the Quality of Service policies and mechanisms embedded in the gateway. Analysis of the results shows that a packet sniffer deployed at a suitable point in the network can identify, measure and track traffic usage per user, device or application in real-time. The study has also demonstrated that when deployed in the gateway connecting users to the cloud, it provides network-wide monitoring and traffic statistics collected can be fed to other functional components of the gateway where a dynamical bandwidth management scheme can be applied to instantaneously allocate and redistribute bandwidth to target users as they roam around the network from one location to another. This approach is however limited and ensuring end-to-end Quality of Service requires mechanisms and policies to be extended across all network layers along the traffic path between the user and the cloud in order to guarantee a consistent treatment of traffic

    Training of Crisis Mappers and Map Production from Multi-sensor Data: Vernazza Case Study (Cinque Terre National Park, Italy)

    Get PDF
    This aim of paper is to presents the development of a multidisciplinary project carried out by the cooperation between Politecnico di Torino and ITHACA (Information Technology for Humanitarian Assistance, Cooperation and Action). The goal of the project was the training in geospatial data acquiring and processing for students attending Architecture and Engineering Courses, in order to start up a team of "volunteer mappers". Indeed, the project is aimed to document the environmental and built heritage subject to disaster; the purpose is to improve the capabilities of the actors involved in the activities connected in geospatial data collection, integration and sharing. The proposed area for testing the training activities is the Cinque Terre National Park, registered in the World Heritage List since 1997. The area was affected by flood on the 25th of October 2011. According to other international experiences, the group is expected to be active after emergencies in order to upgrade maps, using data acquired by typical geomatic methods and techniques such as terrestrial and aerial Lidar, close-range and aerial photogrammetry, topographic and GNSS instruments etc.; or by non conventional systems and instruments such us UAV, mobile mapping etc. The ultimate goal is to implement a WebGIS platform to share all the data collected with local authorities and the Civil Protectio

    A Survey on the Security and the Evolution of Osmotic and Catalytic Computing for 5G Networks

    Full text link
    The 5G networks have the capability to provide high compatibility for the new applications, industries, and business models. These networks can tremendously improve the quality of life by enabling various use cases that require high data-rate, low latency, and continuous connectivity for applications pertaining to eHealth, automatic vehicles, smart cities, smart grid, and the Internet of Things (IoT). However, these applications need secure servicing as well as resource policing for effective network formations. There have been a lot of studies, which emphasized the security aspects of 5G networks while focusing only on the adaptability features of these networks. However, there is a gap in the literature which particularly needs to follow recent computing paradigms as alternative mechanisms for the enhancement of security. To cover this, a detailed description of the security for the 5G networks is presented in this article along with the discussions on the evolution of osmotic and catalytic computing-based security modules. The taxonomy on the basis of security requirements is presented, which also includes the comparison of the existing state-of-the-art solutions. This article also provides a security model, "CATMOSIS", which idealizes the incorporation of security features on the basis of catalytic and osmotic computing in the 5G networks. Finally, various security challenges and open issues are discussed to emphasize the works to follow in this direction of research.Comment: 34 pages, 7 tables, 7 figures, Published In 5G Enabled Secure Wireless Networks, pp. 69-102. Springer, Cham, 201
    corecore