164 research outputs found

    Congestion Control for Massive Machine-Type Communications: Distributed and Learning-Based Approaches

    Get PDF
    The Internet of things (IoT) is going to shape the future of wireless communications by allowing seamless connections among wide range of everyday objects. Machine-to-machine (M2M) communication is known to be the enabling technology for the development of IoT. With M2M, the devices are allowed to interact and exchange data without or with little human intervention. Recently, M2M communication, also referred to as machine-type communication (MTC), has received increased attention due to its potential to support diverse applications including eHealth, industrial automation, intelligent transportation systems, and smart grids. M2M communication is known to have specific features and requirements that differ from that of the traditional human-to-human (H2H) communication. As specified by the Third Generation Partnership Project (3GPP), MTC devices are inexpensive, low power, and mostly low mobility devices. Furthermore, MTC devices are usually characterized by infrequent, small amount of data, and mainly uplink traffic. Most importantly, the number of MTC devices is expected to highly surpass that of H2H devices. Smart cities are an example of such a mass-scale deployment. These features impose various challenges related to efficient energy management, enhanced coverage and diverse quality of service (QoS) provisioning, among others. The diverse applications of M2M are going to lead to exponential growth in M2M traffic. Associating with M2M deployment, a massive number of devices are expected to access the wireless network concurrently. Hence, a network congestion is likely to occur. Cellular networks have been recognized as excellent candidates for M2M support. Indeed, cellular networks are mature, well-established networks with ubiquitous coverage and reliability which allows cost-effective deployment of M2M communications. However, cellular networks were originally designed for human-centric services with high-cost devices and ever-increasing rate requirements. Additionally, the conventional random access (RA) mechanism used in Long Term Evolution-Advanced (LTE-A) networks lacks the capability of handling such an enormous number of access attempts expected from massive MTC. Particularly, this RA technique acts as a performance bottleneck due to the frequent collisions that lead to excessive delay and resource wastage. Also, the lengthy handshaking process of the conventional RA technique results in highly expensive signaling, specifically for M2M devices with small payloads. Therefore, designing an efficient medium access schemes is critical for the survival of M2M networks. In this thesis, we study the uplink access of M2M devices with a focus on overload control and congestion handling. In this regard, we mainly provide two different access techniques keeping in mind the distinct features and requirements of MTC including massive connectivity, latency reduction, and energy management. In fact, full information gathering is known to be impractical for such massive networks of tremendous number of devices. Hence, we assure to preserve the low complexity, and limited information exchange among different network entities by introducing distributed techniques. Furthermore, machine learning is also employed to enhance the performance with no or limited information exchange at the decision maker. The proposed techniques are assessed via extensive simulations as well as rigorous analytical frameworks. First, we propose an efficient distributed overload control algorithm for M2M with massive access, referred to as M2M-OSA. The proposed algorithm can efficiently allocate the available network resources to massive number of devices within relatively small, and bounded contention time and with reduced overhead. By resolving collisions, the proposed algorithm is capable of achieving full resources utilization along with reduced average access delay and energy saving. For Beta-distributed traffic, we provide analytical evaluation for the performance of the proposed algorithm in terms of the access delay, total service time, energy consumption, and blocking probability. This performance assessment accounted for various scenarios including slightly, and seriously congested cases, in addition to finite and infinite retransmission limits for the devices. Moreover, we provide a discussion of the non-ideal situations that could be encountered in real-life deployment of the proposed algorithm supported by possible solutions. For further energy saving, we introduced a modified version of M2M-OSA with traffic regulation mechanism. In the second part of the thesis, we adopt a promising alternative for the conventional random access mechanism, namely fast uplink grant. Fast uplink grant was first proposed by the 3GPP for latency reduction where it allows the base station (BS) to directly schedule the MTC devices (MTDs) without receiving any scheduling requests. In our work, to handle the major challenges associated to fast uplink grant namely, active set prediction and optimal scheduling, both non-orthogonal multiple access (NOMA) and learning techniques are utilized. Particularly, we propose a two-stage NOMA-based fast uplink grant scheme that first employs multi-armed bandit (MAB) learning to schedule the fast grant devices with no prior information about their QoS requirements or channel conditions at the BS. Afterwards, NOMA facilitates the grant sharing where pairing is done in a distributed manner to reduce signaling overhead. In the proposed scheme, NOMA plays a major role in decoupling the two major challenges of fast grant schemes by permitting pairing with only active MTDs. Consequently, the wastage of the resources due to traffic prediction errors can be significantly reduced. We devise an abstraction model for the source traffic predictor needed for fast grant such that the prediction error can be evaluated. Accordingly, the performance of the proposed scheme is analyzed in terms of average resource wastage, and outage probability. The simulation results show the effectiveness of the proposed method in saving the scarce resources while verifying the analysis accuracy. In addition, the ability of the proposed scheme to pick quality MTDs with strict latency is depicted

    Let Opportunistic Crowdsensors Work Together for Resource-efficient, Quality-aware Observations

    Get PDF
    International audienceOpportunistic crowdsensing empowers citizens carrying hand-held devices to sense physical phenomena of common interest at a large and fine-grained scale without requiring the citizens' active involvement. However, the resulting uncontrolled collection and upload of the massive amount of contributed raw data incur significant resource consumption, from the end device to the server, as well as challenge the quality of the collected observations. This paper tackles both challenges raised by opportunistic crowdsensing, that is, enabling the resource-efficient gathering of relevant observations. To achieve so, we introduce the BeTogether middleware fostering context-aware, collaborative crowdsensing at the edge so that co-located crowdsensors operating in the same context, group together to share the work load in a cost- and quality-effective way. We evaluate the proposed solution using an implementation-driven evaluation that leverages a dataset embedding nearly 1 million entries contributed by 550 crowdsensors over a year. Results show that BeTogether increases the quality of the collected data while reducing the overall resource cost compared to the cloud-centric approach

    Modelling and optimisation of resource usage in an IoT enabled smart campus

    Full text link
    University campuses are essentially a microcosm of a city. They comprise diverse facilities such as residences, sport centres, lecture theatres, parking spaces, and public transport stops. Universities are under constant pressure to improve efficiencies while offering a better experience to various stakeholders including students, staff, and visitors. Nonetheless, anecdotal evidence indicates that campus assets are not being utilized efficiently, often due to the lack of data collection and analysis, thereby limiting the ability to make informed decisions on the allocation and management of resources. Advances in the Internet of Things (IoT) technologies that can sense and communicate data from the physical world, coupled with data analytics and Artificial intelligence (AI) that can predict usage patterns, have opened up new opportunities for organizations to lower cost and improve user experience. This thesis explores this opportunity via theory and experimentation using UNSW Sydney as a living laboratory. The building blocks of this thesis consist of three pillars of execution, namely, IoT deployment, predictive modelling, and optimization. Together, these components create an end-to-end framework that provides informed decisions to estate manager in regards to the optimal allocation of campus resources. The main contributions of this thesis are three application domains, which lies on top of the execution pillars, defining campus resources as classrooms, car parks, and transit buses. Specifically, our contributions are: i) We evaluate several IoT occupancy sensing technologies and instrument 9 lecture halls of varying capacities with the most appropriate sensing solution. The collected data provides us with insights into attendance patterns, such as cancelled lectures and class tests, of over 250 courses. We then develop predictive models using machine learning algorithms and quantile regression technique to predict future attendance patterns. Finally, we propose an intelligent optimisation model that allows allocations of classes to rooms based on the dynamics of predicted attendance as opposed to static enrolment number. We show that the data-driven assignment of classroom resources can achieve a potential saving in room cost of over 10\% over the course of a semester, while incurring a very low risk of disrupting student experience due to classroom overflow; ii) We instrument a car park with IoT sensors for real-time monitoring of parking demand and comprehensively analyse the usage data spanning over 15 months. We then develop machine learning models to forecast future parking demand at multiple forecast horizons ranging from 1 day to 10 weeks, our models achieve a mean absolute error (MAE) of 4.58 cars per hour. Finally, we propose a novel optimal allocation framework that allows campus manager to re-dimension the car park to accommodate new paradigms of car use while minimizing the risk of rejecting users and maintaining a certain level of revenue from the parking infrastructure; iii) We develop sensing technology for measuring an outdoor orderly queue using ultrasonic sensor and LoRaWAN, and deploy the solution at an on campus bus stop. Our solution yields a reasonable accuracy with MAE of 10.7 people for detecting a queue length of up to 100 people. We then develop an optimisation model to reschedule bus dispatch times based on the actual dynamics of passenger demand. The result suggests that a potential wait time reduction of 42.93% can be achieved with demand-driven bus scheduling. Taken together, our contributions demonstrates that there are significant resource efficiency gains to be realised in a smart-campus that employs IoT sensing coupled with predictive modelling and dynamic optimisation algorithms

    Modelling and optimisation of resource usage in an IoT enabled smart campus

    Full text link
    University campuses are essentially a microcosm of a city. They comprise diverse facilities such as residences, sport centres, lecture theatres, parking spaces, and public transport stops. Universities are under constant pressure to improve efficiencies while offering a better experience to various stakeholders including students, staff, and visitors. Nonetheless, anecdotal evidence indicates that campus assets are not being utilized efficiently, often due to the lack of data collection and analysis, thereby limiting the ability to make informed decisions on the allocation and management of resources. Advances in the Internet of Things (IoT) technologies that can sense and communicate data from the physical world, coupled with data analytics and Artificial intelligence (AI) that can predict usage patterns, have opened up new opportunities for organizations to lower cost and improve user experience. This thesis explores this opportunity via theory and experimentation using UNSW Sydney as a living laboratory. The building blocks of this thesis consist of three pillars of execution, namely, IoT deployment, predictive modelling, and optimization. Together, these components create an end-to-end framework that provides informed decisions to estate manager in regards to the optimal allocation of campus resources. The main contributions of this thesis are three application domains, which lies on top of the execution pillars, defining campus resources as classrooms, car parks, and transit buses. Specifically, our contributions are: i) We evaluate several IoT occupancy sensing technologies and instrument 9 lecture halls of varying capacities with the most appropriate sensing solution. The collected data provides us with insights into attendance patterns, such as cancelled lectures and class tests, of over 250 courses. We then develop predictive models using machine learning algorithms and quantile regression technique to predict future attendance patterns. Finally, we propose an intelligent optimisation model that allows allocations of classes to rooms based on the dynamics of predicted attendance as opposed to static enrolment number. We show that the data-driven assignment of classroom resources can achieve a potential saving in room cost of over 10\% over the course of a semester, while incurring a very low risk of disrupting student experience due to classroom overflow; ii) We instrument a car park with IoT sensors for real-time monitoring of parking demand and comprehensively analyse the usage data spanning over 15 months. We then develop machine learning models to forecast future parking demand at multiple forecast horizons ranging from 1 day to 10 weeks, our models achieve a mean absolute error (MAE) of 4.58 cars per hour. Finally, we propose a novel optimal allocation framework that allows campus manager to re-dimension the car park to accommodate new paradigms of car use while minimizing the risk of rejecting users and maintaining a certain level of revenue from the parking infrastructure; iii) We develop sensing technology for measuring an outdoor orderly queue using ultrasonic sensor and LoRaWAN, and deploy the solution at an on campus bus stop. Our solution yields a reasonable accuracy with MAE of 10.7 people for detecting a queue length of up to 100 people. We then develop an optimisation model to reschedule bus dispatch times based on the actual dynamics of passenger demand. The result suggests that a potential wait time reduction of 42.93% can be achieved with demand-driven bus scheduling. Taken together, our contributions demonstrates that there are significant resource efficiency gains to be realised in a smart-campus that employs IoT sensing coupled with predictive modelling and dynamic optimisation algorithms

    Hybrid mobile computing for connected autonomous vehicles

    Get PDF
    With increasing urbanization and the number of cars on road, there are many global issues on modern transport systems, Autonomous driving and connected vehicles are the most promising technologies to tackle these issues. The so-called integrated technology connected autonomous vehicles (CAV) can provide a wide range of safety applications for safer, greener and more efficient intelligent transport systems (ITS). As computing is an extreme component for CAV systems,various mobile computing models including mobile local computing, mobile edge computing and mobile cloud computing are proposed. However it is believed that none of these models fits all CAV applications, which have highly diverse quality of service (QoS) requirements such as communication delay, data rate, accuracy, reliability and/or computing latency.In this thesis, we are motivated to propose a hybrid mobile computing model with objective of overcoming limitations of individual models and maximizing the performances for CAV applications.In proposed hybrid mobile computing model three basic computing models and/or their combinations are chosen and applied to different CAV applications, which include mobile local computing, mobile edge computing and mobile cloud computing. Different computing models and their combinations are selected according to the QoS requirements of the CAV applications.Following the idea, we first investigate the job offloading and allocation of computing and communication resources at the local hosts and external computing centers with QoS aware and resource awareness. Distributed admission control and resource allocation algorithms are proposed including two baseline non-cooperative algorithms and a matching theory based cooperative algorithm. Experiment results demonstrate the feasibility of the hybrid mobile computing model and show large improvement on the service quality and capacity over existing individual computing models. The matching algorithm also largely outperforms the baseline non-cooperative algorithms.In addition, two specific use cases of the hybrid mobile computing for CAV applications are investigated: object detection with mobile local computing where only local computing resources are used, and movie recommendation with mobile cloud computing where remote cloud resources are used. For object detection, we focus on the challenges of detecting vehicles, pedestrians and cyclists in driving environment and propose three methods to an existing CNN based object detector. Large detection performance improvement is obtained over the KITTI benchmark test dataset. For movie recommendation we propose two recommendation models based on a general framework of integrating machine learning and collaborative filtering approach.The experiment results on Netix movie dataset show that our models are very effective for cold start items recommendatio

    How technology can advance port operations and address supply chain disruptions

    Get PDF
    Supply chain disruptions continue to be a significant challenge as the world economy recovers from the pandemic-related shutdowns that have strained global supply chains. Shocks challenge the adaptability and resilience of maritime ports. The reaction of automated container terminals to supply chain disruptions has renewed interest, given the dramatic scenes of ships anchored for weeks. In this dissertation, I provide a vision of how technology can enhance a port’s ability to anticipate and handle shocks by improving coordination, cooperation, and information exchange across port stakeholders. The vision will be helpful for academics and practitioners to perform research that advances theory and practice on the use of advanced technologies to improve port operations. I use complex adaptive systems theory to develop a qualitative cross-case study of the ports of Los Angeles, Vancouver, and Rotterdam. I examine the effect that automation and other technologies have had on the efficiency of these ports, both in daily operations and during the disruption caused by the COVID-19 pandemic. Using critical tenets of complexity and with a rigorous application of the case study method, I develop theoretical propositions and practical insights to ground the vision of the port of the future based on current practices. The findings from the cross-case study suggest that automated terminals were more efficient during the pandemic than non-automated terminals. I propose that transitioning to higher levels of automation, supported by emerging technologies like blockchain and the internet of things, will make ports more resilient to supply chain disruptions when those systems are coordinated through Port Community Systems

    A Unified And Green Platform For Smartphone Sensing

    Get PDF
    Smartphones have become key communication and entertainment devices in people\u27s daily life. Sensors on (or attached to) smartphones can enable attractive sensing applications in different domains, including environmental monitoring, social networking, healthcare, transportation, etc. Most existing smartphone sensing systems are application-specific. How to leverage smartphones\u27 sensing capability to make them become unified information providers for various applications has not yet been fully explored. This dissertation presents a unified and green platform for smartphone sensing, which has the following desirable features: 1) It can support various smartphone sensing applications; 2) It is personalizable; 2) It is energy-efficient; and 3) It can be easily extended to support new sensors. Two novel sensing applications are built and integrated into this unified platform: SOR and LIPS. SOR is a smartphone Sensing based Objective Ranking (SOR) system. Different from a few subjective online review and recommendation systems (such as Yelp and TripAdvisor), SOR ranks a target place based on data collected via smartphone sensing. LIPS is a system that learns the LIfestyles of mobile users via smartPhone Sensing (LIPS). Combining both unsupervised and supervised learning, a hybrid scheme is proposed to characterize lifestyle and predict future activities of mobile users. This dissertation also studies how to use the cloud as a coordinator to assist smartphones for sensing collaboratively with the objective of reducing sensing energy consumption. A novel probabilistic model is built to address the GPS-less energy-efficient crowd sensing problem. Provably-good approximation algorithms are presented to enable smartphones to sense collaboratively without accurate locations such that sensing coverage requirements can be met with limited energy consumption

    Unmanned Aerial Vehicle (UAV)-Enabled Wireless Communications and Networking

    Get PDF
    The emerging massive density of human-held and machine-type nodes implies larger traffic deviatiolns in the future than we are facing today. In the future, the network will be characterized by a high degree of flexibility, allowing it to adapt smoothly, autonomously, and efficiently to the quickly changing traffic demands both in time and space. This flexibility cannot be achieved when the network’s infrastructure remains static. To this end, the topic of UAVs (unmanned aerial vehicles) have enabled wireless communications, and networking has received increased attention. As mentioned above, the network must serve a massive density of nodes that can be either human-held (user devices) or machine-type nodes (sensors). If we wish to properly serve these nodes and optimize their data, a proper wireless connection is fundamental. This can be achieved by using UAV-enabled communication and networks. This Special Issue addresses the many existing issues that still exist to allow UAV-enabled wireless communications and networking to be properly rolled out

    Direct communication radio Iinterface for new radio multicasting and cooperative positioning

    Get PDF
    Cotutela: Universidad de defensa UNIVERSITA’ MEDITERRANEA DI REGGIO CALABRIARecently, the popularity of Millimeter Wave (mmWave) wireless networks has increased due to their capability to cope with the escalation of mobile data demands caused by the unprecedented proliferation of smart devices in the fifth-generation (5G). Extremely high frequency or mmWave band is a fundamental pillar in the provision of the expected gigabit data rates. Hence, according to both academic and industrial communities, mmWave technology, e.g., 5G New Radio (NR) and WiGig (60 GHz), is considered as one of the main components of 5G and beyond networks. Particularly, the 3rd Generation Partnership Project (3GPP) provides for the use of licensed mmWave sub-bands for the 5G mmWave cellular networks, whereas IEEE actively explores the unlicensed band at 60 GHz for the next-generation wireless local area networks. In this regard, mmWave has been envisaged as a new technology layout for real-time heavy-traffic and wearable applications. This very work is devoted to solving the problem of mmWave band communication system while enhancing its advantages through utilizing the direct communication radio interface for NR multicasting, cooperative positioning, and mission-critical applications. The main contributions presented in this work include: (i) a set of mathematical frameworks and simulation tools to characterize multicast traffic delivery in mmWave directional systems; (ii) sidelink relaying concept exploitation to deal with the channel condition deterioration of dynamic multicast systems and to ensure mission-critical and ultra-reliable low-latency communications; (iii) cooperative positioning techniques analysis for enhancing cellular positioning accuracy for 5G+ emerging applications that require not only improved communication characteristics but also precise localization. Our study indicates the need for additional mechanisms/research that can be utilized: (i) to further improve multicasting performance in 5G/6G systems; (ii) to investigate sideline aspects, including, but not limited to, standardization perspective and the next relay selection strategies; and (iii) to design cooperative positioning systems based on Device-to-Device (D2D) technology
    • …
    corecore