10 research outputs found

    Goal-Oriented Scheduling in Sensor Networks With Application Timing Awareness

    Get PDF
    — Taking inspiration from linguistics, the communications theoretical community has recently shown a significant recent interest in pragmatic, or goal-oriented, communication. In this paper, we tackle the problem of pragmatic communication with multiple clients with different, and potentially conflicting, objectives. We capture the goal-oriented aspect through the metric of Value of Information (VoI), which considers the estimation of the remote process as well as the timing constraints. However, the most common definition of VoI is simply the Mean Square Error (MSE) of the whole system state, regardless of the relevance for a specific client. Our work aims to overcome this limitation by including different summary statistics, i.e., value functions of the state, for separate clients, and a diversified query process on the client side, expressed through the fact that different applications may request different functions of the process state at different times. A query-aware Deep Reinforcement Learning (DRL) solution based on statically defined VoI can outperform naive approaches by 15-20%

    A Decentralized Policy for Minimization of Age of Incorrect Information in Slotted ALOHA Systems

    No full text
    The Age of Incorrect Information (AoII) is a metric that can combine the freshness of the information available to a gateway in an Internet of Things (IoT) network with the accuracy of that information. As such, minimizing the AoII can allow the operators of IoT systems to have a more precise and up-to-date picture of the environment in which the sensors are deployed. However, most IoT systems do not allow for centralized scheduling or explicit coordination, as sensors need to be extremely simple and consume as little power as possible. Finding a decentralized policy to minimize the AoII can be extremely challenging in this setting. This paper presents a heuristic to optimize AoII for a slotted ALOHA system, starting from a threshold-based policy and using dual methods to converge to a better solution. This method can significantly outperform state-independent policies, finding an efficient balance between frequent updates and a low number of packet collisions

    RAN Slicing Performance Tradeoffs: Timing Versus Throughput Requirements

    No full text
    The coexistence of diverse services with heterogeneous requirements is a fundamental feature of 5G. This necessitates efficient Radio Access Network (RAN) slicing, defined as sharing of the wireless resources among diverse services while guaranteeing their throughput, timing, and/or reliability requirements. In this paper, we investigate RAN slicing for an uplink scenario in the form of multiple access schemes for two user types: (1) broadband users with throughput requirements and (2) intermittently active users with timing requirements, expressed as either Latency-Reliability (LR) or Peak Age of Information (PAoI). Broadband users transmit data continuously, hence, are allocated non-overlapping parts of the spectrum. We evaluate the trade-offs between the achievable throughput of a broadband user and the timing requirements of an intermittent user under Orthogonal Multiple Access (OMA) and Non-Orthogonal Multiple Access (NOMA), considering capture. Our analysis shows that NOMA, in combination with packet-level coding, is a superior strategy in most cases for both LR and PAoI, achieving a similar LR with only a slight 2% decrease in throughput with respect to the case where an independent channel is allocated to each user. The latter solution leads to the upper bound in performance but requires double the amount of resources than the considered OMA and NOMA schemes. However, there are extreme cases where OMA achieves a slightly greater throughput than NOMA at the expense of an increased PAoI

    Spectrum slicing for multiple access channels with heterogeneous services

    No full text
    Wireless mobile networks from the fifth generation (5G) and beyond serve as platforms for flexible support of heterogeneous traffic types with diverse performance requirements. In particular, the broadband services aim for the traditional rate optimization, while the time-sensitive services aim for the optimization of latency and reliability, and some novel metrics such as Age of Information (AoI). In such settings, the key question is the one of spectrum slicing: how these services share the same chunk of available spectrum while meeting the heterogeneous requirements. In this work we investigated the two canonical frameworks for spectrum sharing, Orthogonal Multiple Access (OMA) and Non-Orthogonal Multiple Access (NOMA), in a simple, but insightful setup with a single time-slotted shared frequency channel, involving one broadband user, aiming to maximize throughput and using packet-level coding to protect its transmissions from noise and interference, and several intermittent users, aiming to either to improve their latency-reliability performance or to minimize their AoI. We analytically assessed the performances of Time Division Multiple Access (TDMA) and ALOHA-based schemes in both OMA and NOMA frameworks by deriving their Pareto regions and the corresponding optimal values of their parameters. Our results show that NOMA can outperform traditional OMA in latency-reliability oriented systems in most conditions, but OMA performs slightly better in age-oriented systems

    Remote Anomaly Detection in Industry 4.0 Using Resource-Constrained Devices

    No full text
    A central use case for the Internet of Things (IoT) is the adoption of sensors to monitor physical processes, such as the environment and industrial manufacturing processes, where they provide data for predictive maintenance, anomaly detection, or similar. The sensor devices are typically resource-constrained in terms of computation and power, and need to rely on cloud or edge computing for data processing. However, the capacity of the wireless link and their power constraints limit the amount of data that can be transmitted to the cloud. While this is not problematic for the monitoring of slowly varying processes such as temperature, it is more problematic for complex signals such as those captured by vibration and acoustic sensors. In this paper, we consider the specific problem of remote anomaly detection based on signals that fall into the latter category over wireless channels with resource-constrained sensors. We study the impact of source coding on the detection accuracy with both an anomaly detector based on Principal Component Analysis (PCA) and one based on an autoencoder. We show that the coded transmission is beneficial when the signal-to-noise ratio (SNR) of the channel is low, while uncoded transmission performs best in the high SNR regime

    Slicing a single wireless collision channel among throughput- And timeliness-sensitive services

    No full text
    The fifth generation (5G) of wireless systems has a platform-driven approach, aiming to support heterogeneous connections with very diverse requirements. The shared wireless resources should be sliced in a way that each user perceives that its requirements have been met. Heterogeneity challenges the traditional notion of resource efficiency, as the resource usage has to cater for, e.g., rate maximization for one user and a timeliness requirement for another user. This paper treats a model for radio access network (RAN) uplink, where a throughput-demanding broadband user shares wireless resources with an intermittently active user that wants to optimize the timeliness, expressed in terms of latency-reliability or Age of Information (AoI). We evaluate the trade-offs between throughput and timeliness for Orthogonal Multiple Access (OMA) as well as Non-Orthogonal Multiple Access (NOMA) with successive interference cancellation (SIC). We observe that NOMA with SIC, in a conservative scenario with destructive collisions, is just slightly inferior to that of OMA, which indicates that it may offer significant benefits in practical deployments where the capture effect is frequently encountered. On the other hand, finding the optimal configuration of NOMA with SIC depends on the activity pattern of the intermittent user, to which OMA is insensitive

    Query Timing Analysis for Content-Based Wake-Up Realizing Informative IoT Data Collection

    No full text
    Information freshness and high energy-efficiency are key requirements for sensor nodes serving Industrial Internet of Things (IIoT) applications, where a sink node must collect informative data before a deadline to control an external element. Pull-based communication is an interesting approach for optimizing information freshness and saving wasteful energy. To this end, we apply Content-based Wake-up (CoWu), in which the sink can activate a subset of nodes observing informative data at the time that wake-up signal is received. In this case, the timing of the wake-up signal plays an important role: early transmission leads to high reliability in data collection, but the received data may become obsolete by the deadline, while later transmission ensures a higher timeliness of the sensed data, but some nodes might not manage to communicate their data before the deadline. This letter investigates the timing for data collection using CoWu and characterizes the gain of CoWu. The obtained numerical results show that CoWu improves accuracy, while reducing energy consumption by about 75% with respect to round-robin scheduling

    A Perspective on Time Toward Wireless 6G

    No full text
    With the advent of 5G technology, the notion of latency got a prominent role in wireless connectivity, serving as a proxy term for addressing the requirements for real-time communication. As wireless systems evolve toward 6G, the ambition to immerse the digital into physical reality will increase. Besides making the real-time requirements more stringent, this immersion will bring the notions of time, simultaneity, presence, and causality to a new level of complexity. A growing body of research points out that latency is insufficient to parameterize all real-time requirements. Notably, one such requirement that received significant attention is information freshness, defined through the Age of Information (AoI) and its derivatives. In general, the metrics derived from a conventional black-box approach to communication network design are not representative of new distributed paradigms, such as sensing, learning, or distributed consensus. The objective of this article is to investigate the general notion of timing in wireless communication systems and networks, and its relation to effective information generation, processing, transmission, and reconstruction at the senders and receivers. We establish a general statistical framework of timing requirements in wireless communication systems, which subsumes both latency and AoI. The framework is made by associating a timing component with the two basic statistical operations: decision and estimation. We first use the framework to present a representative sample of the existing works that deal with timing in wireless communication. Next, it is shown how the framework can be used with different communication models of increasing complexity, starting from the basic Shannon one-way communication model and arriving at communication models for consensus, distributed learning, and inference. Overall, this article fills an important gap in the literature by providing a systematic treatment of various timing measures in wireless communication and sets the basis for design and optimization for the next-generation real-time systems

    Query Age of Information: Freshness in Pull-Based Communication

    No full text
    Age of Information (AoI) has become an important concept in communications, as it allows system designers to measure the freshness of the information available to remote monitoring or control processes. However, its definition tacitly assumes that new information is used at any time, which is not always the case: the instants at which information is collected and used may be dependent on a certain query process, and resource-constrained environments such as most Internet of Things (IoT) use cases require precise timing to fully exploit the limited available transmissions. In this work, we consider a pull-based communication model in which the freshness of information is only important when the receiver generates a query: if the monitoring process is not using the value, the age of the last update is irrelevant. We optimize the Age of Information at Query (QAoI), a metric that samples the AoI at relevant instants, better fitting the pull-based resource-constrained scenario, and show how this can lead to very different choices. Our results show that QAoI-aware optimization can significantly reduce the average and worst-case perceived age for both periodic and stochastic queries

    Freshness on Demand: Optimizing Age of Information for the Query Process

    No full text
    Age of Information (AoI) has become an important concept in communications, as it allows system designers to measure the freshness of the information available to remote monitoring or control processes. However, its definition tacitly assumes that new information is used at any time, which is not always the case. Instead instants at which information is collected and used are dependent on a certain query process. We propose a model that accounts for the discrete time nature of many monitoring processes, by considering a pull-based communication model in which the freshness of information is only important when the receiver generates a query. We then define the Age of Information at Query (QAoI), a more general metric that fits the pull-based scenario, and show how its optimization can lead to very different choices from traditional push-based AoI optimization when using a Packet Erasure Channel (PEC)
    corecore