2,583 research outputs found

    On-demand Bandwidth and Stability Based Unicast Routing in Mobile Adhoc Networks

    Get PDF
    Characteristics of mobile ad hoc networks (MANETs) such as lack of central coordination, dynamic topology and limited resources pose a challenging problem in quality of service (QoS) routing. Providing an efficient, robust and low overhead QoS unicast route from source to destination is a critical issue. Bandwidth and route stability are the major important QoS parameters for applications where long duration connections are required with stringent bandwidth requirements for multimedia applications. This paper proposes an On-demand Bandwidth and Stability based Unicast Routing scheme (OBSUR) in MANET by adding additional QoS features to existing Dynamic Source Routing (DSR) protocol. The objective of the OBSUR is to provide QoS satisfied, reliable and robust route for communicating nodes. The scheme works in following steps. (1) Each node in the network periodically (small regular intervals) estimates bandwidth availability, node and link stability, buffer availability, and stability factor between nodes. (2) Construction of neighbor stability and QoS database at every node which is used in route establishment process. (3) The unicast path is constructed by using route request and route reply packets with the help of route information cache, and (4) route maintenance in case of node mobility and route failures. Simulation results show that there is an improvement in terms of traffic admission ratio, control overhead, packet delivery ratio, end to end delay and throughput as compared to Route Stability Based QoS Routing (RSQR) in MANETs.

    An Overview on Application of Machine Learning Techniques in Optical Networks

    Get PDF
    Today's telecommunication networks have become sources of enormous amounts of widely heterogeneous data. This information can be retrieved from network traffic traces, network alarms, signal quality indicators, users' behavioral data, etc. Advanced mathematical tools are required to extract meaningful information from these data and take decisions pertaining to the proper functioning of the networks from the network-generated data. Among these mathematical tools, Machine Learning (ML) is regarded as one of the most promising methodological approaches to perform network-data analysis and enable automated network self-configuration and fault management. The adoption of ML techniques in the field of optical communication networks is motivated by the unprecedented growth of network complexity faced by optical networks in the last few years. Such complexity increase is due to the introduction of a huge number of adjustable and interdependent system parameters (e.g., routing configurations, modulation format, symbol rate, coding schemes, etc.) that are enabled by the usage of coherent transmission/reception technologies, advanced digital signal processing and compensation of nonlinear effects in optical fiber propagation. In this paper we provide an overview of the application of ML to optical communications and networking. We classify and survey relevant literature dealing with the topic, and we also provide an introductory tutorial on ML for researchers and practitioners interested in this field. Although a good number of research papers have recently appeared, the application of ML to optical networks is still in its infancy: to stimulate further work in this area, we conclude the paper proposing new possible research directions

    QAmplifyNet: Pushing the Boundaries of Supply Chain Backorder Prediction Using Interpretable Hybrid Quantum - Classical Neural Network

    Full text link
    Supply chain management relies on accurate backorder prediction for optimizing inventory control, reducing costs, and enhancing customer satisfaction. However, traditional machine-learning models struggle with large-scale datasets and complex relationships, hindering real-world data collection. This research introduces a novel methodological framework for supply chain backorder prediction, addressing the challenge of handling large datasets. Our proposed model, QAmplifyNet, employs quantum-inspired techniques within a quantum-classical neural network to predict backorders effectively on short and imbalanced datasets. Experimental evaluations on a benchmark dataset demonstrate QAmplifyNet's superiority over classical models, quantum ensembles, quantum neural networks, and deep reinforcement learning. Its proficiency in handling short, imbalanced datasets makes it an ideal solution for supply chain management. To enhance model interpretability, we use Explainable Artificial Intelligence techniques. Practical implications include improved inventory control, reduced backorders, and enhanced operational efficiency. QAmplifyNet seamlessly integrates into real-world supply chain management systems, enabling proactive decision-making and efficient resource allocation. Future work involves exploring additional quantum-inspired techniques, expanding the dataset, and investigating other supply chain applications. This research unlocks the potential of quantum computing in supply chain optimization and paves the way for further exploration of quantum-inspired machine learning models in supply chain management. Our framework and QAmplifyNet model offer a breakthrough approach to supply chain backorder prediction, providing superior performance and opening new avenues for leveraging quantum-inspired techniques in supply chain management

    Thirty Years of Machine Learning: The Road to Pareto-Optimal Wireless Networks

    Full text link
    Future wireless networks have a substantial potential in terms of supporting a broad range of complex compelling applications both in military and civilian fields, where the users are able to enjoy high-rate, low-latency, low-cost and reliable information services. Achieving this ambitious goal requires new radio techniques for adaptive learning and intelligent decision making because of the complex heterogeneous nature of the network structures and wireless services. Machine learning (ML) algorithms have great success in supporting big data analytics, efficient parameter estimation and interactive decision making. Hence, in this article, we review the thirty-year history of ML by elaborating on supervised learning, unsupervised learning, reinforcement learning and deep learning. Furthermore, we investigate their employment in the compelling applications of wireless networks, including heterogeneous networks (HetNets), cognitive radios (CR), Internet of things (IoT), machine to machine networks (M2M), and so on. This article aims for assisting the readers in clarifying the motivation and methodology of the various ML algorithms, so as to invoke them for hitherto unexplored services as well as scenarios of future wireless networks.Comment: 46 pages, 22 fig

    Integrated High-Resolution Modeling for Operational Hydrologic Forecasting

    Get PDF
    Current advances in Earth-sensing technologies, physically-based modeling, and computational processing, offer the promise of a major revolution in hydrologic forecasting—with profound implications for the management of water resources and protection from related disasters. However, access to the necessary capabilities for managing information from heterogeneous sources, and for its deployment in robust-enough modeling engines, remains the province of large governmental agencies. Moreover, even within this type of centralized operations, success is still challenged by the sheer computational complexity associated with overcoming uncertainty in the estimation of parameters and initial conditions in large-scale or high-resolution models. In this dissertation we seek to facilitate the access to hydrometeorological data products from various U.S. agencies and to advanced watershed modeling tools through the implementation of a lightweight GIS-based software package. Accessible data products currently include gauge, radar, and satellite precipitation; stream discharge; distributed soil moisture and snow cover; and multi-resolution weather forecasts. Additionally, we introduce a suite of open-source methods aimed at the efficient parameterization and initialization of complex geophysical models in contexts of high uncertainty, scarce information, and limited computational resources. The developed products in this suite include: 1) model calibration based on state of the art ensemble evolutionary Pareto optimization, 2) automatic parameter estimation boosted through the incorporation of expert criteria, 3) data assimilation that hybridizes particle smoothing and variational strategies, 4) model state compression by means of optimized clustering, 5) high-dimensional stochastic approximation of watershed conditions through a novel lightweight Gaussian graphical model, and 6) simultaneous estimation of model parameters and states for hydrologic forecasting applications. Each of these methods was tested using established distributed physically-based hydrologic modeling engines (VIC and the DHSVM) that were applied to watersheds in the U.S. of different sizes—from a small highly-instrumented catchment in Pennsylvania, to the basin of the Blue River in Oklahoma. A series of experiments was able to demonstrate statistically-significant improvements in the predictive accuracy of the proposed methods in contrast with traditional approaches. Taken together, these accessible and efficient tools can therefore be integrated within various model-based workflows for complex operational applications in water resources and beyond
    corecore