3,360 research outputs found
An Overview on Application of Machine Learning Techniques in Optical Networks
Today's telecommunication networks have become sources of enormous amounts of
widely heterogeneous data. This information can be retrieved from network
traffic traces, network alarms, signal quality indicators, users' behavioral
data, etc. Advanced mathematical tools are required to extract meaningful
information from these data and take decisions pertaining to the proper
functioning of the networks from the network-generated data. Among these
mathematical tools, Machine Learning (ML) is regarded as one of the most
promising methodological approaches to perform network-data analysis and enable
automated network self-configuration and fault management. The adoption of ML
techniques in the field of optical communication networks is motivated by the
unprecedented growth of network complexity faced by optical networks in the
last few years. Such complexity increase is due to the introduction of a huge
number of adjustable and interdependent system parameters (e.g., routing
configurations, modulation format, symbol rate, coding schemes, etc.) that are
enabled by the usage of coherent transmission/reception technologies, advanced
digital signal processing and compensation of nonlinear effects in optical
fiber propagation. In this paper we provide an overview of the application of
ML to optical communications and networking. We classify and survey relevant
literature dealing with the topic, and we also provide an introductory tutorial
on ML for researchers and practitioners interested in this field. Although a
good number of research papers have recently appeared, the application of ML to
optical networks is still in its infancy: to stimulate further work in this
area, we conclude the paper proposing new possible research directions
Emerging Routing Method Using Path Arbitrator in Web Sensor Networks
Sophisticated Routing has a big impact on wireless sensor network performance and data delivery. Because nodes join and leave the network on a whim, routing in WSN is not as simple a task as it is throughout sensor networks that are wireless. The fact that the most of WSN devices are resource constrained is another restriction on how routing is implemented in WSN. The WSN uses a variety of routing protocols. However, the primary goal of this research is to determine the best route from the source to the destination using wireless sensor networks and machine learning techniques Which is Particle Swarm Optimization. In this study, an innovative and intelligent machine dubbed the Path Arbitrator or selector, which will store all sensor data and use machine learning methods, is used to develop a new routing mechanism
Balanced Order Batching with Task-Oriented Graph Clustering
Balanced order batching problem (BOBP) arises from the process of warehouse
picking in Cainiao, the largest logistics platform in China. Batching orders
together in the picking process to form a single picking route, reduces travel
distance. The reason for its importance is that order picking is a labor
intensive process and, by using good batching methods, substantial savings can
be obtained. The BOBP is a NP-hard combinational optimization problem and
designing a good problem-specific heuristic under the quasi-real-time system
response requirement is non-trivial. In this paper, rather than designing
heuristics, we propose an end-to-end learning and optimization framework named
Balanced Task-orientated Graph Clustering Network (BTOGCN) to solve the BOBP by
reducing it to balanced graph clustering optimization problem. In BTOGCN, a
task-oriented estimator network is introduced to guide the type-aware
heterogeneous graph clustering networks to find a better clustering result
related to the BOBP objective. Through comprehensive experiments on
single-graph and multi-graphs, we show: 1) our balanced task-oriented graph
clustering network can directly utilize the guidance of target signal and
outperforms the two-stage deep embedding and deep clustering method; 2) our
method obtains an average 4.57m and 0.13m picking distance ("m" is the
abbreviation of the meter (the SI base unit of length)) reduction than the
expert-designed algorithm on single and multi-graph set and has a good
generalization ability to apply in practical scenario.Comment: 10 pages, 6 figure
Artificial intelligence (AI) methods in optical networks: A comprehensive survey
Producción CientíficaArtificial intelligence (AI) is an extensive scientific discipline which enables computer systems to solve problems by emulating complex biological processes such as learning, reasoning and self-correction. This paper presents a comprehensive review of the application of AI techniques for improving performance of optical communication systems and networks. The use of AI-based techniques is first studied in applications related to optical transmission, ranging from the characterization and operation of network components to performance monitoring, mitigation of nonlinearities, and quality of transmission estimation. Then, applications related to optical network control and management are also reviewed, including topics like optical network planning and operation in both transport and access networks. Finally, the paper also presents a summary of opportunities and challenges in optical networking where AI is expected to play a key role in the near future.Ministerio de Economía, Industria y Competitividad (Project EC2014-53071-C3-2-P, TEC2015-71932-REDT
Recommended from our members
Improving Computer Network Operations Through Automated Interpretation of State
Networked systems today are hyper-scaled entities that provide core functionality for distributed services and applications spanning personal, business, and government use. It is critical to maintain correct operation of these networks to avoid adverse business outcomes. The advent of programmable networks has provided much needed fine-grained network control, enabling providers and operators alike to build some innovative networking architectures and solutions. At the same time, they have given rise to new challenges in network management. These architectures, coupled with a multitude of devices, protocols, virtual overlays on top of physical data-plane etc. make network management a highly challenging task. Existing network management methodologies have not evolved at the same pace as the technologies and architectures. Current network management practices do not provide adequate solutions for highly dynamic, programmable environments. We have a long way to go in developing management methodologies that can meaningfully contribute to networks becoming self-healing entities. The goal of my research is to contribute to the design and development of networks towards transforming them into self-healing entities.
Network management includes a multitude of tasks, not limited to diagnosis and troubleshooting, but also performance engineering and tuning, security analysis etc. This research explores novel methods of utilizing network state to enhance networking capabilities. It is constructed around hypotheses based on careful analysis of practical deficiencies in the field. I try to generate real-world impact with my research by tackling problems that are prevalent in deployed networks, and that bear practical relevance to the current state of networking. The overarching goal of this body of work is to examine various approaches that could help enhance network management paradigms, providing administrators with a better understanding of the underlying state of the network, thus leading to more informed decision-making. The research looks into two distinct areas of network management, troubleshooting and routing, presenting novel approaches to accomplishing certain goals in each of these areas, demonstrating that they can indeed enhance the network management experience
Survey of Routing Algorithms for Computer Networks
This thesis gives a general discussion of routing for computer networks, followed by an overview of a number of typical routing algorithms used or reported in the past few years. Attention is mainly focused on distributed adaptive routing algorithms for packet switching (or message switching) networks. Algorithms for major commercial networks (or network architectures) are reviewed as well, for the convenience of comparison
Machine Learning Meets Communication Networks: Current Trends and Future Challenges
The growing network density and unprecedented increase in network traffic, caused by the massively expanding number of connected devices and online services, require intelligent network operations. Machine Learning (ML) has been applied in this regard in different types of networks and networking technologies to meet the requirements of future communicating devices and services. In this article, we provide a detailed account of current research on the application of ML in communication networks and shed light on future research challenges. Research on the application of ML in communication networks is described in: i) the three layers, i.e., physical, access, and network layers; and ii) novel computing and networking concepts such as Multi-access Edge Computing (MEC), Software Defined Networking (SDN), Network Functions Virtualization (NFV), and a brief overview of ML-based network security. Important future research challenges are identified and presented to help stir further research in key areas in this direction
- …