5,000 research outputs found

    A efficient mapping algorithm with novel node-ranking approach for embedding virtual networks

    Get PDF
    Virtual network embedding (VNE) problem has been widely accepted as an important aspect in network virtualization (NV) area: how to efficiently embed virtual networks, with node and link resource demands, onto the shared substrate network that has finite network resources. Previous VNE heuristic algorithms, only considering single network topology attribute and local resources of each node, may lead to inefficient resource utilization of the substrate network in the long term. To address this issue, a topology attribute and global resource-driven VNE algorithm (VNE-TAGRD), adopting a novel node-ranking approach, is proposed in this paper. The novel node-ranking approach, developed from the well-known Google PageRank algorithm, considers three essential topology attributes and global network resources information before conducting the embedding of given virtual network request (VNR). Numerical simulation results reveal that the VNE-TAGRD algorithm outperforms five typical and latest heuristic algorithms that only consider single network topology attribute and local resources of each node, such as long-term average VNR acceptance ratio and average revenue to cost ratio

    Machine learning requirements for energy-efficient virtual network embedding

    Get PDF
    Network virtualization is a technology proven to be a key enabling a family of strategies in different targets, such as energy efficiency, economic revenue, network usage, adaptability or failure protection. Network virtualization allows us to adapt the needs of a network to new circumstances, resulting in greater flexibility. The allocation decisions of the demands onto the physical network resources impact the costs and the benefits. Therefore it is one of the major current problems, called virtual network embedding (VNE). Many algorithms have been proposed recently in the literature to solve the VNE problem for different targets. Due to the current successful rise of artificial intelligence, it has been widely used recently to solve technological problems. In this context, this paper investigates the requirements and analyses the use of the Q-learning algorithm for energy-efficient VNE. The results achieved validate the strategy and show clear improvements in terms of cost/revenue and energy savings, compared to traditional algorithms.This work has been supported by the Agencia Estatal de Investigación of Ministerio de Ciencia e Innovación of Spain under project PID2019-108713RB-C51 MCIN/AEI/10.13039/501100011033.Peer ReviewedPostprint (published version

    edge2vec: Representation learning using edge semantics for biomedical knowledge discovery

    Full text link
    Representation learning provides new and powerful graph analytical approaches and tools for the highly valued data science challenge of mining knowledge graphs. Since previous graph analytical methods have mostly focused on homogeneous graphs, an important current challenge is extending this methodology for richly heterogeneous graphs and knowledge domains. The biomedical sciences are such a domain, reflecting the complexity of biology, with entities such as genes, proteins, drugs, diseases, and phenotypes, and relationships such as gene co-expression, biochemical regulation, and biomolecular inhibition or activation. Therefore, the semantics of edges and nodes are critical for representation learning and knowledge discovery in real world biomedical problems. In this paper, we propose the edge2vec model, which represents graphs considering edge semantics. An edge-type transition matrix is trained by an Expectation-Maximization approach, and a stochastic gradient descent model is employed to learn node embedding on a heterogeneous graph via the trained transition matrix. edge2vec is validated on three biomedical domain tasks: biomedical entity classification, compound-gene bioactivity prediction, and biomedical information retrieval. Results show that by considering edge-types into node embedding learning in heterogeneous graphs, \textbf{edge2vec}\ significantly outperforms state-of-the-art models on all three tasks. We propose this method for its added value relative to existing graph analytical methodology, and in the real world context of biomedical knowledge discovery applicability.Comment: 10 page

    Efficient and Secure 5G Core Network Slice Provisioning Based on VIKOR Approach

    Get PDF
    Network slicing in 5G is expected to essentially change the way in which network operators deploy and manage vertical services with different performance requirements. Efficient and secure slice provisioning algorithms are important since network slices share the limited resources of the physical network. In this article, we first analyze the security issues in network slicing and formulate an Integer Linear Programming (ILP) model for secure 5G core network slice provisioning. Then, we propose a heuristic 5G core network slice provisioning algorithm called VIKOR-CNSP based on VIKOR, which is a multi-criteria decision making (MCDM) method. In the slice node provisioning stage, the node importance is ranked with the VIKOR approach by considering the node resource and topology attributes. The slice nodes are then provisioned according to the ranking results. In the slice link provisioning stage, the k shortest path algorithm is implemented to obtain the candidate physical paths for the slice link, and a strategy for selecting a candidate physical path is proposed to increase the slice acceptance ratio. The strategy first calculates the path factor P which is the product of the maximum link bandwidth utilization of the candidate physical path and its hop-count, and then chooses the candidate physical path with the smallest P to host the slice link. Extensive simulations show that the proposed algorithm can achieve the highest slice acceptance ratio and the largest provisioning revenue-to-cost ratio, satisfying the security constraints of 5G core network slice requests. f

    Efficient Virtual Network Embedding Via Exploring Periodic Resource Demands

    Get PDF
    Cloud computing built on virtualization technologies promises provisioning elastic computing and communication resources to enterprise users. To share cloud resources efficiently, embedding virtual networks of different users to a distributed cloud consisting of multiple data centers (a substrate network) poses great challenges. Motivated by the fact that most enterprise virtual networks usually operate on long-term basics and have the characteristics of periodic resource demands, in this paper we study the virtual network embedding problem by embedding as many virtual networks as possible to a substrate network such that the revenue of the service provider of the substrate network is maximized, while meeting various Service Level Agreements (SLAs) between enterprise users and the cloud service provider. For this problem, we propose an efficient embedding algorithm by exploring periodic resource demands of virtual networks, and employing a novel embedding metric that models the workloads on both substrate nodes and communication links if the periodic resource demands of virtual networks are given; otherwise, we propose a prediction model to predict the periodic resource demands of these virtual networks based on their historic resource demands. We also evaluate the performance of the proposed algorithms by experimental simulation. Experimental results demonstrate that the proposed algorithms outperform existing algorithms, improving the revenue from 10% to 31%

    Management of Spectral Resources in Elastic Optical Networks

    Get PDF
    Recent developments in the area of mobile technologies, data center networks, cloud computing and social networks have triggered the growth of a wide range of network applications. The data rate of these applications also vary from a few megabits per second (Mbps) to several Gigabits per second (Gbps), thereby increasing the burden on the Inter- net. To support this growth in Internet data traffic, one foremost solution is to utilize the advancements in optical networks. With technology such as wavelength division multiplexing (WDM) networks, bandwidth upto 100 Gbps can be exploited from the optical fiber in an energy efficient manner. However, WDM networks are not efficient when the traffic demands vary frequently. Elastic Optical Networks (EONs) or Spectrum Sliced Elastic Optical Path Networks (SLICE) or Flex-Grid has been recently proposed as a long-term solution to handle the ever-increasing data traffic and the diverse demand range. EONs provide abundant bandwidth by managing the spectrum resources as fine-granular orthogonal sub-carriers that makes it suitable to accommodate varying traffic demands. However, the Routing and Spectrum Allocation (RSA) algorithm in EONs has to follow additional constraints while allocating sub-carriers to demands. These constraints increase the complexity of RSA in EONs and also, make EONs prone to the fragmentation of spectral resources, thereby decreasing the spectral efficiency. The major objective of this dissertation is to study the problem of spectrum allocation in EONs under various network conditions. With this objective, this dissertation presents the author\u27s study and research on multiple aspects of spectrum allocation in EONs: how to allocate sub-carriers to the traffic demands, how to accommodate traffic demands that varies with time, how to minimize the fragmentation of spectral resources and how to efficiently integrate the predictability of user demands for spectrum assignment. Another important contribution of this dissertation is the application of EONs as one of the substrate technologies for network virtualization

    Masked deep reinforcement learning for virtual network embedding on elastic optical networks

    Get PDF
    Deep reinforcement learning (DRL) with invalid action masking is applied to the optimization problem of virtual optical network embedding (VONE) over elastic optical networks (EON). Separate DRL agents are trained on the nodemapping task, link-mapping task, and overall VONE task. Their blocking probability performance is compared with a spectral fragmentation-aware VONE heuristic. All three DRL agents achieve lower blocking probability than the heuristic across low and high traffic loads
    • …
    corecore