20 research outputs found

    Tensor decomposition techniques for analysing time-varying networks

    Get PDF
    The aim of this Ph.D thesis is the study of time-varying networks via theoretical and data-driven approaches. Networks are natural objects to represent a vast variety of systems in nature, e.g., communication networks (phone calls and e-mails), online social networks (Facebook, Twitter), infrastructural networks, etc. Considering the temporal dimension of networks helps to better understand and predict complex phenomena, by taking into account both the fact that links in the network are not continuously active over time and the potential relation between multiple dimensions, such as space and time. A fundamental challenge in this area is the definition of mathematical models and tools able to capture topological and dynamical aspects and to reproduce properties observed on the real dynamics of networks. Thus, the purpose of this thesis is threefold: 1) we will focus on the analysis of the complex mesoscale patterns, as community like structures and their evolution in time, that characterize time-varying networks; 2) we will study how these patterns impact dynamical processes that occur over the network; 3) we will sketch a generative model to study the interplay between topological and temporal patterns of time-varying networks and dynamical processes occurring over the network, e.g., disease spreading. To tackle these problems, we adopt and extend an approach at the intersection between multi-linear algebra and machine learning: the decomposition of time-varying networks represented as tensors (multi-dimensional arrays). In particular, we focus on the study of Non-negative Tensor Factorization (NTF) techniques to detect complex topological and temporal patterns in the network. We first extend the NTF framework to tackle the problem of detecting anomalies in time-varying networks. Then, we propose a technique to approximate and reconstruct time-varying networks affected by missing information, to both recover the missing values and to reproduce dynamical processes on top of the network. Finally, we focus on the analysis of the interplay between the discovered patterns and dynamical processes. To this aim, we use the NTF as an hint to devise a generative model of time-varying networks, in which we can control both the topological and temporal patterns, to identify which of them has a major impact on the dynamics

    Rethinking Broadband Internet Access

    Get PDF
    The emergence of broadband Internet technologies, such as cable modem and digital subscriber line (DSL) systems, has reopened debates over how the Internet should be regulated. Advocates of network neutrality and open access to cable modem systems have proposed extending the regulatory regime developed to govern conventional telephone and narrowband Internet service to broadband. A critical analysis of the rationales traditionally invoked to justify the regulation of telecommunications networks--such as natural monopoly, network economic effects, vertical exclusion, and the dangers of ruinous competition--reveals that those rationales depend on empirical and theoretical preconditions that do not apply to broadband. In addition, the current policy debate treats access to networks as a unitary phenomenon that fails to take into account how different types of access requirements can affect network performance in widely divergent ways. The current debate also fails to capture how individual network elements can interact in ways that can be quite unpredictable. In this Article, Professors Spulber and Yoo analyze broadband access using a theory of network configuration based on a branch of mathematics known as graph theory, which captures the interactions between individual components that cause networks to behave as complex systems. This theory yields a five-part classification system that provides insights into the effect of different types of access on network cost, capacity, reliability, and transaction costs

    Sustaining Interdisciplinary Research: A Multilayer Perspective

    Get PDF

    Mobile Ad-Hoc Networks

    Get PDF
    Being infrastructure-less and without central administration control, wireless ad-hoc networking is playing a more and more important role in extending the coverage of traditional wireless infrastructure (cellular networks, wireless LAN, etc). This book includes state-of-the-art techniques and solutions for wireless ad-hoc networks. It focuses on the following topics in ad-hoc networks: quality-of-service and video communication, routing protocol and cross-layer design. A few interesting problems about security and delay-tolerant networks are also discussed. This book is targeted to provide network engineers and researchers with design guidelines for large scale wireless ad hoc networks

    Knowledge Modelling and Learning through Cognitive Networks

    Get PDF
    One of the most promising developments in modelling knowledge is cognitive network science, which aims to investigate cognitive phenomena driven by the networked, associative organization of knowledge. For example, investigating the structure of semantic memory via semantic networks has illuminated how memory recall patterns influence phenomena such as creativity, memory search, learning, and more generally, knowledge acquisition, exploration, and exploitation. In parallel, neural network models for artificial intelligence (AI) are also becoming more widespread as inferential models for understanding which features drive language-related phenomena such as meaning reconstruction, stance detection, and emotional profiling. Whereas cognitive networks map explicitly which entities engage in associative relationships, neural networks perform an implicit mapping of correlations in cognitive data as weights, obtained after training over labelled data and whose interpretation is not immediately evident to the experimenter. This book aims to bring together quantitative, innovative research that focuses on modelling knowledge through cognitive and neural networks to gain insight into mechanisms driving cognitive processes related to knowledge structuring, exploration, and learning. The book comprises a variety of publication types, including reviews and theoretical papers, empirical research, computational modelling, and big data analysis. All papers here share a commonality: they demonstrate how the application of network science and AI can extend and broaden cognitive science in ways that traditional approaches cannot

    Supporting distributed computation over wide area gigabit networks

    Get PDF
    The advent of high bandwidth fibre optic links that may be used over very large distances has lead to much research and development in the field of wide area gigabit networking. One problem that needs to be addressed is how loosely coupled distributed systems may be built over these links, allowing many computers worldwide to take part in complex calculations in order to solve "Grand Challenge" problems. The research conducted as part of this PhD has looked at the practicality of implementing a communication mechanism proposed by Craig Partridge called Late-binding Remote Procedure Calls (LbRPC). LbRPC is intended to export both code and data over the network to remote machines for evaluation, as opposed to traditional RPC mechanisms that only send parameters to pre-existing remote procedures. The ability to send code as well as data means that LbRPC requests can overcome one of the biggest problems in Wide Area Distributed Computer Systems (WADCS): the fixed latency due to the speed of light. As machines get faster, the fixed multi-millisecond round trip delay equates to ever increasing numbers of CPU cycles. For a WADCS to be efficient, programs should minimise the number of network transits they incur. By allowing the application programmer to export arbitrary code to the remote machine, this may be achieved. This research has looked at the feasibility of supporting secure exportation of arbitrary code and data in heterogeneous, loosely coupled, distributed computing environments. It has investigated techniques for making placement decisions for the code in cases where there are a large number of widely dispersed remote servers that could be used. The latter has resulted in the development of a novel prototype LbRPC using multicast IP for implicit placement and a sequenced, multi-packet saturation multicast transport protocol. These prototypes show that it is possible to export code and data to multiple remote hosts, thereby removing the need to perform complex and error prone explicit process placement decisions

    Net Neutrality

    Get PDF
    This book is available as open access through the Bloomsbury Open Access programme and is available on www.bloomsburycollections.com. Chris Marsden maneuvers through the hype articulated by Netwrok Neutrality advocates and opponents. He offers a clear-headed analysis of the high stakes in this debate about the Internet's future, and fearlessly refutes the misinformation and misconceptions that about' Professor Rob Freiden, Penn State University Net Neutrality is a very heated and contested policy principle regarding access for content providers to the Internet end-user, and potential discrimination in that access where the end-user's ISP (or another ISP) blocks that access in part or whole. The suggestion has been that the problem can be resolved by either introducing greater competition, or closely policing conditions for vertically integrated service, such as VOIP. However, that is not the whole story, and ISPs as a whole have incentives to discriminate between content for matters such as network management of spam, to secure and maintain customer experience at current levels, and for economic benefit from new Quality of Service standards. This includes offering a ‘priority lane' on the network for premium content types such as video and voice service. The author considers market developments and policy responses in Europe and the United States, draws conclusions and proposes regulatory recommendations
    corecore