406 research outputs found

    A Survey on Centrality Metrics and Their Implications in Network Resilience

    Full text link
    Centrality metrics have been used in various networks, such as communication, social, biological, geographic, or contact networks. In particular, they have been used in order to study and analyze targeted attack behaviors and investigated their effect on network resilience. Although a rich volume of centrality metrics has been developed for decades, a limited set of centrality metrics have been commonly in use. This paper aims to introduce various existing centrality metrics and discuss their applicabilities and performance based on the results obtained from extensive simulation experiments to encourage their use in solving various computing and engineering problems in networks.Comment: Main paper: 36 pages, 2 figures. Appendix 23 pages,45 figure

    Resilience study applied in eco-industrial Parks

    Get PDF
    An Eco-Industrial Park (EIP) is a community of businesses that seeks to reduce the global impact through material sharing. Even though an EIP presents an environmental improvement when compared with a set of stand-alone industrial plants, the established connections among the industrial participants can propagate failures, and become in a source of risk. For this reason, this work proposes an indicator to follow the resilience of EIPs, which is constructed to be applied on the design phase of eco-industrial parks, by means of an optimization problem. This indicator is based on two aspects of an industrial network: its topology and its operative flexibility. These aspects are measured by two respective sub-indicators, Network Connectivity Index (NCI) and Flow adaptability index (ϕ). Both sub-indicators are integrated to compose a global resilience indicator. Finally, we apply the resilience indicator over five illustrative cases in order to analyze its applicability, obtaining consistent results

    Mining and analysis of real-world graphs

    Get PDF
    Networked systems are everywhere - such as the Internet, social networks, biological networks, transportation networks, power grid networks, etc. They can be very large yet enormously complex. They can contain a lot of information, either open and transparent or under the cover and coded. Such real-world systems can be modeled using graphs and be mined and analyzed through the lens of network analysis. Network analysis can be applied in recognition of frequent patterns among the connected components in a large graph, such as social networks, where visual analysis is almost impossible. Frequent patterns illuminate statistically important subgraphs that are usually small enough to analyze visually. Graph mining has different practical applications in fraud detection, outliers detection, chemical molecules, etc., based on the necessity of extracting and understanding the information yielded. Network analysis can also be used to quantitatively evaluate and improve the resilience of infrastructure networks such as the Internet or power grids. Infrastructure networks directly affect the quality of people\u27s lives. However, a disastrous incident in these networks may lead to a cascading breakdown of the whole network and serious economic consequences. In essence, network analysis can help us gain actionable insights and make better data-driven decisions based on the networks. On that note, the objective of this dissertation is to improve upon existing tools for more accurate mining and analysis of real-world networks --Abstract, page iv

    Mining complex trees for hidden fruit : a graph–based computational solution to detect latent criminal networks : a thesis presented in partial fulfilment of the requirements for the degree of Doctor of Philosophy in Information Technology at Massey University, Albany, New Zealand.

    Get PDF
    The detection of crime is a complex and difficult endeavour. Public and private organisations – focusing on law enforcement, intelligence, and compliance – commonly apply the rational isolated actor approach premised on observability and materiality. This is manifested largely as conducting entity-level risk management sourcing ‘leads’ from reactive covert human intelligence sources and/or proactive sources by applying simple rules-based models. Focusing on discrete observable and material actors simply ignores that criminal activity exists within a complex system deriving its fundamental structural fabric from the complex interactions between actors - with those most unobservable likely to be both criminally proficient and influential. The graph-based computational solution developed to detect latent criminal networks is a response to the inadequacy of the rational isolated actor approach that ignores the connectedness and complexity of criminality. The core computational solution, written in the R language, consists of novel entity resolution, link discovery, and knowledge discovery technology. Entity resolution enables the fusion of multiple datasets with high accuracy (mean F-measure of 0.986 versus competitors 0.872), generating a graph-based expressive view of the problem. Link discovery is comprised of link prediction and link inference, enabling the high-performance detection (accuracy of ~0.8 versus relevant published models ~0.45) of unobserved relationships such as identity fraud. Knowledge discovery uses the fused graph generated and applies the “GraphExtract” algorithm to create a set of subgraphs representing latent functional criminal groups, and a mesoscopic graph representing how this set of criminal groups are interconnected. Latent knowledge is generated from a range of metrics including the “Super-broker” metric and attitude prediction. The computational solution has been evaluated on a range of datasets that mimic an applied setting, demonstrating a scalable (tested on ~18 million node graphs) and performant (~33 hours runtime on a non-distributed platform) solution that successfully detects relevant latent functional criminal groups in around 90% of cases sampled and enables the contextual understanding of the broader criminal system through the mesoscopic graph and associated metadata. The augmented data assets generated provide a multi-perspective systems view of criminal activity that enable advanced informed decision making across the microscopic mesoscopic macroscopic spectrum

    Calling The Dead: Resilience In The WTC Communication Networks

    Full text link
    Organizations in emergency settings must cope with various sources of disruption, most notably personnel loss. Death, incapacitation, or isolation of individuals within an organizational communication network can impair information passing, coordination, and connectivity, and may drive maladaptive responses such as repeated attempts to contact lost personnel (``calling the dead'') that themselves consume scarce resources. At the same time, organizations may respond to such disruption by reorganizing to restore function, a behavior that is fundamental to organizational resilience. Here, we use empirically calibrated models of communication for 17 groups of responders to the World Trade Center Disaster to examine the impact of exogenous removal of personnel on communication activity and network resilience. We find that removal of high-degree personnel and those in institutionally coordinative roles is particularly damaging to these organizations, with specialist responders being slower to adapt to losses. However, all organizations show adaptations to disruption, in some cases becoming better connected and making more complete use of personnel relative to control after experiencing losses

    A Tutorial on Clique Problems in Communications and Signal Processing

    Full text link
    Since its first use by Euler on the problem of the seven bridges of K\"onigsberg, graph theory has shown excellent abilities in solving and unveiling the properties of multiple discrete optimization problems. The study of the structure of some integer programs reveals equivalence with graph theory problems making a large body of the literature readily available for solving and characterizing the complexity of these problems. This tutorial presents a framework for utilizing a particular graph theory problem, known as the clique problem, for solving communications and signal processing problems. In particular, the paper aims to illustrate the structural properties of integer programs that can be formulated as clique problems through multiple examples in communications and signal processing. To that end, the first part of the tutorial provides various optimal and heuristic solutions for the maximum clique, maximum weight clique, and kk-clique problems. The tutorial, further, illustrates the use of the clique formulation through numerous contemporary examples in communications and signal processing, mainly in maximum access for non-orthogonal multiple access networks, throughput maximization using index and instantly decodable network coding, collision-free radio frequency identification networks, and resource allocation in cloud-radio access networks. Finally, the tutorial sheds light on the recent advances of such applications, and provides technical insights on ways of dealing with mixed discrete-continuous optimization problems
    • 

    corecore