10 research outputs found

    ELASTICITY: Topological Characterization of Robustness in Complex Networks

    Full text link
    Just as a herd of animals relies on its robust social structure to survive in the wild, similarly robustness is a crucial characteristic for the survival of a complex network under attack. The capacity to measure robustness in complex networks defines the resolve of a network to maintain functionality in the advent of classical component failures and at the onset of cryptic malicious attacks. To date, robustness metrics are deficient and unfortunately the following dilemmas exist: accurate models necessitate complex analysis while conversely, simple models lack applicability to our definition of robustness. In this paper, we define robustness and present a novel metric, elasticity- a bridge between accuracy and complexity-a link in the chain of network robustness. Additionally, we explore the performance of elasticity on Internet topologies and online social networks, and articulate results

    On the influence of topological characteristics on robustness of complex networks

    Full text link
    In this paper, we explore the relationship between the topological characteristics of a complex network and its robustness to sustained targeted attacks. Using synthesised scale-free, small-world and random networks, we look at a number of network measures, including assortativity, modularity, average path length, clustering coefficient, rich club profiles and scale-free exponent (where applicable) of a network, and how each of these influence the robustness of a network under targeted attacks. We use an established robustness coefficient to measure topological robustness, and consider sustained targeted attacks by order of node degree. With respect to scale-free networks, we show that assortativity, modularity and average path length have a positive correlation with network robustness, whereas clustering coefficient has a negative correlation. We did not find any correlation between scale-free exponent and robustness, or rich-club profiles and robustness. The robustness of small-world networks on the other hand, show substantial positive correlations with assortativity, modularity, clustering coefficient and average path length. In comparison, the robustness of Erdos-Renyi random networks did not have any significant correlation with any of the network properties considered. A significant observation is that high clustering decreases topological robustness in scale-free networks, yet it increases topological robustness in small-world networks. Our results highlight the importance of topological characteristics in influencing network robustness, and illustrate design strategies network designers can use to increase the robustness of scale-free and small-world networks under sustained targeted attacks

    Network robustness improvement via long-range links

    Get PDF
    Abstract Many systems are today modelled as complex networks, since this representation has been proven being an effective approach for understanding and controlling many real-world phenomena. A significant area of interest and research is that of networks robustness, which aims to explore to what extent a network keeps working when failures occur in its structure and how disruptions can be avoided. In this paper, we introduce the idea of exploiting long-range links to improve the robustness of Scale-Free (SF) networks. Several experiments are carried out by attacking the networks before and after the addition of links between the farthest nodes, and the results show that this approach effectively improves the SF network correct functionalities better than other commonly used strategies

    Robustness optimization via link additions

    Get PDF
    Robustness is the ability of networks to avoid malfunction. Networks could be subject to failures, viruses, removals of links and other attacks. There are several researches about how to measure or improve the network robustness in large networks by doing small modifications such as adding new links or nodes. A small network modification is required because setting up links in real-world networks is expensive. Different metrics can be used in order to measure robustness. This project focuses on the algebraic connectivity as the way of measuring connectivity. The main statement is that the larger the algebraic connectivity is, the more difficult to disconnect the network is. The goal of this project is to design strategies to add a number (or a certain percentage) of links to a network such that the algebraic connectivity can be increased the most. Insights may come from results in mathematics, or the strategies for adding one link. In summary, the idea consists of optimizing a network in terms of algebraic connectivity by means of link additions. Finally, different strategies will be evaluated on different types of networks such as the Barabasi-Albert power law graphs or Erdös-Rényi random graphs

    A Critical Review of Robustness in Power Grids using Complex Networks Concepts

    Get PDF
    Complex network theory for analyzing robustness in energy gridsThis paper reviews the most relevant works that have investigated robustness in power grids using Complex Networks (CN) concepts. In this broad field there are two different approaches. The first one is based solely on topological concepts, and uses metrics such as mean path length, clustering coefficient, efficiency and betweenness centrality, among many others. The second, hybrid approach consists of introducing (into the CN framework) some concepts from Electrical Engineering (EE) in the effort of enhancing the topological approach, and uses novel, more efficient electrical metrics such as electrical betweenness, net-ability, and others. There is however a controversy about whether these approaches are able to provide insights into all aspects of real power grids. The CN community argues that the topological approach does not aim to focus on the detailed operation, but to discover the unexpected emergence of collective behavior, while part of the EE community asserts that this leads to an excessive simplification. Beyond this open debate it seems to be no predominant structure (scale-free, small-world) in high-voltage transmission power grids, the vast majority of power grids studied so far. Most of them have in common that they are vulnerable to targeted attacks on the most connected nodes and robust to random failure. In this respect there are only a few works that propose strategies to improve robustness such as intentional islanding, restricted link addition, microgrids and smart grids, for which novel studies suggest that small-world networks seem to be the best topology.This work has been partially supported by the project TIN2014-54583-C2-2-R from the Spanish Ministerial Commission of Science and Technology (MICYT), by the project S2013/ICE-2933 from Comunidad de Madrid and by the project FUTURE GRIDS-2020 from the Basque Government

    Robustness optimization via link additions

    Get PDF
    Robustness is the ability of networks to avoid malfunction. Networks could be subject to failures, viruses, removals of links and other attacks. There are several researches about how to measure or improve the network robustness in large networks by doing small modifications such as adding new links or nodes. A small network modification is required because setting up links in real-world networks is expensive. Different metrics can be used in order to measure robustness. This project focuses on the algebraic connectivity as the way of measuring connectivity. The main statement is that the larger the algebraic connectivity is, the more difficult to disconnect the network is. The goal of this project is to design strategies to add a number (or a certain percentage) of links to a network such that the algebraic connectivity can be increased the most. Insights may come from results in mathematics, or the strategies for adding one link. In summary, the idea consists of optimizing a network in terms of algebraic connectivity by means of link additions. Finally, different strategies will be evaluated on different types of networks such as the Barabasi-Albert power law graphs or Erdös-Rényi random graphs

    Soft Computing approaches in ocean wave height prediction for marine energy applications

    Get PDF
    El objetivo de esta tesis consiste en investigar el uso de técnicas de Soft Computing (SC) aplicadas a la energía producida por las olas o energía undimotriz. Ésta es, entre todas las energías marinas disponibles, la que exhibe el mayor potencial futuro porque, además de ser eficiente desde el punto de vista técnico, no causa problemas ambientales significativos. Su importancia práctica radica en dos hechos: 1) es aproximadamente 1000 veces más densa que la energía eólica, y 2) hay muchas regiones oceánicas con abundantes recursos de olas que están cerca de zonas pobladas que demandan energía eléctrica. La contrapartida negativa se encuentra en que las olas son más difíciles de caracterizar que las mareas debido a su naturaleza estocástica. Las técnicas SC exhiben resultados similares e incluso superiores a los de otros métodos estadísticos en las estimaciones a corto plazo (hasta 24 h), y tienen la ventaja adicional de requerir un esfuerzo computacional mucho menor que los métodos numérico-físicos. Esta es una de las razones por la que hemos decidido explorar el uso de técnicas de SC en la energía producida por el oleaje. La otra se encuentra en el hecho de que su intermitencia puede afectar a la forma en la que se integra la electricidad que genera con la red eléctrica. Estas dos son las razones que nos han impulsado a explorar la viabilidad de nuevos enfoques de SC en dos líneas de investigación novedosas. La primera de ellas es un nuevo enfoque que combina un algoritmo genético (GA: Genetic Algorithm) con una Extreme Learning Machine (ELM) aplicado a un problema de reconstrucción de la altura de ola significativa (en un boya donde los datos se han perdido, por ejemplo, por una tormenta) utilizando datos de otras boyas cercanas. Nuestro algoritmo GA-ELM es capaz de seleccionar un conjunto reducido de parámetros del oleaje que maximizan la reconstrucción de la altura de ola significativa en la boya cuyos datos se han perdido utilizando datos de boyas vecinas. El método y los resultados de esta investigación han sido publicados en: Alexandre, E., Cuadra, L., Nieto-Borge, J. C., Candil-García, G., Del Pino, M., & Salcedo-Sanz, S. (2015). A hybrid genetic algorithm—extreme learning machine approach for accurate significant wave height reconstruction. Ocean Modelling, 92, 115-123. La segunda contribución combina conceptos de SC, Smart Grids (SG) y redes complejas (CNs: Complex Networks). Está motivada por dos aspectos importantes, mutuamente interrelacionados: 1) la forma en la que los conversores WECs (wave energy converters) se interconectan eléctricamente para formar un parque, y 2) cómo conectar éste con la red eléctrica en la costa. Ambos están relacionados con el carácter aleatorio e intermitente de la energía eléctrica producida por las olas. Para poder integrarla mejor sin afectar a la estabilidad de la red se debería recurrir al concepto Smart Wave Farm (SWF). Al igual que una SG, una SWF utiliza sensores y algoritmos para predecir el olaje y controlar la producción y/o almacenamiento de la electricidad producida y cómo se inyecta ésta en la red. En nuestro enfoque, una SWF y su conexión con la red eléctrica se puede ver como una SG que, a su vez, se puede modelar como una red compleja. Con este planteamiento, que se puede generalizar a cualquier red formada por generadores renovables y nodos que consumen y/o almacenan energía, hemos propuesto un algoritmo evolutivo que optimiza la robustez de dicha SG modelada como una red compleja ante fallos aleatorios o condiciones anormales de funcionamiento. El modelo y los resultados han sido publicados en: Cuadra, L., Pino, M. D., Nieto-Borge, J. C., & Salcedo-Sanz, S. (2017). Optimizing the Structure of Distribution Smart Grids with Renewable Generation against Abnormal Conditions: A Complex Networks Approach with Evolutionary Algorithms. Energies, 10(8), 1097

    Designing Cross-Company Business Intelligence Networks

    Get PDF
    Business Intelligence (BI) ist der allgemein akzeptierte Begriff für Methoden, Konzepte und Werkzeuge zur Sammlung, Aufbereitung, Speicherung, Verteilung und Analyse von Daten für Management- und Geschäftsentscheidungen. Obwohl unternehmensübergreifende Kooperation in den vergangenen Jahrzehnten stets an Einfluss gewonnen hat, existieren nur wenige Forschungsergebnisse im Bereich unternehmensübergreifender BI. Die vorliegende Arbeit stellt eine Arbeitsdefinition des Begriffs Cross-Company BI (CCBI) vor und grenzt diesen von gemeinschaftlicher Entscheidungsfindung ab. Auf Basis eines Referenzmodells, das existierende Arbeiten und Ansätze verwandter Forschungsbereiche berücksichtigt, werden umfangreiche Simulationen und Parametertests unternehmensübergreifender BI-Netzwerke durchgeführt. Es wird gezeigt, dass eine Peer-To-Peer-basierte Gestaltung der Netzwerke leistungsfähig und kompetitiv zu existierenden zentral-fokussierten Ansätzen ist. Zur Quantifizierung der Beobachtungen werden Messgrößen geprüft, die sich aus existierenden Konzepten zur Schemaüberführung multidimensionaler Daten sowie Überlegungen zur Daten- und Informationsqualität ableiten oder entwickeln lassen.Business Intelligence (BI) is a well-established term for methods, concepts and tools to retrieve, store, deliver and analyze data for management and business purposes. Although collaboration across company borders has substantially increased over the past decades, little research has been conducted specifically on Cross-Company BI (CCBI). In this thesis, a working definition and distinction from general collaborative decision making is proposed. Based on a reference model that takes existing research and related approaches of adjacent fields into account a peer-to-peer network design is created. With an extensive simulation and parameter testing it is shown that the design proves valuable and competitive to centralized approaches and that obtaining a critical mass of participants leads to improved usefulness of the network. To quantify the observations, appropriate quality measures rigorously derived from respected concepts on data and information quality and multidimensional data models are introduced and validated
    corecore