12,849 research outputs found
Correlation between centrality metrics and their application to the opinion model
In recent decades, a number of centrality metrics describing network
properties of nodes have been proposed to rank the importance of nodes. In
order to understand the correlations between centrality metrics and to
approximate a high-complexity centrality metric by a strongly correlated
low-complexity metric, we first study the correlation between centrality
metrics in terms of their Pearson correlation coefficient and their similarity
in ranking of nodes. In addition to considering the widely used centrality
metrics, we introduce a new centrality measure, the degree mass. The m order
degree mass of a node is the sum of the weighted degree of the node and its
neighbors no further than m hops away. We find that the B_{n}, the closeness,
and the components of x_{1} are strongly correlated with the degree, the
1st-order degree mass and the 2nd-order degree mass, respectively, in both
network models and real-world networks. We then theoretically prove that the
Pearson correlation coefficient between x_{1} and the 2nd-order degree mass is
larger than that between x_{1} and a lower order degree mass. Finally, we
investigate the effect of the inflexible antagonists selected based on
different centrality metrics in helping one opinion to compete with another in
the inflexible antagonists opinion model. Interestingly, we find that selecting
the inflexible antagonists based on the leverage, the B_{n}, or the degree is
more effective in opinion-competition than using other centrality metrics in
all types of networks. This observation is supported by our previous
observations, i.e., that there is a strong linear correlation between the
degree and the B_{n}, as well as a high centrality similarity between the
leverage and the degree.Comment: 20 page
Power Grid Network Evolutions for Local Energy Trading
The shift towards an energy Grid dominated by prosumers (consumers and
producers of energy) will inevitably have repercussions on the distribution
infrastructure. Today it is a hierarchical one designed to deliver energy from
large scale facilities to end-users. Tomorrow it will be a capillary
infrastructure at the medium and Low Voltage levels that will support local
energy trading among prosumers. In our previous work, we analyzed the Dutch
Power Grid and made an initial analysis of the economic impact topological
properties have on decentralized energy trading. In this paper, we go one step
further and investigate how different networks topologies and growth models
facilitate the emergence of a decentralized market. In particular, we show how
the connectivity plays an important role in improving the properties of
reliability and path-cost reduction. From the economic point of view, we
estimate how the topological evolutions facilitate local electricity
distribution, taking into account the main cost ingredient required for
increasing network connectivity, i.e., the price of cabling
Market Structure and Communicable Diseases
Communicable diseases pose a formidable challenge for public policy. Using numerical simulations, we show under which scenarios a monopolist’s price and prevalence paths converge to a nonzero steady-state. In contrast, a planner typically eradicates the disease. If eradication is impossible, the planner subsidizes treatments as long as the prevalence can be controlled. Drug resistance exacerbates the welfare difference between monopoly and first best outcomes. Nevertheless, because the negative externalities from resistance compete with the positive externalities of treatment, a mixed competition/monopoly regime may perform better than competition alone. This result has important implications for the design of many drug patents.communicable disease, resistance, epidemiology, patent
Matroids are Immune to Braess Paradox
The famous Braess paradox describes the following phenomenon: It might happen
that the improvement of resources, like building a new street within a
congested network, may in fact lead to larger costs for the players in an
equilibrium. In this paper we consider general nonatomic congestion games and
give a characterization of the maximal combinatorial property of strategy
spaces for which Braess paradox does not occur. In a nutshell, bases of
matroids are exactly this maximal structure. We prove our characterization by
two novel sensitivity results for convex separable optimization problems over
polymatroid base polyhedra which may be of independent interest.Comment: 21 page
Making and breaking power laws in evolutionary algorithm population dynamics
Deepening our understanding of the characteristics and behaviors of population-based search algorithms remains an important ongoing challenge in Evolutionary Computation. To date however, most studies of Evolutionary Algorithms have only been able to take place within tightly restricted experimental conditions. For instance, many analytical methods can only be applied to canonical algorithmic forms or can only evaluate evolution over simple test functions. Analysis of EA behavior under more complex conditions is needed to broaden our understanding of this population-based search process. This paper presents an approach to analyzing EA behavior that can be applied to a diverse range of algorithm designs and environmental conditions. The approach is based on evaluating an individual’s impact on population dynamics using metrics derived from genealogical graphs.\ud
From experiments conducted over a broad range of conditions, some important conclusions are drawn in this study. First, it is determined that very few individuals in an EA population have a significant influence on future population dynamics with the impact size fitting a power law distribution. The power law distribution indicates there is a non-negligible probability that single individuals will dominate the entire population, irrespective of population size. Two EA design features are however found to cause strong changes to this aspect of EA behavior: i) the population topology and ii) the introduction of completely new individuals. If the EA population topology has a long path length or if new (i.e. historically uncoupled) individuals are continually inserted into the population, then power law deviations are observed for large impact sizes. It is concluded that such EA designs can not be dominated by a small number of individuals and hence should theoretically be capable of exhibiting higher degrees of parallel search behavior
Comparing the writing style of real and artificial papers
Recent years have witnessed the increase of competition in science. While
promoting the quality of research in many cases, an intense competition among
scientists can also trigger unethical scientific behaviors. To increase the
total number of published papers, some authors even resort to software tools
that are able to produce grammatical, but meaningless scientific manuscripts.
Because automatically generated papers can be misunderstood as real papers, it
becomes of paramount importance to develop means to identify these scientific
frauds. In this paper, I devise a methodology to distinguish real manuscripts
from those generated with SCIGen, an automatic paper generator. Upon modeling
texts as complex networks (CN), it was possible to discriminate real from fake
papers with at least 89\% of accuracy. A systematic analysis of features
relevance revealed that the accessibility and betweenness were useful in
particular cases, even though the relevance depended upon the dataset. The
successful application of the methods described here show, as a proof of
principle, that network features can be used to identify scientific gibberish
papers. In addition, the CN-based approach can be combined in a straightforward
fashion with traditional statistical language processing methods to improve the
performance in identifying artificially generated papers.Comment: To appear in Scientometrics (2015
- …