13,360 research outputs found
On the Stability of Community Detection Algorithms on Longitudinal Citation Data
There are fundamental differences between citation networks and other classes
of graphs. In particular, given that citation networks are directed and
acyclic, methods developed primarily for use with undirected social network
data may face obstacles. This is particularly true for the dynamic development
of community structure in citation networks. Namely, it is neither clear when
it is appropriate to employ existing community detection approaches nor is it
clear how to choose among existing approaches. Using simulated data, we attempt
to clarify the conditions under which one should use existing methods and which
of these algorithms is appropriate in a given context. We hope this paper will
serve as both a useful guidepost and an encouragement to those interested in
the development of more targeted approaches for use with longitudinal citation
data.Comment: 17 pages, 7 figures, presenting at Applications of Social Network
Analysis 2009, ETH Zurich Edit, August 17, 2009: updated abstract, figures,
text clarification
Learning Bayesian Networks with the bnlearn R Package
bnlearn is an R package which includes several algorithms for learning the
structure of Bayesian networks with either discrete or continuous variables.
Both constraint-based and score-based algorithms are implemented, and can use
the functionality provided by the snow package to improve their performance via
parallel computing. Several network scores and conditional independence
algorithms are available for both the learning algorithms and independent use.
Advanced plotting options are provided by the Rgraphviz package.Comment: 22 pages, 4 picture
Statistical Reliability Estimation of Microprocessor-Based Systems
What is the probability that the execution state of a given microprocessor running a given application is correct, in a certain working environment with a given soft-error rate? Trying to answer this question using fault injection can be very expensive and time consuming. This paper proposes the baseline for a new methodology, based on microprocessor error probability profiling, that aims at estimating fault injection results without the need of a typical fault injection setup. The proposed methodology is based on two main ideas: a one-time fault-injection analysis of the microprocessor architecture to characterize the probability of successful execution of each of its instructions in presence of a soft-error, and a static and very fast analysis of the control and data flow of the target software application to compute its probability of success. The presented work goes beyond the dependability evaluation problem; it also has the potential to become the backbone for new tools able to help engineers to choose the best hardware and software architecture to structurally maximize the probability of a correct execution of the target softwar
Learning Bayesian Networks with the bnlearn R Package
bnlearn is an R package (R Development Core Team 2010) which includes several algorithms for learning the structure of Bayesian networks with either discrete or continuous variables. Both constraint-based and score-based algorithms are implemented, and can use the functionality provided by the snow package (Tierney et al. 2008) to improve their performance via parallel computing. Several network scores and conditional independence algorithms are available for both the learning algorithms and independent use. Advanced plotting options are provided by the Rgraphviz package (Gentry et al. 2010).
Optimization of Free Space Optical Wireless Network for Cellular Backhauling
With densification of nodes in cellular networks, free space optic (FSO)
connections are becoming an appealing low cost and high rate alternative to
copper and fiber as the backhaul solution for wireless communication systems.
To ensure a reliable cellular backhaul, provisions for redundant, disjoint
paths between the nodes must be made in the design phase. This paper aims at
finding a cost-effective solution to upgrade the cellular backhaul with
pre-deployed optical fibers using FSO links and mirror components. Since the
quality of the FSO links depends on several factors, such as transmission
distance, power, and weather conditions, we adopt an elaborate formulation to
calculate link reliability. We present a novel integer linear programming model
to approach optimal FSO backhaul design, guaranteeing -disjoint paths
connecting each node pair. Next, we derive a column generation method to a
path-oriented mathematical formulation. Applying the method in a sequential
manner enables high computational scalability. We use realistic scenarios to
demonstrate our approaches efficiently provide optimal or near-optimal
solutions, and thereby allow for accurately dealing with the trade-off between
cost and reliability
Towards the fast and robust optimal design of Wireless Body Area Networks
Wireless body area networks are wireless sensor networks whose adoption has
recently emerged and spread in important healthcare applications, such as the
remote monitoring of health conditions of patients. A major issue associated
with the deployment of such networks is represented by energy consumption: in
general, the batteries of the sensors cannot be easily replaced and recharged,
so containing the usage of energy by a rational design of the network and of
the routing is crucial. Another issue is represented by traffic uncertainty:
body sensors may produce data at a variable rate that is not exactly known in
advance, for example because the generation of data is event-driven. Neglecting
traffic uncertainty may lead to wrong design and routing decisions, which may
compromise the functionality of the network and have very bad effects on the
health of the patients. In order to address these issues, in this work we
propose the first robust optimization model for jointly optimizing the topology
and the routing in body area networks under traffic uncertainty. Since the
problem may result challenging even for a state-of-the-art optimization solver,
we propose an original optimization algorithm that exploits suitable linear
relaxations to guide a randomized fixing of the variables, supported by an
exact large variable neighborhood search. Experiments on realistic instances
indicate that our algorithm performs better than a state-of-the-art solver,
fast producing solutions associated with improved optimality gaps.Comment: Authors' manuscript version of the paper that was published in
Applied Soft Computin
A Scalable Null Model for Directed Graphs Matching All Degree Distributions: In, Out, and Reciprocal
Degree distributions are arguably the most important property of real world
networks. The classic edge configuration model or Chung-Lu model can generate
an undirected graph with any desired degree distribution. This serves as a good
null model to compare algorithms or perform experimental studies. Furthermore,
there are scalable algorithms that implement these models and they are
invaluable in the study of graphs. However, networks in the real-world are
often directed, and have a significant proportion of reciprocal edges. A
stronger relation exists between two nodes when they each point to one another
(reciprocal edge) as compared to when only one points to the other (one-way
edge). Despite their importance, reciprocal edges have been disregarded by most
directed graph models.
We propose a null model for directed graphs inspired by the Chung-Lu model
that matches the in-, out-, and reciprocal-degree distributions of the real
graphs. Our algorithm is scalable and requires random numbers to
generate a graph with edges. We perform a series of experiments on real
datasets and compare with existing graph models.Comment: Camera ready version for IEEE Workshop on Network Science; fixed some
typos in tabl
Using the general link transmission model in a dynamic traffic assignment to simulate congestion on urban networks
This article presents two new models of Dynamic User Equilibrium that are particularly suited for ITS applications, where the evolution of vehicle flows and travel times must be simulated on large road networks, possibly in real-time. The key feature of the proposed models is the detail representation of the main congestion phenomena occurring at nodes of urban networks, such as vehicle queues and their spillback, as well as flow conflicts in mergins and diversions. Compared to the simple word of static assignment, where only the congestion along the arc is typically reproduced through a separable relation between vehicle flow and travel time, this type of DTA models are much more complex, as the above relation becomes non-separable, both in time and space.
Traffic simulation is here attained through a macroscopic flow model, that extends the theory of kinematic waves to urban networks and non-linear fundamental diagrams: the General Link Transmission Model. The sub-models of the GLTM, namely the Node Intersection Model, the Forward Propagation Model of vehicles and the Backward Propagation Model of spaces, can be combined in two different ways to produce arc travel times starting from turn flows. The first approach is to consider short time intervals of a few seconds and process all nodes for each temporal layer in chronological order. The second approach allows to consider long time intervals of a few minutes and for each sub-model requires to process the whole temporal profile of involved variables. The two resulting DTA models are here analyzed and compared with the aim of identifying their possible use cases.
A rigorous mathematical formulation is out of the scope of this paper, as well as a detailed explanation of the solution algorithm.
The dynamic equilibrium is anyhow sought through a new method based on Gradient Projection, which is capable to solve both proposed models with any desired precision in a reasonable number of iterations. Its fast convergence is essential to show that the two proposed models for network congestion actually converge at equilibrium to nearly identical solutions in terms of arc flows and travel times, despite their two diametrical approaches wrt the dynamic nature of the problem, as shown in the numerical tests presented here
- …