113 research outputs found
Trade-Offs in Distributed Interactive Proofs
The study of interactive proofs in the context of distributed network computing is a novel topic, recently introduced by Kol, Oshman, and Saxena [PODC 2018]. In the spirit of sequential interactive proofs theory, we study the power of distributed interactive proofs. This is achieved via a series of results establishing trade-offs between various parameters impacting the power of interactive proofs, including the number of interactions, the certificate size, the communication complexity, and the form of randomness used. Our results also connect distributed interactive proofs with the established field of distributed verification. In general, our results contribute to providing structure to the landscape of distributed interactive proofs
Synchronous Context-Free Grammars and Optimal Linear Parsing Strategies
Synchronous Context-Free Grammars (SCFGs), also known as syntax-directed
translation schemata, are unlike context-free grammars in that they do not have
a binary normal form. In general, parsing with SCFGs takes space and time
polynomial in the length of the input strings, but with the degree of the
polynomial depending on the permutations of the SCFG rules. We consider linear
parsing strategies, which add one nonterminal at a time. We show that for a
given input permutation, the problems of finding the linear parsing strategy
with the minimum space and time complexity are both NP-hard
Analyzing and Comparing On-Line News Sources via (Two-Layer) Incremental Clustering
In this paper, we analyse the contents of the web site of two Italian press agencies and of four of the most popular Italian newspapers, in order to answer questions such as what are the most relevant news, what is the average life of news, and how much different are different sites. To this aim, we have developed a web-based application which hourly collects the articles in the main column of the six web sites, implements an incremental clustering algorithm for grouping the articles into news, and finally allows the user to see the answer to the above questions. We have also designed and implemented a two-layer modification of the incremental clustering algorithm and executed some preliminary experimental evaluation of this modification: it turns out that the two-layer clustering is extremely efficient in terms of time performances, and it has quite good performances in terms of precision and recall
Into the Square: On the Complexity of Some Quadratic-time Solvable Problems
International audienceWe analyze several quadratic-time solvable problems, and we show that these problems are not solvable in truly subquadratic time (that is, in time O(n2−ϵ) for some ϵ>0), unless the well known Strong Exponential Time Hypothesis (in short, SETH) is false. In particular, we start from an artificial quadratic-time solvable variation of the k-Sat problem (already introduced and used in the literature) and we will construct a web of Karp reductions, proving that a truly subquadratic-time algorithm for any of the problems in the web falsifies SETH. Some of these results were already known, while others are, as far as we know, new. The new problems considered are: computing the betweenness centrality of a vertex (the same result was proved independently by Abboud et al.), computing the minimum closeness centrality in a graph, computing the hyperbolicity of a graph, and computing the subset graph of a collection of sets. On the other hand, we will show that testing if a directed graph is transitive and testing if a graph is a comparability graph are subquadratic-time solvable (our algorithm is practical, since it is not based on intricate matrix multiplication algorithms)
Inondation dans les réseaux dynamiques
International audienceCette note résume nos travaux sur l'inondation dans les réseaux dynamiques. Ces derniers sont définis à partir d'un processus Markovien de paramètres et générant des séquences de graphes sur un même ensemble de sommets, et tels que est obtenu à partir de comme suit~: si alors avec probabilité , et si alors avec probabilité . Clementi et al. (PODC 2008) ont analysé différent processus de diffusion de l'information dans de tels réseaux, et ont en particulier établi un ensemble de bornes sur les performances de l'inondation. L'inondation consiste en un protocole élémentaire où chaque n{\oe}ud apprenant une information à un temps la retransmet à tous ses voisins à toutes les étapes suivantes. Evidemment, en dépit de ses avantages en terme de simplicité et de robustesse, le protocole d'inondation souffre d'une utilisation abusive des ressources en bande passante. Dans cette note, nous montrons que l'inondation dans les réseaux dynamiques peut être mis en {\oe}uvre de façon à limiter le nombre de retransmissions d'une même information, tout en préservant les performances en termes du temps mis par une information pour atteindre tous les n{\oe}uds du réseau. La principale difficulté de notre étude réside dans les dépendances temporelles entre les connexions du réseaux à différentes étapes de temps
On Computing the Diameter of (Weighted) Link Streams
A weighted link stream is a pair (V,?) comprising V, the set of nodes, and ?, the list of temporal edges (u,v,t,?), where u,v are two nodes in V, t is the starting time of the temporal edge, and ? is its travel time. By making use of this model, different notions of diameter can be defined, which refer to the following distances: earliest arrival time, latest departure time, fastest time, and shortest time. After proving that any of these diameters cannot be computed in time sub-quadratic with respect to the number of temporal edges, we propose different algorithms (inspired by the approach used for computing the diameter of graphs) which allow us to compute, in practice very efficiently, the diameter of quite large real-world weighted link stream for several definitions of the diameter. Indeed, all the proposed algorithms require very often a very low number of single source (or target) best path computations. We verify the effectiveness of our approach by means of an extensive set of experiments on real-world link streams. We also experimentally prove that the temporal version of the well-known 2-sweep technique, for computing a lower bound on the diameter of a graph, is quite effective in the case of weighted link stream, by returning very often tight bounds
- …