74,323 research outputs found

    On the Complexity of Distributed Splitting Problems

    Full text link
    One of the fundamental open problems in the area of distributed graph algorithms is the question of whether randomization is needed for efficient symmetry breaking. While there are fast, polylogn\text{poly}\log n-time randomized distributed algorithms for all of the classic symmetry breaking problems, for many of them, the best deterministic algorithms are almost exponentially slower. The following basic local splitting problem, which is known as the \emph{weak splitting} problem takes a central role in this context: Each node of a graph G=(V,E)G=(V,E) has to be colored red or blue such that each node of sufficiently large degree has at least one node of each color among its neighbors. Ghaffari, Kuhn, and Maus [STOC '17] showed that this seemingly simple problem is complete w.r.t. the above fundamental open question in the following sense: If there is an efficient polylogn\text{poly}\log n-time determinstic distributed algorithm for weak splitting, then there is such an algorithm for all locally checkable graph problems for which an efficient randomized algorithm exists. In this paper, we investigate the distributed complexity of weak splitting and some closely related problems. E.g., we obtain efficient algorithms for special cases of weak splitting, where the graph is nearly regular. In particular, we show that if δ\delta and Δ\Delta are the minimum and maximum degrees of GG and if δ=Ω(logn)\delta=\Omega(\log n), weak splitting can be solved deterministically in time O(Δδpoly(logn))O\big(\frac{\Delta}{\delta}\cdot\text{poly}(\log n)\big). Further, if δ=Ω(loglogn)\delta = \Omega(\log\log n) and Δ2εδ\Delta\leq 2^{\varepsilon\delta}, there is a randomized algorithm with time complexity O(Δδpoly(loglogn))O\big(\frac{\Delta}{\delta}\cdot\text{poly}(\log\log n)\big)

    A randomised primal-dual algorithm for distributed radio-interferometric imaging

    Get PDF
    Next generation radio telescopes, like the Square Kilometre Array, will acquire an unprecedented amount of data for radio astronomy. The development of fast, parallelisable or distributed algorithms for handling such large-scale data sets is of prime importance. Motivated by this, we investigate herein a convex optimisation algorithmic structure, based on primal-dual forward-backward iterations, for solving the radio interferometric imaging problem. It can encompass any convex prior of interest. It allows for the distributed processing of the measured data and introduces further flexibility by employing a probabilistic approach for the selection of the data blocks used at a given iteration. We study the reconstruction performance with respect to the data distribution and we propose the use of nonuniform probabilities for the randomised updates. Our simulations show the feasibility of the randomisation given a limited computing infrastructure as well as important computational advantages when compared to state-of-the-art algorithmic structures.Comment: 5 pages, 3 figures, Proceedings of the European Signal Processing Conference (EUSIPCO) 2016, Related journal publication available at https://arxiv.org/abs/1601.0402

    Multi-criteria scheduling of pipeline workflows

    Get PDF
    Mapping workflow applications onto parallel platforms is a challenging problem, even for simple application patterns such as pipeline graphs. Several antagonist criteria should be optimized, such as throughput and latency (or a combination). In this paper, we study the complexity of the bi-criteria mapping problem for pipeline graphs on communication homogeneous platforms. In particular, we assess the complexity of the well-known chains-to-chains problem for different-speed processors, which turns out to be NP-hard. We provide several efficient polynomial bi-criteria heuristics, and their relative performance is evaluated through extensive simulations
    corecore