19,225 research outputs found

    Power assignment problems in wireless communication

    No full text
    A fundamental class of problems in wireless communication is concerned with the assignment of suitable transmission powers to wireless devices/stations such that the resulting communication graph satisfies certain desired properties and the overall energy consumed is minimized. Many concrete communication tasks in a wireless network like broadcast, multicast, point-to-point routing, creation of a communication backbone, etc. can be regarded as such a power assignment problem. This paper considers several problems of that kind; for example one problem studied before in (Vittorio Bil{\`o} et al: Geometric Clustering to Minimize the Sum of Cluster Sizes, ESA 2005) and (Helmut Alt et al.: Minimum-cost coverage of point sets by disks, SCG 2006) aims to select and assign powers to kk of the stations such that all other stations are within reach of at least one of the selected stations. We improve the running time for obtaining a (1+ϵ)(1+\epsilon)-approximate solution for this problem from n((α/ϵ)O(d))n^{((\alpha/\epsilon)^{O(d)})} as reported by Bil{\`o} et al. (see Vittorio Bil{\`o} et al: Geometric Clustering to Minimize the Sum of Cluster Sizes, ESA 2005) to O(n+(k2d+1ϵd)min{  2k,    (α/ϵ)O(d)  })O\left( n+ {\left(\frac{k^{2d+1}}{\epsilon^d}\right)}^{ \min{\{\; 2k,\;\; (\alpha/\epsilon)^{O(d)} \;\}} } \right) that is, we obtain a running time that is \emph{linear} in the network size. Further results include a constant approximation algorithm for the TSP problem under squared (non-metric!) edge costs, which can be employed to implement a novel data aggregation protocol, as well as efficient schemes to perform kk-hop multicasts

    Optimal interval clustering: Application to Bregman clustering and statistical mixture learning

    Full text link
    We present a generic dynamic programming method to compute the optimal clustering of nn scalar elements into kk pairwise disjoint intervals. This case includes 1D Euclidean kk-means, kk-medoids, kk-medians, kk-centers, etc. We extend the method to incorporate cluster size constraints and show how to choose the appropriate kk by model selection. Finally, we illustrate and refine the method on two case studies: Bregman clustering and statistical mixture learning maximizing the complete likelihood.Comment: 10 pages, 3 figure

    Parallel Graph Partitioning for Complex Networks

    Full text link
    Processing large complex networks like social networks or web graphs has recently attracted considerable interest. In order to do this in parallel, we need to partition them into pieces of about equal size. Unfortunately, previous parallel graph partitioners originally developed for more regular mesh-like networks do not work well for these networks. This paper addresses this problem by parallelizing and adapting the label propagation technique originally developed for graph clustering. By introducing size constraints, label propagation becomes applicable for both the coarsening and the refinement phase of multilevel graph partitioning. We obtain very high quality by applying a highly parallel evolutionary algorithm to the coarsened graph. The resulting system is both more scalable and achieves higher quality than state-of-the-art systems like ParMetis or PT-Scotch. For large complex networks the performance differences are very big. For example, our algorithm can partition a web graph with 3.3 billion edges in less than sixteen seconds using 512 cores of a high performance cluster while producing a high quality partition -- none of the competing systems can handle this graph on our system.Comment: Review article. Parallelization of our previous approach arXiv:1402.328

    Approximate Clustering via Metric Partitioning

    Get PDF
    In this paper we consider two metric covering/clustering problems - \textit{Minimum Cost Covering Problem} (MCC) and kk-clustering. In the MCC problem, we are given two point sets XX (clients) and YY (servers), and a metric on XYX \cup Y. We would like to cover the clients by balls centered at the servers. The objective function to minimize is the sum of the α\alpha-th power of the radii of the balls. Here α1\alpha \geq 1 is a parameter of the problem (but not of a problem instance). MCC is closely related to the kk-clustering problem. The main difference between kk-clustering and MCC is that in kk-clustering one needs to select kk balls to cover the clients. For any \eps > 0, we describe quasi-polynomial time (1 + \eps) approximation algorithms for both of the problems. However, in case of kk-clustering the algorithm uses (1 + \eps)k balls. Prior to our work, a 3α3^{\alpha} and a cα{c}^{\alpha} approximation were achieved by polynomial-time algorithms for MCC and kk-clustering, respectively, where c>1c > 1 is an absolute constant. These two problems are thus interesting examples of metric covering/clustering problems that admit (1 + \eps)-approximation (using (1+\eps)k balls in case of kk-clustering), if one is willing to settle for quasi-polynomial time. In contrast, for the variant of MCC where α\alpha is part of the input, we show under standard assumptions that no polynomial time algorithm can achieve an approximation factor better than O(logX)O(\log |X|) for αlogX\alpha \geq \log |X|.Comment: 19 page
    corecore