121,665 research outputs found

    Bipartite Perfect Matching in Pseudo-Deterministic NC

    Get PDF
    We present a pseudo-deterministic NC algorithm for finding perfect matchings in bipartite graphs. Specifically, our algorithm is a randomized parallel algorithm which uses poly(n) processors, poly(log n) depth, poly(log n) random bits, and outputs for each bipartite input graph a unique perfect matching with high probability. That is, on the same graph it returns the same matching for almost all choices of randomness. As an immediate consequence we also find a pseudo-deterministic NC algorithm for constructing a depth first search (DFS) tree. We introduce a method for computing the union of all min-weight perfect matchings of a weighted graph in RNC and a novel set of weight assignments which in combination enable isolating a unique matching in a graph. We then show a way to use pseudo-deterministic algorithms to reduce the number of random bits used by general randomized algorithms. The main idea is that random bits can be reused by successive invocations of pseudo-deterministic randomized algorithms. We use the technique to show an RNC algorithm for constructing a depth first search (DFS) tree using only O(log^2 n) bits whereas the previous best randomized algorithm used O(log^7 n), and a new sequential randomized algorithm for the set-maxima problem which uses fewer random bits than the previous state of the art. Furthermore, we prove that resolving the decision question NC = RNC, would imply an NC algorithm for finding a bipartite perfect matching and finding a DFS tree in NC. This is not implied by previous randomized NC search algorithms for finding bipartite perfect matching, but is implied by the existence of a pseudo-deterministic NC search algorithm

    Biconnectivity, Chain Decomposition and st-Numbering Using O(n) Bits

    Get PDF
    Recent work by Elmasry et al. (STACS 2015) and Asano et al. (ISAAC 2014) reconsidered classical fundamental graph algorithms focusing on improving the space complexity. Elmasry et al. gave, among others, an implementation of depth first search (DFS) of a graph on n vertices and m edges, taking O(m lg lg n) time using O(n) bits of space improving on the time bound of O(m lg n) due to Asano et al. Subsequently Banerjee et al. (COCOON 2016) gave an O(m + n) time implementation using O(m+n) bits, for DFS and its classical applications (including testing for biconnectivity, and finding cut vertices and cut edges). Recently, Kammer et al. (MFCS 2016) gave an algorithm for testing biconnectivity using O(n + min{m, n lg lg n}) bits in linear time. In this paper, we consider O(n) bits implementations of the classical applications of DFS. These include the problem of finding cut vertices, and biconnected components, chain decomposition and st-numbering. Classical algorithms for them typically use DFS and some Omega(lg n) bits of information at each node. Our O(n)-bit implementations for these problems take O(m lg^c n lg lg n) time for some small constant c (c leq 3). Central to our implementation is a succinct representation of the DFS tree and a space efficient partitioning of the DFS tree into connected subtrees, which maybe of independent interest for space efficient graph algorithms

    Fast and Compact Distributed Verification and Self-Stabilization of a DFS Tree

    Full text link
    We present algorithms for distributed verification and silent-stabilization of a DFS(Depth First Search) spanning tree of a connected network. Computing and maintaining such a DFS tree is an important task, e.g., for constructing efficient routing schemes. Our algorithm improves upon previous work in various ways. Comparable previous work has space and time complexities of O(nlogโกฮ”)O(n\log \Delta) bits per node and O(nD)O(nD) respectively, where ฮ”\Delta is the highest degree of a node, nn is the number of nodes and DD is the diameter of the network. In contrast, our algorithm has a space complexity of O(logโกn)O(\log n) bits per node, which is optimal for silent-stabilizing spanning trees and runs in O(n)O(n) time. In addition, our solution is modular since it utilizes the distributed verification algorithm as an independent subtask of the overall solution. It is possible to use the verification algorithm as a stand alone task or as a subtask in another algorithm. To demonstrate the simplicity of constructing efficient DFS algorithms using the modular approach, We also present a (non-sielnt) self-stabilizing DFS token circulation algorithm for general networks based on our silent-stabilizing DFS tree. The complexities of this token circulation algorithm are comparable to the known ones

    Indexing Graph Search Trees and Applications

    Get PDF
    We consider the problem of compactly representing the Depth First Search (DFS) tree of a given undirected or directed graph having n vertices and m edges while supporting various DFS related queries efficiently in the RAM with logarithmic word size. We study this problem in two well-known models: indexing and encoding models. While most of these queries can be supported easily in constant time using O(n lg n) bits of extra space, our goal here is, more specifically, to beat this trivial O(n lg n) bit space bound, yet not compromise too much on the running time of these queries. In the indexing model, the space bound of our solution involves the quantity m, hence, we obtain different bounds for sparse and dense graphs respectively. In the encoding model, we first give a space lower bound, followed by an almost optimal data structure with extremely fast query time. Central to our algorithm is a partitioning of the DFS tree into connected subtrees, and a compact way to store these connections. Finally, we also apply these techniques to compactly index the shortest path structure, biconnectivity structures among others

    ๊ณต๊ฐ„ ํšจ์œจ์ ์ธ ๊ทธ๋ž˜ํ”„ ์•Œ๊ณ ๋ฆฌ์ฆ˜์˜ ์„ฑ๋Šฅ ๋ถ„์„

    Get PDF
    ํ•™์œ„๋…ผ๋ฌธ (์„์‚ฌ)-- ์„œ์šธ๋Œ€ํ•™๊ต ๋Œ€ํ•™์› : ๊ณต๊ณผ๋Œ€ํ•™ ์ปดํ“จํ„ฐ๊ณตํ•™๋ถ€, 2019. 2. Satti, Srinivasa Rao.Various graphs from social networks or big data may contain gigantic data. Searching such graph requires memory scaling with graph. Asano et al. ISAAC (2014) initiated the study of space e๏ฌƒcient graph algorithms, and proposed algorithms for DFS and some applications using sub-linear space which take slightly more than linear time. Banerjee et al. ToCS 62(8), 1736-1762 (2018) proposed space e๏ฌƒcient graph algorithms based on read-only memory(ROM) model. Given a graph G with n vertices and m edges, their BFS algorithm spends O(m + n) time using 2n + o(n) bits. The space usage is further improved to nlg3 + o(n) bits with O(mlgn f(n)) time, where f(n) is extremely slow growing function of n. For DFS, their algorithm takes O(m + n) time using O(mlg(m/n)). Chakraborty et al. ESA (2018) introduced in-place model. The notion of in-place model is to relax the read-only restriction of ROM model to improve the space usage of ROM model. Algorithms based on in-place model improve space usage exponentially, to O(lgn) bits, at the expense of slower runtime. In this thesis, we focus on exploring proposed space e๏ฌƒcient graph algorithms of ROM model and in-place model in detail and evaluate performance of those algorithms. We implemented almost all the best-known space-efficient algorithms for BFS and DFS, and evaluated their performance. Along the way, we also implemented several space-e๏ฌƒcient data structures for representing bit vectors, strings, dictionaries etc.์†Œ์…œ ๋„คํŠธ์›Œํฌ๋‚˜ ๋น… ๋ฐ์ดํ„ฐ๋กœ๋ถ€ํ„ฐ ์ƒ์„ฑ๋œ ๋‹ค์–‘ํ•œ ๊ทธ๋ž˜ํ”„๋“ค์€ ๋ฐฉ๋Œ€ํ•œ ์–‘์˜ ๋ฐ์ดํ„ฐ๋ฅผ ํฌํ•จํ•˜๊ณ  ์žˆ๋‹ค. ์ด๋Ÿฌํ•œ ๊ทธ๋ž˜ํ”„๋ฅผ ํƒ์ƒ‰ํ•˜๊ธฐ ์œ„ํ•ด์„œ๋Š” ๊ทธ๋ž˜ํ”„์˜ ํฌ๊ธฐ์— ๋น„๋ก€ํ•˜์—ฌ ํ•„์š”ํ•œ ๋ฉ”๋ชจ๋ฆฌ์˜ ์šฉ๋Ÿ‰์ด ๋Š˜์–ด๋‚œ๋‹ค. Asano ๋“ฑ(ISAAC (2014))์€ ๊ณต๊ฐ„ ํšจ์œจ์  ๊ทธ๋ž˜ํ”„ ์•Œ๊ณ ๋ฆฌ์ฆ˜ ์—ฐ๊ตฌ๋ฅผ ๊ฐœ์‹œํ–ˆ๋‹ค. ์ด ์—ฐ๊ตฌ๋ฅผ ํ†ตํ•ด ์„ ํ˜•์  ์‹œ๊ฐ„๋ณด๋‹ค ์•ฝ๊ฐ„ ๋” ๊ฑธ๋ฆฌ๋Š” ๋Œ€์‹  ์ €์„ ํ˜•์  ๊ณต๊ฐ„์„ ์‚ฌ์šฉํ•˜๋Š” DFS ์•Œ๊ณ ๋ฆฌ์ฆ˜๊ณผ ํ™œ์šฉ ๋ฐฉ์•ˆ๋“ค์ด ์ œ์•ˆ๋๋‹ค. Banerjee ๋“ฑ(ToCS 62(8), 1736-1762 (2018))์€ ROM ๋ชจ๋ธ์„ ๊ธฐ๋ฐ˜์œผ๋กœ ํ•˜๋Š” ๊ณต๊ฐ„ ํšจ์œจ์ ์ธ ๊ทธ๋ž˜ํ”„ ์•Œ๊ณ ๋ฆฌ์ฆ˜๋“ค์„ ์ œ์•ˆํ–ˆ๋‹ค. ๊ทธ๋ž˜ํ”„ G์˜ n๊ฐœ์˜ ์ •์ ๊ณผ m๊ฐœ์˜ ๊ฐ„์„ ์ด ์ฃผ์–ด์กŒ์„ ๋•Œ, O(m + n)์˜ ์‹œ๊ฐ„๊ณผ 2n + o(n) ์˜ ๋น„ํŠธ๋ฅผ ์‚ฌ์šฉํ•˜๋Š” BFS๊ฐ€ ์ œ์•ˆ๋๊ณ , f(n)์„ n์— ๋น„๋ก€ํ•ด์„œ ๋งค์šฐ ๋Š๋ฆฌ๊ฒŒ ์ปค์ง€๋Š” ํ•จ์ˆ˜๋ผ๊ณ  ํ–ˆ์„ ๋•Œ, O(mlgnf(n))์˜ ์‹œ๊ฐ„๊ณผ nlg3 + o(n)์˜ ๋น„ํŠธ๋ฅผ ์‚ฌ์šฉํ•˜๋Š” ์•Œ๊ณ ๋ฆฌ์ฆ˜์ด ์ œ์•ˆ๋๋‹ค. DFS์˜ ๊ฒฝ์šฐ, O(m + n)์˜ ์‹œ๊ฐ„๊ณผ O(mlg m n )์˜ ๋น„ํŠธ๋ฅผ ์‚ฌ์šฉํ•˜๋Š” ์•Œ๊ณ ๋ฆฌ์ฆ˜์ด ์ œ์•ˆ๋๋‹ค. Chakraborty ๋“ฑ(ESA (2018))์€ ROM ๋ชจ๋ธ์ด ๊ฐ€์ง€๊ณ  ์žˆ๋Š” ํ•œ๊ณ„์ ์„ ๋„˜๊ธฐ ์œ„ํ•ด ROM ๋ชจ๋ธ์˜ ์ œํ•œ์ ์„ ์™„ํ™”์‹œํ‚ค๋Š” in-place ๋ชจ๋ธ์„ ์†Œ๊ฐœํ–ˆ๋‹ค. In-place ๋ชจ๋ธ์„ ๊ธฐ๋ฐ˜์œผ๋กœ ํ•œ ์•Œ๊ณ ๋ฆฌ์ฆ˜๋“ค์€ n + O(lgn)์˜ ๋น„ํŠธ๋ฅผ ์‚ฌ์šฉํ•˜์—ฌ BFS์™€ DFS๋ฅผ ์ˆ˜ํ–‰ํ•  ์ˆ˜ ์žˆ๊ณ , ์ถ”๊ฐ€์ ์œผ๋กœ ๋” ๊ธด ์‹œ๊ฐ„์„ ์†Œ์š”ํ•˜์—ฌ O(lgn) ๋น„ํŠธ์˜ ๊ณต๊ฐ„๋งŒ์œผ๋กœ ์•Œ๊ณ ๋ฆฌ์ฆ˜์„ ์ˆ˜ํ–‰ํ•  ์ˆ˜ ์žˆ๋‹ค. ์ด ๋…ผ๋ฌธ์—์„œ๋Š” ROM ๋ชจ๋ธ๊ณผ in-place ๋ชจ๋ธ์—์„œ ์ œ์•ˆ๋œ ๋‹ค์–‘ํ•œ ์•Œ๊ณ ๋ฆฌ์ฆ˜๋“ค์„ ์—ฐ๊ตฌ ๋ฐ ๊ตฌํ˜„ํ•˜๊ณ  ์‹คํ—˜์„ ํ†ตํ•˜์—ฌ ์ด๋“ค ์•Œ๊ณ ๋ฆฌ์ฆ˜์˜ ์ˆ˜ํ–‰ ๊ฒฐ๊ณผ๋ฅผ ํ‰๊ฐ€ํ•œ๋‹ค.Abstract i Contents iii List of Figures v List of Tables vi Chapter 1 Introduction 1 1.1 Related Work . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2 1.2 Organization of the Paper . . . . . . . . . . . . . . . . . . . . . . 2 Chapter 2 Preliminaries 4 2.1 ROM Model . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4 2.2 In-place Model . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5 2.3 Succinct Data Structure . . . . . . . . . . . . . . . . . . . . . . . 6 2.4 Changing Base Without Losing Space . . . . . . . . . . . . . . . 6 2.5 Dictionaries With Findany Operation . . . . . . . . . . . . . . . 7 Chapter 3 Breadth First Search 9 3.1 ROM model . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9 3.2 Rotate model . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11 3.3 Implicit model . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12 iii Chapter 4 Depth First Search 14 4.1 ROM model . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 14 4.2 Rotate model . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 16 4.3 Implicit model . . . . . . . . . . . . . . . . . . . . . . . . . . . . 20 Chapter 5 Experimental Results 22 5.1 BFS . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 23 5.2 DFS . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 31 Chapter 6 Conclusion 40 ์š”์•ฝ 46 Acknowledgements 47Maste
    • โ€ฆ
    corecore