4 research outputs found
A Note on Hardness of Diameter Approximation
We revisit the hardness of approximating the diameter of a network. In the
CONGEST model of distributed computing, rounds are
necessary to compute the diameter [Frischknecht et al. SODA'12], where hides polylogarithmic factors. Abboud et al. [DISC 2016]
extended this result to sparse graphs and, at a more fine-grained level, showed
that, for any integer ,
distinguishing between networks of diameter and
requires rounds. We slightly tighten this result by
showing that even distinguishing between diameter and requires rounds. The reduction of Abboud et al. is
inspired by recent conditional lower bounds in the RAM model, where the
orthogonal vectors problem plays a pivotal role. In our new lower bound, we
make the connection to orthogonal vectors explicit, leading to a conceptually
more streamlined exposition.Comment: Accepted to Information Processing Letter
Brief Announcement: A Note on Hardness of Diameter Approximation
We revisit the hardness of approximating the diameter of a network. In the CONGEST model, ~Omega(n) rounds are necessary to compute the diameter [Frischknecht et al. SODA\u2712]. Abboud et al. [DISC 2016] extended this result to sparse graphs and, at a more fine-grained level, showed that, for any integer 1 <= l <= polylog(n)distinguishing between networks of diameter 4l + 2 and 6l + 1 requires ~Omega(n) rounds. We slightly tighten this result by showing that even distinguishing between diameter 2l + 1 and 3l + 1 requires ~Omega(n) rounds. The reduction of Abboud et al. is inspired by recent conditional lower bounds in the RAM model, where the orthogonal vectors problem plays a pivotal role. In our new lower bound, we make the connection to orthogonal vectors explicit, leading to a conceptually more streamlined exposition. This is suited for teaching both the lower bound in the CONGEST model and the conditional lower bound in the RAM model