日本語
 
Help Privacy Policy ポリシー/免責事項
  詳細検索ブラウズ

アイテム詳細


公開

成果報告書

A Note on Hardness of Diameter Approximation

MPS-Authors
/persons/resource/persons44182

Bringmann,  Karl       
Algorithms and Complexity, MPI for Informatics, Max Planck Society;

External Resource
There are no locators available
Fulltext (restricted access)
There are currently no full texts shared for your IP range.
フルテキスト (公開)

arXiv:1705.02127.pdf
(プレプリント), 582KB

付随資料 (公開)
There is no public supplementary material available
引用

Bringmann, K., & Krinninger, S. (2017). A Note on Hardness of Diameter Approximation. Retrieved from http://arxiv.org/abs/1705.02127.


引用: https://hdl.handle.net/11858/00-001M-0000-002D-89B7-D
要旨
We revisit the hardness of approximating the diameter of a network. In the CONGEST model, $ \tilde \Omega (n) $ rounds are necessary to compute the diameter [Frischknecht et al. SODA'12]. Abboud et al. DISC 2016 extended this result to sparse graphs and, at a more fine-grained level, showed that, for any integer $ 1 \leq \ell \leq \operatorname{polylog} (n) $, distinguishing between networks of diameter $ 4 \ell + 2 $ and $ 6 \ell + 1 $ requires $ \tilde \Omega (n) $ rounds. We slightly tighten this result by showing that even distinguishing between diameter $ 2 \ell + 1 $ and $ 3 \ell + 1 $ requires $ \tilde \Omega (n) $ rounds. The reduction of Abboud et al. is inspired by recent conditional lower bounds in the RAM model, where the orthogonal vectors problem plays a pivotal role. In our new lower bound, we make the connection to orthogonal vectors explicit, leading to a conceptually more streamlined exposition. This is suited for teaching both the lower bound in the CONGEST model and the conditional lower bound in the RAM model.