Difference between revisions of "Open Problems:31"
(Created page with "{{DISPLAYTITLE:Problem 31: Gap-Hamming Information Cost}} {{Infobox |label1 = Proposed by |data1 = Amit Chakrabarti |label2 = Source |data2 = [[Workshops:Kanpur_2009|Kanpur 20...") |
|||
(2 intermediate revisions by one other user not shown) | |||
Line 1: | Line 1: | ||
− | {{ | + | {{Header |
− | + | |source=kanpur09 | |
− | | | + | |who=Amit Chakrabarti |
− | | | ||
− | |||
− | |||
− | |||
− | |||
}} | }} | ||
In the Gap-Hamming problem, two players Alice and Bob have vectors $x,y\in \{0,1\}^n$ respectively and wish to compute the function $f$ | In the Gap-Hamming problem, two players Alice and Bob have vectors $x,y\in \{0,1\}^n$ respectively and wish to compute the function $f$ | ||
Line 23: | Line 18: | ||
\] | \] | ||
It would be interesting to prove a lower bound on the information cost for the Gap-Hamming problem for some natural input distribution. | It would be interesting to prove a lower bound on the information cost for the Gap-Hamming problem for some natural input distribution. | ||
+ | |||
+ | ==Update== | ||
+ | A linear lower bound on the information cost for a certain distribution was shown by {{cite|KerenidisLLRX-12}}. Later, {{cite|BravermanGPW-13}} showed that a linear lower bound also holds for the uniform distribution. |
Latest revision as of 18:23, 27 March 2013
Suggested by | Amit Chakrabarti |
---|---|
Source | Kanpur 2009 |
Short link | https://sublinear.info/31 |
In the Gap-Hamming problem, two players Alice and Bob have vectors $x,y\in \{0,1\}^n$ respectively and wish to compute the function $f$ \[ f(x,y)= \begin{cases} 0 & \mbox{ if } \Delta(x,y) \leq n/2-\sqrt{n} \\ 1 & \mbox{ if } \Delta(x,y) \geq n/2+\sqrt{n} \end{cases} \] where $\Delta(x,y)=|\{i:x_i\neq y_i\}|$ is the Hamming distance between the vectors and we are promised that $|\Delta(x,y)-n/2|\geq \sqrt{n}$. The problem became interesting in the streaming community because a lower bound on the communication complexity of evaluating $f$ yields a lower bound on the space required by a streaming algorithm to estimate the number of distinct elements or the entropy of a stream. After a series of papers, it is know that evaluating $f$ requires $\Omega(n)$ communication [IndykW-03,Woodruff-04,BrodyC-09,BrodyCRVW-10,ChakrabartiR-11] even if an unlimited number of rounds of communication are used.
An increasingly popular technique in communication complexity is to prove bounds by bounding the information cost [ChakrabartiSWY-01,BarYossefJKS-04]. Here we consider random input $(X,Y)$ and consider the mutual information between the input and the random transcript of the protocol $\Pi(X,Y)$: \[ I(XY;\Pi(X,Y))=H(XY)-H(XY|\Pi(X,Y)) \ . \] It would be interesting to prove a lower bound on the information cost for the Gap-Hamming problem for some natural input distribution.
Update[edit]
A linear lower bound on the information cost for a certain distribution was shown by [KerenidisLLRX-12]. Later, [BravermanGPW-13] showed that a linear lower bound also holds for the uniform distribution.