Editing Open Problems:83

Jump to: navigation, search

Warning: You are not logged in. Your IP address will be publicly visible if you make any edits. If you log in or create an account, your edits will be attributed to your username, along with other benefits.

The edit can be undone. Please check the comparison below to verify that this is what you want to do, and then save the changes below to finish undoing the edit.
Latest revision Your text
Line 5: Line 5:
 
Given the full description of a fixed distribution $q$ over a discrete domain (say $[n]=\{1,\dots,n\}$), as well as access to i.i.d. samples from an unknown probability distributions $p$ over $[n]$ and distance parameter $\varepsilon\in(0,1]$, the  identity testing problem asks to distinguish w.h.p. between (i) $p=q$ and (ii) $\operatorname{d}_{\rm TV}(p,q)>\varepsilon$.
 
Given the full description of a fixed distribution $q$ over a discrete domain (say $[n]=\{1,\dots,n\}$), as well as access to i.i.d. samples from an unknown probability distributions $p$ over $[n]$ and distance parameter $\varepsilon\in(0,1]$, the  identity testing problem asks to distinguish w.h.p. between (i) $p=q$ and (ii) $\operatorname{d}_{\rm TV}(p,q)>\varepsilon$.
  
The sample complexity of this question as a function of $n$ and $\varepsilon$ is fully understood by now: $\Theta(\sqrt{n}/\varepsilon^2)$ are necessary and sufficient, the worst-case lower bound following from taking $q$ to be the uniform distribution on $[n]$. Valiant and Valiant {{cite|ValiantV-14}} shown an ''instance-specific'' bound on this problem, where the sample complexity $\Psi_{\rm TV}$ now only depends on $\varepsilon$ and the (massive) parameter $q$ instead of $n$: namely, that  
+
The sample complexity of this question as a function of $n$ and $\varepsilon$ is fully understood by now: $\Theta(\sqrt{n}/\varepsilon^2)$ are necessary and sufficient, the worst-case lower bound following from taking $q$ to be the uniform distribution on $[n]$. Valiant and Valiant {{cite|ValiantV-14}} shown an ''instance-specific'' bound on this problem, where the sample complexity $\Psi_{\rm TV}$ now only depends on $n$ and the (massive) parameter $q$ instead of $n$: namely, that  
 
$$\Psi_{\rm TV}(q,\varepsilon) = \Theta\left(\max\left( \frac{\Phi(q,\Theta(\varepsilon))}{\varepsilon^2}, \frac{1}{\varepsilon}\right)\right)$$
 
$$\Psi_{\rm TV}(q,\varepsilon) = \Theta\left(\max\left( \frac{\Phi(q,\Theta(\varepsilon))}{\varepsilon^2}, \frac{1}{\varepsilon}\right)\right)$$
 
samples were necessary and sufficient, where $\Phi$ is the functional defined by taking the $2/3$-pseudonorm of the vector of probabilities of $q$, once both the biggest element and $\varepsilon$ total mass of the smallest elements had been removed:
 
samples were necessary and sufficient, where $\Phi$ is the functional defined by taking the $2/3$-pseudonorm of the vector of probabilities of $q$, once both the biggest element and $\varepsilon$ total mass of the smallest elements had been removed:

Please note that all contributions to Open Problems in Sublinear Algorithms may be edited, altered, or removed by other contributors. If you do not want your writing to be edited mercilessly, then do not submit it here.
You are also promising us that you wrote this yourself, or copied it from a public domain or similar free resource (see Open Problems in Sublinear Algorithms:Copyrights for details). Do not submit copyrighted work without permission!

To edit this page, please answer the question that appears below (more info):

Cancel Editing help (opens in new window)