Difference between revisions of "Open Problems:81"

From Open Problems in Sublinear Algorithms
Jump to: navigation, search
(Created page with "{{Header |source=focs17 |who=Jayadev Acharya }} For any $\alpha \geq 0$, the ''Rényi entropy of order $\alpha$'' of a probability distribution $p$ over a discrete domain $\Om...")
 
m (cleaning the header)
 
(4 intermediate revisions by 2 users not shown)
Line 8: Line 8:
 
$$
 
$$
 
for $\alpha\neq 1$, and $H_1(p) = \lim_{\alpha\to 1} H_\alpha(p)$.
 
for $\alpha\neq 1$, and $H_1(p) = \lim_{\alpha\to 1} H_\alpha(p)$.
In particular, $H_1$ corresponds to the Shannon entropy $H(p)=-\sum_{x\in\Omega} p(x) \log p(x)$. The problem of estimation Rényi entropy of a distribution $p$, given i.i.d. samples from it, was studied in {{cite|Acharya-17}}, where the authors obtain tight bounds for every integer $\alpha\neq 1$, and nearly-tight ones for every non-integer $\alpha \geq 0$. In the later, however, the sample complexity is only known up to a subpolynomial factor: resolving the exact dependence on $\alpha$ and the alphabet size would be interesting.
+
In particular, $H_1$ corresponds to the Shannon entropy $H(p)=-\sum_{x\in\Omega} p(x) \log p(x)$. The problem of estimation Rényi entropy of a distribution $p$, given i.i.d. samples from it, was studied in {{cite|AcharyaOST-17}}, where the authors obtain tight bounds for every integer $\alpha\neq 1$, and nearly-tight ones for every non-integer $\alpha \geq 0$. In the later case, however, the sample complexity is only known up to a subpolynomial factor: resolving the exact dependence on $\alpha$ and the alphabet size would be interesting.

Latest revision as of 06:04, 8 November 2017

Suggested by Jayadev Acharya
Source FOCS 2017
Short link https://sublinear.info/81

For any $\alpha \geq 0$, the Rényi entropy of order $\alpha$ of a probability distribution $p$ over a discrete domain $\Omega$ is defined as $$ H_\alpha(p) = \frac{1}{1-\alpha}\log\sum_{x\in\Omega} p(x)^\alpha $$ for $\alpha\neq 1$, and $H_1(p) = \lim_{\alpha\to 1} H_\alpha(p)$. In particular, $H_1$ corresponds to the Shannon entropy $H(p)=-\sum_{x\in\Omega} p(x) \log p(x)$. The problem of estimation Rényi entropy of a distribution $p$, given i.i.d. samples from it, was studied in [AcharyaOST-17], where the authors obtain tight bounds for every integer $\alpha\neq 1$, and nearly-tight ones for every non-integer $\alpha \geq 0$. In the later case, however, the sample complexity is only known up to a subpolynomial factor: resolving the exact dependence on $\alpha$ and the alphabet size would be interesting.