Difference between revisions of "Open Problems:28"
m (updated header)
|Line 1:||Line 1:|
Latest revision as of 01:50, 7 March 2013
|Suggested by||Sudipto Guha|
Many streaming algorithms are designed for worst-case inputs and the first step of analysis is conducted using truly random hash functions, which in the second step are replaced by hash functions that can be described using a small number of truly random bits. In practice, the input stream is often a result of some random process. Mitzenmacher and Vadhan [MitzenmacherV-08] show that as long as it has sufficiently large entropy, hash functions from a restricted family are essentially as good as truly hash functions. On a related note, Gabizon and Hassidim [GabizonH-10] show that algorithms for random-order streams need essentially no additional entropy apart from what can be extracted from the input.
In these two cases, the input can be seen as a source of randomness for the algorithm. How can one quantify the randomness of the stream in a natural way? For instance, Mitzenmacher and Vadhan set a lower bound for the Renyi entropy of each element of the stream, conditioned on the previous elements of the stream. Are there measures that are likely to be useful in practice and that are possible to verify?
Once we fix a measure of randomness, how much randomness according to this measure is necessary to derandomize or simplify specific streaming algorithms?