Difference between revisions of "Open Problems:28"
(Created page with "{{DISPLAYTITLE:Problem 28: Randomness of Partially Random Streams}} {{Infobox label1 = Proposed by data1 = Sudipto Guha label2 = Source data2 = [[Workshops:Kanpur_2009Kan...") 
m (1 revision) 
(No difference)

Revision as of 05:33, 8 November 2012
Proposed by  Sudipto Guha 

Source  Kanpur 2009 
Short link  http://sublinear.info/28 
Many streaming algorithms are designed for worstcase inputs and the first step of analysis is conducted using truly random hash functions, which in the second step are replaced by hash functions that can be described using a small number of truly random bits. In practice, the input stream is often a result of some random process. Mitzenmacher and Vadhan [MitzenmacherV08] show that as long as it has sufficiently large entropy, hash functions from a restricted family are essentially as good as truly hash functions. On a related note, Gabizon and Hassidim [GabizonH10] show that algorithms for randomorder streams need essentially no additional entropy apart from what can be extracted from the input.
In these two cases, the input can be seen as a source of randomness for the algorithm. How can one quantify the randomness of the stream in a natural way? For instance, Mitzenmacher and Vadhan set a lower bound for the Renyi entropy of each element of the stream, conditioned on the previous elements of the stream. Are there measures that are likely to be useful in practice and that are possible to verify?
Once we fix a measure of randomness, how much randomness according to this measure is necessary to derandomize or simplify specific streaming algorithms?