Difference between revisions of "Open Problems:6"

From Open Problems in Sublinear Algorithms
Jump to: navigation, search
m (1 revision)
 
(One intermediate revision by the same user not shown)
Line 1: Line 1:
{{DISPLAYTITLE:Problem 6: Filtering Irrelevant Data}}
+
{{Header
{{Infobox
+
|source=kanpur06
|label1 = Proposed by
+
|who=Sariel Har-Peled
|data1 = Sariel Har-Peled
 
|label2 = Source
 
|data2 = [[Workshops:Kanpur_2006|Kanpur 2006]]
 
|label3 = Short link
 
|data3 = http://sublinear.info/6
 
 
}}
 
}}
 
For many problems most of the stream is irrelevant and a good use of a streaming algorithm could be to filter out the irrelevant parts of the stream such that the data left is small enough to be processed by an I/O efficient algorithm. How effective can a small-space algorithm be at such filtering for a given problem? An alternative idea that addresses similar issues is to allow a data stream algorithm to delete and annotate the stream and take multiple passes as in {{cite|DemetrescuFR-06}}. If the deletion of irrelevant elements was a large component of the algorithm then it would not make sense to measure the total number of passes taken by the algorithm but, rather, the total number of elements processed.
 
For many problems most of the stream is irrelevant and a good use of a streaming algorithm could be to filter out the irrelevant parts of the stream such that the data left is small enough to be processed by an I/O efficient algorithm. How effective can a small-space algorithm be at such filtering for a given problem? An alternative idea that addresses similar issues is to allow a data stream algorithm to delete and annotate the stream and take multiple passes as in {{cite|DemetrescuFR-06}}. If the deletion of irrelevant elements was a large component of the algorithm then it would not make sense to measure the total number of passes taken by the algorithm but, rather, the total number of elements processed.

Latest revision as of 01:38, 7 March 2013

Suggested by Sariel Har-Peled
Source Kanpur 2006
Short link https://sublinear.info/6

For many problems most of the stream is irrelevant and a good use of a streaming algorithm could be to filter out the irrelevant parts of the stream such that the data left is small enough to be processed by an I/O efficient algorithm. How effective can a small-space algorithm be at such filtering for a given problem? An alternative idea that addresses similar issues is to allow a data stream algorithm to delete and annotate the stream and take multiple passes as in [DemetrescuFR-06]. If the deletion of irrelevant elements was a large component of the algorithm then it would not make sense to measure the total number of passes taken by the algorithm but, rather, the total number of elements processed.