<?xml version="1.0"?>
<feed xmlns="http://www.w3.org/2005/Atom" xml:lang="en">
	<id>https://sublinear.info/api.php?action=feedcontributions&amp;feedformat=atom&amp;user=132.76.50.6</id>
	<title>Open Problems in Sublinear Algorithms - User contributions [en]</title>
	<link rel="self" type="application/atom+xml" href="https://sublinear.info/api.php?action=feedcontributions&amp;feedformat=atom&amp;user=132.76.50.6"/>
	<link rel="alternate" type="text/html" href="https://sublinear.info/index.php?title=Special:Contributions/132.76.50.6"/>
	<updated>2026-04-22T18:37:54Z</updated>
	<subtitle>User contributions</subtitle>
	<generator>MediaWiki 1.31.10</generator>
	<entry>
		<id>https://sublinear.info/index.php?title=Open_Problems:67&amp;diff=786</id>
		<title>Open Problems:67</title>
		<link rel="alternate" type="text/html" href="https://sublinear.info/index.php?title=Open_Problems:67&amp;diff=786"/>
		<updated>2014-06-05T05:58:27Z</updated>

		<summary type="html">&lt;p&gt;132.76.50.6: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;{{Header&lt;br /&gt;
|title=Difficult Instance for Max-Cut in the Streaming Model&lt;br /&gt;
|source=bertinoro14&lt;br /&gt;
|who=Robert Krauthgamer&lt;br /&gt;
}}&lt;br /&gt;
&lt;br /&gt;
We are interested in Max-Cut in the streaming model, and specifically in the tradeoff between approximation and space (storage) complexity. &lt;br /&gt;
Formally, in the '''Max-Cut''' problem, the input is a graph $G$, and the goal is to compute the maximum number of edges that cross any single cut $(V_1,V_2)$. This is clearly equivalent to computing the least number of edges that need to be removed to make the graph bipartite.&lt;br /&gt;
In the streaming model, we assume that the input graph is seen as a sequence of edges, in an arbitrary order, and the goal is to compute the Max-Cut value, i.e., the number of edges (or approximate it). There is no need to report the cut itself. For instance, it is easy to approximate Max-Cut within factor $1/2$ using $O(\log n)$ space, by simply counting the total number edges in the input and reporting $|E(G)|/2$. &lt;br /&gt;
&lt;br /&gt;
Here is a concrete suggestion for a hard input distribution, which is known to be a hard instance for bipartiteness testing in sparse graphs {{cite|GoldreichR-02}}. &lt;br /&gt;
Let $G'$ be a graph consisting of a cycle of length $n$ (where $n$ is even) and a random matching. It is known that with high probability, $G'$ is an expander and at least $0.01 n$ edges have to be removed to turn it into a bipartite graph. Let $G''$ be a graph consisting of a cycle of length $n$ and a random matching, with the constraint that the matching must consist only of ''even chords'': these are chords that are an even number of vertices apart on the cycle. It is easy to see that $G''$ is always bipartite. &lt;br /&gt;
&lt;br /&gt;
The total number of edges in both $G'$ and $G''$ is exactly $3n/2$. It is easy to see that&lt;br /&gt;
* $G''$ has a cut of size $3n/2$,&lt;br /&gt;
* with high probability, $G'$ has no cut of size greater than $(3/2 - 0.01)n$.&lt;br /&gt;
How much space is required to distinguish between these two graphs in the streaming model? Is it $\Omega(n)$?&lt;br /&gt;
And what about the (multi-round) communication complexity of the problem, namely, the edges of the input graph are split between two parties, Alice and Bob, who need to estimate the Max-Cut?&lt;/div&gt;</summary>
		<author><name>132.76.50.6</name></author>
		
	</entry>
</feed>