intuition of metric methods
This commit is contained in:
parent
32ae555c29
commit
9722f3e187
1
main.tex
1
main.tex
@ -181,6 +181,7 @@ To get a $O(\sqrt{\log n})$ (randomized) approximation algorithm we need to firs
|
||||
\]
|
||||
|
||||
This is the framework of the proof in \cite{arora_expander_2004}.
|
||||
I think the intuition behind this SDP relaxation is almost the same as \metric{}. $\ell_1$ metrics are good since they are in the cut cone. If we further require the metric in \metric{} is a $\ell_1$ metric in $\R^d$, the resulting LP is still a relaxation of \nonuscut{} and the integrality gap is upperbounded by the metric embedding theorem. \cite{leighton_multicommodity_1999} showed that the $\Theta(\log n)$ gap is tight for \metric{}, but add extra constraints to \metric{} (while keeping it to be a relaxation of \scut{} and to be polynomially solvable) may provides better gap. The SDP relaxation is in fact trying to enforce the metric to be $\ell_2^2$ in $\R^n$.
|
||||
|
||||
\bibliographystyle{alpha}
|
||||
\bibliography{ref}
|
||||
|
Loading…
x
Reference in New Issue
Block a user