what's next?...

This commit is contained in:
Yu Cong 2025-05-21 23:28:36 +08:00
parent 56a0c84bce
commit 4e8f62a293
3 changed files with 25 additions and 1 deletions

BIN
main.pdf

Binary file not shown.

View File

@ -184,6 +184,14 @@ This is the framework of the proof in \cite{arora_expander_2004}.
I think the intuition behind this SDP relaxation is almost the same as \metric{}. $\ell_1$ metrics are good since they are in the cut cone. However, if we further require that the metric in \metric{} is an $\ell_1$ metric in $\R^d$, then resulting LP is NP-hard, since the integrality gap becomes 1.
\cite{leighton_multicommodity_1999} showed that the $\Theta(\log n)$ gap is tight for \metric{}, but add extra constraints to \metric{} (while keeping it to be a relaxation of \scut{} and to be polynomially solvable) may provides better gap. The SDP relaxation is in fact trying to enforce the metric to be $\ell_2^2$ in $\R^n$.
\cite{arora_euclidean_2005} proved that there is an embedding from $\ell_2^2$ to $\ell_1$ with distortion $O(\sqrt{\log n}\log \log n)$. This implies an approximation for \nonuscut{} with the same ratio. $O(\sqrt{\log n})$ is likely to be the optimal bound for the above SDP. To get better gap one can stay with SDP and add more additional constraints (like Sherali-Adams, Lovász-Schrijver and Lasserre relaxations); or think distance as variables in an LP and add force feasible solution to be certain kind of metrics. \cite{arora_towards_2013} is following the former method and considers Lasserre relaxations. For the later method, getting a cut from the optimal metric is the same as embedding it to $\ell_1$. Thus it still relies on progress in metric embedding theory. Note that both methods need to satisfy
\begin{enumerate}
\item the further constrained programs is polynomially solvable,
\item it remains a relaxation of \scut{},
\item the gap is better.
\end{enumerate}
The Lasserre relaxation of SDP automatically satisfies 1 and 2. But I believe there may be some very strange kind of metric that embeds into $\ell_1$ well?
\bibliographystyle{alpha}
\bibliography{ref}
\end{document}

18
ref.bib
View File

@ -317,4 +317,20 @@ series = {SODA '95}
year = {2013},
pages = {295--305},
}
@book{Williamson_Shmoys_2011, place={Cambridge}, title={The Design of Approximation Algorithms}, publisher={Cambridge University Press}, author={Williamson, David P. and Shmoys, David B.}, year={2011}}
@book{Williamson_Shmoys_2011, place={Cambridge}, title={The Design of Approximation Algorithms}, publisher={Cambridge University Press}, author={Williamson, David P. and Shmoys, David B.}, year={2011}}
@inproceedings{arora_towards_2013,
address = {Berkeley, CA, USA},
title = {Towards a {Better} {Approximation} for {Sparsest} {Cut}?},
isbn = {978-0-7695-5135-7},
url = {http://ieeexplore.ieee.org/document/6686163/},
doi = {10.1109/FOCS.2013.37},
language = {en},
urldate = {2025-05-09},
booktitle = {2013 {IEEE} 54th {Annual} {Symposium} on {Foundations} of {Computer} {Science}},
publisher = {IEEE},
author = {Arora, Sanjeev and Ge, Rong and Sinop, Ali Kemal},
month = oct,
year = {2013},
pages = {270--279},
}