18 log n lemma
All checks were successful
Build LaTeX on Host / build (push) Successful in 10s

This commit is contained in:
2025-07-22 00:17:32 +08:00
parent 54439684d9
commit 5502e7d20d

View File

@@ -39,11 +39,11 @@ We first ignore the outlier condition and see if stochastic embeddings break the
For any metric space $(X,d)$ and for any $p$, there is an embedding of $(X,d)$ into $\ell_p^{O(\log^2 n)}$ with distortion $O(\log n)$. For any metric space $(X,d)$ and for any $p$, there is an embedding of $(X,d)$ into $\ell_p^{O(\log^2 n)}$ with distortion $O(\log n)$.
\end{theorem} \end{theorem}
Bourgain develops an algorithm that finds a desired embedding with probability at least $1/2$.\footnote{\url{https://home.ttic.edu/~harry/teaching/pdf/lecture3.pdf}} For the $\ell_2$ case, the embedding has the following bounds: Bourgain develops a randomized algorithm that finds a desired embedding.\footnote{The expansion bound always holds. The contraction bound holds with probability at least $1/2$. See \url{https://home.ttic.edu/~harry/teaching/pdf/lecture3.pdf}} For the $\ell_2$ case, the embedding has the following bounds:
\begin{itemize} \begin{enumerate}
\item[Expansion] $\|f(x)-f(y)\|_2\leq O(\log n) d(x,y)$ \item Expansion. $\|f(x)-f(y)\|_2\leq O(\log n) d(x,y)$
\item[Contraction] $\|f(x)-f(y)\|_2 \geq \frac{d(x,y)}{O(1)}$ \item Contraction. $\|f(x)-f(y)\|_2 \geq \frac{d(x,y)}{O(1)}$
\end{itemize} \end{enumerate}
The contraction bound is almost tight. Let $K$ be the dimension of the target space. For the expansion bound, we have The contraction bound is almost tight. Let $K$ be the dimension of the target space. For the expansion bound, we have
@@ -58,8 +58,16 @@ The contraction bound is almost tight. Let $K$ be the dimension of the target sp
One thing we can try is to tighten the second line. One thing we can try is to tighten the second line.
Recall that for each dimension $i$ a random subset $S_i\subset X$ is selected and the value of $f_i(x)$ is $\min_{s\in S_i} d(x,s)$. Recall that for each dimension $i$ a random subset $S_i\subset X$ is selected and the value of $f_i(x)$ is $\min_{s\in S_i} d(x,s)$.
We want to show that for any fixed $x,y\in X$ and any dimension $i$ the event that distance $|f_i(x)-f_i(y)|^2$ is much smaller than $d(x,y)^2$ happends with high probability. We want to show that for any fixed $x,y\in X$ and any dimension $i$ the event that distance $|f_i(x)-f_i(y)|^2$ is much smaller than $d(x,y)^2$ happens with high probability.
Now consider a subset $S_i$ by sampling each node in $X$ iid with probability $2^{-i}$. Now consider a subset $S_j$ by sampling each node in $X$ iid with probability $2^{-j}$. We independently repeat this process $m=576\log n$ times and denote by $S_{ij}$ the sampled set for $i\in [m]$. A~free lemma is the following.
\begin{lemma}
For fixed $x,y\in X$ and $j$,
\[
\Pr[\text{for at least $18\log n$ values of $i$, $|f_{ij}(x)-f_{ij}(y)|\geq (\rho_j -\rho_{j-1})$}]\geq 1-\frac{1}{n^3},
\]
where $\rho_j$ is the smallest radius for which $|B(x,\rho_j)|\geq 2^j$ and $|B(y,\rho_j)|\geq 2^j$.
\end{lemma}
\end{document} \end{document}