This commit is contained in:
@@ -20,12 +20,14 @@ For $\ell_2$ the lowerbound is still $\Omega(\log n)$
|
||||
|
||||
Recall that we want to find a $(O(k),(1+\e)c)$-outlier embedding into $\ell_2$ for any metric space $(X,d)$ which admits a $(k,c)$-outlier embedding into $\ell_2$. If we can do this deterministically, we actually find an embedding of the outlier points into $\ell_2$ \sout{with distortion $O(k)$, which contradicts the lowerbound}. This is not true! The $\log k$ factor is required by SDP and only expansion bound is needed. We do not have to bound the contraction part. However, maybe we can do $O(k)$ via embedding into some distribution of $\ell_2$ metrics.
|
||||
|
||||
\begin{definition}[Expected distortion.]
|
||||
\begin{definition}[Expected distortion]
|
||||
Let $(X,d)$ be the original metric space and let $\mathcal Y=\{ (Y_1,d_1),\ldots (Y_h,d_h) \}$ be a set of target spaces. Let $\pi$ be a distribution of embeddings into $\mathcal Y$. To be more precise, for each target space $(Y_i,d_i)$ we define an embedding $\alpha_i:X\to Y_i$ and define the probability of choosing this embedding to be $p_i$. The original metric space $(X,d)$ embeds into $\pi$ with distortion $D$ if there is an $r>0$ such that for all $x,y\in X$,
|
||||
\[r\leq \frac{\E_{i\from \pi} [d_i(\alpha_i(x),\alpha_i(y))]}{d(x,y)}\leq Dr.\]
|
||||
\end{definition}
|
||||
|
||||
SODA23 paper also embeds $(X,d)$ into distribution. We call this kind of embeddings stochastic embedding.
|
||||
Note that if we compute the minimum $D$ for all $x,y$ pair and take the average, the resulting value is called the average distortion.\footnote{\url{https://www.cs.huji.ac.il/w~ittaia/papers/ABN-STOC06.pdf}} There is an embedding into $\ell_p$ with constant average distortion for arbitrary metric spaces, while maintaining the same worst case bound provided by Bourgain's theorem.
|
||||
|
||||
The outlier paper (SODA23) also embeds $(X,d)$ into distribution. We call this kind of embeddings stochastic embedding.
|
||||
|
||||
\begin{lemma}
|
||||
Let $\pi$ be a stochastic embedding into $\ell_p$ with expected expansion bound $\E_{i\from \pi}\|\alpha_i(x)-\alpha_i(y)\|_p\leq c_{\E}d(x,y)$. Then there is a deterministic embedding into $\ell_p$ with the same expansion bound.
|
||||
|
Reference in New Issue
Block a user