From 8627ece41afd1f9a279b8d3ddc9e04955989ca7c Mon Sep 17 00:00:00 2001 From: Yu Cong Date: Tue, 22 Jul 2025 14:17:07 +0800 Subject: [PATCH] better definition --- distribution.tex | 8 ++++++-- 1 file changed, 6 insertions(+), 2 deletions(-) diff --git a/distribution.tex b/distribution.tex index 1182e42..85a3b55 100644 --- a/distribution.tex +++ b/distribution.tex @@ -18,8 +18,10 @@ For $\ell_2$ the lowerbound is still $\Omega(\log n)$ Recall that we want to find a $(O(k),(1+\e)c)$-outlier embedding into $\ell_2$ for any metric space $(X,d)$ which admits a $(k,c)$-outlier embedding into $\ell_2$. If we can do this deterministically, we actually find an embedding of the outlier points into $\ell_2$ with distortion $O(k)$, which contradicts the lowerbound. However, maybe we can do $O(k)$ via embedding into some distribution of $\ell_2$ metrics. -Let $(X,d)$ be a finite metric space and let $\mathcal Y=\{ (Y_1,d_1),\ldots (Y_h,d_h) \}$ be a set of metric spaces. Let $\pi$ be a distribution of embeddings into $\mathcal Y$. The original metric space $(X,d)$ embeds into $\pi$ with distortion $D$ if there is an $r>0$ such that for all $x,y\in X$, +\begin{definition}[Expected distortion.] +Let $(X,d)$ be the original metric space and let $\mathcal Y=\{ (Y_1,d_1),\ldots (Y_h,d_h) \}$ be a set of target spaces. Let $\pi$ be a distribution of embeddings into $\mathcal Y$. To be more precise, for each target space $(Y_i,d_i)$ we define an embedding $\alpha_i:X\to Y_i$ and define the probability of choosing this embedding to be $p_i$. The original metric space $(X,d)$ embeds into $\pi$ with distortion $D$ if there is an $r>0$ such that for all $x,y\in X$, \[r\leq \frac{\E_{i\from \pi} [d_i(\alpha_i(x),\alpha_i(y))]}{d(x,y)}\leq Dr.\] +\end{definition} SODA23 paper also embeds $(X,d)$ into distribution. We call this kind of embeddings stochastic embedding. @@ -27,9 +29,11 @@ SODA23 paper also embeds $(X,d)$ into distribution. We call this kind of embeddi Consider the problem of embedding some finite metric into a tree metric. We can get an $O(n)$ lowerbound via the unit edge length cycle $C_n$. However, if embedding into distortions is allowed, we can do $O(\log n)$. \begin{theorem}[Bartal] -Let $(X,d)$ be a metric space on $n$ points with diameter $\Delta$, let $\mathcal D T$ be the set of tree metrics that dominate $d$, there is a distribution $\pi$ on $\mathcal D T$ such that $(X,d)$ embeds into $\pi$ with distortion $O(\log n)$. +Let $(X,d)$ be a metric space on $n$ points, let $\mathcal D T$ be the set of tree metrics that dominate $d$, there is a distribution $\pi$ on $\mathcal D T$ such that $(X,d)$ embeds into $\pi$ with distortion $O(\log n)$. \end{theorem} +Is there any other known result on expected distortion of embeddings besides Bartal's theorem? + % A kind of embedding problems which are closely related to outlier embeddings is Ramsey type embedding. Let $(X,d_X)$ be the original metric space and let $(Y,d_Y)$ be the target space. Given a fixed distortion $c$, Ramsey type embedding asks for the largest subset $Z$ of $X$ such that $(Z,d_X)$ embeds into $(Y,d_Y)$ with distortion at most $c$. This is the same as computing the smallest outlier set. \section{Stochastic Embedding into \texorpdfstring{$\ell_2$}{l2}}