diff --git a/distribution.pdf b/distribution.pdf index d9f41da..e8f9c93 100644 Binary files a/distribution.pdf and b/distribution.pdf differ diff --git a/distribution.tex b/distribution.tex index 5792c09..a3a5ac6 100644 --- a/distribution.tex +++ b/distribution.tex @@ -6,20 +6,20 @@ \begin{document} \section{Better Distortion with Distribution} -There is a well known lowerbound for the distortion of embedding a finite metric $(X,d)$ into $\ell_1$. +There is a well known lowerbound for the distortion of embedding a metric space $(X,d)$ into $\ell_1$. \begin{theorem} -For any finite metric $(X,d)$ on $n$ points, one has +For any metric space $(X,d)$ on $n$ points, one has \[(X,d) \lhook\joinrel\xrightarrow{\Omega(\log n)} \ell_1. \] \end{theorem} For $\ell_2$ the lowerbound is still $\Omega(\log n)$ \footnote{\url{https://web.stanford.edu/class/cs369m/cs369mlecture1.pdf}}. -Recall that we want to find a $(O(k),(1+\e)c)$-outlier embedding into $\ell_2$ for any finite metric $(X,d)$ which admits a $(k,c)$-outlier embedding into $\ell_2$. If we can do this deterministically, we actually find an embedding of the outlier points into $\ell_2$ with distortion $O(k)$, which contradicts the lowerbound. However, maybe we can do $O(k)$ via embedding into some distribution of $\ell_2$ metrics. +Recall that we want to find a $(O(k),(1+\e)c)$-outlier embedding into $\ell_2$ for any metric space $(X,d)$ which admits a $(k,c)$-outlier embedding into $\ell_2$. If we can do this deterministically, we actually find an embedding of the outlier points into $\ell_2$ with distortion $O(k)$, which contradicts the lowerbound. However, maybe we can do $O(k)$ via embedding into some distribution of $\ell_2$ metrics. -Let $(X,d)$ be a finite metric and let $\mathcal Y=\{ (Y_1,d_1),\ldots (Y_h,d_h) \}$ be a set of metrics where $|X|=|Y|=n$. Let $\pi$ be a distribution on $\mathcal Y$. The original metric $(X,d)$ embeds into $\pi$ with distortion $D$ if there is an $r>0$ such that for all $x,y\in X$, -\[r\leq \frac{\E[d_i(x,y)]}{d(x,y)}\leq Dr.\] +Let $(X,d)$ be a finite metric space and let $\mathcal Y=\{ (Y_1,d_1),\ldots (Y_h,d_h) \}$ be a set of metric spaces. Let $\pi$ be a distribution on $\mathcal Y$. The original metric space $(X,d)$ embeds into $\pi$ with distortion $D$ if there is an $r>0$ such that for all $x,y\in X$, +\[r\leq \frac{\E_{i\from \pi} [d_i(\alpha_i(x),\alpha_i(y))]}{d(x,y)}\leq Dr.\] SODA23 paper also embeds $(X,d)$ into distribution. @@ -27,7 +27,7 @@ SODA23 paper also embeds $(X,d)$ into distribution. Consider the problem of embedding some finite metric into a tree metric. We can get an $O(n)$ lowerbound via the unit edge length cycle $C_n$. However, if embedding into distortions is allowed, we can do $O(\log n)$. \begin{theorem}[Bartal] -Let $(X,d)$ be a metric on $n$ points with diameter $\Delta$, let $\mathcal D T$ be the set of tree metrics that dominate $d$, there is a distribution $\pi$ on $\mathcal D T$ such that $(X,d)$ embeds into $pi$ with distortion $O(\log n\log \Delta)$. +Let $(X,d)$ be a metric space on $n$ points with diameter $\Delta$, let $\mathcal D T$ be the set of tree metrics that dominate $d$, there is a distribution $\pi$ on $\mathcal D T$ such that $(X,d)$ embeds into $pi$ with distortion $O(\log n\log \Delta)$. \end{theorem}