first commit

This commit is contained in:
2025-04-24 13:11:28 +08:00
commit ff9c54d5e4
5960 changed files with 834111 additions and 0 deletions

View File

@@ -0,0 +1,404 @@
% Choose pra, prb, prc, prd, pre, prl, prstab, or rmp for journal
% Add 'draft' option to mark overfull boxes with black boxes
% Add 'showpacs' option to make PACS codes appear
% for review and submission
%\documentclass[aps,preprint,showpacs,superscriptaddress,groupedaddress]{revtex4} % for double-spaced preprint
% needed for figures
% needed for some tables
% for math
% for math
% for crossing out text
% for coloring text
%\input{tcilatex}
\documentclass[aps,prl,twocolumn,showpacs,superscriptaddress,groupedaddress]{revtex4}
\usepackage{graphicx}
\usepackage{dcolumn}
\usepackage{bm}
\usepackage{amssymb}
\usepackage{soul}
\usepackage{color}
%TCIDATA{OutputFilter=LATEX.DLL}
%TCIDATA{Version=5.50.0.2960}
%TCIDATA{<META NAME="SaveForMode" CONTENT="1">}
%TCIDATA{BibliographyScheme=BibTeX}
%TCIDATA{LastRevised=Tuesday, May 20, 2014 03:06:00}
%TCIDATA{<META NAME="GraphicsSave" CONTENT="32">}
\hyphenation{ALPGEN}
\hyphenation{EVTGEN}
\hyphenation{PYTHIA}
\def\be{\begin{equation}}
\def\ee{\end{equation}}
\def\bea{\begin{eqnarray}}
\def\eea{\end{eqnarray}}
%\input{tcilatex}
\begin{document}
\title{Transport measurements of the spin wave gap of Mn}
\input author_list.tex
\date{\today}
\begin{abstract}
Temperature dependent transport measurements on ultrathin antiferromagnetic
Mn films reveal a heretofore unknown non-universal weak localization
correction to the conductivity which extends to disorder strengths greater than
100~k$\Omega$ per square. The inelastic scattering of electrons off of
gapped antiferromagnetic spin waves gives rise to an inelastic scattering
length which is short enough to place the system in the 3D regime. The
extracted fitting parameters provide estimates of the energy gap ($\Delta
\approx$~16~K) and exchange energy ($\bar{J} \approx$~320~K). %\st{which are in
%agreement with values obtained with other techniques}.
\end{abstract}
\pacs{75}
\maketitle
Hello world
Thin-film transition metal ferromagnets (Fe, Co, Ni, Gd) and
antiferromagnets (Mn, Cr) and their alloys are not only ubiquitous in
present day technologies but are also expected to play an important role in
future developments~\cite{thompson_2008}. Understanding magnetism in these
materials, especially when the films are thin enough so that disorder plays
an important role, is complicated by the long standing controversy about the
relative importance of itinerant and local moments~\cite%
{slater_1936,van_vleck_1953,aharoni_2000}. For the itinerant transition
metal magnets, a related fundamental issue centers on the question of how
itinerancy is compromised by disorder. Clearly with sufficient disorder the
charge carriers become localized, but questions arise as to what happens to
the spins and associated spin waves and whether the outcome depends on the
ferro/antiferro alignment of spins in the itinerant parent. Ferromagnets
which have magnetization as the order parameter are fundamentally different
than antiferromagnets which have staggered magnetization (i.e., difference
between the magnetization on each sublattice) as the order parameter~\cite%
{blundell_2001}. Ferromagnetism thus distinguishes itself by having soft
modes at zero wave number whereas antiferromagnets have soft modes at finite
wave number~\cite{belitz_2005}. Accordingly, the respective spin wave
spectrums are radically different. These distinctions are particularly
important when comparing quantum corrections to the conductivity near
quantum critical points for ferromagnets~\cite{paul_2005} and
antiferromagnets~\cite{syzranov_2012}.
Surprisingly, although there have been systematic studies of the effect of
disorder on the longitudinal $\sigma_{xx}$ and transverse $\sigma_{xy}$
conductivity of ferromagnetic films~\cite%
{bergmann_1978,bergmann_1991,mitra_2007,misra_2009,kurzweil_2009}, there
have been few if any such studies on antiferromagnetic films. In this paper
we remedy this situation by presenting transport data on systematically
disordered Mn films that are sputter deposited in a custom designed vacuum
chamber and then transferred without exposure to air into an adjacent
cryostat for transport studies to low temperature. The experimental
procedures are similar to those reported previously: disorder, characterized
by the sheet resistance $R_0$ measured at $T=$~5~K, can be changed either by
growing separate samples or by gentle annealing of a given sample through
incremental stages of disorder~\cite{misra_2011}. Using these same procedures our results for
antiferromagnets however are decidedly different. The data are well
described over a large range of disorder strengths by a non-universal three
dimensional (3d) quantum correction that applies only to spin wave gapped
antiferromagnets. This finding implies the presence of strong inelastic
electron scattering off of antiferromagnetic spin waves. The theory is
validated not only by good fits to the data but also by extraction from the
fitting parameters of a value for the spin wave gap $\Delta$ that is in
agreement with the value expected for Mn. On the other hand, the
exchange energy $\bar{J}$ could be sensitive to the high disorder in our
ultra thin films, and it turns out to be much smaller compared to the known values.
In previous work the inelastic scattering of electrons off of spin waves has
been an essential ingredient in understanding disordered ferromagnets. For
example, to explain the occurrence of weak-localization corrections to the
anomalous Hall effect in polycrystalline Fe films~\cite{mitra_2007}, it was
necessary to invoke a contribution to the inelastic phase breaking rate $%
\tau_{\varphi}^{-1}$ due to spin-conserving inelastic scattering off
spin-wave excitations. This phase breaking rate, anticipated by theory~\cite%
{tatara_2004} and seen experimentally in spin polarized electron energy loss
spectroscopy (SPEELS) measurements of ultrathin Fe films~\cite%
{plihal_1999,zhang_2010}, is linear in temperature and significantly larger
than the phase breaking rate due to electron-electron interactions, thus
allowing a wide temperature range to observe weak localization corrections~%
\cite{mitra_2007}. The effect of a high $\tau_{\varphi}^{-1}$ due to
inelastic scattering off spin-wave excitations is also seen in Gd films
where in addition to a localizing log($T$) quantum correction to the
conductance, a localizing linear-in-$T$ quantum correction is present and is
interpreted as a spin-wave mediated Altshuler-Aronov type correction to the
conductivity~\cite{misra_2009}.
Interestingly, this high rate of inelastic spin rate scattering becomes even
more important for the thinnest films as shown in theoretical calculations
on Fe and Ni which point to extremely short spin-dependent inelastic mean
free paths~\cite{hong_2000} and in spin-polarized electron energy-loss
spectroscopy (SPEELS) measurements on few monolayer-thick Fe/W(110) films in
which a strong nonmonotonic enhancement of localized spin wave energies is
found on the thinnest films~\cite{zhang_2010}.
Inelastic spin wave scattering in highly disordered ferromagnetic films can
be strong enough to assure that the associated $T$-dependent dephasing
length $L_{\varphi }(T)=\sqrt{D\tau _{\varphi }}$ (with $D$ the diffusion
constant)~\cite{lee_1985} is less than the film thickness $t$, thus putting
thin films into the 3d limit where a metal-insulator
transition is observed~\cite{misra_2011}. Recognizing that similarly high
inelastic scattering rates must apply to highly disordered antiferromagnetic
films, we first proceed with a theoretical approach that takes into account
the scattering of antiferromagnetic spin waves on the phase relaxation rate
and find a heretofore unrecognized non-universal 3d weak localization
correction to the conductivity that allows an interpretation of our experimental
results.
We mention in passing that the 3d interaction-induced quantum correction
found to be dominant in the case of ferromagnetic Gd
films which undergo a metal-insulator transition\cite{misra_2011} is
found to be much smaller in the present case and will not be considered further (for an estimate of this contribution see \cite{muttalib_unpub}.
As discussed in detail in Ref.~[\onlinecite{wm10}], the phase relaxation
time $\tau _{\varphi }$ limits the phase coherence in a particle-particle
diffusion propagator $C(q,\omega )$ (Cooperon) in the form
\begin{equation}
C(q,\omega _{l})=\frac{1}{2\pi N_{0}\tau ^{2}}\frac{1}{Dq^{2}+|\omega
_{l}|+1/\tau _{\varphi }}.
\end{equation}
where $N_{0}$ is the density of states at the Fermi level, $\tau $ is the
elastic scattering time and $\omega _{l}=2\pi lT$ is the Matsubara
frequency. Labeling the Cooperon propagator in the absence of interactions
as $C_{0}$, we can write
\begin{equation}
\frac{1}{\tau _{\varphi }}=\frac{1}{2\pi N_{0}\tau ^{2}}[C^{-1}-C_{0}^{-1}].
\end{equation}
In general, $C(q,\omega )$ can be evaluated diagrammatically in the presence
of interactions and disorder in a ladder approximation \cite{fa} that can be
symbolically written as $C=C_{0}+C_{0}KC$ where the interaction vertex $K$
contains self energy as well as vertex corrections due to both interactions
and disorder. It then follows that $1/\tau _{\varphi }$ is given by
\begin{equation}
\frac{1}{\tau _{\varphi }}=-\frac{1}{2\pi N_{0}\tau ^{2}}K.
\end{equation}%
In Ref.~[\onlinecite{wm10}], the leading temperature and disorder dependence
of the inelastic diffusion propagator was evaluated diagrammatically, in the
presence of ferromagnetic spin-wave mediated electron-electron interactions.
Here we consider the antiferromagnetic case. We only consider large
spin-wave gap where the damping can be ignored. Using the antiferromagnetic
dispersion relation $\omega _{q}=\Delta +Aq$, where $A$ is the spin
stiffness, the inelastic lifetime is given by
\be
\frac{\hbar }{\tau _{\varphi }}=\frac{4}{\pi \hbar }nJ^{2}\int_{0}^{1/l}%
\frac{q^{d-1}dq}{\sinh \beta \omega _{q}}\frac{Dq^{2}+1/\tau _{\varphi }}{%
(Dq^{2}+1/\tau _{\varphi })^{2}+\omega _{q}^{2}}
\ee%
where $n=k_{F}^{3}/3\pi ^{2}$ is the 3d density, $J$ is the effective
spin-exchange interaction and $\beta =1/k_{B}T$. Here we will consider the
limit $\hbar /\tau _{\varphi }\ll \Delta $, relevant for our experiment on
Mn. In this limit we can neglect the $1/\tau _{\varphi }$ terms inside the
integral. The upper limit should be restricted to $\Delta /A$ in the limit $%
\Delta /A<1/l$. For large disorder, we expect the parameter $x\equiv
\hbar Dk_{F}^{2}\Delta / \bar{J}^{2}\ll 1$, where the spin-exchange energy
is given by $\bar{J}=Ak_{F}$. In this limit, $L_{\varphi }$ can be
simplified as
\be
k_{F}L_{\varphi }\approx \left( \frac{\bar{J}}{\Delta }\right) ^{3/2}\left(
\frac{5\sinh \frac{\Delta }{T}}{12\pi }\right) ^{1/2},\;\;\;x\ll 1
\label{L-phi-3d}
\ee%
which is independent of $x$, and therefore, independent of disorder.
Given the inelastic lifetime, the weak localization correction in 3d is
usually given by \cite{lee_1985} $\delta \sigma _{3d}=\frac{e^{2}}{\hbar \pi
^{3}}\frac{1}{L_{\varphi }},$ where the prefactor to the inverse inelastic
length is a universal number, independent of disorder. However, at large
enough disorder, we show that there exists a disorder dependent correction,
due to the scale dependent diffusion coefficient near the Anderson
metal-insulator transition. In fact, the diffusion coefficient obeys the
self consistent equation \cite{WV}
\begin{equation}
\frac{D_{0}}{D(\omega )}=1+\frac{k_{F}^{2-d}}{\pi m}\int_{0}^{1/l}dQ\frac{%
Q^{d-1}}{-i\omega +D(\omega )Q^{2}}
\end{equation}%
where $D_{0}=v_{F}l/d$ is the diffusion coefficient at weak disorder. While
the significance of the prefactor to the integral is not clear, the above
equation remains qualitatively accurate over a wide range near the Anderson
transition. Setting $\omega =i/\tau _{\varphi }$ and doing the $Q$-integral
in 3d,
\bea
\frac{D_{0}}{D} &\approx & 1+\frac{1}{\pi mk_{F}}\int_{1/L_{\phi }}^{1/l}dQ\frac{%
Q^{2}}{DQ^{2}}\cr
&=& 1+\frac{D_{0}}{D}\frac{3}{\pi k_{F}^{2}l^{2}}-\delta
\left( \frac{D_{0}}{D}\right) ,
\label{delta}
\eea%
where
\bea
\delta \equiv \frac{D_{0}}{D}\frac{3}{\pi k_{F}^{2}l^{2}}\frac{l}{%
L_{\varphi }}
\eea
is assumed to be a small correction, and Eq.~(\ref{delta})
should not be solved self-consistently. This follows from the fact that the
diffusion coefficient of electrons at fixed energy entering the Cooperon
expression is that of non-interacting electrons, and is given by the limit $%
T\rightarrow 0$, $L_{\varphi }\rightarrow \infty $ and therefore $\delta
\rightarrow 0$. Then the correction at finite $T$ is given by
\bea
\frac{D}{D_{0}} &=& \frac{1}{\left( \frac{D_{0}}{D}\right) _{0}-\delta \left(
\frac{D_{0}}{D}\right) }\cr
&\approx & \left( \frac{D}{D_{0}}\right) _{0}+\left( \frac{D}{D_{0}}\right) _{0}
\frac{3}{\pi k_{F}^{2}l^{2}}\frac{l}{L_{\varphi }}%
\eea%
where
\be
\lim_{T\rightarrow 0}\frac{D}{D_{0}}\equiv \left( \frac{D}{D_{0}}\right)
_{0}.
\ee%
Using the relation $\sigma _{3d}=(e^{2}/\hbar )nD$ where the longitudinal
sheet conductance $\sigma _{\square }=\sigma _{3d}t$, with $t$ being the
film thickness, we finally get the temperature dependent weak localization
correction term
\bea
\frac{\delta \sigma _{\square }}{L_{00}} &=& \left( \frac{D}{D_{0}}\right) _{0}%
\frac{2}{\pi }\frac{t}{L_{\varphi }}\cr
\left( \frac{D}{D_{0}}\right)_{0} &\approx &\frac{2}{1+\sqrt{1+\frac{4R_{0}^{2}}{a^{2}}}}
\label{WL}
\eea%
where $R_{0}=L_{00}/\sigma _{\square }(T$=$0)$, $L_{00}=e^{2}/\pi h$, $%
a=3\pi/2k_{F}tb_{0}$, $b_{0}$ is a number of order unity and we
have solved the self-consistent equation for $D$ in order to express $D_{0%
\text{ }}$in terms of $D$ and finally $R_{0}$. Thus in this case, the weak
localization correction has a prefactor which is not universal. While this
reduces to the well-known universal result at weak disorder $R_{0}\ll a$, it
becomes dependent on disorder characterized by the sheet resistance $R_{0}$
at strong disorder and at the same time substantially extends the 3d regime
near the transition.
Using the expression for $L_{\varphi }$ (Eq.~(\ref{L-phi-3d})) into Eq.~(\ref%
{WL}), we finally obtain the total conductivity, including the quantum
correction to the conductivity due to weak localization in 3d arising from
scattering of electrons off antiferromagnetic spin waves in Mn,
\begin{equation}
\frac{\sigma _{\square }}{L_{00}}=A+\frac{B}{\sqrt{\sinh [\Delta /T]}},
\label{sigmaWL}
\end{equation}%
\textbf{\textbf{}}where the parameter $A$ is temperature independent and the parameter
\bea
B &\equiv & \left( \frac{D}{D_{0}}\right) _{0}\frac{2}{\pi ^{2}}\left( \frac{%
12\pi }{5}\right) ^{1/2}\left( \frac{\Delta }{\bar{J}}\right) ^{3/2}tk_{F}\cr%
&=&\frac{2c}{1+\sqrt{1+\frac{4R_{0}^{2}}{a^{2}}}},
\label{BFit}
\eea%
where
\be
c\equiv \left( \frac{\Delta }{\bar{J}}\right) ^{3/2}\left( \frac{%
48t^{2}k_{F}^{2}}{5\pi}\right) ^{1/2}.
\label{cFit}
\ee
The data presented here is for a single film prepared with an initial $R_0
\approx$~6~k$\Omega$. Disorder was consequently increased in incremental
stages up to 180~k$\Omega$ by annealing at approximately 280~K~\cite%
{misra_2011}. Additional samples were grown at intermediate disorder and
measured to check reproducibility.
Figure~\ref{fig:cond} shows the conductivity data for two samples with
disorder $R_{0}=$~17573~$\Omega $ and 63903~$\Omega $ with corresponding
fittings to the expression (\ref{sigmaWL}) where $A$ and $B$ are taken as
fitting parameters and $\Delta =$~16~K is the spin wave gap. The fits are
sensitive to the parameters $A$ and $B$ but relatively insensitive to $%
\Delta $. We find that $\Delta =$~16~$\pm $~4~K provides good fittings in
the whole range of disorder (from 6 to 180~k$\Omega $).
\begin{figure}[tbp]
\begin{center}
\includegraphics[width=9cm]{fig_1_16.eps}
\end{center}
\caption{The temperature-dependent normalized conductivity (open squares)
for two samples with the indicated disorder strengths of $R_0 =$~17573~$%
\Omega$ and 63903~$\Omega$ show good agreement with theory (solid lines).
The fitting parameters $A$ and $B$ are indicated for each curve with the
error in the least significant digit indicated in parentheses.}
\label{fig:cond}
\end{figure}
Figure~\ref{fig:parb} shows the dependence of the parameter $B$ on the
disorder strength $R_0$ (open squares) and a theoretical fit (solid line)
using Eq.~(\ref{BFit}), where $c$ and $a$ are fitting parameters. The solid
line for this two-paramener fit is drawn for the best-fit values $c=0.67 \pm
0.04$ and $a= 28 \pm 3$~k$\Omega$. We note that the fit is of reasonable
quality over most of the disorder range except for the film with the least
disorder ($R_0 = 6$~k$\Omega$) where $B = 0.77$,
somewhat below the saturated value
$B = c = 0.67$ evaluated from Eq.~(\ref{BFit}) at $R_0 = 0$. Using higher
values of $c$ (e.g., $c=0.8$) and lower values of $a$ (eg., $a = 22$~k$\Omega$)
improves the fit at low disorder strengths but
increases the discrepancy at higher disorder strengths.
%L_phi/t = 2/pi*2/(1+sqrt(1+16))/0.5, 2/pi*2/(1+sqrt(1+1))/0.25
%http://hyperphysics.phy-astr.gsu.edu/hbase/tables/fermi.html , k_F = sqrt(2*m_e*(10.9 eV))/(hbar) = 1.7E10 1/m
% (bar(J) / \Delta) ^ 3/2 = (48*(2e-9)^2*(2.7e9)^2/5/pi/(0.65)^2) ^0.5 = 8360 = 20 ^ 3
%A = \bar{J} / k_F , \bar{J} = nJ
Substituting the Fermi energy for bulk Mn~\cite{ashcroft_1976},
a thickness $t=2$~nm known to 20\% accuracy, together with the best-fit
value for $c$ into Eq.~(\ref{cFit}), we calculate the value $\bar{J} =$~320~$%
\pm$~93~K. Gao et al.~\cite{gao_2008} performed inelastic scanning tunneling
spectroscopy (ISTS) on thin Mn films and reported $\Delta$ in the range from
30 to 60~K and $\bar{J}=vk_F=$~3150~$\pm$~200~K. The agreement of energy gaps is
good; however our significantly lower value of $\bar{J}$ is probably due to the
high disorder in our ultra thin films.
Since the temperature-dependent correction $B/\sqrt{\sinh (\Delta /T)}$ of
Eq.~\ref{sigmaWL} is small compared to the parameter $A$, we can write
$\sigma_{\square} \approx 1/R_0$ so that Eq.~\ref{sigmaWL} reduces to the
expression $A \approx 1/L_{00}R_0$. The logarithmic plot derived by taking the
logarithm of both sides of this approximation is shown in the inset of
Fig.~\ref{fig:parb}. The slope of -1 confirms the linear dependence of $A$ on
$1/R_0$ and the intercept of 5.01 (10$^{5.01}\approx $~102~k$\Omega$) is
within 20\% of the expected theoretical value $L_{00}=$~81~k$\Omega $,
for the normalization constant. Accordingly, the conductivity corrections in
Eq.~\ref{sigmaWL} are small compared to the zero temperature conductivity and
the normalization constant $L_{00}$ for the conductivity is close to the
expected theoretical value.
Using Eq.~(\ref{WL}) and the obtained value for $a\approx $~28~k$\Omega $ we can
compare the dephasing length ($L_{\varphi }$) with the thickness ($t\approx $%
~2~nm) at 16~K. For the sample with $R_{0}=$~63903~$\Omega $ the ratio $%
L_{\varphi }/t\approx $~0.5 and for the sample with $R_{0}=$~17573~$\Omega $
$L_{\varphi }/t\approx $~2. The latter estimate assumes no spin
polarization, while a full polarization would imply $L_{\varphi }/t\approx $%
~1. Thus $L_{\varphi }$ is smaller than or close to the thickness of the
film, which keeps the film in the three-dimensional regime for almost all
temperatures and disorder strengths considered.
\begin{figure}[tbp]
\begin{center}
\includegraphics[width=9cm]{fig_2_16.eps}
\end{center}
\caption{Dependence of the fitting parameters $B$ and $A$ (inset) on
disorder $R_0$ for $\Delta=$~16~K. The fitting parameters are indicated for
each curve with the error in the least significant digit indicated in
parentheses.}
\label{fig:parb}
\end{figure}
In conclusion, we have performed \textit{in situ} transport measurements on
ultra thin Mn films, systematically varying the disorder ($R_{0}=R_{xx}$($T=$%
~5~K)). The obtained data were analyzed within a weak localization theory in
3d generalized to strong disorder. In the temperature range considered
inelastic scattering off spin waves is found to be strong giving rise to a
dephasing length shorter than the film thickness, which places these systems
into the 3d regime. The obtained value for the spin wave gap was close to
the one measured by Gao et al.~\cite{gao_2008} using ISTS, while the
exchange energy was much smaller.
This work has been supported by the NSF under Grant No 1305783 (AFH).
PW thanks A.\ M.\ \ Finkel'stein for useful discussions and acknowledges
partial support through the DFG research unit "Quantum phase transitions".
\bibliographystyle{apsrev}
\bibliography{bibl}
\end{document}

View File

@@ -0,0 +1,3 @@
Hello world
One two three

View File

@@ -0,0 +1,5 @@
Hello world
One two three
Four five six

View File

@@ -0,0 +1,7 @@
Hello world
One two three
Four five six
Seven eight nine

View File

@@ -0,0 +1,404 @@
% Choose pra, prb, prc, prd, pre, prl, prstab, or rmp for journal
% Add 'draft' option to mark overfull boxes with black boxes
% Add 'showpacs' option to make PACS codes appear
% for review and submission
%\documentclass[aps,preprint,showpacs,superscriptaddress,groupedaddress]{revtex4} % for double-spaced preprint
% needed for figures
% needed for some tables
% for math
% for math
% for crossing out text
% for coloring text
%\input{tcilatex}
\documentclass[aps,prl,twocolumn,showpacs,superscriptaddress,groupedaddress]{revtex4}
\usepackage{graphicx}
\usepackage{dcolumn}
\usepackage{bm}
\usepackage{amssymb}
\usepackage{soul}
\usepackage{color}
%TCIDATA{OutputFilter=LATEX.DLL}
%TCIDATA{Version=5.50.0.2960}
%TCIDATA{<META NAME="SaveForMode" CONTENT="1">}
%TCIDATA{BibliographyScheme=BibTeX}
%TCIDATA{LastRevised=Tuesday, May 20, 2014 03:06:00}
%TCIDATA{<META NAME="GraphicsSave" CONTENT="32">}
\hyphenation{ALPGEN}
\hyphenation{EVTGEN}
\hyphenation{PYTHIA}
\def\be{\begin{equation}}
\def\ee{\end{equation}}
\def\bea{\begin{eqnarray}}
\def\eea{\end{eqnarray}}
%\input{tcilatex}
\begin{document}
\title{Transport measurements of the spin wave gap of Mn}
\input author_list.tex
\date{\today}
\begin{abstract}
Temperature dependent transport measurements on ultrathin antiferromagnetic
Mn films reveal a heretofore unknown non-universal weak localization
correction to the conductivity which extends to disorder strengths greater than
100~k$\Omega$ per square. The inelastic scattering of electrons off of
gapped antiferromagnetic spin waves gives rise to an inelastic scattering
length which is short enough to place the system in the 3D regime. The
extracted fitting parameters provide estimates of the energy gap ($\Delta
\approx$~16~K) and exchange energy ($\bar{J} \approx$~320~K). %\st{which are in
%agreement with values obtained with other techniques}.
\end{abstract}
\pacs{75}
\maketitle
Thin-film transition metal ferromagnets (Fe, Co, Ni, Gd) and
antiferromagnets (Mn, Cr) and their alloys are not only ubiquitous in
present day technologies but are also expected to play an important role in
future developments~\cite{thompson_2008}. Understanding magnetism in these
materials, especially when the films are thin enough so that disorder plays
an important role, is complicated by the long standing controversy about the
relative importance of itinerant and local moments~\cite%
{slater_1936,van_vleck_1953,aharoni_2000}. For the itinerant transition
metal magnets, a related fundamental issue centers on the question of how
itinerancy is compromised by disorder. Clearly with sufficient disorder the
charge carriers become localized, but questions arise as to what happens to
the spins and associated spin waves and whether the outcome depends on the
ferro/antiferro alignment of spins in the itinerant parent. Ferromagnets
which have magnetization as the order parameter are fundamentally different
than antiferromagnets which have staggered magnetization (i.e., difference
between the magnetization on each sublattice) as the order parameter~\cite%
{blundell_2001}. Ferromagnetism thus distinguishes itself by having soft
modes at zero wave number whereas antiferromagnets have soft modes at finite
wave number~\cite{belitz_2005}. Accordingly, the respective spin wave
spectrums are radically different. These distinctions are particularly
important when comparing quantum corrections to the conductivity near
quantum critical points for ferromagnets~\cite{paul_2005} and
antiferromagnets~\cite{syzranov_2012}.
Surprisingly, although there have been systematic studies of the effect of
disorder on the longitudinal $\sigma_{xx}$ and transverse $\sigma_{xy}$
conductivity of ferromagnetic films~\cite%
{bergmann_1978,bergmann_1991,mitra_2007,misra_2009,kurzweil_2009}, there
have been few if any such studies on antiferromagnetic films. In this paper
we remedy this situation by presenting transport data on systematically
disordered Mn films that are sputter deposited in a custom designed vacuum
chamber and then transferred without exposure to air into an adjacent
cryostat for transport studies to low temperature. The experimental
procedures are similar to those reported previously: disorder, characterized
by the sheet resistance $R_0$ measured at $T=$~5~K, can be changed either by
growing separate samples or by gentle annealing of a given sample through
incremental stages of disorder~\cite{misra_2011}. Using these same procedures our results for
antiferromagnets however are decidedly different. The data are well
described over a large range of disorder strengths by a non-universal three
dimensional (3d) quantum correction that applies only to spin wave gapped
antiferromagnets. This finding implies the presence of strong inelastic
electron scattering off of antiferromagnetic spin waves. The theory is
validated not only by good fits to the data but also by extraction from the
fitting parameters of a value for the spin wave gap $\Delta$ that is in
agreement with the value expected for Mn. On the other hand, the
exchange energy $\bar{J}$ could be sensitive to the high disorder in our
ultra thin films, and it turns out to be much smaller compared to the known values.
In previous work the inelastic scattering of electrons off of spin waves has
been an essential ingredient in understanding disordered ferromagnets. For
example, to explain the occurrence of weak-localization corrections to the
anomalous Hall effect in polycrystalline Fe films~\cite{mitra_2007}, it was
necessary to invoke a contribution to the inelastic phase breaking rate $%
\tau_{\varphi}^{-1}$ due to spin-conserving inelastic scattering off
spin-wave excitations. This phase breaking rate, anticipated by theory~\cite%
{tatara_2004} and seen experimentally in spin polarized electron energy loss
spectroscopy (SPEELS) measurements of ultrathin Fe films~\cite%
{plihal_1999,zhang_2010}, is linear in temperature and significantly larger
than the phase breaking rate due to electron-electron interactions, thus
allowing a wide temperature range to observe weak localization corrections~%
\cite{mitra_2007}. The effect of a high $\tau_{\varphi}^{-1}$ due to
inelastic scattering off spin-wave excitations is also seen in Gd films
where in addition to a localizing log($T$) quantum correction to the
conductance, a localizing linear-in-$T$ quantum correction is present and is
interpreted as a spin-wave mediated Altshuler-Aronov type correction to the
conductivity~\cite{misra_2009}.
Interestingly, this high rate of inelastic spin rate scattering becomes even
more important for the thinnest films as shown in theoretical calculations
on Fe and Ni which point to extremely short spin-dependent inelastic mean
free paths~\cite{hong_2000} and in spin-polarized electron energy-loss
spectroscopy (SPEELS) measurements on few monolayer-thick Fe/W(110) films in
which a strong nonmonotonic enhancement of localized spin wave energies is
found on the thinnest films~\cite{zhang_2010}.
Inelastic spin wave scattering in highly disordered ferromagnetic films can
be strong enough to assure that the associated $T$-dependent dephasing
length $L_{\varphi }(T)=\sqrt{D\tau _{\varphi }}$ (with $D$ the diffusion
constant)~\cite{lee_1985} is less than the film thickness $t$, thus putting
thin films into the 3d limit where a metal-insulator
transition is observed~\cite{misra_2011}. Recognizing that similarly high
inelastic scattering rates must apply to highly disordered antiferromagnetic
films, we first proceed with a theoretical approach that takes into account
the scattering of antiferromagnetic spin waves on the phase relaxation rate
and find a heretofore unrecognized non-universal 3d weak localization
correction to the conductivity that allows an interpretation of our experimental
results.
We mention in passing that the 3d interaction-induced quantum correction
found to be dominant in the case of ferromagnetic Gd
films which undergo a metal-insulator transition\cite{misra_2011} is
found to be much smaller in the present case and will not be considered further (for an estimate of this contribution see \cite{muttalib_unpub}.
As discussed in detail in Ref.~[\onlinecite{wm10}], the phase relaxation
time $\tau _{\varphi }$ limits the phase coherence in a particle-particle
diffusion propagator $C(q,\omega )$ (Cooperon) in the form
\begin{equation}
C(q,\omega _{l})=\frac{1}{2\pi N_{0}\tau ^{2}}\frac{1}{Dq^{2}+|\omega
_{l}|+1/\tau _{\varphi }}.
\end{equation}
where $N_{0}$ is the density of states at the Fermi level, $\tau $ is the
elastic scattering time and $\omega _{l}=2\pi lT$ is the Matsubara
frequency. Labeling the Cooperon propagator in the absence of interactions
as $C_{0}$, we can write
\begin{equation}
\frac{1}{\tau _{\varphi }}=\frac{1}{2\pi N_{0}\tau ^{2}}[C^{-1}-C_{0}^{-1}].
\end{equation}
In general, $C(q,\omega )$ can be evaluated diagrammatically in the presence
of interactions and disorder in a ladder approximation \cite{fa} that can be
symbolically written as $C=C_{0}+C_{0}KC$ where the interaction vertex $K$
contains self energy as well as vertex corrections due to both interactions
and disorder. It then follows that $1/\tau _{\varphi }$ is given by
\begin{equation}
\frac{1}{\tau _{\varphi }}=-\frac{1}{2\pi N_{0}\tau ^{2}}K.
\end{equation}%
In Ref.~[\onlinecite{wm10}], the leading temperature and disorder dependence
of the inelastic diffusion propagator was evaluated diagrammatically, in the
presence of ferromagnetic spin-wave mediated electron-electron interactions.
Here we consider the antiferromagnetic case. We only consider large
spin-wave gap where the damping can be ignored. Using the antiferromagnetic
dispersion relation $\omega _{q}=\Delta +Aq$, where $A$ is the spin
stiffness, the inelastic lifetime is given by
\be
\frac{\hbar }{\tau _{\varphi }}=\frac{4}{\pi \hbar }nJ^{2}\int_{0}^{1/l}%
\frac{q^{d-1}dq}{\sinh \beta \omega _{q}}\frac{Dq^{2}+1/\tau _{\varphi }}{%
(Dq^{2}+1/\tau _{\varphi })^{2}+\omega _{q}^{2}}
\ee%
where $n=k_{F}^{3}/3\pi ^{2}$ is the 3d density, $J$ is the effective
spin-exchange interaction and $\beta =1/k_{B}T$. Here we will consider the
limit $\hbar /\tau _{\varphi }\ll \Delta $, relevant for our experiment on
Mn. In this limit we can neglect the $1/\tau _{\varphi }$ terms inside the
integral. The upper limit should be restricted to $\Delta /A$ in the limit $%
\Delta /A<1/l$. For large disorder, we expect the parameter $x\equiv
\hbar Dk_{F}^{2}\Delta / \bar{J}^{2}\ll 1$, where the spin-exchange energy
is given by $\bar{J}=Ak_{F}$. In this limit, $L_{\varphi }$ can be
simplified as
\be
k_{F}L_{\varphi }\approx \left( \frac{\bar{J}}{\Delta }\right) ^{3/2}\left(
\frac{5\sinh \frac{\Delta }{T}}{12\pi }\right) ^{1/2},\;\;\;x\ll 1
\label{L-phi-3d}
\ee%
which is independent of $x$, and therefore, independent of disorder.
Given the inelastic lifetime, the weak localization correction in 3d is
usually given by \cite{lee_1985} $\delta \sigma _{3d}=\frac{e^{2}}{\hbar \pi
^{3}}\frac{1}{L_{\varphi }},$ where the prefactor to the inverse inelastic
length is a universal number, independent of disorder. However, at large
enough disorder, we show that there exists a disorder dependent correction,
due to the scale dependent diffusion coefficient near the Anderson
metal-insulator transition. In fact, the diffusion coefficient obeys the
self consistent equation \cite{WV}
\begin{equation}
\frac{D_{0}}{D(\omega )}=1+\frac{k_{F}^{2-d}}{\pi m}\int_{0}^{1/l}dQ\frac{%
Q^{d-1}}{-i\omega +D(\omega )Q^{2}}
\end{equation}%
where $D_{0}=v_{F}l/d$ is the diffusion coefficient at weak disorder. While
the significance of the prefactor to the integral is not clear, the above
equation remains qualitatively accurate over a wide range near the Anderson
transition. Setting $\omega =i/\tau _{\varphi }$ and doing the $Q$-integral
in 3d,
\bea
\frac{D_{0}}{D} &\approx & 1+\frac{1}{\pi mk_{F}}\int_{1/L_{\phi }}^{1/l}dQ\frac{%
Q^{2}}{DQ^{2}}\cr
&=& 1+\frac{D_{0}}{D}\frac{3}{\pi k_{F}^{2}l^{2}}-\delta
\left( \frac{D_{0}}{D}\right) ,
\label{delta}
\eea%
where
\bea
\delta \equiv \frac{D_{0}}{D}\frac{3}{\pi k_{F}^{2}l^{2}}\frac{l}{%
L_{\varphi }}
\eea
is assumed to be a small correction, and Eq.~(\ref{delta})
should not be solved self-consistently. This follows from the fact that the
diffusion coefficient of electrons at fixed energy entering the Cooperon
expression is that of non-interacting electrons, and is given by the limit $%
T\rightarrow 0$, $L_{\varphi }\rightarrow \infty $ and therefore $\delta
\rightarrow 0$. Then the correction at finite $T$ is given by
\bea
\frac{D}{D_{0}} &=& \frac{1}{\left( \frac{D_{0}}{D}\right) _{0}-\delta \left(
\frac{D_{0}}{D}\right) }\cr
&\approx & \left( \frac{D}{D_{0}}\right) _{0}+\left( \frac{D}{D_{0}}\right) _{0}
\frac{3}{\pi k_{F}^{2}l^{2}}\frac{l}{L_{\varphi }}%
\eea%
where
\be
\lim_{T\rightarrow 0}\frac{D}{D_{0}}\equiv \left( \frac{D}{D_{0}}\right)
_{0}.
\ee%
Using the relation $\sigma _{3d}=(e^{2}/\hbar )nD$ where the longitudinal
sheet conductance $\sigma _{\square }=\sigma _{3d}t$, with $t$ being the
film thickness, we finally get the temperature dependent weak localization
correction term
\bea
\frac{\delta \sigma _{\square }}{L_{00}} &=& \left( \frac{D}{D_{0}}\right) _{0}%
\frac{2}{\pi }\frac{t}{L_{\varphi }}\cr
\left( \frac{D}{D_{0}}\right)_{0} &\approx &\frac{2}{1+\sqrt{1+\frac{4R_{0}^{2}}{a^{2}}}}
\label{WL}
\eea%
where $R_{0}=L_{00}/\sigma _{\square }(T$=$0)$, $L_{00}=e^{2}/\pi h$, $%
a=3\pi/2k_{F}tb_{0}$, $b_{0}$ is a number of order unity and we
have solved the self-consistent equation for $D$ in order to express $D_{0%
\text{ }}$in terms of $D$ and finally $R_{0}$. Thus in this case, the weak
localization correction has a prefactor which is not universal. While this
reduces to the well-known universal result at weak disorder $R_{0}\ll a$, it
becomes dependent on disorder characterized by the sheet resistance $R_{0}$
at strong disorder and at the same time substantially extends the 3d regime
near the transition.
Using the expression for $L_{\varphi }$ (Eq.~(\ref{L-phi-3d})) into Eq.~(\ref%
{WL}), we finally obtain the total conductivity, including the quantum
correction to the conductivity due to weak localization in 3d arising from
scattering of electrons off antiferromagnetic spin waves in Mn,
\begin{equation}
\frac{\sigma _{\square }}{L_{00}}=A+\frac{B}{\sqrt{\sinh [\Delta /T]}},
\label{sigmaWL}
\end{equation}%
\textbf{\textbf{}}where the parameter $A$ is temperature independent and the parameter
\bea
B &\equiv & \left( \frac{D}{D_{0}}\right) _{0}\frac{2}{\pi ^{2}}\left( \frac{%
12\pi }{5}\right) ^{1/2}\left( \frac{\Delta }{\bar{J}}\right) ^{3/2}tk_{F}\cr%
&=&\frac{2c}{1+\sqrt{1+\frac{4R_{0}^{2}}{a^{2}}}},
\label{BFit}
\eea%
where
\be
c\equiv \left( \frac{\Delta }{\bar{J}}\right) ^{3/2}\left( \frac{%
48t^{2}k_{F}^{2}}{5\pi}\right) ^{1/2}.
\label{cFit}
\ee
The data presented here is for a single film prepared with an initial $R_0
\approx$~6~k$\Omega$. Disorder was consequently increased in incremental
stages up to 180~k$\Omega$ by annealing at approximately 280~K~\cite%
{misra_2011}. Additional samples were grown at intermediate disorder and
measured to check reproducibility.
Figure~\ref{fig:cond} shows the conductivity data for two samples with
disorder $R_{0}=$~17573~$\Omega $ and 63903~$\Omega $ with corresponding
fittings to the expression (\ref{sigmaWL}) where $A$ and $B$ are taken as
fitting parameters and $\Delta =$~16~K is the spin wave gap. The fits are
sensitive to the parameters $A$ and $B$ but relatively insensitive to $%
\Delta $. We find that $\Delta =$~16~$\pm $~4~K provides good fittings in
the whole range of disorder (from 6 to 180~k$\Omega $).
\begin{figure}[tbp]
\begin{center}
\includegraphics[width=9cm]{fig_1_16.eps}
\end{center}
\caption{The temperature-dependent normalized conductivity (open squares)
for two samples with the indicated disorder strengths of $R_0 =$~17573~$%
\Omega$ and 63903~$\Omega$ show good agreement with theory (solid lines).
The fitting parameters $A$ and $B$ are indicated for each curve with the
error in the least significant digit indicated in parentheses.}
\label{fig:cond}
\end{figure}
Figure~\ref{fig:parb} shows the dependence of the parameter $B$ on the
disorder strength $R_0$ (open squares) and a theoretical fit (solid line)
using Eq.~(\ref{BFit}), where $c$ and $a$ are fitting parameters. The solid
line for this two-paramener fit is drawn for the best-fit values $c=0.67 \pm
0.04$ and $a= 28 \pm 3$~k$\Omega$. We note that the fit is of reasonable
quality over most of the disorder range except for the film with the least
disorder ($R_0 = 6$~k$\Omega$) where $B = 0.77$,
somewhat below the saturated value
$B = c = 0.67$ evaluated from Eq.~(\ref{BFit}) at $R_0 = 0$. Using higher
values of $c$ (e.g., $c=0.8$) and lower values of $a$ (eg., $a = 22$~k$\Omega$)
improves the fit at low disorder strengths but
increases the discrepancy at higher disorder strengths.
%L_phi/t = 2/pi*2/(1+sqrt(1+16))/0.5, 2/pi*2/(1+sqrt(1+1))/0.25
%http://hyperphysics.phy-astr.gsu.edu/hbase/tables/fermi.html , k_F = sqrt(2*m_e*(10.9 eV))/(hbar) = 1.7E10 1/m
% (bar(J) / \Delta) ^ 3/2 = (48*(2e-9)^2*(2.7e9)^2/5/pi/(0.65)^2) ^0.5 = 8360 = 20 ^ 3
%A = \bar{J} / k_F , \bar{J} = nJ
Substituting the Fermi energy for bulk Mn~\cite{ashcroft_1976},
a thickness $t=2$~nm known to 20\% accuracy, together with the best-fit
value for $c$ into Eq.~(\ref{cFit}), we calculate the value $\bar{J} =$~320~$%
\pm$~93~K. Gao et al.~\cite{gao_2008} performed inelastic scanning tunneling
spectroscopy (ISTS) on thin Mn films and reported $\Delta$ in the range from
30 to 60~K and $\bar{J}=vk_F=$~3150~$\pm$~200~K. The agreement of energy gaps is
good; however our significantly lower value of $\bar{J}$ is probably due to the
high disorder in our ultra thin films.
Since the temperature-dependent correction $B/\sqrt{\sinh (\Delta /T)}$ of
Eq.~\ref{sigmaWL} is small compared to the parameter $A$, we can write
$\sigma_{\square} \approx 1/R_0$ so that Eq.~\ref{sigmaWL} reduces to the
expression $A \approx 1/L_{00}R_0$. The logarithmic plot derived by taking the
logarithm of both sides of this approximation is shown in the inset of
Fig.~\ref{fig:parb}. The slope of -1 confirms the linear dependence of $A$ on
$1/R_0$ and the intercept of 5.01 (10$^{5.01}\approx $~102~k$\Omega$) is
within 20\% of the expected theoretical value $L_{00}=$~81~k$\Omega $,
for the normalization constant. Accordingly, the conductivity corrections in
Eq.~\ref{sigmaWL} are small compared to the zero temperature conductivity and
the normalization constant $L_{00}$ for the conductivity is close to the
expected theoretical value.
Using Eq.~(\ref{WL}) and the obtained value for $a\approx $~28~k$\Omega $ we can
compare the dephasing length ($L_{\varphi }$) with the thickness ($t\approx $%
~2~nm) at 16~K. For the sample with $R_{0}=$~63903~$\Omega $ the ratio $%
L_{\varphi }/t\approx $~0.5 and for the sample with $R_{0}=$~17573~$\Omega $
$L_{\varphi }/t\approx $~2. The latter estimate assumes no spin
polarization, while a full polarization would imply $L_{\varphi }/t\approx $%
~1. Thus $L_{\varphi }$ is smaller than or close to the thickness of the
film, which keeps the film in the three-dimensional regime for almost all
temperatures and disorder strengths considered.
\begin{figure}[tbp]
\begin{center}
\includegraphics[width=9cm]{fig_2_16.eps}
\end{center}
\caption{Dependence of the fitting parameters $B$ and $A$ (inset) on
disorder $R_0$ for $\Delta=$~16~K. The fitting parameters are indicated for
each curve with the error in the least significant digit indicated in
parentheses.}
\label{fig:parb}
\end{figure}
In conclusion, we have performed \textit{in situ} transport measurements on
ultra thin Mn films, systematically varying the disorder ($R_{0}=R_{xx}$($T=$%
~5~K)). The obtained data were analyzed within a weak localization theory in
3d generalized to strong disorder. In the temperature range considered
inelastic scattering off spin waves is found to be strong giving rise to a
dephasing length shorter than the film thickness, which places these systems
into the 3d regime. The obtained value for the spin wave gap was close to
the one measured by Gao et al.~\cite{gao_2008} using ISTS, while the
exchange energy was much smaller.
This work has been supported by the NSF under Grant No 1305783 (AFH).
PW thanks A.\ M.\ \ Finkel'stein for useful discussions and acknowledges
partial support through the DFG research unit "Quantum phase transitions".
\bibliographystyle{apsrev}
\bibliography{bibl}
\end{document}

View File

@@ -0,0 +1,74 @@
{
"chunk": {
"history": {
"snapshot": {
"files": {
"bar.tex": {
"hash": "4f785a4c192155b240e3042b3a7388b47603f423",
"stringLength": 26
},
"main.tex": {
"hash": "f28571f561d198b87c24cc6a98b78e87b665e22d",
"stringLength": 20638,
"metadata": {
"main": true
}
}
}
},
"changes": [
{
"operations": [
{
"pathname": "main.tex",
"textOperation": [
1912,
"Hello world",
18726
]
}
],
"timestamp": "2017-12-04T10:23:35.633Z",
"authors": [
31
]
},
{
"operations": [
{
"pathname": "bar.tex",
"newPathname": "foo.tex"
}
],
"timestamp": "2017-12-04T10:27:26.874Z",
"authors": [
31
]
},
{
"operations": [
{
"pathname": "foo.tex",
"textOperation": [
26,
"\n\nFour five six"
]
}
],
"timestamp": "2017-12-04T10:28:33.724Z",
"authors": [
31
]
}
]
},
"startVersion": 0
},
"authors": [
{
"id": 31,
"email": "james.allen@overleaf.com",
"name": "James"
}
]
}

View File

@@ -0,0 +1,74 @@
{
"chunk": {
"history": {
"snapshot": {
"files": {
"main.tex": {
"hash": "35c9bd86574d61dcadbce2fdd3d4a0684272c6ea",
"stringLength": 20649,
"metadata": {
"main": true
}
},
"foo.tex": {
"hash": "c6654ea913979e13e22022653d284444f284a172",
"stringLength": 41
}
}
},
"changes": [
{
"operations": [
{
"pathname": "foo.tex",
"textOperation": [
41,
"\n\nSeven eight nince"
]
}
],
"timestamp": "2017-12-04T10:29:17.786Z",
"authors": [
31
]
},
{
"operations": [
{
"pathname": "foo.tex",
"textOperation": [
58,
-1,
1
]
}
],
"timestamp": "2017-12-04T10:29:22.905Z",
"authors": [
31
]
},
{
"operations": [
{
"pathname": "foo.tex",
"newPathname": "bar.tex"
}
],
"timestamp": "2017-12-04T10:29:26.120Z",
"authors": [
31
]
}
]
},
"startVersion": 3
},
"authors": [
{
"id": 31,
"email": "james.allen@overleaf.com",
"name": "James"
}
]
}

View File

@@ -0,0 +1,63 @@
{
"chunk": {
"history": {
"snapshot": {
"files": {
"main.tex": {
"hash": "35c9bd86574d61dcadbce2fdd3d4a0684272c6ea",
"stringLength": 20649,
"metadata": {
"main": true
}
},
"bar.tex": {
"hash": "e13c315d53aaef3aa34550a86b09cff091ace220",
"stringLength": 59
}
}
},
"changes": [
{
"operations": [
{
"pathname": "main.tex",
"textOperation": [
1923,
" also updated",
18726
]
}
],
"timestamp": "2017-12-04T10:32:47.277Z",
"authors": [
31
]
},
{
"operations": [
{
"pathname": "bar.tex",
"textOperation": [
28,
-15,
16
]
}
],
"timestamp": "2017-12-04T10:32:52.877Z",
"v2Authors": [
"5a5637efdac84e81b71014c4"
]
}
]
},
"startVersion": 6
},
"authors": [
{
"id": 31,
"email": "james.allen@overleaf.com",
"name": "James"
}
]
}

View File

@@ -0,0 +1,83 @@
import { expect } from 'chai'
import nock from 'nock'
import mongodb from 'mongodb-legacy'
import * as ProjectHistoryApp from './helpers/ProjectHistoryApp.js'
import * as ProjectHistoryClient from './helpers/ProjectHistoryClient.js'
const { ObjectId } = mongodb
const MockHistoryStore = () => nock('http://127.0.0.1:3100')
const MockWeb = () => nock('http://127.0.0.1:3000')
const fixture = path => new URL(`../fixtures/${path}`, import.meta.url)
describe('Deleting project', function () {
beforeEach(function (done) {
this.projectId = new ObjectId().toString()
this.historyId = new ObjectId().toString()
MockWeb()
.get(`/project/${this.projectId}/details`)
.reply(200, {
name: 'Test Project',
overleaf: { history: { id: this.historyId } },
})
MockHistoryStore()
.get(`/api/projects/${this.historyId}/latest/history`)
.replyWithFile(200, fixture('chunks/0-3.json'))
MockHistoryStore().delete(`/api/projects/${this.historyId}`).reply(204)
ProjectHistoryApp.ensureRunning(done)
})
describe('when the project has no pending updates', function (done) {
it('successfully deletes the project', function (done) {
ProjectHistoryClient.deleteProject(this.projectId, done)
})
})
describe('when the project has pending updates', function (done) {
beforeEach(function (done) {
ProjectHistoryClient.pushRawUpdate(
this.projectId,
{
pathname: '/main.tex',
docLines: 'hello',
doc: this.docId,
meta: { userId: this.userId, ts: new Date() },
},
err => {
if (err) {
return done(err)
}
ProjectHistoryClient.setFirstOpTimestamp(
this.projectId,
Date.now(),
err => {
if (err) {
return done(err)
}
ProjectHistoryClient.deleteProject(this.projectId, done)
}
)
}
)
})
it('clears pending updates', function (done) {
ProjectHistoryClient.getDump(this.projectId, (err, dump) => {
if (err) {
return done(err)
}
expect(dump.updates).to.deep.equal([])
done()
})
})
it('clears the first op timestamp', function (done) {
ProjectHistoryClient.getFirstOpTimestamp(this.projectId, (err, ts) => {
if (err) {
return done(err)
}
expect(ts).to.be.null
done()
})
})
})
})

View File

@@ -0,0 +1,415 @@
import { expect } from 'chai'
import request from 'request'
import crypto from 'node:crypto'
import mongodb from 'mongodb-legacy'
import nock from 'nock'
import * as ProjectHistoryClient from './helpers/ProjectHistoryClient.js'
import * as ProjectHistoryApp from './helpers/ProjectHistoryApp.js'
const { ObjectId } = mongodb
const MockHistoryStore = () => nock('http://127.0.0.1:3100')
const MockWeb = () => nock('http://127.0.0.1:3000')
function createMockBlob(historyId, content) {
const sha = crypto.createHash('sha1').update(content).digest('hex')
MockHistoryStore()
.get(`/api/projects/${historyId}/blobs/${sha}`)
.reply(200, content)
.persist()
return sha
}
describe('Diffs', function () {
beforeEach(function (done) {
ProjectHistoryApp.ensureRunning(error => {
if (error) {
throw error
}
this.historyId = new ObjectId().toString()
this.projectId = new ObjectId().toString()
MockHistoryStore().post('/api/projects').reply(200, {
projectId: this.historyId,
})
MockWeb()
.get(`/project/${this.projectId}/details`)
.reply(200, {
name: 'Test Project',
overleaf: { history: { id: this.historyId } },
})
ProjectHistoryClient.initializeProject(this.historyId, error => {
if (error) {
return done(error)
}
done()
})
})
})
afterEach(function () {
nock.cleanAll()
})
it('should return a diff of the updates to a doc from a single chunk', function (done) {
this.blob = 'one two three five'
this.sha = createMockBlob(this.historyId, this.blob)
this.v2AuthorId = '123456789'
MockHistoryStore()
.get(`/api/projects/${this.historyId}/versions/6/history`)
.reply(200, {
chunk: {
history: {
snapshot: {
files: {
'foo.tex': {
hash: this.sha,
stringLength: this.blob.length,
},
},
},
changes: [
{
operations: [
{
pathname: 'foo.tex',
textOperation: [13, ' four', 5],
},
],
timestamp: '2017-12-04T10:29:17.786Z',
authors: [31],
},
{
operations: [
{
pathname: 'foo.tex',
textOperation: [4, -4, 15],
},
],
timestamp: '2017-12-04T10:29:22.905Z',
authors: [31],
},
{
operations: [
{
pathname: 'foo.tex',
textOperation: [19, ' six'],
},
],
timestamp: '2017-12-04T10:29:26.120Z',
v2Authors: [this.v2AuthorId],
},
],
},
startVersion: 3,
},
authors: [31],
})
ProjectHistoryClient.getDiff(
this.projectId,
'foo.tex',
3,
6,
(error, diff) => {
if (error) {
throw error
}
expect(diff).to.deep.equal({
diff: [
{
u: 'one ',
},
{
d: 'two ',
meta: {
users: [31],
start_ts: 1512383362905,
end_ts: 1512383362905,
},
},
{
u: 'three',
},
{
i: ' four',
meta: {
users: [31],
start_ts: 1512383357786,
end_ts: 1512383357786,
},
},
{
u: ' five',
},
{
i: ' six',
meta: {
users: [this.v2AuthorId],
start_ts: 1512383366120,
end_ts: 1512383366120,
},
},
],
})
done()
}
)
})
it('should return a diff of the updates to a doc across multiple chunks', function (done) {
MockHistoryStore()
.get(`/api/projects/${this.historyId}/versions/5/history`)
.reply(200, {
chunk: {
history: {
snapshot: {
files: {
'foo.tex': {
hash: createMockBlob(this.historyId, 'one two three five'),
stringLength: 'one three four five'.length,
},
},
},
changes: [
{
operations: [
{
pathname: 'foo.tex',
textOperation: [13, ' four', 5],
},
],
timestamp: '2017-12-04T10:29:17.786Z',
authors: [31],
},
{
operations: [
{
pathname: 'foo.tex',
textOperation: [4, -4, 15],
},
],
timestamp: '2017-12-04T10:29:22.905Z',
authors: [31],
},
],
},
startVersion: 3,
},
authors: [{ id: 31, email: 'james.allen@overleaf.com', name: 'James' }],
})
MockHistoryStore()
.get(`/api/projects/${this.historyId}/versions/6/history`)
.reply(200, {
chunk: {
history: {
snapshot: {
files: {
'foo.tex': {
hash: createMockBlob(this.historyId, 'one three four five'),
stringLength: 'one three four five'.length,
},
},
},
changes: [
{
operations: [
{
pathname: 'foo.tex',
textOperation: [19, ' six'],
},
],
timestamp: '2017-12-04T10:29:26.120Z',
authors: [31],
},
{
operations: [
{
pathname: 'foo.tex',
textOperation: [23, ' seven'],
},
],
timestamp: '2017-12-04T10:29:26.120Z',
authors: [31],
},
],
},
startVersion: 5,
},
authors: [{ id: 31, email: 'james.allen@overleaf.com', name: 'James' }],
})
ProjectHistoryClient.getDiff(
this.projectId,
'foo.tex',
4,
6,
(error, diff) => {
if (error) {
throw error
}
expect(diff).to.deep.equal({
diff: [
{
u: 'one ',
},
{
d: 'two ',
meta: {
users: [31],
start_ts: 1512383362905,
end_ts: 1512383362905,
},
},
{
u: 'three four five',
},
{
i: ' six',
meta: {
users: [31],
start_ts: 1512383366120,
end_ts: 1512383366120,
},
},
],
})
done()
}
)
})
it('should return a 404 when there are no changes for the file in the range', function (done) {
this.blob = 'one two three five'
this.sha = createMockBlob(this.historyId, this.blob)
MockHistoryStore()
.get(`/api/projects/${this.historyId}/versions/6/history`)
.reply(200, {
chunk: {
history: {
snapshot: {
files: {
'foo.tex': {
hash: this.sha,
stringLength: this.blob.length,
},
},
},
changes: [
{
operations: [
{
pathname: 'foo.tex',
textOperation: [13, ' four', 5],
},
],
timestamp: '2017-12-04T10:29:17.786Z',
authors: [31],
},
],
},
startVersion: 3,
},
authors: [31],
})
request.get(
{
url: `http://127.0.0.1:3054/project/${this.projectId}/diff`,
qs: {
pathname: 'not_here.tex',
from: 3,
to: 6,
},
json: true,
},
(error, res, body) => {
if (error) {
throw error
}
expect(res.statusCode).to.equal(404)
done()
}
)
})
it('should return a binary flag with a diff of a binary file', function (done) {
this.blob = 'one two three five'
this.sha = createMockBlob(this.historyId, this.blob)
this.binaryBlob = Buffer.from([1, 2, 3, 4])
this.binarySha = createMockBlob(this.historyId, this.binaryBlob)
MockHistoryStore()
.get(`/api/projects/${this.historyId}/versions/6/history`)
.reply(200, {
chunk: {
history: {
snapshot: {
files: {
'binary.tex': {
hash: this.binarySha,
byteLength: this.binaryBlob.length, // Indicates binary
},
'foo.tex': {
hash: this.sha,
stringLength: this.blob.length, // Indicates binary
},
},
},
changes: [
{
operations: [
{
pathname: 'foo.tex',
textOperation: [13, ' four', 5],
},
],
timestamp: '2017-12-04T10:29:17.786Z',
authors: [31],
},
{
operations: [
{
pathname: 'foo.tex',
textOperation: [4, -4, 15],
},
],
timestamp: '2017-12-04T10:29:22.905Z',
authors: [31],
},
{
operations: [
{
pathname: 'foo.tex',
textOperation: [19, ' six'],
},
],
timestamp: '2017-12-04T10:29:26.120Z',
authors: [31],
},
],
},
startVersion: 3,
},
authors: [{ id: 31, email: 'james.allen@overleaf.com', name: 'James' }],
})
ProjectHistoryClient.getDiff(
this.projectId,
'binary.tex',
3,
6,
(error, diff) => {
if (error) {
throw error
}
expect(diff).to.deep.equal({
diff: {
binary: true,
},
})
done()
}
)
})
})

View File

@@ -0,0 +1,73 @@
/* eslint-disable
no-undef,
no-unused-vars,
*/
// TODO: This file was created by bulk-decaffeinate.
// Fix any style issues and re-enable lint.
/*
* decaffeinate suggestions:
* DS102: Remove unnecessary code created because of implicit returns
* DS207: Consider shorter variations of null checks
* Full docs: https://github.com/decaffeinate/decaffeinate/blob/master/docs/suggestions.md
*/
import async from 'async'
import sinon from 'sinon'
import { expect } from 'chai'
import Settings from '@overleaf/settings'
import assert from 'node:assert'
import mongodb from 'mongodb-legacy'
import nock from 'nock'
import * as ProjectHistoryClient from './helpers/ProjectHistoryClient.js'
import * as ProjectHistoryApp from './helpers/ProjectHistoryApp.js'
const { ObjectId } = mongodb
const MockHistoryStore = () => nock('http://127.0.0.1:3100')
const MockWeb = () => nock('http://127.0.0.1:3000')
describe('DiscardingUpdates', function () {
beforeEach(function (done) {
this.timestamp = new Date()
return ProjectHistoryApp.ensureRunning(error => {
if (error != null) {
throw error
}
this.user_id = new ObjectId().toString()
this.project_id = new ObjectId().toString()
this.doc_id = new ObjectId().toString()
MockHistoryStore().post('/api/projects').reply(200, {
projectId: 0,
})
MockWeb()
.get(`/project/${this.project_id}/details`)
.reply(200, { name: 'Test Project' })
return ProjectHistoryClient.initializeProject(this.project_id, done)
})
})
return it('should discard updates', function (done) {
return async.series(
[
cb => {
const update = {
pathname: '/main.tex',
docLines: 'a\nb',
doc: this.doc_id,
meta: { user_id: this.user_id, ts: new Date() },
}
return ProjectHistoryClient.pushRawUpdate(this.project_id, update, cb)
},
cb => {
return ProjectHistoryClient.flushProject(this.project_id, cb)
},
],
error => {
if (error != null) {
throw error
}
return done()
}
)
})
})

View File

@@ -0,0 +1,880 @@
/* eslint-disable
no-undef,
no-unused-vars,
*/
// TODO: This file was created by bulk-decaffeinate.
// Fix any style issues and re-enable lint.
/*
* decaffeinate suggestions:
* DS102: Remove unnecessary code created because of implicit returns
* DS207: Consider shorter variations of null checks
* Full docs: https://github.com/decaffeinate/decaffeinate/blob/master/docs/suggestions.md
*/
import sinon from 'sinon'
import { expect } from 'chai'
import Settings from '@overleaf/settings'
import request from 'request'
import assert from 'node:assert'
import Path from 'node:path'
import crypto from 'node:crypto'
import mongodb from 'mongodb-legacy'
import nock from 'nock'
import * as ProjectHistoryClient from './helpers/ProjectHistoryClient.js'
import * as ProjectHistoryApp from './helpers/ProjectHistoryApp.js'
import * as HistoryId from './helpers/HistoryId.js'
const { ObjectId } = mongodb
const MockHistoryStore = () => nock('http://127.0.0.1:3100')
const MockFileStore = () => nock('http://127.0.0.1:3009')
const MockWeb = () => nock('http://127.0.0.1:3000')
const sha = data => crypto.createHash('sha1').update(data).digest('hex')
describe('FileTree Diffs', function () {
beforeEach(function (done) {
return ProjectHistoryApp.ensureRunning(error => {
if (error != null) {
throw error
}
this.historyId = new ObjectId().toString()
this.projectId = new ObjectId().toString()
MockHistoryStore().post('/api/projects').reply(200, {
projectId: this.historyId,
})
MockWeb()
.get(`/project/${this.projectId}/details`)
.reply(200, {
name: 'Test Project',
overleaf: { history: { id: this.historyId } },
})
return ProjectHistoryClient.initializeProject(
this.historyId,
(error, olProject) => {
if (error != null) {
throw error
}
return done()
}
)
})
})
afterEach(function () {
return nock.cleanAll()
})
it('should return a diff of the updates to a doc from a single chunk', function (done) {
MockHistoryStore()
.get(`/api/projects/${this.historyId}/versions/7/history`)
.reply(200, {
chunk: {
history: {
snapshot: {
files: {
'foo.tex': {
hash: sha('mock-sha-foo'),
stringLength: 42,
},
'renamed.tex': {
hash: sha('mock-sha-renamed'),
stringLength: 42,
},
'deleted.tex': {
hash: sha('mock-sha-deleted'),
stringLength: 42,
},
},
},
changes: [
{
operations: [
{
pathname: 'renamed.tex',
newPathname: 'newName.tex',
},
],
timestamp: '2017-12-04T10:29:17.786Z',
authors: [31],
},
{
operations: [
{
pathname: 'foo.tex',
textOperation: ['lorem ipsum'],
},
],
timestamp: '2017-12-04T10:29:17.786Z',
authors: [31],
},
{
operations: [
{
pathname: 'deleted.tex',
newPathname: '',
},
],
timestamp: '2017-12-04T10:29:22.905Z',
authors: [31],
},
{
operations: [
{
file: {
hash: sha('new-sha'),
stringLength: 42,
},
pathname: 'added.tex',
},
],
timestamp: '2017-12-04T10:29:22.905Z',
authors: [31],
},
],
},
startVersion: 3,
},
authors: [{ id: 31, email: 'james.allen@overleaf.com', name: 'James' }],
})
return ProjectHistoryClient.getFileTreeDiff(
this.projectId,
3,
7,
(error, diff) => {
if (error != null) {
throw error
}
expect(diff).to.deep.equal({
diff: [
{
pathname: 'foo.tex',
operation: 'edited',
},
{
pathname: 'deleted.tex',
operation: 'removed',
deletedAtV: 5,
editable: true,
},
{
newPathname: 'newName.tex',
pathname: 'renamed.tex',
operation: 'renamed',
editable: true,
},
{
pathname: 'added.tex',
operation: 'added',
editable: true,
},
],
})
return done()
}
)
})
it('should return a diff of the updates to a doc across multiple chunks', function (done) {
MockHistoryStore()
.get(`/api/projects/${this.historyId}/versions/5/history`)
.reply(200, {
chunk: {
history: {
snapshot: {
files: {
'foo.tex': {
// Updated in this chunk
hash: sha('mock-sha-foo'),
stringLength: 42,
},
'bar.tex': {
// Updated in the next chunk
hash: sha('mock-sha-bar'),
stringLength: 42,
},
'baz.tex': {
// Not updated
hash: sha('mock-sha-bar'),
stringLength: 42,
},
'renamed.tex': {
hash: sha('mock-sha-renamed'),
stringLength: 42,
},
'deleted.tex': {
hash: sha('mock-sha-deleted'),
stringLength: 42,
},
},
},
changes: [
{
operations: [
{
pathname: 'renamed.tex',
newPathname: 'newName.tex',
},
],
timestamp: '2017-12-04T10:29:17.786Z',
authors: [31],
},
{
operations: [
{
pathname: 'foo.tex',
textOperation: ['lorem ipsum'],
},
],
timestamp: '2017-12-04T10:29:19.786Z',
authors: [31],
},
{
operations: [
{
pathname: 'deleted.tex',
newPathname: '',
},
],
timestamp: '2017-12-04T10:29:22.905Z',
authors: [31],
},
],
},
startVersion: 2,
},
authors: [{ id: 31, email: 'james.allen@overleaf.com', name: 'James' }],
})
MockHistoryStore()
.get(`/api/projects/${this.historyId}/versions/7/history`)
.reply(200, {
chunk: {
history: {
snapshot: {
files: {
'foo.tex': {
hash: sha('mock-sha-foo'),
stringLength: 42,
},
'baz.tex': {
hash: sha('mock-sha-bar'),
stringLength: 42,
},
'newName.tex': {
hash: sha('mock-sha-renamed'),
stringLength: 42,
},
},
},
changes: [
{
operations: [
{
file: {
hash: sha('new-sha'),
stringLength: 42,
},
pathname: 'added.tex',
},
],
timestamp: '2017-12-04T10:29:22.905Z',
authors: [31],
},
{
operations: [
{
pathname: 'bar.tex',
textOperation: ['lorem ipsum'],
},
],
timestamp: '2017-12-04T10:29:23.786Z',
authors: [31],
},
],
},
startVersion: 5,
},
authors: [{ id: 31, email: 'james.allen@overleaf.com', name: 'James' }],
})
return ProjectHistoryClient.getFileTreeDiff(
this.projectId,
2,
7,
(error, diff) => {
if (error != null) {
throw error
}
expect(diff).to.deep.equal({
diff: [
{
pathname: 'foo.tex',
operation: 'edited',
},
{
pathname: 'bar.tex',
operation: 'edited',
},
{
pathname: 'baz.tex',
editable: true,
},
{
pathname: 'deleted.tex',
operation: 'removed',
deletedAtV: 4,
editable: true,
},
{
newPathname: 'newName.tex',
pathname: 'renamed.tex',
operation: 'renamed',
editable: true,
},
{
pathname: 'added.tex',
operation: 'added',
editable: true,
},
],
})
return done()
}
)
})
it('should return a diff that includes multiple renames', function (done) {
MockHistoryStore()
.get(`/api/projects/${this.historyId}/versions/5/history`)
.reply(200, {
chunk: {
history: {
snapshot: {
files: {
'one.tex': {
hash: sha('mock-sha'),
stringLength: 42,
},
},
},
changes: [
{
operations: [
{
pathname: 'one.tex',
newPathname: 'two.tex',
},
],
timestamp: '2017-12-04T10:29:17.786Z',
authors: [31],
},
{
operations: [
{
pathname: 'two.tex',
newPathname: 'three.tex',
},
],
timestamp: '2017-12-04T10:29:22.905Z',
authors: [31],
},
],
},
startVersion: 3,
},
authors: [{ id: 31, email: 'james.allen@overleaf.com', name: 'James' }],
})
return ProjectHistoryClient.getFileTreeDiff(
this.projectId,
3,
5,
(error, diff) => {
if (error != null) {
throw error
}
expect(diff).to.deep.equal({
diff: [
{
newPathname: 'three.tex',
pathname: 'one.tex',
operation: 'renamed',
editable: true,
},
],
})
return done()
}
)
})
it('should handle deleting then re-adding a file', function (done) {
MockHistoryStore()
.get(`/api/projects/${this.historyId}/versions/5/history`)
.reply(200, {
chunk: {
history: {
snapshot: {
files: {
'one.tex': {
hash: sha('mock-sha'),
stringLength: 42,
},
},
},
changes: [
{
operations: [
{
pathname: 'one.tex',
newPathname: '',
},
],
timestamp: '2017-12-04T10:29:17.786Z',
authors: [31],
},
{
operations: [
{
pathname: 'one.tex',
file: {
hash: sha('mock-sha'),
},
},
],
timestamp: '2017-12-04T10:29:22.905Z',
authors: [31],
},
],
},
startVersion: 3,
},
authors: [{ id: 31, email: 'james.allen@overleaf.com', name: 'James' }],
})
return ProjectHistoryClient.getFileTreeDiff(
this.projectId,
3,
5,
(error, diff) => {
if (error != null) {
throw error
}
expect(diff).to.deep.equal({
diff: [
{
pathname: 'one.tex',
operation: 'added',
editable: null,
},
],
})
return done()
}
)
})
it('should handle deleting the renaming a file to the same place', function (done) {
MockHistoryStore()
.get(`/api/projects/${this.historyId}/versions/5/history`)
.reply(200, {
chunk: {
history: {
snapshot: {
files: {
'one.tex': {
hash: sha('mock-sha-one'),
stringLength: 42,
},
'two.tex': {
hash: sha('mock-sha-two'),
stringLength: 42,
},
},
},
changes: [
{
operations: [
{
pathname: 'one.tex',
newPathname: '',
},
],
timestamp: '2017-12-04T10:29:17.786Z',
authors: [31],
},
{
operations: [
{
pathname: 'two.tex',
newPathname: 'one.tex',
},
],
timestamp: '2017-12-04T10:29:22.905Z',
authors: [31],
},
],
},
startVersion: 3,
},
authors: [{ id: 31, email: 'james.allen@overleaf.com', name: 'James' }],
})
return ProjectHistoryClient.getFileTreeDiff(
this.projectId,
3,
5,
(error, diff) => {
if (error != null) {
throw error
}
expect(diff).to.deep.equal({
diff: [
{
pathname: 'two.tex',
newPathname: 'one.tex',
operation: 'renamed',
editable: true,
},
],
})
return done()
}
)
})
it('should handle adding then renaming a file', function (done) {
MockHistoryStore()
.get(`/api/projects/${this.historyId}/versions/5/history`)
.reply(200, {
chunk: {
history: {
snapshot: {
files: {},
},
changes: [
{
operations: [
{
pathname: 'one.tex',
file: {
hash: sha('mock-sha'),
stringLength: 42,
},
},
],
timestamp: '2017-12-04T10:29:17.786Z',
authors: [31],
},
{
operations: [
{
pathname: 'one.tex',
newPathname: 'two.tex',
},
],
timestamp: '2017-12-04T10:29:22.905Z',
authors: [31],
},
],
},
startVersion: 3,
},
authors: [{ id: 31, email: 'james.allen@overleaf.com', name: 'James' }],
})
return ProjectHistoryClient.getFileTreeDiff(
this.projectId,
3,
5,
(error, diff) => {
if (error != null) {
throw error
}
expect(diff).to.deep.equal({
diff: [
{
pathname: 'two.tex',
operation: 'added',
editable: true,
},
],
})
return done()
}
)
})
it('should return 422 with a chunk with an invalid rename', function (done) {
MockHistoryStore()
.get(`/api/projects/${this.historyId}/versions/6/history`)
.reply(200, {
chunk: {
history: {
snapshot: {
files: {
'foo.tex': {
hash: sha('mock-sha-foo'),
stringLength: 42,
},
'bar.tex': {
hash: sha('mock-sha-bar'),
stringLength: 42,
},
},
},
changes: [
{
operations: [
{
pathname: 'foo.tex',
newPathname: 'bar.tex',
},
],
timestamp: '2017-12-04T10:29:17.786Z',
authors: [31],
},
],
},
startVersion: 5,
},
authors: [{ id: 31, email: 'james.allen@overleaf.com', name: 'James' }],
})
return ProjectHistoryClient.getFileTreeDiff(
this.projectId,
5,
6,
(error, diff, statusCode) => {
if (error != null) {
throw error
}
expect(statusCode).to.equal(422)
return done()
}
)
})
it('should return 200 with a chunk with an invalid add', function (done) {
MockHistoryStore()
.get(`/api/projects/${this.historyId}/versions/6/history`)
.reply(200, {
chunk: {
history: {
snapshot: {
files: {
'foo.tex': {
hash: sha('mock-sha-foo'),
stringLength: 42,
},
},
},
changes: [
{
operations: [
{
file: {
hash: sha('new-sha'),
},
pathname: 'foo.tex',
},
],
timestamp: '2017-12-04T10:29:17.786Z',
authors: [31],
},
],
},
startVersion: 5,
},
authors: [{ id: 31, email: 'james.allen@overleaf.com', name: 'James' }],
})
return ProjectHistoryClient.getFileTreeDiff(
this.projectId,
5,
6,
(error, diff, statusCode) => {
if (error != null) {
throw error
}
expect(diff).to.deep.equal({
diff: [
{
pathname: 'foo.tex',
operation: 'added',
editable: null,
},
],
})
expect(statusCode).to.equal(200)
return done()
}
)
})
it('should handle edits of missing/invalid files ', function (done) {
MockHistoryStore()
.get(`/api/projects/${this.historyId}/versions/5/history`)
.reply(200, {
chunk: {
history: {
snapshot: {
files: {},
},
changes: [
{
operations: [
{
pathname: 'new.tex',
textOperation: ['lorem ipsum'],
},
],
timestamp: '2017-12-04T10:29:18.786Z',
authors: [31],
},
{
operations: [
{
pathname: '',
textOperation: ['lorem ipsum'],
},
],
timestamp: '2017-12-04T10:29:17.786Z',
authors: [31],
},
],
},
startVersion: 3,
},
authors: [{ id: 31, email: 'james.allen@overleaf.com', name: 'James' }],
})
return ProjectHistoryClient.getFileTreeDiff(
this.projectId,
3,
5,
(error, diff) => {
if (error != null) {
throw error
}
expect(diff).to.deep.equal({
diff: [
{
operation: 'edited',
pathname: 'new.tex',
},
],
})
return done()
}
)
})
it('should handle deletions of missing/invalid files ', function (done) {
MockHistoryStore()
.get(`/api/projects/${this.historyId}/versions/5/history`)
.reply(200, {
chunk: {
history: {
snapshot: {
files: {},
},
changes: [
{
operations: [
{
pathname: 'missing.tex',
newPathname: '',
},
],
timestamp: '2017-12-04T10:29:17.786Z',
authors: [31],
},
{
operations: [
{
pathname: '',
newPathname: '',
},
],
timestamp: '2017-12-04T10:29:17.786Z',
authors: [31],
},
],
},
startVersion: 3,
},
authors: [{ id: 31, email: 'james.allen@overleaf.com', name: 'James' }],
})
return ProjectHistoryClient.getFileTreeDiff(
this.projectId,
3,
5,
(error, diff) => {
if (error != null) {
throw error
}
expect(diff).to.deep.equal({
diff: [],
})
return done()
}
)
})
return it('should handle renames of missing/invalid files ', function (done) {
MockHistoryStore()
.get(`/api/projects/${this.historyId}/versions/5/history`)
.reply(200, {
chunk: {
history: {
snapshot: {
files: {},
},
changes: [
{
operations: [
{
pathname: 'missing.tex',
newPathname: 'missing-renamed.tex',
},
],
timestamp: '2017-12-04T10:29:17.786Z',
authors: [31],
},
{
operations: [
{
pathname: '',
newPathname: 'missing-renamed-other.tex',
},
],
timestamp: '2017-12-04T10:29:17.786Z',
authors: [31],
},
],
},
startVersion: 3,
},
authors: [{ id: 31, email: 'james.allen@overleaf.com', name: 'James' }],
})
return ProjectHistoryClient.getFileTreeDiff(
this.projectId,
3,
5,
(error, diff) => {
if (error != null) {
throw error
}
expect(diff).to.deep.equal({
diff: [],
})
return done()
}
)
})
})

View File

@@ -0,0 +1,242 @@
import async from 'async'
import nock from 'nock'
import { expect } from 'chai'
import request from 'request'
import assert from 'node:assert'
import mongodb from 'mongodb-legacy'
import * as ProjectHistoryClient from './helpers/ProjectHistoryClient.js'
import * as ProjectHistoryApp from './helpers/ProjectHistoryApp.js'
const { ObjectId } = mongodb
const MockHistoryStore = () => nock('http://127.0.0.1:3100')
const MockWeb = () => nock('http://127.0.0.1:3000')
describe('Flushing old queues', function () {
const historyId = new ObjectId().toString()
beforeEach(function (done) {
this.timestamp = new Date()
ProjectHistoryApp.ensureRunning(error => {
if (error) {
throw error
}
this.projectId = new ObjectId().toString()
this.docId = new ObjectId().toString()
this.fileId = new ObjectId().toString()
MockHistoryStore().post('/api/projects').reply(200, {
projectId: historyId,
})
MockWeb()
.get(`/project/${this.projectId}/details`)
.reply(200, {
name: 'Test Project',
overleaf: {
history: {
id: historyId,
},
},
})
MockHistoryStore()
.get(`/api/projects/${historyId}/latest/history`)
.reply(200, {
chunk: {
startVersion: 0,
history: {
changes: [],
},
},
})
ProjectHistoryClient.initializeProject(historyId, done)
})
})
afterEach(function () {
nock.cleanAll()
})
describe('retrying an unflushed project', function () {
describe('when the update is older than the cutoff', function () {
beforeEach(function (done) {
this.flushCall = MockHistoryStore()
.put(
`/api/projects/${historyId}/blobs/0a207c060e61f3b88eaee0a8cd0696f46fb155eb`
)
.reply(201)
.post(`/api/projects/${historyId}/legacy_changes?end_version=0`)
.reply(200)
const update = {
pathname: '/main.tex',
docLines: 'a\nb',
doc: this.docId,
meta: { user_id: this.user_id, ts: new Date() },
}
async.series(
[
cb =>
ProjectHistoryClient.pushRawUpdate(this.projectId, update, cb),
cb =>
ProjectHistoryClient.setFirstOpTimestamp(
this.projectId,
Date.now() - 24 * 3600 * 1000,
cb
),
],
done
)
})
it('flushes the project history queue', function (done) {
request.post(
{
url: 'http://127.0.0.1:3054/flush/old?maxAge=10800',
},
(error, res, body) => {
if (error) {
return done(error)
}
expect(res.statusCode).to.equal(200)
assert(
this.flushCall.isDone(),
'made calls to history service to store updates'
)
done()
}
)
})
it('flushes the project history queue in the background when requested', function (done) {
request.post(
{
url: 'http://127.0.0.1:3054/flush/old?maxAge=10800&background=1',
},
(error, res, body) => {
if (error) {
return done(error)
}
expect(res.statusCode).to.equal(200)
expect(body).to.equal('{"message":"running flush in background"}')
assert(
!this.flushCall.isDone(),
'did not make calls to history service to store updates in the foreground'
)
setTimeout(() => {
assert(
this.flushCall.isDone(),
'made calls to history service to store updates in the background'
)
done()
}, 100)
}
)
})
})
describe('when the update is newer than the cutoff', function () {
beforeEach(function (done) {
this.flushCall = MockHistoryStore()
.put(
`/api/projects/${historyId}/blobs/0a207c060e61f3b88eaee0a8cd0696f46fb155eb`
)
.reply(201)
.post(`/api/projects/${historyId}/legacy_changes?end_version=0`)
.reply(200)
const update = {
pathname: '/main.tex',
docLines: 'a\nb',
doc: this.docId,
meta: { user_id: this.user_id, ts: new Date() },
}
async.series(
[
cb =>
ProjectHistoryClient.pushRawUpdate(this.projectId, update, cb),
cb =>
ProjectHistoryClient.setFirstOpTimestamp(
this.projectId,
Date.now() - 60 * 1000,
cb
),
],
done
)
})
it('does not flush the project history queue', function (done) {
request.post(
{
url: `http://127.0.0.1:3054/flush/old?maxAge=${3 * 3600}`,
},
(error, res, body) => {
if (error) {
return done(error)
}
expect(res.statusCode).to.equal(200)
assert(
!this.flushCall.isDone(),
'did not make calls to history service to store updates'
)
done()
}
)
})
})
describe('when the update does not have a timestamp', function () {
beforeEach(function (done) {
this.flushCall = MockHistoryStore()
.put(
`/api/projects/${historyId}/blobs/0a207c060e61f3b88eaee0a8cd0696f46fb155eb`
)
.reply(201)
.post(`/api/projects/${historyId}/legacy_changes?end_version=0`)
.reply(200)
const update = {
pathname: '/main.tex',
docLines: 'a\nb',
doc: this.docId,
meta: { user_id: this.user_id, ts: new Date() },
}
this.startDate = Date.now()
async.series(
[
cb =>
ProjectHistoryClient.pushRawUpdate(this.projectId, update, cb),
cb =>
ProjectHistoryClient.clearFirstOpTimestamp(this.projectId, cb),
],
done
)
})
it('flushes the project history queue anyway', function (done) {
request.post(
{
url: `http://127.0.0.1:3054/flush/old?maxAge=${3 * 3600}`,
},
(error, res, body) => {
if (error) {
return done(error)
}
expect(res.statusCode).to.equal(200)
assert(
this.flushCall.isDone(),
'made calls to history service to store updates'
)
ProjectHistoryClient.getFirstOpTimestamp(
this.projectId,
(err, result) => {
if (err) {
return done(err)
}
expect(result).to.be.null
done()
}
)
}
)
})
})
})
})

View File

@@ -0,0 +1,158 @@
import { expect } from 'chai'
import mongodb from 'mongodb-legacy'
import nock from 'nock'
import Core from 'overleaf-editor-core'
import * as ProjectHistoryClient from './helpers/ProjectHistoryClient.js'
import * as ProjectHistoryApp from './helpers/ProjectHistoryApp.js'
import latestChunk from '../fixtures/chunks/7-8.json' with { type: 'json' }
import previousChunk from '../fixtures/chunks/4-6.json' with { type: 'json' }
import firstChunk from '../fixtures/chunks/0-3.json' with { type: 'json' }
const { ObjectId } = mongodb
const MockHistoryStore = () => nock('http://127.0.0.1:3100')
const MockWeb = () => nock('http://127.0.0.1:3000')
const fixture = path => new URL(`../fixtures/${path}`, import.meta.url)
describe('GetChangesInChunkSince', function () {
let projectId, historyId
beforeEach(function (done) {
projectId = new ObjectId().toString()
historyId = new ObjectId().toString()
ProjectHistoryApp.ensureRunning(error => {
if (error) throw error
MockHistoryStore().post('/api/projects').reply(200, {
projectId: historyId,
})
ProjectHistoryClient.initializeProject(historyId, (error, olProject) => {
if (error) throw error
MockWeb()
.get(`/project/${projectId}/details`)
.reply(200, {
name: 'Test Project',
overleaf: { history: { id: olProject.id } },
})
MockHistoryStore()
.get(`/api/projects/${historyId}/latest/history`)
.replyWithFile(200, fixture('chunks/7-8.json'))
MockHistoryStore()
.get(`/api/projects/${historyId}/versions/7/history`)
.replyWithFile(200, fixture('chunks/7-8.json'))
MockHistoryStore()
.get(`/api/projects/${historyId}/versions/6/history`)
.replyWithFile(200, fixture('chunks/7-8.json'))
MockHistoryStore()
.get(`/api/projects/${historyId}/versions/5/history`)
.replyWithFile(200, fixture('chunks/4-6.json'))
MockHistoryStore()
.get(`/api/projects/${historyId}/versions/4/history`)
.replyWithFile(200, fixture('chunks/4-6.json'))
MockHistoryStore()
.get(`/api/projects/${historyId}/versions/3/history`)
.replyWithFile(200, fixture('chunks/4-6.json'))
MockHistoryStore()
.get(`/api/projects/${historyId}/versions/2/history`)
.replyWithFile(200, fixture('chunks/0-3.json'))
MockHistoryStore()
.get(`/api/projects/${historyId}/versions/1/history`)
.replyWithFile(200, fixture('chunks/0-3.json'))
MockHistoryStore()
.get(`/api/projects/${historyId}/versions/0/history`)
.replyWithFile(200, fixture('chunks/0-3.json'))
done()
})
})
})
afterEach(function () {
nock.cleanAll()
})
function expectChangesSince(version, n, changes, done) {
ProjectHistoryClient.getChangesInChunkSince(
projectId,
version,
{},
(error, got) => {
if (error) throw error
expect(got.latestStartVersion).to.equal(6)
expect(got.changes).to.have.length(n)
expect(got.changes.map(c => Core.Change.fromRaw(c))).to.deep.equal(
changes.map(c => Core.Change.fromRaw(c))
)
done()
}
)
}
const cases = {
8: {
name: 'when up-to-date, return zero changes',
n: 0,
changes: [],
},
7: {
name: 'when one version behind, return one change',
n: 1,
changes: latestChunk.chunk.history.changes.slice(1),
},
6: {
name: 'when at current chunk boundary, return latest chunk in full',
n: 2,
changes: latestChunk.chunk.history.changes,
},
5: {
name: 'when one version behind last chunk, return one change',
n: 1,
changes: previousChunk.chunk.history.changes.slice(2),
},
4: {
name: 'when in last chunk, return two changes',
n: 2,
changes: previousChunk.chunk.history.changes.slice(1),
},
3: {
name: 'when at previous chunk boundary, return just the previous chunk',
n: 3,
changes: previousChunk.chunk.history.changes,
},
2: {
name: 'when at end of first chunk, return one change',
n: 1,
changes: firstChunk.chunk.history.changes.slice(2),
},
1: {
name: 'when in first chunk, return two changes',
n: 2,
changes: firstChunk.chunk.history.changes.slice(1),
},
0: {
name: 'when from zero, return just the first chunk',
n: 3,
changes: firstChunk.chunk.history.changes,
},
}
for (const [since, { name, n, changes }] of Object.entries(cases)) {
it(name, function (done) {
expectChangesSince(since, n, changes, done)
})
}
it('should return an error when past the end version', function (done) {
ProjectHistoryClient.getChangesInChunkSince(
projectId,
9,
{ allowErrors: true },
(error, _body, statusCode) => {
if (error) throw error
expect(statusCode).to.equal(400)
done()
}
)
})
})

View File

@@ -0,0 +1,76 @@
/* eslint-disable
no-undef,
*/
// TODO: This file was created by bulk-decaffeinate.
// Fix any style issues and re-enable lint.
/*
* decaffeinate suggestions:
* DS102: Remove unnecessary code created because of implicit returns
* DS207: Consider shorter variations of null checks
* Full docs: https://github.com/decaffeinate/decaffeinate/blob/master/docs/suggestions.md
*/
import { expect } from 'chai'
import settings from '@overleaf/settings'
import request from 'request'
import mongodb from 'mongodb-legacy'
import nock from 'nock'
import * as ProjectHistoryClient from './helpers/ProjectHistoryClient.js'
import * as ProjectHistoryApp from './helpers/ProjectHistoryApp.js'
const { ObjectId } = mongodb
const MockHistoryStore = () => nock('http://127.0.0.1:3100')
const MockWeb = () => nock('http://127.0.0.1:3000')
describe('Health Check', function () {
beforeEach(function (done) {
const projectId = new ObjectId()
const historyId = new ObjectId().toString()
settings.history.healthCheck = { project_id: projectId }
return ProjectHistoryApp.ensureRunning(error => {
if (error != null) {
throw error
}
MockHistoryStore().post('/api/projects').reply(200, {
projectId: historyId,
})
MockHistoryStore()
.get(`/api/projects/${historyId}/latest/history`)
.reply(200, {
chunk: {
startVersion: 0,
history: {
snapshot: {},
changes: [],
},
},
})
MockWeb()
.get(`/project/${projectId}/details`)
.reply(200, {
name: 'Test Project',
overleaf: {
history: {
id: historyId,
},
},
})
return ProjectHistoryClient.initializeProject(historyId, done)
})
})
return it('should respond to the health check', function (done) {
return request.get(
{
url: 'http://127.0.0.1:3054/health_check',
},
(error, res, body) => {
if (error != null) {
return callback(error)
}
expect(res.statusCode).to.equal(200)
return done()
}
)
})
})

View File

@@ -0,0 +1,282 @@
import { expect } from 'chai'
import mongodb from 'mongodb-legacy'
import nock from 'nock'
import * as ProjectHistoryClient from './helpers/ProjectHistoryClient.js'
import * as ProjectHistoryApp from './helpers/ProjectHistoryApp.js'
const { ObjectId } = mongodb
const MockHistoryStore = () => nock('http://127.0.0.1:3100')
const MockWeb = () => nock('http://127.0.0.1:3000')
const fixture = path => new URL(`../fixtures/${path}`, import.meta.url)
describe('Labels', function () {
beforeEach(function (done) {
ProjectHistoryApp.ensureRunning(error => {
if (error != null) {
throw error
}
this.historyId = new ObjectId().toString()
MockHistoryStore().post('/api/projects').reply(200, {
projectId: this.historyId,
})
ProjectHistoryClient.initializeProject(
this.historyId,
(error, olProject) => {
if (error != null) {
throw error
}
this.project_id = new ObjectId().toString()
MockWeb()
.get(`/project/${this.project_id}/details`)
.reply(200, {
name: 'Test Project',
overleaf: { history: { id: olProject.id } },
})
MockHistoryStore()
.get(`/api/projects/${this.historyId}/latest/history`)
.replyWithFile(200, fixture('chunks/7-8.json'))
MockHistoryStore()
.get(`/api/projects/${this.historyId}/versions/7/history`)
.replyWithFile(200, fixture('chunks/7-8.json'))
.persist()
MockHistoryStore()
.get(`/api/projects/${this.historyId}/versions/8/history`)
.replyWithFile(200, fixture('chunks/7-8.json'))
.persist()
this.comment = 'a saved version comment'
this.comment2 = 'another saved version comment'
this.user_id = new ObjectId().toString()
this.created_at = new Date(1)
done()
}
)
})
})
afterEach(function () {
nock.cleanAll()
})
it('can create and get labels', function (done) {
ProjectHistoryClient.createLabel(
this.project_id,
this.user_id,
7,
this.comment,
this.created_at,
(error, label) => {
if (error != null) {
throw error
}
ProjectHistoryClient.getLabels(this.project_id, (error, labels) => {
if (error != null) {
throw error
}
expect(labels).to.deep.equal([label])
done()
})
}
)
})
it('can create and get labels with no user id', function (done) {
const userId = undefined
ProjectHistoryClient.createLabel(
this.project_id,
userId,
7,
this.comment,
this.created_at,
(error, label) => {
if (error != null) {
throw error
}
ProjectHistoryClient.getLabels(this.project_id, (error, labels) => {
if (error != null) {
throw error
}
expect(labels).to.deep.equal([label])
done()
})
}
)
})
it('can delete labels', function (done) {
ProjectHistoryClient.createLabel(
this.project_id,
this.user_id,
7,
this.comment,
this.created_at,
(error, label) => {
if (error != null) {
throw error
}
ProjectHistoryClient.deleteLabel(this.project_id, label.id, error => {
if (error != null) {
throw error
}
ProjectHistoryClient.getLabels(this.project_id, (error, labels) => {
if (error != null) {
throw error
}
expect(labels).to.deep.equal([])
done()
})
})
}
)
})
it('can delete labels for the current user', function (done) {
ProjectHistoryClient.createLabel(
this.project_id,
this.user_id,
7,
this.comment,
this.created_at,
(error, label) => {
if (error != null) {
throw error
}
ProjectHistoryClient.deleteLabelForUser(
this.project_id,
this.user_id,
label.id,
error => {
if (error != null) {
throw error
}
ProjectHistoryClient.getLabels(this.project_id, (error, labels) => {
if (error != null) {
throw error
}
expect(labels).to.deep.equal([])
done()
})
}
)
}
)
})
it('can transfer ownership of labels', function (done) {
const fromUser = new ObjectId().toString()
const toUser = new ObjectId().toString()
ProjectHistoryClient.createLabel(
this.project_id,
fromUser,
7,
this.comment,
this.created_at,
(error, label) => {
if (error != null) {
throw error
}
ProjectHistoryClient.createLabel(
this.project_id,
fromUser,
7,
this.comment2,
this.created_at,
(error, label2) => {
if (error != null) {
throw error
}
ProjectHistoryClient.transferLabelOwnership(
fromUser,
toUser,
error => {
if (error != null) {
throw error
}
ProjectHistoryClient.getLabels(
this.project_id,
(error, labels) => {
if (error != null) {
throw error
}
expect(labels).to.deep.equal([
{
id: label.id,
comment: label.comment,
version: label.version,
created_at: label.created_at,
user_id: toUser,
},
{
id: label2.id,
comment: label2.comment,
version: label2.version,
created_at: label2.created_at,
user_id: toUser,
},
])
done()
}
)
}
)
}
)
}
)
})
it('should return labels with summarized updates', function (done) {
ProjectHistoryClient.createLabel(
this.project_id,
this.user_id,
8,
this.comment,
this.created_at,
(error, label) => {
if (error != null) {
throw error
}
ProjectHistoryClient.getSummarizedUpdates(
this.project_id,
{ min_count: 1 },
(error, updates) => {
if (error != null) {
throw error
}
expect(updates).to.deep.equal({
nextBeforeTimestamp: 6,
updates: [
{
fromV: 6,
toV: 8,
meta: {
users: ['5a5637efdac84e81b71014c4', 31],
start_ts: 1512383567277,
end_ts: 1512383572877,
},
pathnames: ['bar.tex', 'main.tex'],
project_ops: [],
labels: [
{
id: label.id.toString(),
comment: this.comment,
version: 8,
user_id: this.user_id,
created_at: this.created_at.toISOString(),
},
],
},
],
})
done()
}
)
}
)
})
})

View File

@@ -0,0 +1,78 @@
import { expect } from 'chai'
import mongodb from 'mongodb-legacy'
import nock from 'nock'
import * as ProjectHistoryClient from './helpers/ProjectHistoryClient.js'
import * as ProjectHistoryApp from './helpers/ProjectHistoryApp.js'
const { ObjectId } = mongodb
const MockHistoryStore = () => nock('http://127.0.0.1:3100')
const MockWeb = () => nock('http://127.0.0.1:3000')
const fixture = path => new URL(`../fixtures/${path}`, import.meta.url)
describe('LatestSnapshot', function () {
beforeEach(function (done) {
ProjectHistoryApp.ensureRunning(error => {
if (error) {
throw error
}
this.historyId = new ObjectId().toString()
MockHistoryStore().post('/api/projects').reply(200, {
projectId: this.historyId,
})
ProjectHistoryClient.initializeProject(
this.historyId,
(error, v1Project) => {
if (error) {
throw error
}
this.projectId = new ObjectId().toString()
MockWeb()
.get(`/project/${this.projectId}/details`)
.reply(200, {
name: 'Test Project',
overleaf: { history: { id: v1Project.id } },
})
done()
}
)
})
})
afterEach(function () {
nock.cleanAll()
})
it('should return the snapshot with applied changes, metadata and without full content', function (done) {
MockHistoryStore()
.get(`/api/projects/${this.historyId}/latest/history`)
.replyWithFile(200, fixture('chunks/0-3.json'))
ProjectHistoryClient.getLatestSnapshot(this.projectId, (error, body) => {
if (error) {
throw error
}
expect(body).to.deep.equal({
snapshot: {
files: {
'main.tex': {
hash: 'f28571f561d198b87c24cc6a98b78e87b665e22d',
stringLength: 20649,
operations: [{ textOperation: [1912, 'Hello world', 18726] }],
metadata: { main: true },
},
'foo.tex': {
hash: '4f785a4c192155b240e3042b3a7388b47603f423',
stringLength: 41,
operations: [{ textOperation: [26, '\n\nFour five six'] }],
},
},
},
version: 3,
})
done()
})
})
})

View File

@@ -0,0 +1,298 @@
import { expect } from 'chai'
import mongodb from 'mongodb-legacy'
import nock from 'nock'
import * as ProjectHistoryClient from './helpers/ProjectHistoryClient.js'
import * as ProjectHistoryApp from './helpers/ProjectHistoryApp.js'
const { ObjectId } = mongodb
const MockHistoryStore = () => nock('http://127.0.0.1:3100')
const MockWeb = () => nock('http://127.0.0.1:3000')
const fixture = path => new URL(`../fixtures/${path}`, import.meta.url)
describe('ReadSnapshot', function () {
beforeEach(function (done) {
ProjectHistoryApp.ensureRunning(error => {
if (error) {
throw error
}
this.historyId = new ObjectId().toString()
MockHistoryStore().post('/api/projects').reply(200, {
projectId: this.historyId,
})
ProjectHistoryClient.initializeProject(
this.historyId,
(error, v1Project) => {
if (error) {
throw error
}
this.projectId = new ObjectId().toString()
MockWeb()
.get(`/project/${this.projectId}/details`)
.reply(200, {
name: 'Test Project',
overleaf: { history: { id: v1Project.id } },
})
done()
}
)
})
})
afterEach(function () {
nock.cleanAll()
})
describe('of a text file', function () {
it('should return the snapshot of a doc at the given version', function (done) {
MockHistoryStore()
.get(`/api/projects/${this.historyId}/versions/5/history`)
.replyWithFile(200, fixture('chunks/4-6.json'))
MockHistoryStore()
.get(
`/api/projects/${this.historyId}/blobs/c6654ea913979e13e22022653d284444f284a172`
)
.replyWithFile(
200,
fixture('blobs/c6654ea913979e13e22022653d284444f284a172')
)
ProjectHistoryClient.getSnapshot(
this.projectId,
'foo.tex',
5,
(error, body) => {
if (error) {
throw error
}
expect(body).to.deep.equal(
`\
Hello world
One two three
Four five six
Seven eight nine\
`.replace(/^\t/g, '')
)
done()
}
)
})
it('should return the snapshot of a doc at a different version', function (done) {
MockHistoryStore()
.get(`/api/projects/${this.historyId}/versions/4/history`)
.replyWithFile(200, fixture('chunks/4-6.json'))
MockHistoryStore()
.get(
`/api/projects/${this.historyId}/blobs/c6654ea913979e13e22022653d284444f284a172`
)
.replyWithFile(
200,
fixture('blobs/c6654ea913979e13e22022653d284444f284a172')
)
ProjectHistoryClient.getSnapshot(
this.projectId,
'foo.tex',
4,
(error, body) => {
if (error) {
throw error
}
expect(body).to.deep.equal(
`\
Hello world
One two three
Four five six
Seven eight nince\
`.replace(/^\t/g, '')
)
done()
}
)
})
it('should return the snapshot of a doc after a rename version', function (done) {
MockHistoryStore()
.get(`/api/projects/${this.historyId}/versions/6/history`)
.replyWithFile(200, fixture('chunks/4-6.json'))
MockHistoryStore()
.get(
`/api/projects/${this.historyId}/blobs/c6654ea913979e13e22022653d284444f284a172`
)
.replyWithFile(
200,
fixture('blobs/c6654ea913979e13e22022653d284444f284a172')
)
ProjectHistoryClient.getSnapshot(
this.projectId,
'bar.tex',
6,
(error, body) => {
if (error) {
throw error
}
expect(body).to.deep.equal(
`\
Hello world
One two three
Four five six
Seven eight nine\
`.replace(/^\t/g, '')
)
done()
}
)
})
})
describe('of a binary file', function () {
beforeEach(function () {
MockHistoryStore()
.get(`/api/projects/${this.historyId}/versions/4/history`)
.reply(200, {
chunk: {
history: {
snapshot: {
files: {
binary_file: {
hash: 'c6654ea913979e13e22022653d284444f284a172',
byteLength: 41,
},
},
},
changes: [],
},
startVersion: 3,
},
authors: [],
})
})
it('should return the snapshot of the file at the given version', function (done) {
MockHistoryStore()
.get(
`/api/projects/${this.historyId}/blobs/c6654ea913979e13e22022653d284444f284a172`
)
.replyWithFile(
200,
fixture('blobs/c6654ea913979e13e22022653d284444f284a172')
)
ProjectHistoryClient.getSnapshot(
this.projectId,
'binary_file',
4,
(error, body) => {
if (error) {
throw error
}
expect(body).to.deep.equal(
`\
Hello world
One two three
Four five six\
`.replace(/^\t/g, '')
)
done()
}
)
})
it("should return an error when the blob doesn't exist", function (done) {
MockHistoryStore()
.get(`/api/projects/${this.historyId}/versions/4/history`)
.reply(200, {
chunk: {
history: {
snapshot: {
files: {
binary_file: {
hash: 'c6654ea913979e13e22022653d284444f284a172',
byteLength: 41,
},
},
},
changes: [],
},
startVersion: 3,
},
authors: [],
})
MockHistoryStore()
.get(
`/api/projects/${this.historyId}/blobs/c6654ea913979e13e22022653d284444f284a172`
)
.reply(404)
ProjectHistoryClient.getSnapshot(
this.projectId,
'binary_file',
4,
{ allowErrors: true },
(error, body, statusCode) => {
if (error) {
throw error
}
expect(statusCode).to.equal(500)
done()
}
)
})
it('should return an error when the blob request errors', function (done) {
MockHistoryStore()
.get(`/api/projects/${this.historyId}/versions/4/history`)
.reply(200, {
chunk: {
history: {
snapshot: {
files: {
binary_file: {
hash: 'c6654ea913979e13e22022653d284444f284a172',
byteLength: 41,
},
},
},
changes: [],
},
startVersion: 3,
},
authors: [],
})
MockHistoryStore()
.get(
`/api/projects/${this.historyId}/blobs/c6654ea913979e13e22022653d284444f284a172`
)
.replyWithError('oh no!')
ProjectHistoryClient.getSnapshot(
this.projectId,
'binary_file',
4,
{ allowErrors: true },
(error, body, statusCode) => {
if (error) {
throw error
}
expect(statusCode).to.equal(500)
done()
}
)
})
})
})

View File

@@ -0,0 +1,194 @@
import async from 'async'
import nock from 'nock'
import { expect } from 'chai'
import request from 'request'
import assert from 'node:assert'
import mongodb from 'mongodb-legacy'
import * as ProjectHistoryClient from './helpers/ProjectHistoryClient.js'
import * as ProjectHistoryApp from './helpers/ProjectHistoryApp.js'
const { ObjectId } = mongodb
const MockHistoryStore = () => nock('http://127.0.0.1:3100')
const MockWeb = () => nock('http://127.0.0.1:3000')
const MockCallback = () => nock('http://127.0.0.1')
describe('Retrying failed projects', function () {
const historyId = new ObjectId().toString()
beforeEach(function (done) {
this.timestamp = new Date()
ProjectHistoryApp.ensureRunning(error => {
if (error) {
throw error
}
this.project_id = new ObjectId().toString()
this.doc_id = new ObjectId().toString()
this.file_id = new ObjectId().toString()
MockHistoryStore().post('/api/projects').reply(200, {
projectId: historyId,
})
MockWeb()
.get(`/project/${this.project_id}/details`)
.reply(200, {
name: 'Test Project',
overleaf: {
history: {
id: historyId,
},
},
})
MockHistoryStore()
.get(`/api/projects/${historyId}/latest/history`)
.reply(200, {
chunk: {
startVersion: 0,
history: {
changes: [],
},
},
})
ProjectHistoryClient.initializeProject(historyId, done)
})
})
afterEach(function () {
nock.cleanAll()
})
describe('retrying project history', function () {
describe('when there is a soft failure', function () {
beforeEach(function (done) {
this.flushCall = MockHistoryStore()
.put(
`/api/projects/${historyId}/blobs/0a207c060e61f3b88eaee0a8cd0696f46fb155eb`
)
.reply(201)
.post(`/api/projects/${historyId}/legacy_changes?end_version=0`)
.reply(200)
const update = {
pathname: '/main.tex',
docLines: 'a\nb',
doc: this.doc_id,
meta: { user_id: this.user_id, ts: new Date() },
}
async.series(
[
cb =>
ProjectHistoryClient.pushRawUpdate(this.project_id, update, cb),
cb =>
ProjectHistoryClient.setFailure(
{
project_id: this.project_id,
attempts: 1,
error: 'soft-error',
},
cb
),
],
done
)
})
it('flushes the project history queue', function (done) {
request.post(
{
url: 'http://127.0.0.1:3054/retry/failures?failureType=soft&limit=1&timeout=10000',
},
(error, res, body) => {
if (error) {
return done(error)
}
expect(res.statusCode).to.equal(200)
assert(
this.flushCall.isDone(),
'made calls to history service to store updates'
)
done()
}
)
})
it('retries in the background when requested', function (done) {
this.callback = MockCallback()
.matchHeader('Authorization', '123')
.get('/ping')
.reply(200)
request.post(
{
url: 'http://127.0.0.1:3054/retry/failures?failureType=soft&limit=1&timeout=10000&callbackUrl=http%3A%2F%2F127.0.0.1%2Fping',
headers: {
'X-CALLBACK-Authorization': '123',
},
},
(error, res, body) => {
if (error) {
return done(error)
}
expect(res.statusCode).to.equal(200)
expect(body).to.equal(
'{"retryStatus":"running retryFailures in background"}'
)
assert(
!this.flushCall.isDone(),
'did not make calls to history service to store updates in the foreground'
)
setTimeout(() => {
assert(
this.flushCall.isDone(),
'made calls to history service to store updates in the background'
)
assert(this.callback.isDone(), 'hit the callback url')
done()
}, 100)
}
)
})
})
describe('when there is a hard failure', function () {
beforeEach(function (done) {
MockWeb()
.get(`/project/${this.project_id}/details`)
.reply(200, {
name: 'Test Project',
overleaf: {
history: {
id: historyId,
},
},
})
ProjectHistoryClient.setFailure(
{
project_id: this.project_id,
attempts: 100,
error: 'hard-error',
},
done
)
})
it('calls web to resync the project', function (done) {
const resyncCall = MockWeb()
.post(`/project/${this.project_id}/history/resync`)
.reply(200)
request.post(
{
url: 'http://127.0.0.1:3054/retry/failures?failureType=hard&limit=1&timeout=10000',
},
(error, res, body) => {
if (error) {
return done(error)
}
expect(res.statusCode).to.equal(200)
assert(resyncCall.isDone(), 'made a call to web to resync project')
done()
}
)
})
})
})
})

File diff suppressed because it is too large Load Diff

View File

@@ -0,0 +1,249 @@
/* eslint-disable
no-undef,
no-unused-vars,
*/
// TODO: This file was created by bulk-decaffeinate.
// Fix any style issues and re-enable lint.
/*
* decaffeinate suggestions:
* DS102: Remove unnecessary code created because of implicit returns
* DS207: Consider shorter variations of null checks
* Full docs: https://github.com/decaffeinate/decaffeinate/blob/master/docs/suggestions.md
*/
import sinon from 'sinon'
import { expect } from 'chai'
import Settings from '@overleaf/settings'
import request from 'request'
import assert from 'node:assert'
import mongodb from 'mongodb-legacy'
import nock from 'nock'
import * as ProjectHistoryClient from './helpers/ProjectHistoryClient.js'
import * as ProjectHistoryApp from './helpers/ProjectHistoryApp.js'
const { ObjectId } = mongodb
const MockHistoryStore = () => nock('http://127.0.0.1:3100')
const MockFileStore = () => nock('http://127.0.0.1:3009')
const MockWeb = () => nock('http://127.0.0.1:3000')
const fixture = path => new URL(`../fixtures/${path}`, import.meta.url)
describe('Summarized updates', function () {
beforeEach(function (done) {
this.projectId = new ObjectId().toString()
this.historyId = new ObjectId().toString()
return ProjectHistoryApp.ensureRunning(error => {
if (error != null) {
throw error
}
MockHistoryStore().post('/api/projects').reply(200, {
projectId: this.historyId,
})
return ProjectHistoryClient.initializeProject(
this.historyId,
(error, olProject) => {
if (error != null) {
throw error
}
MockWeb()
.get(`/project/${this.projectId}/details`)
.reply(200, {
name: 'Test Project',
overleaf: { history: { id: olProject.id } },
})
MockHistoryStore()
.get(`/api/projects/${this.historyId}/latest/history`)
.replyWithFile(200, fixture('chunks/7-8.json'))
MockHistoryStore()
.get(`/api/projects/${this.historyId}/versions/6/history`)
.replyWithFile(200, fixture('chunks/4-6.json'))
MockHistoryStore()
.get(`/api/projects/${this.historyId}/versions/3/history`)
.replyWithFile(200, fixture('chunks/0-3.json'))
return done()
}
)
})
})
afterEach(function () {
return nock.cleanAll()
})
it('should return the latest summarized updates from a single chunk', function (done) {
return ProjectHistoryClient.getSummarizedUpdates(
this.projectId,
{ min_count: 1 },
(error, updates) => {
if (error != null) {
throw error
}
expect(updates).to.deep.equal({
nextBeforeTimestamp: 6,
updates: [
{
fromV: 6,
toV: 8,
meta: {
users: ['5a5637efdac84e81b71014c4', 31],
start_ts: 1512383567277,
end_ts: 1512383572877,
},
pathnames: ['bar.tex', 'main.tex'],
project_ops: [],
labels: [],
},
],
})
return done()
}
)
})
it('should return the latest summarized updates, with min_count spanning multiple chunks', function (done) {
return ProjectHistoryClient.getSummarizedUpdates(
this.projectId,
{ min_count: 5 },
(error, updates) => {
if (error != null) {
throw error
}
expect(updates).to.deep.equal({
updates: [
{
fromV: 6,
toV: 8,
meta: {
users: ['5a5637efdac84e81b71014c4', 31],
start_ts: 1512383567277,
end_ts: 1512383572877,
},
pathnames: ['bar.tex', 'main.tex'],
project_ops: [],
labels: [],
},
{
fromV: 5,
toV: 6,
meta: {
users: [31],
start_ts: 1512383366120,
end_ts: 1512383366120,
},
pathnames: [],
project_ops: [
{
atV: 5,
rename: {
pathname: 'foo.tex',
newPathname: 'bar.tex',
},
},
],
labels: [],
},
{
fromV: 2,
toV: 5,
meta: {
users: [31],
start_ts: 1512383313724,
end_ts: 1512383362905,
},
pathnames: ['foo.tex'],
project_ops: [],
labels: [],
},
{
fromV: 1,
toV: 2,
meta: {
users: [31],
start_ts: 1512383246874,
end_ts: 1512383246874,
},
pathnames: [],
project_ops: [
{
atV: 1,
rename: {
pathname: 'bar.tex',
newPathname: 'foo.tex',
},
},
],
labels: [],
},
{
fromV: 0,
toV: 1,
meta: {
users: [31],
start_ts: 1512383015633,
end_ts: 1512383015633,
},
pathnames: ['main.tex'],
project_ops: [],
labels: [],
},
],
})
return done()
}
)
})
it('should return the summarized updates from a before version at the start of a chunk', function (done) {
MockHistoryStore()
.get(`/api/projects/${this.historyId}/versions/4/history`)
.replyWithFile(200, fixture('chunks/4-6.json'))
return ProjectHistoryClient.getSummarizedUpdates(
this.projectId,
{ before: 4 },
(error, updates) => {
if (error != null) {
throw error
}
expect(updates.updates[0].toV).to.equal(4)
return done()
}
)
})
it('should return the summarized updates from a before version in the middle of a chunk', function (done) {
MockHistoryStore()
.get(`/api/projects/${this.historyId}/versions/5/history`)
.replyWithFile(200, fixture('chunks/4-6.json'))
return ProjectHistoryClient.getSummarizedUpdates(
this.projectId,
{ before: 5 },
(error, updates) => {
if (error != null) {
throw error
}
expect(updates.updates[0].toV).to.equal(5)
return done()
}
)
})
return it('should return the summarized updates from a before version at the end of a chunk', function (done) {
MockHistoryStore()
.get(`/api/projects/${this.historyId}/versions/6/history`)
.replyWithFile(200, fixture('chunks/4-6.json'))
return ProjectHistoryClient.getSummarizedUpdates(
this.projectId,
{ before: 6 },
(error, updates) => {
if (error != null) {
throw error
}
expect(updates.updates[0].toV).to.equal(6)
return done()
}
)
})
})

File diff suppressed because it is too large Load Diff

View File

@@ -0,0 +1,7 @@
// TODO: This file was created by bulk-decaffeinate.
// Sanity-check the conversion and remove this comment.
let id = 0
export function nextId() {
return id++
}

View File

@@ -0,0 +1,41 @@
/* eslint-disable
no-unused-vars,
*/
// TODO: This file was created by bulk-decaffeinate.
// Fix any style issues and re-enable lint.
/*
* decaffeinate suggestions:
* DS102: Remove unnecessary code created because of implicit returns
* DS207: Consider shorter variations of null checks
* Full docs: https://github.com/decaffeinate/decaffeinate/blob/master/docs/suggestions.md
*/
import { expect } from 'chai'
import request from 'request'
import Settings from '@overleaf/settings'
export function getLatestContent(olProjectId, callback) {
if (callback == null) {
callback = function () {}
}
return request.get(
{
url: `${Settings.overleaf.history.host}/projects/${olProjectId}/latest/content`,
auth: {
user: Settings.overleaf.history.user,
pass: Settings.overleaf.history.pass,
sendImmediately: true,
},
},
(error, res, body) => {
if (res.statusCode < 200 || res.statusCode >= 300) {
callback(
new Error(
`history store a non-success status code: ${res.statusCode}`
)
)
}
return callback(error, JSON.parse(body))
}
)
}

View File

@@ -0,0 +1,41 @@
// TODO: This file was created by bulk-decaffeinate.
// Fix any style issues and re-enable lint.
/*
* decaffeinate suggestions:
* DS101: Remove unnecessary use of Array.from
* DS102: Remove unnecessary code created because of implicit returns
* DS205: Consider reworking code to avoid use of IIFEs
* DS207: Consider shorter variations of null checks
* Full docs: https://github.com/decaffeinate/decaffeinate/blob/master/docs/suggestions.md
*/
import { app } from '../../../../app/js/server.js'
let running = false
let initing = false
const callbacks = []
export function ensureRunning(callback) {
if (callback == null) {
callback = function () {}
}
if (running) {
return callback()
} else if (initing) {
return callbacks.push(callback)
}
initing = true
callbacks.push(callback)
app.listen(3054, '127.0.0.1', error => {
if (error != null) {
throw error
}
running = true
return (() => {
const result = []
for (callback of Array.from(callbacks)) {
result.push(callback())
}
return result
})()
})
}

View File

@@ -0,0 +1,354 @@
import { expect } from 'chai'
import request from 'request'
import Settings from '@overleaf/settings'
import RedisWrapper from '@overleaf/redis-wrapper'
import { db } from '../../../../app/js/mongodb.js'
const rclient = RedisWrapper.createClient(Settings.redis.project_history)
const Keys = Settings.redis.project_history.key_schema
export function resetDatabase(callback) {
rclient.flushdb(callback)
}
export function initializeProject(historyId, callback) {
request.post(
{
url: 'http://127.0.0.1:3054/project',
json: { historyId },
},
(error, res, body) => {
if (error) {
return callback(error)
}
expect(res.statusCode).to.equal(200)
callback(null, body.project)
}
)
}
export function flushProject(projectId, options, callback) {
if (typeof options === 'function') {
callback = options
options = null
}
if (!options) {
options = { allowErrors: false }
}
request.post(
{
url: `http://127.0.0.1:3054/project/${projectId}/flush`,
},
(error, res, body) => {
if (error) {
return callback(error)
}
if (!options.allowErrors) {
expect(res.statusCode).to.equal(204)
}
callback(error, res)
}
)
}
export function getSummarizedUpdates(projectId, query, callback) {
request.get(
{
url: `http://127.0.0.1:3054/project/${projectId}/updates`,
qs: query,
json: true,
},
(error, res, body) => {
if (error) {
return callback(error)
}
expect(res.statusCode).to.equal(200)
callback(error, body)
}
)
}
export function getDiff(projectId, pathname, from, to, callback) {
request.get(
{
url: `http://127.0.0.1:3054/project/${projectId}/diff`,
qs: {
pathname,
from,
to,
},
json: true,
},
(error, res, body) => {
if (error) {
return callback(error)
}
expect(res.statusCode).to.equal(200)
callback(error, body)
}
)
}
export function getFileTreeDiff(projectId, from, to, callback) {
request.get(
{
url: `http://127.0.0.1:3054/project/${projectId}/filetree/diff`,
qs: {
from,
to,
},
json: true,
},
(error, res, body) => {
if (error) {
return callback(error)
}
callback(error, body, res.statusCode)
}
)
}
export function getChangesInChunkSince(projectId, since, options, callback) {
request.get(
{
url: `http://127.0.0.1:3054/project/${projectId}/changes-in-chunk`,
qs: {
since,
},
json: true,
},
(error, res, body) => {
if (error) return callback(error)
if (!options.allowErrors) {
expect(res.statusCode).to.equal(200)
}
callback(null, body, res.statusCode)
}
)
}
export function getLatestSnapshot(projectId, callback) {
request.get(
{
url: `http://127.0.0.1:3054/project/${projectId}/snapshot`,
json: true,
},
(error, res, body) => {
if (error) {
return callback(error)
}
expect(res.statusCode).to.equal(200)
callback(null, body)
}
)
}
export function getSnapshot(projectId, pathname, version, options, callback) {
if (typeof options === 'function') {
callback = options
options = null
}
if (!options) {
options = { allowErrors: false }
}
request.get(
{
url: `http://127.0.0.1:3054/project/${projectId}/version/${version}/${encodeURIComponent(
pathname
)}`,
},
(error, res, body) => {
if (error) {
return callback(error)
}
if (!options.allowErrors) {
expect(res.statusCode).to.equal(200)
}
callback(error, body, res.statusCode)
}
)
}
export function pushRawUpdate(projectId, update, callback) {
rclient.rpush(
Keys.projectHistoryOps({ project_id: projectId }),
JSON.stringify(update),
callback
)
}
export function setFirstOpTimestamp(projectId, timestamp, callback) {
rclient.set(
Keys.projectHistoryFirstOpTimestamp({ project_id: projectId }),
timestamp,
callback
)
}
export function getFirstOpTimestamp(projectId, callback) {
rclient.get(
Keys.projectHistoryFirstOpTimestamp({ project_id: projectId }),
callback
)
}
export function clearFirstOpTimestamp(projectId, callback) {
rclient.del(
Keys.projectHistoryFirstOpTimestamp({ project_id: projectId }),
callback
)
}
export function getQueueLength(projectId, callback) {
rclient.llen(Keys.projectHistoryOps({ project_id: projectId }), callback)
}
export function getQueueCounts(callback) {
return request.get(
{
url: 'http://127.0.0.1:3054/status/queue',
json: true,
},
callback
)
}
export function resyncHistory(projectId, callback) {
request.post(
{
url: `http://127.0.0.1:3054/project/${projectId}/resync`,
json: true,
body: { origin: { kind: 'test-origin' } },
},
(error, res, body) => {
if (error) {
return callback(error)
}
expect(res.statusCode).to.equal(204)
callback(error)
}
)
}
export function createLabel(
projectId,
userId,
version,
comment,
createdAt,
callback
) {
request.post(
{
url: `http://127.0.0.1:3054/project/${projectId}/labels`,
json: { comment, version, created_at: createdAt, user_id: userId },
},
(error, res, body) => {
if (error) {
return callback(error)
}
expect(res.statusCode).to.equal(200)
callback(null, body)
}
)
}
export function getLabels(projectId, callback) {
request.get(
{
url: `http://127.0.0.1:3054/project/${projectId}/labels`,
json: true,
},
(error, res, body) => {
if (error) {
return callback(error)
}
expect(res.statusCode).to.equal(200)
callback(null, body)
}
)
}
export function deleteLabelForUser(projectId, userId, labelId, callback) {
request.delete(
{
url: `http://127.0.0.1:3054/project/${projectId}/user/${userId}/labels/${labelId}`,
},
(error, res, body) => {
if (error) {
return callback(error)
}
expect(res.statusCode).to.equal(204)
callback(null, body)
}
)
}
export function deleteLabel(projectId, labelId, callback) {
request.delete(
{
url: `http://127.0.0.1:3054/project/${projectId}/labels/${labelId}`,
},
(error, res, body) => {
if (error) {
return callback(error)
}
expect(res.statusCode).to.equal(204)
callback(null, body)
}
)
}
export function setFailure(failureEntry, callback) {
db.projectHistoryFailures.deleteOne(
{ project_id: { $exists: true } },
(err, result) => {
if (err) {
return callback(err)
}
db.projectHistoryFailures.insertOne(failureEntry, callback)
}
)
}
export function getFailure(projectId, callback) {
db.projectHistoryFailures.findOne({ project_id: projectId }, callback)
}
export function transferLabelOwnership(fromUser, toUser, callback) {
request.post(
{
url: `http://127.0.0.1:3054/user/${fromUser}/labels/transfer/${toUser}`,
},
(error, res, body) => {
if (error) {
return callback(error)
}
expect(res.statusCode).to.equal(204)
callback(null, body)
}
)
}
export function getDump(projectId, callback) {
request.get(
`http://127.0.0.1:3054/project/${projectId}/dump`,
(err, res, body) => {
if (err) {
return callback(err)
}
expect(res.statusCode).to.equal(200)
callback(null, JSON.parse(body))
}
)
}
export function deleteProject(projectId, callback) {
request.delete(`http://127.0.0.1:3054/project/${projectId}`, (err, res) => {
if (err) {
return callback(err)
}
expect(res.statusCode).to.equal(204)
callback()
})
}

View File

@@ -0,0 +1,13 @@
import chai from 'chai'
import sinonChai from 'sinon-chai'
import chaiAsPromised from 'chai-as-promised'
import mongodb from 'mongodb-legacy'
const { ObjectId } = mongodb
// ensure every ObjectId has the id string as a property for correct comparisons
ObjectId.cacheHexString = true
// Chai configuration
chai.should()
chai.use(sinonChai)
chai.use(chaiAsPromised)

View File

@@ -0,0 +1,160 @@
import sinon from 'sinon'
import { strict as esmock } from 'esmock'
const MODULE_PATH = '../../../../app/js/BlobManager.js'
describe('BlobManager', function () {
beforeEach(async function () {
this.callback = sinon.stub()
this.extendLock = sinon.stub().yields()
this.project_id = 'project-1'
this.historyId = 12345
this.HistoryStoreManager = {
createBlobForUpdate: sinon.stub(),
}
this.UpdateTranslator = {
isAddUpdate: sinon.stub().returns(false),
}
this.BlobManager = await esmock(MODULE_PATH, {
'../../../../app/js/HistoryStoreManager.js': this.HistoryStoreManager,
'../../../../app/js/UpdateTranslator.js': this.UpdateTranslator,
})
this.updates = ['update-1', 'update-2']
})
describe('createBlobsForUpdates', function () {
describe('when there are no blobs to create', function () {
beforeEach(function (done) {
this.BlobManager.createBlobsForUpdates(
this.project_id,
this.historyId,
this.updates,
this.extendLock,
(error, updatesWithBlobs) => {
this.callback(error, updatesWithBlobs)
done()
}
)
})
it('should not create any blobs', function () {
this.HistoryStoreManager.createBlobForUpdate.called.should.equal(false)
})
it('should call the callback with the updates', function () {
const updatesWithBlobs = this.updates.map(update => ({
update,
}))
this.callback.calledWith(null, updatesWithBlobs).should.equal(true)
})
})
describe('when there are blobs to create', function () {
beforeEach(function (done) {
this.UpdateTranslator.isAddUpdate.returns(true)
this.blobHash = 'test hash'
this.HistoryStoreManager.createBlobForUpdate.yields(null, {
file: this.blobHash,
})
this.BlobManager.createBlobsForUpdates(
this.project_id,
this.historyId,
this.updates,
this.extendLock,
(error, updatesWithBlobs) => {
this.callback(error, updatesWithBlobs)
done()
}
)
})
it('should create blobs', function () {
this.HistoryStoreManager.createBlobForUpdate
.calledWith(this.project_id, this.historyId, this.updates[0])
.should.equal(true)
})
it('should extend the lock', function () {
this.extendLock.called.should.equal(true)
})
it('should call the callback with the updates', function () {
const updatesWithBlobs = this.updates.map(update => ({
update,
blobHashes: { file: this.blobHash },
}))
this.callback.calledWith(null, updatesWithBlobs).should.equal(true)
})
})
describe('when there are blobs to create and there is a single network error', function () {
beforeEach(function (done) {
this.UpdateTranslator.isAddUpdate.returns(true)
this.blobHash = 'test hash'
this.HistoryStoreManager.createBlobForUpdate
.onFirstCall()
.yields(new Error('random failure'))
this.HistoryStoreManager.createBlobForUpdate.yields(null, {
file: this.blobHash,
})
this.BlobManager.createBlobsForUpdates(
this.project_id,
this.historyId,
this.updates,
this.extendLock,
(error, updatesWithBlobs) => {
this.callback(error, updatesWithBlobs)
done()
}
)
})
it('should create blobs', function () {
this.HistoryStoreManager.createBlobForUpdate
.calledWith(this.project_id, this.historyId, this.updates[0])
.should.equal(true)
})
it('should extend the lock', function () {
this.extendLock.called.should.equal(true)
})
it('should call the callback with the updates', function () {
const updatesWithBlobs = this.updates.map(update => ({
update,
blobHashes: { file: this.blobHash },
}))
this.callback.calledWith(null, updatesWithBlobs).should.equal(true)
})
})
describe('when there are blobs to create and there are multiple network errors', function () {
beforeEach(function (done) {
this.UpdateTranslator.isAddUpdate.returns(true)
this.blobHash = 'test hash'
this.error = new Error('random failure')
this.HistoryStoreManager.createBlobForUpdate.yields(this.error)
this.BlobManager.createBlobsForUpdates(
this.project_id,
this.historyId,
this.updates,
this.extendLock,
(error, updatesWithBlobs) => {
this.callback(error, updatesWithBlobs)
done()
}
)
})
it('should try to create blobs', function () {
this.HistoryStoreManager.createBlobForUpdate
.calledWith(this.project_id, this.historyId, this.updates[0])
.should.equal(true)
})
it('should call the callback with an error', function () {
this.callback.calledWith(this.error).should.equal(true)
})
})
})
})

File diff suppressed because it is too large Load Diff

View File

@@ -0,0 +1,395 @@
import sinon from 'sinon'
import { expect } from 'chai'
import { strict as esmock } from 'esmock'
const MODULE_PATH = '../../../../app/js/DiffGenerator.js'
describe('DiffGenerator', function () {
beforeEach(async function () {
this.DiffGenerator = await esmock(MODULE_PATH, {})
this.ts = Date.now()
this.user_id = 'mock-user-id'
this.user_id_2 = 'mock-user-id-2'
this.meta = {
start_ts: this.ts,
end_ts: this.ts,
user_id: this.user_id,
}
})
describe('buildDiff', function () {
beforeEach(function () {
this.diff = [{ u: 'mock-diff' }]
this.content = 'Hello world'
this.updates = [
{ i: 'mock-update-1' },
{ i: 'mock-update-2' },
{ i: 'mock-update-3' },
]
this.DiffGenerator._mocks.applyUpdateToDiff = sinon
.stub()
.returns(this.diff)
this.DiffGenerator._mocks.compressDiff = sinon.stub().returns(this.diff)
this.result = this.DiffGenerator.buildDiff(this.content, this.updates)
})
it('should return the diff', function () {
this.result.should.deep.equal(this.diff)
})
it('should build the content into an initial diff', function () {
this.DiffGenerator._mocks.applyUpdateToDiff
.calledWith(
[
{
u: this.content,
},
],
this.updates[0]
)
.should.equal(true)
})
it('should apply each update', function () {
this.updates.map(update =>
this.DiffGenerator._mocks.applyUpdateToDiff
.calledWith(sinon.match.any, update)
.should.equal(true)
)
})
it('should compress the diff', function () {
this.DiffGenerator._mocks.compressDiff
.calledWith(this.diff)
.should.equal(true)
})
})
describe('compressDiff', function () {
describe('with adjacent inserts with the same user id', function () {
it('should create one update with combined meta data and min/max timestamps', function () {
const diff = this.DiffGenerator.compressDiff([
{
i: 'foo',
meta: { start_ts: 10, end_ts: 20, users: [this.user_id] },
},
{
i: 'bar',
meta: { start_ts: 5, end_ts: 15, users: [this.user_id] },
},
])
expect(diff).to.deep.equal([
{
i: 'foobar',
meta: { start_ts: 5, end_ts: 20, users: [this.user_id] },
},
])
})
})
describe('with adjacent inserts with different user ids', function () {
it('should leave the inserts unchanged', function () {
const input = [
{
i: 'foo',
meta: { start_ts: 10, end_ts: 20, users: [this.user_id] },
},
{
i: 'bar',
meta: { start_ts: 5, end_ts: 15, users: [this.user_id_2] },
},
]
const output = this.DiffGenerator.compressDiff(input)
expect(output).to.deep.equal(input)
})
})
describe('with adjacent deletes with the same user id', function () {
it('should create one update with combined meta data and min/max timestamps', function () {
const diff = this.DiffGenerator.compressDiff([
{
d: 'foo',
meta: { start_ts: 10, end_ts: 20, users: [this.user_id] },
},
{
d: 'bar',
meta: { start_ts: 5, end_ts: 15, users: [this.user_id] },
},
])
expect(diff).to.deep.equal([
{
d: 'foobar',
meta: { start_ts: 5, end_ts: 20, users: [this.user_id] },
},
])
})
})
describe('with adjacent deletes with different user ids', function () {
it('should leave the deletes unchanged', function () {
const input = [
{
d: 'foo',
meta: { start_ts: 10, end_ts: 20, users: [this.user_id] },
},
{
d: 'bar',
meta: { start_ts: 5, end_ts: 15, users: [this.user_id_2] },
},
]
const output = this.DiffGenerator.compressDiff(input)
expect(output).to.deep.equal(input)
})
})
describe('with history resync updates', function () {
it('should keep only inserts and mark them as unchanged text', function () {
const input = [
{ u: 'untracked text' },
{
i: 'inserted anonymously',
meta: { origin: { kind: 'history-resync' } },
},
{
d: 'deleted anonymously',
meta: { origin: { kind: 'history-resync' } },
},
]
const output = this.DiffGenerator.compressDiff(input)
expect(output).to.deep.equal([
{ u: 'untracked text' },
{ u: 'inserted anonymously' },
])
})
})
})
describe('applyUpdateToDiff', function () {
describe('an insert', function () {
it('should insert into the middle of (u)nchanged text', function () {
const diff = this.DiffGenerator.applyUpdateToDiff([{ u: 'foobar' }], {
op: [{ p: 3, i: 'baz' }],
meta: this.meta,
})
expect(diff).to.deep.equal([
{ u: 'foo' },
{ i: 'baz', meta: this.meta },
{ u: 'bar' },
])
})
it('should insert into the start of (u)changed text', function () {
const diff = this.DiffGenerator.applyUpdateToDiff([{ u: 'foobar' }], {
op: [{ p: 0, i: 'baz' }],
meta: this.meta,
})
expect(diff).to.deep.equal([
{ i: 'baz', meta: this.meta },
{ u: 'foobar' },
])
})
it('should insert into the end of (u)changed text', function () {
const diff = this.DiffGenerator.applyUpdateToDiff([{ u: 'foobar' }], {
op: [{ p: 6, i: 'baz' }],
meta: this.meta,
})
expect(diff).to.deep.equal([
{ u: 'foobar' },
{ i: 'baz', meta: this.meta },
])
})
it('should insert into the middle of (i)inserted text', function () {
const diff = this.DiffGenerator.applyUpdateToDiff(
[{ i: 'foobar', meta: this.meta }],
{ op: [{ p: 3, i: 'baz' }], meta: this.meta }
)
expect(diff).to.deep.equal([
{ i: 'foo', meta: this.meta },
{ i: 'baz', meta: this.meta },
{ i: 'bar', meta: this.meta },
])
})
it('should not count deletes in the running length total', function () {
const diff = this.DiffGenerator.applyUpdateToDiff(
[{ d: 'deleted', meta: this.meta }, { u: 'foobar' }],
{ op: [{ p: 3, i: 'baz' }], meta: this.meta }
)
expect(diff).to.deep.equal([
{ d: 'deleted', meta: this.meta },
{ u: 'foo' },
{ i: 'baz', meta: this.meta },
{ u: 'bar' },
])
})
})
describe('a delete', function () {
describe('deleting unchanged text', function () {
it('should delete from the middle of (u)nchanged text', function () {
const diff = this.DiffGenerator.applyUpdateToDiff(
[{ u: 'foobazbar' }],
{ op: [{ p: 3, d: 'baz' }], meta: this.meta }
)
expect(diff).to.deep.equal([
{ u: 'foo' },
{ d: 'baz', meta: this.meta },
{ u: 'bar' },
])
})
it('should delete from the start of (u)nchanged text', function () {
const diff = this.DiffGenerator.applyUpdateToDiff(
[{ u: 'foobazbar' }],
{ op: [{ p: 0, d: 'foo' }], meta: this.meta }
)
expect(diff).to.deep.equal([
{ d: 'foo', meta: this.meta },
{ u: 'bazbar' },
])
})
it('should delete from the end of (u)nchanged text', function () {
const diff = this.DiffGenerator.applyUpdateToDiff(
[{ u: 'foobazbar' }],
{ op: [{ p: 6, d: 'bar' }], meta: this.meta }
)
expect(diff).to.deep.equal([
{ u: 'foobaz' },
{ d: 'bar', meta: this.meta },
])
})
it('should delete across multiple (u)changed text parts', function () {
const diff = this.DiffGenerator.applyUpdateToDiff(
[{ u: 'foo' }, { u: 'baz' }, { u: 'bar' }],
{ op: [{ p: 2, d: 'obazb' }], meta: this.meta }
)
expect(diff).to.deep.equal([
{ u: 'fo' },
{ d: 'o', meta: this.meta },
{ d: 'baz', meta: this.meta },
{ d: 'b', meta: this.meta },
{ u: 'ar' },
])
})
})
describe('deleting inserts', function () {
it('should delete from the middle of (i)nserted text', function () {
const diff = this.DiffGenerator.applyUpdateToDiff(
[{ i: 'foobazbar', meta: this.meta }],
{ op: [{ p: 3, d: 'baz' }], meta: this.meta }
)
expect(diff).to.deep.equal([
{ i: 'foo', meta: this.meta },
{ i: 'bar', meta: this.meta },
])
})
it('should delete from the start of (u)nchanged text', function () {
const diff = this.DiffGenerator.applyUpdateToDiff(
[{ i: 'foobazbar', meta: this.meta }],
{ op: [{ p: 0, d: 'foo' }], meta: this.meta }
)
expect(diff).to.deep.equal([{ i: 'bazbar', meta: this.meta }])
})
it('should delete from the end of (u)nchanged text', function () {
const diff = this.DiffGenerator.applyUpdateToDiff(
[{ i: 'foobazbar', meta: this.meta }],
{ op: [{ p: 6, d: 'bar' }], meta: this.meta }
)
expect(diff).to.deep.equal([{ i: 'foobaz', meta: this.meta }])
})
it('should delete across multiple (u)changed and (i)nserted text parts', function () {
const diff = this.DiffGenerator.applyUpdateToDiff(
[{ u: 'foo' }, { i: 'baz', meta: this.meta }, { u: 'bar' }],
{ op: [{ p: 2, d: 'obazb' }], meta: this.meta }
)
expect(diff).to.deep.equal([
{ u: 'fo' },
{ d: 'o', meta: this.meta },
{ d: 'b', meta: this.meta },
{ u: 'ar' },
])
})
})
describe('deleting over existing deletes', function () {
it('should delete across multiple (u)changed and (d)deleted text parts', function () {
const diff = this.DiffGenerator.applyUpdateToDiff(
[{ u: 'foo' }, { d: 'baz', meta: this.meta }, { u: 'bar' }],
{ op: [{ p: 2, d: 'ob' }], meta: this.meta }
)
expect(diff).to.deep.equal([
{ u: 'fo' },
{ d: 'o', meta: this.meta },
{ d: 'baz', meta: this.meta },
{ d: 'b', meta: this.meta },
{ u: 'ar' },
])
})
})
describe("deleting when the text doesn't match", function () {
it('should throw an error when deleting from the middle of (u)nchanged text', function () {
expect(() =>
this.DiffGenerator.applyUpdateToDiff([{ u: 'foobazbar' }], {
op: [{ p: 3, d: 'xxx' }],
meta: this.meta,
})
).to.throw(this.DiffGenerator.ConsistencyError)
})
it('should throw an error when deleting from the start of (u)nchanged text', function () {
expect(() =>
this.DiffGenerator.applyUpdateToDiff([{ u: 'foobazbar' }], {
op: [{ p: 0, d: 'xxx' }],
meta: this.meta,
})
).to.throw(this.DiffGenerator.ConsistencyError)
})
it('should throw an error when deleting from the end of (u)nchanged text', function () {
expect(() =>
this.DiffGenerator.applyUpdateToDiff([{ u: 'foobazbar' }], {
op: [{ p: 6, d: 'xxx' }],
meta: this.meta,
})
).to.throw(this.DiffGenerator.ConsistencyError)
})
})
describe('when the last update in the existing diff is a delete', function () {
it('should insert the new update before the delete', function () {
const diff = this.DiffGenerator.applyUpdateToDiff(
[{ u: 'foo' }, { d: 'bar', meta: this.meta }],
{ op: [{ p: 3, i: 'baz' }], meta: this.meta }
)
expect(diff).to.deep.equal([
{ u: 'foo' },
{ i: 'baz', meta: this.meta },
{ d: 'bar', meta: this.meta },
])
})
})
describe('when the only update in the existing diff is a delete', function () {
it('should insert the new update after the delete', function () {
const diff = this.DiffGenerator.applyUpdateToDiff(
[{ d: 'bar', meta: this.meta }],
{ op: [{ p: 0, i: 'baz' }], meta: this.meta }
)
expect(diff).to.deep.equal([
{ d: 'bar', meta: this.meta },
{ i: 'baz', meta: this.meta },
])
})
})
})
})
})

View File

@@ -0,0 +1,523 @@
import sinon from 'sinon'
import { expect } from 'chai'
import { strict as esmock } from 'esmock'
const MODULE_PATH = '../../../../app/js/DiffManager.js'
describe('DiffManager', function () {
beforeEach(async function () {
this.DocumentUpdaterManager = {}
this.DiffGenerator = {
buildDiff: sinon.stub(),
}
this.UpdatesProcessor = {
processUpdatesForProject: sinon.stub(),
}
this.HistoryStoreManager = {
getChunkAtVersion: sinon.stub(),
}
this.WebApiManager = {
getHistoryId: sinon.stub(),
}
this.ChunkTranslator = {
convertToDiffUpdates: sinon.stub(),
}
this.FileTreeDiffGenerator = {}
this.DiffManager = await esmock(MODULE_PATH, {
'../../../../app/js/DocumentUpdaterManager.js':
this.DocumentUpdaterManager,
'../../../../app/js/DiffGenerator.js': this.DiffGenerator,
'../../../../app/js/UpdatesProcessor.js': this.UpdatesProcessor,
'../../../../app/js/HistoryStoreManager.js': this.HistoryStoreManager,
'../../../../app/js/WebApiManager.js': this.WebApiManager,
'../../../../app/js/ChunkTranslator.js': this.ChunkTranslator,
'../../../../app/js/FileTreeDiffGenerator.js': this.FileTreeDiffGenerator,
})
this.projectId = 'mock-project-id'
this.callback = sinon.stub()
})
describe('getDiff', function () {
beforeEach(function () {
this.pathname = 'main.tex'
this.fromVersion = 4
this.toVersion = 8
this.initialContent = 'foo bar baz'
this.updates = ['mock-updates']
this.diff = { mock: 'dif' }
this.UpdatesProcessor.processUpdatesForProject
.withArgs(this.projectId)
.yields()
this.DiffGenerator.buildDiff
.withArgs(this.initialContent, this.updates)
.returns(this.diff)
})
describe('with a text file', function () {
beforeEach(function () {
this.DiffManager._mocks._getProjectUpdatesBetweenVersions = sinon.stub()
this.DiffManager._mocks._getProjectUpdatesBetweenVersions
.withArgs(
this.projectId,
this.pathname,
this.fromVersion,
this.toVersion
)
.yields(null, {
initialContent: this.initialContent,
updates: this.updates,
})
this.DiffManager.getDiff(
this.projectId,
this.pathname,
this.fromVersion,
this.toVersion,
this.callback
)
})
it('should make sure all pending updates have been process', function () {
this.UpdatesProcessor.processUpdatesForProject
.calledWith(this.projectId)
.should.equal(true)
})
it('should get the updates from the history backend', function () {
this.DiffManager._mocks._getProjectUpdatesBetweenVersions
.calledWith(
this.projectId,
this.pathname,
this.fromVersion,
this.toVersion
)
.should.equal(true)
})
it('should convert the updates to a diff', function () {
this.DiffGenerator.buildDiff
.calledWith(this.initialContent, this.updates)
.should.equal(true)
})
it('should return the diff', function () {
this.callback.calledWith(null, this.diff).should.equal(true)
})
})
describe('with a binary file', function () {
beforeEach(function () {
this.DiffManager._mocks._getProjectUpdatesBetweenVersions = sinon.stub()
this.DiffManager._mocks._getProjectUpdatesBetweenVersions
.withArgs(
this.projectId,
this.pathname,
this.fromVersion,
this.toVersion
)
.yields(null, { binary: true })
this.DiffManager.getDiff(
this.projectId,
this.pathname,
this.fromVersion,
this.toVersion,
this.callback
)
})
it('should make sure all pending updates have been process', function () {
this.UpdatesProcessor.processUpdatesForProject
.calledWith(this.projectId)
.should.equal(true)
})
it('should get the updates from the history backend', function () {
this.DiffManager._mocks._getProjectUpdatesBetweenVersions
.calledWith(
this.projectId,
this.pathname,
this.fromVersion,
this.toVersion
)
.should.equal(true)
})
it('should not try convert any updates to a diff', function () {
this.DiffGenerator.buildDiff.called.should.equal(false)
})
it('should return the binary diff', function () {
this.callback.calledWith(null, { binary: true }).should.equal(true)
})
})
})
describe('_getProjectUpdatesBetweenVersions', function () {
beforeEach(function () {
this.pathname = 'main.tex'
this.fromVersion = 4
this.toVersion = 8
this.chunks = ['mock-chunk-1', 'mock-chunk-2']
this.concatted_chunk = 'mock-chunk'
this.DiffManager._mocks._concatChunks = sinon.stub()
this.DiffManager._mocks._concatChunks
.withArgs(this.chunks)
.returns(this.concatted_chunk)
this.updates = ['mock-updates']
this.initialContent = 'foo bar baz'
this.ChunkTranslator.convertToDiffUpdates
.withArgs(
this.projectId,
this.concatted_chunk,
this.pathname,
this.fromVersion,
this.toVersion
)
.yields(null, {
initialContent: this.initialContent,
updates: this.updates,
})
})
describe('for the normal case', function () {
beforeEach(function () {
this.DiffManager._mocks._getChunks = sinon.stub()
this.DiffManager._mocks._getChunks
.withArgs(this.projectId, this.fromVersion, this.toVersion)
.yields(null, this.chunks)
this.DiffManager._getProjectUpdatesBetweenVersions(
this.projectId,
this.pathname,
this.fromVersion,
this.toVersion,
this.callback
)
})
it('should get the relevant chunks', function () {
this.DiffManager._mocks._getChunks
.calledWith(this.projectId, this.fromVersion, this.toVersion)
.should.equal(true)
})
it('should get the concat the chunks', function () {
this.DiffManager._mocks._concatChunks
.calledWith(this.chunks)
.should.equal(true)
})
it('should convert the chunks to an initial version and updates', function () {
this.ChunkTranslator.convertToDiffUpdates
.calledWith(
this.projectId,
this.concatted_chunk,
this.pathname,
this.fromVersion,
this.toVersion
)
.should.equal(true)
})
it('should return the initialContent and updates', function () {
this.callback
.calledWith(null, {
initialContent: this.initialContent,
updates: this.updates,
})
.should.equal(true)
})
})
describe('for the error case', function () {
beforeEach(function () {
this.DiffManager._mocks._getChunks = sinon.stub()
this.DiffManager._mocks._getChunks
.withArgs(this.projectId, this.fromVersion, this.toVersion)
.yields(new Error('failed to load chunk'))
this.DiffManager._getProjectUpdatesBetweenVersions(
this.projectId,
this.pathname,
this.fromVersion,
this.toVersion,
this.callback
)
})
it('should call the callback with an error', function () {
this.callback
.calledWith(sinon.match.instanceOf(Error))
.should.equal(true)
})
})
})
describe('_getChunks', function () {
beforeEach(function () {
this.historyId = 'mock-overleaf-id'
this.WebApiManager.getHistoryId.yields(null, this.historyId)
})
describe('where only one chunk is needed', function () {
beforeEach(function (done) {
this.fromVersion = 4
this.toVersion = 8
this.chunk = {
chunk: {
startVersion: 2,
}, // before fromVersion
}
this.HistoryStoreManager.getChunkAtVersion
.withArgs(this.projectId, this.historyId, this.toVersion)
.yields(null, this.chunk)
this.DiffManager._getChunks(
this.projectId,
this.fromVersion,
this.toVersion,
(error, chunks) => {
this.error = error
this.chunks = chunks
done()
}
)
})
it("should the project's overleaf id", function () {
this.WebApiManager.getHistoryId
.calledWith(this.projectId)
.should.equal(true)
})
it('should request the first chunk', function () {
this.HistoryStoreManager.getChunkAtVersion
.calledWith(this.projectId, this.historyId, this.toVersion)
.should.equal(true)
})
it('should return an array of chunks', function () {
expect(this.chunks).to.deep.equal([this.chunk])
})
})
describe('where multiple chunks are needed', function () {
beforeEach(function (done) {
this.fromVersion = 4
this.toVersion = 8
this.chunk1 = {
chunk: {
startVersion: 6,
},
}
this.chunk2 = {
chunk: {
startVersion: 2,
},
}
this.HistoryStoreManager.getChunkAtVersion
.withArgs(this.projectId, this.historyId, this.toVersion)
.yields(null, this.chunk1)
this.HistoryStoreManager.getChunkAtVersion
.withArgs(
this.projectId,
this.historyId,
this.chunk1.chunk.startVersion
)
.yields(null, this.chunk2)
this.DiffManager._mocks._getChunks(
this.projectId,
this.fromVersion,
this.toVersion,
(error, chunks) => {
this.error = error
this.chunks = chunks
done()
}
)
})
it('should request the first chunk', function () {
this.HistoryStoreManager.getChunkAtVersion
.calledWith(this.projectId, this.historyId, this.toVersion)
.should.equal(true)
})
it('should request the second chunk, from where the first one started', function () {
this.HistoryStoreManager.getChunkAtVersion
.calledWith(
this.projectId,
this.historyId,
this.chunk1.chunk.startVersion
)
.should.equal(true)
})
it('should return an array of chunks', function () {
expect(this.chunks).to.deep.equal([this.chunk1, this.chunk2])
})
})
describe('where more than MAX_CHUNKS are requested', function () {
beforeEach(function (done) {
this.fromVersion = 0
this.toVersion = 8
this.chunk1 = {
chunk: {
startVersion: 6,
},
}
this.chunk2 = {
chunk: {
startVersion: 4,
},
}
this.chunk3 = {
chunk: {
startVersion: 2,
},
}
this.DiffManager.setMaxChunkRequests(2)
this.HistoryStoreManager.getChunkAtVersion
.withArgs(this.projectId, this.historyId, this.toVersion)
.yields(null, this.chunk1)
this.HistoryStoreManager.getChunkAtVersion
.withArgs(
this.projectId,
this.historyId,
this.chunk1.chunk.startVersion
)
.yields(null, this.chunk2)
this.DiffManager._mocks._getChunks(
this.projectId,
this.fromVersion,
this.toVersion,
(error, chunks) => {
this.error = error
this.chunks = chunks
done()
}
)
})
it('should request the first chunk', function () {
this.HistoryStoreManager.getChunkAtVersion
.calledWith(this.projectId, this.historyId, this.toVersion)
.should.equal(true)
})
it('should request the second chunk, from where the first one started', function () {
this.HistoryStoreManager.getChunkAtVersion
.calledWith(
this.projectId,
this.historyId,
this.chunk1.chunk.startVersion
)
.should.equal(true)
})
it('should not request the third chunk', function () {
this.HistoryStoreManager.getChunkAtVersion
.calledWith(
this.projectId,
this.historyId,
this.chunk2.chunk.startVersion
)
.should.equal(false)
})
it('should return an error', function () {
expect(this.error).to.exist
expect(this.error.message).to.equal('Diff spans too many chunks')
expect(this.error.name).to.equal('BadRequestError')
})
})
describe('where fromVersion == toVersion', function () {
beforeEach(function (done) {
this.fromVersion = 4
this.toVersion = 4
this.chunk = {
chunk: {
startVersion: 2,
}, // before fromVersion
}
this.HistoryStoreManager.getChunkAtVersion
.withArgs(this.projectId, this.historyId, this.toVersion)
.yields(null, this.chunk)
this.DiffManager._mocks._getChunks(
this.projectId,
this.fromVersion,
this.toVersion,
(error, chunks) => {
this.error = error
this.chunks = chunks
done()
}
)
})
it('should still request the first chunk (because we need the file contents)', function () {
this.HistoryStoreManager.getChunkAtVersion
.calledWith(this.projectId, this.historyId, this.toVersion)
.should.equal(true)
})
it('should return an array of chunks', function () {
expect(this.chunks).to.deep.equal([this.chunk])
})
})
})
describe('_concatChunks', function () {
it('should concat the chunks in reverse order', function () {
const result = this.DiffManager._mocks._concatChunks([
{
chunk: {
history: {
snapshot: {
files: {
mock: 'files-updated-2',
},
},
changes: [7, 8, 9],
},
},
},
{
chunk: {
history: {
snapshot: {
files: {
mock: 'files-updated',
},
},
changes: [4, 5, 6],
},
},
},
{
chunk: {
history: {
snapshot: {
files: {
mock: 'files-original',
},
},
changes: [1, 2, 3],
},
},
},
])
expect(result).to.deep.equal({
chunk: {
history: {
snapshot: {
files: {
mock: 'files-original',
},
},
changes: [1, 2, 3, 4, 5, 6, 7, 8, 9],
},
},
})
})
})
})

View File

@@ -0,0 +1,184 @@
/* eslint-disable
no-return-assign,
no-undef,
no-unused-vars,
*/
// TODO: This file was created by bulk-decaffeinate.
// Fix any style issues and re-enable lint.
/*
* decaffeinate suggestions:
* DS102: Remove unnecessary code created because of implicit returns
* Full docs: https://github.com/decaffeinate/decaffeinate/blob/master/docs/suggestions.md
*/
import sinon from 'sinon'
import { expect } from 'chai'
import { strict as esmock } from 'esmock'
const MODULE_PATH = '../../../../app/js/DocumentUpdaterManager.js'
describe('DocumentUpdaterManager', function () {
beforeEach(async function () {
this.settings = {
apis: { documentupdater: { url: 'http://example.com' } },
}
this.request = {
get: sinon.stub(),
post: sinon.stub(),
}
this.DocumentUpdaterManager = await esmock(MODULE_PATH, {
request: this.request,
'@overleaf/settings': this.settings,
})
this.callback = sinon.stub()
this.lines = ['one', 'two', 'three']
return (this.version = 42)
})
describe('getDocument', function () {
describe('successfully', function () {
beforeEach(function () {
this.body = JSON.stringify({
lines: this.lines,
version: this.version,
ops: [],
})
this.request.get.yields(null, { statusCode: 200 }, this.body)
return this.DocumentUpdaterManager.getDocument(
this.project_id,
this.doc_id,
this.callback
)
})
it('should get the document from the document updater', function () {
const url = `${this.settings.apis.documentupdater.url}/project/${this.project_id}/doc/${this.doc_id}`
return this.request.get.calledWith(url).should.equal(true)
})
return it('should call the callback with the content and version', function () {
return this.callback
.calledWith(null, this.lines.join('\n'), this.version)
.should.equal(true)
})
})
describe('when the document updater API returns an error', function () {
beforeEach(function () {
this.error = new Error('something went wrong')
this.request.get.yields(this.error, null, null)
return this.DocumentUpdaterManager.getDocument(
this.project_id,
this.doc_id,
this.callback
)
})
return it('should return an error to the callback', function () {
return this.callback.calledWith(this.error).should.equal(true)
})
})
return describe('when the document updater returns a failure error code', function () {
beforeEach(function () {
this.request.get.yields(null, { statusCode: 500 }, '')
return this.DocumentUpdaterManager.getDocument(
this.project_id,
this.doc_id,
this.callback
)
})
return it('should return the callback with an error', function () {
return this.callback
.calledWith(
sinon.match.has(
'message',
'doc updater returned a non-success status code: 500'
)
)
.should.equal(true)
})
})
})
return describe('setDocument', function () {
beforeEach(function () {
this.content = 'mock content'
return (this.user_id = 'user-id-123')
})
describe('successfully', function () {
beforeEach(function () {
this.request.post.yields(null, { statusCode: 200 })
return this.DocumentUpdaterManager.setDocument(
this.project_id,
this.doc_id,
this.content,
this.user_id,
this.callback
)
})
it('should set the document in the document updater', function () {
const url = `${this.settings.apis.documentupdater.url}/project/${this.project_id}/doc/${this.doc_id}`
return this.request.post
.calledWith({
url,
json: {
lines: this.content.split('\n'),
source: 'restore',
user_id: this.user_id,
undoing: true,
},
})
.should.equal(true)
})
return it('should call the callback', function () {
return this.callback.calledWith(null).should.equal(true)
})
})
describe('when the document updater API returns an error', function () {
beforeEach(function () {
this.error = new Error('something went wrong')
this.request.post.yields(this.error, null, null)
return this.DocumentUpdaterManager.setDocument(
this.project_id,
this.doc_id,
this.content,
this.user_id,
this.callback
)
})
return it('should return an error to the callback', function () {
return this.callback.calledWith(this.error).should.equal(true)
})
})
return describe('when the document updater returns a failure error code', function () {
beforeEach(function () {
this.request.post.yields(null, { statusCode: 500 }, '')
return this.DocumentUpdaterManager.setDocument(
this.project_id,
this.doc_id,
this.content,
this.user_id,
this.callback
)
})
return it('should return the callback with an error', function () {
return this.callback
.calledWith(
sinon.match.has(
'message',
'doc updater returned a non-success status code: 500'
)
)
.should.equal(true)
})
})
})
})

View File

@@ -0,0 +1,96 @@
import sinon from 'sinon'
import { strict as esmock } from 'esmock'
import tk from 'timekeeper'
const MODULE_PATH = '../../../../app/js/ErrorRecorder.js'
describe('ErrorRecorder', function () {
beforeEach(async function () {
this.now = new Date()
tk.freeze(this.now)
this.db = {
projectHistoryFailures: {
deleteOne: sinon.stub().resolves(),
findOneAndUpdate: sinon
.stub()
.resolves({ value: { failure: 'record' } }),
},
}
this.mongodb = { db: this.db }
this.metrics = { gauge: sinon.stub() }
this.ErrorRecorder = await esmock(MODULE_PATH, {
'../../../../app/js/mongodb.js': this.mongodb,
'@overleaf/metrics': this.metrics,
})
this.project_id = 'project-id-123'
this.queueSize = 445
})
afterEach(function () {
tk.reset()
})
describe('record', function () {
beforeEach(async function () {
this.error = new Error('something bad')
await this.ErrorRecorder.promises.record(
this.project_id,
this.queueSize,
this.error
)
})
it('should record the error to mongo', function () {
this.db.projectHistoryFailures.findOneAndUpdate
.calledWithMatch(
{
project_id: this.project_id,
},
{
$set: {
queueSize: this.queueSize,
error: this.error.toString(),
stack: this.error.stack,
ts: this.now,
},
$inc: {
attempts: 1,
},
$push: {
history: {
$each: [
{
queueSize: this.queueSize,
error: this.error.toString(),
stack: this.error.stack,
ts: this.now,
},
],
$position: 0,
$slice: 10,
},
},
},
{
upsert: true,
}
)
.should.equal(true)
})
})
describe('clearError', function () {
beforeEach(async function () {
this.result = await this.ErrorRecorder.promises.clearError(
this.project_id
)
})
it('should remove any error from mongo', function () {
this.db.projectHistoryFailures.deleteOne
.calledWithMatch({ project_id: this.project_id })
.should.equal(true)
})
})
})

View File

@@ -0,0 +1,497 @@
import { expect } from 'chai'
import { createRangeBlobDataFromUpdate } from '../../../../app/js/HistoryBlobTranslator.js'
/**
* @import { AddDocUpdate } from "../../../../app/js/types"
*/
/**
*
* @param {string} pathname s
* @param {string} docLines
* @param {AddDocUpdate["ranges"]} ranges
* @returns {AddDocUpdate}
*/
const update = (pathname, docLines, ranges) => {
return {
pathname,
docLines,
ranges,
version: 'version-1',
projectHistoryId: 'project-id',
doc: 'doc',
meta: {
user_id: 'user-id',
ts: 0,
},
}
}
describe('HistoryBlobTranslator', function () {
describe('createBlobDataFromUpdate', function () {
beforeEach(function () {
this.text = 'the quick brown fox jumps over the lazy dog'
})
describe('for update with no ranges', function () {
beforeEach(function () {
this.result = createRangeBlobDataFromUpdate(
update('pathname', this.text, undefined)
)
})
it('should not return ranges', function () {
expect(this.result).to.be.undefined
})
})
describe('for update with empty ranges object', function () {
beforeEach(function () {
this.result = createRangeBlobDataFromUpdate(
update('pathname', this.text, {})
)
})
it('should not return ranges', function () {
expect(this.result).to.be.undefined
})
})
describe('for update with ranges object with empty lists', function () {
beforeEach(function () {
this.result = createRangeBlobDataFromUpdate(
update('pathname', this.text, { changes: [], comments: [] })
)
})
it('should not return ranges', function () {
expect(this.result).to.be.undefined
})
})
describe('for update with zero length comments', function () {
beforeEach(function () {
this.result = createRangeBlobDataFromUpdate(
update('pathname', this.text, {
changes: [],
comments: [
{ op: { c: '', p: 4, t: 'comment-1', resolved: false } },
],
})
)
})
it('should treat them as detached comments', function () {
expect(this.result).to.deep.equal({
comments: [{ id: 'comment-1', ranges: [] }],
trackedChanges: [],
})
})
})
describe('for update with ranges object with only comments', function () {
it('should return unmoved ranges', function () {
const result = createRangeBlobDataFromUpdate(
update('pathname', this.text, {
comments: [
{
op: { c: 'quick', p: 4, t: 'comment-1', resolved: false },
},
],
})
)
expect(result).to.deep.equal({
comments: [
{
id: 'comment-1',
ranges: [{ pos: 4, length: 5 }],
},
],
trackedChanges: [],
})
})
it('should merge comments ranges into a single comment by id', function () {
const result = createRangeBlobDataFromUpdate(
update('pathname', this.text, {
comments: [
{
op: { c: 'quick', p: 4, t: 'comment-1', resolved: false },
},
{
op: { c: 'jumps', p: 20, t: 'comment-1', resolved: false },
},
],
})
)
expect(result).to.deep.equal({
comments: [
{
id: 'comment-1',
ranges: [
{ pos: 4, length: 5 },
{ pos: 20, length: 5 },
],
},
],
trackedChanges: [],
})
})
it('should not merge ranges into a single comment if id differs', function () {
const result = createRangeBlobDataFromUpdate(
update('pathname', this.text, {
comments: [
{
op: { c: 'quick', p: 4, t: 'comment-1', resolved: false },
},
{
op: { c: 'jumps', p: 20, t: 'comment-2', resolved: false },
},
],
})
)
expect(result).to.deep.equal({
comments: [
{
id: 'comment-1',
ranges: [{ pos: 4, length: 5 }],
},
{
id: 'comment-2',
ranges: [{ pos: 20, length: 5 }],
},
],
trackedChanges: [],
})
})
})
describe('for update with ranges object with only tracked insertions', function () {
it('should translate into history tracked insertions', function () {
const result = createRangeBlobDataFromUpdate(
update('pathname', this.text, {
changes: [
{
op: { p: 4, i: 'quick' },
metadata: {
ts: '2024-01-01T00:00:00.000Z',
user_id: 'user-1',
},
},
{
op: { p: 10, i: 'brown' },
metadata: {
ts: '2023-01-01T00:00:00.000Z',
user_id: 'user-2',
},
},
],
})
)
expect(result).to.deep.equal({
comments: [],
trackedChanges: [
{
range: { pos: 4, length: 5 },
tracking: {
type: 'insert',
userId: 'user-1',
ts: '2024-01-01T00:00:00.000Z',
},
},
{
range: { pos: 10, length: 5 },
tracking: {
type: 'insert',
userId: 'user-2',
ts: '2023-01-01T00:00:00.000Z',
},
},
],
})
})
})
describe('for update with ranges object with mixed tracked changes', function () {
describe('with tracked deletions before insertions', function () {
it('should insert tracked deletions before insertions', function () {
const text = 'the quickrapid brown fox jumps over the lazy dog'
const result = createRangeBlobDataFromUpdate(
update('pathname', text, {
changes: [
{
op: { p: 4, d: 'quick' },
metadata: {
ts: '2024-01-01T00:00:00.000Z',
user_id: 'user-1',
},
},
{
op: { p: 4, hpos: 9, i: 'rapid' },
metadata: {
ts: '2023-01-01T00:00:00.000Z',
user_id: 'user-2',
},
},
],
})
)
expect(result).to.deep.equal({
comments: [],
trackedChanges: [
{
range: { pos: 4, length: 5 },
tracking: {
type: 'delete',
userId: 'user-1',
ts: '2024-01-01T00:00:00.000Z',
},
},
{
range: { pos: 9, length: 5 },
tracking: {
type: 'insert',
userId: 'user-2',
ts: '2023-01-01T00:00:00.000Z',
},
},
],
})
})
})
describe('with tracked insertions before deletions', function () {
it('should insert tracked deletions before insertions', function () {
const text = 'the quickrapid brown fox jumps over the lazy dog'
const result = createRangeBlobDataFromUpdate(
update('pathname', text, {
changes: [
{
op: { p: 4, hpos: 9, i: 'rapid' },
metadata: {
ts: '2023-01-01T00:00:00.000Z',
user_id: 'user-2',
},
},
{
op: { p: 4, d: 'quick' },
metadata: {
ts: '2024-01-01T00:00:00.000Z',
user_id: 'user-1',
},
},
],
})
)
expect(result).to.deep.equal({
comments: [],
trackedChanges: [
{
range: { pos: 4, length: 5 },
tracking: {
type: 'delete',
userId: 'user-1',
ts: '2024-01-01T00:00:00.000Z',
},
},
{
range: { pos: 9, length: 5 },
tracking: {
type: 'insert',
userId: 'user-2',
ts: '2023-01-01T00:00:00.000Z',
},
},
],
})
})
})
it('should adjust positions', function () {
const text = 'the quick brown fox jumps over the lazy dog'
const result = createRangeBlobDataFromUpdate(
update('pathname', text, {
changes: [
{
op: { p: 4, i: 'quick' },
metadata: {
ts: '2024-01-01T00:00:00.000Z',
user_id: 'user-1',
},
},
{
op: { p: 10, d: 'brown' },
metadata: {
ts: '2023-01-01T00:00:00.000Z',
user_id: 'user-2',
},
},
{
op: { p: 30, hpos: 35, i: 'lazy' },
metadata: {
ts: '2022-01-01T00:00:00.000Z',
user_id: 'user-2',
},
},
],
})
)
expect(result).to.deep.equal({
comments: [],
trackedChanges: [
{
range: { pos: 4, length: 5 },
tracking: {
type: 'insert',
userId: 'user-1',
ts: '2024-01-01T00:00:00.000Z',
},
},
{
range: { pos: 10, length: 5 },
tracking: {
type: 'delete',
userId: 'user-2',
ts: '2023-01-01T00:00:00.000Z',
},
},
{
range: { pos: 35, length: 4 },
tracking: {
type: 'insert',
userId: 'user-2',
ts: '2022-01-01T00:00:00.000Z',
},
},
],
})
})
})
describe('for update with ranges object with mixed tracked changes and comments', function () {
it('should adjust positions', function () {
const text = 'the quick brown fox jumps over the lazy dog'
const result = createRangeBlobDataFromUpdate(
update('pathname', text, {
comments: [
{
op: { c: 'quick', p: 4, t: 'comment-1', resolved: false },
},
{
op: {
c: 'fox',
p: 11,
hpos: 16,
t: 'comment-2',
resolved: false,
},
},
],
changes: [
{
op: { p: 4, i: 'quick' },
metadata: {
ts: '2024-01-01T00:00:00.000Z',
user_id: 'user-1',
},
},
{
op: { p: 10, d: 'brown' },
metadata: {
ts: '2023-01-01T00:00:00.000Z',
user_id: 'user-2',
},
},
{
op: { p: 30, hpos: 35, i: 'lazy' },
metadata: {
ts: '2022-01-01T00:00:00.000Z',
user_id: 'user-2',
},
},
],
})
)
expect(result).to.deep.equal({
comments: [
{
ranges: [{ pos: 4, length: 5 }],
id: 'comment-1',
},
{
ranges: [{ pos: 16, length: 3 }],
id: 'comment-2',
},
],
trackedChanges: [
{
range: { pos: 4, length: 5 },
tracking: {
type: 'insert',
userId: 'user-1',
ts: '2024-01-01T00:00:00.000Z',
},
},
{
range: { pos: 10, length: 5 },
tracking: {
type: 'delete',
userId: 'user-2',
ts: '2023-01-01T00:00:00.000Z',
},
},
{
range: { pos: 35, length: 4 },
tracking: {
type: 'insert',
userId: 'user-2',
ts: '2022-01-01T00:00:00.000Z',
},
},
],
})
})
it('should adjust comment length', function () {
const text = 'the quick brown fox jumps over the lazy dog'
const result = createRangeBlobDataFromUpdate(
update('pathname', text, {
comments: [
{
op: { c: 'quick fox', p: 4, t: 'comment-1', resolved: false },
},
],
changes: [
{
op: { p: 10, d: 'brown ' },
metadata: {
ts: '2023-01-01T00:00:00.000Z',
user_id: 'user-2',
},
},
],
})
)
expect(result).to.deep.equal({
comments: [
{
ranges: [{ pos: 4, length: 9 }],
id: 'comment-1',
},
],
trackedChanges: [
{
range: { pos: 10, length: 6 },
tracking: {
type: 'delete',
userId: 'user-2',
ts: '2023-01-01T00:00:00.000Z',
},
},
],
})
})
})
})
})

View File

@@ -0,0 +1,727 @@
import sinon from 'sinon'
import { expect } from 'chai'
import { strict as esmock } from 'esmock'
import EventEmitter from 'node:events'
import { RequestFailedError } from '@overleaf/fetch-utils'
import * as Errors from '../../../../app/js/Errors.js'
const MODULE_PATH = '../../../../app/js/HistoryStoreManager.js'
describe('HistoryStoreManager', function () {
beforeEach(async function () {
this.projectId = '123456789012345678901234'
this.historyId = 'mock-ol-project-id'
this.settings = {
overleaf: {
history: {
host: 'http://example.com',
user: 'overleaf',
pass: 'password',
requestTimeout: 123,
},
},
apis: {
filestore: {
enabled: true,
url: 'http://filestore.overleaf.production',
},
},
}
this.latestChunkRequestArgs = sinon.match({
method: 'GET',
url: `${this.settings.overleaf.history.host}/projects/${this.historyId}/latest/history`,
json: true,
auth: {
user: this.settings.overleaf.history.user,
pass: this.settings.overleaf.history.pass,
sendImmediately: true,
},
})
this.callback = sinon.stub()
this.LocalFileWriter = {
bufferOnDisk: sinon.stub(),
}
this.WebApiManager = {
getHistoryId: sinon.stub(),
}
this.WebApiManager.getHistoryId
.withArgs(this.projectId)
.yields(null, this.historyId)
this.FetchUtils = {
fetchStream: sinon.stub(),
fetchNothing: sinon.stub().resolves(),
RequestFailedError,
}
this.request = sinon.stub()
this.logger = {
debug: sinon.stub(),
warn: sinon.stub(),
}
this.HistoryStoreManager = await esmock(MODULE_PATH, {
'@overleaf/fetch-utils': this.FetchUtils,
request: this.request,
'@overleaf/settings': this.settings,
'../../../../app/js/LocalFileWriter.js': this.LocalFileWriter,
'../../../../app/js/WebApiManager.js': this.WebApiManager,
'../../../../app/js/Errors.js': Errors,
'@overleaf/logger': this.logger,
})
})
describe('getMostRecentChunk', function () {
describe('successfully', function () {
beforeEach(function () {
this.chunk = {
chunk: {
startVersion: 0,
history: {
snapshot: {
files: {},
},
changes: [],
},
},
}
this.request
.withArgs(this.latestChunkRequestArgs)
.yields(null, { statusCode: 200 }, this.chunk)
this.HistoryStoreManager.getMostRecentChunk(
this.projectId,
this.historyId,
this.callback
)
})
it('should call the callback with the chunk', function () {
expect(this.callback).to.have.been.calledWith(null, this.chunk)
})
})
})
describe('getMostRecentVersion', function () {
describe('successfully', function () {
beforeEach(function () {
this.chunk = {
chunk: {
startVersion: 5,
history: {
snapshot: {
files: {},
},
changes: [
{ v2Authors: ['5678'], timestamp: '2017-10-17T10:44:40.227Z' },
{ v2Authors: ['1234'], timestamp: '2017-10-16T10:44:40.227Z' },
],
},
},
}
this.request
.withArgs(this.latestChunkRequestArgs)
.yields(null, { statusCode: 200 }, this.chunk)
this.HistoryStoreManager.getMostRecentVersion(
this.projectId,
this.historyId,
this.callback
)
})
it('should call the callback with the latest version information', function () {
expect(this.callback).to.have.been.calledWith(
null,
7,
{ project: undefined, docs: {} },
{ v2Authors: ['5678'], timestamp: '2017-10-17T10:44:40.227Z' }
)
})
})
describe('out of order doc ops', function () {
beforeEach(function () {
this.chunk = {
chunk: {
startVersion: 5,
history: {
snapshot: {
v2DocVersions: {
mock_doc_id: {
pathname: '/main.tex',
v: 2,
},
},
},
changes: [
{
operations: [],
v2DocVersions: {
mock_doc_id: {
pathname: '/main.tex',
v: 1,
},
},
},
],
},
},
}
this.request
.withArgs(this.latestChunkRequestArgs)
.yields(null, { statusCode: 200 }, this.chunk)
this.HistoryStoreManager.getMostRecentVersion(
this.projectId,
this.historyId,
this.callback
)
})
it('should return an error', function () {
expect(this.callback).to.have.been.calledWith(
sinon.match
.instanceOf(Errors.OpsOutOfOrderError)
.and(sinon.match.has('message', 'doc version out of order'))
)
})
it('should call the callback with the latest version information', function () {
expect(this.callback).to.have.been.calledWith(
sinon.match.instanceOf(Errors.OpsOutOfOrderError),
6,
{
project: undefined,
docs: { mock_doc_id: { pathname: '/main.tex', v: 2 } },
},
this.chunk.chunk.history.changes[0]
)
})
})
describe('out of order project structure versions', function () {
beforeEach(function () {
this.chunk = {
chunk: {
startVersion: 5,
history: {
snapshot: {
projectVersion: 2,
},
changes: [
{
operations: [{ pathname: 'main.tex', newPathname: '' }],
projectVersion: 1,
},
],
},
},
}
this.request
.withArgs(this.latestChunkRequestArgs)
.yields(null, { statusCode: 200 }, this.chunk)
this.HistoryStoreManager.getMostRecentVersion(
this.projectId,
this.historyId,
this.callback
)
})
it('should return an error', function () {
expect(this.callback).to.have.been.calledWith(
sinon.match
.instanceOf(Errors.OpsOutOfOrderError)
.and(
sinon.match.has(
'message',
'project structure version out of order'
)
)
)
})
it('should call the callback with the latest version information', function () {
expect(this.callback).to.have.been.calledWith(
sinon.match.instanceOf(Errors.OpsOutOfOrderError),
6,
{ project: 2, docs: {} },
this.chunk.chunk.history.changes[0]
)
})
})
describe('out of order project structure and doc versions', function () {
beforeEach(function () {
this.chunk = {
chunk: {
startVersion: 5,
history: {
snapshot: {
projectVersion: 1,
},
changes: [
{
operations: [{ pathname: 'main.tex', newPathname: '' }],
projectVersion: 1,
},
{
operations: [{ pathname: 'main.tex', newPathname: '' }],
projectVersion: 2,
},
{
operations: [{ pathname: 'main.tex', newPathname: '' }],
projectVersion: 3,
},
{
operations: [{ pathname: 'main.tex', newPathname: '' }],
projectVersion: 1,
},
{
operations: [],
v2DocVersions: {
mock_doc_id: {
pathname: '/main.tex',
v: 1,
},
},
},
{
operations: [],
v2DocVersions: {
mock_doc_id: {
pathname: '/main.tex',
v: 2,
},
},
},
{
operations: [],
v2DocVersions: {
mock_doc_id: {
pathname: '/main.tex',
v: 1,
},
},
},
],
},
},
}
this.request
.withArgs(this.latestChunkRequestArgs)
.yields(null, { statusCode: 200 }, this.chunk)
this.HistoryStoreManager.getMostRecentVersion(
this.projectId,
this.historyId,
this.callback
)
})
it('should return an error', function () {
expect(this.callback).to.have.been.calledWith(
sinon.match
.instanceOf(Errors.OpsOutOfOrderError)
.and(
sinon.match.has(
'message',
'project structure version out of order'
)
)
)
})
it('should call the callback with the latest version information', function () {
expect(this.callback).to.have.been.calledWith(
sinon.match.instanceOf(Errors.OpsOutOfOrderError),
12,
{
project: 3,
docs: { mock_doc_id: { pathname: '/main.tex', v: 2 } },
},
this.chunk.chunk.history.changes[6]
)
})
})
describe('with an unexpected response', function () {
beforeEach(function () {
this.badChunk = {
chunk: {
foo: 123, // valid chunk should have startVersion property
bar: 456,
},
}
this.request
.withArgs(this.latestChunkRequestArgs)
.yields(null, { statusCode: 200 }, this.badChunk)
this.HistoryStoreManager.getMostRecentVersion(
this.projectId,
this.historyId,
this.callback
)
})
it('should return an error', function () {
expect(this.callback).to.have.been.calledWith(
sinon.match
.instanceOf(Error)
.and(sinon.match.has('message', 'unexpected response'))
)
})
})
})
describe('createBlobForUpdate', function () {
beforeEach(function () {
this.fileStream = {}
this.hash = 'random-hash'
this.LocalFileWriter.bufferOnDisk.callsArgWith(4, null, this.hash)
this.FetchUtils.fetchNothing.rejects(
new RequestFailedError('', {}, { status: 404 })
)
this.FetchUtils.fetchStream.resolves(this.fileStream)
})
describe('for a file update with any filestore location', function () {
beforeEach(function (done) {
this.file_id = '012345678901234567890123'
this.update = {
file: true,
url: `http://filestore.other.cloud.provider/project/${this.projectId}/file/${this.file_id}`,
hash: this.hash,
}
this.HistoryStoreManager.createBlobForUpdate(
this.projectId,
this.historyId,
this.update,
(err, { file: hash }) => {
if (err) {
return done(err)
}
this.actualHash = hash
done()
}
)
})
it('should not log any warnings', function () {
expect(this.logger.warn).to.not.have.been.called
})
it('should request the file from the filestore in settings', function () {
expect(this.FetchUtils.fetchStream).to.have.been.calledWithMatch(
`${this.settings.apis.filestore.url}/project/${this.projectId}/file/${this.file_id}`
)
})
it('should call the callback with the blob', function () {
expect(this.actualHash).to.equal(this.hash)
})
})
describe('with filestore disabled', function () {
beforeEach(function (done) {
this.settings.apis.filestore.enabled = false
this.file_id = '012345678901234567890123'
this.update = {
file: true,
url: `http://filestore.other.cloud.provider/project/${this.projectId}/file/${this.file_id}`,
hash: this.hash,
}
this.HistoryStoreManager.createBlobForUpdate(
this.projectId,
this.historyId,
this.update,
err => {
expect(err).to.match(/blocking filestore read/)
done()
}
)
})
it('should not request the file', function () {
expect(this.FetchUtils.fetchStream).to.not.have.been.called
})
})
describe('for a file update with an invalid filestore location', function () {
beforeEach(function (done) {
this.invalid_id = '000000000000000000000000'
this.file_id = '012345678901234567890123'
this.update = {
file: true,
url: `http://filestore.other.cloud.provider/project/${this.invalid_id}/file/${this.file_id}`,
hash: this.hash,
}
this.HistoryStoreManager.createBlobForUpdate(
this.projectId,
this.historyId,
this.update,
err => {
expect(err).to.exist
done()
}
)
})
it('should not request the file from the filestore', function () {
expect(this.FetchUtils.fetchStream).to.not.have.been.called
})
})
describe('when the hash mismatches', function () {
beforeEach(function (done) {
this.file_id = '012345678901234567890123'
this.update = {
file: true,
url: `http://filestore.other.cloud.provider/project/${this.projectId}/file/${this.file_id}`,
hash: 'another-hash-from-web',
}
this.HistoryStoreManager.createBlobForUpdate(
this.projectId,
this.historyId,
this.update,
(err, { file: hash }) => {
if (err) {
return done(err)
}
this.actualHash = hash
done()
}
)
})
it('should log a warning', function () {
expect(this.logger.warn).to.have.been.calledWith(
{
projectId: this.projectId,
fileId: this.file_id,
webHash: 'another-hash-from-web',
fileHash: this.hash,
},
'hash mismatch between web and project-history'
)
})
it('should request the file from the filestore in settings', function () {
expect(this.FetchUtils.fetchStream).to.have.been.calledWithMatch(
`${this.settings.apis.filestore.url}/project/${this.projectId}/file/${this.file_id}`
)
})
it('should call the callback with the blob', function () {
expect(this.actualHash).to.equal(this.hash)
})
})
describe('when the createdBlob flag is set on the update', function () {
beforeEach(function () {
this.file_id = '012345678901234567890123'
this.update = {
file: true,
createdBlob: true,
url: `http://filestore.other.cloud.provider/project/${this.projectId}/file/${this.file_id}`,
hash: this.hash,
}
})
describe('when history-v1 confirms that the blob exists', function () {
beforeEach(function (done) {
this.FetchUtils.fetchNothing.resolves()
this.HistoryStoreManager.createBlobForUpdate(
this.projectId,
this.historyId,
this.update,
(err, { file: hash }) => {
if (err) {
return done(err)
}
this.actualHash = hash
done()
}
)
})
it('should call the callback with the existing hash', function () {
expect(this.actualHash).to.equal(this.hash)
})
it('should not request the file from the filestore', function () {
expect(this.FetchUtils.fetchStream).to.not.have.been.called
})
it('should log a debug level message', function () {
expect(this.logger.debug).to.have.been.calledWith(
{
projectId: this.projectId,
fileId: this.file_id,
update: this.update,
},
'Skipping blob creation as it has already been created'
)
})
})
describe('when history-v1 does not confirm that the blob exists', function () {
beforeEach(function (done) {
this.FetchUtils.fetchNothing.rejects(
new RequestFailedError(
`${this.settings.overleaf.history.host}/project/${this.projectId}/file/${this.file_id}`,
{ method: 'HEAD' },
{ status: 404 }
)
)
this.HistoryStoreManager.createBlobForUpdate(
this.projectId,
this.historyId,
this.update,
(err, { file: hash }) => {
if (err) {
return done(err)
}
this.actualHash = hash
done()
}
)
})
it('should warn that we will use the filestore', function () {
expect(this.logger.warn).to.have.been.calledWithMatch(
{
fileId: this.file_id,
projectId: this.projectId,
update: this.update,
},
'created blob does not exist, reading from filestore'
)
})
it('should request the file from the filestore in settings', function () {
expect(this.FetchUtils.fetchStream).to.have.been.calledWithMatch(
`${this.settings.apis.filestore.url}/project/${this.projectId}/file/${this.file_id}`
)
})
it('should call the callback with the blob', function () {
expect(this.actualHash).to.equal(this.hash)
})
})
})
})
describe('getProjectBlob', function () {
describe('successfully', function () {
beforeEach(function () {
this.blobContent = 'test content'
this.blobHash = 'test hash'
this.request.yields(null, { statusCode: 200 }, this.blobContent)
this.HistoryStoreManager.getProjectBlob(
this.historyId,
this.blobHash,
this.callback
)
})
it('should get the blob from the overleaf history service', function () {
expect(this.request).to.have.been.calledWithMatch({
method: 'GET',
url: `${this.settings.overleaf.history.host}/projects/${this.historyId}/blobs/${this.blobHash}`,
auth: {
user: this.settings.overleaf.history.user,
pass: this.settings.overleaf.history.pass,
sendImmediately: true,
},
})
})
it('should call the callback with the blob', function () {
expect(this.callback).to.have.been.calledWith(null, this.blobContent)
})
})
})
describe('getProjectBlobStream', function () {
describe('successfully', function () {
beforeEach(function (done) {
this.historyResponse = new EventEmitter()
this.blobHash = 'test hash'
this.FetchUtils.fetchStream.resolves(this.historyResponse)
this.HistoryStoreManager.getProjectBlobStream(
this.historyId,
this.blobHash,
(err, stream) => {
if (err) {
return done(err)
}
this.stream = stream
done()
}
)
})
it('should get the blob from the overleaf history service', function () {
expect(this.FetchUtils.fetchStream).to.have.been.calledWithMatch(
`${this.settings.overleaf.history.host}/projects/${this.historyId}/blobs/${this.blobHash}`
)
})
it('should return a stream of the blob contents', function () {
expect(this.stream).to.equal(this.historyResponse)
})
})
})
describe('initializeProject', function () {
describe('successfully', function () {
beforeEach(function () {
this.response_body = { projectId: this.historyId }
this.request.callsArgWith(
1,
null,
{ statusCode: 200 },
this.response_body
)
this.HistoryStoreManager.initializeProject(
this.historyId,
this.callback
)
})
it('should send the change to the history store', function () {
expect(this.request).to.have.been.calledWithMatch({
method: 'POST',
url: `${this.settings.overleaf.history.host}/projects`,
auth: {
user: this.settings.overleaf.history.user,
pass: this.settings.overleaf.history.pass,
sendImmediately: true,
},
json: { projectId: this.historyId },
})
})
it('should call the callback with the new overleaf id', function () {
expect(this.callback).to.have.been.calledWith(null, this.historyId)
})
})
})
describe('deleteProject', function () {
beforeEach(function (done) {
this.request.yields(null, { statusCode: 204 }, '')
this.HistoryStoreManager.deleteProject(this.historyId, done)
})
it('should ask the history store to delete the project', function () {
expect(this.request).to.have.been.calledWithMatch({
method: 'DELETE',
url: `${this.settings.overleaf.history.host}/projects/${this.historyId}`,
})
})
})
})

View File

@@ -0,0 +1,573 @@
import sinon from 'sinon'
import { strict as esmock } from 'esmock'
import mongodb from 'mongodb-legacy'
const { ObjectId } = mongodb
const MODULE_PATH = '../../../../app/js/HttpController.js'
describe('HttpController', function () {
beforeEach(async function () {
this.UpdatesProcessor = {
processUpdatesForProject: sinon.stub().yields(),
}
this.SummarizedUpdatesManager = {
getSummarizedProjectUpdates: sinon.stub(),
}
this.DiffManager = {
getDiff: sinon.stub(),
}
this.HistoryStoreManager = {
deleteProject: sinon.stub().yields(),
getMostRecentVersion: sinon.stub(),
getProjectBlobStream: sinon.stub(),
initializeProject: sinon.stub(),
}
this.SnapshotManager = {
getFileSnapshotStream: sinon.stub(),
getProjectSnapshot: sinon.stub(),
}
this.HealthChecker = {}
this.SyncManager = {
clearResyncState: sinon.stub().yields(),
startResync: sinon.stub().yields(),
}
this.WebApiManager = {
getHistoryId: sinon.stub(),
}
this.RedisManager = {
destroyDocUpdatesQueue: sinon.stub().yields(),
clearFirstOpTimestamp: sinon.stub().yields(),
clearCachedHistoryId: sinon.stub().yields(),
}
this.ErrorRecorder = {
clearError: sinon.stub().yields(),
}
this.LabelsManager = {
createLabel: sinon.stub(),
deleteLabel: sinon.stub().yields(),
deleteLabelForUser: sinon.stub().yields(),
getLabels: sinon.stub(),
}
this.HistoryApiManager = {
shouldUseProjectHistory: sinon.stub(),
}
this.RetryManager = {}
this.FlushManager = {}
this.request = {}
this.pipeline = sinon.stub()
this.HttpController = await esmock(MODULE_PATH, {
request: this.request,
stream: { pipeline: this.pipeline },
'../../../../app/js/UpdatesProcessor.js': this.UpdatesProcessor,
'../../../../app/js/SummarizedUpdatesManager.js':
this.SummarizedUpdatesManager,
'../../../../app/js/DiffManager.js': this.DiffManager,
'../../../../app/js/HistoryStoreManager.js': this.HistoryStoreManager,
'../../../../app/js/SnapshotManager.js': this.SnapshotManager,
'../../../../app/js/HealthChecker.js': this.HealthChecker,
'../../../../app/js/SyncManager.js': this.SyncManager,
'../../../../app/js/WebApiManager.js': this.WebApiManager,
'../../../../app/js/RedisManager.js': this.RedisManager,
'../../../../app/js/ErrorRecorder.js': this.ErrorRecorder,
'../../../../app/js/LabelsManager.js': this.LabelsManager,
'../../../../app/js/HistoryApiManager.js': this.HistoryApiManager,
'../../../../app/js/RetryManager.js': this.RetryManager,
'../../../../app/js/FlushManager.js': this.FlushManager,
})
this.pathname = 'doc-id-123'
this.projectId = new ObjectId().toString()
this.projectOwnerId = new ObjectId().toString()
this.next = sinon.stub()
this.userId = new ObjectId().toString()
this.now = Date.now()
this.res = {
json: sinon.stub(),
send: sinon.stub(),
sendStatus: sinon.stub(),
setHeader: sinon.stub(),
}
})
describe('getProjectBlob', function () {
beforeEach(function () {
this.blobHash = 'abcd'
this.stream = {}
this.historyId = 1337
this.HistoryStoreManager.getProjectBlobStream.yields(null, this.stream)
this.HttpController.getProjectBlob(
{ params: { history_id: this.historyId, hash: this.blobHash } },
this.res,
this.next
)
})
it('should get a blob stream', function () {
this.HistoryStoreManager.getProjectBlobStream
.calledWith(this.historyId, this.blobHash)
.should.equal(true)
this.pipeline.should.have.been.calledWith(this.stream, this.res)
})
it('should set caching header', function () {
this.res.setHeader.should.have.been.calledWith(
'Cache-Control',
'private, max-age=86400'
)
})
})
describe('initializeProject', function () {
beforeEach(function () {
this.historyId = new ObjectId().toString()
this.req = { body: { historyId: this.historyId } }
this.HistoryStoreManager.initializeProject.yields(null, this.historyId)
this.HttpController.initializeProject(this.req, this.res, this.next)
})
it('should initialize the project', function () {
this.HistoryStoreManager.initializeProject.calledWith().should.equal(true)
})
it('should return the new overleaf id', function () {
this.res.json
.calledWith({ project: { id: this.historyId } })
.should.equal(true)
})
})
describe('flushProject', function () {
beforeEach(function () {
this.req = {
params: {
project_id: this.projectId,
},
query: {},
}
this.HttpController.flushProject(this.req, this.res, this.next)
})
it('should process the updates', function () {
this.UpdatesProcessor.processUpdatesForProject
.calledWith(this.projectId)
.should.equal(true)
})
it('should return a success code', function () {
this.res.sendStatus.calledWith(204).should.equal(true)
})
})
describe('getDiff', function () {
beforeEach(function () {
this.from = 42
this.to = 45
this.req = {
params: {
project_id: this.projectId,
},
query: {
pathname: this.pathname,
from: this.from,
to: this.to,
},
}
this.diff = [{ u: 'mock-diff' }]
this.DiffManager.getDiff.yields(null, this.diff)
this.HttpController.getDiff(this.req, this.res, this.next)
})
it('should get the diff', function () {
this.DiffManager.getDiff.should.have.been.calledWith(
this.projectId,
this.pathname,
this.from,
this.to
)
})
it('should return the diff', function () {
this.res.json.calledWith({ diff: this.diff }).should.equal(true)
})
})
describe('getUpdates', function () {
beforeEach(function () {
this.before = Date.now()
this.nextBeforeTimestamp = this.before - 100
this.min_count = 10
this.req = {
params: {
project_id: this.projectId,
},
query: {
before: this.before,
min_count: this.min_count,
},
}
this.updates = [{ i: 'mock-summarized-updates', p: 10 }]
this.SummarizedUpdatesManager.getSummarizedProjectUpdates.yields(
null,
this.updates,
this.nextBeforeTimestamp
)
this.HttpController.getUpdates(this.req, this.res, this.next)
})
it('should get the updates', function () {
this.SummarizedUpdatesManager.getSummarizedProjectUpdates.should.have.been.calledWith(
this.projectId,
{
before: this.before,
min_count: this.min_count,
}
)
})
it('should return the formatted updates', function () {
this.res.json.should.have.been.calledWith({
updates: this.updates,
nextBeforeTimestamp: this.nextBeforeTimestamp,
})
})
})
describe('latestVersion', function () {
beforeEach(function () {
this.historyId = 1234
this.req = {
params: {
project_id: this.projectId,
},
}
this.version = 99
this.lastChange = {
v2Authors: ['1234'],
timestamp: '2016-08-16T10:44:40.227Z',
}
this.versionInfo = {
version: this.version,
v2Authors: ['1234'],
timestamp: '2016-08-16T10:44:40.227Z',
}
this.WebApiManager.getHistoryId.yields(null, this.historyId)
this.HistoryStoreManager.getMostRecentVersion.yields(
null,
this.version,
{},
this.lastChange
)
this.HttpController.latestVersion(this.req, this.res, this.next)
})
it('should process the updates', function () {
this.UpdatesProcessor.processUpdatesForProject
.calledWith(this.projectId)
.should.equal(true)
})
it('should get the ol project id', function () {
this.WebApiManager.getHistoryId
.calledWith(this.projectId)
.should.equal(true)
})
it('should get the latest version', function () {
this.HistoryStoreManager.getMostRecentVersion
.calledWith(this.projectId, this.historyId)
.should.equal(true)
})
it('should return version number', function () {
this.res.json.calledWith(this.versionInfo).should.equal(true)
})
})
describe('resyncProject', function () {
beforeEach(function () {
this.req = {
params: {
project_id: this.projectId,
},
query: {},
body: {},
}
this.HttpController.resyncProject(this.req, this.res, this.next)
})
it('should resync the project', function () {
this.SyncManager.startResync.calledWith(this.projectId).should.equal(true)
})
it('should flush the queue', function () {
this.UpdatesProcessor.processUpdatesForProject
.calledWith(this.projectId)
.should.equal(true)
})
it('should return 204', function () {
this.res.sendStatus.calledWith(204).should.equal(true)
})
})
describe('getFileSnapshot', function () {
beforeEach(function () {
this.version = 42
this.pathname = 'foo.tex'
this.req = {
params: {
project_id: this.projectId,
version: this.version,
pathname: this.pathname,
},
}
this.res = { mock: 'res' }
this.stream = {}
this.SnapshotManager.getFileSnapshotStream.yields(null, this.stream)
this.HttpController.getFileSnapshot(this.req, this.res, this.next)
})
it('should get the snapshot', function () {
this.SnapshotManager.getFileSnapshotStream.should.have.been.calledWith(
this.projectId,
this.version,
this.pathname
)
})
it('should pipe the returned stream into the response', function () {
this.pipeline.should.have.been.calledWith(this.stream, this.res)
})
})
describe('getProjectSnapshot', function () {
beforeEach(function () {
this.version = 42
this.req = {
params: {
project_id: this.projectId,
version: this.version,
},
}
this.res = { json: sinon.stub() }
this.snapshotData = { one: 1 }
this.SnapshotManager.getProjectSnapshot.yields(null, this.snapshotData)
this.HttpController.getProjectSnapshot(this.req, this.res, this.next)
})
it('should get the snapshot', function () {
this.SnapshotManager.getProjectSnapshot.should.have.been.calledWith(
this.projectId,
this.version
)
})
it('should send json response', function () {
this.res.json.calledWith(this.snapshotData).should.equal(true)
})
})
describe('getLabels', function () {
beforeEach(function () {
this.req = {
params: {
project_id: this.projectId,
},
}
this.labels = ['label-1', 'label-2']
this.LabelsManager.getLabels.yields(null, this.labels)
})
describe('project history is enabled', function () {
beforeEach(function () {
this.HistoryApiManager.shouldUseProjectHistory.yields(null, true)
this.HttpController.getLabels(this.req, this.res, this.next)
})
it('should get the labels for a project', function () {
this.LabelsManager.getLabels
.calledWith(this.projectId)
.should.equal(true)
})
it('should return the labels', function () {
this.res.json.calledWith(this.labels).should.equal(true)
})
})
describe('project history is not enabled', function () {
beforeEach(function () {
this.HistoryApiManager.shouldUseProjectHistory.yields(null, false)
this.HttpController.getLabels(this.req, this.res, this.next)
})
it('should return 409', function () {
this.res.sendStatus.calledWith(409).should.equal(true)
})
})
})
describe('createLabel', function () {
beforeEach(function () {
this.req = {
params: {
project_id: this.projectId,
},
body: {
version: (this.version = 'label-1'),
comment: (this.comment = 'a comment'),
created_at: (this.created_at = Date.now().toString()),
validate_exists: true,
user_id: this.userId,
},
}
this.label = { _id: new ObjectId() }
this.LabelsManager.createLabel.yields(null, this.label)
})
describe('project history is enabled', function () {
beforeEach(function () {
this.HistoryApiManager.shouldUseProjectHistory.yields(null, true)
this.HttpController.createLabel(this.req, this.res, this.next)
})
it('should create a label for a project', function () {
this.LabelsManager.createLabel.should.have.been.calledWith(
this.projectId,
this.userId,
this.version,
this.comment,
this.created_at,
true
)
})
it('should return the label', function () {
this.res.json.calledWith(this.label).should.equal(true)
})
})
describe('validate_exists = false is passed', function () {
beforeEach(function () {
this.req.body.validate_exists = false
this.HistoryApiManager.shouldUseProjectHistory.yields(null, true)
this.HttpController.createLabel(this.req, this.res, this.next)
})
it('should create a label for a project', function () {
this.LabelsManager.createLabel
.calledWith(
this.projectId,
this.userId,
this.version,
this.comment,
this.created_at,
false
)
.should.equal(true)
})
it('should return the label', function () {
this.res.json.calledWith(this.label).should.equal(true)
})
})
describe('project history is not enabled', function () {
beforeEach(function () {
this.HistoryApiManager.shouldUseProjectHistory.yields(null, false)
this.HttpController.createLabel(this.req, this.res, this.next)
})
it('should return 409', function () {
this.res.sendStatus.calledWith(409).should.equal(true)
})
})
})
describe('deleteLabelForUser', function () {
beforeEach(function () {
this.req = {
params: {
project_id: this.projectId,
user_id: this.userId,
label_id: (this.label_id = new ObjectId()),
},
}
this.HttpController.deleteLabelForUser(this.req, this.res, this.next)
})
it('should delete a label for a project', function () {
this.LabelsManager.deleteLabelForUser
.calledWith(this.projectId, this.userId, this.label_id)
.should.equal(true)
})
it('should return 204', function () {
this.res.sendStatus.calledWith(204).should.equal(true)
})
})
describe('deleteLabel', function () {
beforeEach(function () {
this.req = {
params: {
project_id: this.projectId,
label_id: (this.label_id = new ObjectId()),
},
}
this.HttpController.deleteLabel(this.req, this.res, this.next)
})
it('should delete a label for a project', function () {
this.LabelsManager.deleteLabel
.calledWith(this.projectId, this.label_id)
.should.equal(true)
})
it('should return 204', function () {
this.res.sendStatus.calledWith(204).should.equal(true)
})
})
describe('deleteProject', function () {
beforeEach(function () {
this.req = {
params: {
project_id: this.projectId,
},
}
this.WebApiManager.getHistoryId
.withArgs(this.projectId)
.yields(null, this.historyId)
this.HttpController.deleteProject(this.req, this.res, this.next)
})
it('should delete the updates queue', function () {
this.RedisManager.destroyDocUpdatesQueue.should.have.been.calledWith(
this.projectId
)
})
it('should clear the first op timestamp', function () {
this.RedisManager.clearFirstOpTimestamp.should.have.been.calledWith(
this.projectId
)
})
it('should clear the cached history id', function () {
this.RedisManager.clearCachedHistoryId.should.have.been.calledWith(
this.projectId
)
})
it('should clear the resync state', function () {
this.SyncManager.clearResyncState.should.have.been.calledWith(
this.projectId
)
})
it('should clear any failure record', function () {
this.ErrorRecorder.clearError.should.have.been.calledWith(this.projectId)
})
})
})

View File

@@ -0,0 +1,293 @@
import sinon from 'sinon'
import { expect } from 'chai'
import mongodb from 'mongodb-legacy'
import tk from 'timekeeper'
import { strict as esmock } from 'esmock'
const { ObjectId } = mongodb
const MODULE_PATH = '../../../../app/js/LabelsManager.js'
describe('LabelsManager', function () {
beforeEach(async function () {
this.now = new Date()
tk.freeze(this.now)
this.db = {
projectHistoryLabels: {
deleteOne: sinon.stub(),
find: sinon.stub(),
insertOne: sinon.stub(),
},
}
this.mongodb = {
ObjectId,
db: this.db,
}
this.HistoryStoreManager = {
getChunkAtVersion: sinon.stub().yields(),
}
this.UpdatesProcessor = {
processUpdatesForProject: sinon.stub().yields(),
}
this.WebApiManager = {
getHistoryId: sinon.stub(),
}
this.LabelsManager = await esmock(MODULE_PATH, {
'../../../../app/js/mongodb.js': this.mongodb,
'../../../../app/js/HistoryStoreManager.js': this.HistoryStoreManager,
'../../../../app/js/UpdatesProcessor.js': this.UpdatesProcessor,
'../../../../app/js/WebApiManager.js': this.WebApiManager,
})
this.project_id = new ObjectId().toString()
this.historyId = 123
this.user_id = new ObjectId().toString()
this.label_id = new ObjectId().toString()
this.callback = sinon.stub()
})
afterEach(function () {
tk.reset()
})
describe('getLabels', function () {
beforeEach(function () {
this.label = {
_id: new ObjectId(),
comment: 'some comment',
version: 123,
user_id: new ObjectId(),
created_at: new Date(),
}
this.db.projectHistoryLabels.find.returns({
toArray: sinon.stub().yields(null, [this.label]),
})
})
describe('with valid project id', function () {
beforeEach(function () {
this.LabelsManager.getLabels(this.project_id, this.callback)
})
it('gets the labels state from mongo', function () {
expect(this.db.projectHistoryLabels.find).to.have.been.calledWith({
project_id: new ObjectId(this.project_id),
})
})
it('returns formatted labels', function () {
expect(this.callback).to.have.been.calledWith(null, [
sinon.match({
id: this.label._id,
comment: this.label.comment,
version: this.label.version,
user_id: this.label.user_id,
created_at: this.label.created_at,
}),
])
})
})
describe('with invalid project id', function () {
it('returns an error', function (done) {
this.LabelsManager.getLabels('invalid id', error => {
expect(error).to.exist
done()
})
})
})
})
describe('createLabel', function () {
beforeEach(function () {
this.version = 123
this.comment = 'a comment'
this.WebApiManager.getHistoryId.yields(null, this.historyId)
})
describe('with createdAt', function () {
beforeEach(function () {
this.createdAt = new Date(1)
this.db.projectHistoryLabels.insertOne.yields(null, {
insertedId: new ObjectId(this.label_id),
})
this.LabelsManager.createLabel(
this.project_id,
this.user_id,
this.version,
this.comment,
this.createdAt,
true,
this.callback
)
})
it('flushes unprocessed updates', function () {
expect(
this.UpdatesProcessor.processUpdatesForProject
).to.have.been.calledWith(this.project_id)
})
it('finds the V1 project id', function () {
expect(this.WebApiManager.getHistoryId).to.have.been.calledWith(
this.project_id
)
})
it('checks there is a chunk for the project + version', function () {
expect(
this.HistoryStoreManager.getChunkAtVersion
).to.have.been.calledWith(this.project_id, this.historyId, this.version)
})
it('create the label in mongo', function () {
expect(this.db.projectHistoryLabels.insertOne).to.have.been.calledWith(
sinon.match({
project_id: new ObjectId(this.project_id),
comment: this.comment,
version: this.version,
user_id: new ObjectId(this.user_id),
created_at: this.createdAt,
}),
sinon.match.any
)
})
it('returns the label', function () {
expect(this.callback).to.have.been.calledWith(null, {
id: new ObjectId(this.label_id),
comment: this.comment,
version: this.version,
user_id: new ObjectId(this.user_id),
created_at: this.createdAt,
})
})
})
describe('without createdAt', function () {
beforeEach(function () {
this.db.projectHistoryLabels.insertOne.yields(null, {
insertedId: new ObjectId(this.label_id),
})
this.LabelsManager.createLabel(
this.project_id,
this.user_id,
this.version,
this.comment,
undefined,
true,
this.callback
)
})
it('create the label with the current date', function () {
expect(this.db.projectHistoryLabels.insertOne).to.have.been.calledWith(
sinon.match({
project_id: new ObjectId(this.project_id),
comment: this.comment,
version: this.version,
user_id: new ObjectId(this.user_id),
created_at: this.now,
})
)
})
})
describe('with shouldValidateExists = false', function () {
beforeEach(function () {
this.createdAt = new Date(1)
this.db.projectHistoryLabels.insertOne.yields(null, {
insertedId: new ObjectId(this.label_id),
})
this.LabelsManager.createLabel(
this.project_id,
this.user_id,
this.version,
this.comment,
this.createdAt,
false,
this.callback
)
})
it('checks there is a chunk for the project + version', function () {
expect(this.HistoryStoreManager.getChunkAtVersion).to.not.have.been
.called
})
})
describe('with no userId', function () {
beforeEach(function () {
this.db.projectHistoryLabels.insertOne.yields(null, {
insertedId: new ObjectId(this.label_id),
})
const userId = undefined
this.LabelsManager.createLabel(
this.project_id,
userId,
this.version,
this.comment,
this.createdAt,
false,
this.callback
)
})
it('creates the label without user_id', function () {
expect(this.db.projectHistoryLabels.insertOne).to.have.been.calledWith(
sinon.match({
project_id: new ObjectId(this.project_id),
comment: this.comment,
version: this.version,
user_id: undefined,
created_at: this.now,
})
)
})
})
})
describe('deleteLabelForUser', function () {
beforeEach(function () {
this.db.projectHistoryLabels.deleteOne.yields()
this.LabelsManager.deleteLabelForUser(
this.project_id,
this.user_id,
this.label_id,
this.callback
)
})
it('removes the label from the database', function () {
expect(this.db.projectHistoryLabels.deleteOne).to.have.been.calledWith(
{
_id: new ObjectId(this.label_id),
project_id: new ObjectId(this.project_id),
user_id: new ObjectId(this.user_id),
},
this.callback
)
})
})
describe('deleteLabel', function () {
beforeEach(function () {
this.db.projectHistoryLabels.deleteOne.yields()
this.LabelsManager.deleteLabel(
this.project_id,
this.label_id,
this.callback
)
})
it('removes the label from the database', function () {
expect(this.db.projectHistoryLabels.deleteOne).to.have.been.calledWith(
{
_id: new ObjectId(this.label_id),
project_id: new ObjectId(this.project_id),
},
this.callback
)
})
})
})

View File

@@ -0,0 +1,422 @@
/* eslint-disable
mocha/no-nested-tests,
no-return-assign,
no-undef,
no-unused-vars,
*/
// TODO: This file was created by bulk-decaffeinate.
// Fix any style issues and re-enable lint.
/*
* decaffeinate suggestions:
* DS101: Remove unnecessary use of Array.from
* DS102: Remove unnecessary code created because of implicit returns
* DS206: Consider reworking classes to avoid initClass
* DS207: Consider shorter variations of null checks
* Full docs: https://github.com/decaffeinate/decaffeinate/blob/master/docs/suggestions.md
*/
import async from 'async'
import sinon from 'sinon'
import { expect } from 'chai'
import { strict as esmock } from 'esmock'
const MODULE_PATH = '../../../../app/js/LockManager.js'
describe('LockManager', function () {
beforeEach(async function () {
let Timer
this.Settings = {
redis: {
lock: {},
},
}
this.rclient = {
auth: sinon.stub(),
del: sinon.stub().yields(),
eval: sinon.stub(),
exists: sinon.stub(),
set: sinon.stub(),
}
this.RedisWrapper = {
createClient: sinon.stub().returns(this.rclient),
}
this.Metrics = {
inc: sinon.stub(),
gauge: sinon.stub(),
Timer: (Timer = (function () {
Timer = class Timer {
static initClass() {
this.prototype.done = sinon.stub()
}
}
Timer.initClass()
return Timer
})()),
}
this.logger = {
debug: sinon.stub(),
}
this.LockManager = await esmock(MODULE_PATH, {
'@overleaf/redis-wrapper': this.RedisWrapper,
'@overleaf/settings': this.Settings,
'@overleaf/metrics': this.Metrics,
'@overleaf/logger': this.logger,
})
this.key = 'lock-key'
this.callback = sinon.stub()
this.clock = sinon.useFakeTimers()
})
afterEach(function () {
this.clock.restore()
})
describe('checkLock', function () {
describe('when the lock is taken', function () {
beforeEach(function () {
this.rclient.exists.yields(null, '1')
return this.LockManager.checkLock(this.key, this.callback)
})
it('should check the lock in redis', function () {
return this.rclient.exists.calledWith(this.key).should.equal(true)
})
return it('should return the callback with false', function () {
return this.callback.calledWith(null, false).should.equal(true)
})
})
return describe('when the lock is free', function () {
beforeEach(function () {
this.rclient.exists.yields(null, '0')
return this.LockManager.checkLock(this.key, this.callback)
})
return it('should return the callback with true', function () {
return this.callback.calledWith(null, true).should.equal(true)
})
})
})
describe('tryLock', function () {
describe('when the lock is taken', function () {
beforeEach(function () {
this.rclient.set.yields(null, null)
this.LockManager._mocks.randomLock = sinon
.stub()
.returns('locked-random-value')
return this.LockManager.tryLock(this.key, this.callback)
})
it('should check the lock in redis', function () {
return this.rclient.set.should.have.been.calledWith(
this.key,
'locked-random-value',
'EX',
this.LockManager.LOCK_TTL,
'NX'
)
})
return it('should return the callback with false', function () {
return this.callback.calledWith(null, false).should.equal(true)
})
})
return describe('when the lock is free', function () {
beforeEach(function () {
this.rclient.set.yields(null, 'OK')
return this.LockManager.tryLock(this.key, this.callback)
})
return it('should return the callback with true', function () {
return this.callback.calledWith(null, true).should.equal(true)
})
})
})
describe('deleteLock', function () {
return beforeEach(function () {
beforeEach(function () {
return this.LockManager.deleteLock(this.key, this.callback)
})
it('should delete the lock in redis', function () {
return this.rclient.del.calledWith(key).should.equal(true)
})
return it('should call the callback', function () {
return this.callback.called.should.equal(true)
})
})
})
describe('getLock', function () {
describe('when the lock is not taken', function () {
beforeEach(function (done) {
this.LockManager._mocks.tryLock = sinon.stub().yields(null, true)
return this.LockManager.getLock(this.key, (...args) => {
this.callback(...Array.from(args || []))
return done()
})
})
it('should try to get the lock', function () {
return this.LockManager._mocks.tryLock
.calledWith(this.key)
.should.equal(true)
})
it('should only need to try once', function () {
return this.LockManager._mocks.tryLock.callCount.should.equal(1)
})
return it('should return the callback', function () {
return this.callback.calledWith(null).should.equal(true)
})
})
describe('when the lock is initially set', function () {
beforeEach(function (done) {
this.LockManager._mocks.tryLock = sinon.stub()
this.LockManager._mocks.tryLock.onCall(0).yields(null, false)
this.LockManager._mocks.tryLock.onCall(1).yields(null, false)
this.LockManager._mocks.tryLock.onCall(2).yields(null, false)
this.LockManager._mocks.tryLock.onCall(3).yields(null, true)
this.LockManager.getLock(this.key, (...args) => {
this.callback(...args)
return done()
})
this.clock.runAll()
})
it('should call tryLock multiple times until free', function () {
this.LockManager._mocks.tryLock.callCount.should.equal(4)
})
return it('should return the callback', function () {
return this.callback.calledWith(null).should.equal(true)
})
})
return describe('when the lock times out', function () {
beforeEach(function (done) {
const time = Date.now()
this.LockManager._mocks.tryLock = sinon.stub().yields(null, false)
this.LockManager.getLock(this.key, (...args) => {
this.callback(...args)
return done()
})
this.clock.runAll()
})
return it('should return the callback with an error', function () {
return this.callback
.calledWith(sinon.match.instanceOf(Error))
.should.equal(true)
})
})
})
return describe('runWithLock', function () {
describe('with successful run', function () {
beforeEach(function () {
this.result = 'mock-result'
this.runner = sinon.stub().callsFake((extendLock, releaseLock) => {
return releaseLock(null, this.result)
})
this.LockManager._mocks.getLock = sinon.stub().yields()
this.LockManager._mocks.releaseLock = sinon.stub().yields()
return this.LockManager.runWithLock(
this.key,
this.runner,
this.callback
)
})
it('should get the lock', function () {
return this.LockManager._mocks.getLock
.calledWith(this.key)
.should.equal(true)
})
it('should run the passed function', function () {
return this.runner.called.should.equal(true)
})
it('should release the lock', function () {
return this.LockManager._mocks.releaseLock
.calledWith(this.key)
.should.equal(true)
})
return it('should call the callback', function () {
return this.callback.calledWith(null, this.result).should.equal(true)
})
})
describe('when the runner function returns an error', function () {
beforeEach(function () {
this.error = new Error('oops')
this.result = 'mock-result'
this.runner = sinon.stub().callsFake((extendLock, releaseLock) => {
return releaseLock(this.error, this.result)
})
this.LockManager._mocks.getLock = sinon.stub().yields()
this.LockManager._mocks.releaseLock = sinon.stub().yields()
return this.LockManager.runWithLock(
this.key,
this.runner,
this.callback
)
})
it('should release the lock', function () {
return this.LockManager._mocks.releaseLock
.calledWith(this.key)
.should.equal(true)
})
return it('should call the callback with the error', function () {
return this.callback
.calledWith(this.error, this.result)
.should.equal(true)
})
})
describe('extending the lock whilst running', function () {
beforeEach(function () {
this.lockValue = 'lock-value'
this.LockManager._mocks.getLock = sinon
.stub()
.yields(null, this.lockValue)
this.LockManager._mocks.extendLock = sinon.stub().callsArg(2)
this.LockManager._mocks.releaseLock = sinon.stub().callsArg(2)
})
it('should extend the lock if the minimum interval has been passed', function (done) {
const runner = (extendLock, releaseLock) => {
this.clock.tick(this.LockManager.MIN_LOCK_EXTENSION_INTERVAL + 1)
return extendLock(releaseLock)
}
return this.LockManager.runWithLock(this.key, runner, () => {
this.LockManager._mocks.extendLock
.calledWith(this.key, this.lockValue)
.should.equal(true)
return done()
})
})
return it('should not extend the lock if the minimum interval has not been passed', function (done) {
const runner = (extendLock, releaseLock) => {
this.clock.tick(this.LockManager.MIN_LOCK_EXTENSION_INTERVAL - 1)
return extendLock(releaseLock)
}
return this.LockManager.runWithLock(this.key, runner, () => {
this.LockManager._mocks.extendLock.callCount.should.equal(0)
return done()
})
})
})
describe('exceeding the lock ttl', function () {
beforeEach(function () {
this.lockValue = 'lock-value'
this.LockManager._mocks.getLock = sinon
.stub()
.yields(null, this.lockValue)
this.LockManager._mocks.extendLock = sinon.stub().yields()
this.LockManager._mocks.releaseLock = sinon.stub().yields()
return (this.LOCK_TTL_MS = this.LockManager.LOCK_TTL * 1000)
})
it("doesn't log if the ttl wasn't exceeded", function (done) {
const runner = (extendLock, releaseLock) => {
this.clock.tick(this.LOCK_TTL_MS - 1)
return releaseLock()
}
return this.LockManager.runWithLock(this.key, runner, () => {
this.logger.debug.callCount.should.equal(0)
return done()
})
})
it("doesn't log if the lock was extended", function (done) {
const runner = (extendLock, releaseLock) => {
this.clock.tick(this.LOCK_TTL_MS - 1)
return extendLock(() => {
this.clock.tick(2)
return releaseLock()
})
}
return this.LockManager.runWithLock(this.key, runner, () => {
this.logger.debug.callCount.should.equal(0)
return done()
})
})
return it('logs that the excecution exceeded the lock', function (done) {
const runner = (extendLock, releaseLock) => {
this.clock.tick(this.LOCK_TTL_MS + 1)
return releaseLock()
}
return this.LockManager.runWithLock(this.key, runner, () => {
const slowExecutionError = new Error('slow execution during lock')
this.logger.debug
.calledWithMatch('exceeded lock timeout', { key: this.key })
.should.equal(true)
return done()
})
})
})
return describe('releaseLock', function () {
describe('when the lock is current', function () {
beforeEach(function () {
this.rclient.eval.yields(null, 1)
return this.LockManager.releaseLock(
this.key,
this.lockValue,
this.callback
)
})
it('should clear the data from redis', function () {
return this.rclient.eval
.calledWith(
this.LockManager.UNLOCK_SCRIPT,
1,
this.key,
this.lockValue
)
.should.equal(true)
})
return it('should call the callback', function () {
return this.callback.called.should.equal(true)
})
})
return describe('when the lock has expired', function () {
beforeEach(function () {
this.rclient.eval.yields(null, 0)
return this.LockManager.releaseLock(
this.key,
this.lockValue,
this.callback
)
})
return it('should return an error if the lock has expired', function () {
return this.callback
.calledWith(
sinon.match.has('message', 'tried to release timed out lock')
)
.should.equal(true)
})
})
})
})
})

View File

@@ -0,0 +1,76 @@
import { expect } from 'chai'
import Core from 'overleaf-editor-core'
import * as OperationsCompressor from '../../../../app/js/OperationsCompressor.js'
describe('OperationsCompressor', function () {
function edit(pathname, textOperationJsonObject) {
return Core.Operation.editFile(
pathname,
Core.TextOperation.fromJSON({ textOperation: textOperationJsonObject })
)
}
it('collapses edit operations', function () {
const compressedOperations = OperationsCompressor.compressOperations([
edit('main.tex', [3, 'foo', 17]),
edit('main.tex', [10, -5, 8]),
])
expect(compressedOperations).to.have.length(1)
expect(compressedOperations[0]).to.deep.equal(
edit('main.tex', [3, 'foo', 4, -5, 8])
)
})
it('only collapses consecutive composable edit operations', function () {
const compressedOperations = OperationsCompressor.compressOperations([
edit('main.tex', [3, 'foo', 17]),
edit('main.tex', [10, -5, 8]),
edit('not-main.tex', [3, 'foo', 17]),
edit('not-main.tex', [10, -5, 8]),
])
expect(compressedOperations).to.have.length(2)
expect(compressedOperations[0]).to.deep.equal(
edit('main.tex', [3, 'foo', 4, -5, 8])
)
expect(compressedOperations[1]).to.deep.equal(
edit('not-main.tex', [3, 'foo', 4, -5, 8])
)
})
it("don't collapses text operations around non-composable operations", function () {
const compressedOperations = OperationsCompressor.compressOperations([
edit('main.tex', [3, 'foo', 17]),
Core.Operation.moveFile('main.tex', 'new-main.tex'),
edit('new-main.tex', [10, -5, 8]),
edit('new-main.tex', [6, 'bar', 12]),
])
expect(compressedOperations).to.have.length(3)
expect(compressedOperations[0]).to.deep.equal(
edit('main.tex', [3, 'foo', 17])
)
expect(compressedOperations[1].newPathname).to.deep.equal('new-main.tex')
expect(compressedOperations[2]).to.deep.equal(
edit('new-main.tex', [6, 'bar', 4, -5, 8])
)
})
it('handle empty operations', function () {
const compressedOperations = OperationsCompressor.compressOperations([])
expect(compressedOperations).to.have.length(0)
})
it('handle single operations', function () {
const compressedOperations = OperationsCompressor.compressOperations([
edit('main.tex', [3, 'foo', 17]),
])
expect(compressedOperations).to.have.length(1)
expect(compressedOperations[0]).to.deep.equal(
edit('main.tex', [3, 'foo', 17])
)
})
})

View File

@@ -0,0 +1,556 @@
import { expect } from 'chai'
import sinon from 'sinon'
import { strict as esmock } from 'esmock'
const MODULE_PATH = '../../../../app/js/RedisManager.js'
describe('RedisManager', function () {
beforeEach(async function () {
this.rclient = new FakeRedis()
this.RedisWrapper = {
createClient: sinon.stub().returns(this.rclient),
}
this.Settings = {
redis: {
project_history: {
key_schema: {
projectHistoryOps({ project_id: projectId }) {
return `Project:HistoryOps:{${projectId}}`
},
projectHistoryFirstOpTimestamp({ project_id: projectId }) {
return `ProjectHistory:FirstOpTimestamp:{${projectId}}`
},
},
},
},
}
this.Metrics = {
timing: sinon.stub(),
summary: sinon.stub(),
globalGauge: sinon.stub(),
}
this.RedisManager = await esmock(MODULE_PATH, {
'@overleaf/redis-wrapper': this.RedisWrapper,
'@overleaf/settings': this.Settings,
'@overleaf/metrics': this.Metrics,
})
this.projectId = 'project-id-123'
this.batchSize = 100
this.historyOpsKey = `Project:HistoryOps:{${this.projectId}}`
this.firstOpTimestampKey = `ProjectHistory:FirstOpTimestamp:{${this.projectId}}`
this.updates = [
{ v: 42, op: ['a', 'b', 'c', 'd'] },
{ v: 45, op: ['e', 'f', 'g', 'h'] },
]
this.extraUpdates = [{ v: 100, op: ['i', 'j', 'k'] }]
this.rawUpdates = this.updates.map(update => JSON.stringify(update))
this.extraRawUpdates = this.extraUpdates.map(update =>
JSON.stringify(update)
)
})
describe('getRawUpdatesBatch', function () {
it('gets a small number of updates in one batch', async function () {
const updates = makeUpdates(2)
const rawUpdates = makeRawUpdates(updates)
this.rclient.setList(this.historyOpsKey, rawUpdates)
const result = await this.RedisManager.promises.getRawUpdatesBatch(
this.projectId,
100
)
expect(result).to.deep.equal({ rawUpdates, hasMore: false })
})
it('gets a larger number of updates in several batches', async function () {
const updates = makeUpdates(
this.RedisManager.RAW_UPDATES_BATCH_SIZE * 2 + 12
)
const rawUpdates = makeRawUpdates(updates)
this.rclient.setList(this.historyOpsKey, rawUpdates)
const result = await this.RedisManager.promises.getRawUpdatesBatch(
this.projectId,
5000
)
expect(result).to.deep.equal({ rawUpdates, hasMore: false })
})
it("doesn't return more than the number of updates requested", async function () {
const updates = makeUpdates(100)
const rawUpdates = makeRawUpdates(updates)
this.rclient.setList(this.historyOpsKey, rawUpdates)
const result = await this.RedisManager.promises.getRawUpdatesBatch(
this.projectId,
75
)
expect(result).to.deep.equal({
rawUpdates: rawUpdates.slice(0, 75),
hasMore: true,
})
})
})
describe('parseDocUpdates', function () {
it('should return the parsed ops', function () {
const updates = makeUpdates(12)
const rawUpdates = makeRawUpdates(updates)
this.RedisManager.parseDocUpdates(rawUpdates).should.deep.equal(updates)
})
})
describe('getUpdatesInBatches', function () {
beforeEach(function () {
this.runner = sinon.stub().resolves()
})
describe('single batch smaller than batch size', function () {
beforeEach(async function () {
this.updates = makeUpdates(2)
this.rawUpdates = makeRawUpdates(this.updates)
this.rclient.setList(this.historyOpsKey, this.rawUpdates)
await this.RedisManager.promises.getUpdatesInBatches(
this.projectId,
3,
this.runner
)
})
it('calls the runner once', function () {
this.runner.callCount.should.equal(1)
})
it('calls the runner with the updates', function () {
this.runner.should.have.been.calledWith(this.updates)
})
it('deletes the applied updates', function () {
expect(this.rclient.getList(this.historyOpsKey)).to.deep.equal([])
})
it('deletes the first op timestamp', function () {
expect(this.rclient.del).to.have.been.calledWith(
this.firstOpTimestampKey
)
})
})
describe('single batch at batch size', function () {
beforeEach(async function () {
this.updates = makeUpdates(123)
this.rawUpdates = makeRawUpdates(this.updates)
this.rclient.setList(this.historyOpsKey, this.rawUpdates)
await this.RedisManager.promises.getUpdatesInBatches(
this.projectId,
123,
this.runner
)
})
it('calls the runner once', function () {
this.runner.callCount.should.equal(1)
})
it('calls the runner with the updates', function () {
this.runner.should.have.been.calledWith(this.updates)
})
it('deletes the applied updates', function () {
expect(this.rclient.getList(this.historyOpsKey)).to.deep.equal([])
})
it('deletes the first op timestamp', function () {
expect(this.rclient.del).to.have.been.calledWith(
this.firstOpTimestampKey
)
})
})
describe('single batch exceeding size limit on updates', function () {
beforeEach(async function () {
this.updates = makeUpdates(2, [
'x'.repeat(this.RedisManager.RAW_UPDATE_SIZE_THRESHOLD),
])
this.rawUpdates = makeRawUpdates(this.updates)
this.rclient.setList(this.historyOpsKey, this.rawUpdates)
await this.RedisManager.promises.getUpdatesInBatches(
this.projectId,
123,
this.runner
)
})
it('calls the runner twice', function () {
this.runner.callCount.should.equal(2)
})
it('calls the runner with the first update', function () {
this.runner
.getCall(0)
.should.have.been.calledWith(this.updates.slice(0, 1))
})
it('calls the runner with the second update', function () {
this.runner
.getCall(1)
.should.have.been.calledWith(this.updates.slice(1))
})
it('deletes the applied updates', function () {
expect(this.rclient.getList(this.historyOpsKey)).to.deep.equal([])
})
})
describe('two batches with first update below and second update above the size limit on updates', function () {
beforeEach(async function () {
this.updates = makeUpdates(2, [
'x'.repeat(this.RedisManager.RAW_UPDATE_SIZE_THRESHOLD / 2),
])
this.rawUpdates = makeRawUpdates(this.updates)
this.rclient.setList(this.historyOpsKey, this.rawUpdates)
await this.RedisManager.promises.getUpdatesInBatches(
this.projectId,
123,
this.runner
)
})
it('calls the runner twice', function () {
this.runner.callCount.should.equal(2)
})
it('calls the runner with the first update', function () {
this.runner
.getCall(0)
.should.have.been.calledWith(this.updates.slice(0, 1))
})
it('calls the runner with the second update', function () {
this.runner
.getCall(1)
.should.have.been.calledWith(this.updates.slice(1))
})
it('deletes the applied updates', function () {
expect(this.rclient.getList(this.historyOpsKey)).to.deep.equal([])
})
})
describe('single batch exceeding op count limit on updates', function () {
beforeEach(async function () {
const ops = Array(this.RedisManager.MAX_UPDATE_OP_LENGTH + 1).fill('op')
this.updates = makeUpdates(2, { op: ops })
this.rawUpdates = makeRawUpdates(this.updates)
this.rclient.setList(this.historyOpsKey, this.rawUpdates)
await this.RedisManager.promises.getUpdatesInBatches(
this.projectId,
123,
this.runner
)
})
it('calls the runner twice', function () {
this.runner.callCount.should.equal(2)
})
it('calls the runner with the first update', function () {
this.runner
.getCall(0)
.should.have.been.calledWith(this.updates.slice(0, 1))
})
it('calls the runner with the second update', function () {
this.runner
.getCall(1)
.should.have.been.calledWith(this.updates.slice(1))
})
it('deletes the applied updates', function () {
expect(this.rclient.getList(this.historyOpsKey)).to.deep.equal([])
})
})
describe('single batch exceeding doc content count', function () {
beforeEach(async function () {
this.updates = makeUpdates(
this.RedisManager.MAX_NEW_DOC_CONTENT_COUNT + 3,
{ resyncDocContent: 123 }
)
this.rawUpdates = makeRawUpdates(this.updates)
this.rclient.setList(this.historyOpsKey, this.rawUpdates)
await this.RedisManager.promises.getUpdatesInBatches(
this.projectId,
123,
this.runner
)
})
it('calls the runner twice', function () {
this.runner.callCount.should.equal(2)
})
it('calls the runner with the first batch of updates', function () {
this.runner.should.have.been.calledWith(
this.updates.slice(0, this.RedisManager.MAX_NEW_DOC_CONTENT_COUNT)
)
})
it('calls the runner with the second batch of updates', function () {
this.runner.should.have.been.calledWith(
this.updates.slice(this.RedisManager.MAX_NEW_DOC_CONTENT_COUNT)
)
})
it('deletes the applied updates', function () {
expect(this.rclient.getList(this.historyOpsKey)).to.deep.equal([])
})
})
describe('two batches with first update below and second update above the ops length limit on updates', function () {
beforeEach(async function () {
// set the threshold below the size of the first update
this.updates = makeUpdates(2, { op: ['op1', 'op2'] })
this.updates[1].op = Array(
this.RedisManager.MAX_UPDATE_OP_LENGTH + 2
).fill('op')
this.rawUpdates = makeRawUpdates(this.updates)
this.rclient.setList(this.historyOpsKey, this.rawUpdates)
await this.RedisManager.promises.getUpdatesInBatches(
this.projectId,
123,
this.runner
)
})
it('calls the runner twice', function () {
this.runner.callCount.should.equal(2)
})
it('calls the runner with the first update', function () {
this.runner.should.have.been.calledWith(this.updates.slice(0, 1))
})
it('calls the runner with the second update', function () {
this.runner.should.have.been.calledWith(this.updates.slice(1))
})
it('deletes the applied updates', function () {
expect(this.rclient.getList(this.historyOpsKey)).to.deep.equal([])
})
})
describe('two batches, one partial', function () {
beforeEach(async function () {
this.updates = makeUpdates(15)
this.rawUpdates = makeRawUpdates(this.updates)
this.rclient.setList(this.historyOpsKey, this.rawUpdates)
await this.RedisManager.promises.getUpdatesInBatches(
this.projectId,
10,
this.runner
)
})
it('calls the runner twice', function () {
this.runner.callCount.should.equal(2)
})
it('calls the runner with the updates', function () {
this.runner
.getCall(0)
.should.have.been.calledWith(this.updates.slice(0, 10))
this.runner
.getCall(1)
.should.have.been.calledWith(this.updates.slice(10))
})
it('deletes the applied updates', function () {
expect(this.rclient.getList(this.historyOpsKey)).to.deep.equal([])
})
})
describe('two full batches', function () {
beforeEach(async function () {
this.updates = makeUpdates(20)
this.rawUpdates = makeRawUpdates(this.updates)
this.rclient.setList(this.historyOpsKey, this.rawUpdates)
await this.RedisManager.promises.getUpdatesInBatches(
this.projectId,
10,
this.runner
)
})
it('calls the runner twice', function () {
this.runner.callCount.should.equal(2)
})
it('calls the runner with the updates', function () {
this.runner
.getCall(0)
.should.have.been.calledWith(this.updates.slice(0, 10))
this.runner
.getCall(1)
.should.have.been.calledWith(this.updates.slice(10))
})
it('deletes the applied updates', function () {
expect(this.rclient.getList(this.historyOpsKey)).to.deep.equal([])
})
})
describe('three full bathches, bigger than the Redis read batch size', function () {
beforeEach(async function () {
this.batchSize = this.RedisManager.RAW_UPDATES_BATCH_SIZE * 2
this.updates = makeUpdates(this.batchSize * 3)
this.rawUpdates = makeRawUpdates(this.updates)
this.rclient.setList(this.historyOpsKey, this.rawUpdates)
await this.RedisManager.promises.getUpdatesInBatches(
this.projectId,
this.batchSize,
this.runner
)
})
it('calls the runner twice', function () {
this.runner.callCount.should.equal(3)
})
it('calls the runner with the updates', function () {
this.runner
.getCall(0)
.should.have.been.calledWith(this.updates.slice(0, this.batchSize))
this.runner
.getCall(1)
.should.have.been.calledWith(
this.updates.slice(this.batchSize, this.batchSize * 2)
)
this.runner
.getCall(2)
.should.have.been.calledWith(this.updates.slice(this.batchSize * 2))
})
it('deletes the applied updates', function () {
expect(this.rclient.getList(this.historyOpsKey)).to.deep.equal([])
})
})
describe('error when first reading updates', function () {
beforeEach(async function () {
this.updates = makeUpdates(10)
this.rawUpdates = makeRawUpdates(this.updates)
this.rclient.setList(this.historyOpsKey, this.rawUpdates)
this.rclient.throwErrorOnLrangeCall(0)
await expect(
this.RedisManager.promises.getUpdatesInBatches(
this.projectId,
2,
this.runner
)
).to.be.rejected
})
it('does not delete any updates', function () {
expect(this.rclient.getList(this.historyOpsKey)).to.deep.equal(
this.rawUpdates
)
})
})
describe('error when reading updates for a second batch', function () {
beforeEach(async function () {
this.batchSize = this.RedisManager.RAW_UPDATES_BATCH_SIZE - 1
this.updates = makeUpdates(this.RedisManager.RAW_UPDATES_BATCH_SIZE * 2)
this.rawUpdates = makeRawUpdates(this.updates)
this.rclient.setList(this.historyOpsKey, this.rawUpdates)
this.rclient.throwErrorOnLrangeCall(1)
await expect(
this.RedisManager.promises.getUpdatesInBatches(
this.projectId,
this.batchSize,
this.runner
)
).to.be.rejected
})
it('calls the runner with the first batch of updates', function () {
this.runner.should.have.been.calledOnce
this.runner
.getCall(0)
.should.have.been.calledWith(this.updates.slice(0, this.batchSize))
})
it('deletes only the first batch of applied updates', function () {
expect(this.rclient.getList(this.historyOpsKey)).to.deep.equal(
this.rawUpdates.slice(this.batchSize)
)
})
})
})
})
class FakeRedis {
constructor() {
this.data = new Map()
this.del = sinon.stub()
this.lrangeCallCount = -1
}
setList(key, list) {
this.data.set(key, list)
}
getList(key) {
return this.data.get(key)
}
throwErrorOnLrangeCall(callNum) {
this.lrangeCallThrowingError = callNum
}
async lrange(key, start, stop) {
this.lrangeCallCount += 1
if (
this.lrangeCallThrowingError != null &&
this.lrangeCallThrowingError === this.lrangeCallCount
) {
throw new Error('LRANGE failed!')
}
const list = this.data.get(key) ?? []
return list.slice(start, stop + 1)
}
async lrem(key, count, elementToRemove) {
expect(count).to.be.greaterThan(0)
const original = this.data.get(key) ?? []
const filtered = original.filter(element => {
if (count > 0 && element === elementToRemove) {
count--
return false
}
return true
})
this.data.set(key, filtered)
}
async exec() {
// Nothing to do
}
multi() {
return this
}
}
function makeUpdates(updateCount, extraFields = {}) {
const updates = []
for (let i = 0; i < updateCount; i++) {
updates.push({ v: i, ...extraFields })
}
return updates
}
function makeRawUpdates(updates) {
return updates.map(JSON.stringify)
}

View File

@@ -0,0 +1,145 @@
import sinon from 'sinon'
import { expect } from 'chai'
import mongodb from 'mongodb-legacy'
import { strict as esmock } from 'esmock'
const { ObjectId } = mongodb
const MODULE_PATH = '../../../../app/js/RetryManager.js'
describe('RetryManager', function () {
beforeEach(async function () {
this.projectId1 = new ObjectId().toString()
this.projectId2 = new ObjectId().toString()
this.projectId3 = new ObjectId().toString()
this.projectId4 = new ObjectId().toString()
this.historyId = 12345
this.WebApiManager = {
promises: {
getHistoryId: sinon.stub().resolves(this.historyId),
},
}
this.RedisManager = {
promises: {
countUnprocessedUpdates: sinon.stub().resolves(0),
},
}
this.ErrorRecorder = {
promises: {
getFailedProjects: sinon.stub().resolves([
{
project_id: this.projectId1,
error: 'Error: Timeout',
attempts: 1,
},
{
project_id: this.projectId2,
error: 'Error: Timeout',
attempts: 25,
},
{
project_id: this.projectId3,
error: 'sync ongoing',
attempts: 10,
resyncAttempts: 1,
},
{
project_id: this.projectId4,
error: 'sync ongoing',
attempts: 10,
resyncAttempts: 2,
},
]),
getFailureRecord: sinon.stub().resolves(),
},
}
this.SyncManager = {
promises: {
startResync: sinon.stub().resolves(),
startHardResync: sinon.stub().resolves(),
},
}
this.UpdatesProcessor = {
promises: {
processUpdatesForProject: sinon.stub().resolves(),
},
}
this.settings = {
redis: {
lock: {
key_schema: {
projectHistoryLock({ projectId }) {
return `ProjectHistoryLock:${projectId}`
},
},
},
},
}
this.request = {}
this.RetryManager = await esmock(MODULE_PATH, {
'../../../../app/js/WebApiManager.js': this.WebApiManager,
'../../../../app/js/RedisManager.js': this.RedisManager,
'../../../../app/js/ErrorRecorder.js': this.ErrorRecorder,
'../../../../app/js/SyncManager.js': this.SyncManager,
'../../../../app/js/UpdatesProcessor.js': this.UpdatesProcessor,
'@overleaf/settings': this.settings,
request: this.request,
})
})
describe('RetryManager', function () {
describe('for a soft failure', function () {
beforeEach(async function () {
await this.RetryManager.promises.retryFailures({ failureType: 'soft' })
})
it('should flush the queue', function () {
expect(
this.UpdatesProcessor.promises.processUpdatesForProject
).to.have.been.calledWith(this.projectId1)
})
})
describe('for a hard failure', function () {
beforeEach(async function () {
await this.RetryManager.promises.retryFailures({ failureType: 'hard' })
})
it('should check the overleaf project id', function () {
expect(
this.WebApiManager.promises.getHistoryId
).to.have.been.calledWith(this.projectId2)
})
it("should start a soft resync when a resync hasn't been tried yet", function () {
expect(this.SyncManager.promises.startResync).to.have.been.calledWith(
this.projectId2
)
})
it('should start a hard resync when a resync has already been tried', function () {
expect(
this.SyncManager.promises.startHardResync
).to.have.been.calledWith(this.projectId3)
})
it("shouldn't try a resync after a hard resync attempt failed", function () {
expect(
this.SyncManager.promises.startHardResync
).not.to.have.been.calledWith(this.projectId4)
})
it('should count the unprocessed updates', function () {
expect(
this.RedisManager.promises.countUnprocessedUpdates
).to.have.been.calledWith(this.projectId2)
})
it('should check the failure record', function () {
expect(
this.ErrorRecorder.promises.getFailureRecord
).to.have.been.calledWith(this.projectId2)
})
})
})
})

File diff suppressed because it is too large Load Diff

View File

@@ -0,0 +1,874 @@
import sinon from 'sinon'
import { expect } from 'chai'
import { strict as esmock } from 'esmock'
const MODULE_PATH = '../../../../app/js/SummarizedUpdatesManager.js'
// A sufficiently large amount of time to make the algorithm process updates
// separately
const LATER = 1000000
describe('SummarizedUpdatesManager', function () {
beforeEach(async function () {
this.historyId = 'history-id-123'
this.projectId = 'project-id-123'
this.firstChunk = { chunk: { startVersion: 0 } }
this.secondChunk = { chunk: { startVersion: 1 } }
this.ChunkTranslator = {
convertToSummarizedUpdates: sinon.stub(),
}
this.HistoryApiManager = {
shouldUseProjectHistory: sinon.stub().yields(null, true),
}
this.HistoryStoreManager = {
getMostRecentChunk: sinon.stub(),
getChunkAtVersion: sinon.stub(),
}
this.UpdatesProcessor = {
processUpdatesForProject: sinon.stub().withArgs(this.projectId).yields(),
}
this.WebApiManager = {
getHistoryId: sinon.stub().yields(null, this.historyId),
}
this.LabelsManager = {
getLabels: sinon.stub().yields(null, []),
}
this.SummarizedUpdatesManager = await esmock(MODULE_PATH, {
'../../../../app/js/ChunkTranslator.js': this.ChunkTranslator,
'../../../../app/js/HistoryApiManager.js': this.HistoryApiManager,
'../../../../app/js/HistoryStoreManager.js': this.HistoryStoreManager,
'../../../../app/js/UpdatesProcessor.js': this.UpdatesProcessor,
'../../../../app/js/WebApiManager.js': this.WebApiManager,
'../../../../app/js/LabelsManager.js': this.LabelsManager,
})
this.callback = sinon.stub()
})
describe('getSummarizedProjectUpdates', function () {
describe('chunk management', function () {
describe('when there is a single empty chunk', function () {
setupChunks([[]])
expectSummaries('returns an empty list of updates', {}, [])
})
describe('when there is a single non-empty chunk', function () {
setupChunks([[makeUpdate()]])
expectSummaries('returns summarized updates', {}, [makeSummary()])
})
describe('when there are multiple chunks', function () {
setupChunks([
[makeUpdate({ startTs: 0, v: 1 })],
[makeUpdate({ startTs: LATER, v: 2 })],
])
describe('and requesting many summaries', function () {
expectSummaries('returns many update summaries', {}, [
makeSummary({ startTs: LATER, fromV: 2 }),
makeSummary({ startTs: 0, fromV: 1 }),
])
})
describe('and requesting a single summary', function () {
expectSummaries('returns a single update summary', { min_count: 1 }, [
makeSummary({ startTs: LATER, fromV: 2 }),
])
})
})
describe('when there are too many chunks', function () {
// Set up 10 chunks
const chunks = []
for (let v = 1; v <= 10; v++) {
chunks.push([
makeUpdate({
startTs: v * 100, // values: 100 - 1000
v, // values: 1 - 10
}),
])
}
setupChunks(chunks)
// Verify that we stop summarizing after 5 chunks
expectSummaries('summarizes the 5 latest chunks', {}, [
makeSummary({ startTs: 600, endTs: 1010, fromV: 6, toV: 11 }),
])
})
describe('when requesting updates before a specific version', function () {
// Chunk 1 contains 5 updates that were made close to each other and 5
// other updates that were made later.
const chunk1 = []
for (let v = 1; v <= 5; v++) {
chunk1.push(
makeUpdate({
startTs: v * 100, // values: 100 - 500
v, // values: 1 - 5
})
)
}
for (let v = 6; v <= 10; v++) {
chunk1.push(
makeUpdate({
startTs: LATER + v * 100, // values: 1000600 - 1001000
v, // values: 6 - 10
})
)
}
// Chunk 2 contains 5 updates that were made close to the latest updates in
// chunk 1.
const chunk2 = []
for (let v = 11; v <= 15; v++) {
chunk2.push(
makeUpdate({
startTs: LATER + v * 100, // values: 1001100 - 1001500
v, // values: 11 - 15
})
)
}
setupChunks([chunk1, chunk2])
expectSummaries(
'summarizes the updates in a single chunk if the chunk is sufficient',
{ before: 14, min_count: 1 },
[
makeSummary({
startTs: LATER + 1100,
endTs: LATER + 1310,
fromV: 11,
toV: 14,
}),
]
)
expectSummaries(
'summarizes the updates in many chunks otherwise',
{ before: 14, min_count: 2 },
[
makeSummary({
startTs: LATER + 600,
endTs: LATER + 1310,
fromV: 6,
toV: 14,
}),
makeSummary({
startTs: 100,
endTs: 510,
fromV: 1,
toV: 6,
}),
]
)
})
})
describe('update summarization', function () {
describe('updates that are close in time', function () {
setupChunks([
[
makeUpdate({
users: ['user1'],
startTs: 0,
v: 4,
}),
makeUpdate({
users: ['user2'],
startTs: 20,
v: 5,
}),
],
])
expectSummaries('should merge the updates', {}, [
makeSummary({
users: ['user1', 'user2'],
startTs: 0,
endTs: 30,
fromV: 4,
toV: 6,
}),
])
})
describe('updates that are far apart in time', function () {
setupChunks([
[
makeUpdate({
users: ['user1'],
startTs: 100,
v: 4,
}),
makeUpdate({
users: ['user2'],
startTs: LATER,
v: 5,
}),
],
])
expectSummaries('should not merge the updates', {}, [
makeSummary({
users: ['user2'],
startTs: LATER,
endTs: LATER + 10,
fromV: 5,
toV: 6,
}),
makeSummary({
users: ['user1'],
startTs: 100,
endTs: 110,
fromV: 4,
toV: 5,
}),
])
})
describe('mergeable updates in different chunks', function () {
setupChunks([
[
makeUpdate({
pathnames: ['main.tex'],
users: ['user1'],
startTs: 10,
v: 4,
}),
makeUpdate({
pathnames: ['main.tex'],
users: ['user2'],
startTs: 30,
v: 5,
}),
],
[
makeUpdate({
pathnames: ['chapter.tex'],
users: ['user1'],
startTs: 40,
v: 6,
}),
makeUpdate({
pathnames: ['chapter.tex'],
users: ['user1'],
startTs: 50,
v: 7,
}),
],
])
expectSummaries('should merge the updates', {}, [
makeSummary({
pathnames: ['main.tex', 'chapter.tex'],
users: ['user1', 'user2'],
startTs: 10,
endTs: 60,
fromV: 4,
toV: 8,
}),
])
})
describe('null user values after regular users', function () {
setupChunks([
[
makeUpdate({
users: ['user1'],
startTs: 0,
v: 4,
}),
makeUpdate({
users: [null],
startTs: 20,
v: 5,
}),
],
])
expectSummaries('should include the null values', {}, [
makeSummary({
users: [null, 'user1'],
startTs: 0,
endTs: 30,
fromV: 4,
toV: 6,
}),
])
})
describe('null user values before regular users', function () {
setupChunks([
[
makeUpdate({
users: [null],
startTs: 0,
v: 4,
}),
makeUpdate({
users: ['user1'],
startTs: 20,
v: 5,
}),
],
])
expectSummaries('should include the null values', {}, [
makeSummary({
users: [null, 'user1'],
startTs: 0,
endTs: 30,
fromV: 4,
toV: 6,
}),
])
})
describe('multiple null user values', function () {
setupChunks([
[
makeUpdate({
users: ['user1'],
startTs: 10,
v: 4,
}),
makeUpdate({
users: [null],
startTs: 20,
v: 5,
}),
makeUpdate({
users: [null],
startTs: 70,
v: 6,
}),
],
])
expectSummaries('should merge the null values', {}, [
makeSummary({
users: [null, 'user1'],
startTs: 10,
endTs: 80,
fromV: 4,
toV: 7,
}),
])
})
describe('multiple users', function () {
setupChunks([
[
makeUpdate({
users: ['user1'],
startTs: 0,
v: 4,
}),
makeUpdate({
users: ['user2'],
startTs: 20,
v: 5,
}),
],
])
expectSummaries('should merge the users', {}, [
makeSummary({
users: ['user1', 'user2'],
startTs: 0,
endTs: 30,
fromV: 4,
toV: 6,
}),
])
})
describe('duplicate updates with the same v1 user', function () {
setupChunks([
[
makeUpdate({
users: [{ id: 'user1' }],
startTs: 0,
v: 4,
}),
makeUpdate({
users: [{ id: 'user1' }],
startTs: 20,
v: 5,
}),
],
])
expectSummaries('should deduplicate the users', {}, [
makeSummary({
users: [{ id: 'user1' }],
startTs: 0,
endTs: 30,
fromV: 4,
toV: 6,
}),
])
})
describe('duplicate updates with the same v2 user', function () {
setupChunks([
[
makeUpdate({
users: ['user1'],
startTs: 0,
v: 4,
}),
makeUpdate({
users: ['user1'],
startTs: 20,
v: 5,
}),
],
])
expectSummaries('should deduplicate the users', {}, [
makeSummary({
users: ['user1'],
startTs: 0,
endTs: 30,
fromV: 4,
toV: 6,
}),
])
})
describe('mixed v1 and v2 users with the same id', function () {
setupChunks([
[
makeUpdate({
users: ['user1'],
startTs: 0,
v: 4,
}),
makeUpdate({
users: [{ id: 'user1' }],
startTs: 20,
v: 5,
}),
],
])
expectSummaries('should deduplicate the users', {}, [
makeSummary({
users: [{ id: 'user1' }],
startTs: 0,
endTs: 30,
fromV: 4,
toV: 6,
}),
])
})
describe('project ops in mergeable updates', function () {
setupChunks([
[
makeUpdate({
pathnames: [],
projectOps: [
{ rename: { pathname: 'C.tex', newPathname: 'D.tex' } },
],
users: ['user2'],
startTs: 0,
v: 4,
}),
makeUpdate({
pathnames: [],
projectOps: [
{ rename: { pathname: 'A.tex', newPathname: 'B.tex' } },
],
users: ['user1'],
startTs: 20,
v: 5,
}),
],
])
expectSummaries('should merge project ops', {}, [
makeSummary({
pathnames: [],
projectOps: [
{
atV: 5,
rename: {
pathname: 'A.tex',
newPathname: 'B.tex',
},
},
{
atV: 4,
rename: {
pathname: 'C.tex',
newPathname: 'D.tex',
},
},
],
users: ['user1', 'user2'],
startTs: 0,
endTs: 30,
fromV: 4,
toV: 6,
}),
])
})
describe('mergable updates with a mix of project ops and doc ops', function () {
setupChunks([
[
makeUpdate({
pathnames: ['main.tex'],
users: ['user1'],
startTs: 0,
v: 4,
}),
makeUpdate({
pathnames: [],
users: ['user2'],
projectOps: [
{ rename: { pathname: 'A.tex', newPathname: 'B.tex' } },
],
startTs: 20,
v: 5,
}),
makeUpdate({
pathnames: ['chapter.tex'],
users: ['user2'],
startTs: 40,
v: 6,
}),
],
])
expectSummaries('should keep updates separate', {}, [
makeSummary({
pathnames: ['chapter.tex'],
users: ['user2'],
startTs: 40,
fromV: 6,
}),
makeSummary({
pathnames: [],
users: ['user2'],
projectOps: [
{ atV: 5, rename: { pathname: 'A.tex', newPathname: 'B.tex' } },
],
startTs: 20,
fromV: 5,
}),
makeSummary({
pathnames: ['main.tex'],
users: ['user1'],
startTs: 0,
fromV: 4,
}),
])
})
describe('label on an update', function () {
const label = {
id: 'mock-id',
comment: 'an example comment',
version: 5,
}
setupChunks([
[
makeUpdate({ startTs: 0, v: 3 }),
makeUpdate({ startTs: 20, v: 4 }),
makeUpdate({ startTs: 40, v: 5 }),
makeUpdate({ startTs: 60, v: 6 }),
],
])
setupLabels([label])
expectSummaries('should split the updates at the label', {}, [
makeSummary({ startTs: 40, endTs: 70, fromV: 5, toV: 7 }),
makeSummary({
startTs: 0,
endTs: 30,
fromV: 3,
toV: 5,
labels: [label],
}),
])
})
describe('updates with origin', function () {
setupChunks([
[
makeUpdate({ startTs: 0, v: 1 }),
makeUpdate({ startTs: 10, v: 2 }),
makeUpdate({
startTs: 20,
v: 3,
origin: { kind: 'history-resync' },
}),
makeUpdate({
startTs: 30,
v: 4,
origin: { kind: 'history-resync' },
}),
makeUpdate({ startTs: 40, v: 5 }),
makeUpdate({ startTs: 50, v: 6 }),
],
])
expectSummaries(
'should split the updates where the origin appears or disappears',
{},
[
makeSummary({ startTs: 40, endTs: 60, fromV: 5, toV: 7 }),
makeSummary({
startTs: 20,
endTs: 40,
fromV: 3,
toV: 5,
origin: { kind: 'history-resync' },
}),
makeSummary({ startTs: 0, endTs: 20, fromV: 1, toV: 3 }),
]
)
})
describe('updates with different origins', function () {
setupChunks([
[
makeUpdate({ startTs: 0, v: 1, origin: { kind: 'origin-a' } }),
makeUpdate({ startTs: 10, v: 2, origin: { kind: 'origin-a' } }),
makeUpdate({ startTs: 20, v: 3, origin: { kind: 'origin-b' } }),
makeUpdate({ startTs: 30, v: 4, origin: { kind: 'origin-b' } }),
],
])
expectSummaries(
'should split the updates when the origin kind changes',
{},
[
makeSummary({
startTs: 20,
endTs: 40,
fromV: 3,
toV: 5,
origin: { kind: 'origin-b' },
}),
makeSummary({
startTs: 0,
endTs: 20,
fromV: 1,
toV: 3,
origin: { kind: 'origin-a' },
}),
]
)
})
describe('empty updates', function () {
setupChunks([
[
makeUpdate({ startTs: 0, v: 1, pathnames: ['main.tex'] }),
makeUpdate({ startTs: 10, v: 2, pathnames: [] }),
makeUpdate({ startTs: 20, v: 3, pathnames: ['main.tex'] }),
makeUpdate({ startTs: 30, v: 4, pathnames: [] }),
makeUpdate({ startTs: 40, v: 5, pathnames: [] }),
],
[
makeUpdate({ startTs: 50, v: 6, pathnames: [] }),
makeUpdate({ startTs: LATER, v: 7, pathnames: [] }),
makeUpdate({ startTs: LATER + 10, v: 8, pathnames: ['main.tex'] }),
makeUpdate({ startTs: LATER + 20, v: 9, pathnames: ['main.tex'] }),
makeUpdate({ startTs: LATER + 30, v: 10, pathnames: [] }),
],
])
expectSummaries('should skip empty updates', {}, [
makeSummary({
startTs: LATER + 10,
endTs: LATER + 30,
fromV: 8,
toV: 11,
}),
makeSummary({ startTs: 0, endTs: 30, fromV: 1, toV: 8 }),
])
})
describe('history resync updates', function () {
setupChunks([
[
makeUpdate({
startTs: 0,
v: 1,
origin: { kind: 'history-resync' },
projectOps: [{ add: { pathname: 'file1.tex' } }],
pathnames: [],
}),
makeUpdate({
startTs: 20,
v: 2,
origin: { kind: 'history-resync' },
projectOps: [
{ add: { pathname: 'file2.tex' } },
{ add: { pathname: 'file3.tex' } },
],
pathnames: [],
}),
makeUpdate({
startTs: 40,
v: 3,
origin: { kind: 'history-resync' },
projectOps: [{ add: { pathname: 'file4.tex' } }],
pathnames: [],
}),
makeUpdate({
startTs: 60,
v: 4,
origin: { kind: 'history-resync' },
projectOps: [],
pathnames: ['file1.tex', 'file2.tex', 'file5.tex'],
}),
makeUpdate({
startTs: 80,
v: 5,
origin: { kind: 'history-resync' },
projectOps: [],
pathnames: ['file4.tex'],
}),
makeUpdate({ startTs: 100, v: 6, pathnames: ['file1.tex'] }),
],
])
expectSummaries('should merge creates and edits', {}, [
makeSummary({
startTs: 100,
endTs: 110,
fromV: 6,
toV: 7,
pathnames: ['file1.tex'],
}),
makeSummary({
startTs: 0,
endTs: 90,
fromV: 1,
toV: 6,
origin: { kind: 'history-resync' },
pathnames: ['file5.tex'],
projectOps: [
{ add: { pathname: 'file4.tex' }, atV: 3 },
{ add: { pathname: 'file2.tex' }, atV: 2 },
{ add: { pathname: 'file3.tex' }, atV: 2 },
{ add: { pathname: 'file1.tex' }, atV: 1 },
],
}),
])
})
})
})
})
/**
* Set up mocks as if the project had a number of chunks.
*
* Each parameter represents a chunk and the value of the parameter is the list
* of updates in that chunk.
*/
function setupChunks(updatesByChunk) {
beforeEach('set up chunks', function () {
let startVersion = 0
for (let i = 0; i < updatesByChunk.length; i++) {
const updates = updatesByChunk[i]
const chunk = { chunk: { startVersion } }
// Find the chunk by any update version
for (const update of updates) {
this.HistoryStoreManager.getChunkAtVersion
.withArgs(this.projectId, this.historyId, update.v)
.yields(null, chunk)
startVersion = update.v
}
if (i === updatesByChunk.length - 1) {
this.HistoryStoreManager.getMostRecentChunk
.withArgs(this.projectId, this.historyId)
.yields(null, chunk)
}
this.ChunkTranslator.convertToSummarizedUpdates
.withArgs(chunk)
.yields(null, updates)
}
})
}
function setupLabels(labels) {
beforeEach('set up labels', function () {
this.LabelsManager.getLabels.withArgs(this.projectId).yields(null, labels)
})
}
function expectSummaries(description, options, expectedSummaries) {
it(`${description}`, function (done) {
this.SummarizedUpdatesManager.getSummarizedProjectUpdates(
this.projectId,
options,
(err, summaries) => {
if (err) {
return done(err)
}
// The order of the users array is not significant
for (const summary of summaries) {
summary.meta.users.sort()
}
for (const summary of expectedSummaries) {
summary.meta.users.sort()
}
expect(summaries).to.deep.equal(expectedSummaries)
done()
}
)
})
}
function makeUpdate(options = {}) {
const {
pathnames = ['main.tex'],
users = ['user1'],
projectOps = [],
startTs = 0,
endTs = startTs + 10,
v = 1,
origin,
} = options
const update = {
pathnames,
project_ops: projectOps,
meta: { users, start_ts: startTs, end_ts: endTs },
v,
}
if (origin) {
update.meta.origin = origin
}
return update
}
function makeSummary(options = {}) {
const {
pathnames = ['main.tex'],
users = ['user1'],
startTs = 0,
endTs = startTs + 10,
fromV = 1,
toV = fromV + 1,
labels = [],
projectOps = [],
origin,
} = options
const summary = {
pathnames: new Set(pathnames),
meta: {
users,
start_ts: startTs,
end_ts: endTs,
},
fromV,
toV,
labels,
project_ops: projectOps,
}
if (origin) {
summary.meta.origin = origin
}
return summary
}

File diff suppressed because it is too large Load Diff

File diff suppressed because it is too large Load Diff

File diff suppressed because it is too large Load Diff

View File

@@ -0,0 +1,552 @@
import sinon from 'sinon'
import { expect } from 'chai'
import { strict as esmock } from 'esmock'
import * as Errors from '../../../../app/js/Errors.js'
const MODULE_PATH = '../../../../app/js/UpdatesProcessor.js'
describe('UpdatesProcessor', function () {
before(async function () {
this.extendLock = sinon.stub()
this.BlobManager = {
createBlobsForUpdates: sinon.stub(),
}
this.HistoryStoreManager = {
getMostRecentVersion: sinon.stub(),
sendChanges: sinon.stub().yields(),
}
this.LockManager = {
runWithLock: sinon.spy((key, runner, callback) =>
runner(this.extendLock, callback)
),
}
this.RedisManager = {}
this.UpdateCompressor = {
compressRawUpdates: sinon.stub(),
}
this.UpdateTranslator = {
convertToChanges: sinon.stub(),
isProjectStructureUpdate: sinon.stub(),
isTextUpdate: sinon.stub(),
}
this.WebApiManager = {
getHistoryId: sinon.stub(),
}
this.SyncManager = {
expandSyncUpdates: sinon.stub(),
setResyncState: sinon.stub().yields(),
skipUpdatesDuringSync: sinon.stub(),
}
this.ErrorRecorder = {
getLastFailure: sinon.stub(),
record: sinon.stub().yields(null, { attempts: 1 }),
}
this.RetryManager = {
isFirstFailure: sinon.stub().returns(true),
isHardFailure: sinon.stub().returns(false),
}
this.Profiler = {
Profiler: class {
log() {
return this
}
wrap(label, cb) {
return cb
}
getTimeDelta() {
return 0
}
end() {
return 0
}
},
}
this.Metrics = {
gauge: sinon.stub(),
inc: sinon.stub(),
timing: sinon.stub(),
}
this.Settings = {
redis: {
lock: {
key_schema: {
projectHistoryLock({ project_id: projectId }) {
return `ProjectHistoryLock:${projectId}`
},
},
},
},
}
this.UpdatesProcessor = await esmock(MODULE_PATH, {
'../../../../app/js/BlobManager.js': this.BlobManager,
'../../../../app/js/HistoryStoreManager.js': this.HistoryStoreManager,
'../../../../app/js/LockManager.js': this.LockManager,
'../../../../app/js/RedisManager.js': this.RedisManager,
'../../../../app/js/UpdateCompressor.js': this.UpdateCompressor,
'../../../../app/js/UpdateTranslator.js': this.UpdateTranslator,
'../../../../app/js/WebApiManager.js': this.WebApiManager,
'../../../../app/js/SyncManager.js': this.SyncManager,
'../../../../app/js/ErrorRecorder.js': this.ErrorRecorder,
'../../../../app/js/Profiler.js': this.Profiler,
'../../../../app/js/RetryManager.js': this.RetryManager,
'../../../../app/js/Errors.js': Errors,
'@overleaf/metrics': this.Metrics,
'@overleaf/settings': this.Settings,
})
this.doc_id = 'doc-id-123'
this.project_id = 'project-id-123'
this.ol_project_id = 'ol-project-id-234'
this.callback = sinon.stub()
this.temporary = 'temp-mock'
})
describe('processUpdatesForProject', function () {
beforeEach(function () {
this.error = new Error('error')
this.queueSize = 445
this.UpdatesProcessor._mocks._countAndProcessUpdates = sinon
.stub()
.callsArgWith(3, this.error, this.queueSize)
})
describe('when there is no existing error', function () {
beforeEach(function (done) {
this.ErrorRecorder.getLastFailure.yields()
this.UpdatesProcessor.processUpdatesForProject(this.project_id, err => {
expect(err).to.equal(this.error)
done()
})
})
it('processes updates', function () {
this.UpdatesProcessor._mocks._countAndProcessUpdates
.calledWith(this.project_id)
.should.equal(true)
})
it('records errors', function () {
this.ErrorRecorder.record
.calledWith(this.project_id, this.queueSize, this.error)
.should.equal(true)
})
})
})
describe('_getHistoryId', function () {
describe('projectHistoryId is not present', function () {
beforeEach(function () {
this.updates = [
{ p: 0, i: 'a' },
{ p: 1, i: 's' },
]
this.WebApiManager.getHistoryId.yields(null)
})
it('returns null', function (done) {
this.UpdatesProcessor._getHistoryId(
this.project_id,
this.updates,
(error, projectHistoryId) => {
expect(error).to.be.null
expect(projectHistoryId).to.be.null
done()
}
)
})
})
describe('projectHistoryId is not present in updates', function () {
beforeEach(function () {
this.updates = [
{ p: 0, i: 'a' },
{ p: 1, i: 's' },
]
})
it('returns the id from web', function (done) {
this.projectHistoryId = '1234'
this.WebApiManager.getHistoryId.yields(null, this.projectHistoryId)
this.UpdatesProcessor._getHistoryId(
this.project_id,
this.updates,
(error, projectHistoryId) => {
expect(error).to.be.null
expect(projectHistoryId).equal(this.projectHistoryId)
done()
}
)
})
it('returns errors from web', function (done) {
this.error = new Error('oh no!')
this.WebApiManager.getHistoryId.yields(this.error)
this.UpdatesProcessor._getHistoryId(
this.project_id,
this.updates,
error => {
expect(error).to.equal(this.error)
done()
}
)
})
})
describe('projectHistoryId is present in some updates', function () {
beforeEach(function () {
this.projectHistoryId = '1234'
this.updates = [
{ p: 0, i: 'a' },
{ p: 1, i: 's', projectHistoryId: this.projectHistoryId },
{ p: 2, i: 'd', projectHistoryId: this.projectHistoryId },
]
})
it('returns an error if the id is inconsistent between updates', function (done) {
this.updates[1].projectHistoryId = 2345
this.UpdatesProcessor._getHistoryId(
this.project_id,
this.updates,
error => {
expect(error.message).to.equal(
'inconsistent project history id between updates'
)
done()
}
)
})
it('returns an error if the id is inconsistent between updates and web', function (done) {
this.WebApiManager.getHistoryId.yields(null, 2345)
this.UpdatesProcessor._getHistoryId(
this.project_id,
this.updates,
error => {
expect(error.message).to.equal(
'inconsistent project history id between updates and web'
)
done()
}
)
})
it('returns the id if it is consistent between updates and web', function (done) {
this.WebApiManager.getHistoryId.yields(null, this.projectHistoryId)
this.UpdatesProcessor._getHistoryId(
this.project_id,
this.updates,
(error, projectHistoryId) => {
expect(error).to.be.null
expect(projectHistoryId).equal(this.projectHistoryId)
done()
}
)
})
it('returns the id if it is consistent between updates but unavaiable in web', function (done) {
this.WebApiManager.getHistoryId.yields(new Error('oh no!'))
this.UpdatesProcessor._getHistoryId(
this.project_id,
this.updates,
(error, projectHistoryId) => {
expect(error).to.be.null
expect(projectHistoryId).equal(this.projectHistoryId)
done()
}
)
})
})
})
describe('_processUpdates', function () {
beforeEach(function () {
this.mostRecentVersionInfo = { version: 1 }
this.rawUpdates = ['raw updates']
this.expandedUpdates = ['expanded updates']
this.filteredUpdates = ['filtered updates']
this.compressedUpdates = ['compressed updates']
this.updatesWithBlobs = ['updates with blob']
this.changes = [
{
toRaw() {
return 'change'
},
},
]
this.newSyncState = { resyncProjectStructure: false }
this.extendLock = sinon.stub().yields()
this.mostRecentChunk = 'fake-chunk'
this.HistoryStoreManager.getMostRecentVersion.yields(
null,
this.mostRecentVersionInfo,
null,
'_lastChange',
this.mostRecentChunk
)
this.SyncManager.skipUpdatesDuringSync.yields(
null,
this.filteredUpdates,
this.newSyncState
)
this.SyncManager.expandSyncUpdates.callsArgWith(
5,
null,
this.expandedUpdates
)
this.UpdateCompressor.compressRawUpdates.returns(this.compressedUpdates)
this.BlobManager.createBlobsForUpdates.callsArgWith(
4,
null,
this.updatesWithBlobs
)
this.UpdateTranslator.convertToChanges.returns(this.changes)
})
describe('happy path', function () {
beforeEach(function (done) {
this.UpdatesProcessor._processUpdates(
this.project_id,
this.ol_project_id,
this.rawUpdates,
this.extendLock,
err => {
this.callback(err)
done()
}
)
})
it('should get the latest version id', function () {
this.HistoryStoreManager.getMostRecentVersion.should.have.been.calledWith(
this.project_id,
this.ol_project_id
)
})
it('should skip updates when resyncing', function () {
this.SyncManager.skipUpdatesDuringSync.should.have.been.calledWith(
this.project_id,
this.rawUpdates
)
})
it('should expand sync updates', function () {
this.SyncManager.expandSyncUpdates.should.have.been.calledWith(
this.project_id,
this.ol_project_id,
this.mostRecentChunk,
this.filteredUpdates,
this.extendLock
)
})
it('should compress updates', function () {
this.UpdateCompressor.compressRawUpdates.should.have.been.calledWith(
this.expandedUpdates
)
})
it('should create any blobs for the updates', function () {
this.BlobManager.createBlobsForUpdates.should.have.been.calledWith(
this.project_id,
this.ol_project_id,
this.compressedUpdates
)
})
it('should convert the updates into a change requests', function () {
this.UpdateTranslator.convertToChanges.should.have.been.calledWith(
this.project_id,
this.updatesWithBlobs
)
})
it('should send the change request to the history store', function () {
this.HistoryStoreManager.sendChanges.should.have.been.calledWith(
this.project_id,
this.ol_project_id,
['change']
)
})
it('should set the sync state', function () {
this.SyncManager.setResyncState.should.have.been.calledWith(
this.project_id,
this.newSyncState
)
})
it('should call the callback with no error', function () {
this.callback.should.have.been.called
})
})
describe('with an error converting changes', function () {
beforeEach(function (done) {
this.err = new Error()
this.UpdateTranslator.convertToChanges.throws(this.err)
this.callback = sinon.stub()
this.UpdatesProcessor._processUpdates(
this.project_id,
this.ol_project_id,
this.rawUpdates,
this.extendLock,
err => {
this.callback(err)
done()
}
)
})
it('should call the callback with the error', function () {
this.callback.should.have.been.calledWith(this.err)
})
})
})
describe('_skipAlreadyAppliedUpdates', function () {
before(function () {
this.UpdateTranslator.isProjectStructureUpdate.callsFake(
update => update.version != null
)
this.UpdateTranslator.isTextUpdate.callsFake(update => update.v != null)
})
describe('with all doc ops in order', function () {
before(function () {
this.updates = [
{ doc: 'id', v: 1 },
{ doc: 'id', v: 2 },
{ doc: 'id', v: 3 },
{ doc: 'id', v: 4 },
]
this.updatesToApply = this.UpdatesProcessor._skipAlreadyAppliedUpdates(
this.project_id,
this.updates,
{ docs: {} }
)
})
it('should return the original updates', function () {
expect(this.updatesToApply).to.eql(this.updates)
})
})
describe('with all project ops in order', function () {
before(function () {
this.updates = [
{ version: 1 },
{ version: 2 },
{ version: 3 },
{ version: 4 },
]
this.updatesToApply = this.UpdatesProcessor._skipAlreadyAppliedUpdates(
this.project_id,
this.updates,
{ docs: {} }
)
})
it('should return the original updates', function () {
expect(this.updatesToApply).to.eql(this.updates)
})
})
describe('with all multiple doc and ops in order', function () {
before(function () {
this.updates = [
{ doc: 'id1', v: 1 },
{ doc: 'id1', v: 2 },
{ doc: 'id1', v: 3 },
{ doc: 'id1', v: 4 },
{ doc: 'id2', v: 1 },
{ doc: 'id2', v: 2 },
{ doc: 'id2', v: 3 },
{ doc: 'id2', v: 4 },
{ version: 1 },
{ version: 2 },
{ version: 3 },
{ version: 4 },
]
this.updatesToApply = this.UpdatesProcessor._skipAlreadyAppliedUpdates(
this.project_id,
this.updates,
{ docs: {} }
)
})
it('should return the original updates', function () {
expect(this.updatesToApply).to.eql(this.updates)
})
})
describe('with doc ops out of order', function () {
before(function () {
this.updates = [
{ doc: 'id', v: 1 },
{ doc: 'id', v: 2 },
{ doc: 'id', v: 4 },
{ doc: 'id', v: 3 },
]
this.skipFn = sinon.spy(
this.UpdatesProcessor._mocks,
'_skipAlreadyAppliedUpdates'
)
try {
this.updatesToApply =
this.UpdatesProcessor._skipAlreadyAppliedUpdates(
this.project_id,
this.updates,
{ docs: {} }
)
} catch (error) {}
})
after(function () {
this.skipFn.restore()
})
it('should throw an exception', function () {
this.skipFn.threw('OpsOutOfOrderError').should.equal(true)
})
})
describe('with project ops out of order', function () {
before(function () {
this.updates = [
{ version: 1 },
{ version: 2 },
{ version: 4 },
{ version: 3 },
]
this.skipFn = sinon.spy(
this.UpdatesProcessor._mocks,
'_skipAlreadyAppliedUpdates'
)
try {
this.updatesToApply =
this.UpdatesProcessor._skipAlreadyAppliedUpdates(
this.project_id,
this.updates,
{ docs: {} }
)
} catch (error) {}
})
after(function () {
this.skipFn.restore()
})
it('should throw an exception', function () {
this.skipFn.threw('OpsOutOfOrderError').should.equal(true)
})
})
})
})

View File

@@ -0,0 +1,170 @@
/* eslint-disable
no-return-assign,
no-undef,
no-unused-vars,
*/
// TODO: This file was created by bulk-decaffeinate.
// Fix any style issues and re-enable lint.
/*
* decaffeinate suggestions:
* DS102: Remove unnecessary code created because of implicit returns
* Full docs: https://github.com/decaffeinate/decaffeinate/blob/master/docs/suggestions.md
*/
import { expect } from 'chai'
import { strict as esmock } from 'esmock'
const MODULE_PATH = '../../../../app/js/Versions.js'
describe('Versions', function () {
beforeEach(async function () {
return (this.Versions = await esmock(MODULE_PATH))
})
describe('compare', function () {
describe('for greater major version', function () {
return it('should return +1', function () {
return this.Versions.compare('2.1', '1.1').should.equal(+1)
})
})
describe('for lesser major version', function () {
return it('should return -1', function () {
return this.Versions.compare('1.1', '2.1').should.equal(-1)
})
})
describe('for equal major versions with no minor version', function () {
return it('should return 0', function () {
return this.Versions.compare('2', '2').should.equal(0)
})
})
describe('for equal major versions with greater minor version', function () {
return it('should return +1', function () {
return this.Versions.compare('2.3', '2.1').should.equal(+1)
})
})
describe('for equal major versions with lesser minor version', function () {
return it('should return -1', function () {
return this.Versions.compare('2.1', '2.3').should.equal(-1)
})
})
describe('for equal major versions with greater minor version (non lexical)', function () {
return it('should return +1', function () {
return this.Versions.compare('2.10', '2.9').should.equal(+1)
})
})
describe('for equal major versions with lesser minor version (non lexical)', function () {
return it('should return +1', function () {
return this.Versions.compare('2.9', '2.10').should.equal(-1)
})
})
describe('for a single major version vs a major+minor version', function () {
return it('should return +1', function () {
return this.Versions.compare('2.1', '1').should.equal(+1)
})
})
describe('for a major+minor version vs a single major version', function () {
return it('should return -1', function () {
return this.Versions.compare('1', '2.1').should.equal(-1)
})
})
describe('for equal major versions with greater minor version vs zero', function () {
return it('should return +1', function () {
return this.Versions.compare('2.3', '2.0').should.equal(+1)
})
})
return describe('for equal major versions with lesser minor version of zero', function () {
return it('should return -1', function () {
return this.Versions.compare('2.0', '2.3').should.equal(-1)
})
})
})
describe('gt', function () {
describe('for greater major version', function () {
return it('should return true', function () {
return this.Versions.gt('2.1', '1.1').should.equal(true)
})
})
describe('for lesser major version', function () {
return it('should return false', function () {
return this.Versions.gt('1.1', '2.1').should.equal(false)
})
})
return describe('for equal major versions with no minor version', function () {
return it('should return false', function () {
return this.Versions.gt('2', '2').should.equal(false)
})
})
})
describe('gte', function () {
describe('for greater major version', function () {
return it('should return true', function () {
return this.Versions.gte('2.1', '1.1').should.equal(true)
})
})
describe('for lesser major version', function () {
return it('should return false', function () {
return this.Versions.gte('1.1', '2.1').should.equal(false)
})
})
return describe('for equal major versions with no minor version', function () {
return it('should return true', function () {
return this.Versions.gte('2', '2').should.equal(true)
})
})
})
describe('lt', function () {
describe('for greater major version', function () {
return it('should return false', function () {
return this.Versions.lt('2.1', '1.1').should.equal(false)
})
})
describe('for lesser major version', function () {
return it('should return true', function () {
return this.Versions.lt('1.1', '2.1').should.equal(true)
})
})
return describe('for equal major versions with no minor version', function () {
return it('should return false', function () {
return this.Versions.lt('2', '2').should.equal(false)
})
})
})
return describe('lte', function () {
describe('for greater major version', function () {
return it('should return false', function () {
return this.Versions.lte('2.1', '1.1').should.equal(false)
})
})
describe('for lesser major version', function () {
return it('should return true', function () {
return this.Versions.lte('1.1', '2.1').should.equal(true)
})
})
return describe('for equal major versions with no minor version', function () {
return it('should return true', function () {
return this.Versions.lte('2', '2').should.equal(true)
})
})
})
})

View File

@@ -0,0 +1,153 @@
import sinon from 'sinon'
import { expect } from 'chai'
import { strict as esmock } from 'esmock'
import { RequestFailedError } from '@overleaf/fetch-utils'
const MODULE_PATH = '../../../../app/js/WebApiManager.js'
describe('WebApiManager', function () {
beforeEach(async function () {
this.settings = {
apis: {
web: {
url: 'http://example.com',
user: 'overleaf',
pass: 'password',
},
},
}
this.userId = 'mock-user-id'
this.projectId = 'mock-project-id'
this.project = { features: 'mock-features' }
this.olProjectId = 12345
this.Metrics = { inc: sinon.stub() }
this.RedisManager = {
promises: {
getCachedHistoryId: sinon.stub(),
setCachedHistoryId: sinon.stub().resolves(),
},
}
this.FetchUtils = {
fetchNothing: sinon.stub().resolves(),
fetchJson: sinon.stub(),
RequestFailedError,
}
this.WebApiManager = await esmock(MODULE_PATH, {
'@overleaf/fetch-utils': this.FetchUtils,
'@overleaf/settings': this.settings,
'@overleaf/metrics': this.Metrics,
'../../../../app/js/RedisManager.js': this.RedisManager,
})
this.WebApiManager.setRetryTimeoutMs(100)
})
describe('getHistoryId', function () {
describe('when there is no cached value and the web request is successful', function () {
beforeEach(function () {
this.RedisManager.promises.getCachedHistoryId
.withArgs(this.projectId) // first call, no cached value returned
.onCall(0)
.resolves(null)
this.RedisManager.promises.getCachedHistoryId
.withArgs(this.projectId) // subsequent calls, return cached value
.resolves(this.olProjectId)
this.RedisManager.promises.getCachedHistoryId
.withArgs('mock-project-id-2') // no cached value for other project
.resolves(null)
this.FetchUtils.fetchJson.resolves({
overleaf: { history: { id: this.olProjectId } },
})
})
it('should only request project details once per project', async function () {
for (let i = 0; i < 5; i++) {
await this.WebApiManager.promises.getHistoryId(this.projectId)
}
this.FetchUtils.fetchJson.should.have.been.calledOnce
await this.WebApiManager.promises.getHistoryId('mock-project-id-2')
this.FetchUtils.fetchJson.should.have.been.calledTwice
})
it('should cache the history id', async function () {
const olProjectId = await this.WebApiManager.promises.getHistoryId(
this.projectId
)
this.RedisManager.promises.setCachedHistoryId
.calledWith(this.projectId, olProjectId)
.should.equal(true)
})
it("should return the project's history id", async function () {
const olProjectId = await this.WebApiManager.promises.getHistoryId(
this.projectId
)
expect(this.FetchUtils.fetchJson).to.have.been.calledWithMatch(
`${this.settings.apis.web.url}/project/${this.projectId}/details`,
{
basicAuth: {
user: this.settings.apis.web.user,
password: this.settings.apis.web.pass,
},
}
)
expect(olProjectId).to.equal(this.olProjectId)
})
})
describe('when the web API returns an error', function () {
beforeEach(function () {
this.error = new Error('something went wrong')
this.FetchUtils.fetchJson.rejects(this.error)
this.RedisManager.promises.getCachedHistoryId.resolves(null)
})
it('should throw an error', async function () {
await expect(
this.WebApiManager.promises.getHistoryId(this.projectId)
).to.be.rejectedWith(this.error)
})
})
describe('when web returns a 404', function () {
beforeEach(function () {
this.FetchUtils.fetchJson.rejects(
new RequestFailedError(
'http://some-url',
{},
{ status: 404 },
'Not found'
)
)
this.RedisManager.promises.getCachedHistoryId.resolves(null)
})
it('should throw an error', async function () {
await expect(
this.WebApiManager.promises.getHistoryId(this.projectId)
).to.be.rejectedWith('got a 404 from web api')
})
})
describe('when web returns a failure error code', function () {
beforeEach(function () {
this.RedisManager.promises.getCachedHistoryId.resolves(null)
this.FetchUtils.fetchJson.rejects(
new RequestFailedError(
'http://some-url',
{},
{ status: 500 },
'Error'
)
)
})
it('should throw an error', async function () {
await expect(
this.WebApiManager.promises.getHistoryId(this.projectId)
).to.be.rejectedWith(RequestFailedError)
})
})
})
})