\documentclass[11pt]{article}
\usepackage{fullpage}
\usepackage{graphicx}
\usepackage{amsmath,amsfonts,amsthm,amssymb,amscd,verbatim,graphicx,fancyhdr,tikz}
\usetikzlibrary{arrows}
\usetikzlibrary{shapes.geometric}
\usetikzlibrary{decorations.pathmorphing}
\usepackage[capitalise,nameinlink]{cleveref}
\usepackage{mathtools} %for minus plus sign alignment in matrices and vectors, added 13 Feb 2017
\tikzset{squiggle/.style={decorate, decoration=snake}}
\usepackage{algorithm}
\usepackage{algpseudocode}
% \usepackage{algorithmicx}
\usepackage{listings}
\crefname{lemma}{Lemma}{Lemmas}
\crefname{fact}{Fact}{Facts}
\crefname{theorem}{Theorem}{Theorems}
\crefname{corollary}{Corollary}{Corollaries}
\crefname{claim}{Claim}{Claims}
\crefname{example}{Example}{Examples}
\crefname{problem}{Problem}{Problems}
\crefname{setting}{Setting}{Settings}
\crefname{definition}{Definition}{Definitions}
\crefname{assumption}{Assumption}{Assumptions}
\crefname{subsection}{Subsection}{Subsections}
\crefname{section}{Section}{Sections}
\DeclareMathOperator*{\E}{\mathbb{E}}
\let\Pr\relax
\DeclareMathOperator*{\Pr}{\mathbb{P}}
\newcommand{\eps}{\varepsilon}
\newcommand{\inprod}[1]{\left\langle #1 \right\rangle}
\newcommand{\R}{\mathbb{R}}
\newcommand{\handout}[5]{
\noindent
\begin{center}
\framebox{
\vbox{
\hbox to 5.78in { {\bf CS 270: Combinatorial Algorithms and Data Structures
} \hfill #2 }
\vspace{4mm}
\hbox to 5.78in { {\Large \hfill #5 \hfill} }
\vspace{2mm}
\hbox to 5.78in { {\em #3 \hfill #4} }
}
}
\end{center}
\vspace*{4mm}
}
\newcommand{\lecture}[4]{\handout{#1}{#2}{#3}{Scribe: #4}{Lecture #1}}
\newtheorem{theorem}{Theorem}[section]
\newtheorem*{theorem*}{Theorem}
\newtheorem{itheorem}{Theorem}
\newtheorem{subclaim}{Claim}[theorem]
\newtheorem{proposition}[theorem]{Proposition}
\newtheorem*{proposition*}{Proposition}
\newtheorem{lemma}[theorem]{Lemma}
\newtheorem*{lemma*}{Lemma}
\newtheorem{corollary}[theorem]{Corollary}
\newtheorem*{conjecture*}{Conjecture}
\newtheorem{fact}[theorem]{Fact}
\newtheorem*{fact*}{Fact}
\newtheorem{exercise}[theorem]{Exercise}
\newtheorem*{exercise*}{Exercise}
\newtheorem{hypothesis}[theorem]{Hypothesis}
\newtheorem*{hypothesis*}{Hypothesis}
\newtheorem{conjecture}[theorem]{Conjecture}
\theoremstyle{definition}
\newtheorem{definition}[theorem]{Definition}
\newtheorem{setting}[theorem]{Setting}
\newtheorem{construction}[theorem]{Construction}
\newtheorem{example}[theorem]{Example}
\newtheorem{question}[theorem]{Question}
\newtheorem{openquestion}[theorem]{Open Question}
% \newtheorem{algorithm}[theorem]{Algorithm}
\newtheorem{problem}[theorem]{Problem}
\newtheorem{protocol}[theorem]{Protocol}
\newtheorem{assumption}[theorem]{Assumption}
\newtheorem{exercise-easy}[theorem]{Exercise}
\newtheorem{exercise-med}[theorem]{Exercise}
\newtheorem{exercise-hard}[theorem]{Exercise$^\star$}
\newtheorem{claim}[theorem]{Claim}
\newtheorem*{claim*}{Claim}
\newtheorem{remark}[theorem]{Remark}
\newtheorem*{remark*}{Remark}
\newtheorem{observation}[theorem]{Observation}
\newtheorem*{observation*}{Observation}
% 1-inch margins, from fullpage.sty by H.Partl, Version 2, Dec. 15, 1988.
% \topmargin 0pt
% \advace \topmargin by -\headheight
% \advance \topmargin by -\headsep
% \textheight 8.9in
% \oddsidemargin 0pt
% \evensidemargin \oddsidemargin
% \marginparwidth 0.5in
% \textwidth 6.5in
% \parindent 0in
% \parskip 1.5ex
\newcommand{\blockflow}{\textsc{BlockingFlow}}
\begin{document}
\lecture{5 --- January 31, 2023}{Spring 2023}{Prof.\ Jelani Nelson}{Jonathan Pei and Vibhav Athreya}
\section{Overview}
Today, we are going to finally talk about link-cut trees \cite{sleatoretal}, the data structural problem they solve, how we can use them to implement blocking flow, and analyze their runtime.
The main things we're covering in this lecture are how to implement link-cut trees and analyze them to show that they are fast. There is a way to implement link-cut such that each operation takes worst case $O(\log n)$ time, but today we'll be talking about how to use splay trees to implement these operations in $O(\log^2n)$ amortized time. You will show how to optimize this to $O(\log n)$ time in a later problem set.
\section{Problem that Link-Cut Trees Solves}
Let's first discuss the problem that link-cut trees solve. We visualize each iteration of \blockflow{} as a level graph as seen in Figure \ref{fig:levelgraph} below:
\begin{figure}[htb]
\centering
\begin{tikzpicture}[->,>=stealth',shorten >=1pt,auto,node distance=3cm,
thick,main node/.style={circle,draw,font=\sffamily\Large\bfseries}]
\node[shape=circle, draw=black] (s) at (0,3) {$s$};
\node[shape=circle, draw=black] (1) at (2,5) {$v_1$};
\node[shape=circle, draw=black] (2) at (2,3) {$v_2$};
\node[shape=circle, draw=black] (3) at (2,1) {$v_3$};
\node[shape=circle, draw=black] (4) at (4,4) {$v_4$};
\node[shape=circle, draw=black] (5) at (4,2) {$v_5$};
\node[shape=circle, draw=black] (6) at (6,5) {$v_6$};
\node[shape=circle, draw=black] (7) at (6,3) {$v_7$};
\node[shape=circle, draw=black] (8) at (6,1) {$v_8$};
\node[shape=circle, draw=black] (t) at (8,3) {$t$};
\draw [->, color=yellow] (s) to node [above] {} (1);
\draw [->, color=black] (s) to node [above] {} (2);
\draw [->, color=black] (s) to node [above] {} (3);
\draw [->, color=yellow] (1) to node [above] {} (4);
\draw [->, color=black] (2) to node [above] {} (4);
\draw [->, color=black] (3) to node [above] {} (4);
\draw [->, color=black] (3) to node [above] {} (5);
\draw [->, color=black] (4) to node [above] {} (6);
\draw [->, color=black] (4) to node [above] {} (7);
\draw [->, color=yellow] (4) to node [above] {} (8);
\draw [->, color=black] (5) to node [above] {} (8);
\draw [->, color=black] (6) to node [above] {} (t);
\draw [->, color=black] (7) to node [above] {} (t);
\draw [->, color=yellow] (8) to node [above] {} (t);
\draw [-, color=red] (4.9, 2.1) node{} -- (5.2, 2.9);
\draw [-, color=red] (4.6, 2.5) node{} -- (5.5, 2.5);
\end{tikzpicture}
\caption{Level Graph with 1 Augmented Path}
\label{fig:levelgraph}
\end{figure}
Recall that in \blockflow{}, we iteratively make a sequence of advances and retreats to augment paths, leaving behind saturated edges. An example of an augmented path is depicted in the above graph, where the advances are indicated in \textcolor{yellow}{yellow}, and the saturated edge is crossed about with a \textcolor{red}{red X}. In future augmentations, we may end up revisiting the same vertex many times. Thus, to optimize path augmentations, we want to somehow memoize the explored path fragments (e.g. $\{(s, v_1), (v_1, v_4)\}, \{(v_8, t)\}$). This way, we can decrease the cost of advances by jumping ``up'' the graph through these explored path fragments.
For example, suppose for our next augmentation, we want to take the path
\[s \rightarrow v_1 \rightarrow v_4 \rightarrow v_7 \rightarrow t.\]
In doing so, we would want to be able to jump from $s$ directly to $v_4$ when advancing, instead of having to traverse through $v_1$. We indicate this optimized augmenting traversal in \textcolor{green}{green} in Figure \ref{fig:levelgraph2}.
\begin{figure}
\centering
\begin{tikzpicture}[->,>=stealth',shorten >=1pt,auto,node distance=3cm,
thick,main node/.style={circle,draw,font=\sffamily\Large\bfseries}]
\node[shape=circle, draw=black] (s) at (0,3) {$s$};
\node[shape=circle, draw=black] (1) at (2,5) {$v_1$};
\node[shape=circle, draw=black] (2) at (2,3) {$v_2$};
\node[shape=circle, draw=black] (3) at (2,1) {$v_3$};
\node[shape=circle, draw=black] (4) at (4,4) {$v_4$};
\node[shape=circle, draw=black] (5) at (4,2) {$v_5$};
\node[shape=circle, draw=black] (6) at (6,5) {$v_6$};
\node[shape=circle, draw=black] (7) at (6,3) {$v_7$};
\node[shape=circle, draw=black] (8) at (6,1) {$v_8$};
\node[shape=circle, draw=black] (t) at (8,3) {$t$};
\draw [->, color=yellow] (s) to node [above] {} (1);
\draw [->, color=black] (s) to node [above] {} (2);
\draw [->, color=black] (s) to node [above] {} (3);
\draw [->, color=green] (s) to node [above] {} (4);
\draw [->, color=yellow] (1) to node [above] {} (4);
\draw [->, color=black] (2) to node [above] {} (4);
\draw [->, color=black] (3) to node [above] {} (4);
\draw [->, color=black] (3) to node [above] {} (5);
\draw [->, color=black] (4) to node [above] {} (6);
\draw [->, color=green] (4) to node [above] {} (7);
\draw [->, color=yellow] (4) to node [above] {} (8);
\draw [->, color=black] (5) to node [above] {} (8);
\draw [->, color=black] (6) to node [above] {} (t);
\draw [->, color=green] (7) to node [above] {} (t);
\draw [->, color=yellow] (8) to node [above] {} (t);
\draw [-, color=red] (4.9, 2.1) node{} -- (5.2, 2.9);
\draw [-, color=red] (4.6, 2.5) node{} -- (5.5, 2.5);
\draw [-, color=red] (6.75, 2.6) node{} -- (7.25, 3.4);
\draw [-, color=red] (7.25, 2.6) node{} -- (6.75, 3.4);
\end{tikzpicture}
\caption{Level Graph with 2 Augmented Paths}
\label{fig:levelgraph2}
\end{figure}
In the later parts of this lecture, we discuss how to implement an efficient data structure that memoizes the explored tree fragments and allows us to make these jumps.
\section{Blocking Flow}
When implementing \blockflow{}, we maintain a rooted forest (using a link-cut tree data structure) subject to the following operations:
\begin{itemize}
\item \texttt{maketree($v$)}: make singleton tree just with node $v$
\item \texttt{link($v, w, x$)}: make $w$ the parent of $v$ with residual capacity $x$ on $(v, w)$
\item \texttt{cut($v$)}: remove edge $(v, \text{parent}(v))$ and return the residual capacity
\item \texttt{findRoot($v$)}: return root of $v$'s tree
\item \texttt{findMin($v$)}: return edge of min capacity from $v$ to its root, breaking ties by returning the one furthest from the root
\item \texttt{addFlow($v, x$)}: subtract $x$ from $u_e$ for all $e$ on the path from $v$ to its root
\end{itemize}
Now that we've defined these operations, we can write out \blockflow{} in Algorithm \ref{alg:bf}.
\begin{algorithm}
\caption{Perform a single iteration of \blockflow{}}\label{alg:bf}
\begin{algorithmicx}
\Procedure{\blockflow{}}{$G=(V, E), s, t$}
\ForAll{$v \in V$}
\State \texttt{makeTree}($v$)
\EndFor
\While{True}
\State $v \gets$ \texttt{findRoot}($s$)
\If{v = t}\Comment{Augment}
\State $(s, x) \gets \texttt{findMin}(s)$
\State \texttt{addFlow}($s$, $x$)
\While{$(z, \text{parent}(z)) \gets \texttt{findMin}(s)$ has remaining capacity 0}
\State \texttt{cut}($z$)
\State \texttt{del}($(z, \text{parent}(z))$)
\EndWhile
\Else \Comment{Advance?}
\If{$v$ has outgoing edge $(v, w) \in E$}
\State \texttt{link}$\left(v, w, u_{(v, w)}\right)$
\Else
\If{$v=s$} \Comment{Done}
\State \texttt{break}
\Else \Comment{Mass Retreat}
\ForAll{child $w$ of $v$}
\State \texttt{cut}($w$)
\State \texttt{del}($(w, v)$)
\EndFor
\EndIf
\EndIf
\EndIf
\EndWhile
\end{algorithmicx}
\end{algorithm}
\newpage
\section{Implementing Link-Cut Trees Using Splay Trees}
Now, we discuss how to actually implement link-cut trees using splay trees for \blockflow{}. We first introduce the idea of accessing a vertex, which we can currently think of as ``touching a vertex $v$.'' We denote the operation as \texttt{access(v)}.
\subsection{Preferred Path Decomposition}
\begin{definition}[Preferred Child]
Every vertex $v$ will have at most one \textbf{preferred child}, defined as follows:
\[\texttt{prefChild}(v) = \begin{cases}
\text{null, if $v$ was the most recent access in its subtree} \\
\text{child towards most recent access, otherwise}
\end{cases}\]
\end{definition}
\begin{definition}[Preferred Edge]
A \textbf{preferred edge} is defined as any edge that leads to a preferred child.
\end{definition}
\begin{definition}[Preferred Path]
A \textbf{preferred path} is defined as a maximal contiguous sequence of preferred edges.
\end{definition}
Using a preferred path decomposition, we can visualize a tree fragment in the level graph below:
\begin{figure}[htb]
\centering
\begin{tikzpicture}[node distance={25mm}, thick, main/.style = {draw, circle}]
\node[main] (1) {$v_1$};
\node[main] (2) [below left of=1] {$v_2$};
\node[main] (3) [left of=2] {$v_3$};
\node[main] (4) [below right of=1] {$v_4$};
\node[main] (5) [right of=4] {$v_5$};
\node[main] (6) [below of=3] {$v_6$};
\node[main] (7) [below left of=2] {$v_7$};
\node[main] (8) [below of=2] {$v_8$};
\node[main] (9) [below right of=2] {$v_9$};
\node[main] (10) [below of=8] {$v_{10}$};
\node[main] (11) [below right of=5] {$v_{11}$};
\node[main] (12) [below left of=5] {$v_{12}$};
\node[main] (13) [below left of=12] {$v_{13}$};
\node[main] (14) [below of=12] {$v_{14}$};
\node[main] (15) [below right of=12] {$v_{15}$};
\node[main] (16) [below left of=14] {$v_{16}$};
\node[main] (17) [below right of=14] {$v_{17}$};
\draw[yellow] (3) -- (6);
\draw[yellow] (2) -- (8);
\draw[yellow] (8) -- (10);
\draw[yellow] (5) -- (12);
\draw[yellow] (12) -- (15);
\draw[yellow] (14) -- (17);
\draw (1) -- (2);
\draw (1) -- (3);
\draw (1) -- (4);
\draw (1) -- (5);
\draw (2) -- (7);
\draw (2) -- (9);
\draw (5) -- (11);
\draw (12) -- (13);
\draw (12) -- (14);
\draw (14) -- (16);
\end{tikzpicture}
\caption{Represented Tree (i.e. Tree Fragment of Level Graph)}
\label{fig:representedtree}
\end{figure}
In this tree fragment, we indicate 4 preferred paths (in \textcolor{yellow}{yellow})
\[\{v_3, v_6\}, \ \{v_2, v_8, v_{10}\}, \ \{v_5, v_{12}, v_{15}\},\ \{v_{14}, v_{17}\}\]
and as part of the link-cut data structure, we store each of these paths as a splay tree. Note that preferred paths consisting of one vertex are stored as singletons.
\subsection{Storing Preferred Paths}
Each preferred path will be stored as a splay tree with the following specifications:
\begin{itemize}
\item We will store preferred path decomposition explicitly.
\item The items in the splay tree will be the vertices in the preferred path.
\item The key corresponding to each vertex will be its depth, prioritized by increasing depth (i.e. higher up in the tree means less priority)
\end{itemize}
Note that the tree depicted in Figure \ref{fig:representedtree} depicts just a single tree (containing a bunch of preferred path trees) in a forest of trees represented by the link-cut data structure.
Next, to distinguish between a preferred path tree and its corresponding splay tree, we define the following terms:
\begin{definition}[Represented Tree]
A \textbf{represented tree} is defined as an entire tree fragment from the level graph (e.g. $(s, v_1, v_4)$ from Figure \ref{fig:levelgraph}, or the entire tree in Figure \ref{fig:representedtree}).
\end{definition}
\begin{definition}[Auxiliary Tree]
An \textbf{auxiliary tree} is defined as a splay tree that represents a preferred path within the represented tree. Note that a represented tree corresponds to a tree of auxiliary trees.
\end{definition}
Thus, we may represent the represented tree in Figure \ref{fig:representedtree} as a forest of auxiliary trees, as shown in Figure \ref{fig:treeofauxtrees} below:
% \begin{align*}
% &A: \{x_1\}, \quad
% B: \{x_3, x_6\}, \quad
% C: \{x_2, x_8, x_{10}\}, \quad \\
% &D: \{x_4\},\quad
% E: \{x_5, x_{12}, x_{15}\},\quad
% F: \{x_{11}\}, \quad \\
% &G: \{x_7\},\quad
% H: \{x_9\},\quad
% I: \{x_{13}\},\quad
% J: \{x_{16}\},\quad
% K: \{x_{14}, x_{17}\}
% \end{align*}
\begin{figure}[htb]
\centering
\begin{tikzpicture}[node distance={25mm}, color=purple, thick, main/.style = {draw, isosceles triangle}, shape border rotate=+90]
\node[main] (1) {$A$};
\node[main] (2) [below left of=1] {$C$};
\node[main] (3) [left of=2] {$B$};
\node[main] (4) [below right of=1] {$D$};
\node[main] (5) [right of=4] {$E$};
\node[main] (6) [below left of=2] {$G$};
\node[main] (7) [below right of=2] {$H$};
\node[main] (8) [below left of=5] {$I$};
\node[main] (9) [below of=5] {$K$};
\node[main] (10) [below right of=5] {$F$};
\node[main] (11) [below left of=9] {$J$};
\draw (1) -- (2);
\draw (1) -- (3);
\draw (1) -- (4);
\draw (1) -- (5);
\draw (2) -- (6);
\draw (2) -- (7);
\draw (5) -- (8);
\draw (5) -- (9);
\draw (5) -- (10);
\draw (9) -- (11);
\end{tikzpicture}
\caption{Auxiliary Tree Forest}
\label{fig:treeofauxtrees}
\end{figure}
Here, the preferred paths are
\begin{center}
\textcolor{purple}{
\begin{tabular}{llll}
$A: \{v_1\},$ &
$B: \{v_3, v_6\},$ &
$C: \{v_2, v_8, v_{10}\},$ &
$D: \{v_4\},$ \\
$E: \{v_5, v_{12}, v_{15}\},$ &
$F: \{v_{11}\}$, &
$G: \{v_7\},$&
$H: \{v_9\},$\\
$I: \{v_{13}\},$&
$J: \{v_{16}\},$&
$K: \{v_{14}, v_{17}\}$
\end{tabular}
}
\end{center}
Now that we've defined the preferred path decomposition and the auxiliary tree forest representation, we discuss what \texttt{access} actually does. Anytime we call \texttt{access} on a vertex $v$, the preferred path from the root node (of the root auxiliary tree) to $v$ will be changed and $v$ will be moved to the root auxiliary tree. Then, after this \texttt{access}, we want to splay $v$ to become the root of all the auxiliary trees. This way, we indicate that $v$ is the most recently accessed node.
Note that calling \texttt{access} does not change the represented tree.
\subsection{Implementing Link and Cut}
Before we describe how to implement \texttt{access} in detail, we first walk through how to implement \texttt{link} and \texttt{cut}.
\subsubsection{Link}
\begin{algorithm}
\caption{Link $v$ and $w$ with edge weight $x$}
\begin{algorithmicx}
\Procedure{link}{$v, w, x$}
\State \texttt{access}($v$)
\State \texttt{access}($w$)
\State $v$\texttt{.pathparent} $\gets w$
\State $w$.\texttt{right} $\gets v$
\end{algorithmicx}
\end{algorithm}
Note that the \texttt{.pathparent} pointer point from the root of a given auxiliary tree to some node in its parent auxiliary tree (e.g. $(rv_B \in B) \rightarrow (v_A \in A)$ in Figure \ref{fig:treeofauxtrees}).
Let's visualize what we're doing here. $v$ is in the root tree of auxiliary trees, and has nothing to its right. $w$ is in a similar situation, higher up in the auxiliary tree forest compared to $v$.
\begin{figure}[htb]
\centering
\begin{tikzpicture}[node distance={10mm}, color=purple, thick, main/.style = {draw, isosceles triangle}, shape border rotate=+90]
\node[main] (2) at (0, 1) {};
\filldraw [purple] (2.5,1) circle (2pt);
\node[draw, color=white, text=purple] at (2.5,1.3) {$v$};
\filldraw [purple] (0.5,2) circle (2pt);
\node[draw, color=white, text=purple] at (0.5,2.3) {$w$};
\node[draw, color=white] (6) [below left of=2] {};
\node[draw, color=white] (7) [below of=2] {};
\node[draw, color=white] (8) [below right of=2] {};
\draw (2) -- (6);
\draw (2) -- (7);
\draw (2) -- (8);
\draw[color=purple, dashed] (2.5,1) -- (0.5, 2);
\draw (0.5,2) -- (0, 1.45);
\end{tikzpicture}
\caption{Auxiliary Tree Forest after Link Operation}
\label{fig:linkauxtrees}
\end{figure}
\begin{figure}[htb]
\centering
\begin{tikzpicture}[node distance={10mm}, color=black!30!yellow, thick, main/.style = {draw, isosceles triangle}, shape border rotate=+90]
\node[main, minimum size=2cm] (1) at (2,2) {};
\node[main, minimum size=2cm] (2) at (0,0) {};
\filldraw [purple] (0,1.8) circle (2pt);
\node[draw, color=white, text=black!30!yellow] at (0,2.1) {$v$};
\filldraw [black!30!yellow] (2,2.3) circle (2pt);
\node[draw, color=white, text=black!30!yellow] at (2.3,2.3) {$w$};
\draw[color=black!30!yellow, dashed] (0, 1.8) -- (2, 2.3);
\draw[color=purple, squiggle] (2, 2.3) -- (2, 3.7);
\end{tikzpicture}
\caption{Represented Tree after Link Operation}
\label{fig:linkreptree}
\end{figure}
Then, in the represented trees, $w$ is somewhere in a represented tree, and $v$ is the root of its own represented tree (we only call \texttt{link} from a root).
Note that we first call \texttt{access} on $v$ and then on $w$ so that $w$ remains in a separate auxiliary tree because it has become the most recently accessed node. Now, the path from $w$ up to its root (indicated in Figure 6) is preferred. Also, since $v$ has no preferred child, it becomes a singleton auxiliary tree.
\subsubsection{Cut}
\begin{algorithm}
\caption{Cut the edge $(v, \text{parent}(v))$}
\begin{algorithmicx}
\Procedure{cut}{$v$}
\State \texttt{access}access($v$)
\State $v$\texttt{.left.parent} $\gets$ null
\State $v$\texttt{.left} $\gets$ null
\end{algorithmicx}
\end{algorithm}
% \textbf{Cut}:
% \begin{lstlisting}
% cut(v):
% access(v)
% v.left.parent = null
% v.left = null
% \end{lstlisting}
Note that the \texttt{.left.parent} and \texttt{.left} pointers are within an auxiliary tree.
Let's visualize what we're doing here. Cutting $v$ basically cuts off everything descendant of $v$ in the represented tree (and thus everything on the left of $v$ in its auxiliary tree).
\begin{figure}[htb]
\centering
\begin{tikzpicture}[node distance={10mm}, color=purple, thick, main/.style = {draw, isosceles triangle}, shape border rotate=+90]
\node[main, minimum size=3cm] (1) at (0,0) {};
\node[main] (1) at (-0.25,0.5) {$A$};
\filldraw [purple] (0,1.7) circle (2pt);
\node[draw, color=white, text=purple] at (0,2.1) {$v$};
\draw[-stealth] (1, 0.5) -- (2, 0.5);
\draw (0,1.7) -- (-0.25, 1.35);
\draw[squiggle] (0,1.7) -- (0.5, 0.5);
\draw[squiggle] (-0.5, 0.25) -- (-0.75, -0.5);
\draw[squiggle] (-0.2, 0.25) -- (0.75, -0.5);
\node[main] (2) at (3.5,0) {$A$};
\filldraw [purple] (4,2) circle (2pt);
\node[draw, color=white, text=purple] at (4,2.3) {$v$};
\draw[squiggle, color=black] (4,2) -- (4.55, 1);
\draw[squiggle] (3.3,-0.3) -- (2.7, -1);
\draw[squiggle] (3.6,-0.3) -- (4.2, -1);
\end{tikzpicture}
\caption{$v$'s Auxiliary Tree Pre-Cut/ Auxiliary Forest Post-Cut}
\label{fig:cutauxtree}
\end{figure}
\subsection{Implementing Access}
Finally, we implement \texttt{access(v)}. We start with our forest of auxiliary trees with $v$ somewhere down below the root auxiliary tree. This is depicted in Figure \ref{fig:accessauxtree} below:
\begin{figure}[htb]
\centering
\begin{tikzpicture}[node distance={10mm}, color=purple, thick, main/.style = {draw, isosceles triangle}, shape border rotate=+90]
\node[main] (1) at (0,0) {};
\node[main] (2) at (1.5,1.5) {};
\filldraw [purple] (0,0.1) circle (2pt);
\node[draw, color=white, text=purple] at (0.4,0.1) {$v$};
\draw (1) -- (2);
\end{tikzpicture}
\caption{Original tree of aux trees (some aux trees may be omitted)}
\label{fig:accessauxtree}
\end{figure}
\newpage
First, we splay $v$ to put it at the root of its auxiliary tree. Then, since $v$ has no preferred child anymore, everything on the right of $v$ in its auxiliary tree becomes a separate auxiliary tree with a parent pointer to $v$. $v$ used to be part of some preferred path, and now it's part of another preferred path.
Then, we look at the the highest vertex in the preferred path represented by $v$'s aux tree (the bottom left node in the aux tree), which we shall call $t$. Now, $t$ has a parent pointer to the deepest node in the parent auxiliary tree (the bottom right node in the aux tree), which we will call $w$. This entire process is depicted in Figure \ref{fig:accesssplayv} below.
\begin{figure}[htb]
\centering
\begin{tikzpicture}[node distance={10mm}, color=purple, thick, main/.style = {draw, isosceles triangle}, shape border rotate=+90]
\node[main, minimum size=2cm] (1) at (0,0) {};
\node[main, minimum size=2cm] (2) at (2.5,3) {};
\filldraw [purple] (0,1.8) circle (2pt);
\node[draw=none, color=white, text=purple] at (0, 2.1) {$v$};
\filldraw [purple] (-0.7,-0.4) circle (2pt);
\node[draw=none, text=purple] at (-0.4,-0.4) {$t$};
\filldraw [purple] (3.2, 2.5) circle (2pt);
\node[draw=none, color=white, text=purple] at (3,2.8) {$w$};
\node[main] (3) at (0.5, -1.5) {};
\draw (1) -- (3);
\draw (1) -- (2);
\draw[dotted] (-0.7,-0.4) -- (3.2,2.5);
\end{tikzpicture}
\caption{After splaying $v$}
\label{fig:accesssplayv}
\end{figure}
As a result of the \texttt{access}, we splay $w$ so that it moves to the root of its auxiliary tree. Now, $t$ is the preferred child of $w$ and gets forked off into its own auxiliary tree. The auxiliary graph now looks like as follows:
\begin{figure}[htb]
\centering
\begin{tikzpicture}[node distance={10mm}, color=purple, thick, main/.style = {draw, isosceles triangle}, shape border rotate=+90]
\node[main, minimum size=2cm] (1) at (0,0) {};
\node[main] (2) at (-0.5, -1.5) {};
\node[main] (3) at (0.5, -1.5) {};
\filldraw [purple] (0,1.8) circle (2pt);
\node[draw, color=white, text=purple] at (0,2.1) {$w$};
% \filldraw [purple] (0.1,-0.25) circle (2pt);
% \node[draw, color=white, text=purple] at (-0.2,-0.25) {$t$};
\filldraw [purple] (0.3, 0.5) circle (2pt);
\node[draw, color=white, text=purple] at (0,0.5) {$v$};
\draw (1) -- (2);
\draw (1) -- (3);
\end{tikzpicture}
\caption{After splaying $w$, cutting off right tree, and merging $v$}
\label{fig:accesssplayvend}
\end{figure}
\newpage
Now, $v$ is in its parent auxiliary tree with $w$. We splay $v$, and it gets moved to the root of its current auxiliary tree. Essentially, we just keep on repeating this process until we get $v$ at the very top. We provide an overview of what \texttt{access}($v$) does in Algorithm \ref{alg:access}.
\begin{algorithm}
\caption{Accessing a node $v$}\label{alg:access}
\begin{algorithmicx}
\Procedure{access}{$v$}
\While{$v$ is not the root of the root auxiliary tree}
\begin{enumerate}
\item Splay $v$, cut off right tree as needed
\item Splay the appropriate node in $v$'s parent auxiliary tree, cut off right tree as needed
\item Merge $v$ into its parent auxiliary tree
\end{enumerate}
\EndWhile
\end{algorithmicx}
\end{algorithm}
\subsection{Runtime of Access}
The cost of an \texttt{access} can be computed by finding the total cost of doing the preferred child change (PCC) splays.
Thus, the runtime of a single \texttt{access} is \[O\left(\text{(cost of splay)} \cdot (1 + (\text{\# of PCC}))\right)\]
and runtime of $m$ of them is
\[O\left(\text{(cost of splay)} \cdot (m + (\text{\# of PCC}))\right)\]
We know that a splay takes amortized $O(\log n)$ time, and in the next lecture we will show that the number of PCCs is $O(m \log n)$. Thus, our overall runtime is:
\begin{equation}
O\left(\log n \cdot (m + m \log n)\right) = O(m \log^2 n)
\end{equation}
To prove this runtime, we use the analysis method called \textbf{Heavy-Light Decomposition}.
\section{Heavy-Light Decomposition (HLD)}
\begin{definition}[size]
We define \textbf{size}($v$) to be the number of nodes in $v$'s subtree, including $v$.
\end{definition}
In the HLD scheme, we categorize edge $(v, w)$ from $v$ to its parent $w$ as \textbf{Heavy} or \textbf{Light} according to the following rule:
\[(v, w) \in \begin{cases}
\text{heavy} & \text{if size($v$) $> \frac{1}{2}$size($w$)} \\
\text{light} & \text{if size($v$) $\leq \frac{1}{2}$size($w$)}
\end{cases}\]
Then, HLD allows us to perform the following runtime analysis:
Suppose the preferred child of $w$ switched from $v$ to $v^{\prime}$. Then, if we say that $(v, w)$ is the preferred child destruction, then $(v^{\prime}, w)$ is denoted as the preferred child creation (PCC).
Since we want to bound the number of preferred child creations (PCC), we want to prove the following two theorems:
\begin{theorem}[]
$\mathrm{(\# \ PCC)} \leq 2(\mathrm{\# \ LPCC}) + n + m $
\end{theorem}
\begin{theorem}[]
$(\mathrm{\# \ LPCC}) \leq O(m \log n)$
\end{theorem}
where LPCC stands for ``Lightest Preferred Child Creation''.
Combining these two theorems together, we are able to show that
\[\mathrm{(\# \ PCC)} \leq O(m \log n)\]
and thus the runtime in Eq (1) directly follows.
Proofs for these theorems will be discussed in the next lecture. Stay tuned!
\bibliographystyle{alpha}
\begin{thebibliography}{42}
\bibitem{sleatoretal}
Daniel Dominic Sleator and Robert Endre Tarjan.
\newblock A Data Structure for Dynamic Trees.
\newblock {\em J. Comput. Syst. Sci.}, 26(3):362-391, 1983
\end{thebibliography}
\end{document}