# Trevisan’s L^1-style Cheeger inequality

My expander class at ETH continues, and for the moment the course notes are kept updated (and in fact I have some more in the notes than I did in class).

Last week, I presented the inequalities relating the definition of expanders based on the expansion ratio with the “spectral” definition. In fact, I used the random walk definition first, which is only very slightly different for general families of graphs (with bounded valency), and is entirely equivalent for regular graphs. For a connected $k$-regular graph $\Gamma$, the inequalities state that
$\frac{2}{k}\lambda_1(\Gamma)\leq h(\Gamma)\leq k\sqrt{2\lambda_1(\Gamma)},$
where:

• $\lambda_1(\Gamma)$ is the smallest non-zero eigenvalue of the normalized Laplace operator $\Delta$, i.e., the operator acting on functions on the vertex set by
$\Delta\varphi(x)=\varphi(x)-\frac{1}{k}\sum_{y\text{ adjacent to x}}{\varphi(y)}$.
• $h(\Gamma)$ is the expansion ratio of $\Gamma$, given by $h(\Gamma)=\min_{1\leq |W|\leq |\Gamma|/2}\frac{|E(W)|}{|W|},$ the set $E(W)$ in the numerator being the set of edges in $\Gamma$ with one extremity in $W$ and one outside.

The left-hand inequality is rather simple, once one knows (or notices) the variational characterization
$\lambda_1(\Gamma)=\min_{\varphi\perp 1}\frac{\langle \Delta\varphi,\varphi\rangle}{\|\varphi\|^2}$
of the spectral gap: basically, for a characteristic function $\varphi$ of a set $W$ of vertices, the right-hand side of this expression reduces (almost) to the corresponding ratio $|E(W)|/|W|$. The second inequality, which is called the discrete Cheeger inequality, is rather more subtle.

The proof I presented is the one presented by L. Trevisan in his recent lectures on expanders (see, specifically, Lecture 4). Here, the idea is to construct a test set for $h(\Gamma)$ of the form
$W_t=\{x\in V_{\Gamma}\,\mid\, \varphi(x)\leq t\},$
for a suitable real-valued function $\varphi$ on the group. The only minor difference is that, in order to motivate the distribution of a random “threshold” $t$ that Trevisan uses to construct those test sets, I first stated a general lemma where the threshold is picked according to a fairly general probability measure. Then I first showed what happens for a uniform choice in this interval, to motivate the “better” one that leads to Cheeger’s inequality.

As it turns out, I just found an older blog post of Trevisan explaining his proof, where he also discusses first the computations based on the uniform measure supported on $[\min \varphi,\max\varphi]$, and explains why it is problematic.

But there is some good in this inequality. First, the precise form it takes (which I didn’t find elsewhere, and will therefore name Trevisan’s inequality) is
$h(\Gamma)\leq \frac{A}{B}$
where, for any non-constant real-valued function $\varphi$ with minimum $a$, maximum $b$ and median $t_0$, we have
$A=\frac{1}{2}\sum_{x,y\in V}{a(x,y)|\varphi(x)-\varphi(y)|},$
(where $a(x,y)$ is the number of edges between $x$ and $y$) and
$B=\sum_{x\in V}{|\varphi(x)-t_0|}.$

The interesting fact is that this can be sharper than Cheeger’s inequality. Indeed, this happens at least for the family of graphs $\Gamma_n$ which are the Cayley graphs of $\mathfrak{S}_n$ with respect to the generating set
$S_n=\{(1\ 2),(1\ 2\ \cdots\ n)^{\pm 1}\}.$
From an analysis of the random walks by Diaconis and Saloff-Coste (see Example 1 in Section 3 of their paper), it is known that
$\lambda_1(\Gamma_n)\asymp \frac{1}{n^3},$
$h(\Gamma_n)\ll n^{-3/2}.$
However (unless I completely messed up my computations…), one sees that the test function used by Diaconis and Saloff-Coste, namely the cyclic distance between $\sigma^{-1}(1)$ and $\sigma^{-1}(2)$, satisfies
$\frac{A}{B}\ll \frac{1}{n^2},$
and therefore we get a better upper-bound
$h(\Gamma)\ll \frac{1}{n^2}$
using Trevisan’s inequality.

(Incidentally, we see here an easy example of a sequence of graphs which forms an esperantist family, but is not an expander.)