You are on page 1of 97

Estimating graph

parameters via random


walks
Roberto I. Oliveira (IMPA)
Joint with Anna Ben-Hamou (Sorbonne) & Yuval Peres
Mathematical Statistics and Learning 1 (2018), 375-399.

CIRM Workshop on Spectra, Algorithms and Random Walks on


Random Networks, Luminy, 2020.
Motivation:
Lightweight
algorithms for very
large networks

Source: Facebook (2011)


Motivation:
Lightweight
algorithms for very
large networks

Source: OPTE Project (2005)


–Practical: already used
Reasons to (see eg. Das Sarma et al. JACM’13)
use random
walks –Mathematical: nice,well understood
(though we need new ideas)
How can one
estimate the
size of a graph
using local
information
and in
sublinear
time?
– Estimators for
Our results # of vertices, edges & mixing time
with nearly optimal time complexity.
Preliminaries
Definition:

A graph is a pair of sets 𝐺 = (𝑉% , 𝐸% ), with:


1. 𝑉% ≠ ∅ (set of vertices);
./
Graph 2. 𝐸% ⊂ 0
(set of edges).
(we’ll often drop the subscript)

Finite graph:
𝑛% ≔ |𝑉% | < +∞ and 𝑚% ≔ |𝐸% | < +∞
Notation:

Given 𝐺 = (𝑉% , 𝐸% ) and 𝑥, 𝑦 ∈ 𝑉% ,

Neighbors 1. 𝑥, 𝑦 are neighbors (𝑥 ∼% 𝑦) if {𝑥, 𝑦} ∈ 𝐸% .


and degrees
2. Define 𝐝𝐞𝐠 𝑮 (𝒙)= # of neighbors of 𝑥.
What is (lazy)
random walk?
Stay put with probability 1/2

What is (lazy)
random walk?
Move to neighbor with prob. 1/2deg(𝑥)

What is (lazy)
random walk?
Time 0

What is (lazy)
random walk?
Time 1 (prob. 1/6)

What is (lazy)
random walk?
Time 2 (prob. 1/2)

What is (lazy)
random walk?
Time 3 (prob. 1/10)

What is (lazy)
random walk?
Time 4 (prob. 1/2)

What is (lazy)
random walk?
Time 5 (prob. 1/2)

What is (lazy)
random walk?
Time 6 (prob. 1/6)

What is (lazy)
random walk?
Time 7 (prob. 1/2)

What is (lazy)
random walk?
Notation:
Given 𝐺 = (𝑉, 𝐸) with 𝑉 ⊂ ℕ and 𝑖 ∈ ℕ.

(K)
– 𝑋J := 𝑖-th random walker on 𝐺 at time 𝑡
Multiple lazy (K) (K)
random walks – deg J ≔ degree of 𝑋J

ℙ%N = probabilities for RWs on 𝐺 that


(K)
all start from 𝑋O =𝑥
The problem
An estimator is a map

𝐸𝑆𝑇: ∪T,U (ℕ0 )T×U → ℝ

We write:
An estimator
Y (K) (K)
𝐸𝑆𝑇T,U ≔ 𝐸𝑆𝑇(𝑋J , deg J : 𝑖 ≤ 𝐾; 𝑡 ≤ 𝑇)

(estimate with 𝐾 RWs run for time 𝑇)


Given:
Family 𝒢 of graphs, 𝛾: 𝒢 → ℝ, 𝜖 > 0.
(eg. 𝛾 = # vertices)

A good Want:
estimator
For all 𝐺 ∈ 𝒢 there exist 𝐾O 𝐺 and
𝑇O (𝐺) such that ∀𝑥 ∈ 𝑉% , 𝐾 ≥ 𝐾O , 𝑇 ≥ 𝑇O ,

2
ℙ%N Y T,U − 𝛾 𝐺
𝐸𝑆𝑇 ≤ 𝜖𝛾 𝐺 ≥ .
3
Time complexity:
𝑲𝟎 𝑮 𝑻𝟎 𝑮 = min. total # of RW steps
(want this as small as possible)

Time Want:
complexity
For all 𝐺 ∈ 𝒢 there exist 𝐾O 𝐺 and
𝑇O (𝐺) such that ∀𝑥 ∈ 𝑉% , 𝐾 ≥ 𝐾O , 𝑇 ≥ 𝑇O ,

2
ℙ%N Y T,U − 𝛾 𝐺
𝐸𝑆𝑇 ≤ 𝜖𝛾 𝐺 ≥ .
3
Given what my group of 𝐾
walkers has seen so far,
Best guess our best guess is that 𝐺
given time has 2 billion vertices.
complexity
budget
1. Do sublinear estimators always exist?

Natural 2. Why not ask that the estimator be


questions “self-stopping”, ie. decides on its own
when it’s seen enough to estimate the
parameter?
Two special cases
that illustrate intrinsic limitations
The case of A cycle
cycles with 14 vertices.
Consider many RWs started from the same
vertex on 𝐶j . Same for 𝐶0j .

Before wrap around, both graphs look like ℤ!

Find the size Wrap around takes Ω 𝒏𝟐 RW steps!


Theorem:
Let 𝒞: = {𝑎𝑙𝑙 𝑐𝑦𝑐𝑙𝑒𝑠 𝐶j , 𝑛 ∈ ℕ}.

No sublinear ∃ 𝑐, 𝜖 > 0 s.t., if 𝛾(𝐺) = 𝑛% = # vertices,


estimator for then ∀ good estimator for 𝒞 and ∀𝑛 ∈ ℕ:
cycles
either 𝐾O 𝐶j 𝑇O 𝐶j ≥ 𝑐𝑛0 ,
or 𝐾O 𝐶0j 𝑇O 𝐶0j ≥ 𝑐𝑛0 .
Cheeger constant or bottleneck ratio:
For 𝑑-regular 𝐺, define:

∑N,€∈|×| • 1[𝑥 ∼% 𝑦]
ℎ% ≔ min
The case of |⊂./ , 𝑑|𝑆|
O} | ~j/ /0
expanders
Thm (Pinsker): When 𝑛 ≫ 1, nearly all 𝑑-
regular graphs 𝐺 on 𝑛 vertices have
ℎ% ≥ ℎ 𝑑 > 0.
Theorem:
𝒢…,† ≔ {𝑑 − 𝑟𝑒𝑔. 𝐺 𝑤𝑖𝑡ℎ ℎ% ≥ ℎ}

has good sublinear estimators with


The case of
expanders
𝐾O (𝐺) 𝑇O (𝐺) ≤ 𝐶 ℎ 𝑛% ,
(which is optimal)

but no “self-stopping” sublinear estimator.


Small 6-reg
𝑐 vertices

Why no
Can get ℎ% ≥ ℎ > 0
sublinear
before & after
self-stopping
estimator?
Huge 6-reg
𝑛 − 𝑐 vertices

Source: Wikipedia, Paley graph


–Time complexity cannot be sublinear in
general.
Takeaways
from these
examples –For certain families, sublinear algorithms
exist, but no self-stopping algorithm
can be sublinear.
Theorem:
Let 𝒢…,† be as above.
If 𝛾(𝐺) = 𝑛% = # vertices, then
∀ good estimator for 𝒢…,† & ∀𝑛 ∈ ℕ:
Lower bound
for expanders
either 𝐾O 𝐺j 𝑇O 𝐺j ≥ 𝑐 𝑛,
or 𝐾O 𝐺0j 𝑇O 𝐺0j ≥ 𝑐 𝑛
where 𝐺j , 𝐺0j ∈ 𝒢…,† have 𝑛 and 2𝑛
vertices (resp).
Main idea:

– Generate random 𝑑-reg. graph on 𝒏 vertices


one vertex at a time, along with the random
Lower bound walks.
for expanders
– The graph is an expander w.h.p.
Lower bound
for expanders
A lower bound
for expanders
A lower bound
for expanders
A lower bound
for expanders
A lower bound
for expanders
A lower bound
for expanders
(...)

A lower bound
for expanders (...) (...)
(...)

A lower bound
for expanders (...) (...)
Coupling:

– Until a cycle is seen, the RWs sees the


infinite 𝑑-regular tree.
A lower bound
for expanders – Cycle takes 𝛀 𝒏 steps to appear

– Can couple with RW on a graph with size 𝟐𝒏


up to that time.
General tools and ideas
Mixing and spectral properties of RW.
–Methods relying on general properties of
What we need random walks.
Transition matrix:
For 𝑥, 𝑦 ∈ 𝑉% ,

𝑃% 𝑥, 𝑦 : = ℙ%N (𝑋• = 𝑦)
Some theory
for LRW •
𝑖𝑓 𝑥 = 𝑦;
0
= •
𝑖𝑓 𝑥 ∼ 𝑦;
0 ••‘ N
0 𝑜𝑡ℎ𝑒𝑟𝑤𝑖𝑠𝑒.
Transition matrix:

If 𝐺 is 𝑑 - regular, (all degrees equal to 𝑑)

Some theory %
𝐼 𝐴
𝑃 = +
for LRW 2 2𝑑

where 𝐴 is the adjacency matrix.


Fact I:

For all times 𝑡 ≥ 0 and 𝑥, 𝑦 ∈ 𝑉% ,

ℙ%N 𝑋J = 𝑦 = 𝑃% J
𝑥, 𝑦 ,
Matrix powers
so the spectrum of 𝑃% is important.

1 ≥ 𝜆%0 ≥ 𝜆%– ≥ ⋯ ≥ 𝜆%j ≥ 0.


Fact II:

If 𝐺 is connected, 𝑃% has a spectral gap,


meaning that the relaxation time
Spectral gap
% 1
and relaxation 𝑡˜™š ≔
1 − 𝜆%0

%
is well defined, and 𝑡˜™š ≤ 2ℎ%›0 .
(Cheeger’s inequality)
Theorem:
– If 𝐺 is connected, then as 𝑡 → +∞,
J
deg 𝑦
𝑃 𝑥, 𝑦 → 𝜋 𝑦 ≔ .
2𝑚
Ergodicity and
the uniform – Moreover, there is a 𝑡•jKž = 𝑂 𝑡˜™š log 𝑛
mixing time such that for all 𝑡 ≥ 𝑡•jKž and 𝑥, 𝑦 ∈ 𝑉% :

3 𝑃J 𝑥, 𝑦 5
≤ ≤
4 𝜋(𝑦) 4
–Leskovec, Lang, Dasgupta & Mahoney
(Internet Mathematics 2009):
Real-life social
networks
evidence of polylog relaxation time
in real-life social networks.
The case of regular
graphs
Collisions vs. intersections
Corollary:

Assume 𝐺 is regular. Then:

Regular J
1
𝑃 𝑥, 𝑦 → .
graphs 𝑛

J
1
𝑡 ≫ 𝑡•jKž ⇒ 𝑃 𝑥, 𝑦 ∼ .
𝑛
Idea # 1: collision counts

(•)
Take i.i.d. LRWs 𝑋J , … , 𝑋J T .
Estimating the
§
number of 𝐶JT ≔ ¦ 1 𝑋J K = 𝑋J .
vertices K,§~T

Counts collisions at time 𝒕.


Idea # 1: collision counts

K §
𝐶JT ≔ ¦ 1 𝑋J = 𝑋J .
K,§~T
Estimating the
number of
vertices For 𝑡 ≫ 𝑡•jKž , 𝐾 ≫ 𝑛:

𝔼%N 𝐶JT ≈ and
j
T-
𝑉𝑎𝑟N% 𝐶JT ≪ « .
j
Idea # 1: collision counts

Upshot: once 𝐾, 𝑡 are large enough,


Estimating the 0
YT : = 𝐾
number of 𝑁 J T ∼𝑛
vertices 𝐶J
with high probability.

About 𝑡•jKž 𝑛 total RW steps.


Collisions Lower bnd
Collisions
Cycles 𝑛¯/0 𝑛0
in examples
Expanders 𝑛 𝑛
Idea # 1: collision counts

– General idea explored by several groups.


See eg. Katzir, Liberty, Somekh, & Cosma
Estimating the (Internet Math’13).
number of
vertices
– Can be extended to non-regular graphs.

– Can one do better?


Idea # 2: intersection counts (our paper)

(•) T
LRWs 𝑋 ,…,𝑋 with 𝐾 = 2𝑘. Set:
Estimating the
number of (K) 0K›• 0K
𝐼J ≔ ¦ 1 𝑋±² = 𝑋±« ;
vertices O~±² ,±« ~J›•

1 (K)
𝐼J ≔ ¦ 𝐼J
𝑘
•~K~³
Theorem (Ben-Hamou, O., Peres):

For regular graphs,


A theorem on 𝑡0 K 𝑡0 –/0
intersection ≤ 𝔼%N 𝐼J ≤ + 𝐶𝑡˜™š
𝑛 𝑛
counts
(K) 0
K
and 𝑉𝑎𝑟N% 𝐼J ≤ 4 max 𝔼%µ 𝐼J .
µ∈./
(improves Peres, Sauerwald, Sousi & Stauffer)
Theorem (Ben-Hamou, O., Peres):

For a regular graph, define 𝐼J as above.


Estimating the
–/·
number of If 𝑡 ≥ 𝐶¶ 𝑛 𝑡˜™š and 𝐾 ≥ 𝐶¶ , then:
vertices
𝑡0 2
ℙ%N 1−𝜖 𝑛 ≤ ≤ 1+𝜖 𝑛 ≥ .
𝐼J 3
Theorem (Ben-Hamou, O., Peres):

For a regular graph, define 𝐼J as above.


Estimating the Y T,J
𝐸𝑆𝑇
–/·
number of If 𝑡 ≥ 𝐶¶ 𝑛 𝑡˜™š and 𝐾 ≥ 𝐶¶ , then:
vertices
𝑡0 2
ℙ%N 1−𝜖 𝑛 ≤ ≤ 1+𝜖 𝑛 ≥ .
𝐼J 3
Collisions Intersect.
Comparison Steps per RW 𝒕𝒖𝒏𝒊𝒇 –/·
𝑛 𝑡˜™š
for general
regular graphs # of walkers 𝑛 𝒄𝒐𝒏𝒔𝒕.
𝟑/𝟒
Total # of steps 𝑛 𝑡•jKž 𝒏 𝒕𝒓𝒆𝒍

(sublinear time estimator when 𝑡˜™š ≪ 𝑛0/– )


Intersection Lower bnd
Intersections
Cycles 𝑛0 𝑛0
in examples
Expanders 𝑛 𝑛
1. Ek a 3-regular expander on k vertices ;
2. In place of each edge e œ E(Ek ), put a
3. Make graph 3-regular by adding edges

𝑛% = Θ 𝑎𝑘
%
𝑡
Stars˜™š =Θ
form an 𝑘 0
Lower bounds 3-regular expander
for “all” 𝐭𝐢𝐦𝐞𝑎 or
with = 2𝑎
𝛀 𝒌 𝟐
vertices.
𝒂
𝟑 𝟏
relaxation = 𝛀( 𝐤 𝟐 𝟒 𝒌𝒂 ) 𝟐
times
No rw is able to distinguish Gk,¸ and
𝑮𝒂 or 𝑮𝟐𝒂 : edges become 𝑘-paths w/ handles 3
0 Ô
(slows RW down by 𝑘 factor) ¸2 k & tu
log
Sketch of
first moment –Click here .
upper bounds
–Collisions give suboptimal bounds
(also require adding more RWs over time).
Takeaways
from the
regular case –Intersections use whole paths and are
optimal for “all” values of the relaxation
time.
Missing:
Results for
nonregular graphs

Source: OPTE Project (2005)


Non-regular graphs
Weighted intersection counts:

(•) T
LRWs 𝑋 ,…,𝑋 with 𝐾 = 2𝑘. Set:
Estimating the
0K›• 0K
number of (K)
1 𝑋±² = 𝑋±«
edges ℐJ ≔ ¦ 0K
;
O~±² ,±« ~J›• deg % (𝑋±« )

1 (K)
ℐJ ≔ ¦ ℐJ
𝑘
•~K~³
Theorem (Ben-Hamou, O., Peres):

For a graph with minimal degree 𝑑,


A theorem on –
weighted 𝑡0 K 𝑡0 0
𝐶𝑡˜™š
%
≤ 𝔼N ℐJ ≤ + .
intersection 2𝑚 2𝑚 𝑑
counts % (K) K
0
and 𝑉𝑎𝑟N ℐJ ≤ 4 max 𝔼%µ ℐJ .
µ∈./
Theorem (Ben-Hamou, O., Peres):

For a graph with minimal degree 𝑑,


Estimating the
number of if 𝑡 ≥ 𝐶¶
Ì –/·
𝑡˜™š and 𝐾 ≥ 𝐶¶ , then:

edges
𝑡0 2
ℙ%N 1−𝜖 𝑚 ≤ ≤ 1+𝜖 𝑚 ≥ .
2ℐJ 3
Theorem (Ben-Hamou, O., Peres):

For a graph with minimal degree 𝑑,


Estimating the Y T,J
𝐸𝑆𝑇
number of if 𝑡 ≥ 𝐶¶
Ì –/·
𝑡˜™š and 𝐾 ≥ 𝐶¶ , then:

edges
𝑡0 2
ℙ%N 1−𝜖 𝑚 ≤ ≤ 1+𝜖 𝑚 ≥ .
2ℐJ 3
Theorem (Ben-Hamou, O., Peres):

Using return prob. bounds by Lyons & Oveis-


Gharan and slightly different ℐJ :
Estimating the
number of
¯/Í
edges if 𝑡 ≥ 𝐶¶ 𝑛 𝑡•jKž and 𝐾 ≥ 𝐶¶ , then:

𝑡0 2
ℙ%N 1−𝜖 𝑚 ≤ ≤ 1+𝜖 𝑚 ≥ .
2ℐJ 3
Corollary: (loosely stated)

Good estimator for 𝑚% with 𝐾 ∼ 𝐶¶ and


Complexity for
Î Ñ
number of Ï
𝑇 ∼ 𝐶¶ 𝑛 𝑡•jKž ∧ 𝐶¶
Ì
𝑡˜™š .
-
edges †

Second bound is attained by regular graphs.


unif
Lower bound for each possible mixing tim
1. Start with a 3-regular expander Ek of size k ;
2. Replace each node of
Length q Ek by a clique Kq of size
3. Replace each edge of Ek by a path of length q.
𝑛% = Θ 𝑞𝑘 Kq Kq
%
𝑡˜™š = Θ 𝑞–
Lower bounds Kq Kq
Kq Kq
for “all” 𝐭𝐢𝐦𝐞 = 𝛀 𝒒𝟑 𝒌 Kq
tun
𝟓 𝟏
relaxation = 𝛀( 𝐪𝟑 𝟔 𝒒𝒌 𝟐) Kq
times No rw is able to distinguish Gk,q and G2k,q
3 45/6
𝑮𝒌 or 𝑮𝟐𝒌 based on expander of size 𝑘q3ork 2𝑘
Ô
&
tunif
– log n
(slows RW down by 𝑞 factor)
Can do it with extra steps:
The ratio

𝑛 𝜋(𝑦)
What about =¦
2𝑚 deg(𝑦)
the number of €

vertices? can be estimated from single trajectory via


Markov chain Chernoff bounds.

J×ØÙÚ Ì
Additional steps.
j
5/6 Ô
Time tunif n is not enoug

𝑛% = Θ 𝑘𝑞
𝑚% = Θ 𝑘 0
%
𝑡•jKž = Θ 𝑞0
Kk
Extra time is 𝐭𝐢𝐦𝐞 = 𝛀 𝒒𝒌
necessary for 𝒕𝒖𝒏𝒊𝒇 𝒎
= 𝛀( )
vertices 𝒏
(walk q on path)
However, once a good esti
the
𝑮𝒒 or 𝑮𝟐𝒒 : clique of sizemean degree,
𝑘 with paths which ca
of length
𝑞 or 2𝑞 attached to each vertex (𝑘 ≫ 𝑞)
Corollary: (loosely stated)

Good estimator for 𝑛% with 𝐾 ∼ 𝐶¶ and


Complexity of
Î Ñ
number of Ï
𝑇 ∼ 𝐶¶ 𝑛 𝑡•jKž ∧ 𝐶¶
Ì
𝑡˜™š + 𝐶¶
- J×ØÙÚ Ì
,
vertices † j

sharp up to logarithmic factors.


𝑳𝟐 distance to equilibrium from x:

J 0
𝑃 𝑥, 𝑦
𝑑N0 𝑡 ≔¦ −1 𝜋(𝑦)
𝜋 𝑦
Estimating €

mixing time After suitable manipulations, becomes:

𝑑N0 𝑡 𝑃0J 𝑥, 𝑥 1
≔ −
2𝑚 deg 𝑥 2𝑚
(can be related to intersections)
Conclusion
and open problems
Can estimate # of vertices, edges and
mixing time via multiple RW:
Summary of
results
1. no self-stopping in general
2. sometimes (but not always) sublinear
–What about other parameters?

Main open
problems –What other models of local access to the
graph? (eg. “cluster growth”)

(jump to end)
Sketch of first moment
intersection bound
for regular graphs
Recall:
(•) T
𝑋 ,…,𝑋 on regular graph with 𝐾 = 2𝑘.

K 0K›• 0K
Sketch of first 𝔼%N 𝐼J ≔ ¦ ℙ%N 𝑋±² = 𝑋±« ;
moment O~±² ,±« ~J›•
bound
Claim:
𝑡0 % K 𝑡0 –/0
≤ 𝔼N 𝐼J ≤ + 𝐶𝑡˜™š
𝑛 𝑛
Intersection and return probabilities:

• 0
ℙ%N 𝑋±² = 𝑋±«
Sketch of first
moment = ∑€ 𝑃±² 𝑥, 𝑦 𝑃±« (𝑥, 𝑦) (meet at some 𝑦)
bound
= ∑€ 𝑃±² 𝑥, 𝑦 𝑃±« (𝑦, 𝑥) (symmetry of 𝑃)

= 𝑃±² ݱ« (𝑥, 𝑥) (decompose returns)


Spectral decomposition of P:

𝑃 ± 𝑥, 𝑥 = ∑K 𝜆K± 𝜓K0 (𝑥), where:


Sketch of first – 𝜆•% = 1 > 𝜆%0 ≥ 𝜆%– ≥ ⋯ ≥ 𝜆%j ≥ 0;
moment
bound – The 𝜓K are eigenvectors, 𝜓•0 𝑥 = 1/𝑛.


⇒ 𝑃± 𝑥, 𝑥 − = ∑Kß0 𝜆K± 𝜓K0 𝑥 ≥ 0.
j
Intersection and return probabilities:

% • • 0
𝔼N [𝐼J ]≔ ¦ ℙ%N 𝑋±² = 𝑋±«
O~±² ,±« ~J›•
Sketch of first
moment J« ±² ݱ« •
= + ∑±² ,±« ~J›• 𝑃 𝑥, 𝑥 −
bound j j

𝑡0 ±
1
≤ + ¦(𝑠 + 1) 𝑃 𝑥, 𝑥 −
𝑛 𝑛
±ßO
Back to the spectral representation


Sketch of first ∑±ßO 𝑠 + 1 𝑃± 𝑥, 𝑥 − =
j
moment
bound = ∑±ßO (𝑠 + 1) ∑Kß0 𝜆K± 𝜓K0 𝑥
›•
with all 0 ≤ 𝜆K ≤ 𝜆0 = 1 − 𝑡˜™š

(terms in the sum decay)


Truncation argument:
± •
∑±ßO 𝑠 + 1 𝑃 𝑥, 𝑥 − =
j

Sketch of first = ∑±ßO (𝑠 + 1) ∑Kß0 𝜆K± 𝜓K0 𝑥


moment ›•
(with all 0 ≤ 𝜆K ≤ 𝜆0 = 1 − 𝑡˜™š )
bound
≤ 𝐶 ∑O~±~𝒕𝒓𝒆𝒍 (𝑠 + 1) ∑Kß0 𝜆K± 𝜓K0 𝑥

= 𝐶 ∑O~±~𝒕𝒓𝒆𝒍 (𝑠 + 1) 𝑃± 𝑥, 𝑥 −
j
Return probability estimate:

± • à
𝑃 𝑥, 𝑥 − ≤ (Aldous-Fill’94)
j ±Ý•
Sketch of first 1
±
moment ⇒ ¦ 𝑠+1 𝑃 𝑥, 𝑥 −
𝑛
bound O~±~𝒕𝒓𝒆𝒍


0
≤ ¦ 𝐶 𝑠+1≤ 𝐶𝑡˜™š .
O~±~𝒕𝒓𝒆𝒍
End of sketch:

𝑡0 1
𝔼%N 𝐼J • ±
= + ¦(𝑠 + 1) 𝑃 𝑥, 𝑥 −
𝑛 𝑛
Sketch of first ±ßO

moment
𝑡0 ±
1
bound ≤ + 𝐶 ¦ (𝑠 + 1) 𝑃 𝑥, 𝑥 −
𝑛 𝑛
±~Jáâã


𝑡0 0
≤ + 𝐶𝑡˜™š .
𝑛
Weighted intersections:
For min degree 𝑑:

0K›• 0K
(K)
1 𝑋±² = 𝑋±«
Non-regular ℐJ ≔ ¦ 0K
;
graphs O~±² ,±« ~J›• deg % (𝑋±« )
(back)

± •O••‘ N
𝑃 𝑥, 𝑥 − 𝜋(𝑥) ≤
† ±Ý•
(Peres-O.’19)
Thank you!

You might also like