You are on page 1of 53

Algorithms and Models for the Web

Graph Anthony Bonato


Visit to download the full and correct content document:
https://textbookfull.com/product/algorithms-and-models-for-the-web-graph-anthony-bo
nato/
More products digital (pdf, epub, mobi) instant
download maybe you interests ...

Algorithms and Models for the Web Graph 17th


International Workshop WAW 2020 Warsaw Poland September
21 22 2020 Proceedings Bogumi■ Kami■ski

https://textbookfull.com/product/algorithms-and-models-for-the-
web-graph-17th-international-workshop-waw-2020-warsaw-poland-
september-21-22-2020-proceedings-bogumil-kaminski/

Graph Searching Games and Probabilistic Methods 1st


Edition Bonato

https://textbookfull.com/product/graph-searching-games-and-
probabilistic-methods-1st-edition-bonato/

A guide to graph colouring algorithms and applications


Lewis

https://textbookfull.com/product/a-guide-to-graph-colouring-
algorithms-and-applications-lewis/

Graph Algorithms for Data Science: With examples in


Neo4j 1st Edition Tomaž Bratanic

https://textbookfull.com/product/graph-algorithms-for-data-
science-with-examples-in-neo4j-1st-edition-tomaz-bratanic/
Hybrid System Identification: Theory and Algorithms for
Learning Switching Models Fabien Lauer

https://textbookfull.com/product/hybrid-system-identification-
theory-and-algorithms-for-learning-switching-models-fabien-lauer/

Graph Algorithms Practical Examples in Apache Spark and


Neo4j 1st Edition Mark Needham

https://textbookfull.com/product/graph-algorithms-practical-
examples-in-apache-spark-and-neo4j-1st-edition-mark-needham/

Discrete Mathematics Graph Algorithms Algebraic


Structures Coding Theory and Cryptography 1st Edition
Sriraman Sridharan

https://textbookfull.com/product/discrete-mathematics-graph-
algorithms-algebraic-structures-coding-theory-and-
cryptography-1st-edition-sriraman-sridharan/

Biological Network Analysis: Trends, Approaches, Graph


Theory, and Algorithms 1st Edition Pietro Hiram Guzzi

https://textbookfull.com/product/biological-network-analysis-
trends-approaches-graph-theory-and-algorithms-1st-edition-pietro-
hiram-guzzi/

Genetic Algorithms and Machine Learning for Programmers


Create AI Models and Evolve Solutions 1st Edition
Frances Buontempo

https://textbookfull.com/product/genetic-algorithms-and-machine-
learning-for-programmers-create-ai-models-and-evolve-
solutions-1st-edition-frances-buontempo/
Anthony Bonato
Paweł Prałat
Andrei Raigorodskii (Eds.)
LNCS 10836

Algorithms and Models


for the Web Graph
15th International Workshop, WAW 2018
Moscow, Russia, May 17–18, 2018
Proceedings

123
Lecture Notes in Computer Science 10836
Commenced Publication in 1973
Founding and Former Series Editors:
Gerhard Goos, Juris Hartmanis, and Jan van Leeuwen

Editorial Board
David Hutchison
Lancaster University, Lancaster, UK
Takeo Kanade
Carnegie Mellon University, Pittsburgh, PA, USA
Josef Kittler
University of Surrey, Guildford, UK
Jon M. Kleinberg
Cornell University, Ithaca, NY, USA
Friedemann Mattern
ETH Zurich, Zurich, Switzerland
John C. Mitchell
Stanford University, Stanford, CA, USA
Moni Naor
Weizmann Institute of Science, Rehovot, Israel
C. Pandu Rangan
Indian Institute of Technology Madras, Chennai, India
Bernhard Steffen
TU Dortmund University, Dortmund, Germany
Demetri Terzopoulos
University of California, Los Angeles, CA, USA
Doug Tygar
University of California, Berkeley, CA, USA
Gerhard Weikum
Max Planck Institute for Informatics, Saarbrücken, Germany
More information about this series at http://www.springer.com/series/7407
Anthony Bonato Paweł Prałat

Andrei Raigorodskii (Eds.)

Algorithms and Models


for the Web Graph
15th International Workshop, WAW 2018
Moscow, Russia, May 17–18, 2018
Proceedings

123
Editors
Anthony Bonato Andrei Raigorodskii
Department of Mathematics Department of Discrete Mathematics
Ryerson University Moscow Institute of Physics and Technology
Toronto, ON Dolgoprudny
Canada Russia
Paweł Prałat
Department of Mathematics
Ryerson University
Toronto, ON
Canada

ISSN 0302-9743 ISSN 1611-3349 (electronic)


Lecture Notes in Computer Science
ISBN 978-3-319-92870-8 ISBN 978-3-319-92871-5 (eBook)
https://doi.org/10.1007/978-3-319-92871-5

Library of Congress Control Number: 2018944417

LNCS Sublibrary: SL1 – Theoretical Computer Science and General Issues

© Springer International Publishing AG, part of Springer Nature 2018


This work is subject to copyright. All rights are reserved by the Publisher, whether the whole or part of the
material is concerned, specifically the rights of translation, reprinting, reuse of illustrations, recitation,
broadcasting, reproduction on microfilms or in any other physical way, and transmission or information
storage and retrieval, electronic adaptation, computer software, or by similar or dissimilar methodology now
known or hereafter developed.
The use of general descriptive names, registered names, trademarks, service marks, etc. in this publication
does not imply, even in the absence of a specific statement, that such names are exempt from the relevant
protective laws and regulations and therefore free for general use.
The publisher, the authors and the editors are safe to assume that the advice and information in this book are
believed to be true and accurate at the date of publication. Neither the publisher nor the authors or the editors
give a warranty, express or implied, with respect to the material contained herein or for any errors or
omissions that may have been made. The publisher remains neutral with regard to jurisdictional claims in
published maps and institutional affiliations.

Printed on acid-free paper

This Springer imprint is published by the registered company Springer International Publishing AG
part of Springer Nature
The registered company address is: Gewerbestrasse 11, 6330 Cham, Switzerland
Preface

The 15th Workshop on Algorithms and Models for the Web Graph (WAW 2018) took
place at the Moscow Institute of Physics and Technology, Russia, May 17–18, 2018.
This is an annual meeting, which is traditionally co-located with another, related,
conference. WAW 2018 was co-located with the Workshop on Graphs, Networks, and
Their Applications. The co-location of the two workshops provided opportunities for
researchers in two different but interrelated areas to interact and to exchange research
ideas. It was an effective venue for the dissemination of new results and for fostering
research collaboration.
The World Wide Web has become part of our everyday life, and information
retrieval and data mining on the Web are now of enormous practical interest. The
algorithms supporting these activities combine the view of the Web as a text repository
and as a graph, induced in various ways by links among pages, hosts and users. The
aim of the workshop was to further the understanding of graphs that arise from the Web
and various user activities on the Web, and stimulate the development of
high-performance algorithms and applications that exploit these graphs. The workshop
gathered together researchers working on graph-theoretic and algorithmic aspects of
related complex networks, including social networks, citation networks, biological
networks, molecular networks, and other networks arising from the Internet.
This volume contains the papers presented during the workshop. Each submission
was reviewed by Program Committee members. Papers were submitted and reviewed
using the EasyChair online system. The committee members accepted 11 papers.

May 2018 Anthony Bonato


Paweł Prałat
Andrei Raigorodskii
Organization

General Chairs
Andrei Z. Broder Google Research, USA
Fan Chung Graham University of California San Diego, USA

Organizing Committee
Anthony Bonato Ryerson University, Canada
Paweł Prałat Ryerson University, Canada
Andrei Raigorodskii MIPT, Russia

Program Committee
Konstantin Avratchenkov Inria, France
Paolo Boldi University of Milan, Italy
Anthony Bonato Ryerson University, Canada
Milan Bradonjic Bell, USA
Fan Chung Graham UC San Diego, USA
Collin Cooper King’s College London, UK
Andrzej Dudek Western Michigan University, USA
Alan Frieze Carnegie Mellon University, USA
Aristides Gionis Aalto University, Finland
David Gleich Purdue University, USA
Jeannette Janssen Dalhousie University, Canada
Bogumil Kaminski Warsaw School of Economics, Poland
Ravi Kumar Google Research, USA
Silvio Lattanzi Google Research, USA
Marc Lelarge Inria, France
Stefano Leonardi Sapienza University of Rome, Italy
Nelly Litvak University of Twente, The Netherlands
Michael Mahoney UC Berkeley, USA
Oliver Mason NUI Maynooth, Ireland
Dieter Mitsche Université de Nice Sophia-Antipolis, France
Peter Morters University of Bath, UK
Tobias Mueller Utrecht University, The Netherlands
Liudmila Ostroumova Yandex, Russia
Pan Peng TU Dortmund, Germany
Xavier Perez-Gimenez University of Nebraska-Lincoln, USA
Pawel Pralat Ryerson University, Canada
Yana Volkovich AppNexus, USA
Stephen Young Pacific Northwest National Laboratory, USA
VIII Organization

Sponsoring Institutions

Microsoft Research New England, USA


Google Research, USA
Moscow Institute of Physics and Technology, Russia
Yandex, Russia
Internet Mathematics
Contents

Finding Induced Subgraphs in Scale-Free Inhomogeneous


Random Graphs . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1
Ellen Cardinaels, Johan S. H. van Leeuwaarden,
and Clara Stegehuis

The Asymptotic Normality of the Global Clustering Coefficient


in Sparse Random Intersection Graphs . . . . . . . . . . . . . . . . . . . . . . . . . . . . 16
Mindaugas Bloznelis and Jerzy Jaworski

Clustering Properties of Spatial Preferential Attachment Model . . . . . . . . . . . 30


Lenar Iskhakov, Bogumił Kamiński, Maksim Mironov, Paweł Prałat,
and Liudmila Prokhorenkova

Parameter Estimators of Sparse Random Intersection Graphs


with Thinned Communities . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 44
Joona Karjalainen, Johan S. H. van Leeuwaarden, and Lasse Leskelä

Joint Alignment from Pairwise Differences with a Noisy Oracle . . . . . . . . . . 59


Michael Mitzenmacher and Charalampos E. Tsourakakis

Analysis of Relaxation Time in Random Walk with Jumps . . . . . . . . . . . . . 70


Konstantin Avrachenkov and Ilya Bogdanov

QAP Analysis of Company Co-mention Network . . . . . . . . . . . . . . . . . . . . 83


S. P. Sidorov, A. R. Faizliev, V. A. Balash, A. A. Gudkov,
A. Z. Chekmareva, M. Levshunov, and S. V. Mironov

Towards a Systematic Evaluation of Generative Network Models . . . . . . . . . 99


Thomas Bläsius, Tobias Friedrich, Maximilian Katzmann,
Anton Krohmer, and Jonathan Striebel

Dynamic Competition Networks: Detecting Alliances and Leaders . . . . . . . . 115


Anthony Bonato, Nicole Eikmeier, David F. Gleich, and Rehan Malik

An Experimental Study of the k-MXT Algorithm with Applications


to Clustering Geo-Tagged Data . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 145
Colin Cooper and Ngoc Vu

A Statistical Performance Analysis of Graph Clustering Algorithms. . . . . . . . 170


Pierre Miasnikof, Alexander Y. Shestopaloff, Anthony J. Bonner,
and Yuri Lawryshyn

Author Index . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 185


Finding Induced Subgraphs in Scale-Free
Inhomogeneous Random Graphs

Ellen Cardinaels, Johan S. H. van Leeuwaarden, and Clara Stegehuis(B)

Eindhoven University of Technology, Eindhoven, The Netherlands


C.Stegehuis@tue.nl

Abstract. We study the induced subgraph isomorphism problem on


inhomogeneous random graphs with infinite variance power-law degrees.
We provide a fast algorithm that determines for any connected graph H
on k vertices if it exists as induced subgraph in a random graph with n
vertices. By exploiting the scale-free graph structure, the algorithm runs
in O(nk) time for small values of k. We test our algorithm on several
real-world data sets.

1 Introduction
The induced subgraph isomorphism problem asks whether a large graph G con-
tains a connected graph H as an induced subgraph. When k is allowed to grow
with the graph size n, this problem is NP-hard in general. For example, k-
clique and k induced cycle, special cases of H, are known to be NP-hard [13,20].
For fixed k, this problem can be solved in polynomial time O(nk ) by search-
ing for H on all possible combinations of k vertices. Several randomized and
non-randomized algorithms exist to improve upon this trivial way of finding
H [14,25,27,29].
On real-world networks, many algorithms were observed to run much faster
than predicted by the worst-case running time of algorithms. This may be
ascribed to some of the properties that many real-world networks share [4],
such as the power-law degree distribution found in many networks [1,8,19,28].
One way of exploiting these power-law degree distributions is to design algo-
rithms that work well on random graphs with power-law degree distributions.
For example, finding the largest clique in a network is NP-complete for general
networks [20]. However, in random graph models such as the Erdős-Rényi ran-
dom graph and the inhomogeneous random graph, their specific structures can be
exploited to design fixed parameter tractable (FPT) algorithms that efficiently
find a clique of size k [10,12] or the largest independent set [15].
In this paper, we study algorithms that are designed to perform well for
the inhomogeneous random graph, a random graph model that can generate
graphs with a power-law degree distribution [2,3,5,6,24,26]. The inhomogeneous
random graph has a densely connected core containing many cliques, consisting
of vertices with degrees n log(n) and larger. In this densely connected core,
the probability of an edge being present is close to one, so that it contains
c Springer International Publishing AG, part of Springer Nature 2018
A. Bonato et al. (Eds.): WAW 2018, LNCS 10836, pp. 1–15, 2018.
https://doi.org/10.1007/978-3-319-92871-5_1
2 E. Cardinaels et al.

many complete graphs [18]. This observation was exploited in [11] to efficiently
determine whether a clique of size k occurs as a subgraph in an inhomogeneous
random graph. When searching for induced subgraphs however, some edges are
required not to be present. Therefore, searching for induced subgraphs in the
entire core is not efficient. We show that a connected subgraph H can be found
as an induced subgraph by scanning only vertices √ that are on the boundary of
the core: vertices with degrees proportional to n.
We present √ an algorithm that first selects the set of vertices with degrees
proportional to n, and then randomly searches for H as an induced subgraph on
a subset of k of those vertices. The first algorithm we present does not depend on
the specific structure of H. For general sparse graphs, the best known algorithms
to solve subgraph isomorphism on 3 or 4 vertices run in O(n1.41 ) or O(n1.51 ) time
with high probability [29]. For small values of k, our algorithm solves subgraph
isomorphism on k nodes in linear time with high probability on inhomogeneous
random graphs. However, the graph size needs to be very large for our algorithm
to perform well. We therefore present √a second algorithm that again selects the
vertices with degrees proportional to n, and then searches for induced subgraph
H in a more efficient way. This algorithm has the same performance guarantee
as our first algorithm, but performs much better in simulations.
We test our algorithm on large inhomogeneous random graphs, where it
indeed efficiently finds induced subgraphs. We also test our algorithm on real-
world network data with power-law degrees. There our algorithm does not per-
form well, probably due to the fact that the densely connected core of some
real-world
√ networks may not be the vertices of degrees at least proportional
to n. We then show that a slight modification of our algorithm that looks for
induced subgraphs on vertices of degrees proportional to nγ for some other value
of γ performs better on real-world networks, where the value of γ depends on
the specific network.
Notation. We say that a sequence of events (En )n≥1 happens with high prob-
ability (w.h.p.) if limn→∞ P (En ) = 1. Furthermore, we write f (n) = o(g(n)) if
limn→∞ f (n)/g(n) = 0, and f (n) = O(g(n)) if |f (n)|/g(n) is uniformly bounded,
where (g(n))n≥1 is nonnegative. Similarly, if lim supn→∞ |f (n)| /g(n) > 0, we
say that f (n) = Ω(g(n)) for nonnegative (g(n))n≥1 . We write f (n) = Θ(g(n)) if
f (n) = O(g(n)) as well as f (n) = Ω(g(n)).

1.1 Model
As a random graph null model, we use the inhomogeneous random graph or
hidden variable model [2,3,5,6,24,26]. Every vertex is equipped with a weight.
We assume that the weights are i.i.d. samples from the power-law distribution
P (wi > k) = Ck 1−τ (1.1)
for some constant C and for τ ∈ (2, 3). Two vertices with weights w and w are
connected with probability
 
 ww
p(w, w ) = min ,1 , (1.2)
μn
Finding Induced Subgraphs in Scale-Free Inhomogeneous Random Graphs 3

where μ denotes the mean value of the power-law distribution (1.1). Choosing
the connection probability in this way ensures that the expected degree of a
vertex with weight w is w.

1.2 Algorithms

We now describe two randomized algorithms that determine whether a connected


graph H is an induced subgraph in an inhomogeneous random graph and find
the location of such a subgraph if it exists. Algorithm 1 selects the vertices in
the inhomogeneous random graph that are on the boundary of the core of the

graph: vertices with degrees slightly below μn. Then, the algorithm randomly
divides these vertices into sets of k vertices. If one of these sets contains H as
an induced subgraph, the algorithm terminates and returns the location of H. If
this is not the case, then the algorithm fails. In the next section, we show that
for k small enough, the probability that the algorithm fails is small. This means
that H is present as an induced subgraph on vertices that are on the boundary
of the core with high probability.
Algorithm 1 is similar to the algorithm in [12] designed to find cliques in
random graphs. The major difference is that the algorithm
√ to find cliques looks
for cliques on all vertices with degrees larger than f1 μn for some function f1 .
This algorithm is not efficient for detecting other subgraphs than cliques, since
vertices with high degrees will be connected with probability close to one.

Algorithm 1. Finding induced subgraph H (random search)


Input : H, G = (V, E), μ, f1 = f1 (n), f2 = f2 (n).
Output: Location of H √ in G √
or fail.
1 Define n = |V |, In = [ f1 μn, f2 μn] and set V  = ∅.
2 for i ∈ V do
3 if Di ∈ In then V  = V  ∪ i
4 end
5 Divide the vertices in V  randomly into |V  | /k sets S1 , . . . , S|V  |/k .
6 for j = 1, . . . , |V  | /k do
7 if H is an induced subgraph on Sj then return location of H
8 end

The following theorem gives a bound for the performance of Algorithm 1 for
small values of k.
Theorem 1. Choose f1 = f1 (n) ≥ 1/ log(n) and f1 < f2 < 1 and let k <
log1/3 (n). Then, with high probability, Algorithm 1 detects induced subgraph H
on k vertices in an inhomogeneous random graph with n vertices and weights
distributed as in (1.1) in time O(nk).
Thus, for small values of k, Algorithm 1 finds an instance of H in linear time.
4 E. Cardinaels et al.

A problem with parameter k is called fixed parameter tractable (FPT) if it


can be solved in f (k)nO(1) time for some function f (k), and it is called typical
FPT (typFPT) if it can be solved in f (k)ng(n) for some function g(n) = O(1)
with high probability [9]. As a corollary of Theorem 1 we obtain that the
induced subgraph problem on the inhomogeneous random graph is in typFPT
for any subgraph H, similarly to the k-clique problem on inhomogeneous random
graphs [12].
Corollary 1. The induced subgraph problem on the inhomogeneous random
graph is in typFPT.
In theory Algorithm 1 detects any motif on k vertices in linear time for small
k. However, this only holds for large values of n, which can be understood as
follows. In Lemma 2, we show that |V  | = Θ(n(3−τ )/2 ), thus tending to infinity
as n grows large. However, when n = 107 and τ = 2.5, this means that the size
of the set V  is only proportional to 101.75 = 56 vertices. Therefore, the number
of sets Sj constructed in Algorithm 1 is also small. Even though the probability
of finding motif H in any such set is proportional to a constant, this constant
may be small, so that for finite n the algorithm almost always fails. Thus, for
Algorithm 1 to work, n needs to be large enough so that n(3−τ )/2 is large as well.
The algorithm can be significantly improved by changing the search for H
on vertices in set V  . In Algorithm 2 we propose a search for motif H similar
to the Kashtan motif sampling algorithm [21]. Rather than sampling k vertices
randomly, it samples one vertex randomly, and then randomly increases the set
S by adding vertices in its neighborhood. This already guarantees the vertices
in list Sj to be connected, making it more likely for them to form a specific
connected motif together. In particular, we expand the list Sj in such a way that
the vertices in Sj are guaranteed to form a spanning tree of H as a subgraph.
This is ensured by choosing the list T H that specifies at which vertex in Sj we
expand Sj by adding a new vertex. For example, if k = 4 and we set T H = [1, 2, 3]
we first add an edge to the first vertex, then we look for a random neighbor of
the previously added vertex, and then we add a random neighbor of the third
added vertex. Thus, setting T H = [1, 2, 3] ensures that the set Sj contains a path
of length three, whereas setting T H = [1, 1, 1] ensures that the set Sj contains a
star-shaped subgraph. Depending on which subgraph H we are looking for, we
can define T H in such a way that we ensure that the set Sj at least contains a
spanning tree of motif H in Step 6 of the algorithm.
The selection on the degrees ensures that the degrees are sufficiently high so
that probability of finding such a connected set on k vertices is high, as well as
that the degrees are sufficiently low to ensure that we do not only find complete
graphs because of the densely connected core of the inhomogeneous random
graph. The probability that Algorithm 2 indeed finds the desired motif H in
any check is of constant order of magnitude, similar to Algorithm 1. Therefore,
the performance guarantee of both algorithms is similar. However, in practice
Algorithm 2 performs much better, since for finite n, k connected vertices are
more likely to form a motif than k randomly chosen vertices.
Finding Induced Subgraphs in Scale-Free Inhomogeneous Random Graphs 5

Algorithm 2. Finding induced subgraph H (neighborhood search)


Input : H, G = (V, E), μ, f1 = f1 (n), f2 = f2 (n), s.
Output: Location of H √ in G √
or fail.
1 Define n = |V |, In = [ f1 μn, f2 μn] and set V  = ∅.
2 for i ∈ V do
3 if Di ∈ In then V  = V  ∪ i
4 end
5 Let G be the induced subgraph of G on vertices V  .
6 Set T H consistently with motif H.
7 for j=1,. . . ,s do
8 Pick a random vertex v ∈ V  and set Sj = v.
9 while |Sj | = k do
10 Pick a random v  ∈ NG (Sj [T H [j]]) : v  ∈
/ Sj
11 Add v  to Sj .
12 end
13 if H is an induced subgraph on Sj then return location of H
14 end

The following theorem shows that indeed Algorithm 2 has similar perfor-
mance guarantees as Algorithm 1.
Theorem 2. Choose f1 = f1 (n) ≥ 1/ log(n) and f1 < f2 < 1. Choose s =
Ω(nα ) for some 0 < α < 1, such that s ≤ n/k. Then, Algorithm 2 detects
induced subgraph H on k < log1/3 (n) vertices on an inhomogeneous random
graph with n vertices and weights distributed as in (1.1) in time O(nk) with high
probability.
The proofs of Theorems 1 and 2 rely on the fact that for small k, any sub-
graph on k vertices is present in G with high probability. This means that after
the degree selection step of Algorithms 1 and 2, for small k, any motif finding
algorithm can be used to find motif H on the remaining graph G , such as the
Grochow-Kellis algorithm [14], the MAvisto algorithm [27] or the MODA algo-
rithm [25]. In the proofs of Theorems 1 and 2, we show that G has Θ(n(3−τ )/2 )
vertices with high probability. Thus, the degree selection step reduces the prob-
lem of finding a motif H on n vertices to finding a motif on a graph with
Θ(n(3−τ )/2 ) vertices, significantly reducing the running time of the algorithms.

2 Proof of Theorems 1 and 2

We prove Theorem 1 using two lemmas. The first lemma relates the degrees of
the vertices to their weights. The connection probabilities in the inhomogeneous
random graph depend on the weights of the vertices. In Algorithm 1, we select
vertices based on their degrees instead of their unknown weights. The following
lemma shows that the weights of the vertices in V  are close to their degrees.
6 E. Cardinaels et al.


Lemma
√ 1. Degrees and weights. Fix ε > 0, and define Jn = [(1−ε) f1 μn, (1+
ε) f2 μn]. Then, for some K > 0,
 2 
 ε (1 − ε) 
P (∃i ∈ V : wi ∈
/ Jn ) ≤ Kn exp − f1 μn . (2.1)
2(1 + ε)
Proof. Fix a vertex i ∈ V . Conditionally on the weight wi of vertex i, Di ∼
Poi(wi ) [5,16]. Then,
   P D ∈ I | w < (1 − ε)√f μn
i n i 1
P wi < (1 − ε) f1 μn, Di ∈ In =  √ 
P wi < (1 − ε) f1 μn
 √ √ 
P Di > f1 μn | wi = (1 − ε) f1 μn
≤ √
1 − C((1 − ε) f1 μn)1−τ
   
≤ K1 P Di > f1 μn | wi = (1 − ε) f1 μn ,
(2.2)
for some K1 > 0. Here the first inequality follows because for Poisson random
variables P (Poi(λ1 ) > k) ≤ P (Poi(λ2 ) > k) for λ1 < λ2 . We use that by the
Chernoff bound for Poisson random variables
 
P (X > λ(1 + δ)) ≤ exp −h(δ)δ 2 λ/2 , (2.3)
where h(δ) = 2((1 + δ) ln(1 + δ) − δ)/δ 2 . Therefore, using that h(δ) ≥ 1/(1 + δ)
for δ ≥ 0 results in
   2 
  ε (1 − ε) 
P Di > f1 μn | wi = (1 − ε) f1 μn ≤ exp − f1 μn . (2.4)
2(1 + ε)
Combining this with (2.2) and taking the union bound over all vertices then
results in
   2 
 ε (1 − ε) 
P ∃i : Di ∈ In , wi < (1 − ε) f1 μn ≤ K1 n exp − f1 μn . (2.5)
2(1 + ε)

The bound for wi > (1 + ε) f2 μn follows similarly. Combining this with the
fact that f1 < f2 then proves the lemma. 
The second lemma shows that after deleting all vertices with degrees outside
of In defined in Step 1 of Algorithm 1, still polynomially many vertices remain
with high probability.
Lemma 2. Polynomially many nodes remain. There exists γ > 0 such that
   
P |V  | < γn(3−τ )/2 ≤ 2 exp −Θ(n(3−τ )/2 ) . (2.6)

Proof. Let E denote the event that all vertices i ∈ V  satisfy wi ∈ Jn for some
ε > 0, with Jn as in Lemma 1. Let W  be the set of vertices with weights in Jn .
Under the event E, |V  | ≤ |W  |. Then, by Lemma 1
     2 
ε (1 − ε) 
P |V  | < γn(3−τ )/2 ≤ P |W  | < γn(3−τ )/2 + Kn exp − f1 μn .
2(1 + ε)
(2.7)
Finding Induced Subgraphs in Scale-Free Inhomogeneous Random Graphs 7

Furthermore,
  √
P (wi ∈ Jn ) = C((1 − ε) f1 μn)1−τ − C((1 + ε) f2 μn)1−τ ≥ c1 ( μn)1−τ
(2.8)
for some constant c1 > 0 because f1 < f2 . Thus, each of the n vertices is in

set W  independently with probability at least c1 ( μn)1−τ . Choose 0 < γ < c1 .
Applying the multiplicative Chernoff bound then shows that
   
 (c1 − γ)2 (3−τ )/2
P |W | < γn (3−τ )/2
≤ exp − n , (2.9)
2c1

which proves the lemma together with (2.7) and the fact that f1 μn =
Ω(n(3−τ )/2 ) for τ ∈ (2, 3). 

We now use these lemmas to prove Theorem 1.

Proof of Theorem 1. We condition on the event that V  is of polynomial size


(Lemma 2) and that the weights are within the constructed lower and upper
bounds (Lemma 1), since both events occur with high probability. This bounds
the edge probability between any pair of nodes i and j in V  as
 √ √ 
(1 + ε) f2 μn(1 + ε) f2 μn
pij < min , 1 = f2 (1 + ε)2 , (2.10)
μn

so that pij ≤ p+ = c1 < 1 if we choose ε small enough. Similarly,


√ 2  
(1 − ε)2 f1 μn 1
pij > min =Θ , (2.11)
μn log(n)

by our choice of f1 , so that pij ≥ p− = c2 / log(n). Let E := |EH | be the number


of edges in H. We upper bound the probability of not finding H in one of the
partitions of size k of V  as 1 − pE (k2)−E . Since all partitions are disjoint
− (1 − p+ )
we can upper bound the probability of not finding H in any of the partitions as
 
 k
 |V  |
k
P (H not in the partitions) ≤ 1 − pE
− (1 − p+ )(2)−E . (2.12)
k
Using that E ≤ k 2 , − E ≤ k 2 and that 1 − x ≤ e−x results in
2
 

k2 k2 |V |
P (H not in the partitions) ≤ exp −p− (1 − p+ ) . (2.13)
k
 3−τ  3−τ
Since |V  | = Θ n 2 , |V  |/k ≥ dn 2 /k for some constant d > 0. We fill in
the expressions for p− and p+ , with c3 > 0 a constant
3−τ  k 2
dn 2 c3
P (H not in the partitions) ≤ exp − . (2.14)
k log n
8 E. Cardinaels et al.

1
Now apply that k ≤ log 3 (n). Then

3−τ  log 23 n
P (H not in the partitions) ≤ exp − dn 12 c3
log n
log 3 n (2.15)
 3−τ

≤ exp −dn 2 −o(1) .

Hence, the inner expression grows polynomially such that the probability of not
finding H in one of the partitions is negligibly small. The running time of the
partial search is given by
   
|V  | k n k 4
≤ ≤ nk ≤ nek , (2.16)
k 2 k 2

which concludes the proof for k ≤ log1/3 (n). 


1
Proof of Corollary 1. If k > log 3 (n), we can determine whether H is an induced
subgraph by exhaustive search in time
  
n k nk k(k − 1) 4 4
≤ ≤ knk ≤ kek ≤ nek , (2.17)
k 2 k 2
 
since for all sets of k vertices the presence or absence of k2 edges needs to be
1
checked. For k ≤ log 3 (n), Theorem 1 shows that the induced subgraph isomor-
4
phism problem can be solved in time nk ≤ nek . Thus, with high probability
4
the induced subgraph isomorphism problem can be solved in nek time, which
proves that it is in typFPT. 

Proof of Theorem 2. The proof of Theorem 2 is very similar to the proof of


Theorem 1. The only way Algorithm 2 differs from Algorithm 1 is in the selection
of the sets Sj . As in the previous theorem, we condition on the event that
|V  | = Θ(n(3−τ )/2 ) (Lemma 2) and that the weights of the vertices in G are
bounded as in Lemma 1.
The graph G constructed in Step 5 of Algorithm 2 then consists of
Θ(n(3−τ )/2 ) vertices. Furthermore, by the bound (2.11) on the connection prob-
abilities of all vertices in G , the expected degree of a vertex i in G satisfies
E [Di,G ] = Ω(n(3−τ )/2 / log(n)). We can use similar arguments as in Lemma 1 to
show that Di,G = Ω(n(3−τ )/2 / log(n)) with high probability for all vertices in
G . Since G consists of Θ(n(3−τ )/2 ) vertices, Di,G = O(n(3−τ )/2 ) as well. This
1
means that for k < log 3 (n), Steps 8–11 are able to find a connected subgraph
on k vertices with high probability.
We now compute the probability that Sj is disjoint with the previous j − 1
constructed sets. The probability that the first vertex does not overlap with the
previous sets is given by 1 − jk/ |V  |, since that vertex is chosen uniformly at
random. The second vertex is chosen in a size-biased manner, since it is chosen
Finding Induced Subgraphs in Scale-Free Inhomogeneous Random Graphs 9

by following a random edge. The probability that vertex i is added can therefore
be bounded as
Di,G M log(n)
P (vertex i is added) = ≤ (2.18)
|V  | |V  |
s=1 Ds,G


for some constant M > 0 by the conditions on the degrees. Therefore, the prob-
ability that Sj does not overlap with one of the previously chose jk vertices can
be bounded from below by
  
kj M kj log(n) k−1
P (Sj does not overlap with previous sets) ≥ 1− 1− . (2.19)
|V  | |V  |

Thus, the probability that all j sets do not overlap can be bounded as
 jk
M kj log(n)
P (Sj ∩ Sj−1 · · · ∩ S1 = ∅) ≥ 1− , (2.20)
|V  |

which tends to one when jk = o(n(3−τ )/4 ). Let sdis denote the number of disjoint
sets out of the s sets constructed in Algorithm 2. Then, when s = Ω(nα ) for some
α > 0, sdis > nβ for some β > 0 with high probability, because k < log1/3 (n).
The probability that H is present as an induced subgraph is bounded sim-
ilarly as in Theorem 1. We already know   that k − 1 edges are present. For all
other E − (k − 1) edges of H, and all k2 − E edges that are not present in H,
we can again use (2.10) and (2.11) to bound on the probability of edges being
present or not being present between vertices in V  . Therefore, we can bound
the probability that H is not found similarly to (2.13) as

P (H not in the partitions) ≤ P (H not in the disjoint partitions)


 2 2

≤ exp −pk− (1 − p+ )k sdis .

Because sdis > nβ for some β > 0, this term tends to zero exponentially. The
running time of the partial search can be bounded similarly to (2.16) as
 
k
s ≤ sk 2 = O(nk), (2.21)
2

where we used that s ≤ n/k. 

3 Experimental Results
Fig. 1 shows the fraction of times Algorithm 1 succeeds to find a cycle of size
k in an inhomogeneous random graph on 107 vertices. Even though for large n
Algorithm 1 should find an instance of a cycle of size k in step 7 of the algorithm
with high probability, we see that Algorithm 1 never succeeds in finding one. This
is because of the finite size effects discussed before.
10 E. Cardinaels et al.

Fig. 1. The fraction of times step 7 in Algorithm 1 succeeds to find a cycle of length k
on an inhomogeneous random graph with n = 107 , averaged over 500 network samples
with f1 = 1/ log(n) and f2 = 0.9.

Figure 2a also plots the fraction of times Algorithm 2 succeeds to find a cycle.
We set the parameter s = 10000 so that the algorithm fails if the algorithm does
not succeed to detect motif H after executing step 13 of Algorithm 2 10000
times. Because s gives the number of attempts to find H, increasing s may
increase the success probability of Algorithm 2 at the cost of a higher running
time. However, in Fig. 2b we see that for small values of k, the mean number of
times Step 13 is executed when the algorithm succeeds is much lower than 10000,
so that increasing s in this experiment probably only has a small effect on the
success probability. We see that Algorithm 2 outperforms Algorithm 1. Figure 2b
also shows that the number of attempts needed to detect a cycle of length k is
small for k ≤ 6. For larger values of k the number of attempts increases. This
can again be ascribed to the finite size effects that cause the set V  to be small,
so that large motifs may not be present on vertices in set V  . We also plot the
success probability when using different values of the functions f1 and f2 . When
only the lower bound f1 on the vertex degrees is used, as in [11], the success
probability of the algorithm decreases. This is because the set V  now contains
many high degree vertices that are much more likely to form clique motifs than
cycles or other connected motifs on k vertices. This makes f2 = ∞ a very efficient
bound for detecting clique motifs [11]. For the cycle motif however, we see in
Fig. 2b that more checks are needed before a cycle is detected, and in some cases
the cycle is not detected at all.
Setting f1 = 0 and f2 = ∞ is also less efficient, as Fig. 2a shows. In this
situation, the number of attempts needed to find a cycle of length k is larger
than for Algorithm 2 for k ≤ 6.

3.1 Real Network Data


We now check Algorithm 2 on four real-world networks with power-law degrees:
a Wikipedia communication network [22], the Gowalla social network [22], the
Baidu online encyclopedia [23] and the Internet on the autonomous systems
level [22]. Table 1 presents several statistics of these scale-free data sets. Fig. 3
Finding Induced Subgraphs in Scale-Free Inhomogeneous Random Graphs 11

Fig. 2. Results of Algorithm 2 on an inhomogeneous random graph with n = 107 for


detecting cycles of length k. The parameters are chosen as s = 10000, f1 = 1/ log(n),
f2 = 0.9. The values are averaged over 500 generated networks.

shows the fraction of runs where Algorithm 2 finds a cycle as an induced sub-
graph. We see that for the Wikipedia social network in Fig. 3a, Algorithm 2 is
more efficient than looking for cycles among all vertices in the network. For the
Baidu online encyclopedia in Fig. 3c however, we see that Algorithm 2 performs
much worse than looking for cycles among all possible vertices. In the other two
network data sets in Figs. 3b and d the performance on the reduced vertex set
and the original vertex set is almost the same. Figure 4 shows that in general,
Algorithm 2 indeed seems to finish in fewer steps than when using the full vertex
set. However, as Fig. 4c shows, for larger values of k the algorithm fails almost
always.

Table 1. Statistics of the data sets: the number of vertices n, the number of edges E,
and the power-law exponent τ fitted by the method of [7].

n E τ
Wikipedia 2,394,385 5,021,410 2.46
Gowalla 196,591 950,327 2.65
Baidu 2,141,300 17,794,839 2.29
AS-Skitter 1,696,415 11,095,298 2.35

These results show that while Algorithm 2 is efficient on inhomogeneous ran-


dom graphs, it may not always be efficient on real-world data sets. This is not
surprising,
√ because there is no reason why the vertices of degrees proportional to
n should behave like an Erdős-Rényi random graph, like in the inhomogeneous
random graph. We therefore investigate whether selecting vertices with degrees
in In = [(μn)γ / log(n), (μn)γ ] for some other value of γ in Algorithm 2 leads
to a better performance. Figures 3 and 4 show for every data set one particular
12 E. Cardinaels et al.

Fig. 3. The fraction of times Algorithm 2 succeeds to find a cycle on four large network
data sets for detecting cycles of length k. The parameters are chosen as s = 10000,
f1 = 1/ log(n), f2 = 0.9. The black line uses Algorithm 2 on vertices of degrees in
In = [(μn)γ / log(n), (μn)γ ]. The values are averaged over 500 runs of Algorithm 2.

value of γ that works well. For the Gowalla, Wikipedia and Autonomous systems
network, this leads to a faster algorithm to detect cycles. Only for the Baidu net-
work other values of γ do not improve upon randomly selecting from all vertices.
This indicates that for most networks, cycles do appear mostly on degrees with
specific orders of magnitude, making it possible to sample these cycles faster.
Unfortunately, these orders of magnitude may be different for different networks.
Across all four networks, the best value of γ seems to be smaller than the value
of 0.5 that is optimal for the inhomogeneous random graph.
Finding Induced Subgraphs in Scale-Free Inhomogeneous Random Graphs 13

Fig. 4. The number of times step 12 of Algorithm 2 is invoked when the algorithm does
not fail on four large network data sets for detecting cycles of length k. The parameters
are chosen as s = 10000, f1 = 1/ log(n), f2 = 0.9. The black line uses Algorithm 2
on vertices of degrees in In = [(μn)γ / log(n), (μn)γ ]. The values are averaged over 500
runs of Algorithm 2.

4 Conclusion
We presented an algorithm which solves the induced subgraph problem on inho-
mogeneous random graphs with infinite variance power-law degrees in time
4
O(nek ) with high probability as n grows large. This algorithm is based on the
observation that for fixed k, any subgraph is present on k vertices with degrees

slightly smaller than μn with positive probability. Therefore, the algorithm
first selects vertices with those degrees, and then uses a random search method
to look for the induced subgraph on those vertices.
We show that this algorithm performs well on simulations of inhomogeneous
random graphs. Its performance on real-world data sets varies for different data
sets. This indicates that the degrees that contain the√ most induced subgraphs
of size k in real-world networks may not be close to n. We then show that on
these data sets, it may be more efficient to find induced subgraphs on degrees
proportional to nγ for some other value of γ. The value of γ may be different for
different networks.
14 E. Cardinaels et al.


Our algorithm exploits that induced subgraphs are likely formed among μn-
degree vertices. However, certain subgraphs may occur more frequently on ver-
tices of other degrees [17]. For example, star-shaped subgraphs on k vertices

appear more often on one vertex with degree much higher than μn corre-
sponding to the middle vertex of the star, and k − 1 lower-degree vertices cor-
responding to the leafs of the star [17]. An interesting open question is whether
there exist better degree-selection steps for specific subgraphs than the one used
in Algorithms 1 and 2.

Acknowledgements. The work of JvL and CS was supported by NWO TOP grant
613.001.451. The work of JvL was further supported by the NWO Gravitation Networks
grant 024.002.003, an NWO TOP-GO grant and by an ERC Starting Grant.

References
1. Albert, R., Jeong, H., Barabási, A.L.: Internet: diameter of the world-wide web.
Nature 401(6749), 130–131 (1999)
2. Boguñá, M., Pastor-Satorras, R.: Class of correlated random networks with hidden
variables. Phys. Rev. E 68, 036112 (2003)
3. Bollobás, B., Janson, S., Riordan, O.: The phase transition in inhomogeneous ran-
dom graphs. Random Struct. Algorithms 31(1), 3–122 (2007)
4. Brach, P., Cygan, M., L acki, J., Sankowski, P.: Algorithmic complexity of power law
networks. In: Proceedings of the Twenty-Seventh Annual ACM-SIAM Symposium
on Discrete Algorithms, SODA 2016, pp. 1306–1325. Society for Industrial and
Applied Mathematics, Philadelphia (2016)
5. Britton, T., Deijfen, M., Martin-Löf, A.: Generating simple random graphs with
prescribed degree distribution. J. Stat. Phys. 124(6), 1377–1397 (2006)
6. Chung, F., Lu, L.: The average distances in random graphs with given expected
degrees. Proc. Natl. Acad. Sci. USA 99(25), 15879–15882 (2002) (electronic)
7. Clauset, A., Shalizi, C.R., Newman, M.E.J.: Power-law distributions in empirical
data. SIAM Rev. 51(4), 661–703 (2009)
8. Faloutsos, M., Faloutsos, P., Faloutsos, C.: On power-law relationships of the inter-
net topology. ACM SIGCOMM Comput. Commun. Rev. 29, 251–262 (1999)
9. Fountoulakis, N., Friedrich, T., Hermelin, D.: On the average-case complexity of
parameterized clique. arXiv:1410.6400v1 (2014)
10. Fountoulakis, N., Friedrich, T., Hermelin, D.: On the average-case complexity of
parameterized clique. Theor. Comput. Sci. 576, 18–29 (2015)
11. Friedrich, T., Krohmer, A.: Cliques in hyperbolic random graphs. In: INFOCOM
Proceedings 2015, pp. 1544–1552. IEEE (2015)
12. Friedrich, T., Krohmer, A.: Parameterized clique on inhomogeneous random
graphs. Disc. Appl. Math. 184, 130–138 (2015)
13. Garey, M.R., Johnson, D.S., Garey, M.R.: Computers and Intractability: A Guide
to the Theory of NP-Completeness. W H FREEMAN & CO (2011)
14. Grochow, J.A., Kellis, M.: Network motif discovery using subgraph enumeration
and symmetry-breaking. In. RECOMB, pp. 92–106 (2007)
15. Heydari, H., Taheri, S.M.: Distributed maximal independent set on inhomogeneous
random graphs. In: 2017 2nd Conference on Swarm Intelligence and Evolutionary
Computation (CSIEC). IEEE, March 2017
Finding Induced Subgraphs in Scale-Free Inhomogeneous Random Graphs 15

16. van der Hofstad, R.: Random Graphs and Complex Networks, vol. 1. Cambridge
University Press, Cambridge (2017)
17. van der Hofstad, R., van Leeuwaarden, J.S.H., Stegehuis, C.: Optimal subgraph
structures in scale-free networks. arXiv:1709.03466 (2017)
18. Janson, S., L
 uczak, T., Norros, I.: Large cliques in a power-law random graph. J.
Appl. Probab. 47(04), 1124–1135 (2010)
19. Jeong, H., Tombor, B., Albert, R., Oltvai, Z.N., Barabási, A.L.: The large-scale
organization of metabolic networks. Nature 407(6804), 651–654 (2000)
20. Karp, R.M.: Reducibility among combinatorial problems. In: Miller, R.E.,
Thatcher, J.W., Bohlinger, J.D. (eds.) Complexity of Computer Computations.
The IBM Research Symposia Series, pp. 85–103. Springer, Boston (1972). https://
doi.org/10.1007/978-1-4684-2001-2 9
21. Kashtan, N., Itzkovitz, S., Milo, R., Alon, U.: Efficient sampling algorithm for
estimating subgraph concentrations and detecting network motifs. Bioinformatics
20(11), 1746–1758 (2004)
22. Leskovec, J., Krevl, A.: SNAP Datasets: Stanford large network dataset collection
(2014). http://snap.stanford.edu/data. Accessed 14 Mar 2017
23. Niu, X., Sun, X., Wang, H., Rong, S., Qi, G., Yu, Y.: Zhishi.me - weaving chinese
linking open data. In: Aroyo, L., Welty, C., Alani, H., Taylor, J., Bernstein, A.,
Kagal, L., Noy, N., Blomqvist, E. (eds.) ISWC 2011. LNCS, vol. 7032, pp. 205–220.
Springer, Heidelberg (2011). https://doi.org/10.1007/978-3-642-25093-4 14
24. Norros, I., Reittu, H.: On a conditionally poissonian graph process. Adv. Appl.
Probab. 38(01), 59–75 (2006)
25. Omidi, S., Schreiber, F., Masoudi-Nejad, A.: MODA: an efficient algorithm for
network motif discovery in biological networks. Genes Genetic Syst. 84(5), 385–
395 (2009)
26. Park, J., Newman, M.E.J.: Statistical mechanics of networks. Phys. Rev. E 70,
066117 (2004)
27. Schreiber, F., Schwobbermeyer, H.: MAVisto: a tool for the exploration of network
motifs. Bioinformatics 21(17), 3572–3574 (2005)
28. Vázquez, A., Pastor-Satorras, R., Vespignani, A.: Large-scale topological and
dynamical properties of the internet. Phys. Rev. E 65, 066130 (2002)
29. Williams, V.V., Wang, J.R., Williams, R., Yu, H.: Finding four-node subgraphs in
triangle time. In: Proceedings of the Twenty-Sixth Annual ACM-SIAM Symposium
on Discrete Algorithms, SODA 2015, pp. 1671–1680. Society for Industrial and
Applied Mathematics, Philadelphia (2015)
The Asymptotic Normality of the Global
Clustering Coefficient in Sparse Random
Intersection Graphs

Mindaugas Bloznelis1(B) and Jerzy Jaworski2


1
Institute of Computer Science, Vilnius University, 03225 Vilnius, Lithuania
mindaugas.bloznelis@mif.vu.lt
2
Faculty of Mathematics and Computer Science, Adam Mickiewicz University,
61-614 Poznań, Poland
jaworski@amu.edu.pl

Abstract. We establish the asymptotic normality of the global cluster-


ing coefficient in sparse uniform random intersection graphs.

Keywords: Clustering coefficient · Asymptotic normality


Random intersection graph

1 Introduction
The global clustering coefficient of a finite graph G is the ratio CG = 3NΔ /N∨ ,
where NΔ is the number of triangles and N∨ is the number of paths of length
2. Equivalently, CG represents the probability that a randomly selected path of
length 2 induces triangle in G. The global clustering coefficient is a commonly
used network characteristic, assessing the strength of the statistical association
between neighboring adjacency relations. For example, in a social network the
tendency of linking actors which have a common neighbor is reflected by a non-
negligible value of the global clustering coefficient.
Clustering in a social network can be explained by an auxiliary bipartite
structure: each actor is prescribed a collection of attributes and any two actors
sharing a common attribute have high chances of being adjacent, cf. [8]. The
respective random intersection graph (RIG) on the vertex set V = {v1 , . . . , vn }
and with the auxiliary attribute set W = {w1 , . . . , wm } defines adjacency rela-
tions with the help of a random bipartite graph H linking actors (=vertices) to
attributes: two actors are adjacent in RIG if they have a common neighbour in
H. We mention that RIG admits non-vanishing tunable global clustering coeffi-
cient, power-law degrees and short typical distances, see e.g., [4].
In this note we consider the uniform random intersection graph G(n, m, r),
where every vertex vi ∈ V is prescribed a random subset Si = S(vi ) ⊂ W of size r
and two vertices vi , vj are declared adjacent (denoted vi ∼ vj ) whenever Si ∩Sj =
∅. We assume that the sets S1 , . . . , Sn are independent. (The respective random
bipartite graph H is drawn uniformly at random from the class of bipartite
c Springer International Publishing AG, part of Springer Nature 2018
A. Bonato et al. (Eds.): WAW 2018, LNCS 10836, pp. 16–29, 2018.
https://doi.org/10.1007/978-3-319-92871-5_2
The Asymptotic Normality of the Global Clustering Coefficient 17

graphs with the property that each actor vi ∈ V has exactly r neighbours in
W .) The uniform random intersection graph has been widely studied in the
literature mainly as a model of secure wireless sensor network that uses random
predistribution of keys, see [5,14]. We denote for short G = G(n, m, r) and by G
we denote the instance (realization) of the random graph G.
We consider large random intersection graphs, where r2 = o(m) as
m, n → +∞. In this case the edge probability is, see (53),

pe = P(vi ∼ vj ) = r2 m−1 + O(r4 m−2 ). (1)

For us the most interesting range of parameters n, m, r is defined by the approx-


imate relation
m ≈ cnr2 , (2)
where c > 0 is an arbitrary constant. In this  case we obtain a sparse random
graph, where the expected number of edges n2 pe ≈ n/(2c) scales as n.
Before formulating our results we introduce some notation. Given a vertex
triple vi , vj , vk , let Δi,j,k and pΔ denote the indicator and the probability of the
event that the vertex triple induces a triangle in G. Similarly, ∨ijk and p∨ denote
the indicator and probability that G contains the path vi ∼ vj ∼ vk (we call
such a path a cherry). The total number of triangles NΔ and cherries N∨ are

NΔ = NΔ (S1 , . . . , Sn ) = Δi,j,k , (3)
{i,j,k}⊂[n]
  
N∨ = N∨ (S1 , . . . , Sn ) = ∨ijk + ∨jki + ∨kij .
{i,j,k}⊂[n]

Denote
 
N̄Δ = NΔ − ENΔ , N̄∨ = N∨ − EN∨ , σΔ
2 2
= EN̄Δ 2
, σ∨ = EN̄∨2 , σΔ∨ = E N̄Δ N̄∨ .

We start our analysis with an evaluation of the first and second moments of
the subgraph counts NΔ and N∨ .

Lemma 1. Let m, n → +∞. Assume that r ≥ 2 and r3 = O(m). We have


   r5 
n r3 r6
ENΔ = pΔ , pΔ = 2 + 3 + O , (4)
3 m m m3
   r8 
n r4 r4 (r − 1)2 r4 (r − 1)4
EN∨ = 3 p∨ , p∨ = p2e = 2 − + + O , (5)
3 m m3 4m4 m4
   
n n
2
σΔ = (n − 2)2 2
EgΔ 1,2 + Eh2Δ 1,2,3 , (6)
2 3
   
2 n n
σ∨ = (n − 2)
2 2
Eg∨ 1,2 + Eh2∨ 1,2,3 , (7)
2 3
   
2 n
  n  
σΔ∨ = (n − 2) E gΔ 1,2 g∨ 1,2 + E hΔ 1,2,3 h∨ 1,2,3 . (8)
2 3
18 M. Bloznelis and J. Jaworski

The random variables gΔ1,2 , hΔ1,2,3 and g∨1,2 , h∨1,2,3 define the Hoeffding
decomposition of N̄Δ and N̄∨ , see (12). Their second moments entering (6),
(7), (8) are evaluated in (25), (26) and (31), (32) and (39), (40) respectively.

We note that (4) and (5) imply that the “theoretical clustering coefficient”
  pΔ E(3NΔ ) 1
P Δi,j,k ∨ijk = = ≈ as n, m → +∞.
p∨ EN∨ r
Therefore, in order to have a non-vanishing global clustering coefficient we
need r to be bounded as n, m → +∞, cf. [3,13]. But we may still expect the
−1 −1
asymptotic normality of σΔ N̄Δ and σ∨ N̄∨ even for r → ∞ as n, m → +∞.
Indeed, assuming (2) we obtain from (4) for r3 = o(m) that

m  r3 
ENΔ ≈ 1 + → +∞ as n, m → +∞. (9)
6c3 r3 m
−1
Hence, for r3 = o(m) we can expect the asymptotic normality of σΔ N̄Δ . For
larger r such that m = O(r ) and r = o(m), the identity ENΔ = n3 pΔ
3 2

combined with (2) and the bound pΔ = O(r3 m−2 +r6 m−3 ) implies ENΔ = O(1).
−1
The latter bound rules out the asymptotic normality of σΔ N̄Δ . We refer to
Lemma 4 and the remark following it for various bounds on pΔ .
Our main result, Theorem 2 below gives sufficient conditions for the asymp-
totic normality of CG as n, m → +∞. We derive the asymptotic normality of CG
from a related asymptotic normality result for the bivariate vector of subgraph
counts (NΔ , N∨ ).

Theorem 1. Let α, β > 0. Let m, n → +∞. Assume that α ≤ m/n ≤ β.


Assume that r ≥ 2 and r = O(1). Suppose that the ratio σ Δ∨ /(σΔ σ∨ ) con-
−1 −1
verges to a limit. We denote the limit κ. The random vector σΔ N̄Δ , σ∨ N̄∨
converges in distribution to a Gaussian random vector (η1 , η2 ), where Eηj = 0,
Eηj2 = 1, j = 1, 2, and Eη1 η2 = κ.

An immediate consequence of Theorem 1 is the asymptotic normality of the


global clustering coefficient CG .

Theorem 2. Let r ≥ 2 and β > 0. Let m, n → +∞. Assume that m/n → β.


Then the ratio σΔ∨ /(σΔ σ∨ ) converges to a limit. We denote the limit κ. The
random variable
 ENΔ 
σ −1 CG − 3
EN∨
converges in distribution to the standard normal random variable. Here
 EN 2  σ 2  σ 2 σΔ σ∨

Δ Δ ∨
2
σ =9 + − 2κ .
EN∨ ENΔ EN∨ ENΔ EN∨
The Asymptotic Normality of the Global Clustering Coefficient 19

We remark that the asymptotic normality of subgraph counts like NΔ , N∨


and their derivatives such as CG provide a useful tool for statistical inference
in network analysis, see e.g., [12]. Results of Theorems 1 and 2 seem to be new.
We are not aware of an earlier work on the asymptotic normality of the global
clustering coefficient in sparse random graphs. A related problem of Poissonian
approximation of the number of cliques in random intersection graphs has been
addressed in [9].

Future Work. We envisage the extension of the techniques developed in the


present paper to more general sparse random intersection graphs and to the
counts of subgraphs of arbitrary, but finite size.

2 Proofs

In the proof we combine Hoeffding’s decomposition and Stein’s method. In a bit


different context a similar approach has been used in [2], see also [7].
The section is organized as follows. We first collect necessary notation. Then
we construct Hoeffding decompositions of N̄Δ , N̄∨ and evaluate variances of
various parts of the decompositions. Next we briefly outline our approach to
the asymptotic normality via Stein’s method. At the very end of the section we
prove Lemma 1, Theorem 2 and sketch the proof of Theorem 1.

Notation. The adjacency relation between vertices vi and vj is denoted vi ∼ vj .


The indicator of an event A is denoted IA . In particular, we have

∨ijk = I{vi ∼vj } I{vj ∼vk } , Δi,j,k = I{vi ∼vj } I{vj ∼vk } I{vk ∼vi } .

Introduce random variables s[j,k] = |Sj ∩ Sk | and s[i,j,k] = |Si ∩ Sj ∩ Sk | and


probabilities
 
pt = P Δi,j,k = 1 s[j,k] = t , qt = P(∨kij = 1|s[j,k] = t),
  
p̄t = P s[j,k] = t), pt = P s[i,j,k] ≥ 1|s[j,k] = t ,
 
pt = P Δi,j,k = 1 s[i,j,k] = 0, s[j,k] = t .

We observe that pt = qt for t ≥ 1. Furthermore, we have for t ≥ 0 that

pt = pt + pt (1 − pt ). (10)



We denote pe = P(vi ∼ vj ) and observe that E I{vi ∼vj } Si ) = EI{vi ∼vj } = pe .
In particular, we have p∨ = p2e . Indeed,
  
p∨ = EI{vi ∼vj } I{vj ∼vk } = E I{vi ∼vj } E I{vj ∼vk } Si , Sj
 
= E I{vi ∼vj } pe = p2e . (11)
20 M. Bloznelis and J. Jaworski

Hoeffding’s Decomposition. Let ψ be a real function defined on 3-


tuples of subsets of W , which is symmetric in its arguments. We assume
that Eψ(S1 , S2 , S3 ) = 0. Hoeffding’s decomposition [1,6] expands T =
{i,j,k}⊂[n] ψ(Si , Sj , Sk ) into a series of uncorrelated U statistics
n − 1
T = U1 + (n − 2)U2 + U3 , (12)
2

U1 = fi , fi = E(ψ(Si , Sj , Sk )|Si ),
i∈[n]
   
U2 = gi,j , gi,j = E ψ(Si , Sj , Sk ) − fi − fj Si , Sj ,
{i,j}⊂[n]

U3 = hi,j,k , hi,j,k = ψ(Si , Sj , Sk ) − fi − fj − fk − gi,j − gi,k − gj,k .
{i,j,k}⊂[n]

We note that g(Si , Sj ) := gi,j and h(Si , Sj , Sk ) := hi,j,k are symmetric functions
of their arguments Si , Sj and Si , Sj , Sk and they have the orthogonality property

E(g(Si , Sj )|Si ) = 0, E(h(Si , Sj , Sk )|Si , Sj ) = 0. (13)

(13) implies in particular that all distinct summands fi , gj1 ,j2 , hk1 ,k2 ,k3 are
uncorrelated whatever the indices i, j1 , j2 , k1 , k2 , k3 . A simple consequence of
(13) is the variance formula
 2    
n−1 n n
VarT = ET 2 = nEf12 + (n − 2)2 2
Eg1,2 + Eh21,2,3 . (14)
2 2 3

We construct decomposition (12) for T = N̄Δ and T = N̄∨ and use subscripts
Δ and ∨ to distinguish the respective terms ψΔ , fΔ j , gΔ i,j , hΔ i,j,k and ψ∨ , f∨ j ,
g∨ i,j , h∨ i,j,k .
Decomposition of N̄Δ . We put ψΔ (Si , Sj , Sk ) = Δi,j,k − pΔ and apply (12)
to T = N̄Δ . We shall show that for any j and k = j

fΔ j ≡ 0, (15)
r
  
gΔ j,k = I{s[j,k] =t} − p̄t pt . (16)
t=1

To show (15), we observe that, given Sj , the conditional probability of triangle


induced by vi , vj , vk (the quantity E(Δi,j,k |Sj )) does not depend on Sj . Hence
E(Δi,j,k |Sj ) = pΔ and, consequently, fΔ j ≡ 0. To show (16) we observe that,
given the pair (Sj , Sk ), the conditional probability of the triangle (the quantity
E(Δi,j,k |Sj , Sk )) only depends on the number s[j,k] . In particular, the following
random variables are equal
r

E(Δi,j,k |Sj , Sk ) = E(Δi,j,k |s[j,k] ) = I{s[j,k] =t} pt . (17)
t=1
Another random document with
no related content on Scribd:
consider anything but his own selfish ease and pleasure
and I suppose he is too old to look for any change now. I
myself am a nervous wreck, so I could not possibly have
you with me.
“As I know that you have but little money and will need
to be very careful, with this letter I am sending you some
things that if you are at all capable you can make over and
use for yourself; the stockings you can cut over, and the
slippers were always too small for me.
“Samuel Jarvis wrote me about the Bible I gave your
mother. I remember it well, and am pleased to know that
you have kept it.
“Your affectionate aunt,
“Sarah Hartly.”
No one made a remark as Mrs. Blossom finished the letter, till
Miss Silence spoke, “Well, let us see what’s in the box.”
The contents were quickly taken out, for even Grandmother Sweet
would have confessed to a curiosity in the matter. These were an old
black velvet dress worn threadbare at the seams and trimmed with
beaded fringe; a soiled black and white check wool wrapper; a black
satin skirt shiny with wear; a purple silk with coffee stains down the
front breadth; some brown brocaded material which had evidently
served as lining to a cloak; a bundle of half-worn stockings; several
yards of black feather trimming, moth-eaten in spots; a pair of fancy
bedroom slippers; and at the bottom of the box a plush cape heavily
braided with a bugle trimming.
Hardly a word had been uttered as one by one the garments had
been unfolded. Rose had knelt among them in silence; now she
drew the cape about her and rose to her feet. For a moment she
looked down at herself, then tearing the cape off she gave it a throw
and sank back in a little heap on the floor. “I know it would be
comfortable,” she wailed, “and I need it, and it would save spending
money, but I can’t wear that cape with those bugles, I can’t.”
Silence Blossom was laughing. “You needn’t wear it, Rose,” she
said soothingly.
Mrs. Patience had lifted the cape and was examining it, “That was
an expensive garment, when it was new.”
“It might have been, when it was new,” retorted her sister.
“What am I to do with the stuff?” questioned Rose with a tragic
gesture toward the unfolded garments scattered round her. “I’ve a
good mind to pack it in the box again and send it straight back to
Great-Aunt Sarah!”
“No, no, Rose,” reproved Mrs. Blossom; “remember she is your
aunt.”
“I do remember.” Rose’s eyes were sparkling with angry tears. “I
used sometimes to imagine what it would be like if I should ever find
my relatives and have real aunts and uncles and cousins, who cared
for me. Well, I have found them,” and she drew a sobbing breath. “I
have a Great-Uncle Samuel and a Great-Aunt Sarah; and neither
one cares that for me,” and she gave a snap to her fingers, “and
neither one will have me—though I’m glad Great-Aunt Sarah doesn’t
want me. But I shall love Great-Uncle Samuel always, even if I never
see him again, because he did take enough interest to come and
see me, and plan things for me. When I was Posey, I was nobody’s
Posey; and now I’m Rose, I’m nobody’s Rose!”
“You are our Rose,” and Mrs. Patience put her arms about her,
“and the Fifields think you are their Rose. I will tell you what you can
do. You can win the love of people for yourself, and so be
everybody’s Rose.”
Rose suddenly smiled. “I never thought of that before, but I will do
it. And Grandmother Sweet shall tell me how, for everybody loves
her.”
But Grandmother shook her head. “That is something thee will
have to learn for thyself. Only I will tell thee one thing, if thee would
win love thee must first give love; whatever thee would get out of life
thee must first put into life.”
Miss Silence had been going over the things again with her
practised eye. “See here, Rose, we can wash up this black and white
check and it will make you a good school dress, with a color for
piping to brighten it. And I have been looking at the black velvet and
I’m quite sure I can get you a little coat out of it. We can use the
brocade for lining, and there will be plenty of feather trimming, even
when the bad spots are taken out. That will look nicely with your new
red dress.”
“And I will make you a little black velvet turban, and trim it with red
ribbon to match your dress,” added Mrs. Patience.
“And I will show you how to put new feet in the stockings.”
Grandmother Sweet had drawn one on her hand. “They are a good,
fine quality.”
Rose looked from one to another. “What should I have done if I
hadn’t come here? You know just what to do every time. And when
the world looks all grey, if it isn’t quite black, if I can see it through
your eyes, why it’s pink and rosy again.”
As Rose was saying this she gathered up the articles and put
them back in the box once more. “I suppose you can find a use for
this purple silk. Perhaps when I’m old and wear a cap it will come
useful.”
For answer Miss Silence laughed and nodded, “There will be
some place where it will come in yet.”
“Rose,” said Mrs. Blossom, “I think it is time the chickens were
fed.”
This was something Rose had begged to do. They were a tamer
flock than Mrs. Hagood’s, petted as was every living thing about the
Blossoms, and it was an unfailing pleasure to have them run to meet
her, to feed them out of her hand, and to smooth their white feathers
as they crowded around. As she took the measure of yellow corn
from the back of the stove where it had been warming, the big
Maltese cat rose and purred beside her. “No, Dandy,” and she gave
him a pat, “you can’t go with me this time, the chickens don’t like
you; you jump and make them flutter.”
As she spoke she looked for something to put around her and her
eye fell on the cape which lay this time on the top of the box. “I have
just thought what I can use it for,” and she laughed merrily. “I can
wear it out to the chicken house; the chickens, I know, will enjoy
pecking at the bugles. That would certainly be making use of it.”
She paused with her hand on the door. “Will I have to write to
Great-Aunt Sarah and thank her?”
“Don’t you think that you ought to?” Mrs. Blossom questioned in
turn.
“I am not sure whether I do or not. But one thing is certain—if I do
write to her you will all have to help me, for I should never know what
to say.”
“I know what I should like to say to her.” Silence Blossom’s tone
was scornful, though she waited till Rose was out of hearing before
she spoke. “I would like to tell her that such a lot of good-for-nothing
old stuff I never saw sent away. I have heard stories of the boxes
sent to some of the home missionaries out West, and I think this
must be like them. Any woman of sense might have known that
those things were not suitable for a girl of Rose’s age.”
“At least the material was good,” urged her mother.
“You mean it had been, but it was past that point. It’s very evident
that Great-Aunt Sarah buys good clothes for herself. Something new
for Rose for a dress would have done her more good than all that
cast-off finery.”
“To my mind the letter was worse than the box,” declared Mrs.
Patience. “I never heard anything more heartless and cold-blooded.
One would have thought the mere facts would have aroused a
sympathy for Rose.”
“She is coming in,” cautioned Miss Silence, “and we would not say
anything before her. But this much is certain, that I know all I want to
of Mrs. Sarah Hartly.”
CHAPTER XXII
QUIET DAYS

You may have seen a little leaf that has fallen into a stream and
been whirled along by the unresting current, torn and bruised and
helpless, then suddenly drift into a still and quiet pool and lie tranquil,
unvexed, while the stream, unable longer to clutch it, goes hurrying
by. So to Rose, after her troubled, changeful childhood, Farmdale
was the quiet pool, where she was to find a quiet, uneventful period.
Not that Rose ever thought of it as uneventful. To her school life
she brought an enthusiasm that never flagged; the school tests, the
class competitions, the school entertainments, the school games,
and even the school differences, she entered into them all heart and
soul. She studied hard, she took eager advantage of every
opportunity, and was none the less ready for every enjoyment with
the keen zest of her intense nature. Then outside the school was the
village with all its people and all their happenings, a little world of
itself. “Some of the girls call Farmdale dull and poky,” she repeated
wonderingly to Miss Silence. “I’m sure it isn’t dull to me—I don’t see
how they can think it is.”
The Blossom household quickly became home, and home folks to
Rose. But when Mrs. Blossom promised for her the same care she
would have given her own little Rachel, she included also, what she
would have expected of little Rachel had she lived, as she had of her
other daughters, the yielding of a ready, cheerful obedience. Mrs.
Blossom’s law was one Rose had known little of, the law of love, but
none the less was it law. Never in their girlhood, and hardly in their
maturer years, had Silence or Patience Blossom dreamed of acting
in opposition to their mother’s will—that reasonable, mild, but
inflexible will. And though Rose had not hesitated to face Mrs.
Hagood’s fury, yet when those clear, steadfast eyes looked into hers,
and that kindly but firm voice said, with its accent of decision, “Rose,
you cannot!” she instinctively realized that here was a force, the
force of moral strength, that impetuous willfulness would beat
powerless against. Nor was her affection for Mrs. Blossom any the
less sincere because of the obedient respect on which it was
founded.
Great-Uncle Samuel had been rightly informed that the Farmdale
high school was a good one, and the lessons Rose learned within its
walls were to her of value; but no less so was the unconscious
teaching of the pure and unselfish lives that were open before her
every day. Over an ardent young life, full of dreams and plans and
ambitions, all centered in self, a happier influence could not well
have fallen than that of these gentle, kindly women, whose spirit of
helpfulness and sympathy was always as ready and unfailing as the
flow of the fountain itself.
Was any one in distress, in perplexity, in trouble; there was no
counselor so wise, discreet, trustworthy, as Mrs. Blossom, who held
half the village secrets, and had served as a peacemaker times
without number. Was there a bride to be dressed; no one could do it
so well as Miss Silence or Mrs. Patience. Was any one sick; no
nurses were as tender and skillful and tireless as they. Did the
shadow of death rest over a home; no voices could speak words of
sweeter comfort to the dying, no other’s presence was so
unobtrusive, so helpful in the house of bereavement. Indeed, few
were the families in that little community to whom they were not
bound by the cords of a common sympathy in some hour of joy or
grief. And Rose was not the only one who often wondered how with
all the calls upon them they still managed to accomplish so much,
and with a manner so unhurried.
“I don’t see how you ever do it,” Rose exclaimed one day.
“It’s the busy people who find not only the most time but the most
happiness,” was Silence Blossom’s cheery answer.
And realizing, as she well did, how much more of real happiness
there was in the modest Blossom home than in the big Fifield house,
where no one ever thought of going to ask a service, and every life
was wholly self-centered, Rose could not but admit that this was
true.
“I don’t see what happiness you could find in sitting up all night
with Aunt Polly Brown,” she protested. “I’m sure I never want to go
where there’s sick people. I hope I’ll never be asked to.”
Already in that home where thoughtfulness for others was part of
the daily life, and interest in any who were suffering a matter of
course, it had come about naturally that Rose should be sent with a
handful of flowers, or some dainty for a sick neighbor, or was asked
to call at the door with a message of inquiry. So the next day she
took it as a matter of course when Miss Silence asked her to take a
bowl of chicken broth to Aunt Polly Brown.
“Take it right in to Aunt Polly,” said the young woman who opened
the door. “She’s in the bedroom right off the sitting-room.”
Rose hesitated. She would have refused if she had known exactly
how to do so. As it was, the bowl trembled a little as she walked
through into the bedroom, where on a high four-post bedstead,
under a “blazing star” quilt, Aunt Polly lay, a ruffled night cap
surrounding her shrunken face.
“Well, now,” as Rose told her errand, “it was reel kind of Silence
Blossom to send the broth. I was just thinkin’ that a taste o’ chicken
broth would relish. Sit down, won’t ye,” with a wistful accent, “and tell
me what’s goin’ on? Mary Jane never knows nothin’. Mebby I ain’t
goin’ to get well, but ’tany rate I like to know what folks is doin’.”
“I was standing on one foot wondering how quick I could get out,”
Rose said, relating it all to Miss Silence, on her return. “But when
she spoke that way I just thought that if I were old and sick I’d be
glad to have somebody come in; and I sat down and racked my brain
to tell everything I could think of. She seemed real cheered up when
I came away, and I promised her I’d come again.”
“I thought you never wanted to go where there were sick people,”
and Silence Blossom’s eyes twinkled.
“Well, it wasn’t so bad as I thought it was going to be, though her
hands are kind of skinny. And I don’t think I feel quite as I did about
sick folks now. Besides, it must be dreadful to lie in bed day after
day, and if I can make a little of the time pass, why I’m glad to.”
“There is where the gladness comes in,” said Mrs. Patience. “It is
making the hours of suffering a little brighter, a little easier. And now
you have learned this I think you will never forget it.”
“And I also remember that I promised to come down to Helen
Green’s to get out my Latin with her,” and gathering an armful of
books Rose hurried away.
“I am glad that Rose went in to see Aunt Polly; she is such a bit of
sunshine that she could not help but do her good. Besides, she has
always had such a morbid dread of a sick room,” Silence remarked
as she watched her away.
“I am glad, too,” agreed Mrs. Blossom, “for Rose can gain as well
as give. Of course I would not want her to go where there was any
danger, but her exuberant young nature will be made the deeper and
richer for being stirred and lifted out of itself.”
So among the threads of interest running from the Blossom home
Rose knit her threads. The people of Farmdale became her friends,
and because they were her friends she loved them, and so it was not
strange that she won love in return. With the Fifields her relations
through the years continued of the friendliest. On her part the
painfulness of being falsely accused had faded away; and on their
part the fact that it had been an unjust charge had not only made
them one and all feel that they owed her something in return, but had
awakened an interest in her that otherwise they might never have
felt. Miss Eudora regarded her in the light of a romance; Miss Jane
Fifield commended the fact that she was neither vain, nor, as she
was pleased to put it, “silly”; while Mr. Nathan, in his pride at Rose’s
persistence, and the quality he called her “grit,” went so far as to
freshen up the languages of his college days, that he might the more
help her.
At their time of life it was not to be expected that the Fifield nature
would greatly change; still their friendship for Rose, inexperienced
young girl though she was, brought a new and wholesome
atmosphere into the old house. Her flitting in and out, bright, breezy,
vivacious, was a welcome break in their old formality. A part of
Rose’s nature was her overflowing enthusiasm on the subject then in
mind; her studies, her school pleasures, whatever part was hers in
the life of the village, was all shared with her friends. So when she
came in beaming with excitement over the prettiness of the newest
Banby baby, Miss Fifield and Miss Eudora became conscious that
Mrs. Banby was a neighbor. Or if it were anxiety how little Mrs.
Mather, whose husband had just died and left her with five children,
was ever going to get through the winter; or rejoicing that Fanny
Barber, who had been so low with inflammatory rheumatism was
really improving, almost before they were aware, they would find
themselves becoming interested, an interest that could easily take
the form of a bundle of warm clothing for the widow, or a glass of
Miss Fifield’s famous quince jelly for the invalid. And so by the slight
touches of Rose’s hands they found themselves drawn gradually
from their cold isolation, and nearer to those about them.
CHAPTER XXIII
A VISIT FROM AN OLD FRIEND

Through Cousin Allen Gloin’s wife’s sister, who lived in Horsham,


Rose occasionally heard of the Hagoods, and the year after she left
there was surprised by the news of Mrs. Hagood’s death.
“Mr. Hagood takes it real hard,” added her informant, “and says he
don’t know how he’s ever going to get along without Almiry. Some
folks thinks it’s put on, but for my part I don’t.”
“No, indeed,” had been Rose’s answer, “I think he had grown so
used to her ordering him around that now he does feel lost without
it.”
It was not quite two years later when one day, returning from
school, Rose found a horse and buggy standing at the Blossom
gate. This of itself was nothing unusual, for the business of Mrs.
Patience and Miss Silence brought a large share of the Farmdale
people, as well as those outside its limits, to their door. But as Rose
gave a second look in passing at the fat old horse and stout buggy,
she suddenly realized that she had known both before, and
quickening her steps she rushed into the house to find Mr. Hagood,
with Rover sitting upright beside him, waiting her coming. His was
the same familiar figure she remembered so well—thin, grizzled,
slightly stooping; but Rose saw almost in the first glance, that his
motions were brisker than in the days when she had known him, that
his whiskers had been trimmed, that his hat brim had taken an
upward tendency, and his eyes had lost their furtive, timid glance; in
short, that there had been a change in the whole man, slight but still
palpable, in the direction of cheerful, self-assertive manhood.
“Well, now, Posey,” was his greeting, as he held both her hands
and smiled till his face was all a-crinkle, “if it don’t beat natur’ how
you’ve growed! An’ prettier than ever, I declare! I tell you I was reel
tickled when I heerd how well you was fixed, an’ that you’d found out
your reel name, an’ your ma’s relations. You don’t look much like the
little girl Almiry brought home with her from the Refuge.”
“And that you gave the russet apples to?” Rose’s eyes were
twinkling, but the tears were very near them as she recalled that day
of her arrival at the Hagood home.
“So I did, to be sure. Well, Posey—if you hev got another name
you’ll always be Posey to me—we did hev some good times
together, didn’t we?”
Then they talked over the pleasant memories of their
companionship, with a mutual care avoiding those whose
suggestiveness might be the opposite. The only allusion he made to
her leaving was, “Rover an’ me did miss you dreadfully when you
went away, we just did. An’ so to-day, as I had to come over this way,
I said to Rover, ‘We’ll stop an’ see Posey, we will.’ I’m glad we did,
too, an’ I just believe Rover knows you.” And Rover, with his head on
Rose’s knee and her hand smoothing his silky ears, gently thumped
his tail on the floor, as if in affirmative.
Then, after a moment’s hesitation, “I was sorry you an’ Almiry
couldn’t fit together better; she meant well, Almiry did, but you know
she’d never had any little girls of her own.” And as if fearful that he
had cast some reflection on her memory he hastened to add, “Almiry
was a wonderful woman. I tell you I met with a big loss when I lost
her, I just did, an’ for a spell I was about broke up.” He paused with
the query, “I s’pose you’d heard she was dead?”
“Yes, but I never heard the particulars. Was she sick long?”
“No; it come so onexpected it just about floored me, it did. You see
she was taken with a chill, an’ she kep’ a gettin’ colder’n colder, in
spite o’ everythin’ we could giv’ her, an’ do for her. Why, it did seem
that what with the hot things we give her to drink, an’ the hot things
we kep’ around her, that if she’d been a stone image ’twould a
warmed her through; but they didn’t do a mite o’ good, not one mite.
She was took early one morning, an’ late the next night I was
warmin’ a flannel to lay on her. I het it so ’twas all a-smokin’, but she
couldn’t feel nothin’, an’ she give it a fling, an’ riz half up in bed an’
spoke, just as natural as she ever did, ‘Elnathan Hagood, I don’t
believe you’ve hed that nigh the stove; what ails you that you can’t
half do a thing? I’ve a good mind to get up and heat some flannel as
it ought to be done. I won’t hev any till I do.’ An’ with that she fell
right back on her piller, an’ never breathed ag’in. I tell you I was all
broke up.”
Rose did not know what she ought to say, so she said nothing.
Mr. Hagood hesitated, cleared his throat, and remarked in an
inquiring tone, “Mebby you’ve heard that I was married again?”
It was Rose’s turn to be surprised. “No, indeed, I’ve heard nothing
from Horsham since Mrs. Gloin’s sister left there. But I’m glad if you
have.”
“Be you really?” his face brightening. “Well, now, you see,” with the
confidential tone Rose remembered so well, “mebby some folks’ld
think I hadn’t orter done such a thing. But I tell you after a man has
had a home as many years as I had it’s kinder tough to be without
one. I couldn’t live alone; Rover an’ I tried that, an’ everything got
messed up dreadful; keepin’ a hired girl wasn’t much better; an’ to
eat my victuals at somebody else’s table didn’t seem reel natural,
now it didn’t.
“I thought if Almiry knew all the circumstances she wouldn’t blame
me none ef I did marry. An’ there was Mirandy Fraser, Jim Fraser’s
widow—don’t know as you ever knew her, a mighty pretty little
woman—she was havin’ a hard time to get along with her two little
girls, for Jim never was noways forehanded. So I figured it out that
she needed a home, an’ I needed some one to make a home; an’
the long an’ short of it is I married her. An’ the plan’s worked first
rate, well now it has. She ain’t such a manager,” he admitted, “as
Almiry was; but then,” with a touch of pride, “I don’t suppose it would
be easy to find Almiry’s equal there. But I’ll say this, I never did see
Mirandy’s match for bein’ pleasant. I don’t believe anybody ever
heerd her speak cross, I really don’t. She’s so contented, too, with
everything; hasn’t given me the first fault-findin’ word yet, not the first
one.”
“How nice that is!” Rose rejoined heartily.
“An’ the little girls,” all the lines on Mr. Hagood’s face deepened
into a tender smile as he spoke of them, “Susy an’ Ruth, I just wish
you could see them; there never were two prettier-behaved children,
if I do say it. They like to come out an’ sit in the shop when I’m at
work there, just as you used to, an’, well, they an’ Rover an’ me has
some pretty good times together.”
Rose smiled. “I don’t believe they enjoy it any more than I did.”
“I don’t work so much in the shop, though,” he added, “for I’ve a
good deal to look after. I’m over this way now on business. The fact
of the matter is,” an accent of dejection creeping into his tone, “I’ve
made a bad bargain. Ever since Almiry went I’ve kept everything up
straight as a string, an’ haven’t lost a dollar till now. I s’pose she’d
say it was all my fault, an’ so it is,” growing more and more
depressed; “for I suppose I ought to hev known better than to hev
ever lent Tom Hodges a hundred dollars. When he moved away from
Horsham he couldn’t pay me, but he’d got a good place as foreman
in a mill, an’ promised it all right. That was eight months ago, an’ I’ve
never seen a single cent, so I made up my mind I’d go over there an’
look him up, an’ I found Tom to-day down with the rheumatism, not
able to do a stroke o’ work, an’ they looked in pretty bad shape—
well, now they did. Of course he couldn’t pay me, said he hadn’t but
two dollars in money, but there was a cow, I could take that towards
it ef I wanted to. But bless you, there was four little children who
would hev to go without milk ef I took the cow, an’ I told Tom I’d wait
on him till he could earn the money, which just the same as meant
that I’d give it to him, for crippled up as he is he can’t more’n take
care of his family. An’ when I come away I handed his wife five
dollars; she looked as though she needed it, an’ they’ve both always
done as well as they could. I don’t know what Almiry’d say ef she
could know it. But hang it all!” giving his hat a slap on his knee,
“Mirandy said not to be hard on ’em, an’ it won’t kill me ef I do lose it.
“No, I can’t stay all night,” in answer to Rose’s invitation. “I brought
Mirandy an’ the little girls to my Cousin Em’ly’s, ten mile from here,
an’ they’ll be lookin’ for me back. But I wish you’d come an’ see us,
Posey,” as he rose to go. “I’ve told Mirandy about you, an’ she’d do
everything to make it pleasant. We haven’t changed things any to
speak of since you was there, only we live more in the front part o’
the house. I couldn’t help feelin’ at first that Almiry wouldn’t like it, but
I wanted to make it pleasant for Mirandy an’ the children, an’ you
know it wasn’t what you could call reel cheerful in that back kitchen.”
“And can Rover come in the house now?” asked Rose.
“Yes, Rover comes in, an’ we hev the front blinds open, an’
evenin’s last winter we’d hev apples an’ nuts an’ popcorn, ’most as
though it was a party. You know,” with a broad smile, “I never had
any children o’ my own before, an’ I sort o’ enjoy havin’ some little
girls to call me ‘Pa.’”
Rose had come out along the walk with Mr. Hagood. As they
paused at the gate he glanced around to be sure that no one but her
could hear him, then lowering his voice as though fearing it might
reach the ears of the departed Mrs. Hagood, he added confidentially,
“An’ to tell the truth, Posey, just betwixt you and me, I never was so
happy before in my life as I be now.”
CHAPTER XXIV
AND COLLEGE NEXT

It was the third May that Rose had been in Farmdale. The turf on
the open green was emerald velvet, the orchards were drifts of pink
and white, the lilacs by Mrs. Blossom’s gate were lifting spikes of
lavender, and shrubs and roses were heavy with the weight of bud or
bloom. In a swift rush Rose came down the walk, the white gate
clashed behind her, and she dashed into the house, rosy and
breathless with haste, waving a long envelope over her head.
“What do you think that is?” she cried.
Miss Silence glanced up from her sewing machine. “It looks to me
like an envelope.”
“And what do you think is inside it?” pursued Rose.
“A letter is usually inside an envelope,” answered Mrs. Patience.
“You won’t guess,” pouted Rose, “so I shall have to tell you, for I
couldn’t possibly keep it. This is my certificate that I have passed the
teachers’ examination I went to last week, and am duly qualified to
teach. Wish me joy!”
“But I thought thee went to the examination simply for the
practice,” said Grandmother Sweet.
“So I did. But all the same I wanted to pass, and was so afraid I
wouldn’t pass. That’s why I didn’t say more about it. And now that I
have a really, truly certificate to teach! I’m sure I’ve grown an inch
since I took it out of the post-office.”
“We are very glad you succeeded,” and Mrs. Patience held off a
hat to see if the bunch of flowers was in the right place.
“And that isn’t all,” Rose went on blithely. “You need sixteen points
to graduate from the high school, I have fourteen already, because
I’ve taken extra studies; to pass the teachers’ examination counts
two points, so now I can graduate this year.”
“But why do you want to graduate this year? I supposed of course
you were going one more,” and Silence looked her surprise.
“I want to get to teaching. I’m just crazy to begin.”
“Rose, Rose,” Mrs. Blossom in the next room had heard the
conversation, and now stepped to the doorway, “you are too young
to think of teaching; even if you are qualified you have not the self-
control a teacher needs.”
“Oh, don’t say that!” groaned Rose, “when I have struggled with
my temper, and prayed over it, and counted a hundred before I
spoke, and bitten my tongue till it bled, and did all the things I ever
heard of to hold on to myself.”
“And you have done very well,” commended Mrs. Blossom. “You
have overcome much, and learned some hard lessons in the bridling
of your quick tongue, and holding in check your temper. But you
have still more to learn, especially if you are going to teach. I know,
for I was a teacher myself, and while text-books and methods
change, boys and girls, as far as I can see, remain about the same.”
“All I ask is the chance to try some boys and girls.”
“Besides,” Mrs. Blossom’s voice was calmly even, “I do not think
you can teach, that any school board would hire a girl of seventeen.”
“But I know people who have taught when no older than that,”
persisted Rose.
“That might have been once but it is not now. Indeed I am quite
sure that a law has been passed in Ohio that a teacher cannot draw
pay unless she is over eighteen.”
“It is a mean old law,” scorned Rose.
“Another thing,” continued Mrs. Blossom, “your Uncle Samuel is
your guardian, and he did not expect, any more than we did, that you
would leave school till next year; and before taking such a step you
must consult him.”
“Great-Uncle Samuel won’t care,” urged Rose, “and I’ve set my
heart on getting through this year. Besides if I can’t teach I can go to
school another year, and take Latin and German, and review the
common branches.”
“You write to Mr. Jarvis first, and see what he says,” and Rose
knew further argument was useless.
Rose waited and fretted for two weeks before an answer to her
letter came, and when she read it she gave a gasp of surprise.
“What do you think?” she exclaimed. “Great-Uncle Samuel says I
have been a very prudent girl, while from my marks—you know I
have sent them to him every quarter—I seem to have made good
use of my opportunities; so if I will continue to be prudent he thinks
there will be money enough for me to go to college for four years.
This is what he writes: ‘Of course not to a big expensive college, that
would be quite beyond your means, the Fairville Woman’s College is
the one I have chosen for you. I am told that it is an excellent school,
that the location is healthy, and the moral tone excellent. That you
will make good use of its benefits I shall expect. Of course your Aunt
Sarah Hartly ought to have seen to this for you, but as long as she
wouldn’t I have done what seemed to me the best.’”
“Four years in college, will not that be fine?” Silence Blossom’s
own eyes were bright with pleasure.
“Yes, I suppose it will,” Rose spoke slowly. “But, you know, I never
had thought of such a thing as college being possible for me; I did
not think that there was money enough for that. Of course I shall like
it, the only thing is it will make me so old before I get to teaching.”
The older women looked at Rose’s face, that had never lost its
child expression, and laughed at her words.
“It may be though,” she went on, “that I can put in extra studies
and shorten the time.”
“No, no,” protested Mrs. Patience, “to do your best work you do
not want to hurry it.”
Grandmother Sweet stopped her knitting. “Rose, my husband
while a lad served five years as apprentice to a carpenter. His own
work was of the best, and he often said that time spent learning to
use one’s tools was time saved. Now, thee is planning to use books
as tools, and the better thee understands them the better work thee
will do.”
“Oh, of course,” Rose hastened to say, “now the chance has come
to me I wouldn’t miss it for anything. And I will make the best of it,
too. I’m going to send right away and get a prospectus of the college
to see what the entrance requirements are. I’m not going to be
conditioned, and I’d rather be a little ahead. I had planned anyway to
read Virgil this summer with Mr. Fifield, and I can study up whatever
else is needed.”
“I think if you are going to college this fall you will need to do some
sewing as well as studying,” suggested Miss Silence.
“Of course I shall. I know I can’t spend money for a great deal;
what I do have I want neat and in good shape. I’m so glad to know
about it now, for I can plan the dresses I will need when I graduate
from the high school so I can use them then.”
“How many will you need?” asked Silence Blossom.
“The other girls say three; a suit for the Baccalaureate sermon,
another for the senior reception, and the graduating dress.”
“That last will be white, and will answer for your best white dress
all the year, and if you get a pretty grey for your suit that will do for
fall wear.”
“That makes two new dresses,” reflected Rose. “I can’t afford any
more, and one other still to be evolved. I wish the waist wasn’t so
badly worn to the lavender and white striped silk Great-Aunt Sarah
sent in the last box; it would make a pretty dress, and I could mend
up the cream lace to trim it.”
Before Rose had ceased speaking Miss Silence was turning the
leaves of a fashion book. “There is a dress in this last number that I
believe we can copy, and use the purple silk she sent you once to
combine with it. The solid color will give it character, and the lace will
soften and keep it girlish.”
Rose was looking at the plate. “Yes, that will be pretty. You are the
very Wizard of Old Clothes. And if there are scraps enough of silk
and lace left I will make a little hat with purple violets for trimming to
wear with it.”
She paused and lifted an impressive finger. “But mind this, when I
get to earning for myself I will have some pretty dresses, and never
will I wear any more of Great-Aunt Sarah’s cast-offs!”
Mrs. Patience smiled indulgently. “You are young, Rose, it is only
natural you should feel so. But you know you are denying yourself
now so that day may come.”
“I know it,” Rose nodded. “When I have had to go without things I
wanted and that other girls did have, I’ve said, ‘Never mind, you are
having an education.’ I expect to have to say that pretty often when I
get to college—it’s hard to realize that I am going—but I’m not going
to forget that I’m working for a purpose.”
“And that’s better than fine clothes.”
Rose twisted her face. “I wouldn’t object to the fine clothes if I
could have them. But I suppose I shall need some dresses for
everyday wear; the blue dress I had last year will do for that, won’t
it?”
“Yes, and there is your green and red plaid. You can have some
separate waists, too. I’m sure, Rose, we can have your wardrobe in
shape, that if not fine, it will be neat and tasty.”
“What could I ever have done without you all?” Rose paused and
sighed. “I am glad that I can go to college. I shall be gladder the
longer I realize it. But I feel that it will just break my heart to leave
here. If I could only take you all with me or bring the college to
Farmdale.”
“We are glad that you can go to college, Rose,” Mrs. Blossom’s
voice had not quite its usual firmness, “but you may be sure of one
thing, we shall miss you more than you will us. But it is a long time till
September; we will not begin the parting yet.”
“And of course I shall come back in vacations; everybody goes
home then, and this is my home.”
“Do you think a college freshman will remember how to gather
eggs?” asked Mrs. Patience.
“This one will, you may be sure,” laughed Rose, “and how to make
omelet, and custard, and cake with them when they are gathered.
It’s a pity Great-Uncle Samuel never comes so I can show him how
you have taught me to cook.”
It was a busy summer for Rose; she went over all the studies in
which she would be examined for entrance to college, she sewed
and gathered and tucked and hemmed, and when the September
days came she packed her modest wardrobe in her new trunk with a
curious mingling of dread and delight; dread at leaving the life she
knew, the friends she had proved; delight in the new and wider world
opening before her.
There had been talk of Mrs. Patience going with Rose, but it had
not proved possible, so when one sunny September day the stage—
the same stage that had brought her to Farmdale, stopped at the
white gate, and her trunk was strapped on, with a mixture of tears
and smiles the good-bys were said, and Rose settled herself in the
same corner of the back seat she had occupied on that day which
now seemed so far, far in the past, no longer a forlorn little figure,
dingy, travel worn and friendless; but a trim young girl in a pretty grey
suit, leaning out and waving her handkerchief in answer to those
waved to her from nearly every house. For Rose’s friends included
almost every one in Farmdale, and all her friends were interested in
her start for college.

THE END

You might also like