You are on page 1of 54

Algorithms and Architectures for

Parallel Processing 18th International


Conference ICA3PP 2018 Guangzhou
China November 15 17 2018
Proceedings Part I Jaideep Vaidya
Visit to download the full and correct content document:
https://textbookfull.com/product/algorithms-and-architectures-for-parallel-processing-1
8th-international-conference-ica3pp-2018-guangzhou-china-november-15-17-2018-pr
oceedings-part-i-jaideep-vaidya/
More products digital (pdf, epub, mobi) instant
download maybe you interests ...

Algorithms and Architectures for Parallel Processing


18th International Conference ICA3PP 2018 Guangzhou
China November 15 17 2018 Proceedings Part III Jaideep
Vaidya
https://textbookfull.com/product/algorithms-and-architectures-
for-parallel-processing-18th-international-conference-
ica3pp-2018-guangzhou-china-november-15-17-2018-proceedings-part-
iii-jaideep-vaidya/

Algorithms and Architectures for Parallel Processing


18th International Conference ICA3PP 2018 Guangzhou
China November 15 17 2018 Proceedings Part IV Jaideep
Vaidya
https://textbookfull.com/product/algorithms-and-architectures-
for-parallel-processing-18th-international-conference-
ica3pp-2018-guangzhou-china-november-15-17-2018-proceedings-part-
iv-jaideep-vaidya/

Algorithms and Architectures for Parallel Processing


ICA3PP 2018 International Workshops Guangzhou China
November 15 17 2018 Proceedings Ting Hu

https://textbookfull.com/product/algorithms-and-architectures-
for-parallel-processing-ica3pp-2018-international-workshops-
guangzhou-china-november-15-17-2018-proceedings-ting-hu/

Algorithms and Architectures for Parallel Processing


14th International Conference ICA3PP 2014 Dalian China
August 24 27 2014 Proceedings Part I 1st Edition Xian-
He Sun
https://textbookfull.com/product/algorithms-and-architectures-
for-parallel-processing-14th-international-conference-
ica3pp-2014-dalian-china-august-24-27-2014-proceedings-
Algorithms and Architectures for Parallel Processing
14th International Conference ICA3PP 2014 Dalian China
August 24 27 2014 Proceedings Part II 1st Edition Xian-
He Sun
https://textbookfull.com/product/algorithms-and-architectures-
for-parallel-processing-14th-international-conference-
ica3pp-2014-dalian-china-august-24-27-2014-proceedings-part-
ii-1st-edition-xian-he-sun/

Algorithms and Architectures for Parallel Processing


13th International Conference ICA3PP 2013 Vietri sul
Mare Italy December 18 20 2013 Proceedings Part I 1st
Edition Antonio Balzanella
https://textbookfull.com/product/algorithms-and-architectures-
for-parallel-processing-13th-international-conference-
ica3pp-2013-vietri-sul-mare-italy-
december-18-20-2013-proceedings-part-i-1st-edition-antonio-
balzanella/
Pattern Recognition and Computer Vision First Chinese
Conference PRCV 2018 Guangzhou China November 23 26
2018 Proceedings Part I Jian-Huang Lai

https://textbookfull.com/product/pattern-recognition-and-
computer-vision-first-chinese-conference-prcv-2018-guangzhou-
china-november-23-26-2018-proceedings-part-i-jian-huang-lai/

Algorithms and Architectures for Parallel Processing


20th International Conference ICA3PP 2020 New York City
NY USA October 2 4 2020 Proceedings Part III Meikang
Qiu
https://textbookfull.com/product/algorithms-and-architectures-
for-parallel-processing-20th-international-conference-
ica3pp-2020-new-york-city-ny-usa-october-2-4-2020-proceedings-
part-iii-meikang-qiu/

Algorithms and Architectures for Parallel Processing


13th International Conference ICA3PP 2013 Vietri sul
Mare Italy December 18 20 2013 Proceedings Part II 1st
Edition Peter Benner
https://textbookfull.com/product/algorithms-and-architectures-
for-parallel-processing-13th-international-conference-
ica3pp-2013-vietri-sul-mare-italy-
Jaideep Vaidya
Jin Li (Eds.)
LNCS 11334

Algorithms and Architectures


for Parallel Processing
18th International Conference, ICA3PP 2018
Guangzhou, China, November 15–17, 2018
Proceedings, Part I

123
Lecture Notes in Computer Science 11334
Commenced Publication in 1973
Founding and Former Series Editors:
Gerhard Goos, Juris Hartmanis, and Jan van Leeuwen

Editorial Board
David Hutchison
Lancaster University, Lancaster, UK
Takeo Kanade
Carnegie Mellon University, Pittsburgh, PA, USA
Josef Kittler
University of Surrey, Guildford, UK
Jon M. Kleinberg
Cornell University, Ithaca, NY, USA
Friedemann Mattern
ETH Zurich, Zurich, Switzerland
John C. Mitchell
Stanford University, Stanford, CA, USA
Moni Naor
Weizmann Institute of Science, Rehovot, Israel
C. Pandu Rangan
Indian Institute of Technology Madras, Chennai, India
Bernhard Steffen
TU Dortmund University, Dortmund, Germany
Demetri Terzopoulos
University of California, Los Angeles, CA, USA
Doug Tygar
University of California, Berkeley, CA, USA
Gerhard Weikum
Max Planck Institute for Informatics, Saarbrücken, Germany
More information about this series at http://www.springer.com/series/7407
Jaideep Vaidya Jin Li (Eds.)

Algorithms and Architectures


for Parallel Processing
18th International Conference, ICA3PP 2018
Guangzhou, China, November 15–17, 2018
Proceedings, Part I

123
Editors
Jaideep Vaidya Jin Li
Rutgers University Guangzhou University
Newark, NJ, USA Guangzhou, China

ISSN 0302-9743 ISSN 1611-3349 (electronic)


Lecture Notes in Computer Science
ISBN 978-3-030-05050-4 ISBN 978-3-030-05051-1 (eBook)
https://doi.org/10.1007/978-3-030-05051-1

Library of Congress Control Number: 2018962485

LNCS Sublibrary: SL1 – Theoretical Computer Science and General Issues

© Springer Nature Switzerland AG 2018


This work is subject to copyright. All rights are reserved by the Publisher, whether the whole or part of the
material is concerned, specifically the rights of translation, reprinting, reuse of illustrations, recitation,
broadcasting, reproduction on microfilms or in any other physical way, and transmission or information
storage and retrieval, electronic adaptation, computer software, or by similar or dissimilar methodology now
known or hereafter developed.
The use of general descriptive names, registered names, trademarks, service marks, etc. in this publication
does not imply, even in the absence of a specific statement, that such names are exempt from the relevant
protective laws and regulations and therefore free for general use.
The publisher, the authors, and the editors are safe to assume that the advice and information in this book are
believed to be true and accurate at the date of publication. Neither the publisher nor the authors or the editors
give a warranty, express or implied, with respect to the material contained herein or for any errors or
omissions that may have been made. The publisher remains neutral with regard to jurisdictional claims in
published maps and institutional affiliations.

This Springer imprint is published by the registered company Springer Nature Switzerland AG
The registered company address is: Gewerbestrasse 11, 6330 Cham, Switzerland
Preface

Welcome to the proceedings of the 18th International Conference on Algorithms and


Architectures for Parallel Processing (ICA3PP 2018), which was organized by
Guangzhou University and held in Guangzhou, China, during November 15–17, 2018.
ICA3PP 2018 was the 18th event in a series of conferences devoted to research on
algorithms and architectures for parallel processing. Previous iterations of the confer-
ence include ICA3PP 2017 (Helsinki, Finland, November 2017), ICA3PP 2016
(Granada, Spain, December 2016), ICA3PP 2015 (Zhangjiajie, China, November
2015), ICA3PP 2014 (Dalian, China, August 2014), ICA3PP 2013 (Vietri sul Mare,
Italy, December 2013), ICA3PP 2012 (Fukuoka, Japan, September 2012), ICA3PP
2011 (Melbourne, Australia, October 2011), ICA3PP 2010 (Busan, Korea, May 2010),
ICA3PP 2009 (Taipei, Taiwan, June 2009), ICA3PP 2008 (Cyprus, June 2008),
ICA3PP 2007 (Hangzhou, China, June 2007), ICA3PP 2005 (Melbourne, Australia,
October 2005), ICA3PP 2002 (Beijing, China, October 2002), ICA3PP 2000 (Hong
Kong, China, December 2000), ICA3PP 1997 (Melbourne, Australia, December 1997),
ICA3PP 1996 (Singapore, June 1996), and ICA3PP 1995 (Brisbane, Australia, April
1995).
ICA3PP is now recognized as the main regular event in the area of parallel algo-
rithms and architectures, which covers many dimensions including fundamental the-
oretical approaches, practical experimental projects, and commercial and industry
applications. This conference provides a forum for academics and practitioners from
countries and regions around the world to exchange ideas for improving the efficiency,
performance, reliability, security, and interoperability of computing systems and
applications.
ICA3PP 2018 attracted over 400 high-quality research papers highlighting the
foundational work that strives to push beyond the limits of existing technologies,
including experimental efforts, innovative systems, and investigations that identify
weaknesses in existing parallel processing technology. Each submission was reviewed
by at least two experts in the relevant areas, on the basis of their significance, novelty,
technical quality, presentation, and practical impact. According to the review results,
141 full papers were selected to be presented at the conference, giving an acceptance
rate of 35%. Besides, we also accepted 50 short papers and 24 workshop papers. In
addition to the paper presentations, the program of the conference included four key-
note speeches and two invited talks from esteemed scholars in the area, namely: Prof.
Xuemin (Sherman) Shen, University of Waterloo, Canada; Prof. Wenjing Lou, Virginia
Tech, USA; Prof. Witold Pedrycz, University of Alberta, Canada; Prof. Xiaohua Jia,
City University of Hong Kong, Hong Kong; Prof. Xiaofeng Chen, Xidian University,
China; Prof. Xinyi Huang, Fujian Normal University, China. We were extremely
honored to have them as the conference keynote speakers and invited speakers.
ICA3PP 2018 was made possible by the behind-the-scene effort of selfless indi-
viduals and organizations who volunteered their time and energy to ensure the success
VI Preface

of this conference. We would like to express our special appreciation to Prof. Yang
Xiang, Prof. Weijia Jia, Prof. Yi Pan, Prof. Laurence T. Yang, and Prof. Wanlei Zhou,
the Steering Committee members, for giving us the opportunity to host this prestigious
conference and for their guidance with the conference organization. We would like to
emphasize our gratitude to the general chairs, Prof. Albert Zomaya and Prof. Minyi
Guo, for their outstanding support in organizing the event. Thanks also to the publicity
chairs, Prof. Zheli Liu and Dr Weizhi Meng, for the great job in publicizing this event.
We would like to give our thanks to all the members of the Organizing Committee and
Program Committee for their efforts and support.
The ICA3PP 2018 program included two workshops, namely, the ICA3PP 2018
Workshop on Intelligent Algorithms for Large-Scale Complex Optimization Problems
and the ICA3PP 2018 Workshop on Security and Privacy in Data Processing. We
would like to express our sincere appreciation to the workshop chairs: Prof. Ting Hu,
Prof. Feng Wang, Prof. Hongwei Li and Prof. Qian Wang.
Last but not least, we would like to thank all the contributing authors and all
conference attendees, as well as the great team at Springer that assisted in producing the
conference proceedings, and the developers and maintainers of EasyChair.

November 2018 Jaideep Vaidya


Jin Li
Organization

General Chairs
Albert Zomaya University of Sydney, Australia
Minyi Guo Shanghai Jiao Tong University, China

Program Chairs
Jaideep Vaidya Rutgers University, USA
Jin Li Guangzhou University, China

Publication Chair
Yu Wang Guangzhou University, China

Publicity Chairs
Zheli Liu Nankai University, China
Weizhi Meng Technical University of Denmark, Denmark

Steering Committee
Yang Xiang (Chair) Swinburne University of Technology, Australia
Weijia Jia Shanghai Jiaotong University, China
Yi Pan Georgia State University, USA
Laurence T. Yang St. Francis Xavier University, Canada
Wanlei Zhou Deakin University, Australia

Program Committee
Pedro Alonso Universitat Politècnica de València, Spain
Daniel Andresen Kansas State University, USA
Cosimo Anglano Universitá del Piemonte Orientale, Italy
Danilo Ardagna Politecnico di Milano, Italy
Kapil Arya Northeastern University, USA
Marcos Assuncao Inria, France
Joonsang Baek University of Wollongong, Australia
Anirban Basu KDDI Research Inc., Japan
Ladjel Bellatreche LIAS/ENSMA, France
Jorge Bernal Bernabe University of Murcia, Spain
Thomas Boenisch High-Performance Computing Center Stuttgart,
Germany
VIII Organization

George Bosilca University of Tennessee, USA


Massimo Cafaro University of Salento, Italy
Philip Carns Argonne National Laboratory, USA
Alexandra Carpen-Amarie Vienna University of Technology, Austria
Aparicio Carranza City University of New York, USA
Aniello Castiglione University of Salerno, Italy
Arcangelo Castiglione University of Salerno, Italy
Pedro Castillo University of Granada, Spain
Tzung-Shi Chen National University of Tainan, Taiwan
Kim-Kwang Raymond The University of Texas at San Antonio, USA
Choo
Mauro Conti University of Padua, Italy
Jose Alfredo Ferreira Costa Federal University, UFRN, Brazil
Raphaël Couturier University Bourgogne Franche-Comté, France
Miguel Cárdenas Montes CIEMAT, Spain
Masoud Daneshtalab Mälardalen University and Royal Institute
of Technology, Sweden
Casimer Decusatis Marist College, USA
Eugen Dedu University of Bourgogne Franche-Comté, France
Juan-Carlos Díaz-Martín University of Extremadura, Spain
Matthieu Dorier Argonne National Laboratory, USA
Avgoustinos Filippoupolitis University of Greenwich, UK
Ugo Fiore Federico II University, Italy
Franco Frattolillo University of Sannio, Italy
Marc Frincu West University of Timisoara, Romania
Jorge G. Barbosa University of Porto, Portugal
Chongzhi Gao Guangzhou University, China
Jose Daniel García University Carlos III of Madrid, Spain
Luis Javier García Villalba Universidad Complutense de Madrid, Spain
Paolo Gasti New York Institute of Technology, USA
Vladimir Getov University of Westminster, UK
Olivier Gluck Université de Lyon, France
Jing Gong KTH Royal Institute of Technology, Sweden
Amina Guermouche Telecom Sud-Paris, France
Jeff Hammond Intel, USA
Feng Hao Newcastle University, UK
Houcine Hassan Universitat Politècnica de València, Spain
Sun-Yuan Hsieh National Cheng Kung University, Taiwan
Chengyu Hu Shandong University, China
Xinyi Huang Fujian Normal University, China
Mauro Iacono University of Campania Luigi Vanvitelli, Italy
Shadi Ibrahim Inria, France
Yasuaki Ito Hiroshima University, Japan
Mathias Jacquelin Lawrence Berkeley National Laboratory, USA
Nan Jiang East China Jiaotong University, China
Lu Jiaxin Jiangxi Normal University, China
Organization IX

Edward Jung Kennesaw State University, USA


Georgios Kambourakis University of the Aegean, Greece
Gabor Kecskemeti Liverpool John Moores University, UK
Muhammad Khurram Khan King Saud University, Saudi Arabia
Dieter Kranzlmüller Ludwig Maximilian University of Munich, Germany
Michael Kuhn University of Hamburg, Germany
Julian Kunkel German Climate Computing Center, Germany
Algirdas Lančinskas Vilnius University, Lithuania
Patrick P. C. Lee The Chinese University of Hong Kong, SAR China
Laurent Lefevre Inria, France
Hui Li University of Electronic Science and Technology
of China, China
Kenli Li Hunan University, China
Dan Liao University of Electronic Science and Technology
of China, China
Jingyu Liu Hebei University of Technology, China
Joseph Liu Monash University, Australia
Yunan Liu Jiangxi Normal University, China
Zheli Liu Nankai University, China
Jay Lofstead Sandia National Laboratories, USA
Paul Lu University of Alberta, Canada
Amit Majumdar University of California San Diego, USA
Tomas Margalef Universitat Autonoma de Barcelona, Spain
Stefano Markidis KTH Royal Institute of Technology, Sweden
Alejandro Masrur Chemnitz University of Technology, Germany
Susumu Matsumae Saga University, Japan
Raffaele Montella University of Naples Parthenope, Italy
Francesco Moscato University of Campania Luigi Vanvitelli, Italy
Bogdan Nicolae Argonne National Laboratory, Germany
Francesco Palmieri University of Salerno, Italy, Italy
Swann Perarnau Argonne National Laboratory, USA
Dana Petcu West University of Timisoara, Romania
Salvador Petit Universitat Politècnica de València, Spain
Riccardo Petrolo Rice University, USA
Florin Pop University Politehnica of Bucharest, Romania
Radu Prodan University of Klagenfurt, Austria
Zhang Qikun Beijing Institute of Technology, China
Thomas Rauber University Bayreuth, Germany
Khaled Riad Zagazig University, Egypt
Suzanne Rivoire Sonoma State University, USA
Ivan Rodero Rutgers University, USA
Romain Rouvoy University of Lille, France
Antonio Ruiz-Martínez University of Murcia, Spain
Françoise Sailhan CNAM, France
Sherif Sakr The University of New South Wales, Australia
Giandomenico Spezzano ICAR-CNR and University of Calabria, Italy
X Organization

Patricia Stolf IRIT, France


John Stone University of Illinois at Urbana-Champaign, USA
Peter Strazdins The Australian National University, Australia
Hari Subramoni The Ohio State University, USA
Gang Sun University of Science and Technology of China, China
Zhizhuo Sun Beijing Institute of Technology, China
Frederic Suter CNRS, France
Yu-An Tan Beijing Institute of Technology, China
Ming Tao Dongguan University of Technology, China
Andrei Tchernykh CICESE Research Center, Mexico
Massimo Torquati University of Pisa, Italy
Tomoaki Tsumura Nagoya Institute of Technology, Japan
Didem Unat Koç University, Turkey
Vladimir Voevodin Moscow University, Russia
Feng Wang Wuhan University, China
Hao Wang Shandong Normal University, China
Yu Wei Nankai University, China
Sheng Wen Swinbourne University of Technology, China
Jigang Wu Guangdong University of Technology, China
Roman Wyrzykowski Czestochowa University of Technology, Poland
Yu Xiao Shandong University of Technology, China
Ramin Yahyapour University of Göttingen, Germany
Fang Yan Beijing Wuzi University, China
Zheng Yan Xidian University, China
Laurence T. Yang St. Francis Xavier University, Canada
Wun-She Yap Universiti Tunku Abdul Rahman, Malaysia
Contents – Part I

Distributed and Parallel Computing

Network-Aware Grouping in Distributed Stream Processing Systems . . . . . . . 3


Fei Chen, Song Wu, and Hai Jin

vPlacer: A Co-scheduler for Optimizing the Performance of Parallel


Jobs in Xen . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 19
Peng Jiang, Ligang He, Shenyuan Ren, Zhiyan Chen, and Rui Mao

Document Nearest Neighbors Query Based on Pairwise Similarity


with MapReduce. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 34
Peipei Lv, Peng Yang, Yong-Qiang Dong, and Liang Gu

Accurate Identification of Internet Video Traffic Using Byte Code


Distribution Features . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 46
Yuxi Xie, Hanbo Deng, Lizhi Peng, and Zhenxiang Chen

RISC: Risk Assessment of Instance Selection in Cloud Markets . . . . . . . . . . 59


Jingyun Gu, Zichen Xu, and Cuiying Gao

Real-Time Data Stream Partitioning over a Sliding Window


in Real-Time Spatial Big Data . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 75
Sana Hamdi, Emna Bouazizi, and Sami Faiz

A Priority and Fairness Mixed Compaction Scheduling Mechanism


for LSM-tree Based KV-Stores . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 89
Lidong Chen, Yinliang Yue, Haobo Wang, and Jianhua Wu

PruX: Communication Pruning of Parallel BFS in the


Graph 500 Benchmark . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 106
Menghan Jia, Yiming Zhang, Dongsheng Li, and Songzhu Mei

Comparative Study of Distributed Deep Learning Tools


on Supercomputers . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 122
Xin Du, Di Kuang, Yan Ye, Xinxin Li, Mengqiang Chen,
Yunfei Du, and Weigang Wu

Noncooperative Optimization of Multi-user Request Strategy


in Cloud Service Composition Reservation . . . . . . . . . . . . . . . . . . . . . . . . . 138
Zheng Xiao, Yang Guo, Gang Liu, and Jiayi Du
XII Contents – Part I

Most Memory Efficient Distributed Super Points Detection


on Core Networks . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 153
Jie Xu, Wei Ding, and Xiaoyan Hu

Parallel Implementation and Optimizations of Visibility Computing of 3D


Scene on Tianhe-2 Supercomputer . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 168
Zhengwei Xu, Xiaodong Wang, Congpin Zhang, and Changmao Wu

Efficient Algorithms of Parallel Skyline Join over Data Streams . . . . . . . . . . 184


Jinchao Zhang, JingZi Gu, Shuai Cheng, Bo Li, Weiping Wang,
and Dan Meng

Air Flow Based Failure Model for Data Centers . . . . . . . . . . . . . . . . . . . . . 200


Hao Feng, Yuhui Deng, and Liang Yu

Adaptive Load Balancing on Multi-core IPsec Gateway . . . . . . . . . . . . . . . . 215


Wei Li, Shengjie Hu, Guanchao Sun, and Yunchun Li

An Active Learning Based on Uncertainty and Density Method


for Positive and Unlabeled Data . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 229
Jun Luo, Wenan Zhou, and Yu Du

TAMM: A New Topology-Aware Mapping Method for Parallel


Applications on the Tianhe-2A Supercomputer . . . . . . . . . . . . . . . . . . . . . . 242
Xinhai Chen, Jie Liu, Shengguo Li, Peizhen Xie, Lihua Chi,
and Qinglin Wang

Adaptive Data Sampling Mechanism for Process Object . . . . . . . . . . . . . . . 257


Yongzheng Lin, Hong Liu, Zhenxiang Chen, Kun Zhang, and Kun Ma

MedusaVM: Decentralizing Virtual Memory System for Multithreaded


Applications on Many-core . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 267
Miao Cai, Shenming Liu, Weiyong Yang, and Hao Huang

An Efficient Retrieval Method for Astronomical Catalog


Time Series Data . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 284
Bingyao Li, Ce Yu, Xiaoteng Hu, Jian Xiao, Shanjiang Tang,
Lianmeng Li, and Bin Ma

Maintaining Root via Custom Android Kernel Across


Over-The-Air Upgrade. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 299
Huang Zucheng, Liu Lu, Li Yuanzhang, Zhang Yu, and Zhang Qikun

Accelerating Pattern Matching with CPU-GPU Collaborative Computing . . . . 310


Victoria Sanz, Adrián Pousa, Marcelo Naiouf, and Armando De Giusti
Contents – Part I XIII

An Incremental Map Matching Algorithm Based on Weighted


Shortest Path . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 323
Jixiao Chen, Yongjian Yang, Liping Huang, Zhuo Zhu, and Funing Yang

Asynchronous Parallel Dijkstra’s Algorithm on Intel Xeon Phi Processor:


How to Accelerate Irregular Memory Access Algorithm. . . . . . . . . . . . . . . . 337
Weidong Zhang, Lei Zhang, and Yifeng Chen

DA Placement: A Dual-Aware Data Placement in a Deduplicated


and Erasure-Coded Storage System . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 358
Mingzhu Deng, Ming Zhao, Fang Liu, Zhiguang Chen, and Nong Xiao

Improving Restore Performance of Deduplication Systems by Leveraging


the Chunk Sequence in Backup Stream . . . . . . . . . . . . . . . . . . . . . . . . . . . 378
Ru Yang, Yuhui Deng, Cheng Hu, and Lei Si

Blockchain-Based Secure and Reliable Distributed Deduplication Scheme . . . 393


Jingyi Li, Jigang Wu, Long Chen, and Jiaxing Li

Controlled Channel Attack Detection Based on Hardware Virtualization . . . . 406


Chenyi Qiang, Weijie Liu, Lina Wang, and Rongwei Yu

CuAPSS: A Hybrid CUDA Solution for AllPairs Similarity Search . . . . . . . . 421


Yilin Feng, Jie Tang, Chongjun Wang, and Junyuan Xie

A Parallel Branch and Bound Algorithm for the Probabilistic TSP . . . . . . . . 437
Mohamed Abdellahi Amar, Walid Khaznaji, and Monia Bellalouna

Accelerating Artificial Bee Colony Algorithm with Elite


Neighborhood Learning . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 449
Xinyu Zhou, Yunan Liu, Yong Ma, Mingwen Wang,
and Jianyi Wan

Distributed Parallel Simulation of Primary Sample Space Metropolis


Light Transport. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 465
Changmao Wu, Changyou Zhang, and Qiao Sun

Parallel Statistical and Machine Learning Methods for Estimation


of Physical Load . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 483
Sergii Stirenko, Peng Gang, Wei Zeng, Yuri Gordienko, Oleg Alienin,
Oleksandr Rokovyi, Nikita Gordienko, Ivan Pavliuchenko,
and Anis Rojbi

Parallel Communication Mechanisms in Solving Integral Equations


for Electromagnetic Scattering Based on the Method of Moments . . . . . . . . . 498
Lan Song, Dennis K. Peters, Weimin Huang, Zhiwei Liu, Lixia Lei,
and Tangliu Wen
XIV Contents – Part I

POWER: A Parallel-Optimization-Based Framework Towards Edge


Intelligent Image Recognition and a Case Study . . . . . . . . . . . . . . . . . . . . . 508
Yingyi Yang, Xiaoming Mai, Hao Wu, Ming Nie, and Hui Wu

SP-TSRM: A Data Grouping Strategy in Distributed Storage System. . . . . . . 524


Dongjie Zhu, Haiwen Du, Ning Cao, Xueming Qiao, and Yanyan Liu

Abstract Parallel Array Types and Ghost Cell Update Implementation . . . . . . 532
Shuang Zhang, Bei Wang, and Yifeng Chen

High Performance Computing

Accelerating Low-End Edge Computing with Cross-Kernel


Functionality Abstraction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 545
Chao Wu, Yaoxue Zhang, Yuezhi Zhou, and Qiushi Li

A High-Performance and High-Reliability RAIS5 Storage Architecture


with Adaptive Stripe . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 562
Linjun Mei, Dan Feng, Lingfang Zeng, Jianxi Chen, and Jingning Liu

ADAM: An Adaptive Directory Accelerating Mechanism for NVM-Based


File Systems. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 578
Xin Cui, Linpeng Huang, and Shengan Zheng

A Parallel Method for All-Pair SimRank Similarity Computation . . . . . . . . . 593


Xuan Huang, Xingkun Gao, Jie Tang, and Gangshan Wu

CLDM: A Cache Cleaning Algorithm for Host Aware SMR Drives . . . . . . . 608
Wenguo Liu, Lingfang Zeng, and Dan Feng

HyGrid: A CPU-GPU Hybrid Convolution-Based Gridding Algorithm


in Radio Astronomy . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 621
Qi Luo, Jian Xiao, Ce Yu, Chongke Bi, Yiming Ji, Jizhou Sun, Bo Zhang,
and Hao Wang

COUSTIC: Combinatorial Double Auction for Crowd Sensing


Task Assignment in Device-to-Device Clouds. . . . . . . . . . . . . . . . . . . . . . . 636
Yutong Zhai, Liusheng Huang, Long Chen, Ning Xiao,
and Yangyang Geng

Author Index . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 653


Distributed and Parallel Computing
Network-Aware Grouping in Distributed
Stream Processing Systems

Fei Chen, Song Wu(B) , and Hai Jin

Services Computing Technology and System Lab, Cluster and Grid Computing Lab,
School of Computer Science and Technology,
Huazhong University of Science and Technology, Wuhan 430074, China
{fei chen 2013,wusong,hjin}@hust.edu.cn

Abstract. Distributed Stream Processing (DSP) systems have recently


attracted much attention because of their ability to process huge volumes
of real-time stream data with very low latency on clusters of commodity
hardware. Existing workload grouping strategies in a DSP system can
be classified into four categories (i.e. raw and blind, data skewness, clus-
ter heterogeneity, and dynamic load-aware). However, these traditional
stream grouping strategies do not consider network distance between
two communicating operators. In fact, the traffic from different network
channels makes a significant impact on performance. How to grouping
tuples according to network distances to improve performance has been
a critical problem.
In this paper, we propose a network-aware grouping framework called
Squirrel to improve the performance under different network distances.
Identifying the network location of two communicating operators, Squir-
rel sets a weight and priority for each network channel. It introduces
Weight Grouping to assign different numbers of tuples to each network
channel according to channel’s weight and priority. In order to adapt
to changes in network conditions, input load, resources and other fac-
tors, Squirrel uses Dynamic Weight Control to adjust network channel’s
weight and priority online by analyzing runtime information. Experi-
mental results prove Squirrel’s effectiveness and show that Squirrel can
achieve 1.67x improvement in terms of throughput and reduce the latency
by 47%.

Keywords: Stream processing · Load balancing · Grouping


Network distance

1 Introduction
DSP systems such as Storm [18], Naiad [9], Flink [4], Heron [8] model a streaming
application as a directed acyclic graph (DAG), where vertices are called process-
ing elements (i.e. operator) and edges are called streams. A processing element
(or operator) receives data tuples from input queue, performs computation logics
on the tuples, and emits the output to output queue. Edges represent the data
c Springer Nature Switzerland AG 2018
J. Vaidya and J. Li (Eds.): ICA3PP 2018, LNCS 11334, pp. 3–18, 2018.
https://doi.org/10.1007/978-3-030-05051-1_1
4 F. Chen et al.

streaming directions. For scalability, each processing element can have a set of
instances running in parallel called processing element instance. Upstream pro-
cessing element instances partition the streams into sub-streams using different
grouping strategies and balance these workloads among multiple downstream
instances.
Existing workloads grouping strategies in a DSP system can be classified
into four categories (i.e. raw and blind, data skewness, cluster heterogeneity,
and dynamic load-aware). Shuffle grouping, which forwards messages typically
in a round-robin way to achieve load balancing, is a raw and blind method.
The second kind of grouping is developed for dealing with data skewness. For
example, PKG [11–13] adopts the power of two choices to dispatch tuples in the
presence of bounded skew. Consistent Grouping [10] partitions tuples to each
downstream operator according to node’s capacity in heterogeneous clusters.
The last category is a dynamic load-aware manner. OSG [15] keeps track of
tuple execution times and adopts a greedy online scheduling mechanism to assign
tuples to operators dynamically. We show an example of shuffle grouping in
Fig. 1.

Fig. 1. An example of shuffle grouping Fig. 2. Architecture of Flink

However, these traditional stream grouping strategies do not consider net-


work distance between two communicating operators. In fact, the traffic from
different network channels makes a significant impact on performance [2,19].
Intra-process communication (i.e. Process Local), where two operators locate in
the same Java Virtual Machine (JVM) process, delivers messages in shared-
memory way. Inter-process communication (i.e. Node Local), where two oper-
ators locate in different JVM processes but in the same host, uses socket but
loopback device to pass messages. Inter-node communication (i.e. Rack Local),
where two operators locate in different hosts but in the same rack, forwards
messages via Network Interface Card (NIC) and one-hop router. Inter-rack
communication (i.e. Any), where two operators locate in different racks, sends
and receives messages through at least two-hop routers. The performance of
Network-Aware Grouping in Distributed Stream Processing Systems 5

different communication modes differ greatly. It has been a critical problem how
to grouping messages according to network distances to improve performance.
In this paper, we propose a network-aware grouping framework called Squirrel
to assign tuples to different network channels. By locating the resource location
(TaskManager, Host, Rack) of two communicating operators, we set a weight
and priority for each network channel. Weight Grouping decides the number of
tuples sent to each network channel according to channel’s weight and priority. In
order to adapt to changes in network conditions, input load, resources, and other
factors, Dynamic Weight Control adjusts network channel’s weight and priority
online by periodically analyzing runtime information such as tuple execution
time. In summary, this paper makes the following contributions:
– We analyze distinctions of different network distances and identify differen-
tiated performance for stream applications under different network channels.
We find that the performance gets worse while two operators locate further
in network distances.
– We present Squirrel, a network-aware grouping framework for assigning tuples
to different network channels. By locating the network distances between two
communicating operators, we introduce weight grouping to distribute tuples
to downstream operator instances. We adopt dynamic weight control to adjust
the weight of each network channels online based on runtime information.
– We have implemented Squirrel on Flink and conducted a comprehensive eval-
uation with various benchmarks. Experimental results show that Squirrel can
achieve 1.67x improvement in terms of throughput and reduce the latency by
47%.
The rest of this paper is organized as follows. In Sect. 2, we introduce the
architecture of Flink and clarify traditional stream grouping strategies. Section 3
explains the motivation of network-aware grouping. In Sect. 4, we describe our
network-aware grouping strategy and the design of Squirrel. Section 4.1 presents
the implementation of Squirrel based on Flink. We evaluate the performance of
our system in Sect. 5. Section 6 surveys the related works briefly. Finally, Sect. 7
concludes this paper.

2 Background
In this section, we first introduce the architecture of Flink. Then we classify
traditional stream grouping strategies into four categories and describe their
representative strategy respectively.

2.1 Architecture of Flink


Flink is a typical open-source stream processing framework for low-latency
streaming applications [1,4]. As depicted in Fig. 2, a Flink cluster consists of
three kinds of processes: the Client, the JobManager, and the TaskManager.
The Client receives and parses the program code, transforms it to a dataflow
6 F. Chen et al.

graph (i.e. DAG), and submits this graph to the JobManager. The JobManager
is responsible for the distributed execution of the dataflow. It monitors the state
and progress of each operator and stream, schedules and deploys new operators,
and conducts checkpointing and recovery when failures occur. The TaskMan-
ager launches each operator as a task instance, and reports operator’s status
and progress to the JobManager. It also manages the buffer pools to buffer
the streams data, and maintains the network connections to transmit messages
between operators.

2.2 Stream Grouping Strategies

We classify traditional stream grouping strategies into four categories.

Raw and Blind. Many default grouping strategies of open-source distributed


stream processing systems are raw and blind. They adopt either a round-robin
fashion or a hashing manner. Shuffle grouping forwards messages typically in a
round-robin way. Messages are distributed across the downstream tasks one by
one such that each task can receive the same number of messages. Key grouping
is usually implemented through hashing and allows each upstream task route
each message via its key. This ensures the messages which have the same key are
processed by the same downstream task.

Data Skewness. Real-world stream workloads are usually with skewed dis-
tribution. Partial Key Grouping (PKG) [11–13] uses the classical power of two
choices to achieve nearly perfect load balance when data skew occurs. DStream
[5] adopts different partitioning mechanisms for different keys. It groups the pop-
ular keys using shuffle grouping and handles unpopular keys with key grouping.
DKG [16] monitors the incoming stream to identify skewed keys and estimate
their frequency. It uses these estimated frequencies and the buckets’ size as
parameters to run a greedy scheduling algorithm in order to obtain a one-to-one
mapping of elements to target instances.

Cluster Heterogeneity. Heterogeneities are very common in current clusters.


[17] uses the TCP blocking rate per connection as the metric of heterogeneous
workers’ service rate and models the load balancing problem as a minimax sep-
arable resource allocation problem. [10] proposes Consistent Grouping (CG) to
group tuples to each processing element instance according to its capacity. CG
borrows the concept of virtual workers from the traditional consistent hashing
and can handle both the potential skewness in input data distribution, as well
as the heterogeneity in resources.

Dynamic Load-Aware [7]. proposes to partition workload by using a mixed


strategy. It first assigns some keys to the specified target worker threads and
Network-Aware Grouping in Distributed Stream Processing Systems 7

handles all other keys with the hash function. The system can update the rout-
ing table which has the chosen keys adaptively when incoming data stream
has short-term distribution fluctuations. OSG [15] leverages sketches to monitor
tuple execution time at operator instances, assign tuples to operator instances by
applying a greedy online multiprocessor scheduling algorithm at runtime. [3] col-
lects and analyzes the application’s traces online to identify correlations between
the keys. It distributes correlated keys to the instances which are located on the
same server to reduce network traffic.

3 Motivation

In this section, we first introduce distinctions of four different network dis-


tances, and then conduct experimental analysis under different network dis-
tances. Finally, we present the key idea of Squirrel.

3.1 Network Distances

In current deployment of stream application topology, there are four network


distances between two communicating operators. They are intra-process (i.e.
Process Local), inter-process (i.e. Node Local), inter-node (i.e. Rack Local), and
inter-rack (i.e. Any). Different network distances have different communication
modes, accompany with different performance impacts. Table 1 summarizes the
characteristics of each network distance.

Table 1. Comparison of different network distance

Location Communication Manner Serialization Latency Bandwidth


Process Local Inter-thread Shared memory No 100 ns 20 GB/s
Node Local Inter-process Loopback device Yes 20 us 2.5 GB/s
Rack Local Inter-node 1-hop router Yes 200 us 100 MB/s
Any Inter-rack N-hop router Yes 350 us 10 MB/s

Intra-process communication, where two operators locate in the same JVM


process, delivers messages in shared-memory way. These two operators work in
the same memory address space of one JVM. Upstream task only needs to pass
a pointer of messages to downstream task. Messages do not need to be serialized
and deserialized. Intra-process communication is the fastest with the latency of
memory access (about 100 ns) and has the largest bandwidth as high as 20 GB/s.
Inter-process communication, where two operators locate in different JVM pro-
cesses but in the same host, uses socket but loopback device to pass messages.
Network packets are forwarded by loopback device in network layer, and they do
not need to go through a NIC and a router. Messages need to be serialized and
deserialized. Inter-process communication is fast with the latency of 20 us and
8 F. Chen et al.

has large bandwidth of 2.5 GB/s. Inter-node communication, where two opera-
tors locate in different hosts but in the same rack, forwards messages via NIC
and one-hop router. Messages need to be serialized and deserialized. Inter-node
communication is slow with the latency of 200 us and has small bandwidth of
100 MB/s. Inter-rack communication, where two operators locate in different
racks, sends and receives messages through at least two-hop routers. Messages
need to be serialized and deserialized. Inter-rack communication is the slowest
with the latency of 350 us and has the smallest bandwidth of 10 MB/s.

3.2 Experimental Analysis

In order to clarify the motivation, we conduct some experiments to show differ-


entiated performance under different network distances. Experimental setup is
listed in Sect. 5. We use classic wordcount as the workload.

Performance Under Different Network Distances. Figure 3 shows the per-


formance when running wordcount under different network distances. We mea-
sure the tuples’ average processing time in Fig. 3(a) and throughput in Fig. 3(b)
respectively. Process Local where two slot sharing groups are deployed in the
same JVM (i.e. taskManager in Flink) can achieve average processing time as low
as 0.47ms and throughput as high as 33000 tuples/s. The performance get worse
as two operators locate further in network distances. Average processing time in
Node Local increases by 3.5x than in Process Local. Throughput in Rack Local
decreases about 2.5x than in Process Local. Performance under inter-rack com-
munication are the worst of all, where average processing time increases by 8.8x
and throughput decreases up to 3.6x comparing with Process Local.
Avg.Proc.Time (ms)

Process_Local Process_Local
10
Throughput (tuples/s)

Node_Local 30000 Node_Local


Rack_Local
8 Rack_Local Any
Any
6 20000

4
10000
2

0 0
50 100 150 200 250 0 50 100 150 200 250
Running Time (s) Running Time (s)

(a) Average Processing Time (b) Throughput

Fig. 3. Performance under different network distances

Performance Between Two Communicating Operators. We measure the


performance between two communicating operators in Fig. 4. Average transmis-
sion time is from the time when a tuple enters into FlatMap to the time when
the tuple reaches to Keyed Aggregation. From Fig. 4, we can see that different
Network-Aware Grouping in Distributed Stream Processing Systems 9

network channels show great performance differences. Tuples’ average transmis-


sion time in Process Local can be as low as 0.032ms and throughput can be
as high as 180000 tuples/s. However, performance degrades significantly when
two operators have further network distances. It shows the similar tendency
with the whole topology’s performance in Fig. 3. Average transmission time in
Node Local increases by 5.8x than in Process Local. Throughput in Rack Local
decreases about 2.3x than in Process Local. Performance under inter-rack com-
munication is the worst of all, where average processing time increases by 19.5x
and throughput decreases up to 3.8x comparing with Process Local.

0.8 200000

Throughput (tuples/s)
A. Process_Local Process_Local
Avg.Transi.Time(ms)

0.7 Node_Local
B. Node_Local 150000 Rack_Local
0.6 Any
C. Rack_Local
0.5
D. Any 100000
0.4
0.3
50000
0.2
0.1 0
0.0 0 50 100 150 200 250
A B C D Running Time (s)

(a) Average Transmission Time (b) Throughput

Fig. 4. Performance between two communicating operators

In summary, different network channels can achieve differentiated perfor-


mance. When two operators locate further in network distances, applications’
performance will get much worse. How to grouping tuples according to network
distances has great impacts on applications’ performance.

4 System Design

In this section, we describe the system design of Squirrel. We begin with an


overview of Squirrel’s architecture, followed by introducing the system details.

4.1 System Overview

Figure 5 overviews the architecture of Squirrel. Being aware of resource location


of each task, Network Locator identifies network distance of two communicat-
ing tasks. Load Monitor collects and analyzes runtime information to adapt to
changes of network conditions and other factors. Squirrel centers on the design
of weight grouping and dynamic weight control to enable efficient network-aware
grouping.

– Weight Grouping assigns different weights and priorities for different net-
work channels. Weight Grouping chooses a target channel for tuples according
to channel’s weight and priority.
10 F. Chen et al.

Fig. 5. Architecture of Squirrel Fig. 6. Actions of Squirrel

– Dynamic Weight Control analyzes tuple execution information from all


downstream tasks and allows Squirrel to adjust each channel’s weight dynam-
ically according to runtime information.
We present Squirrel’s actions in Fig. 6. Squirrel works as follows: (1) when a
task is deployed and begins running, Network Locator in JobManager records
the resource location (such as TaskManagerID, HostID, RackID) and broadcasts
the location to all upstream tasks. Upstream tasks compare this location with
its own location and decide the network distances of these two tasks. (2) Squirrel
sets initial weight and priority for each network channel according to network
distance. The closer the network distance is, the greater the weight and the higher
priority (Process Local > Node Local > Rack Local > Any). (3) Squirrel sends
tuples to downstream tasks by weight grouping. (4) Load Monitor collects and
analyzes runtime information to be aware of the changes of network conditions,
input load, resources. (5) Dynamic weight control adjusts each network channel’s
weight according to runtime information from downstream tasks such as tuple
waiting time, tuple execution time.

4.2 Network and Load Aware


Network Aware. Considering about pipelined deploying mode in distributed
stream processing systems, a task instance of upstream operator is likely to be
deployed and scheduled after downstream tasks. If there is one downstream task
deployed and running, JobManager records its resource location and broadcasts
the location to all alive upstream tasks. If there is one upstream task deployed
after downstream tasks, it will not receive these location information. However,
it notifies pipelined consumers (i.e. downstream tasks) when its result partition
is ready. This will trigger the deployment of consuming tasks. If one downstream
task has been deployed and is running, JobManager sends this downstream task’s
location to the upstream task. Upstream tasks compare these locations with their
Network-Aware Grouping in Distributed Stream Processing Systems 11

own locations and decide the network distances of these two tasks. If they have
the same TaskManagerID, they are Process Local. Otherwise if they have the
same HostID, they are Node Local. If they have the same RackID, they are
Rack Local. If they have completely different resource locations, they are Any.

Load Aware. Squirrel collects runtime information dynamically to be aware


of the changes in network conditions, input load, and resource contention. Flink
outputs metric reports to log file or web console periodically. In Squirrel, these
metric reports including latency, length of input queue, and other metrics are
sent to JobManager through heartbeat messages. Squirrel tracks the latency of
records traveling through the system by emitting a special record, called Latency
Marker. The marker contains a timestamp from the time when the record has
been emitted at the sources and does not include any logical operations. Squirrel
analyzes these metrics to decide whether load are balanced among downstream
tasks.

4.3 Weight Grouping


Identifying the network distances between two communicating tasks, Squirrel
assigns an initial weight and priority for each network channel according to
different network distances. The closer the network distance, the greater the
weight and the higher priority. Squirrel sends tuples to downstream tasks by
weight grouping. Algorithm 1 shows the procedure of weight grouping.
Line 1–2 assigns an initial weight and priority for each network channel.
When choosing a channel for a tuple, Squirrel prefers the channel selected by
the default channel selection strategy (such as shuffle grouping, key grouping).
If the default selected channel has sufficient weight share (emitTimes[channel]
< weight[channel]), then the tuple is sent to this channel (line 3–6). Otherwise,
if the default selected channel has used up all its weight share, Squirrel tra-
verses from the channel of highest priority and looks for channel with enough
weight share (line 7–15). If all channels have used up their weight shares, Squir-
rel begins to assign a new round of weight shares to each channel (line 16–18).
Squirrel adjusts each channel’s weight dynamically according to tuple execution
information (line 19).

4.4 Dynamic Weight Control

Although weight grouping can distribute tuples among downstream tasks in a


static manner efficiently, it is not robust enough in case of data skewness and
cluster heterogeneity. Besides, considering how to set ideal initial weight for each
channel depends on application developers’ experience, Squirrel needs dynamic
weight control to correct the error of initial weight. Squirrel adopts a mechanism
similar with TCP congestion control (i.e., AIMD, additive increase/multiplicative
decrease) to adjust each channel’s weight dynamically. By comparing average
tuple execution time in each downstream task, Squirrel checks whether the
12 F. Chen et al.

Algorithm 1. Weight Grouping


Input: t: input tuple, n: total channels, locations: network locations of channels,
emitT imes[channel]: times of tuples through each channel
Output: targetChannel
1: weight[channel] ⇐ initializeW eight(n, locations)
2: priority[channel] ⇐ sortChannel(n, locations)
3: selectedChannel ⇐ def aultGrouping(t, n)
4: if emitT imes[selectedChannel] <
weight[selectedChannel] then
5: targetChannel ⇐ selectedChannel
6: emitT imes[targetChannel] + +
7: else
8: for priorityChannel ⇐ priority[channel] do
9: if emitT imes[priorityChannel] <
weight[priorityChannel] then
10: targetChannel ⇐ priorityChannel
11: emitT imes[targetChannel] + +
12: break
13: end if
14: end for
15: end if
16: if usedU pW eight(weight, emitT imes[], n) then
17: reassignW eight(weight, emitT imes[], n)
18: end if
19: dynamicW eightControl(n, locations)
20: return targetChannel

weight share of a channel is appropriate. If average tuple execution time in


one channel is further longer than in other channels, it shows that the channel is
congested. Squirrel lowers the weight of congested channel with an exponential
reduction. If average tuple execution time in one channel is further shorter than
in other channels, it shows that the channel is underloaded. Squirrel grows the
weight of underloaded channel linearly. Algorithm 2 presents the procedure of
dynamic weight control.

5 Experimental Evaluation

In this section, we evaluate Squirrel’s performance under various workloads.


First, we describe the experimental setup and workloads, then present the exper-
imental results.

5.1 Experimental Setup

Our evaluations are conducted on a cluster of 5 commodity machines. Each


machine is comprised of two dual-core Intel Xeon E5-2670 2.6 GHz CPUs, 16 GB
Network-Aware Grouping in Distributed Stream Processing Systems 13

Algorithm 2. Dynamic Weight Control


Input: initialW eight[channel]: initial weight of each channel, executionT ime
[Channel]: average tuple execution time in each channel, n: total channels
Output: dynamicWeight[channel]
1: dynamicW eight[channel] ⇐ initialW eight[channel]
2: timeList ⇐ sort(executionT ime[channel])
3: medianT ime ⇐ median(timeList)
4: for channel i = 1 → n do
5: if executionT ime[i] > α ∗ medianT ime (α > 1) then
6: dynamicW eight[i] ⇐ dynamicW eight[i]/2
7: else if executionT ime[i] < α ∗ medianT ime then
8: dynamicW eight[i] ⇐ dynamicW eight[i] + δ
9: end if
10: end for
11: return dynamicW eight[channel]

memory, and 210 GB disks. All computers are interconnected with 1 Gbps Ether-
net cards. We use Flink-1.4.0 as the distributed computing platform and Redhat
Enterprise Linux 6.2 with the kernel version 2.6.32 as the OS. One machine in the
cluster serves as the master node to host the JobManager. The other machines
run Flink TaskManagers.

5.2 Workloads

We use word count topology (stream version) and advertising topology to eval-
uate Squirrel’s performance. Wordcount is a classic application for benchmark-
ing distributed engines. It splits a sentence into words and counts the number
of each word in the input stream. Advertising topology is a typical streaming
benchmark which simulates an advertisement analytics pipeline [6]. This pipeline
consists of six operations including receive, parse, filter, join, window count, and
store back. We set buffer timeout to control latency and throughput. setBuffer-
Timeout(0) triggers buffer flushing after every record thus minimizing latency.
setBufferTimeout(-1) triggers buffer flushing only when the output buffer is full
thus maximizing throughput.

5.3 Latency

In performance test (including latency and throughput), for wordcount, we use


a large-scale trace collected from real-world systems as the dataset. This trace
is a set of records which represent the usage of each machine and contains 1
billion records associated with 300 thousand unique keys. For advertising, the
source dataset is generated from its own EventGeneratorSource. In all group-
ing phases, Squirrel adopts Weight Grouping no matter the default grouping is
shuffle grouping or key grouping. However, Flink uses a combination of shuffle
grouping and key grouping. For example, in wordcount, the map stage which
14 F. Chen et al.

rebalances words among downstream tasks uses shuffle grouping. The reduce
stage which aggregates keys by a hash operation uses key grouping.

6 8
Flink Flink

Avg. Proc. Time (ms)


Avg. Proc. Time (ms)

Squirrel Squirrel
6
4

2
2

0 0
0 2 4 6 8 10 0 2 4 6 8 10
Runtime (minute) Runtime (minute)

(a) WordCount (b) AdvertisingTopology

Fig. 7. Processing latency in wordcount and advertising

Figure 7 shows the processing latency (i.e., average tuple execution time)
in wordcount and advertising. We compare Squirrel with the original Flink. In
Squirrel, the average tuple execution time of wordcount and advertising is 1.03
ms and 1.62 ms, respectively. However, in Flink, the average tuple execution
time of wordcount and advertising is 1.91 ms and 3.26 ms, respectively. The
experimental results show that Squirrel can improve average tuple execution
time by up to 47%.

5.4 Throughput

Figure 8 compares the throughput of Squirrel with Flink in wordcount and


advertising. In Squirrel, the average throughput of wordcount and advertising is
91624 tuples/s and 127209 tuples/s, respectively. However, in Flink, the average
throughput of wordcount and advertising is 57162 tuples/s and 76173 tuples/s,
respectively. The experimental results show that Squirrel achieves 1.67x improve-
ment of the average system throughput compared to Flink.
We presents the performance improvements for batch processing in Fig. 9.
Although Flink is a streaming computation engine, it embraces batch process-
ing model through treating a bounded dataset as a special case of an unbounded
one. Optimizations on Flink engine can also improve the performance of batch
processing. We compare the performance of Squirrel with Flink when running
wordcount as a batch job on them. We vary the size of input files from 160 MB
to 2560 MB. The experimental results show that Squirrel improves the job com-
pletion time by 29%–41%.

5.5 Dynamic Weight Control


In order to evaluate the performance of Dynamic Weight Control, we conduct
extended experiments with generated synthetic dataset with 100 million tuples
Network-Aware Grouping in Distributed Stream Processing Systems 15

120000 160000
Flink Flink

Throughput (tuples/s)

Throughput (tuples/s)
100000 Squirrel 140000 Squirrel

120000
80000
100000
60000
80000
40000 60000
20000 40000
0 2 4 6 8 10 0 2 4 6 8 10
Timeline (minute) Timeline (minute)
(a) WordCount (b) AdvertisingTopology

Fig. 8. Throughput test in wordcount and advertising

1000 Flink
Completion Time(s)

Squirrel
800

600

400

200

0
160MB 320MB 640MB 1280MB 2560MB
Size of Input Files

Fig. 9. Improvements for batch processing running WordCount

and 1 million keys at different levels of skewness using the Uniform distribution
and Zipfian distributions. We vary the zipf coefficients of the dataset from 0.5 to
2.0. With higher zipf coefficient, the data distribution is more skewed. We also
test Squirrel’s performance under cluster heterogeneity. We turnoff some CPU
cores and lower the frequency of the remaining cores to produce heterogeneous
machines which have different computational capabilities. The ratio of physical
resources (represented by cores∗frequency) between the fastest machine and the
slowest machine varies from 1 to 8.

Data Skewness. Figure 10 plots the system performance when running word-
count under data skewness. The results show that Flink’s performance decreases
when the coefficient increases. The default key grouping can not balance those
tuples with high levels of skewness very well. WG (Weight Grouping) amortizes
the impacts of skewed keys by limiting the number of tuples in each channel
according to its weight share. DWC (Dynamic Weight Control ) can be aware
of the overloaded downstream tasks and adjusts their weight shares. So, the
performance of Squirrel (WG+DWC) does not decrease significantly.

Cluster Heterogeneity. Figure 11 evaluates the performance of Flink and


Squirrel in the case of heterogeneous machines. With the exacerbation of
Another random document with
no related content on Scribd:
Fred, of course, had to tell Janet that there was no hope of help from
grandma.

"And I think you are right, dear, and that I had better get away at once. I
will take just what will pay travelling expenses, and keep me for a few
days. I will write and tell you where to join me. You must settle
everything here, and come as soon as I send for you. I could not stand
the—the disgrace, Janet. Every one will know to-morrow that I am
dismissed, and Henley won't be silent."

Poor, selfish Fred! He desired nothing so much as to get away before his
disgrace was known, and poor Janet, in her unselfish love, was as anxious
about it as he could be. Fred had always held his head high, and whatever
private discontent he felt with his situation, he had always been
considered a very fortunate young man, much better off than others of his
years. To meet those who had always admired and looked up to him, in
his new character, as a dismissed man and a defaulting speculator, he felt
would drive him mad. So, having kissed his two boys as they slept
sweetly in their little beds, he bade farewell to Janet, telling her to come
to Liverpool, to the Ship Hotel, Guelph Street, where he would write to
her; he could not say where he would be, as that would depend upon the
boats he might be in time for. And then he was gone; and poor Janet
crept off to bed, cold and stunned, and almost heartbroken.

CHAPTER III.
IN LIVERPOOL.
NEXT day, Janet sent for a broker, and pointed out to him such articles of
furniture as Fred had told her belonged to him. She was in the midst of
making her bargain when, to her surprise, in came Mrs. Rayburn, who had
not left her room for many days.

"Betty told me, my dear, that Mr. Pitman was here; so I guessed it was
about your piano, and I crept in, for I may as well dispose of my few
things at the same time."

And, turning to Mr. Pitman, she proceeded to point out what she claimed.
It was not very much, but it was nearly all that Janet had been speaking
about, and the poor girl reddened when she found Pitman looking
doubtfully at her. She said—

"I did not know, grandma, that these things were yours. We thought they
were ours."

"Oh, my dear, the things my kind husband bought for me? But say no
more; I know you have always had them as your own, and it was stupid
of me to—"

"No, no; I only spoke lest Mr. Pitman might think I had known it before."

The business was soon settled, and a van carried off all the Rayburns'
share of the furniture. Very bare the parlour looked, and Fred cried for his
"pitty cot," when laid to sleep in his mother's bed. Frank, old enough to be
frightened at his mother's sad face, made no plaint about anything, but
ran with messages and helped her with all his might.

Their few belongings were soon packed, and all, save one box, sent off to
Liverpool by goods train. Janet paid up her household bills for the last
week, and then everything was done. She had no one to say farewell to
save Mrs. Rayburn; her own father and mother had died since her
marriage, and her only brother had emigrated. Janet had always been a
home-keeping woman, and had no very intimate friends.

"What shall you do, grandma?" she asked. "Where will you live till you
hear of a place?"

"I shall stay here, dear, until I'm turned out; then my sister-in-law will
take me in for a few days."

"You'll write and tell me what you hear from Lord Beaucourt, won't you?
Indeed I hope he will be kind to you. I have been so hurried that I hardly
seem to feel things yet; but indeed I am very, very sorry for you. It is so
hard on you."

"It is indeed. But Lord Beaucourt is one who never forgets them who have
served him well, and my mother was his confidential housekeeper—no
common servant; more like a friend, you know—for many years, and his
lordship was always most kind to me. Of course, I shall write, and you
must write to me. How I shall miss you, dear, and my darling boys!
There's some one at the door, Janet."

"Come in," Janet called out wearily.

And in came Mr. Frank Hopper.

"Good evening, Mrs. Rayburn," he said, as she rose to meet him. The
elder woman was sitting in the shadow of the window curtain, and he did
not see her.

"This is a sad business, Mrs. Rayburn. I am sorry to hear that Rayburn


has gone away. It struck me that you might be in difficulties, and that I
had better see you."

"Of course we are in difficulties, Mr. Frank."

"Yes—but I want—I meant—in a word, Mrs. Rayburn, do you know where


your husband has gone, and—are you to join him?"

"Why, certainly I am," Janet said angrily. "Oh, Mr. Frank, how could you
think that Fred would desert me and the children?"

"It looked so bad, his going off in this way. I was afraid there might be—
debts, you know. I wished, if possible, to help you."

"No, Mr. Hopper, you cannot help me. I have money to keep us until we go
out to Fred; I could not take help from you. I think you have been very
hard on my husband. I am an ignorant woman, and perhaps ought not to
say this; but it seems to me that you have been very, very hard on him."

"You mean in dismissing him? But he knew the rules, and knew that we
never depart from them. But I don't want to talk of that. Where's my little
namesake? I have a present here for him."

"Not for him; we will not accept help from you under any names, Mr.
Frank."
"Well, I would help you if I could," the young man said quietly; "but I
understand and respect your feelings. Business men have to be guided by
rules that seem harsh to women, I am sure. Only remember, if you ever
feel that there is anything I can do, you have my address. It will give me
real pleasure to help you, Mrs. Rayburn."

He bowed and withdrew, and old Mrs. Rayburn gave young Mrs. Rayburn
a lecture for being so proud and so foolish.

"I cannot help it," Janet said. "Did you remark how it was all that he
wished to help me; not a word of kindness for Fred, who has worked
under him so long? No, I will not put up with that sort of kindness."

Janet had never left Hemsborough except to go to school in a small town


not ten miles off, so the journey to Liverpool was quite a formidable
undertaking for her. But she had plenty of commonsense, and managed
very well. She and her boys reached the Ship Hotel in safety. And now
there was nothing to do but to wait.

Waiting is always weary work, and poor Janet was anxious about her
husband and uneasy about her boys. Accustomed to play about the big
brewery yards and sheds, where every one knew them and took more or
less care of them, the boys fretted if she kept them in her room in the
hotel; and yet the street and the adjoining quays did not seem a safe
playground for them. The hotel was very small, very crowded and noisy,
and by no means cheap. However, Janet lived as cheaply as she could;
and at last a letter came.

Fred wrote from New York, not, as she had hoped, from Halifax, for she
had wished him to go there to be nearer to her brother. He had as yet
failed to get permanent employment. He could just live, and that was all.
People told him that he was not likely to get good employment in New
York. Yet what could he do? He had not funds now to go to her brother in
British Columbia, and he feared it would be some time before he could
save enough. She must husband her money, and stay in England for a
while, for if she came to him now, what he feared was, that they would
sink into the class that just lives from hand to mouth, and that the boys
would get no education. She was to write to him at once, for he longed to
hear of her and the boys. Frank and Fred must not forget him.

Janet thought long and deeply over this letter, and the immediate result of
her meditations was that she wrote to Mr. Frank Hopper. Poor Janet! She
felt very reluctant to do it.
"Ship Hotel, Guelph Street,
Liverpool.

"DEAR SIR,

"You were kind enough to say that you would help me


if you could. Will you give me a few lines which I may show
to any one here to whom I may apply for work? I am quite
unknown here, and my husband and I have decided that it
is better for me not to join him just yet. I think he will
most likely go after a time to my brother in British
Columbia, and there is no use in my going out till I can go
there direct. I am a very good dressmaker, and wish to find
work in order to help my husband.

"Your obedient servant,

"JANET
RAYBURN."

Mr. Hopper at once sent her a letter which answered her purpose. She
was fortunate enough to get employment in the cutting-out department of
a great shop in Bold Street, where she gave such satisfaction that she was
told that she should be the head of the department when the lady now
over it married, which she was about to do soon. She was free at about
seven o'clock, and might be free rather earlier in winter.

She sent the boys to a little preparatory school in the street in which she
now lodged, Frank to learn, and Fred to be safe; and the servant at the
lodgings undertook to give them their dinner when they came home, and
on fine days to let them play in what she (perhaps satirically) called the
garden, and generally to keep a watch upon them. Then she was able to
write to Fred to say that she had got employment which, with the few
pounds she kept, would support her and the boys for a time; and she sent
him all the rest of the money she still had, urging him to go to her brother
Gilbert, and "not to be longer about sending for her than he could help,
for she felt very sad without him."

Poor Janet! She would not have admitted to any one, even to herself, that
she in the least distrusted her husband. Yet, in doing this, she was
unconsciously influenced by a touch of distrust. She felt that if she kept
money enough to take her and the boys out, maybe Fred would go on just
keeping himself; he had never taken kindly to steady, dull work, and this
kind of life might have some strange attraction for him. Whereas, if he
knew that she was working hard, and that he must send her the passage-
money, he would certainly feel quite differently. As for herself and the
children, she had no fears. God would take care of them.

But God's ways are not our ways; and Janet's simple faith was to be
sorely tried. And it stood the trial, because it was simple and humble.
When things happened that she did not expect, Janet did not forthwith
conclude that God had forgotten His promises; she concluded that she
had not fully understood them.

The summer was now past, and the winter was a severe one. Liverpool is
a very cold place, too, and Janet felt it herself, though she did not actually
suffer in health. But the children caught cold again and again. They would
creep back to their rather dreary home when school was over, with their
little overcoats unbuttoned, and their warm comforters forgotten. After a
time, Janet succeeded in teaching Frank that it was his duty to take care
of Fred, and of himself too, because it made poor "muddie" so wretched
to see them ill. From that time, Frank remembered; and it was touching
to see the tender, protecting care he took of little Fred, who really
suffered far less from cold than did Frank himself. Frank grew tall and thin
and white, but he never complained, for "poor muddie would be sorry, if
she knew how his bones pained him."

Looking back upon that time in Liverpool, it always seemed to Janet very
long, yet it really lasted but a few months. She heard regularly from her
husband, and he wrote in good spirits. He had set out to join Gilbert Gray,
but, having reached a town called New Durham, in British Columbia, he
fell in with an acquaintance who was in business there, and who had put
him up to one or two very good things; he would soon be quite
independent. In sending him that money she had, he thought, laid the
foundation of a fine fortune; but he would send her the passage-money
very soon now.

All this made Janet uneasy, she knew not why. She felt a little uneasy,
too, about grandma, as Mrs. Rayburn had for years been called in the old
Gatehouse, for she had never heard from her since they parted, though
she had written to her. However, in the spring she had a letter from her.

"Kelmersdale Castle, near


Rugeley.

"MY DEAR JANET,


"I am afraid you are thinking me very unkind, getting
letters from you and never writing to you; but you will
understand how this happened, when I tell you how I have
been knocked about. I am glad you have found work that
suits you, but, in your place, I would have gone after Fred
at once. I love him like a mother, but, after what
happened, I think him weak, and I hardly expect now that
you will ever get out to him. You ought to have left the
children in England and gone after Fred. No risk in leaving
them for a time; any one would be kind to those darlings.
But I suppose it is too late now, as you parted with the
money.

"As for me, a letter from my lord came the day after
you went away, offering me my choice of two situations,
matron of a big orphanage near Stafford, or housekeeper
at Kelmersdale Castle. The matronship was the best pay,
so I took it. But, my health being so feeble, I found the
work too much, and after my little darlings, Frank and
Fred, the children seemed a dreadful lot, and after a few
weeks, I wrote to my lord to say my health would not
stand it, and that if the other place was still open, I would
prefer it. I am thankful to say my lord had not been able to
suit himself, so I came to the Castle, and I just wish I had
the dear boys here, with such places to play about, and
every comfort.

"The place is very old, and was once besieged. I am


learning all the history off by heart, for many a shilling can
be got by showing visitors over the Castle in summer. My
lord never lives here long, but comes on business or for the
shooting. The living rooms are small, with thick walls and
little windows high up. My rooms are very comfortable, and
I have servants under me, and am to see all kept in order;
the armour, and old pictures and furniture. But, except just
when the earl is here, I have little to do but to amuse
myself like any grand lady. The salary is small, or I should
send you a present, as well I might after all your kindness
to me, which I can never forget. I hope the darling boys
have not forgotten grandma. I seem to hear Frank calling
me, dear little rogue. Some day you must all come and pay
me a visit; I know my lord will give leave.
"You say the boys go to school, but in the summer
holidays they might be glad of a change. But take my
advice and get after Fred as soon as you can; don't lose
your hold over him, it will be the ruin of him.

"Ever your affectionate mother,

"LYDIA RAYBURN."

"Oh, I wish people would not say things like this to me!" cried Janet,
unconscious that no one had said anything to her about her husband
except Mrs. Rayburn and her own anxious, loving heart.

Mrs. Lydia Rayburn little thought when she penned that letter, so full of
patronizing kindness, what the effect of her words would be; for
simpleminded Janet believed that she meant every word most sincerely.

She sent the letter on to Fred, and said to the two children, "How fond
grandma is of you both, boys!"

To which Frank replied thoughtfully, "Is she, muddie?"

Soon after this Fred failed to write for some weeks, and Janet was getting
seriously uneasy, when she received a letter from her brother; not one of
his usual brief epistles, but a long, closely written letter with a money
order enclosed.

"Old Man's Ferry Farm, Gattigo,


British Columbia.

"MY DEAR SISTER,

"I do not know whether what I must now tell you will
be a surprise and a shock to you or not. Of course, I could
see that you were not speaking out quite frankly about
your husband and the loss of his good place; not that I am
blaming you, for he is your husband, and you are bound to
stick to him. You wrote me word that he was on his way to
me, and I laid out my plans for giving him and you a start
here if I could. But he did not turn up, and only ten days
ago I got a letter from a lawyer in New Durham—a rising
new town a good way from us—enclosing a letter from
Rayburn.
"Not to make too long a story of it, your husband told
me that he had been on his way to me, five months ago,
when at New Durham, he fell in with an old friend, and
went into some kind of business with him, putting all the
money he possessed into it. They seemed prosperous for a
time, and Rayburn declares that he did not know that the
articles they were selling were regular cheap locks and
stoves and such things, with good English names on them,
which this fellow Turner had got made out here, and not
even of good materials. Of course, this could only go on for
a time—people here are no fools—and Turner must have
found out that he was suspected, for he made off with all
he could get hold of, and left Rayburn to bear the
consequences. Rayburn had a narrow escape of being
roughly handled by a lot of fellows who had come to the
town together to have it out with Turner. In these half-
settled places people have a very short kind of justice, but
he got away out of the shop with a whole skin, and was
taken up for the cheating. Then he told the lawyer that he
was my brother-in-law, and that I could speak for him, and
so they sent for me. I went, of course, and found him very
ill, I really think from fretting, for there is no doubt that he
was badly treated by Turner.

"They have not caught Turner, and now they will hardly
do so; and I think Rayburn will get off for want of evidence
against him. I would get him out on bail, but that he is so
ill, that it is better for him to keep quiet. When he is free, I
shall take him home with me, and Aimée will nurse him till
he is all right again. And, if I find it possible, I may still do
what I was thinking of—start an hotel in Gattigo to be
supplied from my farm, and you and Rayburn to manage it.
If he had come direct to me, all would be easy. Now I fear
it may be a feeling against him, and in that case it would
be risking money in setting up the hotel, and it is a great
pity, for he is the very man for the place; he has such a
pleasant manner. But there is no use in crying over spilt
milk. I wish he was not your husband, for, truth to say, I do
not like this business, though I cannot help liking him. And
I will do what I can for your husband.

"Now, Janet, the fact is, if you have as much good


sense and good principle as I believe you to have, you
ought to come out by the next boat, and join Rayburn, and
not part with him again. In some way we will find an
opening for him, and with you beside him, and me at your
back—particularly now you are on your guard—he may yet
do very well. He is feather-headed, easily taken with
anything new, and impatient of slow gains. Rayburn says
you are to send the boys to their grandmother, who is sure
to take good care of them, being very fond of them. He
desired me to say that on no account are you to bring
them with you, if Mrs. Rayburn will take them for a while,
as he is very anxious that they should never hear of his
being in prison here. I think myself that it would be better
for you to come alone, and we will get them out as soon as
we know what you and Rayburn will do; but there may be
no use in your settling here, and it will be better to get
them to the place you finally decide upon, direct.

"I enclose money for your journey, and, on the other


side of this sheet, I will put down your exact best way, and
all particulars I can think of. Do not lose a boat, and come
direct to Gattigo. Rayburn will probably be here before you.
The hotel plan, if still possible, would be the best for him
and for you; but, unless you are here, I will not risk it.
Besides, having you with him will make him seem more
dependable and respectable, and, you see, he has made a
bad start, and has a prejudice to get over. Do not think me
unkind, though I know I may seem so, because I am not
used to much letter-writing, and do not know how to wrap
things up.

"My wife is just longing for you to come; there is not a


woman she cares about within many miles of us.

"Your affectionate brother,

"GILBERT GRAY."

Then followed the directions for her journey, which were so clear and
minute that a child could have followed them.

A year ago the idea of such a long lonely journey would have reduced
Janet to tears and misery; but she had learned to know her strength, and
it was not her own part in the matter that frightened her. Nor was it the
leaving the boys at Kelmersdale, for she had no doubt of their well-being
there, and had been thinking of asking grandma to take them for a
fortnight or so, as Frank would be the better for a change of air. She had a
brave heart and a childlike faith, and thought but little of herself; but oh,
what bitter tears she shed over that letter! But she lost no time; in half an
hour after the letter came, she was in the office of the line of boats Gilbert
had named, inquiring when the next left Liverpool.

CHAPTER IV.
KELMERSDALE.

JANET found that the next boat would sail in four days; so, if she could be
set free from her engagement at Gair and Co.'s, she could well be ready
in time, even if she had to take the children with her. For, of course, if
Mrs. Rayburn either could not or would not keep the little ones, they must
needs go with her.

The first thing to be done now was to telegraph to Mrs. Rayburn. She
passed an office on her way to Gair's. She sent her message, but only
said, "Can you send to meet us at Rugeley to-morrow?"

"I can explain much better when I am with her," she thought; "and if she
cannot take the boys, the expense is not very great, after all."

Having arranged for the answer to be sent to Gair's, she went thither
herself, arriving five minutes late, for the first time.
"Has Mr. Simmons come yet?" she asked a young man who was arranging
the window.

"He's in the office, Mrs. Rayburn."

And to the office Janet repaired. There she told her story, with certain
reservations. Her brother, she said, had sent her money to go out to
Canada to her husband, who was ill. When he recovered, her brother
knew of a promising opening for him, in which her help would be
necessary. Her month's salary was nearly due, but she was willing to
forfeit it, if she might go at once. There was no press of work, and Miss
Green was a very capable cutter-out. Mr. Simmons, a slow and solemn
man, rather thought that such an abrupt departure was impossible, but
would speak to Mr. Gair. Luckily for Janet, it was kind old Mr. Gair who was
in the office, and he came out to speak to her himself.

"We're sorry to lose you, Mrs. Rayburn, but we will not stand in your way,
as the matter seems of consequence. Pay Mrs. Rayburn up to the first of
July, Simmons; she has been a steady and useful worker."

Finally, the old gentleman sat down and wrote her a regular discharge.

"Keep that, Mrs. Rayburn," he said, looking kindly at the anxious young
face. "It may prove useful, though I hope your husband will do well. Do
you take your children with you?"

"No, sir; I shall take them to Kelmersdale Castle, near Rugeley, where
their grandmother is the housekeeper. If she can keep them, I am to
leave them with her for a time."

"Well, good-bye, good-bye," said Mr. Gair, retreating hastily towards his
private room, for his sons were wont to laugh at him for being always
ready to interest himself in any one. But he took a parting glance at
Janet, and something in the youth, sweetness, and determination in her
face touched his heart. Muttering, "I will now; they may say what they
like, just for once I will please myself," back he came.

"Are you sure you can manage all this for yourself, Mrs. Rayburn? Is there
anything that I can do for you?"

Janet looked up in his face with a somewhat tremulous smile.

"I have been so afraid," she said, "but if every one is as kind as you are—
but, indeed, that is not likely. I don't know how to thank you, sir; your
kindness gives me such courage."
"I think you have plenty of courage," the old man answered, "and a better
Friend than I can be. One who can go with you, and yet be with the
children at home. Is it not so?"

"Oh, it is—it is indeed! Yes, I can say that sincerely."

"Then you serve my Master, and so you need never be afraid, for you will
be cared for. God bless you, child."

Janet left the shop with that blessing warm at her heart. She went home,
and busied herself in getting the boys' clothes together and packing them.
She took a cabinet photograph of her husband and cut away the edges, to
make it fit into a little miniature case she had among her few ornaments:
this she meant to give to Frank. She made a list of the things in the trunk,
which she carefully packed for the children. While thus employed, the
answer to her telegram was sent on from Bold Street. It was brief, but
said that a vehicle should be at the station to meet the 12 a.m. train.

Then the boys came home from school, and Janet nearly broke down
when she heard their shout of rejoicing when they saw her at that unusual
hour. When she had given them their dinner, she took Fred on her knee
and put her arm round Frank, as he stood beside her.

"Now, listen to me, my little boys. I have something to tell you which you
will not like, and neither do I; but it cannot be helped, and I want you
both to be good—very good—and so help me to bear it. For I must go
away and leave you for a time, and—and—it nearly breaks my heart."

"Leave us—here, muddie?" Frank said, fixing his blue eyes on her face,
and growing white in the endeavour to "be good."

"Won't be left," said Fred, sturdily; "we go wif you."

"Not here, Frank, and not alone. To-morrow, I shall take you to a beautiful
place in the country, where I hope to leave you with grandma. There you
will have green fields to run about in, and grandma to take care of you.
You remember grandma, Fred, don't you?"

Frank had slipped down, and sat on the floor at his mother's feet, staring
up at her, and keeping unnaturally still, with every trace of colour gone
from his face. And there he still sat, when Fred had forgotten all about
this terrible parting and was playing merrily about the room, and Janet
was completing the packing of the box.
"Why must you go, muddie?" he cried at last, catching at her dress as she
passed him.

"My darling, my little Frank, don't look like that. I would not leave you if—
if I could help it. Father is ill and, wants me. When he is well, you shall
both come to us."

She sat down and lifted him upon her knee.

"The time will pass quickly, Frank. See, here is father's picture—I give it to
you; keep it safe, and show it to Fred, that he may remember him. And
you will be good, and not make poor muddie fret. And you will take care
of Fred, and try to keep him from being troublesome to grandma."

"I will try," Frank said. "May I go to bed, muddie? I'm tired, and don't
want any supper to-day."

Janet was rather frightened, he looked so white and weak. She put him to
bed, and brought him some bread and milk, which he took to please her.
When she woke him next morning he seemed quite himself again, and,
having said his prayers, he came and stood before her, saying earnestly—

"I will try to be good, muddie, and I promise to take care of Fred all I
can."

And he was good, poor little fellow, giving no trouble whatever, and trying
to keep Fred quiet during the journey. But Fred had bound himself by no
such promise, and was in uproarious spirits, making noise enough for half
a dozen.

At Rugeley she left the train and looked about for some one from
Kelmersdale. Presently a short, square-built, awkward young man came
up to her, making a clumsy bow, which he accompanied by a curious
movement of one foot, like the pawing of an impatient horse. But it was
shyness, not impatience, that made him paw.

"Be you Mrs. Fred Rayburn?"

"Yes; is Mrs. Rayburn here?"

"No, but I have the taxcart here for you and the children. Be this your
box? Come along, then."

With a final paw, which sent the gravel flying, he picked up the box and
led the way to where he had left the taxcart. Janet sat in front beside the
driver, with Fred in her arms, for she could not trust the excited child out
of her sight. Frank and the box kept each other company, and Frank was
glad, for he wanted to cry just a little without "making muddie cry." It was
a lovely drive, but none of them saw much of it.

At last they drove through a great heavy gate into a paved court, walled
on three sides, and with a large pillared porch on the fourth, with a great
broad flight of stone steps leading to a large iron-studded door. This was
wide open, and just inside stood Mrs. Rayburn, and with her a young
servant, in white cap and apron and blue satin bows.

"Well, Janet dear, here you are, and here are my darling boys," Mrs.
Rayburn cried. "It was a surprise—your telegraph saying that you were
coming. Why, Frank looks but poorly; a little country air will do him good.
Jacob, bring in that box. Fred's grown a little; as to Frank, he's run up far
too fast for strength. Come to my sitting-room—isn't this cosy? Maria,
we'll have dinner as soon as 'tis ready."

Maria departed, and Mrs. Rayburn went on.

"So you have a holiday—how long is it, Janet? I hope you can stay some
time. My lord is never here except during the shooting season, when he
has a party for the sport; so I can do just as I like. And I advise you to
leave the children with me just for a bit—just till Frank picks up a colour
and a little flesh. He looks very peaky."

"Yes; Liverpool does not agree with him. May the boys run out and play in
the court, Mrs. Rayburn? I want to talk to you alone."

"I'll just send and get the gate shut, and then they'll be as safe as
possible."

She left the room, and soon a man crossed the yard and shut the gate.
The two boys went out, but only into the porch. Fred was so sleepy that
he was glad to sit on the stone steps with his head on his brother's
shoulder. Frank, white and weary, knowing the whereabouts of every bone
in his body by a separate ache, yet manfully held the little one in his
arms, and sat gazing at the paved court and the high walls. Somehow he
felt like a bird in a cage.

"Now, Janet, we're alone. Let's have a talk till dinner is ready."

"Mrs. Rayburn, do you think Lord Beaucourt would be annoyed if you had
my boys here for a time?"
Having just asked them to stay, Mrs. Rayburn could not very well tell a
different story now; but when she made that request, she had no idea
that Janet would part with her darlings for so much as a week. But, after
all, the boys could not be in her way. The house was large and the
weather warm; they could be out for the greater part of the day, and they
would not cost her a penny. So, after an almost imperceptible pause, she
said—

"My lord annoyed? Oh, dear, not at all. My mother, you know, was a
confidential servant—almost a friend; and he is just as kind to me. If you
like to let Frank stay here, I'll take the best of care of him—you know
that."

"Yes; so you said in the kind letter I sent on to Fred. And he has sent me
word by my brother to leave them with you if you really can have them
without being troubled about it afterwards."

"To leave them both?"

"Yes; Fred is ill and in trouble, and Gilbert says I had better go at once.
Gilbert has plans for us, but it is not quite certain yet where we shall be. I
am to go to Gattigo to my brother, and Fred will meet me there, and when
we know where we shall settle, we will get the boys out. It will not be for
long. Gilbert thinks of setting up an hotel in Gattigo, with Fred and me to
manage it. And when we are quite settled, and can make you
comfortable, you must come out to us, grandma. However pleasant things
may be made for you here, it is not like being in your own house with
your own people."

"No, indeed, Janet, it's a deadly dull life here for one used to sociability
and a large town. I often think of Hemsborough and the dear old
gatehouse. I might be of use, too, in an hotel. Well, Maria is a good girl,
and will help me willingly, and, as you say, it will not be for long. And
what trouble is Fred in, poor dear fellow?"

"He went into partnership with a man he had known before, and this man,
Turner, was not dealing fairly, and he had to run away, and Fred's money
was all lost."

"If this Turner is the man who broke some years ago in Hemsborough,
Fred ought to have known better than to have dealings with him. So he
lost all he had?"

"Yes—but it was not much. Gilbert has got on very well, and seems sure
that this hotel will succeed. But Fred was ill when the letter was written,
and Gilbert says I ought to be there. They both wish me to come without
the boys, but if you cannot have them, I shall take them with me." And
Janet's face brightened a little, for oh! How much rather would she take
them than leave them!

But Mrs. Rayburn was determined not to say anything which could make
Janet think that her position at Kelmersdale was not as independent and
pleasant in every way as she had represented it, so she declared that my
lord would be quite pleased to know that she had the darling boys for
company.

"For he knows it is a lonely life here, and he is so kind-hearted. But, you


see, things were going all wrong for want of a really trustworthy
confidential person at the head of the household. He will not be here till
the middle of August, and perhaps not till September. Of course, they
might be in the way then. But there's time enough, and you know, Janet,
I'd do anything for you and Fred."

"I knew you would do this, if you could, so I have brought all their
clothes. I must get back to Liverpool; the steamer sails on Thursday,
early."

"Then you can stay here to-night. Do you think I'm going to let you travel
back to-night, and you looking so tired and worn? No, no, stay for the
night, and you'll see where the little darlings are to sleep, and how
comfortable I shall make them; as well I may, remembering all your
kindness to me, and how you nursed me when I was ill."

Her cordiality increased as she thought over the hotel project, and
considered how pleasant it would be, when all was comfortably settled, to
rejoin her stepson in Gattigo. Life at Kelmersdale was very dull to a
woman whose idea of enjoying herself meant much gossip and many
sociable tea-parties.

"I will stay, as you are so kind," Janet said, yet in her heart she wished
she had the courage to go, and have the parting over.

Maria, a good-natured girl, with very little to do, seemed rather pleased at
the prospect of a visit from the children, and said that the last
housekeeper had a niece who used to stay with her for months at a time.
There was a turret-room, six-sided, at the end of the passage on which
Mrs. Rayburn's rooms opened, and this was got ready for the boys. Janet
unpacked and arranged their clothes herself; and at night she tucked
them up in an old-fashioned little bedstead, with a high back of carved
wood. Conspicuous among the carving was an earl's coronet, which had
once been gilded; I suppose some baby Earl of Beaucourt had once slept
in the bed which now held poor Janet's boys. They slept as sweetly as any
earl, and even Janet slept, worn out.

Next day, Janet said she must catch the train for Liverpool, which was due
at Rugeley at a little after eleven. She had still a good deal of packing to
do, some things to buy which she would want on the passage, and she
must go to the school the boys had attended and pay what was due there.

She would not take the boys to Rugeley with her. When Jacob and the
taxcart came to the door, she kissed Mrs. Rayburn, and whispered—

"Be—be tender with them. They have never had a harsh word. Frank will
give you no trouble, and if Fred is not quite so good, oh! Have patience
with him, he is but a baby. Good-bye, and thank you for all your
kindness."

Then she knelt down on the stone floor of the hall, and held her boys to
her heart for a few moments. Fred set up a lamentable howl, but Frank
only gazed at his mother with wide eyes and a pale face. Janet rose, and
walked hurriedly out into the porch; Jacob helped her into the cart, and in
a moment they were gone.

"Come back, come back, muddie!" shouted Fred; "Take me wif you. I
won't stay here."

"Nonsense, child!" said Mrs. Rayburn, catching him as he broke away from
Frank and ran towards the door. "You've got to stay here. Come along to
my room and watch the cart; you can see it from the window there."

When the cart had passed the last turn in the long road through the park
at which it could be seen, Fred set up another roar. Mrs. Rayburn lifted
him up, and went to where her spacious easy-chair stood, where she sat
down.

"Stop that, Fred. Come here, Frank. Now, listen to me, both of you. You
are to stay here for a time, and if you're good, you'll have a pleasant time
of it. And I dare say you will be good, after a time, but you're both just a
bit spoiled, because your mother is too soft in her ways with you. Now,
I'm not like her."

"No, grandma," said Frank, with conviction.

"And if you're naughty or noisy or mischievous or troublesome in any way,


I'll give you a right good whipping. If you'll be good, you'll find yourself
very well off. And when you've had a whipping or two, I've no doubt I
shall have no more trouble with you. Come, now, get your hats, and I'll
show you a place where you may run about and play."

She took them out into the paved court, and across to a small iron gate,
and, when she had unlocked and opened this gate, Frank cried out with
surprise and delight—

"Oh, muddie, muddie, if you could just see this!"

On hearing this imprudent mention of "muddie," Fred began to roar; but


he received a very prompt cuff on the side of his curly head, and ceased,
staring hard at grandma.

To confess the truth, Fred had been quite spoiled by being the pet and
plaything of the school he attended with Frank—and, indeed, of the house
where his mother lodged also. He was a very handsome child, being like
his father, and he was also a self-willed little monkey, who liked his own
way, and was but little used to contradiction. Seeing "muddie" but for a
short time each day, he was always very happy and tolerably good with
her, so that poor Janet had little idea that her son had learned to get his
own way—entirely with Frank, and to a great extent with others—by
howling loudly when not pleased. Thus I may say that I do not altogether
grudge him a little discipline, though a box on the ears is not a safe way
to apply it.

Frank took his brother's hand, and drew him through the little gate into a
large, old-fashioned garden, primly and stiffly laid out, and full of various
flowers, though there was nothing very fine or rare. But to a child a flower
is a flower, and there were walks to run up and down, little thickets of
evergreen to explore, and, in the middle, a marble basin full of gold-fish.
In fact, it was a Paradise, and in this Paradise, these two little Adams
were to be left to their own devices.

"I shall come for you at two—that's my dinner-time, children. You must
not walk on the beds nor pick flowers nor do any mischief, but play about
and amuse yourselves. And I do hope, Frank, that you'll pick up a little
colour, for at present, you're a show. I shall lock you in. Now mind, if you
do any mischief, I shall whip you soundly."

To leave two boys, one not quite seven and the other only four, alone in a
garden full of flowers, and to expect them to gather none is to expect too
much of such very young human nature. Frank would never have done it,
but Fred did; and Frank, though he disapproved, did not actually interfere
to stop him.
Mrs. Rayburn spent the rest of the morning in writing to Lord Beaucourt,
telling him what had happened (in her own way), and asking leave to
keep the boys with her until their parents sent for them. As she had
before given Lord Beaucourt to understand that Fred Rayburn was a ne'er-
do-weel, who had ruined her, and his wife a silly, shiftless body, who
never saw what mischief was going on, while she herself was a most
amiable, trustful being, whose little all had been made away with by this
thriftless pair, the earl was quite ready to pity her. He wrote that he was
sorry that she had new difficulties with her stepson, but that the children
would be in no one's way at Kelmersdale, and she could keep them, if she
liked. This answer, of course, did not come for two or three days.

At two, Mrs. Rayburn went to the garden for the two boys, caught them
red-handed—that is to say, Fred had his hand full of some gaudy tulips
and china roses—and proceeded to administer what she called justice at
once. She had found them near the marble basin, and on the edge of this
she sat down.

"Did you hear me say that you were not to touch the flowers? Yes, you
certainly did. And I said that if you did, I should whip you soundly."

"If we did any mischief, you said, grandma," answered Frank.

"And what do you call that?" pointing to the flowers in Fred's hand. "And
what do you suppose Mr. Ross, the gardener, will say when he misses
them? And the beds all trampled on, I suppose?"

"No, indeed, grandma, we never went on the beds at all. We did no


mischief—flowers don't mind being picked."

"Don't you stand arguing there, sir; you were always one for arguing. It
was all your fault, for Fred's only a baby, so I shall let him off this time."

And seizing Frank, she proceeded to lay him across her knees, and gave
him a smart whipping. Then she set him on his feet, all flushed and giddy.
The first thing he saw was the row of windows that overlooked the
garden, and I think that the shame of having possibly been seen
undergoing such disgrace was worse than the whipping.

Fred, hitherto staring, open-mouthed and terrified, now began to


whimper.

"Oh, Fwank, was it for the f'owers? You said she'd be angwy. Beat me too,
you bad woman. 'Twas me took 'em; Fwank begged me not."

You might also like