You are on page 1of 16

See discussions, stats, and author profiles for this publication at: https://www.researchgate.

net/publication/344340140

Discovering Relational Intelligence in Online Social Networks

Chapter · September 2020


DOI: 10.1007/978-3-030-59003-1_22

CITATION READS
1 21

4 authors, including:

Leonard Tan Thuan Pham


University of Southern Queensland  University of Southern Queensland 
13 PUBLICATIONS   33 CITATIONS    9 PUBLICATIONS   37 CITATIONS   

SEE PROFILE SEE PROFILE

Some of the authors of this publication are also working on these related projects:

Automatic statistic translation system View project

A.I. Demonstration Software View project

All content following this page was uploaded by Leonard Tan on 22 March 2022.

The user has requested enhancement of the downloaded file.


Discovering Relational Intelligence In Online
Social Networks

Leonard Tan1 , Thuan Pham1 , Hang Kei Ho2 , and Tan Seng Kok3
1
Engineering and Sciences, The University of Southern Queensland, Australia
Leonard.Tan@usq.edu.au, Thuan.Pham@usq.edu.au
2
Faculty of Social Sciences, The University of Helsinki, Finland
hang.kei.ho@helsinki.fi
3
Construction AI Research Labs, Applipro Services, Singapore
aps rudi@yahoo.com.sg

Abstract. Information networks are pivotal to the operational utility of


key industries like medical, finance, governments, etc. However, applica-
tions in this area are not adequate in representing relationships between
nodes [34]. Trending graph learning methodologies [9], [16] like Graph
Convolutional Networks (GCNs) [6] lack both representational power and
accuracy to perform abstract computational tasks like prediction, classifi-
cation, recommendation, etc. on real-time social networks. Furthermore,
most such approaches known to date rely on learning temporal adjacency
matrices to describe shallow attributes [16], [9] like word co-occurance
PMI [3] changes [6] and are unable to capture complex evolving entity
relationships in real life for applications like event prediction, link predic-
tion, topic tracking, etc. [34]. Importantly, such models ignore knowledge
information geometry [32], [1], [24] completely, and sacrifices fidelity to
speed of convergence. To address these challenges, a novel Relational
Flux Turbulence (RFT) model was developed in this study - to identify
relational turbulence in Online Social Networks (OSNs). Very good cor-
relations between relational turbulence and sentiments exchanged within
social transactions show promise in achieving these objectives.

Keywords: Relational Turbulence, Social Recognition, Deep Learning

1 Introduction
Online Social Network (OSN) behavior has always been a topic of interest within
various fields of social applications in artificial intelligence. These include: link
detection, security threat identification, pattern recognition, recommendation,
topic modeling and event prediction tasks, etc. Key relational behavior arises
from manifolds of dynamic communication patterns which evolve over a tempo-
ral space of constant inceptions. Recent research include the use of directional
dyads and signed reciprocity as a special representation of link “strength” [22].
Challenges. Many relational approaches used in this study however, lack
depth and representative power [35]. The drawback of these techniques are that
important correlational attributes shared between actors are ignored, resulting
2 L. Tan et al.

in shallow representations of relational states [35]. Methods based on feature


similarities throughout studies in literature, have shown the lack of representa-
tional efficacy to model real life social structures effectively [2], [28]. Generally
speaking, there are several critical key questions in this field of study which
remain unanswered. In an unstructured social network within an evolving con-
struct of dynamic relationships [35]; how can we firstly, represent generalizations
of evolutionary behavior within these social transactions accurately? Secondly,
how can we recognize dynamic relational profiles which correlate to different so-
cial communication patterns? Finally, how can we quantify the dynamic errors
arising from social disruptions (outliers) in our representations?
Data Models. We address these questions with the use of Fractal Neu-
ral Networks (FNNs). FNNs are used within the Relational Turbulence Model
(RTM) framework to describe structures of chaos [25]. FNNs leverage on the dy-
namic structure of fractals as the lowest principle decompositions of never ending
patterns. They are driven by a recursive process, and are adaptable enough to
describe highly dynamic system representations [21]. In our approach, we define
Relational Turbulence as probabilistic measures of Relational Intensity P (γrl ),
Relational Interference P (ϑrl ) and Relational Uncertainty P (ϕrl ) [30]. RTM
characterizes an artificial construct, which predicts communication behaviors.
These behaviors are observed during relationship transitions in an environment
of constant social disruptions [30]. We choose this model because alternative data
models compromise accuracy and performance for simplicity in representation.
Examples include node-based, neighbor-based, path-based, random walk-based,
measures etc. [11]. These representations capture relational structures from a
time static perspective and are not adaptable to real-life dynamic evolutions of
relational states [31]. In this work, we focus on discovering relational intelligence
through identifying relational profiles on three major social platforms: Twitter,
Google and Enron email datasets.
Technical Model. In this paper, we introduce RFT to tackle the problem
of misrepresentations as a time evolving flow of relational attributes. The model
evolves into a multi-stage Deep Neural Network (DNN) from atomic fractal
hybrid architectures [5]. The atom structure is morphed from standard con-
catenations of Restricted Boltzmann Machines (RBMs) and Recursive Neural
Nets (RNNs). RFT accepts as inputs, key relational feature states fi between
actors aj and global events E from past and present social transactions to de-
termine the likelihood of relational turbulence τij within an identified social flux
F . Turbulence broadly corresponds to disruptive social communication patterns
within various topic and event contexts. For example, passive negative sentiments
transacted through discussions on major topics like trade wars, drive relational
breakdowns in many aspects like trust, influence, status, etc. We develop a novel
architecture from RTM to identify social disruptions by estimating relational tur-
bulence profiles, within a given social context describing the state of flux. Then,
we evaluate and demonstrate that our methods outperform similarity based fea-
ture and flat structural approaches in detecting social flux and turbulence.
Contributions. Our scientific contributions are presented as follows:
Discovering Relational Intelligence In Online Social Networks 3

1. Our method adaptively learns from real-time online streaming data to iden-
tify key turbulent relationships within a given OSN.
2. An innovative RFT model was developed to capture key relational features
which were used to detect and profile social communication patterns of event-
ful states within a given OSN.
3. Experiment results show that RFT is able to offer a good modeling of rela-
tional ground truths, while FNN efficiently and accurately represents evolv-
ing relational turbulence and flux profiles within a given OSN.

The remaining part of the paper is organized as follows: Section II presents a brief
overview of related works drawn from social theories and relational structures.
Section III introduces key concepts, theories and preliminaries of our proposed
model. Section IV discusses the methods and models we have developed for
profiling relational turbulence in OSNs. Section V introduces our experimental
design, implementation, results and presents our discussion. Section VI leads to
a conclusion and potential future directions.

2 Related Literature

Relational Turbulence. Relational Turbulence was first studied in [13]. It is


characterized as a resultant state of conflicting interests between two or more
actors. Conflict correlates to both a stimulus for communication and detrimental
event occurrences [13]. Therefore, relational altering events are important dis-
criminators to conflict detection and turbulence profiling. These events, if found
to be in huge negative violations of expectancies between relational reciprocates
of actors, can lead to instability in a relational flux [13]. The RTM [30] builds
upon the core principles of relational state shifts and conflict management in
an environment of continuous online social disruptions. The process of turbulent
relationship development can be described as a continuous and communicative
state of flux [30]. This state defines a consistent exchange of sentimental and
affective information between the actor/s involved. Each transition to another
state (e.g. professional colleagues to friendship) has the probability to cause
friction (conflict), which may lead to a polarization of sentiments and affective
communication flux in OSNs [30]. Two key features of the RTM are actor in-
terferences and relational uncertainty [30]. They enable effective detection and
prediction of conflicting events in sentimental and affective computing.
Neural Network Architectures. In [18], the authors present a minimal-
ist neural network architecture for reliably and accurately estimating emotional
states based on EEG captured data. Their model however, suffers from a lack of
representation for more deeply complex emotional states (e.g. an in-betweeness
in quantization across valance and arousal). Additionally, their reinforced gra-
dient coefficient augments the errors calculated between expected-weighted and
actual outputs which are then used to update the layered weights of their shallow
Artificial Neural Network (ANN) model. This approach alleviates diminishing
gradients at the expense of performance. In the same vein, [26] deals with social
4 L. Tan et al.

role recognition through the use of a Conditional Random Field (CRF) layered
model architecture. However, for video image frames in which latent social role-
based semantics exist, CRF architectures are ill-adapted to handle the complex
representations of the depth to these roles in the identification process. This leads
to poor performance output measures of their full model method. Building on
the principles of Role Theory, the authors in [20] propose a deeper hierarchical
model for human activity recognition based on identified actor roles within an
eventful context. Their models performance suffer from scaling to larger event
frameworks due to problems of overfitting and error gradient saddle points.

3 Preliminaries

Our RFT model leverages on two very important key concepts. The logical aspect
is derived from the Relational Turbulence Model and the structural design is
evolved from the Fractal Neural Network (FNN). The core idea of RFT is to
iteratively adapt the structure of the neural network model to changing outputs
(relational turbulence) at the inputs of the design. This is done in reference to
the changing complexities of data at the inputs. A detailed architecture of the
FNN used in our design is given in Figure 1a.

Fig. 1: The RFT logical architecture

(a) The RFT architecture design (b) The RFT System Model Design

Relational Turbulence. From the RTM approach [29], we define Rela-


tional Intensity P (γrl ), Relational Interference P (ϑrl ) and Relational Uncer-
tainty P (ϕrl ) to be three key probabilistic outputs of the RFT model which
represent the relational turbulence P (τrl ) of a given link in an OSN. The key el-
ement types we have identified to be contributing features between the duration
of the turning point and relationship development (as an unstable / turbulent
process) are the Confidence ρij , Salience ξij and Sentiment λij scores in an actor-
actor relationship of a social transaction in question.
Discovering Relational Intelligence In Online Social Networks 5

Expectancy Violation. It is noteworthy of mention that the ground truth


reciprocities of these element types shared within a relational flux, violates ex-
pectancies - E(ρij ), E(ξij ) and E(λij ) respectively [29]. These violations, are a
contributing factor to temporal representations of relational turbulence - γrl , ϑrl
and ϕrl . Negative expectancy is defined has a polar mismatch between expected
reciprocates against actual reciprocates (e.g. Actor i expecting a somewhat pos-
itive reciprocation of an egress sentiment stream, but instead, received a neg-
ative ingress sentiment stream from actor j). Positive expectancy is defined as
the strong cosine similar vector alignment between these reciprocates. Both ex-
pectancy violation (EV) extremes, are characterized by sharp gradient changes
of their weighted feature scores. This is given mathematically as:
n
∂Erl X Y E(ηji ) ∂ηij
= × (1)
∂τrl i,j=1
∂ηji ∂τij
η=ρ,ξ,λ

Where τij is also known as the relational turbulence between node i and its
surrounding neighbors j and ηij , ηji is the reciprocated sentiment from node i
to j and j to i respectively.

Relational change or transition - also known as a turning point, defines some


state-based critical threshold, beyond which relational turbulence and negative
communication is irrevocable [19]. This critical threshold is specific to actor-actor
relationships and learned through our model as a conflict escalation minimization
function [27]. Conflict escalation is defined as the gradual increase in negative
flux −∇F
∇t

over time within a classified context area LF of interest [27]. The
critical threshold parameter is then driven mathematically as:
(
1
(− ∇F ∇F
∇t × log2 ( ∇L ))

∀| ∂E
∂τrl | > 1
rl

T = inft→∞ 2m ∇F ∂Erl (2)


log2 (|1 − ∇L |) ∀| ∂τrl | < 1

Where T is the threshold of interest and m is the total number of training data
over the time window t. The equation states simply that the relational transition
threshold decreases drastically for strong EVs and gradually for weak EVs.
Problem Formulation. The problem statement which our work addresses
can be summarised as follows: Given an OSN within an environment of constant
social shocks, we wish to minimize inaccuracies in the representations from time
evolving flow of relational attributes (time-realistic relationships) between ac-
tors. Furthermore, although DNNs are very powerful tools designed for use in
both classification and recognition tasks, it is computationally abhorrent [14]. A
drawback of a generative architectural approach involves the use of stochastic
gradient decent methods during training which do not scale well to high dimen-
sionalities [23], [17]. Although still, generative DBNs offer many benefits like a
supply of good initialization points, the efficient use of unlabeled data, etc.; thus,
making its use in deep network architectures indispensable [5].
The Model Solution. To tackle the problem of computational efficiency
and learning scalability to large data sets, we have adopted the DSN model
6 L. Tan et al.

framework for our study. Central to the concept of such an architecture is the
relational use of stacking to learn complex distributions from simple core be-
lief modules, functions and classifiers. Our approach leverages on the temporal
transitions of stages in the relational evolution between nodes of an OSN [34]. It
determines profiles of relational turbulence and encodes knowledge dimensional-
ity into a highly volatile shallow fractal ANN architecture. This is used to either
generate or collapse depth complexity during active learning - in response to
random “anytime-sequenced” fluctuating data information.

4 Model and Methods

A high level system architecture of RFT is given in Figure 1b. Specifically, in our
design, data is fed into our model from two distinct sources. The first is batch
processed from a repository of social data (Googles and Enron emails). The sec-
ond is actively learned from live streaming tweet data (Twitter) pulled from
multiple server sources using the twitter firehose API. It is then pushed through
the model in stages. During pre-processing, data is first broken down into key
relational features - Category confidence, Entity salience, Entity sentiment, Men-
tions sentiment and Context sentiment using the Googles NLP API. Then, in
the next stage, these features are accepted as inputs into our RFT model (Fig-
ure 1b) to estimate the output relational turbulence profiles. The input features
of our RFT model is concatenated with the truth values of relational turbu-
lence calculated from (6), (7) and (8) and synchonously fed back recursively into
the intermediate confabulations of our FNN architecture (Figure 1a). Errors in
output expectations are backpropagated and corrected with inter-layer activity
weight adjustments until they fall within pre-defined tolerance levels.

4.1 The Hybrid RFT Fractal Architecture

We begin with the definition of a soft kernel used to discover a markovian struc-
ture which we then encode into confabulations of fractal sub-structures. For a
given set of data observables as inputs: χ ∈ X and outputs: = ∈ Ξ we wish to
loosely define a mapping such that the source space (X, α) maps onto a target
space (=, ω). The conditional P (χ ∨ ω) assigns a probability from each source in-
put χ to the final output space in ω. Each posterior state-space from in between
input to output is generated and sampled through a random walk process. An
indicator function which we have chosen to describe the state transition rule is:
(
0
Θt+1 = min δE c (3)
qnc=1 δχt+1
c
t

c
Where δEt+1 is the error change from one hidden feature activity state ht ∈ H
onto higher posterior confabulations. The objective function at each transition
seeks to minimize error gradients. For a general finite state space markovian
process, the markov kernel is thus defined as:
Discovering Relational Intelligence In Online Social Networks 7

(
p : X × ω → [0, 1]
Kern(M ) = H (4)
p(χ|ω) = ω q(χ, =)ν(δ=)
Once a unique markovian neural network has been discovered, a Single Layer
Convolutional Perceptron (SLCP) is proposed as a baseline structure to learn the
fractal sub-network from pre-existing posterior confabulations. The SLCP base-
line structure changes as discovered knowledge is progressively encoded during
the learning process.

4.2 The Fractal Neural Network


The model design we have chosen, with which to address the dynamic profiling
of relational turbulence is the Fractal Neural Network (FNN) [21]. FNN adopts
a hybrid architecture which incorporates the use of both generative and dis-
criminative deep networks [5]. In our architecture, the generative DBN is used
to initialize the DNN weights. Fine-tuning from the backpropagation process is
then subsequently carried out sequentially layer by layer.
Generative Framework. In our learning model, the FNN generative frame-
work is developed from the Restricted Boltzmann Machine (RBM) [12] layer
stack. A Boltzmann Machine is architecturally defined as a stochastically cou-
pled pair of binary units. These units contain a visible layer given as: V ∈ 0, 1D
and a hidden layer vector: H ∈ 0, 1P . The coupling between visible and hidden
layers V ; H is driven by an energy state of layered interactivity; expressed as:
1 1
E(V, H, θ) = − V T LV − H T JH − V T W H (5)
2 2
Where θ = W, J, L are Boltzmann Machine model weights between visible to
hidden, visible to visible and hidden to hidden layers respectively. The discrimi-
native architecture of the FNN model is built from the Tensorized Deep Stacking
Recursive Neural Network (TDSN-RNN) model framework [5].
Discriminative Framework. The discriminative architecture of our FNN
model is built from the Tensorized Deep Stacking Recursive Neural Network
(TDSN-RNN) model framework. All deep architectures (Contrastive Divergence
or per layer RBM to supervised backpropogation perceptron golden architec-
ture) rely on a back and forth recursive process through three core stages of
their learning process. Stage 1 involves a forward pass which sequentially pro-
cesses stacked training layers from input to output. Stage 2 backpropogates this
layer-wise sequence from output to input using gradient descent. Stage 3 adjusts
weights between layers to minimize output errors. This process is repeated in
cycles until the final expectation is reached.

4.3 The Relational Turbulence Model


In our model, the probabilistic Relational Turbulence P (τrl ) of a given link in an
OSN is determined by key features of an established relationship in any instance.
8 L. Tan et al.

They are the confidence ρij , salience ξij and sentiment λij scores in a dyadic
link. We define relational intensity as the continuous integration of sentimental
transactions F per context (event topic) LF area, the relational uncertainty
as the likelihood from opposing sentiment mentions and relational interference
as the probabilistic deviations in expectancies from predicted uncertainties and
flux intensities. Mathematically, these are given as:
For Relational Intensity:

n ∇F
X βij | − ∇tj |
γrl = + χrl + θ̇rl (6)
i,j=1
LF

Where βij is defined as the temporal derivative of the latent topic (context)
oscillation phase , χrl is the reciprocal bias and θ̇rl is the gradient of social
influence from one actor to another across a relational link.
For Relational Uncertainty:
Pn
i,j=1 Si Sj
ϕrl = pP qP (7)
n n
i=1 Si j=1 Sj

Where Si and Sj are sentiments transacted from nodes i to j and from nodes j
to i respectively.
For Relational Interference:

2
ϑrl = E(F (γrl,ϕrl : µγϕ,ωγϕ ))
n
(γrl ,ϕrl −µ)2 (8)
= 12 + √2πω
1 1 ,ϕrl −µ
γrl√
)exp−
P
γrl ,ϕrl =0 2 erf (
2ω 2

Where,

γrl ,ϕrl
1 X (t−µ)2
2
F (γrl , ϕrl : µγϕ , ωγϕ )= √ exp− 2ω2 dt (9)
2πω t=−∞

2
Here, F (γrl , ϕrl : µγϕ , ωγϕ ) is the Cumulative Distribution Function (CDF), and
erf (x) is the error function of the predicted outcomes γrl and ϕrl .
Finally Relational Turbulence: was calculated from conditional posteriors
of γrl , ϑrl and ϕrl as the mathematical relation of:
n
X P (γi |θi )P (ϑi |ϕi )P (ϕi |γi )
P (τrl ) = (10)
i=1
Ni P (γi )P (ϑi )P (ϕi )

Here, Ni is the conditional scaling factor. The inputs were tested across the RFT
dynamically stacked Fractal Neural Network (FNN) and the chosen baseline
models.
Discovering Relational Intelligence In Online Social Networks 9

5 Experiments

5.1 Dataset

The experiments were conducted on three datasets using RFT and five differ-
ent baseline algorithms. The datasets are: Twitter, Google and Enron emails.
These three datasets were chosen because they are widely benchmarked through-
out the academic circle for studies in sentimental computing and can be easily
understood by the audience of this paper. They are detailed in Table 1.

Table 1: Statistics of Datasets


Dataset #Entities #Dyads Size Avg. text len
Enron email1 162 1.5 mil 500000 emails 1000
Googles2 3566224 436994489 279 mil crawls 5000
TwitterAPI34 50 mil 1 bil 100 mil tweets 283

5.2 Baselines

Several state-of-the-art methods were considered for comparison with the pro-
posed RFT model. Since the model is the first in line for this type of adaptive
online active learning approach, modified versions of similar methods were used
along with the baselines, developed earlier for comparison. Another notable point
is although many prediction models exist, not all methods have the same goal or
data features as this study. Therefore consideration is given only to the models
which use similar data for comparison. It should be mentioned that not all the
methods can both predict relational turbulence and profile communication pat-
terns together. Therefore we compare only the profiles of relational turbulence
outputs between each other. Descriptions of the competing methods are given in
Table 2. The key difference between DCN and RFT is that in DCN the number
of layers are fixed at 45 while in RFT, the layers are allowed to grow and collapse
as new feature complexity representations are learned over time.
Tuning Parameters. In the experiments, system model parameters were
chosen based on the combined effect of several factors - including errors in obser-
vational data, choices of calibration methods and Design Of Experiment (DOE)
criterias [10]. A hybrid of both global and local Sensitivity Analysis (SA) ap-
proaches was used to determine and specify the best performing parameters for
experimentation based on a predefined behavior threshold for the model. The
experiments were conducted on the training model with a learning rate set to
1
https://archive.ics.uci.edu/ml/datasets /bag+of+words
2
http://commoncrawl.org/2014/07/april-2014-crawl-data-available/
3
https://developer.twitter.com/en/docs.html
4
http://help.sentiment140.com/api
10 L. Tan et al.

Table 2: Baseline Models


Baseline Class Data Modalities
SLFN [15] ANN feature parameters parametric inputs
DCN [14] DNN feature parameters 45-layer DCN
IMPALA [7] RL feature parameters Reinforcement Learning
MVVA [35] VAR endogeneous variables Vector Auto-Regression
EnsemDT [8] Ensemble group learners SLP, DCN, IMPALA and MVVA
RTM (True Value) RTT feature parameters (6) - (10)

1.1, a sliding window set to 3, an error tolerance set to 0.1 (10%), a data outlier
threshold set to 1.0, with scaling set to 10, a vanishing gradient error threshold
at 0 and an exploding gradient error threshold set to 100. Finally, both trust
region radius parameter was set to 5 and the softmax temperature regularization
parameter was staged at 1.2.

5.3 Performance Measurements

Kendall Coefficient. The Kendall (tau-b coefficient) was used to measure the
strength of associations between predicted and expected outputs of the learning
models. It is given as:

Nc − Nd
τb = p (11)
(N0 − Nx )(N0 − Ny )

Where Nc , Nd are the number of concordant and discordant pairs respectively,


ui is the number of tied values in the ith group of ties for the first quantity and
vj is the number of tied values in the j th group of ties for the second quantity.
Spearman Coefficient. The Spearman (rho coefficient) was used to mea-
sure the monotonic relationship between the independent variables (Category
confidence Ci , Entity Sailence Ji , Entity sentiments - magnitude and scores
(=i , ℶi ), Mention sentiments -magnitude and scores (Li , ℷi ), Context senti-
Q

ments - magnitude and scores ( i , ℸi )) and the dependent variables (Relational


Intensity γrl , Relational Interference ϑrl and Relational Uncertainty ϕrl ). It is
calculated as:
6 Di2
P
ΓS = 1 − (12)
N (N 2 − 1)
Where Di = rank(Xi )−rank(Yi ) is the difference in ranks between the observed
independent variable Xi and dependent variable Yi and N is the number of
predictions to input data sets for all three sources.
K-Fold Validation. Finally, during the experimentation, the full datasets
obtained from the different sources (twitter, google and enron) were partitioned
into k-subsamples. K-fold validation [33] was performed over all deep learning
models across the Mean Absolute Percentage Error (MAPE) [4] measurement of
Discovering Relational Intelligence In Online Social Networks 11

each run. Mathematically, MAPE can be expressed as:


N
1 X Ei (x) − Yi (t)
δM AP E = | | (13)
N i=1 Ei (x)

Where Ei (x) is the expectation at the output of data input set i and Yi (t) is
the corresponding prediction over N total subsamples. δM AP E is the average
measure of errors in expectations at the output.

5.4 Results

The tests were run across the baselines and our RFT model. For clarity and
simplicity of explainations, only every 10th running data from a chosen output
sample set is plotted on a graph and displayed for discussion purposes. The line
of best fit was used to graph the curve through the points. Additionally because
of space constraints, only the table on Kendall correlation experimented on the
chosen datasets is displayed. The results are shown in Table 3 and Figure 2a - c.

Fig. 2: Graph of relational turbulence across three datasets

(a) Graph of Enron Relational Turbulence(b) Graph of Googles Relational Turbulence


Profile. Profile.

(c) Graph of Twitter Relational Turbulence


Profile.
12 L. Tan et al.

Table 3: Table of Spearman, Kendall and K-fold results


(a) Table of Kendall coefficient (b) Table of K-fold MAPE

(c) Table of Spearman coefficient for RFT (d) Table of Spearman coefficient for DCN

(e) Table of Spearman coefficient for IM-(f) Table of Spearman coefficient for En-
PALA semDT

(h) Table of Spearman coefficient for


(g) Table of Spearman coefficient for SLP MVVA
Discovering Relational Intelligence In Online Social Networks 13

5.5 Investigation
As can be seen from the graphs, SLP models consistently underperforms in rank-
ing where prediction accuracy is concerned, the Kendall (tau-b coefficient) test
shows a lower (positive) correlation between expected and predicted outputs
across the test data set for SLP models and much higher (positive) association
for other baselines and RFT. Furthermore, from the results of the Spearman
(rho coefficient) test done on the independent and dependent variables, it can
be seen from Tables 3c - h that the spearman coefficient indicates strongly posi-
tive monotonic correlations between turbulence measures (γrl , ϑrl and ϕrl ) and

Q
sentiment scores [(=i , ℶi ), (Li , ℷi ), ( i , ℸi )] and moderately positive correlations
between the same turbulence measures (γrl , ϑrl and ϕrl ) to both category con-
fidence and entity salience (Ci , Ji ).

Additionally, from Table 3a, it is observed that across all models, strength of as-
sociations between predicted and expected outputs tend to be weaker in specif-
ically directed communications. This is observed in Enrons email datasets as
opposed to Twitter and Google results. It is analysed that this is due to high
relational interference scores which tend to correlate fairly well to entity salience
scores. In this scenario, entity salience plays an important function in deter-
mining relational turbulence - as opposed to contexts over which the sentiments
were expressed. This means that an actor with a higher social status of influence
may more readily interfere with other relationships in directed communications.
Generally however, it can be observed from Table 3b that as the number of sub-
sample windows increases over the dataset, the MAPE over DCN, EnsemDT and
RFT decreases. Whereas MAPE for SLP, IMPALA and MVVA tend to fluctu-
ate about a fixed error. This behavior is attributed to overfitting and gradient
saddle points from poor initializations. RFT remains the clear winner across the
measured baselines in all k-fold validation experiments.

6 Conclusion
In conclusion, it has been shown that RFT is capable of predicting relational tur-
bulence profiles between actors within a given OSN acquired from anytime data.
The results show superior accuracies and performance of the FNN model in com-
parison to well known baseline models. The feasibility of the learning model has
been demonstrated through the implementation on three large scale networks:
Twitter, Google Plus and Enron emails. The study uncovers three pivotal long-
term objectives from a relational perspective. Firstly, relational features can be
used to strengthen medical, cyber security and social applications where the
constant challenges between detection, recommendation, prediction, data utility
and privacy are being continually addressed. Secondly, in fintech applications,
relational predicates (e.g. turbulence) are determinants to market movements
- closely modeled after a system of constant shocks. Thirdly, in artificial in-
telligence applications like computer cognition and robotics, learning relational
features between social actors enables machines to recognize and evolve.
14 L. Tan et al.

References

1. Amari, S.i., Nagaoka, H.: Methods of information geometry, vol. 191. American
Mathematical Soc. (2007)
2. Backstrom, L., Leskovec, J.: Link prediction in social networks using computation-
ally efficient topological features. In: Privacy, Security, Risk and Trust (PASSAT)
and 2011 IEEE Third Inernational Conference on Social Computing (SocialCom),
2011 IEEE Third International Conference on. pp. 73–80. IEEE (2011)
3. Church, K.W., Hanks, P.: Word association norms, mutual information, and lexi-
cography. Computational linguistics 16(1), 22–29 (1990)
4. De Myttenaere, A., Golden, B., Le Grand, B., Rossi, F.: Mean absolute percentage
error for regression models. Neurocomputing 192, 38–48 (2016)
5. Deng, L., Yu, D., et al.: Deep learning: methods and applications. Foundations and
Trends R in Signal Processing 7(3–4), 197–387 (2014)
6. Deng, S., Rangwala, H., Ning, Y.: Learning dynamic context graphs for predicting
social events. In: Proceedings of the 25th ACM SIGKDD International Conference
on Knowledge Discovery & Data Mining. pp. 1007–1016 (2019)
7. Espeholt, L., Soyer, H., Munos, R., Simonyan, K., Mnih, V., Ward, T., Doron, Y.,
Firoiu, V., Harley, T., Dunning, I., et al.: Impala: Scalable distributed deep-rl with
importance weighted actor-learner architectures. arXiv preprint arXiv:1802.01561
(2018)
8. Ezzat, A., Wu, M., Li, X., Kwoh, C.K.: Computational prediction of drug-target
interactions via ensemble learning. In: Computational Methods for Drug Repur-
posing, pp. 239–254. Springer (2019)
9. Feng, K., Cong, G., Jensen, C.S., Guo, T.: Finding attribute-aware similar regions
for data analysis. Proceedings of the VLDB Endowment 12(11), 1414–1426 (2019)
10. Gan, Y., Duan, Q., Gong, W., Tong, C., Sun, Y., Chu, W., Ye, A., Miao, C.,
Di, Z.: A comprehensive evaluation of various sensitivity anslysis methods: A case
study with a hydrological model. Environmental Modelling & Software 51, 269–285
(2014)
11. Gao, F., Musial, K., Cooper, C., Tsoka, S.: Link prediction methods and their
accuracy for different social networks and network metrics. Scientific Programming
2015, 1 (2015)
12. Han, Z., Liu, Z., Han, J., Vong, C.M., Bu, S., Chen, C.L.P.: Mesh convolutional
restricted boltzmann machines for unsupervised learning of features with structure
preservation on 3-d meshes. IEEE transactions on neural networks and learning
systems 28(10), 2268–2281 (2017)
13. HAUNANI SOLOMON, D., Theiss, J.: A longitudinal test of the relational turbu-
lence model of romantic relationship development. Personal Relationships 15, 339
– 357 (08 2008). https://doi.org/10.1111/j.1475-6811.2008.00202.x
14. Huang, G., Sun, Y., Liu, Z., Sedra, D., Weinberger, K.Q.: Deep networks with
stochastic depth. In: European Conference on Computer Vision. pp. 646–661.
Springer (2016)
15. Huang, G.B., Chen, Y.Q., Babri, H.A.: Classification ability of single hidden layer
feedforward neural networks. IEEE Transactions on Neural Networks 11(3), 799–
801 (2000)
16. Huang, X., Song, Q., Li, Y., Hu, X.: Graph recurrent networks with attributed
random walks. In: Proceedings of the 25th ACM SIGKDD International Conference
on Knowledge Discovery & Data Mining. pp. 732–740 (2019)
Discovering Relational Intelligence In Online Social Networks 15

17. Hutchinson, B., Deng, L., Yu, D.: Tensor deep stacking networks. IEEE Transac-
tions on Pattern Analysis and Machine Intelligence 35(8), 1944–1957 (2013)
18. Keshmiri, S., Sumioka, H., Nakanishi, J., Ishiguro, H.: Emotional state estimation
using a modified gradient-based neural architecture with weighted estimates. In:
2017 International Joint Conference on Neural Networks (IJCNN). pp. 4371–4378.
IEEE (2017)
19. Knobloch, L.K., Theiss, J.A.: Relational turbulence theory applied to the transition
from deployment to reintegration. Journal of Family Theory & Review 10(3), 535–
549 (2018)
20. Lan, T., Sigal, L., Mori, G.: Social roles in hierarchical models for human activity
recognition. In: 2012 IEEE Conference on Computer Vision and Pattern Recogni-
tion. pp. 1354–1361. IEEE (2012)
21. Larsson, G., Maire, M., Shakhnarovich, G.: Fractalnet: Ultra-deep neural networks
without residuals. arXiv preprint arXiv:1605.07648 (2016)
22. Li, Y., Zhang, Z.L., Bao, J.: Mutual or unrequited love: Identifying stable clusters
in social networks with uni-and bi-directional links. In: International Workshop on
Algorithms and Models for the Web-Graph. pp. 113–125. Springer (2012)
23. Miikkulainen, R., Liang, J., Meyerson, E., Rawal, A., Fink, D., Francon, O., Raju,
B., Navruzyan, A., Duffy, N., Hodjat, B.: Evolving deep neural networks. arXiv
preprint arXiv:1703.00548 (2017)
24. Nielsen, F., Barbaresco, F.: Geometric science of information. Springer Interna-
tional Publishing (2015)
25. Peitgen, H.O., Jürgens, H., Saupe, D.: Chaos and fractals: new frontiers of science.
Springer Science & Business Media (2006)
26. Ramanathan, V., Yao, B., Fei-Fei, L.: Social role discovery in human events. In:
Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition.
pp. 2475–2482 (2013)
27. Simeonova, L.: Gradient emotional analysis (2017)
28. Snijders, T.A.: Markov chain monte carlo estimation of exponential random graph
models. Journal of Social Structure 3(2), 1–40 (2002)
29. Solomon, D.H., Knobloch, L.K., Theiss, J.A., McLaren, R.M.: Relational turbu-
lence theory: Variation in subjective experiences and communication within ro-
mantic relationships. Human Communication Research 42(4), 507–532 (2016)
30. Theiss, J.A., Solomon, D.H.: A relational turbulence model of communi-
cation about irritations in romantic relationships. Communication Re-
search 33(5), 391–418 (2006). https://doi.org/10.1177/0093650206291482,
https://doi.org/10.1177/0093650206291482
31. Wang, P., Xu, B., Wu, Y., Zhou, X.: Link prediction in social networks: the state-
of-the-art. Science China Information Sciences 58(1), 1–38 (2015)
32. Watters, N.: Information Geometric Approaches for Neural Network Algorithms.
Ph.D. thesis (2016)
33. Wong, T.T.: Performance evaluation of classification algorithms by k-fold and
leave-one-out cross validation. Pattern Recognition 48(9), 2839–2846 (2015)
34. Zhang, J., Tan, L., Tao, X., Pham, T., Zhu, X., Li, H., Chang, L.: Detecting
relational states in online social networks. In: 2018 5th International Conference
on Behavioral, Economic and Socio-Cultural Computing (BESC). pp. 38–43. IEEE
35. Zhang, J., Tao, X., Tan, L., Lin, J.C.W., Li, H., Chang, L.: On link stability
detection for online social networks. In: International Conference on Database and
Expert Systems Applications. pp. 320–335. Springer (2018)

View publication stats

You might also like