You are on page 1of 31

Chapter 1


In this chapter the basic introduction of fog computing is described and also the advantages
and disadvantages of cloud are given, because of disadvantages of the cloud computing fog
computing is introduced.

IoT environments generate unprecedented amounts of data that can be useful in many ways,
particularly if analyzed for insights. However, the data volume can overwhelm todays
storage systems and analytics applications. The Internet of things (IoT) will be the Internet of
future, as we have seen a huge increase in wearable technology, smart grid, smart home/city
and smart connected vehicles. Fog computing is usually cooperated with cloud computing.

As a result, end users, fog and cloud together form a three layer service delivery model . Fog
computing also shows a strong connection to cloud computing in terms of characterization.
For example, elastic resources (computation, storage and networking) are the building blocks
of both of them, indicating that most cloud computing technologies can be directly applied to
fog computing. However, fog computing has several unique properties that distinguish it
from other existing computing architectures. The most important is its close distance to end
users. It is vital to keep computing resource at the edge of the network to support latency-
sensitive applications and services. Another interesting property is location-awareness.
In a Fog Computing environment, a considerable amount of processing may occur in a data
hub on a smart mobile device or on the edge of the network in a smart router or other
gateway device. This distributed approach is rising in popularity due to the Internet of Things
(IoT) and the immense amount of data that sensors generate.
1.1. Cloud Computing

Cloud computing is a type of Internet-based computing that provides shared computer

processing resources and data to computers and other devices on demand. It is a model
for enabling ubiquitous, on-demand access to a shared pool of configurable computing
resources (e.g., computer networks, servers, storage, applications and services) which
can be rapidly provisioned and released with minimal management effort. Cloud
computing and storage solutions provide users and enterprises with various capabilities
to store and process their data in either privately owned, or third-party data centers that
Seminar Report on fog Computing 1
may be located far from the user ranging in distance from across a city to across the
world. Cloud computing relies on sharing of resources to achieve coherence
and economy of scale, similar to a utility (like the electricity grid) over an electricity

Fig1.1: Cloud computing

1.1.1. Advantages of Cloud Computing

Cost Savings

The most significant cloud computing benefit is in terms of IT cost savings. Businesses, no
matter what their type or size, exist to earn money while keeping capital and operational
expenses to a minimum. With cloud computing, you can save substantial capital costs with
zero in-house server storage and application requirements. The lack of on-premises
infrastructure also removes their associated operational costs in the form of power, air
conditioning and administration costs. You pay for what is used and disengage whenever you
like - there is no invested IT capital to worry about. Its a common misconception that only
large businesses can afford to use the cloud, when in fact, cloud services are extremely
affordable for smaller businesses.


With a managed service platform, cloud computing is much more reliable and consistent than
in-house IT infrastructure. Most providers offer a Service Level Agreement which guarantees
24/7/365 and 99.99% availability. Your organization can benefit from a massive pool of

Seminar Report on fog Computing 2

redundant IT resources, as well as quick failover mechanism - if a server fails, hosted
applications and services can easily be transited to any of the available servers.


Cloud computing provides enhanced and simplified IT management and maintenance

capabilities through central administration of resources, vendor managed infrastructure and
SLA backed agreements. IT infrastructure updates and maintenance are eliminated, as all
resources are maintained by the service provider. You enjoy a simple web-based user
interface for accessing software, applications and services without the need for installation -
and an SLA ensures the timely and guaranteed delivery, management and maintenance of
your IT services.

Strategic Edge

Ever-increasing computing resources give you a competitive edge over competitors, as the
time you require for IT procurement is virtually nil. Your company can deploy mission
critical applications that deliver significant business benefits, without any upfront costs and
minimal provisioning time. Cloud computing allows you to forget about technology and
focus on your key business activities and objectives. It can also help you to reduce the time
needed to market newer applications and services.

1.1.2. Disadvantages of Cloud Computing


As cloud service providers take care of a number of clients each day, they can become
overwhelmed and may even come up against technical outages. This can lead to your
business processes being temporarily suspended. Additionally, if your internet connection is
offline, you will not be able to access any of your applications, server or data from the cloud.


Although cloud service providers implement the best security standards and industry
certifications, storing data and important files on external service providers always opens up
risks. Using cloud-powered technologies means you need to provide your service provider
with access to important business data. Meanwhile, being a public service opens up cloud

Seminar Report on fog Computing 3

service providers to security challenges on a routine basis. The ease in procuring and
accessing cloud services can also give nefarious users the ability to scan, identify and exploit
loopholes and vulnerabilities within a system. For instance, in a multi-tenant cloud
architecture where multiple users are hosted on the same server, a hacker might try to break
into the data of other users hosted and stored on the same server. However, such exploits and
loopholes are not likely to surface, and the likelihood of a compromise is not great.

Limited Control

Since the cloud infrastructure is entirely owned, managed and monitored by the service
provider, it transfers minimal control over to the customer. The customer can only control and
manage the applications, data and services operated on top of that, not the backend
infrastructure itself. Key administrative tasks such as server shell access, updating and
firmware management may not be passed to the customer or end user.

1.2. History of open fog computing

On November 19, 2015, Cisco Systems, ARM Holdings, Dell, Intel, Microsoft, and Princeton
University, founded the OpenFog Consortium, to promote interests and development in fog
computing. Cisco Sr. Managing-Director Helder Antunes became the consortium's first
chairman and Intel's Chief IoT Strategist Jeff Fedders became its first president.

1.3. Fog Computing

The term Fog Computing was introduced by the Cisco Systems as new model to ease
wireless data transfer to distributed devices in the Internet of Things (IoT) network
paradigm. Cisco defines Fog Computing as a paradigm that extends Cloud computing and
services to the edge of the network. Similar to Cloud, Fog provides data, compute, storage,
and application services to end-users. The distinguishing Fog characteristics are its proximity
to end-users, its dense geographical distribution, and its support for mobility. Services are
hosted at the network edge or even end devices such as set-top-boxes or access points. By
doing so, Fog reduces service latency, and improves QoS, resulting in superior user-
experience. Fog Computing supports emerging Internet of Everything (IoE) applications that
demand real-time/predictable latency (industrial automation, transportation, networks of
sensors and actuators). Thanks to its wide geographical distribution the Fog paradigm is well

Seminar Report on fog Computing 4

positioned for real time big data and real time analytics. Fog supports densely distributed data
collection points, hence adding a fourth axis to the often mentioned Big Data dimensions [4].

Fig 1.2 : Fog Computing.

Fog computing, also known as fog networking, is a kind of decentralized computing

infrastructure in which computing resources and application services are distributed in a
logical and efficient place at any point, along the continuum from the data source to the
cloud. Although this is mostly done for efficiency reasons, it can also be done for security and
compliance reasons [4].

Fog Computing enables a new breed of applications and services, and that there is a fruitful
interplay between the Cloud and the Fog, particularly when it comes to data management and
analytics. The Fog vision was conceived to address applications and services that do not fit
well the paradigm of the Cloud [6]. They include:

Applications that require very low and predictable latencythe Cloud frees the user from
many implementation details, including the precise knowledge of where the computation or
storage takes place. This freedom from choice, welcome in many circumstances becomes a
liability when latency is at premium (gaming, video conferencing).

Geo-distributed applications (pipeline monitoring, sensor networks to monitor the


Fast mobile applications (smart connected vehicle, connected rail).

Seminar Report on fog Computing 5

Large-scale distributed control systems (smart grid, connected rail, smart traffic light

Table 1.1 : Cloud vs Fog Computing

Cloud Computing Fog Computing

Data and application are processed in a cloud Rather than presenting and working from a
which is time consuming task for large data. centralize cloud, fog operates on network
edge. So it consumes less time.
Problem of bandwidth, as a result of sending Less demand for bandwidth, as every bit of
every bit of data over cloud channels. datas were aggregated at certain access
points instead of sending over cloud
Slow response time and scalability problems By setting small server called edge servers in
as a result of depending servers that are visibility of users, it is possible for fog
located at remote places. computing platform to avoid response time
and scalability issues.

Seminar Report on fog Computing 6

Chapter 2


In previous chapter we have seen introduction of fog computing so, in this chapter the role of
fog computing in IoT(Internet of Things),designing goals and the system design and
components of fog computing are described.

2.1. Role of fog computing in IoT :

1.Connected Vehicles :The Connected Vehicle distribution displays a rich setup of

connectivity and interactions: cars to cars, cars to access points (Wi-Fi,3G,smart traffic
lights), and access points to access points [3].

2.Wireless Sensor and Actuator Networks : The real Wireless Sensor Nodes (WSNs), were
designed to operate at particularly low power in order to extend battery life or even to make
energy reaping achievable. Most of these WSNs involve a large number of less bandwidth,
less energy, very low processing power, trivial memory motes, operating as a sources of a
sink (collector), in a unidirectional fashion [3].

3.IoT and Cyber Physical System (CPSs) : Fogging based systems are becoming a
significant class of IoT and CpSs IoT is a network that can interrelate ordinary physical
objects with identified addresses. CPSs article a constricted combination of the systems
computational and physical elements. CPSs also organize the incorporation of computer and
data centric physical engineered systems [3].

4.Software Defined Networks (SDN): SDN concept along with fogging will determine the
main problem in vehicular networks, irregular connectivity, collisions and high packet loss,
by supplementing vehicle to vehicle with vehicle to infrastructure communication and unified
control [3].

5. Decentralized Smart Building Control: The application of this development are enabled by
wireless sensors positioned atmosphere In this case information can be exchanged among all
sensors in a floor, and their analyses can be combined to form unfailing measurements [3].

Seminar Report on fog Computing 7

2.2. Designing Goals
There are several designing goals for an adequate fog computing platform.
1. Latency.
It is fundamental for fog computing platform to offer end user low-latency-guaranteed
applications and services. The latency comes from the execution time of a task, the
task offloading time, the time for cyber foraging and speed of decisions making, etc
2. Efficiency.
While at first glance the efficiency may have its own impact on latency, it is more
related to the efficient utilization of resources and energy [3] . The reasons are
obvious and quite different from counterparts in cloud computing scenarios:
Not all fog nodes are resource rich; some of them have limited computation
power, memory and storage.
Most of fog nodes and clients are battery-powered, such as hand-hold devices,
wearables, and wireless sensor units.
3. Generality.
Due to the heterogeneity of fog node and client, we need provide same abstract to top
layer applications and services for fog clients. General application programming
interfaces (APIs) should be provided to cope with existing protocols and APIs (e.g.
Machine-2- machine protocols, smart vehicle/smart appliance APIs etc) [3] .
2.3. Characteristics of the fog computing
1. Heterogeneity:
Fog Computing is a highly virtualized platform that provides compute, storage, and
networking services between end devices and traditional Cloud Computing Data
Centers, typically, but not exclusively located at the edge of network. Compute,
storage, and networking resources are the building blocks of both the Cloud and the
Fog . Edge of the Network, however, implies a number of characteristics that make
the Fog a non-trivial extension of the Cloud. Let us list them with pointers to
motivating examples [5].
2. Edge location, location awareness, and low latency:
The origins of the Fog can be traced to early proposals to support endpoints with rich
services at the edge of the network, including applications with low latency
requirements (e.g. gaming, video streaming, and augmented reality) [5].

3. Geographical distribution:

Seminar Report on fog Computing 8

In sharp contrast to the more centralized Cloud, the services and applications targeted
by the Fog demand widely distributed deployments. The Fog, for instance, will play
an active role in delivering high quality streaming to moving vehicles, through
proxies and access points positioned along highways and tracks [5].
4. Large-scale sensor networks:
To monitor the environment and the Smart Grid are other examples of inherently
distributed systems, requiring distributed computing and storage resources [5].
5. Very large number of nodes:
As a consequence of the wide geo-distribution, as evidenced in sensor networks in
general, and the Smart Grid in particular [5].
6. Support for mobility:
It is essential for many Fog applications to communicate directly with mobile devices,
and therefore support mobility techniques, such as the LISP protocol, that decouple
host identity from location identity, and require a distributed directory system [5].
7. Real-time interactions:
Important Fog applications involve real-time interactions rather than batch processing
8. Interoperability and federation:
Seamless support of certain services (streaming is a good example) requires the
cooperation of different providers. Hence, Fog components must be able to
interoperate, and services must be federated across domains [5].
2.4. Components in Fog Architecture
2.4.1. IoT (Internet of Things) Services Verticals:

Fog nodes are heterogeneous in nature and deployed in variety of environments

including core, edge, access networks and endpoints. The Fog architecture should
facilitate seamless resource management across the diverse set of platforms.
The Fog platform hosts diverse set of applications belonging to various verticals smart
connected vehicles to smart cities, oil and gas, smart grid etc. Fog architecture should
expose generic APIs that can be used by the diverse set of applications to leverage
Fog platform.
The Fog platform should provide necessary means for distributed policy-based
orchestration, resulting in scalable management of individual subsystems and the
overall service [4].

Seminar Report on fog Computing 9

2.4.2. Heterogeneous Physical Resources
Fog nodes are heterogeneous in nature. They range from high end servers, edge routers,
access points, set-top boxes, and even end devices such as vehicles, sensors, mobile phones
etc. The different hardware platforms have varying levels of RAM, secondary storage, and
real estate to support new functionalities. The platforms run various kinds of OSes, software
applications resulting in a wide variety of hardware and software capabilities.
The Fog network infrastructure is also heterogeneous in nature, ranging from high-speed
links connecting enterprise data centers and the core to multiple wireless access technologies
(ex: 3G/4G, LTE, WiFi etc.) towards the edge [4].

Fig 2.1: Components in Fog Architecture.

2.4.3. Fog Abstraction Layer

The Fog abstraction layer hides the platform heterogeneity and exposes a uniform and
programmable interface for seamless resource management and control.
The layer provides generic APIs for monitoring, provisioning and controlling physical
resources such as CPU, memory, network and energy. The layer also exposes generic APIs to
monitor and manage various hypervisors, OSes, service containers, and service instances on a
physical machine (discussed more later) [4].
The layer includes necessary techniques that support virtualization, specifically

Seminar Report on fog Computing 10

the ability to run multiple OSes or service containers on a physical machine to
improve resource utilization. Virtualization enables the abstraction layer to support multi-
tenancy. The layer exposes generic APIs to specify security, privacy and isolation
policies for OSes or containers belonging to different tenants on the same
physical machine. Specifically, the following multi-tenancy features are supported:
Data and resource isolation guarantees for the different tenants on the same physical
infrastructure [4].
The capabilities to inflict no collateral damage to the different parties at the minimum
Expose a single, consistent model across physical machine to provide these isolation
services [4].
The abstraction layer exposes both the physical and the logical (per-tenant) network
to administrators, and the resource usage per-tenant [4].
2.4.4. Fog Service Orchestration Layer
The service orchestration layer provides dynamic, policy-based life-cycle management of
Fog services. The orchestration functionality is as distributed as the underlying Fog
infrastructure and services. Managing services on a large volume of Fog nodes with a wide
range of capabilities is achieved with the following technology and components:
A software agent with reasonably small footprint yet capable of bearing the
orchestration functionality and performance requirements that could be embedded in
various edge devices.
A distributed, persistent storage to store policies and resource meta-data (capability,
performance, etc) that support high transaction rate update and retrieval.
A scalable messaging bus to carry control messages for service orchestration and
resource management.
A distributed policy engine with a single global view and local enforcement [4]. Software Agent
The distributed Fog orchestration framework consists of several Foglet software agents, one
running on every node in the Fog platform. The Foglet agent uses abstraction layer APIs to
monitor the health and state associated with the physical machine and services deployed on
the machine. This information is both locally analyzed and also pushed to the distributed
storage for global processing [4].
Foglet is also responsible for performing life-cycle management activities such as standing
up/down guest OSes, service containers, and provisioning and tearing down service instances
etc. Thus, Foglets interactions on a Fog node span over a range of entities starting from the

Seminar Report on fog Computing 11

physical machine, hypervisor, guest OSes, service containers, and service instances. Each of
these entities implements the necessary functions for programmatic management and control;
Foglet invokes these functions via the abstraction layer APIs [4]. Distributed Database
A distributed database, while complex to implement is ideal for increasing Fogs scalability
and fault-tolerance. The distributed database provides faster (than centralized) storage and
retrieval of data. The database is used to store both application data and necessary meta-data
to aid in Fog service orchestration. Sample meta-data include (discussed more in the next
Fog nodes hardware and software capabilities to enable service instantiation on a
platform with matching capabilities.
Health and other state information of Fog nodes and running service instances for
load balancing, and generating performance reports.
Business policies that should be enforced throughout a services life cycle such as
those related to security, configuration etc. North-Bound APIs for Applications
The Fog software framework exposes northbound APIs that applications use to effectively
leverage the Fog platform. These APIs are broadly classified into data and control APIs. Data
APIs allow an application to leverage the Fog distributed data store. Control APIs allow an
application to specify how the application should be deployed on the Fog platform [4].
Few example APIs:
Put_data(): To store/update application-specific data and meta-data on the Fog
distributed data store.
Get_data():To retrieve application-specific datameta-data from the Fog distributed
data store.
Request_service(): To request for a service instance that matches some criteria.
Setup_service(): To setup a new service instance that matches some criteria.
Install_policy(): To install specific set of policies for a provider, subscriber in the
orchestration framework.
Update_policy (): To configure/re-configure a policy with a specific set of parameters
(ex: thresholds for a load balancing policy).
Get_stats(): To generate reports of Fog node health and other status. Policy-Based Service Orchestration

Seminar Report on fog Computing 12

The orchestration framework provides policy-based service routing, i.e., routes an incoming
service request to the appropriate service instance that confirms to the relevant business
policies [4].

Fig2.2: Policy-based orchestration framework

2.5. Fog Computing and Data Management
IoT is going to be a big driver for distributed (Fog) computing. It is simply
unproductive to transmit all the data, a bundle of sensors generates to the Cloud for
processing and analysis; doing so needs a great deal of bandwidth and all the back-
and-forth communication between the sensors and the cloud can adversely impact
performance [4].
IoT will create enormous amounts of data there is a need for distributed intelligence
and so-called fast Big Data processing. Companies like Par stream (acquired by
Cisco) recognize this and have built and are building solutions to support ESP and fast
processing [4].
IoT will create enormous amounts of data, driving a need for distributed intelligent
data management and so-called 'fast' Big Data processing [4].

Seminar Report on fog Computing 13

Fig2.3. : Data Management in Fog Computing.
The above figure illustrates the notion of some data being pre-processed and potentially used
in real-time whereas other data is stored or even archived for much later use in a more
centralized cloud infrastructure or platform environment [4].
Every communication deployment of IoT is unique. However, there are four basic stages that
are common to just about every IoT application. Those components are: data collection, data
transmission, data assessment, and response to the available information. Successful data
management is therefore very important to the success of IoT [4].
Data Management for IoT can be viewed as a two-part system Online/Real-time Front-end
(e.g. distributed nodes) and Off-line Back-end (centralized Cloud storage). The Online/real-
time portion of the system is concerned with data management associated with distributed
objects/assets/devices and their associated sensors. As we discuss later in this report, there are
issues pertaining to the need for fast data and distributed intelligence to deal with this data.
The Front-end also passes data (in the form of proactive push and responses to queries)
results from the objects/devices/sensors to the Back-end. The frequent communication
between Frontend and Backend is termed as Online. The Back-end is storage-intensive;
storing select data produced from disparate sources and also supports in-depth queries and
analysis over the long-term as well as data archival needs [4].

Seminar Report on fog Computing 14

There will also be a need for advanced Data Virtualization techniques for IoT Data. Data
virtualization is any approach to data management that allows an application to retrieve and
manipulate data without requiring technical details about the data, such as how it is formatted
or where it is physically located. An example of a leading company in this area is Cisco,
whose Data Virtualization offering represents an agile data integration software solution that
makes it easy to abstract and view data, regardless of where it resides. With their integrated
data platform, a business can query various types of data across the network as if it were in a
single place [4].
There are also data infrastructure issues to consider with IoT Data. Three important
DB/infrastructure issues to consider for IoT Data Management are:
Hybrid Database Support: IoT database with flexibility to handle semi-structured,
unstructured, geo-spatial and traditional relational data. The varied types of data can
co-exist within one single database [4].
Embedded Deployment Database: IoT database often need to be embeddable for
processing and compressing data and transmitting over and between networks. Good
features to have are little or no-configuration at run-time, self-tuning and automatic
recovery from failure [4].
Cloud Migration: IoT networks can store and process data in scalable, flexible Cloud
infrastructure. The platform can be accessed using web-based interfaces and API calls



Seminar Report on fog Computing 15

In previous chapter we have seen the system design and components of fog computing so, in
this chapter the overall working of fog computing for data processing, data storage, data
transmission and the data compute is described.

3.1. Distributed data processing in a fog-computing environment:

Distributed data processing in a fog-computing environment is based on the desired

functionality of a system, users can deploy Internet of Things sensors in different
environments including roads, medical centers, and farms. Once the system collects
information from the sensors, fog devicesincluding nearby gateways and private clouds
dynamically conduct data analytics [1].

Fig 3.1 : Distributed data processing in a fog-computing environment.

Fog computing is a distributed paradigm that provides cloud-like services to the network
edge. It leverages cloud and edge resources along with its own infrastructure, as Figure 3.1.

Seminar Report on fog Computing 16

shows. In essence, the technology deals with IoT data locally by utilizing clients or edge
devices near users to carry out a substantial amount of storage, communication, control,
configuration, and management. The approach benefits from edge devices close proximity to
sensors, while leveraging the on-demand scalability of cloud resources [1].
Fog computing involves the components of data-processing or analytics applications running
in distributed cloud and edge devices. It also facilitates the management and programming of
computing, networking, and storage services between data centers and end devices [1].

It supports user mobility, resource and interface heterogeneity, and distributed data analytics
to address the requirements of widely distributed applications that need low latency [1].

3.2. Fog Computing Working

Developers either port or write IoT applications for fog nodes at the network edge. The fog
nodes closest to the network edge ingest the data from IoT devices [2]. Then and this is
crucial the fog IoT application directs different types of data to the optimal place for analysis:
The most time-sensitive data is analyzed on the fog node closest to the things
generating the data. In a Cisco Smart Grid distribution network, for example, the most
time-sensitive requirement is to verify that protection and control loops are operating
properly. Therefore, the fog nodes closest to the grid sensors can look for signs of
problems and then prevent them by sending control commands to actuators [2].
Data that can wait seconds or minutes for action is passed along to an aggregation
node for analysis and action. In the Smart Grid example, each substation might have
its own aggregation node that reports the operational status of each downstream
feeder and lateral [2].
Data that is less time sensitive is sent to the cloud for historical analysis, big data
analytics, and long-term storage. For example, each of thousands or hundreds of
thousands of fog nodes might send periodic summaries of grid data to the cloud for
historical analysis and storage.
In fog computing, much of the processing takes place in a data hub on a smart
mobile device or on the edge of the network in a smart router or other gateway
device. This technique is especially advantageous for Internet of Things as the
amount of data generated by the sensors is immense. The amount of data is so
huge that it is simply inefficient to transmit all the data a bunch of sensors
produce to the cloud for processing and analysis. A great deal of bandwidth is
needed and the back-and-forth communication between the sensors and the cloud

Seminar Report on fog Computing 17

can also negatively impact performance. The latency issue can be simply
annoying in some cases such as gaming, but delays in data transmission might
become life-threatening in case of vehicle-to-vehicle communication system or
large scale distributed control system for rail travel [2].
Fog computing was introduced to meet three primary goals-
To improve efficiency and trim the amount of data that requires to be transmitted for
processing, analysis and storage.
Place the data close to the end user.

Provide security and compliance to the data transmission over cloud.

Fog Networking consists of a control plane and a data plane, where most of the processing
takes place in the data plane of a smart mobile or on the edge of the network in a gateway
device [2].

While edge devices and sensors are where data is generated and collected, they dont have
the compute and storage resources to perform advanced analytics and machine learning
tasks [2].

Though cloud servers have the power to do these, they are often too far away to process the
data and respond in a timely manner [2].

In addition, having all endpoints connecting to and sending raw data to the cloud over the
internet can have privacy, security and legal implications, especially when dealing with
sensitive data subject to regulations in different countries.

In a fog environment, the processing takes place in a data hub on a smart device, or in a smart
router or gateway, thus reducing the amount of data sent to the cloud. It is important to note
that fog networking complements not replaces cloud computing fogging allows for short term
analytics at the edge, and cloud performs resource-intensive, longer-term analytics.

Fog computing can be perceived both in large cloud systems and big data structures, making
reference to the growing difficulties in accessing information objectively. This results in a
lack of quality of the obtained content. The effects of fog computing on cloud computing and
big data systems may vary; yet, a common aspect that can be extracted is a limitation in

Seminar Report on fog Computing 18

accurate content distribution, an issue that has been tackled with the creation of metrics that
attempt to improve accuracy [2].

Fig 3.2 : UML diagram of fog computing.

Fog networking consists of a control plane and a data plane. For example, on the data plane,
fog computing enables computing services to reside at the edge of the network as opposed to
servers in a data-center. Compared to cloud computing, fog computing emphasizes proximity
to end-users and client objectives, dense geographical distribution and local resource pooling,
latency reduction for quality of service and edge analytics/stream mining, resulting in
superior user-experience and redundancy in case of failure [2].

Seminar Report on fog Computing 19

Fig 3.3 : Working of Fog Computing.

Fog nodes will Receive feeds from IoT devices using any protocol, in real time. Run IoT-
enabled applications for real-time control and analytics, with millisecond response time then
Provide transient storage, often 12 hours and Send periodic data summaries to the cloud
after this at the cloud platform the cloud Receives and aggregates data summaries from many
fog nodes Performs analysis on the IoT data and data from other sources to gain business
insight and can send new application rules to the fog nodes based on these insights.

3.3. How Fog Computing will Help To Control the Traffic?

Seminar Report on fog Computing 20

Fig 3.4: Block diagram of Implementation of Traffic light system
3.3.1. Traffic Control:
These systems will communicate with each other say every 15 minutes.
The DM or the local server will communicate to the other local server servers in every
10 minutes.
If traffic is detected in an area, the system attached to that area will

communicate with the other systems with the help of communicator. And this is how
the other systems will get information about the heavy traffic in that area.
The sensors will detect the number of vehicles on the zebra crossing.

If the number of vehicles is more than the system will not allow the
pedestrians to cross the zebra crossing unless there is a red signal.
If the number of vehicles is less then it will give the red signal to them and then allow
the pedestrians to cross the road.
3.3.2. Role of Fog Computing in this Example
If the decision makers were on the cloud far away from the system location then it
would have taken a huge time in taking the decision as well as it would cause a delay.
Smart traffic light needs to be act in the real time.

Seminar Report on fog Computing 21

Therefore the Fog Computing concept resolves this issue.
As mentioned earlier the Fog Computing benefits will help this Smart Traffic Light
system to work efficiently in a real time [2].
Table3.1. Attribute of Smart traffic light system
Attributes of smart traffic light system

Geo-Distribution Wide (across region) and dense

Low /predictable latency Tight within the scope of interaction.

Fog-Cloud interplay. Data at different time scales(sensors/vehicles

at intersection, traffic info at diverse collection

Multi-Agencies orchestration Agencies that run the system must co ordinate

control law policies in real time.

Consistency Getting the traffic landscape demands a degree

of consistency between collection of policies.

3.4. Modelling and Simulation

To enable real-time analytics in fog computing, we must investigate various resource-
management and scheduling techniques including the placement, migration, and
consolidation of stream-processing operators, application modules, and tasks. This signifi-
cantly impacts processing latency and decision-making times [6].
However, constructing a real IoT environment as a test bed for evaluating such techniques is
costly and doesnt provide a controllable environment for conducting repeatable experiments.
To overcome this limitation, we developed an open source simulator called iFog Sim. iFog
Sim enables the modelling and simulation of fog-computing environments for the evaluation
of resource-management and scheduling policies across edge and cloud resources under
multiple scenarios, based on their impact on latency, energy consumption, network
congestion, and operational costs. It measures performance metrics and simulates edge
devices, cloud data centres, sensors [6].

Seminar Report on fog Computing 22

Chapter 4


In the previous chapter s we have seen the fog architecture and working of fog computing. In
this chapter the advantages and applications are discussed. Also the difference between the
fog computing and cloud computing is given in this chapter.

4.1. Quality of Service (QoS)

Seminar Report on fog Computing 23

QoS is an important metric for fog service and can be divided into four aspects,
1)connectivity, 2) reliability, 3) capacity, and 4) delay.
In a heterogeneous fog network, network relaying, partitioning and clustering provide
new opportunities for reducing cost, trimming data and expanding connectivity. For
example, an ad-hoc wireless sensor network can be partitioned into several clusters
due to the coverage of rich-resource fog nodes (cloudlet, sink node, powerful smart
phone, etc.). Similarly, the selection of fog node from end user will heavily impact the
performance. We can dynamically select a subset of fog nodes as relay nodes for
optimization goals of maximal availability of fog services for a certain area or a single
user, with constraints such as delay, throughput, connectivity, and energy consumption
Normally, reliability can be improved through periodical check-pointing to resume
after failure, rescheduling of failed tasks or replication to exploit executing in parallel.
But check pointing and rescheduling may not suit the highly dynamic fog computing
environment since there will be latency, and cannot adapt to changes. Replication
seems more promising but it relies on multiple fog nodes to work together [7].
Capacity has two folds: 1) network bandwidth, 2) storage capacity. In order to achieve
high bandwidth and efficient storage utilization, it is important to investigate how data
are placed in fog network since data locality for computation is very important. There
are similar works in the context of cloud, and sensor network . However, this problem
faces new challenges in fog computing. For example, a fog node may need to
compute on data that is distributed in several nearby nodes. Data placement in
federation of fog and cloud also needs critical thinking. The challenges come from
how to design interplay between fog and cloud to accommodate different workloads.
Due to the dynamic data placement and large overall capacity volume in fog
computing, we may also need to redesign search engine which can process search
query of content scattered in fog nodes [7].
Delay Latency:
Sensitive applications, such as streaming mining or complex event processing, are
typical applications which need fog computing to provide real-time streaming
processing rather than batch processing. propose a fog-based opportunistic
spatiotemporal event processing system to meet the latency requirement [7].

Seminar Report on fog Computing 24

4.2. Difference between Cloud Computing and Fog Computing
From Table 4.1, it can be seen that Cloud Computing characteristics have very severe
limitations with respect to quality of service demanded by real time applications requiring
almost immediate action by the server.

Table4.1: Difference between Cloud Computing and Fog Computing

Requirements Cloud Computing Fog Computing

Latency High Low

Location of Service Within the Internet At the edge of the local


Distance between client and Multiple hops One hope


Security Undefined Can be defined

Attack on data enroute High probability Very low probability

Location awareness No Yes

Geo-distribution Centralized Distributed

No. of server nodes Few Very large

Support for Mobility Limited Supported

Real Time interactions Supported Supported

4.3. Fog Computing Advantages

1. The significant reduction in data movement across the network resulting in reduced
congestion, cost and latency, elimination of bottlenecks resulting from centralized
computing systems, improved security of encrypted data as it stays closer to the end user
reducing exposure to hostile elements and improved scalability arising from virtualized
systems [3].

Seminar Report on fog Computing 25

2. Eliminates the core computing environment, thereby reducing a major block and a
point of failure [3].
3. Improves the security, as data are encoded as it is moved towards the network edge
4. Edge Computing, in addition to providing sub-second response to end users, it also
provides high levels of scalability, reliability and fault tolerance [3].
5. Consumes less amount of band width [3].

4.4. Fog Computing Applications

Various applications could benefit from fog computing.

Healthcare and activity tracking

Fog computing could be useful in healthcare, in which real-time processing and event
response are critical. One proposed system utilizes fog computing to detect, predict,
and prevent falls by stroke patients. The fall-detection learning algorithms are
dynamically deployed across edge devices and cloud resources. Experiments
concluded that this system had a lower response time and consumed less energy than
cloud-only approaches.
A proposed fog computing based smart-healthcare system enables low latency,
mobility support, and location and privacy awareness [2].
Smart Grids
Smart grid is another application where fog computing is been used. Based on
demand for energy, its obtain ability and low cost, these smart devices can switch to
other energies like solar and winds. The edge process the data collected by fog
collectors and generate control command to the actuators. The filtered data are
consumed locally and the balance to the higher tiers for visualization, real-time
reports and transactional analytics. Fog supports semi-permanent storage at the
highest tier and momentary storage at the lowest tier [2].

Smart utility services

Fog computing can be used with smart utility services, whose focus is improving
energy generation, delivery, and billing. In such environments, edge devices can
report more fine-grained energy-consumption details (for example, hourly and daily,
rather than monthly, readings) to users mobile devices than traditional smart utility

Seminar Report on fog Computing 26

services. These edge devices can also calculate the cost of power consumption
throughout the day and suggest which energy source is most economical at any given
time or when home appliances should be turned on to minimize utility use [2].
Connected car

Autonomous vehicle is the new trend taking place on the road. Tesla is working on
software to add automatic steering, enabling literal "hands free" operations of the
vehicle. Starting out with testing and releasing self-parking features that don't require
a person behind the wheel. Within 2017 all new cars on the road will have the
capability to connect to cars nearby and internet. Fog computing will be the best
option for all internet connected vehicles why because fog computing gives real time
interaction. Cars, access point and traffic lights will be able to interact with each other
and so it makes safe for all. At some point in time, the connected car will start saving
lives by reducing automobile accidents [2].

Augmented reality, cognitive systems, and gaming

Fog computing plays a major role in augmented-reality applications, which are
latency sensitive. For example, the EEG Tractor Beam augmented multiplayer, online
braincomputer-interaction game performs continuous real-time brain-state
classification on fog devices and then tunes classification models on cloud servers,
based on electroencephalogram readings that sensors collect [2].
A wearable cognitive-assistance system that uses Google Glass devices helps people
with reduced mental acuity perform various tasks, including telling them the names of
people they meet but dont remember.10 In this application, devices communicate with
the cloud for delay-tolerant jobs such as error reporting and logging. For time-
sensitive tasks, the system streams video from the Glass camera to the fog devices for
processing. The system demonstrates how using nearby fog devices greatly decreases
end-to-end latency [2].

Mobile Big Data Analytics:

Big data processing is a hot topic for big data architecture in the cloud and mobile
cloud. Fog computing can provide elastic resources to large scale data process system
without suffering from the drawback of cloud, high latency. In cloud computing
paradigm, event or data will be transmitted to the data center inside core network and

Seminar Report on fog Computing 27

result will be sent back to end user after a series of processing. A federation of fog and
cloud can handle the big data acquisition, aggregation and pre-processing, reducing
the data transportation and storage, balancing computation power on data processing.
For example, in a large scale environment monitoring system, local and regional data
can be aggregated and mined at fog nodes providing timely feedback especially for
emergency case such as toxic pollution alert. While detailed and thorough analysis as
computational-intensive tasks can be scheduled in the cloud side. We believe data
processing in the fog will be the key technique to tackle analytics on large scale of
data generated by applications of IoT [2].

Big Data has emerged in earnest the past couple of years and with such an emergence
the Cloud became the architecture of choice. All but the most well financed
organizations find it feasible to access the massive quantities of Big Data via the
virtual resources of the Cloud, with its nearly infinite scalability and on-demand pay
structure [2].

4.5. Future of Fog Computing

Just as cloud has created new business models, growth and industries, fog can eventually do
the same, who foresees the excitement of having new vendors, new industries, new
businesses models come out of this as the industry, working together with academia to
address the challenges and solve real business problems with these new architectural

Fog computing will provide ample opportunities for creating new applications and services
that cannot be easily supported by the current host-based and cloud-based application
platforms, For example, new fog-based security services will be able to help address many
challenges we are facing in helping to secure the Internet of Things. Fog computing was
introduced to meet three primary goals-

To improve efficiency and trim the amount of data that requires to be transmitted for
processing, analysis and storage.
Place the data close to the end user.
Provide security and compliance to the data transmission over cloud.
Fog Networking consists of a control plane and a data plane, where most of the processing
takes place in the data plane of a smart mobile or on the edge of the network in a gateway

Seminar Report on fog Computing 28

Developing these services at the edge through fog computing will lead to new business
models and opportunities for network operators.


We have analyzed Fog Computing and its real time applications.

Fog computing performs better than cloud computing. Processing data closer to where it is
produced and needed. It also protects sensitive IoT data. fog computing will grow in helping

Seminar Report on fog Computing 29

the emerging network paradigms that require faster processing with less delay. By using the
concepts of fog computing, if the same device can be used for these kind of processing, data
generated can be put to immediate use and deliver a much better user experience.


[1]. Cisco RFP-2013-078. Fog Computing, Ecosystem, Architecture and Applications.

Seminar Report on fog Computing 30

[2]. Internet of Things by Rajkumar Buyya, & Amir Vahid Dastjerdi, 1st Edition.
[3]. Computer Science and Information System,2014 Federated Conference on 7-10 Sept.
[4]. F. Bonomi, Connected vehicles, the internet of things, and fog com- puting, in
The Eighth ACM International Workshop on Vehicular Inter- Networking (VANET), Las
Vegas, USA, 2011.
[5]. International Journal of Research in Engineering and Technology eISSN: 2319-1163 |
pISSN: 2321-7308.
[6]. Fog Computing in the Internet of Things, Rahmani,A.M., Liljeberg.
[7]. H. Madsen, G. Albeanu, B. Burtschy, and F. Popentiu-Vladicescu. Reliability in the
utility computing era: Towards reliable fog computing. In IWSSIP. IEEE.

Seminar Report on fog Computing 31