You are on page 1of 41

Search Engine Optimisation

(FOR PRIVATE CIRCULATION ONLY)


2019
PROGRAMME COORDINATOR
Dr. Narendra Parchure

COURSE DESIGN AND REVIEW COMMITTEE


Dr. Padmpriya Irabatti Dr. Narendra Parchure
Dr. Sudeep Limaye Ms. Sonali Karale
Dr. Pallavi Soman

COURESE WRITER
Dr. Kishori Kasat Dr. Deepali Dhainje

EDITOR
Ms. Neha Mule

Published by Symbiosis Centre for Distance Learning (SCDL), Pune


2019

Copyright © 2019 Symbiosis Open Education Society


All rights reserved. No part of this book may be reproduced, transmitted or utilised in any
form or by any means, electronic or mechanical, including photocopying, recording or by
any information storage or retrieval system without written permission from the publisher.
Acknowledgement
Every attempt has been made to trace the copyright holders of materials reproduced in this
book. Should any infringement have occurred, SCDL apologises for the same and will be
pleased to make necessary corrections in future editions of this book.

6
PREFACE
We are glad to write this SLM on “Search Engine Optimisation” for the students of SCDL. With
the growth in globalization, the complexities in the world have increased immensely. Today,
all organizations are interlinked more than ever before. We could just not think of isolating
ourselves from others in the business world. This leads to the increasing need for trust and
transparency in the system. The idea behind search engine optimisation is simple - well run
businesses to produce better results.
This SLM covers the basic fundamentals of search engine optimisation. The chapter flow of
each unit comprises of the objective, a detailed presentation of concepts and generalisation
to give the students a clear understanding of the topic, the summary followed by key words
and a list of questions for self-assessment. Special stress has been laid on the simplicity of
language. Every effort has been made to acknowledge the references to the sources of
information. We sincerely hope that this SLM will be interesting and useful and will help
students and readers to learn this subject in a more meaningful and useful manner.
We take this opportunity to sincerely extend our thanks to the SCDL staff for believing in and
giving an opportunity to write this SLM. We also thank all those, who directly or indirectly
helped in completing this work.
Dr. Kishori Kasat
Dr. Deepali Dhainje

7
ABOUT THE AUTHORS
Dr. Kishori Kasat has completed PhD in Electronic Science from University of Pune. India. She
is an enthusiastic educator with the broad areas like Electronic Science, Management and
Technology, Digital Signal Processing, Speech Recognition, Decision Science, Operations
Research, Education Technology, Statistics and Quantitative Techniques, Advanced Research
Analysis Techniques (SPSS, AMOS). She has presented number of papers in National and
International conferences with Scopus Index. Dr. Kasat presently working on a minor research
project which is sanctioned by BCUD, with a project title as “Development of Speaker
Independent Automatic Speech Recognition System for Marathi Language”
Dr. Deepali Dhainje is a faculty in Fergusson College (Post Graduation section) and visiting
faculty for embedded system course with total 15 Years of teaching experience. Her areas of
expertise are Databases, Distributed database, Operating System, Object Oriented languages,
Software Engineering. She has participated and conducted many Rational Testing Tools
Course. Ms. Dhainje has guided many postgraduate students on various projects. She is also
on the interview panel of subject expert in the University of Pune.

8
CONTENTS

Unit No. TITLE Page No.


1 Search Engine Optimisation 1–10
1.1 Introduction to Search Engines
1.2 History of Search Engines
1.3 Types of Search Engines
1.4 Categories of Search Engines
Summary
Key Words
Self-Assessment Questions
Answers to Check your Progress
Suggested Reading

2 On Page Optimisation 11–18


2.1 Introduction
2.2 Modus Operandi – On Page Optimisation
2.3 Factors Affecting the On Page Optimisation
2.4 Common Problems in On Page Optimisation
Summary
Self-Assessment Questions
Answers to Check your Progress
3 Advanced SEO 19–29
3.1 Introduction
3.2 Learning Objective
3.3 Need and Basic Requisites for Learning
3.4 Advanced SEO Course Content
3.5 Challenges in Designing the Content
Summary
Self-Assessment Questions
Answers to Check your Progress
Suggested Reading
4 Off Page Optimisation 30–36
4.1 Introduction
4.2 Importance of Backlinks
4.3 Key Techniques for Off Page Optimisation
4.4 Benefits of Off-Page Optimisation
Summary
Self-Assessment Questions
Answers to Check your Progress
Suggested Readings

9
Unit 1

Search Engine Optimisation

Structure
1.1 Introduction to Search Engines

1.2 History of Search Engines

1.3 Types of Search Engines

1.4 Categories of Search Engines

Summary
Key Words
Self-Assessment Questions
Answers to Check your Progress
Suggested Reading

1
Objectives
After going through this unit, you will be able to:

 Understand the basics of Search Engines

 Understand the History of Search Engines

 Learn about various approaches to Search Engines

 Know about the types of Search Engines

 Study about Search Engine Categories

1.1 INTRODUCTION TO SEARCH ENGINES


The function of a search engine is to mine the requested information from the
enormous database of resources that are presented on the internet. The Search engines
turn out to be an imperative day to day means for discovering the essential information
without knowing where precisely it is stored. The practice of using Internet has been
enormously amplified in modern days with the easy to make use of various search
engines like Google etc.

Search Engines
A search engine is a software program which deals with the information retrieval that
discovers crawls, converts and stores the information for retrieval and management in
reply to the queries that are given by the user.

In general, a search engine consists of four parts, i.e., a search interface, crawler,
indexer, and a database. The crawler navigates through a collection of documents,
deconstructs the text of document, allot surrogates for the purpose of storage in the
search engine index. The search engines which are online can store images, they can
connect data and metadata for the document.

A search engine which is used on the web is kind of a website that facilitates a user to
discover the information on the Internet. It accomplishes this by having a look through
various other web pages for the text which the user wants to find. The software that
carries out this type of task is known as a search engine. Instead of the user
encompassing to go to the first webpage, this task can be accomplished with the help
of the Web browser and a search engine.

To make use of a search engine, it is mandatory to enter at least a single keyword in the
search box. In general, an on-screen button will be present, which has to be clicked to
submit the search query. Then, the search engine tries to look for the matches between
the entered keyword(s) and the database of websites and words.

Promptly, once a search is submitted then, the results become visible on the screen. The
web page that shows the results is called as search engine results page (SERP). The SERP
is a record of Web Pages that includes matches to the keywords that were searched.

2
The SERP typically displays the names of the web, their little descriptions and a hyperlink
for each of the matching web page. When, the user click on any of the links, the user
can navigate to one of the websites.

In general, the Search engines can be considered up to some extent as the advanced
websites on the web. The Search engines try to use unique computer code to arrange
the web pages based on SERPs. In general, the most popular or peak quality web pages
will be close to the top of the list.

When a user types the words into the search engine, it glances for web pages with those
types of words. It is possible that, there might be thousands, or millions, of web pages
with those types of words. Hence, the search engines assist the users by positioning the
web pages by assuming the user desires initially.

1.2 HISTORY OF SEARCH ENGINES


The search engines of Internet themselves struck the entry of Web in December 1990.
Earlier, well documented search engine was the Archie that started searching of the
content files, namely FTP which was started on 10 September 1990.

Earlier to September 1993, the World Wide Web (WWW) was exclusively indexed by
hand only. Tim Berners-Lee has edited most of the web and he has hosted on
the CERN webserver.

The Archie program downloads the directory listings of all files which are located on
public unidentified FTP (File Transfer Protocol) sites, by creating a database that can be
searchable on file names. But, the disadvantage of the Archie Search Engine was that it
could not index the content of these sites because the amount of data was so
inadequate and it could be readily searched manually.

The growth of Gopher was formed in 1991 by Mark McCahill at the University of
Minnesota. It has shown the way to 2 innovative popular search programs, which were
Veronica (Very Easy Rodent-Oriented Net-wide Index to Computerized Archives) and
Jughead (Jonzy’s Universal Gopher Hierarchy Excavation and Display). The two search
engines Veronica and Jughead also searched the file names and titles that were stored
in Gopher index systems. The search engine Veronica as well provided a keyword
exploration of the majority of the menu titles of Gopher in the whole Gopher listings.
The search engine Jughead was a tool for attaining the menu information from
particular Gopher servers.

Unfortunately, in 1993, there was no existence of search engine for the web. Although,
several specific catalogues were maintained by hand. Oscar Nierstrasz at the University
of Geneva has written a sequence of Perl scripts that occasionally reflected these pages
and modified them into a regular format. This produced the foundation for W3Catalog,
i.e., the first primitive search engine of web that was released on September 2, 1993.

In June 1993, Matthew Gray, then at MIT shaped the earliest web robot, which is Perl-
based World Wide Web Wanderer. It is used to produce an index called 'Wandex'. The
main idea of the Wanderer was to determine the size of the World Wide Web. In
November 1993, Aliweb, the second search engine of web came into view. Aliweb did

3
not use a web robot, instead, the administrators were advised to maintain the existence
at each site of an index file in a particular format.

NCSA's Mosaic™ - Mosaic (web browser) was not the first Web browser, but it was the
initial one to make a main splash. In November 1993, Mosaic version 1.0 has started
including a variety of features like bookmarks, icons, a more eye-catching interface, and
pictures, all of them has made the software so that it can be used easily and attractive.

Jump Station, that was created in December 1993 by Jonathon Fletcher used a web
robot to locate web pages and to construct its index, and used a web form as an
interface to its query plan. It was the first resource-discovery tool of World Wide Web
to unite the three necessary features of a web search engine, i.e., crawling, indexing,
and searching.

WebCrawler was one of the foremost "all text" search engines which were crawler-
based, it appeared out in 1994. In contrast to its antecedent, it allowed users to explore
any word on any webpage, which has turn out to be the standard for all major search
engines ever since. It was also the search engine that was extensively known by the
community. In 1994, Lycos which started at Carnegie Mellon University was started and
turn out to be a most important commercial effort. Many search engines have appeared
after and they had gained the popularity. Some of them are Excite, Magellan, Infoseek
Yahoo!, etc. The Yahoo search engine was amongst the most popular one by the people
to locate web pages of significance. The search function of the Yahoo! search engine will
be totally operated on its web directory, slightly than its full-text replica of web pages.
Instead of performing a keyword-based search, the Information seekers look through
the directory.

In 1996, Netscape seems to provide a solo search engine with an elite deal, as the search
engine has all features on the web browser of the Netscape, which has made Netscape
to hit the deals with five of the most important search engines for $5 million a year, so
that, each of the search engine would be in replacement on the page of the Netscape
search engine. The five search engines were Yahoo!, Magellan, Lycos, Infoseek, and
Excite.

Google implemented the thought of selling search terms in 1998, starting from a small
search engine company and named it as goto.com. This progress has a momentous
outcome on the Search Engine business, which has led to one of the most profitable
businesses in the internet.

Number of companies have entered in the market outstandingly by getting record gains
all the way through their initial public offerings. Some of them have opted for their
public search engine, and some of them started as marketing enterprise-only editions
such as Northern Light. Many search engine companies were trapped up in the dot-com
bubble, which was a speculation-driven market explosion that worn out in 1999 and
ended in 2001.

Around 2000, Google has given a rise to prominence. Better results were achieved by
the company for lot of searches with an improvement called as PageRank.
This PageRank was an iterative algorithm that assigns the ranks to the webpages based

4
on the number and PageRank of other websites and pages that link there, on the
principle that good or enviable pages were linked to more than others. Google as well
retain a modest interface to its search engine, where lot of its competitors entrenched
a search engine in a web portal itself. In reality, Google search engine has turn out to be
so popular.

By 2000, Yahoo! was providing search services that were based on Inktomi's search
engine. And later, Yahoo! changed to Google's search engine until 2004, when it started
its own search engine that is based on the combined technologies of its achievements.

Microsoft primarily launched MSN Search in the fall of 1998 using search results from
Inktomi. But later, in 2004, Microsoft began a changeover to its own search technology,
which was powered by its own web crawler which was called as msnbot.

The Microsoft's rebranded search engine is Bing, which was initiated on June 1, 2009.
On July 29, 2009, Yahoo! and Microsoft concluded an agreement in which Yahoo!
Search would be powered by Microsoft Bing technology.

1.3 TYPES OF SEARCH ENGINES


In general, Search engines are categorised on the basis of its work process. They are
categorised as Crawler based search engines, Human powered directories, Hybrid
search engines, other special search engines.

I) Crawler Based Search Engines

The crawler based search engines make an effort to make use of a crawler or bot or
spider for crawling and indexing the content that is novel to the search database. The
basic 4 steps which every crawler based search engine pursues before showing any
site in the search results are Crawling, Indexing, Calculating Relevancy and Retrieving
the Result.

a) Crawling

The Search engines search to get the obtainable web pages in the whole web. A
portion of software called as a crawler or bot or spider carry out the process of
crawling. The frequency of crawling depends on the search engine and it may
perhaps acquire few days between the crawls. Therefore, sometimes, search
results show the content of old page or deleted page. Once the search engines
crawl the site another time, search results will illustrate the latest content.

b) Indexing

The process of Indexing is subsequent step after the crawling process. It is a


method of recognising the words and expressions that most excellently illustrate
the page. The recognised words are also called as keywords and page is allotted
to the recognised keywords. In some cases, when crawler does not recognize the
importance of page then, the site might be given lower rank based on the search
results. Once the crawlers pickup right keywords, then the page will be allocated
to those keywords and higher rank will be given based on search results.

5
c) Calculating Relevancy

The Search engine starts comparing the search string in the search appeal with
the pages that are indexed from the database. Since it is possible that more than
one page may hold the search string, therefore, the search engine begins the
process of computing the relevancy of each and every page in its index with the
search string. There are numerous algorithms that are available to determine the
relevancy. Each and every algorithm has different relative weights for general
factors like keyword density, links, or Meta tags. Therefore, different search
engines offer different search results pages for the identical search string. It is a
well-known fact that every major search engine intermittently modifies their
algorithms.

d) Retrieving Results

It is the last step in search engines. Mainly, it is basically demonstrating them in


the browser in an order. The Search engines arrange the continuous pages of
search results in the order of most appropriate to the least appropriate sites. The
majority of popular search engines are crawler based search engines and use the
above technology to display search results. Some of the examples of crawler
based search engines are Google, Yahoo!, and Bing etc. The other accepted
crawler based search engines are AOL, DuckDuckGo and Ask.

II) Human Powered Directories

The Human powered directories are also called as open directory system which
depends on individual based activities for listings. The working of indexing in human
powered directories is as follows:

 The owner of the site submits a small description of the site to the directory
along with the category type that it is to be listed.

 The submitted site is manually evaluated and then it is added in the suitable
category or it is discarded for listing.

 The keywords that are entered in a search box will be coordinated with the
description of the sites. Therefore, the changes that are made to the content
of web pages are not taken into consideration as it is simply the explanation
that matters.

In general, a good site with good quality content is more possibly to be reviewed for
free compared to a site with deprived content. Some of the examples of human
powered directories are DMOZ and Yahoo! Directory.

III) Hybrid Search Engines

The Hybrid Search Engines attempt to make use of both crawler based and manual
indexing for listing the web sites in search results. The greater part of the crawler
based search engines is similar to that of Google which mostly uses crawlers as a
major mechanism and human powered directories as insignificant mechanism. In
view of the fact that, the human powered directories are becoming extinct,
6
therefore, the hybrid types are becoming more and more crawler based search
engines. However, still there are manual filtering of search result that happens to
eliminate the copied and spam websites. When a website is recognised as a spam, it
is a duty of website’s owner to take the necessary and corrective action and resubmit
the website to search engines. In that case, the specialists do manual evaluation of
the submitted website before including it another time in the search results. In this
way, even if the crawlers manage the processes, the control is manual to observe
and show the search results as per the expectations.

IV) Other Types of Search Engines

These types of search engines may be classified into a variety of other types
depending upon their usage. Some of the search engines hold different types of bots
to completely demonstrate the videos, images, news, products and local listings. One
such search engine includes Google News page which can be used to seek out only
news that arises from dissimilar newspapers.

The search engines similar to Dogpile attempt to collect Meta information of the web
pages from additional search engines and directories to illustrate the search results,
therefore, they are also known as metasearch engines.

The Swoogle is one of the Semantic search engine which tries to present exact search
results on a particular area by taking into account the related importance of the
search queries.

Check your Progress 1


Fill in the Blanks.

1. The Human powered directories are also called as ______________.

2. The ______________ attempt to make use of both crawler based and manual
indexing for listing the web sites in search results.

1.4 CATEGORIES OF SEARCH ENGINES


a) Web Search Engines

The Search engines that are specifically planned to search the web pages, images and
documents were developed to assist searching through a big, tenuous splash of
shapeless resources. They are engineered to go after a multi-stage process, i.e.,
crawling the endless accumulation of pages and documents to skim the abstract
foam from their contents, indexing the fluff/catchphrase in a variety of form i.e.,
semi-structured in nature, and at end, deciding the entries or the queries of the user
to go back with most appropriate results and links to those scanned pages or
documents from the inventory.

7
b) Crawl

In the case of completely textual search, the primary step in categorising the web
pages is to locate an ‘index item’ that may communicate explicitly to the ‘search
term.’ In the earlier period, the search engines initiate with a minute list of Uniform
Resource Locators (URLs) which is also called as seed list. Seed list tries to fetch the
content, and parses the links on those pages for related information, which later
provides the new links. This procedure was extremely recurring and sustained until
an adequate amount of pages were by the searcher for their use. At present, an
uninterrupted crawl technique is in use which is in contrast to a supplementary
finding which is based on a seed list. In general, the crawl method is an expansion of
a foresaid finding method which does not have a seed list, because the system in no
way stops worming.

Majority of search engines try to make use of complicated scheduling algorithms to


make a decision about when to re-examine a particular page, to request its
importance. These algorithms vary from unvarying visit-interval with top priority for
more commonly varying pages to adaptive visit-interval which are based on
numerous criteria such as occurrence of change, attractiveness, and on the whole
quality of site. The speed of the web server that runs the page as well as constraints
of the resources similar to quantity of hardware or bandwidth also shapes in.

c) Link map

The pages that are revealed by web crawls are regularly scattered and fed into
another computer that generate an absolute record of resources that are uncovered.
The bunchy cluster mass appears like a small graph, on which the dissimilar pages are
characterised as minute nodes that are linked by the links among the pages. The
excess of data is accumulated in numerous data structures that allow fast access to
assumed data by definite algorithms. These algorithms calculate the reputation score
of pages based on how many links position to an assured web page, which is how
people can access several number of resources that are concerned with detecting
fixation. In general, the search engines frequently distinguish between internal links
and external links. The Link map data structures usually store up the anchor text that
is surrounded in the links as well, since, the anchor text can repeatedly offer a very
good quality review of the content of the web page.

d) Database Search Engines

Specialised search engines arise because searching for text-based content in


databases presents a small number of extraordinary. Sometimes, the databases can
be slow while resolving complex queries. The process of crawling is not necessary for
the database, because the data is previously structured. But, it is frequently
necessary to index the data in a more economised form to permit a speedier search.

e) Mixed Search Engines

At times, data search includes both database content and web pages or documents.
Therefore, the Search engine technology has developed to act in response to both
sets of requirements. Majority of the assorted search engines are very large, the Web

8
search engines like Google searches together through ordered and unordered data
sources. In general, the documents are crawled and indexed in a separate index. The
databases are also indexed from a variety of sources. The results of search are then
generated for users by questioning these numerous indices in parallel and
compounding the results according to “rules.”

Check your Progress 2

State True or False.

1. The process of crawling is not necessary for the database, because the data is
previously structured.

Summary
 The Swoogle is one of the Semantic search engine which tries to present exact
search results on a particular area by taking into account the related importance of
the search queries.

 At present, an uninterrupted crawl technique is in use which is in contrast to a


supplementary finding that is based on a seed list.

 Data search includes both database content and web pages or documents.

Keywords
 URL: Uniform Resource Locators. In the earlier period, the search engines initiate
with a minute list of URL which is also called as seed list. Seed list tries to fetch the
content, and parses the links on those pages for related information, which later
provides the new links.

Self-Assessment Questions
1. Explain the categories of Search Engines.

2. Explain the types of Search Engines in detail.

Answers to Check your Progress


Check your Progress 1

Fill in the Blanks.

1. The Human powered directories are also called as open directory system.

2. The Hybrid Search Engines attempt to make use of both crawler based and manual
indexing for listing the web sites in search results.

9
Check your Progress 2

State True or False.

1. True

Suggested Reading
1. Peter Kent, SEO for Dummies, 6th Edition, John Wiley & Sons.

2. Jason McDonald, SEO Toolbook: 2018 Directory of Free Search Engine Optimization
Tools, Kindle Edition.

3. W. Bruce Croft, Donald Metzler, Trevor Strohman, Search Engines Information


Retrieval in Practice, Pearson Education, Inc.

4. Aaron Matthew Wall, Search Engine Optimization book.

10
Unit 2

On Page Optimisation

Structure
2.1 Introduction

2.2 Modus Operandi – On Page Optimisation

2.3 Factors Affecting the On Page Optimisation

2.4 Common Problems in On Page Optimisation

Summary

Self-Assessment Questions

Answers to Check your Progress

11
Objectives
After going through this unit, you will be able to:

 Understand the concept of on page optimization

 Study the important on page factors which influence your rating of search
engines

2.1 INTRODUCTION
The footstep which every SEO web designer or webmaster should take into
consideration is on page optimisation. This means the webmaster needs administer
through selecting the correct name for the domain, formation of tittle tags and Meta
tags, enhancement of hyperlink, images and heading tag and many more. It doesn’t take
much time to understand and execute the procedures required for on page
optimisation. It is very crucial for any portal webmaster to implement appropriate on
page optimisation to get a good rating in the search engine and also enhance the
complete website reading for the visitors of your webpage.

It is not very easy to adapt a general and effective procedure for enhancing a page
because it is important to note that the set of rules given by Google keeps changing
continuously and the magnitude of different ways of getting the visitors or the web
traffic to your webpage is increasing day by day.

The essence of on page optimisation is to pay attention to the responsive search engine
and accordingly design the webpage and content. When it comes to high rating of your
webpage, on page optimisation becomes more important than off page optimisation
and it also takes a lead when compared with portals having good domain name and
many backlinks which are considered for high rating.

2.2 MODUS OPERANDI – ON PAGE OPTIMISATION


The following are the methods listed under the on page optimisation:

1. Configuration of URL

It is necessary that you have a URL that is noticeable and has value. It is
recommended that you keep a brief URL which is easy to recollect and describes the
necessity of the webpage and maintains clarity. When visitor wants to understand
the content page he/she can interpret the URL from the status bar in the browser. In
order to have an active URL, you need to maintain a database-driven portal or URL
of a portal that has its running script. Fixed URLs are generally rated well in search
engine outcome pages and indexed faster than dynamic URLs. Moment the URL is in
accordance with the question related to the title and description on the search
engine the visitors will surely connect with that URL

12
2. Configuration of internal and external linking: Link Optimisation

In order to help your visitors with proper direction and support your search engines,
it is crucial you look at the optimization of your internal and external outbound links.
The factors required for this purpose is as follows:

a. To connect to other persons, make use of good anchor text and correct keywords
to help you with a meaningful outbound link.

b. The configuration of the internal link should be neat and clear with correct use
of anchor text.

c. Make use of permalinks in case of CMS so that you can retain the keywords/post
title on your link and gets greater value by search engines.

3. Analysis of Keywords and projecting URL based on keyword

Generally, if you are loading your portal with lot of keywords and text it will become
difficult for the visitor to read the content. Therefore, you are required to maintain
stability with your content and keywords. It is better for you to investigate before
you place your keywords through tools that are free of charge (e.g. Google AdWords
Keyword Tool). Maintain compactness of keywords so that your webpage is found
appropriate and the visitor can relate easily. Make use of synonyms and associated
keywords so that your content will sound real and help the optimisation of search
engines. Also try and use long tail keywords as they help you in rating your webpage.

4. Structural design of URL

The structural design of the URL should have the flow for data. Also, it should have
clarity on the webpage so that the search engines can look for relevant data searched
by the visitor. The significance of the URL structure is that it supports the search
engines to know the requirements and give out a metric to the said page through
relevancy. From the anchor text point of view, it is supportive as visitors generally
will connect with relevant keyword or text or content contained in the URL.

5. Configuration of Tittle Tags

The very important factor of your portal optimisation is your portal’s title tag. It is
recommended that you don’t keep your title too lengthy and it should have precise
information so that the users can reach to your webpage and recognize your product
and services and continue their engagement on your site. It also helps you to get
better rating to your portal as compared to other websites that are same like yours.

The title tag should include the following: Your Brand Name, Your Name and the
name of the portal, Keywords, your contact number (toll free)

6. Formation and Optimization of Meta data

Even though this might not be very significant but the Meta description cannot be
overlooked. This should consist of a short explanation about your portal, your
business objectives and your expertise.

13
The Meta tags should include: Selling point, keywords, Contact Numbers

The very crucial portion of the HTML Code and the design of your portal are the Meta
tags. Its description tag is created to deliver a short narration of your portal so that it
can be utilised by search engines or directories. It also supports in indexing of a portal
with the help of search engine so that visitors can see your portal. Meta tags are
considered as an essential part of the web page design. Examples Tittle description,
keyword and robot tags.

7. HTML Tags: H1, H2 and Strong tags

You need to emphasize on some portions of your portal that you wish your visitors
should take a glimpse.

There are many tags like the header tags H1 (for header content), H2 & H3 (Page/Post
Headers or crucial parts of your pages), Bold (strong keywords), Italic (some words),
Quote (Referring somebody’s Quote) etc. The content inside these tags helps the
search engines to connect easily.

8. Optimisation of Content

In order to put the right information on your webpage you are required to optimise
your content by exploring, collect right information, script and make necessary
changes that is suitable for your portal. It plays a very important role in process
optimisation and increasing the prominence of your portal, your rating and search
engine optimisation.

Example: Track your heading tags, Compactness of Keywords, Inline text links

9. Optimisation for Image

The more the images on your portal the more you will need to track the optimisation
otherwise they will lose readability by the search engines. Human can easily read the
image but the web crawler procedure is not similar to human. It is different. Search
engine spiders can’t read images as they are used only for text.

The following are some of the crucial factors of image optimisation:

a) Alt text: It is the text that explains the image given by you as the mouse is placed
on the image on your web page. In case the image is not shown by your browser
then you utilize alt text for that image.

b) Name of File: Give names to your file that carry significance and remember to
maintain the same name as alt text.

c) Title of image: See that your images carry title tags so that when the visitor
places his mouse on your image he/she can see the title as tool tip.

d) Linking of your image: Image linking can be done by customising the image

14
10. Robots.txt
The text file on your portal which helps search robots on your portal to bring to your
notice the page that one should not visit. This means the robot Meta tags will
support the search engines to avoid the files and folders on the portal. With the
help of robot.txt file you can direct your search engines to your portal. The robot.txt
should be placed in the key directory so as it makes easy for user agents to find it
easily.

Other techniques:

1) Analysis of Challengers or competitors

2) Investigation for Internal Search

3) Making and Updating sitemap

4) Making of Sitemap for Video

5) Google maps with insertion of images

6) Revision and authentication of HTML Coding

2.3 FACTORS AFFECTING THE ON PAGE OPTIMISATION


The ranking of your page is interlinked with the system of optimisation. If the
optimization is done in the right manner, it will have a greater effect for the rating of
your web page.

The most important on page factors that influence your rating of search engines are as
follows:

1. Page Content

The outlook and outcome of the search is based on the content given on your page.
The content is what the visitor is looking at while searching for your web page
through search engines. You need to work on a content in such a way that it attracts
the attention of users and should be relevant to their search and they should be able
to connect easily.

While designing the content you have to keep in mind that your web page will create
the necessary demand for that page through the relevant content only. The content
also needs to be linkable otherwise you will lose your web page rating and it will not
connect the visitors to your site.

2. Being unique and Valuable:

The moment you add value and uniqueness to your page, visitors will remain
engaged and will not look back for some different pages. Content used should be
unique and relevant to describe your product and services so that visitors will attract
to your page. The video, text, images and multimedia used should be noteworthy.

15
3. Targeted Keywords

Your URL should display the primary keyword phrase. This is the main factor of your
title also. You need to use terms and phrases that are relevant and keywords should
be valid and complete.

4. Support Social Media Networks

A well written URL will display your web page title. The visitors should be sharing
preferences to maximum social networks.

5. Support connecting multiple devices

The visitors for your webpage should be able to connect through all devices and load
your URL to get the required information. You need to work out and make it ready
to connect to multiple devices.

6. Interlinking

Tactfully linking your single web page to other pages that are similar on your portal
will give framework to the search engine and to the visitor also. You can go through
the following exercise:

a. Add links to the foremost content of every single page


b. Give weightage to linking paragraphs
c. Make use of anchor text that have precious keywords in links
d. Do not have many links form an individual page to similar page
e. Make use of minimum links so that you can have full control

2.4 COMMON PROBLEMS IN ON PAGE OPTIMISATION


The very common problems related to on page optimisation are as follows:

1. Optimisation of Image

These are one of the very common problems as compared to other issues. Generally,
these are of two types:

1. The fragmented images: These look like tiny block or paper piece on the page.
They appear like this because of wrong file path in the code, or wrong name to
the image, or if the image has lost its existence.

2. Omitted alt attributes: Alt attributes are the substitutes to the text forms of
images. They provide explanation about the images for the search engines.

If you want to fix the above problems, it is necessary that you conduct regular SEO
audit to check the problems in images that are influencing your portal. Create
substitutes for your fragmented images or remove them completely. You need to
include alt and title attributes to all significant images that are absent.

16
2. Content Duplication

This is one of the major problems that is noticed for the on page optimisation. It is
not necessary that your content should be similar to be identical. As you are
transacting with search engines and crawl-bots and if your content is duplicate or
same then it will lead to conflict with one another for flow of traffic. This will end up
affecting the rating of your web page also.

Conduct periodical site audit to check the amount of content that is identical and
minimise duplication of content. Ensure that any identical content which is under
your control has been forwarded to the utmost authoritative page.

3. Omission of Meta Descriptions

It is noted that many pages still have missing meta-description. Generally, the length
of Meta description should have 130 to 155 characters and not more than this. It
should consist of primary keyword to support optimisation of the page by keeping your
visitor engaged and attract more visitors to connect on your page.

You can modify the Meta description with the help of WordPress, Yoast SEO Plugin.
SEO Audit software can also be used.

4. Problems Related to Well-organised Data

With the help of Schema.org microdata, you can support the search engines to keep a
check on your data and accordingly it will show information during good search
experiences. But generally, it is found that only 20% of pages make use of schema.org
microdata.

With the help of free Schema Creator toll, you can form HTML with schema.org
microdata and then add your custom code in the required parts inside your pages.

5. Problems Related to the Link

You can indicate to Google that your portal is of greater value when portals of great
excellence connect to your pages. This can be done with the linking of your single page
to another. Links that are fragmented will affect the experience of the visitor and will
indicate the search engines that your page is of low grade.

Connect your links that are fragmented or broken. Delete them or upgrade them. You
can also run the site audit software and crawl for your problems related to the links in
your portal.

17
Check your Progress 1

State True or False.

1. Off page optimisation becomes more important than on page optimisation for the
high rating of webpage.

2. The essence of on page optimisation is to pay attention to the responsive search


engine and accordingly design the webpage and content.

Summary
 The outlook and outcome of the search is based on the content given on your page.
Content duplication is one of the major problems that is noticed for the on page
optimisation. It is noted that many pages still have missing meta-description.
Generally, it is found that only 20% of pages make use of schema.org microdata.

Self-Assessment Questions
1. Explain in brief the methods under on page optimisation.

2. List factors affecting the on page optimisation.

Answers to Check your Progress


Check your Progress 1

State True or False.


1. False

2. True

18
Unit 3
Advanced SEO

Structure
3.1 Introduction
3.2 Learning Objective
3.3 Need and Basic Requisites for Learning
3.4 Advanced SEO Course Content
3.5 Challenges in Designing the Content
Summary
Self-Assessment Questions
Answers to Check your Progress
Suggested Reading

19
Objectives
After going through this unit, you will be able to:
• Understand the importance of SEO
• Study the challenges in the designing of content

3.1 INTRODUCTION
Search Engine Optimisation is a very fast-moving and vigorous field. It has become
necessary that you are required to have complete information about this field and
acquire knowledge and study endlessly. Just depending on techniques or strategies that
are redundant and is not operational will only upset you and lead you nowhere.
Having an understanding about basic SEO will only support you for the tasks like creating
the links, designing old content, accumulating some keywords to increase your search
rates and this might add little prominence to your brand or business. SEO does not mean
that you only see your presence on search engines and direct visitors to your portal.
Knowing basic SEO is to have a following basic methods or tactics to get your SEO
equipped for your website. It generally covers some of the following aspects only:

 Heading and explanation meta tags


 Designed to be accessed with a help of social network
 Availability of Crawler bot
 Wealthy snippets for native SEO and commodities
 Powerful and exclusive content with keywords emphasised
But in today’s high-speed world, it has become necessary to learn that how SEO is
multifaceted and complicated, requires lot of time and is technical than basics. There is
a need to know the advanced level of SEO which not only follows basic requirements
but also considers all the possible variables related to SEO, tackles problems and
developments on the on-page and off-page dynamics involved. In today’s digital world
it has become crucial that you keep a trace and realise the following factors of Advanced
SEO requirements:
1. Upcoming styles (e.g. voice search)
2. Variations in Algorithm
3. High-tech evolutions
4. Your viewers: Wonderful experience, conversions and customer
5. Structural Data for accessibility of your webpage
6. Lead competition with best SEO strategies
7. Indexing and crawlability
8. Redirects
9. Encrypting and technical features

20
10. Links on your portal

3.2 LEARNING OBJECTIVE


It is crucial to have wide understanding of the overview of everything related to SEO
that you learn the need for Advanced SEO which is the key to your success.
The learning objectives for advanced SEO is as follows:
1. Concentration on advanced keywords and search engines
2. Increasing greater SEO position through promotion of content marketing
3. Understanding the social media environments
4. Build your prominence and reputation through influence marketing
5. Analyse, recognize and develop your own global content
6. Understand the behaviour of ultimate direct audience
7. Become proficient in technical factors and strategies related to SEO
8. Create your line of defence to guard your website from adverse SEO
9. Driving the website generated through databases
10. Understanding the tools and sitemaps
11. Understanding the strength and weaknesses of SEO
12. Understanding the strength of editorial writing and its positive effects
13. Methods for optimisation of the search engine in a professional way
14. Economical intelligence and multi-distinctive evaluation
15. Methods of getting acquaintance with sources like Google, Yahoo etc.

3.3 NEED AND BASIC REQUISITES FOR LEARNING


The Advanced SEO course content is structured to prepare you to be a successful
industry professional in SEO. It will help you to become a master with keyword
management and research, on-page and off-page optimisation, develop links and design
your portal for high traffic and conversions. You will learn the crucial tactics through
thorough preparation and engagement.
This course is generally designed for the following persons:
1. Person who needs to understand upcoming and advanced SEO tactics
2. Real-time marketers
3. Businessman or Entrepreneur
4. Webmasters, Web Designers, Web Developers, Bloggers, Content writers
5. Anyone who needs to form, develop and arrange for a self-hosted friendly
WordPress Websites.
6. Person who wants greater website rankings on search engines
7. Person who understands marketing and communication
8. Person who are general managers of SMEs

21
The basic requisites are as follows:
1. Person who understands basic SEO contents i.e. the necessities and pre-
requisites
2. Person who has knowledge of HTML tagging
3. Person who has equal hands-on information and understanding
4. Understands how to explore keywords in SEO
5. Understands how search engines will get indexed in short period of time.
6. Professional who has knowledge of marketing and creation of portal

3.4 ADVANCED SEO COURSE CONTENT


1. Names of the Search Engine Spiders
A search engine spider is an intelligent platform, but it is known as spider because the
manner in which it works and give direction to effects of Web. The spider knits a web of
indexed Web pages by scrutinising the HTML and other factors on every page.
The names of spiders and its associated search engines are given below:

 Spider 1: Googlebot: It belongs to Google.com search engine


 Spider 2: MSNbot: It belongs to Search.msn.com search engine
 Spider 3: Ask Jeeves: It belongs to TeomaAsk.com
 Spider 4: Architext: It belongs to Excite.com search engine
 Spider 5: Yahoo Slurp: It belongs to Yahoo Web Search
 Spider 6: ia_archiver: It belongs to Alexa.com search engine
Search engine spiders are known by another name called ‘crawlers’, generally brought
in use by Internet search engines to gather data about portals and specific Web pages.
The search engine spiders have four essential ways of collecting data. The first way is to
form the web pages’ queues so that can be search by different other spiders which is
operating in so called ‘selection mode’.
The second is a way where spider is created so that it can go above pages that has been
crawled by a given spider which is known as the ‘re-visitation’ mode.
The third way is called ‘politeness’ mode as search engines are bothered about a page
being over crawled by different other spiders and it controls the crawling on such
overused pages.
The fourth way is called the ‘parallelisation’ mode wherein it permits a spider to
synchronise the information gathered with other search engines’ spiders which are
crawling on the similar page.
2. Google Analytics Cookies
A cookie is a text file that stores data about client choice, their location and additional
information. It restricts unofficial contacts and safeguards the client information. It also

22
controls some functions of portals. It supports remarketing and also gathers Google
Analytics information and different tracking information.
There are two kinds of Cookies:
First Party Cookies: This kind of cookies are released by the portal which is being
connected to and only the portal which has released the first party cookies can study
these cookies.
Third Party Cookies: This kind of cookies are released by the portals compared to the
portal which you are connected or you visit.
Google has established several kinds of cookies for various reasons. Given below are the
kinds of cookies established on a client’s hard disk:
1. Preference Cookie (named as PREF): This cookie has provision for client’s
preferences (example: language preferred by client or any kind of customisation.
2. Security Cookies (example: SID and HSID): It has provision to safeguard client’s
information from unknown or illegal sources.
3. Process Cookies (example: lbcs): This has provision to preserve some functions
of portals
4. Advertising Cookies (example: ’id’): This has provision to support efficient
advertising by supporting and by assisting with customized Ads to clients.
5. Conversion Cookies: This has provision to check the pathway of client’s
engagement communication with Ads.
6. Analytics Cookies (example: _utma, _ga): This has provision to gather Google
Analytics information.
It is important to note that the following kinds of cookies can have termination date or
may not have any finishing date:
1. First and Third Party Cookies can be established either with a termination date
or without it.
2. The cookies which are established without the date of termination are called as
temporary cookies. This kind of cookies get terminated immediately after the
web session finishes or the window of your browser is shut.
3. The cookies which are established with the date of termination are called as
persistent cookies. These kind of cookies are terminated only on the termination
date and can exist on your PC even though you have closed your web session or
the window of your browser.
It is important to understand that all cookies of Google Analytics are persistent
cookies but _utmc cookie belongs to a temporary cookie.
3. What is Google Panda?
The transformation brought to the search outcomes of Google ranking algorithm is
known as Google Panda. This was published first time in February 2011. The objective
behind the transformation was to lessen the rating of "low-quality portals" or "thin

23
sites", in specific "content farms", and restore greater-quality portals close to the top of
the search outcomes. The name ‘Panda’ is derived from Google engineer Navneet Panda
who established the technology and supported Google to form and execute the
algorithm.
Google Panda is a sequence of continual algorithm revitalisation and restores
information for the Google search engine which is rolled out by the firms or organisation
to support enhancing its search algorithm to advance the worth of questions on search
and their outcomes for clients.
It revises precisely tweak algorithm as a measure of constant endeavour by Google to
uplift greater-quality portals and web pages to the highest rank of the natural search
outcomes and at the same time the rating of low-quality or thin web portals and pages
having great quantity of ads but do not have good or high-quality content or information
reduces or punishes with penalty.
The businesses related to search engine optimisation and also organisation, companies
and web developers around the world follow the Google Panda updates because the
modifications by Google Panda substantially affect the volume of traffic of a given portal
from search results that are natural or organic.
Google Panda updates focus on portals which carry loads of duplication in web pages or
the quality of the pages is subtle. It is crucial for you to track if your portal has got
knockout by Google Panda or if it has any other punishment as penalty or if the visitors
or quantity of traffic has reduced. You can find out the reasons and regain from a penalty
imposed by Google Panda.
You need to check your Google Analytics especially the Google traffic and track it, as you
have remunerated drives and your hard work on marketing your content. You need to
track your Google organic search. If you observe that in a day or two the traffic has
reduced drastically it can be a penalty by Google Panda. In order to prevent Panda and
to get back to regaining from Panda you need to run a thin site with no additional or
extra pages and avoid duplication. It is better to audit the complete portal and track
these thin and duplicate pages.
4. Keyword Effective Indexing
This is a very significant index in the area of search engine optimisation. It sets a criterion
for calculating the success of using some keywords.
A numerical illustration of the fame of a keyword calculated in various searchers when
matched with the fame calculated as the number of pages in a search engines index is
referred to as Keyword Effective Indexing.
The keyword effective indexing (KEI) is a numerical interpretation that makes known
keyword phrases and terms that are very efficient when brought in practise for
enhancing your web pages. When the keywords have been searched by numerous users
it requires to be enhanced on some contending pages of your web.

24
The calculation of KEI is as follows: If there is lesser KEI, it is considered that the
keywords have fewer competitors and are well-known. This will support you to get the
highest ranking in search engines and receive huge amount of traffic or visitors to your
portal. In short the keywords that get lot of searches and are not competed in the search
outcomes are considered as elite keywords.
Advantages of Keyword Effective Indexing
The webmasters and expertise in search engine optimisation have found this useful in
selecting valuable keywords. The following are some of advantages of using KEI for
enhancing your portal:
It supports you to recognise efficient keywords so that you can increase your traffic
easily especially the organic traffic.
If there is greater KEI then it will help you to track your actual contenders especially
those who have enhanced their pages with the keywords similar to you. Tracking the
actual contenders will help you with information about all the sites that are seen in the
regular search and the websites that appear in a phrase search or which are considered
in the title page. This will in turn help you to understand the correct outcome for KEI.
You can attract the users to your webpage which has appropriate information about
your business and can attract the relevant client to your page.
5. Google Exact Match Domain
For gaining additional search engine optimisation results, we generally keep a track of
our keywords in our domain and same thing happens while deciding to purchase a
domain also, one would desire to utilise keywords. Having names like makemytrip.com,
youtubekids.com, makemoneyonline.com, generally support you to speed up the rating
for your portal. At the same time names with words like ‘online success’, ‘magazine’,
‘laptop guy’ etc. fall in the category of partial match domain names.
The name which has an exact match in question in searches that will help attract users
and connect the traffic to your portal is referred to as exact match domain. E.g.
buycheapshoesonline.com might drive your traffic but because it might have exact
match it is better to keep away from it as it can be regarded as mark of a spammy portal
when its domain name has a match in the searches.
In order to get greater ratings in search engines and without the support of good quality
content on the portal webmasters have utilised exact match domain names to purchase
their domain. This latest procedure of renovation by Google has the main objective of
focus on the web portals that maintain low quality and have greater ranking in search
engines just because of the benefit of the keyword used in the name of their domain.
Google’s this new algorithm is tracking exact match domain search engine optimizations
and the punishment through penalty for the portals by name.

25
To retrieve from exact match domain can take a long time but it is recommended that
instead of aiming at domain with same keywords should opt for partial match domain
names by placing an additional prefix or suffix in the name. Put all your efforts on
correcting the content and promoting your domain with the help of content writers.
Google had introduced a filter called the Exact Match Domain (EMD) Update for exact
match domain so that it can stop low quality portals from greater ranking who made
use of exact keywords match in searches for their domain names. Every time there is a
new EMD update, new portals with exact match domain names having low quality
information get trapped.
6. Google Penguin
A set of procedural revision and information that revises in Google search engine which
is done in regular intervals by organisation or companies to support them in optimizing
the questions in search outcomes is referred as Google Penguin. As a modern trend and
attempt to remunerate greater quality portals and reduce the search engine result
pages (SERP) existence of portals that were following keyword stuffing and
manipulation in link schemes, Google introduced Google Penguin Update. It is an
essential part of the Google’s algorithm. Google Penguin is a Google index filter related
to all the portals.
Even though Google Penguin looks similar to Google Panda and Google Hummingbird
which are Google schemes for procedural optimisation, Google Penguin is restricted to
apply penalty to companies and web developers that intentionally try to increase their
search engine ratings through devious SEO strategies.
The updates by Google Penguin is mainly pursued to avoid different kinds of search
engine spamming, called as spamdexing or Black Hat SEO, so that it does not get
lucrative rewards by being placed on the top of search engines outcomes. Search engine
spam can consist of pursuits like spamming links, stuffing with keywords, unseen
content on web pages, copyrighted information is replicated from other portals with
greater ratings etc.
The two precise performances targeted by Google Penguin is as follows:
1. Link Schemes
This means when you buy, procure or improvise backlinks from irrelevant
websites or frame wrong picture of your fame and in turn manipulate Google
procedure just to get greater ratings of your portal in searches.
2. Keyword Stuffing
This means when you overload your page on your portal with a huge number of
keywords or duplicate keywords and manipulate the rating making, it look
relevant to certain phrases on searches.

26
If you observe that your ratings or traffic is reducing on a given date related with a
Penguin Update then it is considered that it has affected your portal and you have been
hit by Penguin. To recover from Penguin you can take the following steps:
1. Eliminate links that are unnatural or if you have constructed on your own or if it has
been triggered to be put on 3rd party portals.
2. The links that are spammy and cannot be governed should disavowal.
3. Reconsider and revise the information of your portal so that you do overdo when it
comes to SEO so that you ensure that your keywords have no repetition, are not
replicated and can be executed naturally on pages.
In short Google Penguin is the solution to unyielding vulnerabilities in the procedure of
Google that permits their algorithm to be trapped by various links which are low-quality
and over enhance the keywords of pages.

3.5 CHALLENGES IN DESIGNING THE CONTENT


The topics covered in the Advanced SEO course should be in a way that it helps you to
overcome the challenges of SEO so that you can monitor it effectively. Some of
challenges are as follows:
1. Approach to SEO
There are different types of SEO approach as it will differ based on the website or the
product and services of a business that needs to be highlighted for competition and
conversion. The approach can be driven either by content, it may have technical-
emphasis or based on brand. Therefore, create your content based on the SEO
approach.
2. Improve overall value of brand or business website
The advanced SEO approach will be valuable only when the decisions on activities
and tasks it comprises of have the final objective of drastic increase in the overall
quality of the portal or the website. Your content should be quality and brand
building oriented.
3. Analyse the disparity in content
Take a stock of the various websites’ content resources like missing keystone
content, non-realistic content or appropriate subject fields that has not been
included, the required content that is absent from your contenders, so that this
content can give you opportunity to lead the competition. You need to analyse this
and then develop your content to make it competitive and to help you serve as an
industry professional.
4. Smart Solutions
In today’s competitive technological world, it is crucial to come up with quick smart
solutions for issues related to your website and traffic so you keep your pace. Give

27
importance to planning, growth and development with effective solutions to the
issues related to Advanced SEO approach. Design your content which has solutions
to web SEO problems.
5. Adapt to continuous change in SEO strategies
The SEO service industry is a continuously developing area and so you are required
to revise and upgrade yourself with the latest trends and approaches in SEO. You are
required to understand the various SEO alert tools available like SEORadar which will
help you with reports of the changes done to your portal. Another way is the usage
of Changelog, a record book, by the portal development group. So to have an
effective SEO, you need to understand the rapid changes and how you should adapt
to these changes to upgrade yourself.
6. Relevant Content
Content has a direct relation with SEO. In order to generate effective SEO, you need
to have real, trustworthy and appealing content to attract the user and increase you
traffic and conversion. You will lose your website traffic with incorrect information
and this will get a bad name to your brand and business. Well written content will
drive your SEO ecosystem.
7. Understand the target audience:
When you are writing the content, it is important to understand the type of audience
you are going to attract and convert into customer. You should have complete
knowledge about your target audience which will help you to generate greater traffic
and rating for your website. You need to concentrate on content writing and
appropriate keywords required to keep your audience engaged.

Check your Progress 1

Fill in the Blanks.


1. The two precise performances targeted by Google Penguin are________
and_________.

Summary
 The spider knits a web of indexed Web pages by scrutinising the HTML and other
factors on every page. Search engine spiders are known by another name called
‘crawlers’, generally brought in use by Internet search engines to gather data about
portals and specific Web pages.

Self-Assessment Questions
1. List the factors of Advanced SEO requirements.
2. What are the challenges to design the content?

28
Answers for Check your Progress
Check your Progress 1
Fill in the Blanks.
1. The two precise performances targeted by Google Penguin are Link Schemes
and Keyword Stuffing.

Suggested Reading
1. The Truth about Search engine Optimisation by Rebecca Lieb, Que Publishing.
2. Search engine visibility (Second Edition) by Shari Thurow, New Riders. ISBN: 13:978-
0-321-50324-4.

29
Unit 4

Off Page Optimisation


Structure
4.1 Introduction
4.2 Importance of Backlinks
4.3 Key Techniques for Off Page Optimisation
4.4 Benefits of Off-Page Optimisation
Summary
Self-Assessment Questions
Answers to Check your Progress
Suggested Readings

30
Objectives
After going through this unit, you will be able to:
• Understand the importance of Backlinks
• Understand the benefits of off page optimisation

4.1 INTRODUCTION
Off page optimisation is very crucial when it comes to your on-screen rating. It is
considered by search engines as the best returns for your best outcomes when it comes
to your search. The complete power of your portal is ascertained by the opinions given
by other portals about your portal and this is indicated as Off-Page optimisation. It also
denotes to complete remedies that are generally considered externally for your real
website so that you can enhance and develop the rating place of your portal.
Generally off page optimisation is very time-consuming as development and
enhancement of the portal is concerned and it is a long-lasting process. It consists of
obtaining backlinks for your portal page from the permitted web portals in your role.
Backlinks are considered as the exchange for your money for the given approach of your
off page.
As compared to on-page optimisation the strengths of off page optimisation are not
evidently seen on the portal page as it is performing the background activities to
improve and develop the effects of your search.
The latest Penguin updates and Google Panda have radically changed the setup for off
page optimisation which has in return affected lot of other web portals and their greater
page ratings. Many good ancient school exercises have become outdated because of
this.

4.2 IMPORTANCE OF BACKLINKS


The connecting back from outside portals to your portal especially the webpage that
links to your webpage on your portal are called as backlinks. The complete power,
uniformity and significance of your portal’s backlink summary is determined by the

31
backlinks referred from outside portals. The entire backlinks can consist of lots of links
from the similar referring portal or many referring portals. Generally, if your content is
significant, commanding or helpful to their portal the referring portals will connect to
your portal. This is how backlinks are gathered together with significant content that
other portals want to be linked with.
When you have backlinks with non-relevant portals it will damage your image with
outcomes of search engines and can also lead to fine or penalty for your portal. So it is
obvious that you get your backlinks from portals that are relevant to yours.
When it comes to search engines, similar customers’ backlinks are very crucial. If you
want to see from the customer’s point of view, backlinks are useful as they help to
search different sources of data on similar or relevant topics. It supports the search
engines to ascertain the necessity and worth of the page. The excellence of other portals
from where the links are pointing to your portal matters rather than the magnitude of
the backlinks coming to your portal. Different portals and their links will generally have
different powers.
The important element for search engines today is the link reputation so that you have
a good search engine experience and portal rating.
Off page optimisation is connected to link building but is more to do with procedures of
the advancement ahead of your portal’s design which is required for greater rating in
the search outcomes.

4.3 KEY TECHNIQUES FOR OFF PAGE OPTIMISATION


The most crucial activities referred under the Off-Page SEO which needs to be
accomplished to your portal are as follows:
1. Building links
2. Social Media Marketing
3. Social bookmarking
1. Building links
Today, this method of link building is a very widely accepted and efficient one. You can
collect maximum returns as you want to compare other portal or competitors by
building outside links to your portal and get greater rankings.
E.g. when any user likes the content of a particular page on your portal, he/she will
mention this in website or blog and in return the search engines will get the relevant
information that the given page has.
Some of the well-known ways of generating a greater number of links is through:
a. Blog directories by having a link directing to another portal.
b. Writing on forums referred as forum signatures for linking back to your portal.
c. Comment on other portals to get a link back.

32
d. Link your portal through exchange schemes with other webmasters. You can also
exchange through the 3-way link. e.g. When you link your portal to other and
the other portal also links to another portal.
e. Publishing articles in Article Directories
f. Publish your matter with other shared directories that permit you so that you
can link back to your portal.
g. You can also use the method of guest blog and this can add value only if you do
it for getting link back.
2. Publicise content through Social Media Marketing
Social Media is a type of link building and is considered generally as an ‘off-site’
optimisation. Even though the links coming from social media portals are ‘notfollow’ but
they are worth when it comes to link building.
Through link building with social media marketing your can enhance your off page. It is
necessary that you design durable and dynamic social media profile to enhance your
page rating.
In this modern world, all are inclined to make use of social metrics or social engagement
to increase the rating element of their portal. Even though a lot of importance is given
by Google to backlinks, it is also giving importance to social indications circulating your
content. The rating of your portal has direct connection with social footpaths as your
search engine optimisation has social advancements which can be either through count
of persons who tweet or social bookmark or vote for your contents through Google Plus.
Ensure you publicise your data on different social media channels to collect some gains
from it. Keep your customers connected through different platforms so that can
emphasise on the users to get maximum connects.
The various platforms which you can make use are as follows:
a. Facebook
One of the foremost social media platforms which has the highest count of
committed audience is the Facebook. This platform generally is not for endorsing
anything. This is one of the easy methods of attracting attention of users to
understand and learn about your product. Another method which is useful is to
form a group on Facebook.
You can design your page for your business and share the portal URL which has
relevance to the given page. Users connect to Facebook to socialise in an
informal manner. In this platform general topics and photos get all the attention
which can be either liked or shared.
b. Twitter and LinkedIn
In Twitter and LinkedIn, the content can be circulated on both platforms. This is
one of the well-known social platform and for online broadcast. In this, only

33
registered users on these portals can post content or connect to each other with
their message.
The content is circulated as article, video, image or generally by updating the
details. You can connect with appropriate groups on LinkedIn and put your
articles.
You can form a branded profile on Twitter for your business. As LinkedIn is more
focused on business oriented social networking, you can design an effective
profile for your off page optimisation.
c. Google Plus
As Google is the ultimate search engine provider, it is crucial for you to impart
drafted content on Google Plus. Google has introduced a new element and
named it as Google Authorship. It displays the photos and name of the writer
with his/her articles in the search engine results of Google.
The articles which become famous will earn more points in return for the
relevance and will get greater rankings.
d. Pinterest
This has become the most modern social media impression as it is a form of
image sharing platform on social network. You can connect with the targeted
users and attract their attention with the help of this image sharing portal. It
permits you to share the pictures from your posts. Pinterest will increase your
sales if your portal is based on e-commerce.
This platform is used for question & answer generally when users are looking for
clarifications to various problems and questions.
You can put your information or your views in any of the conversations
appropriate to you and you can get your votes up or down by persons reading it.
This can help you to set your expertise in the given subject matter in your area.
e. Sharing Video
You can generate videos with the information of relevant topics you choose and
also can put them on portal which have free video hosting facilities like YouTube,
Vimeo. By sharing videos in these portals, you will be connecting to thousands
of users every day every month.
3. Connect through Social Book Markings
This is one of the traditional types of search engine optimisation which has been of great
help and has been existing before the Panda and Penguin age. The renowned sites like
Tumbler, StumbleUpon, Reddit, Delicious, and Digg have been used in search engines
for various contents by people. All the content created by you can be shared by adding

34
a social bookmarking panel to your portal. You can also contribute in the discussion and
frequently show that you are engaged and connected.
These are online services which permits users to add, edit and share bookmarks of web
documents. It helps you to control and store information online. If you bookmark in the
right manner if will give successful outcomes within a day’s time of indexing. You will
have all the opportunities of users re-visiting your portal if you add useful information
and attractive offers.

4.4 BENEFITS OF OFF PAGE OPTIMISATION


The benefits of off-page optimisation are as follows:
a. Off page optimisation is one of the fundamental ways in which you can utilise
the SEO technique operated externally and the main objective is to get the best
ranking for your keywords and attract maximum users who will remain engaged
with your portal.
b. It aims at getting maximum links back through different methods of link building
for a given set of keywords.
c. In order to promote your product and services your portal needs to be efficient
enough to publicise the information about the business.
d. The visitors must get all data related to your business so that get influenced and
connect with you positively by purchasing your product or through their
engagement with you portal services through online marketing.
e. Off page optimisation enhances your portal ratings and also give quick results
for search and add fame to your business.
f. Off-page controls your referral traffic and prominence of search engine.
g. Google updates which consist of Panda that collates all user behaviour, feedback
and satisfaction levels are very crucial for search engines to understand the
visitors more precisely.

Check your Progress 1

State True or False.


1. Off Page is performing the background activities so as to improve and develop
the effects of your search.
2. Off Page optimisation aims at getting minimum links back through different
methods of link building for a given set of keywords.

Summary
• The rating of your portal has direct connection with social footpaths as your search
engine optimisation has social advancements which can be either through count of
persons who tweet or social bookmark or vote for your contents through Google
Plus.

35
Self-Assessment Questions
1. Explain the importance of Backlinks.
2. What are the benefits of off page optimisation?

Answers for Check your Progress


Check your Progress 1
State True or False.
1. True
2. False

Suggested Readings
1. The Truth about Search engine Optimization by Rebecca Lieb. Que Publishing.
2. Search engine Visibility (2nd Edition) by Shari Thurow. New Riders. ISBN: 13:978-0-
321-50324-4.

36

You might also like