You are on page 1of 4

We will be talking about technical SEO next.

You are going to learn about all the things related to technical SEO like crawling and
indexing, XML sitemaps, duplicate content, structured data, hreflang and lots more.

Technical SEO is one of the most important parts of Search Engine optimisation and you
need to make sure that your website is fully optimised for it to be crawled by Google.

You need to make sure that your technical SEO is up to speed and this guide will help you
achieve that.
So we are going to start from technical SEO fundamentals, specifically will be covering what
technical SEO is and why it is really really important in 2020 and will also show you what is
and what isn't considered as technical SEO.
So what exactly is technical SEO, to summarise: technical SEO is the process of ensuring
that your website meets the technical requirements of search engines in order to improve
organic traffic. Some important elements of Technical SEO include crawling and indexing,
rendering and website architecture.

So why is technical SEO important. You can have the best website with the best content but
if your technical SEO is messed up, you are not going to perform well in the search rankings.
To put it simply Google needs your technical SEO perfect in order to rank it and in order to
find your website and crawl and index the pages of your website which is why technical SEO
is really necessary.
But that is not all. That's just scratching the surface, even if your site's pages are indexed
that doesn't mean your work is done. There are a lot of other factors which go into improving
the technical SEO of your website, for example your site needs to be fully optimised, free of
duplicate content, fast loading, mobile friendly, and a lot of other things which include the
technical optimisation aspects of SEO.

You don't have to go into much detail, but the easier you make it for Google to access your
content the better chance you have to rank.

So how can you improve the technical SEO. Some of the factors you can work upon in
technical SEO is JavaScript XML sitemaps, site architecture, URL structure structure date of
content duplicate content raflan Canonical tags 404 pages 301 redirects and even that is not
all fortunately you are going to read about all of them and more in the guide.

Let's start with site architecture. Architecture should be the number one step of your
technical SEO campaign even before crawling and indexing.
This is important because most of the crawling and indexing issues happen due to poorly
structured websites, so you can decrease the error rate of your website by improving your
side architecture and then you don't need to worry about indexing issues.
Another thing is that your site architecture influences almost everything you do on your site
from URLs to your sitemap, everything is influenced by your site architecture so it should be
the main thing you start your campaign from.
Focusing on site architecture will make your SEO tasks a lot easier.

So how do we start?
First thing is using a flat organised site structure.
So what exactly is site structure?
It is how all of the pages on your website are organised, and it is always better to have a flat
site architecture. To put it simply all your site's pages should be only a few links away from
one another.
It is really important because if structured this way, it becomes far easier for Google to crawl
your site and to index your pages.
You also have to make your site super organised and not messy. Messy structure usually
creates orphan pages which are pages which don't have any internal links pointing to them
and which makes it hard to fix ID and indexing issues.

Second thing is to have a consistent URL structure.


You don't have to overthink it, you just have to follow a consistent URL logical structure.
It can help people to make sense of where they are and additional information in the URL
can provide extra information to google which will help your site.

One very important thing is breadcrumb navigation. Breadcrumb navigation is super SEO
friendly because they automatically add internal links to category and sub pages on your
website which helps to organise your site structure.

Another thing is crawling rendering and indexing.


You will get to know how to fix crawl errors and how to send search engine spiders to deep
pages on your website.

You have to first spot the indexing issues on your website.


You can do it in three ways,
First is coverage report in Google search console.
This report lets you know if Google is unable to fully index the pages which you want
indexed.
Another option is a tool called screaming frog which is the world's most famous crawler and
it is really really good. You can use it to crawl your whole website.
You can also use ahrefs' site audit tool which will give you the overall technical health of your
website.

The biggest problem in technical SEO indexing is getting the deep pages in your website
indexed.
Flat site architecture usually prevents this issue from happening because if you implement a
flat site architecture your deepest page will only be three to four clicks away from your home
page.
Google stated that XML sitemaps are the second most important source for finding URLs so
you might want to double check that your sitemap is all good. You can do this by going in
Google search console and looking at sitemaps.
You can also go to Google search console and inspect if the URLs on your website are
getting indexed or not.

Another really important thing is to not have thin and duplicate content on your website. If
your website is filled with unique content then you probably don't need to worry about this,
however duplicate content can be present on your site without you realising it. The same
goes for thin content. Both can overall hurt your site rankings so it's worth finding and fixing
about.
You are going to learn exactly how to fix this issue in this section.

You can use a tool to check your website for duplicate content.
There are two tools which are really great.
First is raven tools site auditor which can show your website for duplicate content or thin
content.
You can also use ahrefs' site audit tool to check your site for duplicate content.

You'll also have to check if your site is not using content from other sites which you can do
with a tool called Copyscape.

most of the sites are going to have pages with some amount of duplicate content in them,
which is okay.

However if those pages are getting indexed, then that can be problematic.
This problem can be solved by adding the noindex tag to those pages which tells Google
and other search engines to not index the page.

There is one more option to solve the problem of duplicate content which is Canonical URLs.
Canonical tags are used for pages which have very similar content on them but with minor
differences, like in e-commerce sites.
Canonical tag hills Google to only index the main page of the website and to ignore the
copied pages.

We'll be talking about one really important issue which is the speed of a website.
You can directly impact your website's ranking by improving the page speed of your website
and it can make a significant dent in your organic traffic.

There are three ways in which you can do that.


First is reducing the webpage size.
In a study we found out that a page's total size correlated with load times more than any
other factor, which means that you can compress images and cache out your site but it is
going to do nothing if your pages are huge.
You can reduce the size of your webpage to improve the site speed, you also have to set up
the CDNs correctly which you can check rom webpagetests.org

Eliminating third party scripts also adds an average of 34 ms to the load time.

Now we will tell you some extra technical SEO tips,


If you site has different versions of your page for different countries and languages then the
hreflangs tag can be a huge help.
Also having dead links on your website is not as bad as having broken internal links.
Broken internal links make it harder for Google bot to find and crawl your website pages.
Structured data is another trend and setting it up increase your click-through rate and give
some of your pages rich snippets which will indirectly help your SEO efforts.

Bing mobile friendly is really important aspect of Technical SEO and your website can get a
ranking decrease if your site is not mobile friendly.

This has been a summary about technical SEO, now we will be talking about some case
studies and you can see how it can help your SEO efforts.

Case study 1
when Felix Norton audited one of his clients websites for technical issues he found out that
there were no internal links.
The clients were publishing a ton of high-quality content but the traffic and rankings were not
improving. That's when Felix decided to add Internal links to their high priority pages, content
pieces, product pages and pieces of related content.
This provided them a 250% traffic boost within a week of adding these internal links.

You can read more case studies by reading the blogs given in resource section or watching
the videos.

You might also like