You are on page 1of 5

SEO Platform migration checklist (to M2)

DEVELOPING THE NEW WEBSITE

This checklist applies to all e-commerce stores being migrated to M2. Importance to these guidelines is applied based on the share of organic
traffic (can be discovered in GA):

60-100% - very important. if SEO guidelines and best practices are not being followed client's e-commerce store performance will be
significantly affected after go-live.
20-50% - important. if SEO guidelines and best practices are not being followed client's e-commerce store performance will be affected
after go-live
0-20% - beneficial. The migration will benefit if following these guidelines. If not followed the performance of the store most probably will
be partly affected

Please note! Following these guidelines doesn't ensure organic traffic boost after go-live. Mainly these are precautions to avoid significant traffic
loss after go-live.

Suggestions listed here are similar to ones included in SEO audit offer (sample proposal and sample audit). So by following these guidelines
since the beginning of the project, you might avoid implementing changes close to go-live, for example making changes in URL structure in the
last minute.

There might be exceptions when specific points don't apply to a specific project or there are might be other issues present in the website affecting
SEO which is not listed here. Therefore if the website relies on organic traffic and the client doesn't work with an SEO agency already it's still
suggested to involve MAAS team to crosscheck SEO setup.

DATA MIGRATION

high importance

What data to migrate from SEO perspective?

If there is no in-depth SEO analysis done before the start of the project (keywords research) then the safest practice from SEO perspective is to
migrate all content (as many products, products as possible and as much actual text as possible). Extra attention to the following elements:

Meta titles (product pages, CMS pages, homepage, categories etc.)

Meta descriptions (product pages, CMS pages, homepage, categories etc.)

Product descriptions - product description cuts may lead to

Category descriptions

Reviews (Google values reviews)

Note! It makes sense to migrate such attributes if there in previous website version they are present and optimized (especially applies to meta
titles and meta descriptions). To evaluate the overall state of the current website on-page SEO there can be run crawl using Screaming Frog or
another SEO spider. That's how you will learn if there are present unique meta titles and descriptions.

How to check: To check the SEO setup of the current website in bulk can use Screaming Frog SEO Spider. After crawling the website you will
get the full list of URLs and corresponding meta titles and descriptions. Ask MAAS team to perform the crawl for large websites (SF free version
crawls only 500 URLs)

URL STRUCTURE

high importance

What to take into consideration when developing the URL structure for the new website?

First of all, if the older website version has good URLs structure matching requirements below try to use the same approach on the new website.
In that way, we will avoid setting a lot of 301 redirects and Google rankings will fluctuate less.

Following approach is suggested when developing URL structure:

It should be consistent - either all URLs have .html at the end or not. URLs should end with /, or should end without /.
Product URLs should be descriptive and represent the name of the product - www.example.com/139230 as a product URL is not
acceptable. www.example.com/green-key - clear and descriptive from both SEO and UX perspective.

Sub categories should have separate pages instead of filters if not decided otherwise, e.g, greenrabbit.com/meat/rabbit instead of green
rabbit.com/meat?cat=4

URLs should be short. no unnecessary directories, e.g., greenrabbit.com/categories/products/brown-sugar should be > greenrabbit.com
/brown-sugar. For Magento it's typical to have /index.php/ directory in URLs

Hyphens are preferred word separator

How to check: To check all URLs in bulk can use Screaming Frog SEO Spider. it's suggested to crawl both current and staging website. Ask
MAAS team to perform the crawl for large websites (SF free version crawls only 500 URLs)

SERVER SIDE REDIRECTS

high importance

Website should have server redirects in place. For example, if website preferred version is https://www.example.com there should be
following 301 redirects set:
- https://example.com to https://www.example.com
- http://www.example.com to https://www.example.com
- http://example.com to https://www.example.com

To avoid creating additional 301 redirects mapping it's suggested to stick to a similar preferred domain as the old website as overall it
doesn't matter either preferred is https://www.example.com or https://example.com

Note, you should ensure that there is no duplicated content due to 2 versions of URLs available with and without trailing slash - force one
of them (based on the setup from previous website version)

Additionally need to make sure that all alternative home page URLs are redirected to the preferred domain, e.g., https://greenrabbit.com
/homepage, https://greenrabbit.com/index

How to check: For checking redirect chains on homepages can use Ayima Redirect Path Chrome Extension - free and user-friendly. Also, can
use Chrome developers console - Network tab.

301 REDIRECTS MAPPING

high importance

301 redirect mapping is a must do in cases when URL structure is changing. There is developed a separate guide on 301 redirects
mapping

SEO ATTRIBUTES

high importance

Prev/Next tag For paginated pages URL parameters should be implemented ?p=2, ?=3 etc. and Prev/Next meta tag should be
implemented for all paginated pages. See Confluence article Infinite scroll and SEO for more details

Canonicals Unique valuable URLs without parameters have to have rel='canonical pointing to themselves, URLs with parameters
should point to clean URLs (except paginated pages, e.g., greenrabbit.com/meat?=p2 canonical should point to itself)

Noindex/nofollow Customer account, login related pages should be set to noindex, nofollow If there will be many filters in category
pages, they also should be set to noindex, nofollow. But will need to review final setup when the overall structure of the website is more
or less clear

H1 There should be H1 titles implemented in the pages - 1 per page, not more, H2-H6 can be multiple per page

How to check: For checking redirect chains on homepages can use Ayima Redirect Path Chrome Extension - free and user-friendly.

INTERNATIONAL SEO

high importance
If we are dealing with an international website which has several languages or different store views for different countries speaking the same
language there should be taken special precautions to ensure that organic traffic doesn't drop after the migration.

REQUIREMENTS TO BE ADDED

MOBILE VERSUS DESKTOP

high importance

It's preferred that the content for mobile and desktop is matching 100%. Google might choose which version to index. If mobile indexing will be
enabled and there is content missing on mobile - it won't be indexed at all.

If we will have different pages for mobile and desktop (similar content different URLs):

need to implement rel="alternate" attribute that points to the mobile version in the href` attribute.

Implement in mobile URL a link tag with a rel="canonical" attribute that points to the desktop version in the href attribute.

Also, Vary HTTP header should be implemented (more info https://developers.google.com/web/fundamentals/discovery/search-


optimization/)

DATA MARKUP

medium importance

There has to be Structured data markup implemented and OG is suggested. Structured data markup will be defined after designs and PDP fields
are confirmed by MAAS. Overal guidelines available in a separate confluence pages:

Structured data implementation guidelines: Structured Data Markup

How to check: Use Google Structured Data Testing Tool for debugging

RESPONSE CODES

medium importance

Avoid internal 301 redirects and 404 errors - links should point directly to URLs with 200 status code

How to check: Within Screaming Frog it's possible to generate reports marking the errors and source URL where errors appeared

XML SITEMAP

medium importance

Available in a separate url, e.g., greenrabbit.com/sitemap.xml.

Generated daily

Consisting only of canonical URLs which are set to index and follow, and not included in robots.txt

Should be included - PDP, categories and subcategories (to be defined), CMS, homepage, images (optional but preferred)

Built according to sitemap protocol https://www.sitemaps.org/index.html

How to check: If there are any crucial errors they will be reported by Google Search Console. Screaming Frog can be used for crawling the
sitemap to debug the 301 redirects included in Sitemap.

Note! Default Magento XML sitemap is not SEO friendly. In order to ensure at least minimal flexibility, there should be developed a custom
solution or SEO extension implemented to make sure that it's possible to include and exclude URLs from the sitemap. For more advanced SEO
projects it's suggested to look for an advanced Extension.

ROBOTS.TXT
high importance

Super detailed robots.txt doesn't always mean it's a good robots.txt

Should be adjusted for a specific project individually based setup of Meta robots

Should include a link to valid XML sitemap

Should not block parameter pages (typical mistake for robots.txt for M2 found on the internet)

Should not block JS or CSS files

Always should be placed in root directory https://www.elekcig.de/robots.txt

Should be formatted in the way that there are line breaks. Sometimes there are issues with that if editing robots.txt directly in M2 BE. If
robots.txt appears without line breaks upload it to the root directory as a simple file

Examples of robots.txt for M2 https://www.elekcig.de/robots.txt; https://www.fitnessguru.com/robots.txt; https://vapo.es/robots.txt

Example of robots.txt for M1 https://www.uropenn.fi/robots.txt

How to check: QA of robots.txt can be done by crawling the website (applicable for staging, prelive, live environments). For live website QA is
done by reviewing reports in Google Search Console - Blocked Resources, Indexed despite robots.txt, Robots.txt

Note, don't use robots.txt to keep staging websites out of Google index. Here is an article about ways how to exclude Staging from the index: http
s://searchengineland.com/how-to-keep-staging-development-site-index-286987

UX AND SEO

medium importance

Paginated pages. If there will be implemented infinite scrolling (an example of such functionality: https://www.lafayette148ny.com/shop
/new-arrivals?p=2) then need to QA if the content of the next pages is being accessed by Google. From SEO perspective the safe choice
is to implement pagination - buttons to switch to the page (example https://www.elekcig.de/aromen) or Preload (Toggle CSS visibility)

Category pages. Each category page (and subcategory page if there will be unique URLs) should have 1, 2 sentence long description,
preferably including category name and additional keywords

Pop-ups. Preferably give around 5 seconds before loading any pop-up. Google doesn't like pop-ups which appear on a page right after
loading

Cloaking. Users should see the same information the Google bot sees - no hidden titles or body text, no white text on white background.
It's considered cloaking and might result in penalties

GO-LIVE

Task Details

Home page redirect test Does the home page have a rel=canonical tag to their www version
Test homepage redirects ( to www.)
Test http => https redirects

Update robots.txt file robots.txt file match the approved doc and NOT include Disallow: /

Crawl the website Test robots.txt by running the crawl.


Let the crawl run in the background and continue with other actions
Meta Robots on Home/Cat/Sub-Cat/Product detail pages - Should be
Index, Follow

Re-generate Sitemap Marketing -> SEO & Search -> Site map -> Click "Generate"

Test the sitemap Crawl using SF to check for broken links, 301 redirects etc. , Are there
CMS pages included, Is the homepage listed as "/" with daily changefreq
and 1.0 priority?

Add sitemap.xml to robots file Include the full URL to sitemap.xml into robots.txt
Verify Bing and SC properties Add the tag in Head section of the website

Submit sitemap to SC and Bing

301 manual redirects Crawl 301 redirects map


Fetch as Googlebot on 10 random pages within 301 redirect list

Analysis of crawled site Are there any major bugs 404 errors, 301 redirects, on-page SEO,
indexation, canonical

AFTER MIGRATION

TO BE UPDATED

TESTS TO RUN

404 errors in Google analytics

Make sure that there is no significant increase in 404 errors. That's might be a first sign that 301 redirect mapping is not working properly and
should be adjusted.

Which pages are reported with 404? GA - All pages > Page title > Filter by page tile of 404 page (check on live site)

Are there any pages receiving a significant amount of traffic? If, yes, check:
Are they linked internally? If Entrances tab shows 0 then most probably page is linked internally. Then crawl the website to
check were URL is linked from. Example,

If number of Entrances matches the page views then most probably broken URL is linked externally. Check if there is a correct
301 redirect set for this page in the BE. If not add the redirect.

You might also like