You are on page 1of 90

Check

Screaming Frog

GWT
Crawl Test

Find out keywords site is ranking for


Tool
Potential custom filters to include:
snippet of GATC to make sure they're tracking every page
YouTube embed snippet to see what videos they're embedding
</iframe> to see how they're using iframes
pageTracker._trackPageview to see if they have remnants of GA's traditional tracking code
All errors
Mozbar > SEOmoz Tools (wrench icon) > Crawl Test or
http://pro.seomoz.org/tools/crawl-test
SEMRush or Keyword Spy
Sterling Staffing Solutions Site Audit
http://www.sterlingstaffingjobs.com/index.php
How many pages are indexed by Google?

When you do a site: search, does the homepage come up first?


Does the site have index bloat?
Do they have any specific crawl issues?

Does the site have mirror sites?


If the site uses mirror sites to reduce server load, are the mirrors noindexed?
Does the site have an xml sitemap (or sitemaps with an index)?

Does the xml sitemap follow proper xml protocol?


Do they need a sitemap generation recommendation?

Are their sitemaps clean?

Are URLs duplicated in the sitemap?

Is a link to the site's xml sitemap or sitemap index in their robots.txt?

Have they submitted sitemaps to Google and Bing?


Do they have all sitemaps they should? Web, mobile, image, video, news

Does their sitemap have more than 50,000 links or > 10MB unzipped?

Does the site have separate sitemaps for the main categories/sections of the sitet?

Does the site have an RSS feed?


Have there been dips in crawl?

Do they have query parameters being indexed?


Are there errors in their robots.txt file?

Are they excluding pages they shouldn't in robots.txt?

Are there pages/directories they should include in their robots.txt?

Are they excluding pages they shouldn't w/ noindex?


Are there any pages with a nofollow tag?

Does the site block login and cart pages?

Does the site offer print pages?


If the site offers print pages, do the print pages use CSS or unique URLs?
If the site offers print pages with unique URLs are these pages blocked by the
search engines?
Does the site use a revisit-after tag?

If the site is a blog, does it use a plugin to ping the search engines to let it know it's
updated?
See if your site is redirecting Google IPs

Does the site have page titles that are too long?

Are conversion (thank you) pages noindexed?


Instructions
Search for [site:yoursite.com] in Google
GWT: Google Index > Index Status

Check ratio of pages indexed/pages getting org traffic


Mozbar > SEOmoz Tools (wrench icon) > Crawl Test or
http://pro.seomoz.org/tools/crawl-test
reverseinternet.com
Mozbar >Analyze Page > Page Attributes
Four quick ways exist to diagnose if an XML sitemap exists:

1. Check the robots.txt file (located at http://domain.com/robots.txt)


2. Do a site:domain.com inurl:sitemap.xml search in Google. If nothing
is returned, then no sitemap exists.
3. Look at /sitemap.xml or use the SEO Site Tools Chrome extension.
4. Paid: http://www.xml-sitemaps.com/standalone-google-sitemap-
generator.html

http://www.sitemaps.org/protocol.html
Tools:
http://gsitecrawler.com/ (Windows)
http://www.intelli-mapper.com/index.php/purchase (Windows)
http://www.xml-sitemaps.com/ (Online)
http://peacockmedia.co.uk/integrity/ (Mac)
http://www.xml-sitemaps.com/standalone-google-sitemap-generator.html
(Paid but cheap)
Screaming Frog: Screenshot of steps -
http://screencast.com/t/MaDq1Eyn

Run through Screaming Frog: Mode > List > File Format: SiteMap
(*.xml)

Sitemaps should only have URLs that resolve to 200 status code and
shouldn't contain low-quality pages, like paginated content, search
results pages, etc.

Screaming Frog: Mode > List > File Format: SiteMap (*.xml)

If the number of URLs crawled is less than the number of URLs you
entered into Screaming Frog. SF tells you how many you entered when
you browse to the sitemap.xml file and load it into SF. But SF
automatically dedupes a sitemap, so if you have fewer URLs you have
duplication in the sitemap and should export a new sitemap from the
final list of clean URLs (Export > Sitemap).

Add as a line item to robots.txt file:


Sitemap: http://www.your-site/your-sitemap-or-sitemap-index.xml
GWT: Crawl > Sitemaps
BWT: Crawl > Sitemaps
site:yoursite.com inurl:sitemap.xml
GWT: Crawl > Sitemaps
BWT: Configure my site > Sitemaps

Google resources about the five different sitemaps:


http://support.google.com/webmasters/bin/topic.py?
hl=en&topic=20986&parent=8476&ctx=topic

Open the sitemap in Chrome and search the page (Ctrl/Command-F) for
"http" - this will show you the number of both http and https URLs in the
sitemap. Look in the upper-right corner of the browser.
Advantage here is the level of detail you suddenly get on indexation is
dramatic. You could instantly find, for ex, that TVs were indexed at a
95% rate while video games were indexed at a 56% rate. This is
information you can use and act on. Read more about this from an
awesome post on AJ Kohn's blog: http://bit.ly/fRMXKs

Recommended file names:


* sitemap.tv.xml
* sitemap.digital-cameras.xml
* sitemap.video-games.xml

Can also break down category and product pages:

* sitemap.tv.category.xml
* sitemap.tv.product.xml
* sitemap.digital-camera.category.xml
* sitemap.digital-camera.product.xml

GWT: Crawl > Crawl stats


BWT: Crawl > Crawl summary
GWT: Crawl > URL parameters
http://tool.motoricerca.info/robots-checker.phtml

A great resource on robots.txt: http://tools.seobook.com/robots-txt/


Use "Is this page blocked by robots.txt?" bookmarklet by Tom Critchlow,
which you can get here: http://bit.ly/UbwfpW
http://tool.motoricerca.info/robots-checker.phtml
SEOmoz Crawl Test (Checking Blocking G, B, Y columns)
Look for weird directories in SEOmoz Crawl Test, Screaming Frog, or
GA: Content > Site Content > Content Drilldown
SEOmoz Crawl Test (http://pro.seomoz.org/tools/crawl-test)
Isolate pages with nofollow tags in SEOmoz Crawl Test or Screaming
Frog
Look at robots.txt and use MozBar to check for noindex tags (Analyze
Page > Page Attributes)

Pull up a couple and look at MozBar: Analyze Page > Page Attributes
Screaming Frog: Meta & Canonical tab

This is fictional. None of the big search engines have ever honored it. It
won't hurt the site, but it will make you look like an SEO neophyte.

http://pingomatic.com/
https://seesmic.com/ (formerly ping.fm)
Enter the URL into translate.google.com and see if it redirects. More
from Googler John Mueller here: http://bit.ly/google-translate-trick

GWT: Crawl > Fetch as Google

Use this tool (http://www.seomofo.com/snippet-optimizer.html) or in


Excel, set column width to 520px, set columns to wrap text, and font to
Arial 12pt. Type in your title, and bold the main keyword. If the line
breaks, your title tag will truncate.
If not, this could create tracking issues.
Observations Recommendation

70
Yes
No

N/A
No
No

No
No

Yes

No

No

No

No
No

No

No
Yes

No
No

No

No

No
No

No

No
No
No

No
No

No

N/A

Yes
N/A
Priority (1–3)
Check
Are URLs SEF friendly?
Do the URLs use query parameters?

How does the site (both content and navigation) look/function when you turn off
CSS, JavaScript, and cookies?
Do all of the navigation links work when Javascript is turned off?
When you view the cached version of the homepage, does all the content show
up?
When you view the cached version of the homepage, are navigation links present?

When you view the cached version of the homepage, do links show up that aren't
visible on the page?
Is any content being pulled in with iframes or from an external source?

How is the site's overall speed performance?

What are the page speeds of their top 10 landing pages?

How does their homepage's page speed compare to their top competitors?

Are images optimized for fast page load?

Are JavaScript and CSS linked to external files?

Are JavaScript and CSS minified to reduce load time?

Does the site leverage caching well to minimize page load time and reduce HTTP
requests?
Is Flash used for important elements?
Do images have ALT text?
Are there 404 pages?

Trend of 404 pages?


Are 404 pages serving the correct header response?
Trend of soft 404 pages?

Are any of the 404 pages significant or indexed?

Are pages in the site pointing to 404 pages?

Do they use a custom 404 page to reel visitors back?


Are there 500 error pages?

Trend of 500 errors?

Are there errors for mobile?


Is the site using 302 redirects?

Are 301 redirects set up properly?

Do any pages have meta refreshes?

Does the site use AJAX pages?

If site is using AJAX, is content in AJAX being indexed?

Are headers images?


Is page copy in images?
Is the site cloaking by cookie detection?

Is site currently being developed?


If so, is the dev server excluded via robots.txt or password-protected?
Are there PDFs on the site?
Is content in PDFs also in HTML?
Does the site have malware?

Has the site been hacked?

Is the website in a bad neighborhood?


Does the site use infinite scroll?
Instructions

Query parameters follow a ? in a URL and are separated by &'s if


there's more than one (e.g., www.yoursite.com/your-landing-
page?src=google+local&sort=low-to-high&index=no). These can
create duplicate content.
Web Developer Toolbar

cache:url

Screaming Frog: Create a custom filter that contains "<iframe".


Don't include closing bracket or quotation marks.
http://gtmetrix.com/
https://developers.google.com/pagespeed/ - lots of
recommendations (also recommended by DistilledU)

Kissmetrics infographic on page load time: http://bit.ly/UjcDAn

Speed Tracer Chrome plugin, Yslow Chrome plugin


(recommended by DistilledU b/c it includes server response time
and download time and makes recommendations)
GA: Use this custom report to look at page load times of top
organic landing pages: http://bit.ly/15g530d (H/t to @Dan_Shure
for his collaboration in coming up with this custom report.)

"Amazon study that showed a 1% decrease in sales for every 0.1s


decrease in response times." http://bit.ly/UjazZ6

Create videos comparing site against competitors:


http://www.webpagetest.org/video/
Page speed tools, Web Dev Toolbar
Do a Screaming Frog crawl of just images (Configuration > Spider
> Check Images)
Should avoid inline and embedded CSS as much as
possible..Also, JS files are pulled sequentially, so these should be
combined where possible.
Desktop apps: JSMin, YUI Compressor, Google Closure Compiler
Web: Google Closure Compiler (http://closure-
compiler.appspot.com/home)

https://developers.google.com/speed/docs/best-practices/caching

ScreamingFrog: Advanced Export > Images Missing ATL Text


ScreamingFrog: Response Codes
GWT: Crawl > Crawl Errors > Not found
GWT: Crawl > Crawl Errors > Not found
GWT: Crawl > Crawl Errors > Soft 404
GWT: Crawl > Crawl errors > Soft 404 (A soft 404 is a not found
page that delivers a 200 response code instead of directing
visitors to a 404 page.
Pull top x organic landing pages into Excel (I recommend the
Excellent Analytics Excel add-in, but you can also export from the
GA site), then either pull into Screaming Frog (Mode > List) or use
SEOTools Excel plugin to get status codes for each page. Note:
Will need to add hostname (domain) to URIs from GA first b/c
both SF and SEOTools needs a full URL, e.g.,
http://www.yousite.com/blog/, not /blog/.

Screaming Frog: Mode > List > Enter 404 pages > Look at Inlinks
column

GWT: Crawl > Crawl Errors > Server error


Screaming Frog
SEOmoz Crawl Test
GWT: Crawl > Crawl Errors > Server error
Screaming Frog
SEOmoz Crawl Test
GWT: Crawl > Crawl errors > Mobile tab
Screaming Frog: Response Codes
SEOmoz Crawl Test
Screaming Frog: Response Codes
SEOmoz Crawl Test
http://www.seologic.com/webmaster-tools/url-redirect
Screaming Frog: Meta & Canonical tab > Meta Refresh
SEOmoz Crawl Test
Currently, if you implement the window.history.pushState()
JavaScript fucntion (part of the 'HTML5 History API'), you will be
able to create an AJAX site that is crawlable. pushState()
accomplishes this by changing the path of the URL that appears
in the user's address bar. This makes it so search engines are
able to execute links and read content encoded in JavaScript.
Cf:http://www.distilled.net/store/u/my/technical/ > Crawling &
Indexing

Search for snippets of text from the AJAX content wrapped in


quotes in Google/Bing.

Change user agent to Googlebot using MozBar (Firefox only -


Settings > Set User Agent > Googlebot) or use the User-Agent
Swticher for Chrome extension: http://bit.ly/VzEFZg.
Is the page different from before?
site:yoursite.com filetype:pdf
Search for snippets of text from the PDF.
GWT: Malware
http://www.google.com/safebrowsing/diagnostic?site=mysite.com

Replace "mysite.com" with your domain in the URL.

SEMRush or Keyword Spy

Run a scan and look to see if site is suddenly ranking for spammy
keywords
https://www.majesticseo.com/reports/neighbourhood-checker
Turn off Javascript. There should be static links to paginated
content. Ex of it done well in Mashable. http://bit.ly/11j0d0g
Observations Recommendation Priority (1–3)
Could be better

Yes

A little sketchy
Most of them do

Switches to another page

Yes, but switches to another page

Yes, a whole other page

N/A

Bad

No

N/A

N/A

N/A
N/A
No
Yes
Yes
No

Yes

Yes

Yes
No

No

No
N/A

Yes

No

No

N/A

N/A
No
N/A

N/A
Yes
No
No
No

No

No
No

No
Check
Is the site organization intuitive?

Does every category and subcategory have a clear purpose?


Does the URL structure follow the category/subcategory structure as you drill
down?
Are key organic landing pages close to the homepage?

How many category pages does the site have?

Are there too many category pages? Not enough?

Are pages targeting competitive head terms more than two clicks from homepage?

Does the site use faceted navigation?


If it's not using faceted navigation, should it?

If the site uses faceted navigation, does it create duplicate content issues?

Does site use breadcrumbs?


If site uses breadcrumbs, do links point to canonical versions of the page (i.e.,
ideally, pages w/o query parameters)?
If site uses breadcrumbs, do breadcrumbs follow organization of the site?
Does the navigation have a reasonable number of options?

Do category pages have followed links to sub-category pages?


Do sub-category pages have followed links to product pages?
Instructions Observations
Bing Webmaster Tools' Index Explorer is a
great way to visualize a site's architecture
(Dashboard > Reports & Data > Index
Explorer). You can see the root-level pages
and directories and then drill down to see
subdirectories. Find report by going to
Reports & Data > Index Explorer.

Yes, but could be a lot better


Yes

No
1. Pull organic landing pages from GA:
Traffic Sources > Sources > Search >
Organic > Landing Page (or pull using
Excellent Analytics plugin)
2. Run Screaming Frog and export crawl
(screenshot:
http://screencast.com/t/ATKBFL3tZHKw)
3. Pull both tabs into the same Excel
document and do vlookups to pull in Level
(which tells you how many clicks the page
is from the homepage, if you stated the
crawl with the homepage). Great video from
Mr. Excel on how to do vlookups if you're a
newbie: http://bit.ly/VEd8Wy.

Top landing pages should ideally be three


or fewer clicks from the homepage.

No
GA: Content > Site Content > Content
Drilldown
BWT: Reports & Data > Index Explorer 5 Main pages
GA: Content > Site Content > Content
Drilldown
BWT: Reports & Data > Index Explorer

A good litmus test is to see how many


pages are in each directory. You don't want
directories that are too thin or too bulky.

Not Enough
Pull SEMRush, GA Organic Keywords, and
GWT Search Queries reports (Search
Traffic > Search Queries) - with landing
pages and search volumes, then filter for
terms with fewer than three words
Yes
Examples of faceted navigation:
http://bit.ly/XMCfu5 No
If a site has a lot of product pages
paginated, this could be an indicator that
the site should use faceted navigation. Doesn't need it
Look for query parameters used by faceted
navigation (if it's parameter driven) and see
if they're listed in GWT's URL Parameters
report (under Crawl) N/A
No

N/A
N/A
There is anecdotal evidence that having
more than 7 options in the main navigation
is counter-productive. N/A
N/A
N/A
Recommendation Priority (1–3)
Check
Have they experienced a decline in inbound links?

Are they overoptimized?


What is ratio of homepage to deep page linking?
Does it appear they've been buying links?
Where have most of their links come from?
Does the site have links in their xml sitemap that weren't captured in the Screaming
Frog crawl?

Do they have a DMOZ link?

Do they have a Yahoo Directory link?


Do they have any links from Wikipedia?
What linkable assets do they have?
What are their most linked to pages?

Does that coordinate with top organic pages?

Do they have a strong internal linking structure?


Do blog posts and content pages interlink a lot?

Does the logo point to the canonical version of the homepage?


Are there any pages with too many links?
How many links do they have on their homepage and top pages?

Do they have broken links throughout the site?

Are there broken links on the page?

Do they
Does thehave too many
site have links on
an HTML a page?
sitemap?

What internal pages do they link to the most?


Does logo link to homepage?
What is the quality of outbound linking?
Do they have microsites?
If they have microsites, are they interlinking with the same terms the main site is
ranking for?

Do they have a link network?

Does the site have a resources/directory/links page that's really a link exchange?
Instructions Observations
ahrefs.com: Enter URL, scroll down to Backlinks graph No
(screenshot: http://www.screencast.com/t/FEA8vmPREC)
OSE, Majestic No
N/A
Dr Pete's link profile doc: http://mz.cm/Pt19gr No
Link Detective: http://www.linkdetective.com/ N/A
Pages found in the sitemap that aren't in the Screaming Frog No
crawl are most likely orphaned pages b/c the spider couldn't get
to it.

Very Important: You shouldn't select "Ignore robots.txt" under


Configuration > Spider in SF to preserve the restrictions a
search engine spider will face.

Steps to compare the two lists (with a screenshot that explains


everything):
1. Import xml sitemap into Excel (PC only). If on a Mac, run the
sitemap through Screaming Frog (Mode > List > Select File >
Sitemap > Navigate to xml sitemap file), run the crawl, and
export the list of URLs into an Excel document.
2. Move the sitemap csv into the same Excel workbook as the
Screaming Frog crawl (right-click on tab at bottom of worksheet
> Move > Select SF workbook > choose where you want to
move it).
3. In the cell next to the first URL from the sitemap, enter this
formula: =IF(ISNA(MATCH(B2,'SF Crawl'!
A:A,0)),"Orphan","OK"). Screenshot with thorough instructions:
http://www.screencast.com/t/V6z7bphD

Search for yourdomain.com site:dmoz.org No


SEO Site Tools Chrome extension: External Page Data
http://www.domaintools.com/: Site Profile tab No
site:wikipedia.org link:client.com -filetype:jpg No
None
OSE: Top Pages tab Their Home Page
Majestic: Pages tab
GWT: Search Traffic > Links to Your Site > Your most linked
content
Searchmetrics: Social > Social Links > Social Spread
GA: Traffic Sources > Sources > Search > Organic > Landing No
Page
GWT: Search Traffic > Internal Links No
Good examples: SEL, seoroundtable.com, outspokenmedia.com No

No
SEOmoz Crawl Test (Too Many On-Page Links column) No
SEO Site Tools > Suggestions AND Page Elements
Mozbar Analyze Page > Page Attributes (Warning: They
produce very different results.)
Screaming Frog: Internal tab > Outlinks
http://rapid.searchmetrics.com/en/seo-tools/link-tools/outbound-
links,50.html
Xenu Lnk Sleuth
http://www.domaintools.com/: Site Profile tab > Links (Internal
and Oubound)
http://urivalet.com/ (Internal and outbound - under HTTP header
info)

Screaming Frog: External tab > Status Code = 404, 410, 500
Xenu Link Sleuth

Check My Links Chrome add-in:


https://chrome.google.com/webstore/detail/ojkcdipcgfaekbeaelaa
pakgnjflfglf
http://www.iwebtool.com/broken_link_checker
Link Checker Firefox plugin: https://addons.mozilla.org/en-
US/firefox/addon/linkchecker/

BEST: SEOmoz Crawl Test - actually tells you which pages have
http://www.xml-sitemaps.com/standalone-google-sitemap-
too many
generator.html
SEO
GWT:Site Tools
Search > Suggestions
Traffic > Internal AND
links Page Elements
Mozbar Analyze Page > Page Attributes (Warning: They
produce very different results.)
Screaming
Screaming Frog:
Frog External tab
Check reverseinternet.com, if you're not sure.
http://rapid.searchmetrics.com/en/seo-tools/link-tools/outbound-
Pull backlinks for each from your tool of choice (GWT, OSE,
links,50.html
Majestic,
Xenu LnkAhrefs),
Sleuth and use a pivot table to evaluate backlinks
http://www.domaintools.com/: Site Profile tab > Links (Internal
and Oubound)for each site, pull into a pivot table, filter to only
Pull backlinks
http://urivalet.com/
look at links to and (Internal
from theirand outbound
sites. - under
If you are HTTPwith
unfamiliar header
how
info)
to use pivot tables in Excel, check out this blog post I wrote:
http://selnd.com/VVK2q9.
Recommendation Priority (1–3)
Check
Are their images optimized for SEO?

Are their images optimized for download?

When you search for brands they rank for in Google web search in Google Images
do they come up?
Do they host their images on another domain?
Do they have visibility for image searches?

Are there page titles that surpass 70 characters?

Are there descriptions that surpass 155 characters?

Are page titles optimized?

Do descriptions have powerful CTAs?

Are they missing page titles?

Are they missing descriptions?

What is the quality of their calls to action?

Does the site have ads?


If the site has ads, how many are above the fold?
If the site has ads, are they relevant to the site?
Do any of the blink, move, or play videos/sound automatically?
Is the page you're checking adequately optimized for the keyword(s) you're going
after?

Are pages keyword stuffed?


Instructions
Check alt tags in Screaming Frog crawl or SEOmoz crawl

Check file sizes (Screaming Frog: Images tab, export, sort


by size)

GWT: Search Traffic > Search Queries > Filters > Filter by
images (screenshot:
http://www.screencast.com/t/rH9vfqpAlnyK)
GWT: Search Appearance > HTML Improvements
Screaming Frog
SEOmoz Crawl Test
GWT: Search Appearance > HTML Improvements
Screaming Frog
SEOmoz Crawl Test
GWT: Search Appearance > HTML Improvements
ScreamingFrog
SEOmoz Crawl Test
GWT: Search Appearance > HTML Improvements
ScreamingFrog
SEOmoz Crawl Test
GWT: Search Appearance > HTML Improvements
ScreamingFrog
SEOmoz Crawl Test
GWT: Search Appearance > HTML Improvements
ScreamingFrog
SEOmoz Crawl Test
Evaluate homepage, category pages, and product/content
pages

SEOmoz's On-Page Analysis Tool:


http://pro.seomoz.org/tools/on-page-keyword-
optimization/new
Observations Recommendation
Priority (1–3)
Check
Does the site have paginated content (multi-page articles, results pages)?
How does site handle pagination? Noindex, canonical tag, rel prev next?

Are paginated results getting organic traffic?


Do they have content duplicated over their different sites?
Are there canonical issues with the homepage?

Are there canonical issues with the rest of the site?

Are https pages indexed? Are they dupes of the non-secure versions?
Are trailing slashes creating duplicates?
Is there duplicate content within the site?
Is there duplicate content with other sites?

Has the site been scraped?

Is there duplicate content between subdomains?


Do they have duplicate page titles?

Do they have duplicate meta descriptions?

If ecommerce site, are product descriptions unique?

Does the site offer user-generated content (e.g., reviews, ratings)


If the site uses ratings, is it getting rich snippets?

Is content organized, using headings properly?

Is content well written?


Does content use proper grammar?
Are titles well written?
Are there misspelled words?
Do pages contain ads?

Do they have a blog?


If they have a blog, do they have an RSS button?
If they have a blog, have they noindexed category and tag pages or provided
unique content to these pages, like an intro paragraph?

If site has an RSS feed(s), is link to RSS feed nofollowed?


Instructions Observations

Should not be using canonical tags for


pagination, unless it's a view all page. And
if you're going to use a view all page, make
sure it doesn't suffer significant latency
(geek speak for it's slow).

This usually isn't ideal.

GA: Content > Site Content > Pages >


Page Title, then click on homepage title to
see if there are multiple URLs receiving
significant traffic for the same title from
different URLs. (Ignore URLs with low
visits.)

ScreamingFrog: Directives > Filter:


Canonical. Export to Excel and check for
rows where the Canonical 1 != Address
column. Filter out all of the matching
canonicals b/c they are just self-referencing
canonical tags. Pay attention to rows where
they don't match to make sure these
canonical assignments make sense.
Content > Site Content > Pages > Page
Title, then click on a bunch of titles to see if
there are multiple URLs receiving
significant traffic for the same title from
different URLs. (Ignore URLs with low
visits.)
GA: Search content for /index|/default

site:yourdomain.com inurl:https

Screaming Frog: Hash (Apply conditional


formatting to column to catch duplicates)
SEOmoz Crawl Test
Search for snippets.
Ask the client if they have registered any
other domains and check for duplicate
content there.
Use http://reverseinternet.com/ to find
related domains.
Use your favourite domain registrar to
search for domain names that contain the
client's brand name or common
misspellings of their name under various
TLDs. Check the content there and the
WHOIS record for those domains. Let the
client know if they should register any of
these domains.

If the content has been scraped, you


should file a content removal request with
Google: http://bit.ly/yB77Dd

GWT: Search Appearance > HTML


Improvements
GWT: Search Appearance > HTML
Improvements
Search for snippets of text from product
descriptions wrapped in quotation marks

GWT: Search Appearance > Structured


Data or do a search for a term the page
gets traffic for

You can find out the keywords by searching


for your landing page and then clicking on it
to see the keywords in this custom report:
http://bit.ly/SwC2di. It's already filtered for
organic.

A site should use H1, H2, etc. over custom


CSS classes to format content.

You want to make sure a significant


proportion of content above the fold is
unique (not ads, not template and
boilerplate content)

An RSS button makes it easy for visitors to


connect with you - esp useful for sites with
tech-savvy target audience.
It's standard fare to noindex, follow
category and tag pages. However, adding a
solid paragraph to them and only including
snippets of posts can make them effective
landing pages.
Recommendation Priority (1–3)
Check
Are they tracking rankings?
If so, are they missing keywords?

What percentage of organic traffic is (not provided)?

Do the keywords they're ranking for and getting traffic for make sense by category?

How many keywords are they ranking for compared to their competitors?

How do their rankings compare to their top competitors?

Do homepage and money pages show up for branded searches?

Do they have landing pages ranking for the same keywords?

If the site offers site search, are there opportunities they should be going after?
Do you need to manually check rankings?

Are they bidding on keywords?


If they're bidding on keywords, are there any keywords that convert like wildfire that
they should consider competing on for organic?
Instructions Observations

Can find new words to track from GA,


GWT, SEMRush, and Open Site Explorer
or Majestic anchors
GA: Traffic Sources > Sources > Search >
Organic
Take keywords from GA, SEMRush (or
Keyword Spy), and GWT and filter by
category, subcategory - look at keywords
OR copy URL and do text to columns OR
tease out categories and subcategoriess
and look at keywords in giant pivot table
that looks at keyword, cat/subcat, keyword
volume, ranking (if applicable), visits (if
applicable)

Run all through SEMRush, Keyword Spy,


or Searchmetrics
If they're using a rank tool, use that. If not,
pull keywords from SEMRush (or Keyword
Spy), create a pivot table to compare

If not, the homepage or money page(s)


might have been penalized, like if a weaker
page starts coming up instead of the page
that used to to get the traffic.
1. Do a site: search Google for important
keyword phrases
2. Check GWT: Search Traffic > Search
Queries
3. Check this custom report from GA:
http://bit.ly/SdmffR. Click on keywords to
see if more than one landing pages is
attracting traffic. Just keep in mind some
visits are lost to the dreaded (not provided).

Compare keywords from Site Search


(Content > Site Search > Search Terms)
against organic keywords and keywords
from SEMRush/GWT
http://hidemyass.com/

Not the perfect proxy solution but an


extremely good one at the price. Great for
seeing exactly how rankings appear from
different locations on the fly or running
systems as though they're from another IP
address in a different location. Source of
analysis:
http://searchenginewatch.com/article/21562
35/78-Resources-For-Every-Internet-
Marketers-Toolkit
Recommendation Priority (1–3)
Check
Has the site had any significant drops in organic traffic?
If the site has experienced drops, are they seasonal?

If the site has experienced drops, do they correlate with any major algorithm
changes?

What tools are they using for tracking?

Are they tracking conversions?


Are they using ecommerce tracking?
Is their analytics tracking code missing from any pages?

Do other sites have their GA code on them?


Do they have subdomains?

If the site has subdomains, are they including hostname in content reports?

If so, is GATC set up properly?

Do they have PPC campaigns showing up in organic results

If the site offers site search, are they tracking it in their analytics?
If so, does their site search appear to be effective?

Are they using asynch?


Is site is using asynch, are pages on the site still using traditional?

Are they using annotations?


Have they set their homepage in GA?
If running Google AdWords campaigns, is your AdWords account linked w/ GA?

If running Google AdSense campaigns, is your AdSense account linked w/ GA?

Do content reports contain utm parameters?


Instructions
GA: Traffic Sources > Sources > Search > Organic
GA: Traffic Sources > Sources > Search > Organic -->
Compare to Previous Year
My SEO gcal: https://www.google.com/calendar/embed?
src=hv5ke5r6teb3272qtqohepsa0c
%40group.calendar.google.com&ctz=America/New_York

BuiltWith Chrome Extension:


https://chrome.google.com/webstore/detail/dapjbgnjinbpoi
ndlpdmhochffioedbn

Screaming Frog: Configuration > Custom > Enter GA


code (e.g., UA-335449-7)
GA: Technology > Network > Hostname
reverseinternet.com
Search in Google: site:clientdomain.com -inurl:www
GA: Technology > Network > Hostname
If not, the site should include hostname in the content
reports (so content reports read www.mysite.com/my-
page instead of /my-page). Screenshot:
http://www.screencast.com/t/juJ4RnFk
No dot before domain in _setDomainName method,
hashing isn't turned off

ROI Revolution post:


http://www.roirevolution.com/blog/2011/01/google_analyti
cs_subdomain_tracking.php

Check for campaign parameters in organic landing pages


(Traffic Sources > Sources > Search > Organic > Landing
Pages), i.e., s_kwcid, ppc, and cpc.
GA: Content > Site Search > Usage
GA: Content > Site Search > Overview > Results
Pageviews/Search (Is it taking too many pages to find
what visitors are looking for?) and Time After Search (Are
they sticking around after searching?)
Search page source code for "gaq"
Screaming Frog: Configuration > Custom > Enter snippet
of traditional GATC, e.g. "pageTracker._trackPageview"

Admin > Click on profile link > Assets > Annotations


Admin > Click on profile link > Profile Settings > General
Information > Default page

Don't need to set this if the default (home) page is /. If it's


something like /default.php or index.html you should enter
that, like you see in this screenshot:
http://screencast.com/t/gd5Idd6u
Instructions:
https://support.google.com/analytics/answer/1033961?
hl=en&topic=1308612&ctx=topic
Instructions:
http://support.google.com/adsense/bin/answer.py?
hl=en&answer=2495976&rd=2
If using GA, navigate to Content > Site Content > All
Pages and apply this advanced
segment:http://bit.ly/1dkmO6M. GA strips these out. So if
you see them in your content reports, it means that
Google isn't recognizing them (or giving credit to the
campaign/source/medium you've assigned). This is
usually b/c the person tagging the links didn't include at
minimum utm_source and utm_medium. These two
parameters are required. All others are optional in that
they will still work even if you don't use them, contrary to
what Google's URL Builder (http://bit.ly/17ShSlO) says.
That said, I would also always include utm_campaign to
assign a campaign name.
Observations Recommendation
Priority (1–3)
Check
Does the site use shopping feeds?
If they're using shopping feeds, what engines have they submitted to?
Do their feeds have all the required fields filled out?
Does the shopping feed use existing categories?
Does the feed include images?
Is the feed in one of the acceptable formats?

Is the feed being updated regularly?


Does the feed use custom fields?
Do product descriptions have unique copy?

Do product variations exist on unique URLs?


If so, are canonical tags used to prevent duplicate content?
Are there any problems with their conversion cycle?
Instructions Observations

Google:
http://support.google.com/merchants/bin/an
swer.py?hl=en&answer=160567

Saarch Google or Bing for snippets of copy


in quotations
Recommendation Priority (1–3)
Check
Is the site language oriented or geographically oriented? Directories (eg:
yourbrand.com/es/ or yourbrand.com/de/) are a better solution for language
targeting --with no geographic constraint-- and ccTLDs (yourbrand.es or
yourbrand.de) for geographically segmented users.

Is the content of the international version of the site in the relevant language,
correctly localized? (Eg: The content of the French version is in French
language featuring prices in Euros, the Brazilian version is in Portuguese
featuring prices in Real, etc.)

Is the URL structure of the international version in the relevant language of the
shown content?

If the site is targeted to a specific country, is this specified in Google


Webmaster Tools?

If the site is targeted to a specific country, does it have a local IP of the


country?

Are the pages or sitemaps of the international sites including the


rel="alternate" hreflang="x" annotations to specify their languages and
geographic regions?
Is the language of the international pages content specified in HTML doctype
declaration?
If the business is targeting too many countries with the same language (eg:
you have yourbrand.com.mx for Mexican users and yourbrand.es for Spanish
users) are each of the different site versions featuring original content?

Are the different international versions of the site prominently linking to each
other, giving users and search engines the option to find them?

Does the site suggests its appropriate version according to users language
and location? If the site is geographically targeted are users suggested to visit
the relevant site version for their location? If the site is language targeted are
users suggested to visit the relevant site version for their language?

Do the international site versions have links from their relevant locations? (eg:
yourbrand.fr should have relevant links from France sites in French).

Which are the most popular search engines used for each of your international
target markets? For most of the Western world might be Google but if you're
targeting to Russia you might need to start focusing also in Yandex and for
China you would need to work with Baidu.

Are you targeting China and Russia?

BIG thanks to Aleyda Solis for pulling together this list!


Instructions
Verify if the site is targeted to any users speaking a specific language or the target
market is restricted to a geographic location. This will help you to define the best
organization for the different versions. For example, if the criteria is only the
language with no geographic constraint you may want to enable directories for
them: yourbrand.com/es/ (for any Spanish speaker who can be in any Latin
American country, Spain, etc.) or yourbrand.com/de/ (for any German speaker who
can be in Germany, Austria, Switzerland, etc.) instead of yourbrand.es (that would
be targeted specifically to Spain) or yourbrand.de (that would be targeted
specifically to Germany). If the site is geotargeted and using generic domains,
subdomains or subdirectories: Keep in to consideration ccTLDs are the ideal
alternative, followed by subdirectories. Avoid subdomains or generic domains with
parameters. Please read to see the pros and cons of the alternatives:
http://www.stateofsearch.com/international-multilingual-sites-criteria-to-establish-
seo-friendly-structure/

Content should be featured in the language of the target market. Keep into
consideration that a language may change in dependance of the geographic area
that is targeting, for example: Mexican Spanish is not exactly the same than
Argentinian Spanish or Brazilian Portuguese is not the same than Portuguese in
Portugal. Because of this make sure the content is not only translated but also
localized and well targeted to its geographic market, with local terms, currency, etc.

Make sure that the URLs of the international versions are in the appropriate
language and not in English by default. For example, the URLs for the Spain
oriented site version should be in Spanish.
Geotarget the site version to its relevant geographic market in Google Webmaster
Tools (http://support.google.com/webmasters/bin/answer.py?
hl=en&answer=62399)

To verify the IP of a site you can use the Flagfox extension for Firefox:
https://addons.mozilla.org/en-US/firefox/addon/flagfox/ or the "Server Stats" tab in
http://whois.domaintools.com/
They should be use to specify the different versions languages and locations
according to: http://support.google.com/webmasters/bin/answer.py?
hl=en&answer=189077
The language of the content should be specified as described in
http://www.w3.org/International/tutorials/new-language-decl/qa-html-language-
declarations
If the products or services are the same make sure to differentiate them by
localizing the descriptions, specifying areas of services, using titles with the
location names, etc.

Give search engines and users a way to find all of the international versions of the
site. Make sure that the links to the different international versions are crawlable.

For example, if the site is geographically targeted and users in Spain are
accessing the UK site version (yourdomain.co.uk) a non-intrusive message (not a
pop-up) should be given informing that it exists another site version targeted to
Spain (yourdomain.es) that might be better suited for them. This can be done
detecting users IP addresses. On the other hand, if the site is language targeted
visitors using a browser in Spanish who are accessing the English site version
should receive a non-intrusive message inviting them to switch to the Spanish
version if they prefer to do it. This should be displayed as an alternative for them
only the first time they access to the site and save their preferences for future
visits. Despite this there should be always the option to switch to any other version
of the site as specified in a previous point.

Verify and make sure to build links to each international site version from local
sites (blogs, news sites, local communities, etc.) in their same language.

Verify which are the most popular search engines used in the international
locations you're targeting and make sure to align your optimization practices to
make the most out of them, use their keywords, webmasters and tracking tools
too, etc. You can use: http://www.alexa.com/topsites/countries to check the top
sites per country.
In this case to choose the .cn and .ru ccTlds is somehow an obligation as both
Baidu and Yandex are strongly biased to their respective Country Code Top Level
Domain Names
Observations Recommendation
Always remember to decide what International SEO formula In case the site use a subfolder
(ccTld, subfolders or subdomain) to use based on the architecture for International SEO, avoid IP
general businesses objective and being aware of the Lookup redirection. It is a deprecated
technical needs of the site in general. For instance, if the practice for multilingual/multicountry SEO.
site is an eCommerce with thousands of products, the The main reasons are: 1) it is not so sure
subfolder option may be not the optimal, because of the users from a country want to use the
replication of the products databases or the risk of dealing country mirror of your site; 2) some CMSs
with a too big and complex database. In that case a ccTld (i.e.: Joomla) tend to present the main
or mix (part subfolder and stores in subdomains) are a sudomain URL (www.domain.com) as the
better solution. Moreover, if the company is having or going home page of every language/country
to have a physical presence in the country targeted, it is version of your site. This can lead to a not
always suggested to use the local ccTld, especially useful naturl link building, as all the link
because of the "French Syndrome", which is how it is called should go to the home page, which is
the unconscious preference people have for sites with their theoretically targeting just one
own country domain termination. language/country; 3) Googlebot usually
crawls from an US IP: with an IP Lookup
redirection it would "see" only one version
of the site.

Remember that a good translator translates into his own Never use an automatic translator. Neither
mother language, not the contrary (i.e.: an Italian translate the Google Translate APIs. Apart from
from English to Italian, not vice versa). PRO TIP: use the offering very bad translations, that is a
translators also for well identify and localize the keywords deprecated by Google tactic.
you are targeting. This way you will discover linguistic
peculiarities, as the love Italians have for using English
wording for tech (i.e.: web hosting in Italian is web hosting
and not "alloggiamento web", which is its literal translation).

Remember that in Google Webmaster Tools also


subdomains and subfolder can be geotargeted. NOTE: only
generic name domains can be geo-targeted. Country Code
Top Level Domain Names are automatically geo-targeted to
their corresponding country. This is important because it is
not so rare to see sites with ccTlds trying to open to
International SEO and creating language/country subfolder
of subdomains, which cannot be geo-targeted via GWT

You can use Screaming Frog Custom Field in order to see if


the rel="alternate" hreflang="x-X" is used on site or not, in
every page or not.
See recommendation in D2

This is important also in Social Media, considered here as


an outreach tool.
Priority (1–3)
Check
Is the site using structured markup (e.g., schema.org, Facebook Open Graph,
GoodRelations, microformats.org, XFN, etc)?

Does the site have content that's rich snippet-worthy that schema.org supports
(e.g., video, reviews, ratings, recipes, author, event, product, offer, etc)?

Is the markup validated?

Does the site need microdata for local business?


Are competitors using structured markup?

Do the rich snippets follow Google's guidelines?


Instructions Observations
GWT: Search Appearance > Structured Data

Screenshot of sample report:


http://www.screencast.com/t/YzptQHgEame
Schema.org:
http://schema.org/docs/gs.html#schemaorg_types
Here's a set of commonly used item types:
Creative works: CreativeWork, Book, Movie,
MusicRecording, Recipe, TVSeries ...
Embedded non-text objects: AudioObject,
ImageObject, VideoObject
Event
Organization
Person
Place, LocalBusiness, Restaurant ...
Product, Offer, AggregateOffer
Review, AggregateRating
Reference: http://bit.ly/rc08C3
How to use: http://selnd.com/vXRHlC

Google's Validator:
http://www.google.com/webmasters/tools/richsnippet
s
BWT: Diagnostics & Tools > Markup Validator
Facebook Debugger:
http://developers.facebook.com/tools/debug

http://microdatagenerator.org/
http://www.sindice.com/

Search for your keywords. Only sites that are using


structured markup will show up in results, and the
type of markup they're using is listed. (RDFA = FB
Open Graph)

http://bit.ly/SypKRq
Recommendation Priority (1–3)
Check
What's your monthly budget?
How many active campaigns are there?
How many active ads are there?

How many active keywords are there?


Is conversion tracking enabled?
Is the account connected to Google Analytics Account?

Are Search and Display separated?


Are you using negative keywords?

Are there mobile campaigns?

Are ads A/B tested?

Are AdGroups themed?


How many destination URLs take visitors to homepage?

What is your Cost to Sales ratio?

What is your average conversion rate?

How good is average Cost Per Click?

How do your Impression Share reports look?

How many keywords have a Quality Score (QS) less than 4?

What percent of keywords have a QS less than 4?


How many keywords have a QS of 4, 5, or 6?
What percent of keywords have a QS of 4, 5, or 6?
How many keywords have a QS of 7, 8, 9, or 10?
What percentage of keywords have a QS of 7, 8, 9, or 10?
BIG thanks to Lyena Solomon for pulling together this list!
Instructions

Number of campaigns should be proportional to the budget.


Number of active ads should be appropriate for the number of ad groups running

Number of keywords needs to be appropriate for the daily budget


Has to be enabled
AdWords and GA track differently. It is very helpful to have both to understand the
user and the success of PPC advertising.
should be
Negative keywords are an absoulte must, especially if you have broad and
modified broad match keywords. Use negative lists to apply them to campaigns
and adGroups in bulk.
Always separate mobile devices. Depending on your mobile traffic (consult
analytics), you might want to start creating campaigns just for mobile devices.

Every adGroup should have at least 2 ads running at all time to test and improve
click-through-rate and conversions.
All keywords should be united in adGroups under a very tight theme.
Do not send paid traffic to your home page. Home pages are too generic and, as a
rule, do not answer the query question. That said, test your home page
performance for ads you think make sense to connect with your home page. You
might be an exception to the rule.
Take your cost, divide by sales and you get a percent. 30% is good. Higher is not
very good and one of quick indicators that you need to make more money from
your advertising. Be careful with this number if your PPC goal is brand recognition
and awareness. When your goal does not translate directly into revenue, this
metric is not as helpful.
Average conversion rate is a quick indicator of the overall account performance.
You will need to look at conversion rates for the campaigns and individual
adGroups. If the average conversion rate is below 1%, you know that you have to
investigate further and as quickly as possible.
Average Cost Per Click (CPC) should be tolerable from the business perspective.
You need to take into consideration cost per acquisition, goals of the campaign,
etc. Divide CPC by average conversion rate and you will get cost per lead. Then,
decide if it is acceptable.
Figure out if you are losing impressions based on bad quality score or low budget.
If you are, improve quality score and revise your budget allocations.
Has to be low. Ideally, you should not have low quality score keywords. In real life,
you might be in a niche (payday loans, for example) where all keywords have low
quality score.

Areas of improvement

Your goal is to have all keywords in this range


Best if it is 70% and above.
Observations Recommendation
Priority (1–3)
Check
Do the videos on the company's website and YouTube channel come up in a brand
video search in Google and Bing?
Do any video rich snippets come up in Google in a search for the company's brand
name?
Do they have video markups from schema.org on all of their videos?

Has the website submitted a video xml sitemap submitted to GWT and BWT?

If the site has submitted video sitemaps, are all of the site's videos indexed?

Is the site's video sitemap formatted correctly? (Must-read guide that covers all
platforms)

Are all of their videos on one page?

Are videos embedded?

If videos are embedded, have they been transcribed?


Is there keyword-filled content surrounding their videos?
Does the company host their videos on a video-sharing site, like YouTube or
Vimeo?
Does the company use a video-hosting service like Wistia or Brightcove?
If the company uses a video-sharing service, like YouTube or Vimeo, are all of the
videos from the company's video channel on their website, and vice-versa?
If the company uses a video-sharing site, do they have a link or button on their
website pointing to their channel?
If the company is using a video-sharing service, like YouTube or Vimeo, do all of
their videos have optimized titles and descripitions?
If the company is using a video-sharing service, are video titles compelling?
Are the videos tagged with relevant keywords?

Are the tags mixed between common and specific keywords?


Do tagged phrases use quotes?

If the company is using a video-sharing service, are video descriptions thorough?

Does the video or description have a call to action for people to comment on the
video?
Are there links to other videos in the annotations?

Are the videos getting favorited and liked? Are the videos providing any incentive to
like or favorite them?
Are videos being published regularly?

Is there a link in the front of the video's description?

If the site uses a video-sharing site, do its descriptions also have a link to their
channel page, a subscription link, links to related content, or to
sites/videos/channels/users referenced in the video, and links for social media?

Are the videos embeded with iframes?

Are the promotional videos focused on converting customer purchases > 30 sec,
but still short and to the point?

Are the product videos coming off as overly promotional ads?

Are product videos on YouTube?

Are there keyword-optimized captions in the video?


Do the videos have CNAME aliases, to make sure links are going back to a
branded subdomain?

Are their social share buttons next to the video player?

Credits:
Most of the tips in this tab come from these two amazing resources:
An SEO's Guide to Video Hosting & Embedding
YouTube Creator Playbook V. 2

The checkpoints for this tab were pulled together by my amazing daughters:
Destinee Cushing
Tori Cushing
Instructions Observations

Check GWT: Search Appearance > Structured


Data
GWT: Crawl > Sitemaps
BWT: Configure my site > Sitemaps
GWT: Crawl > Sitemaps
BWT: Configure my site > Sitemaps
http://www.distilled.net/blog/video/creating-video-
sitemaps-for-each-video-hosting-platform/

This is sub-optimal. With rare exception, videos


should be on separate pages with text on the
page.
Screaming Frog: Configuration > Custom > Enter
a snippet of code from video embed (e.g.,
src="http://www.youtube.com/ for YouTube)

https://www.speechpad.com/pricing

YouTube's Keyword Suggestion Tool:


https://ads.youtube.com/keyword_tool

The site should use these generously. Put most


important tags first.
This is ideal.
If phrases such as "video seo" aren't in quotes,
they will be tagged as video and seo.

Video responses show YouTube that your video


is popular and relevant.
There's anecdotal evidence that YouTube favors
videos that link to other YouTube assets within
annotations.
Getting Liked and favorited helps with video SEO
in YouTube.
YouTube's alorgithm favors video channels that
puplish content regularly. It also gives you more
opportunities to have high ranking videos.

It's best to start the video description with a link


back to your website. Read more here:
http://mashable.com/2010/04/16/boost-seo-
youtube/.

In order to be indexed, it's best to embed content


with HTML5, JavaScript, or Flash. Search
engines aren't very good at crawling and
indexing iframes.
If you’re aiming to convert consumer purchases
off the back of your video, keeping it as short
and to the point as possible is preferable.

If you’re building commercially focused content,


then it’s important to remember that product
videos are not ads.
Product and commercially oriented videos tend
to have a better chance of getting rich snippets
on video-hosting sites, like Wistia and
BightCove.
Make sure your videos have captions
This way you can ensure any embedded content
links back to you twice: once to reference the
video file and once with an attribution link to a
page which you can specify. You can read more
about what Phil Nottingham has to say about this
here: http://www.seomoz.org/blog/hosting-and-
embedding-for-video-seo

Video-hosting sites like Brightcove or Wistia


provide social sharing buttons as overlays or
icons next to the player. Otherwise, just include
social buttons next to the player you're using.
Recommendation Priority (1–3)
Check
Have they claimed their local pages?
Do they list a local number or a toll free one?
Is address in content?
Does the business have multiple locations?
If it has multiple locations, does the site have a page for each location?

Is the business address correct in Google and Apple Maps?

Is business listed in reputable web directories?


Is business listed in reputable business directories?
Is business listed with relevant associations?
Does the business have a Google News feed?
Does the business have brick and mortar(s)?
Does the business have reviews on local?
Do their competitors use affiliates as business model?
Do they have an affiliate program?
Any ideas to get the brand more visibility?
What are the main calls to action for the site?
Instructions Observations
http://getlisted.org/
Better to have a local one

The site should have a page for each


location and submit each of those to
Google and Bing. Or submit a data file w/
all of them listed. They can be submitted
here for Google:
https://places.google.com/manage/?hl=en-
US&gl=US#upload. Here's a template
Google provides: http://bit.ly/WD293o. And
submit here for Bing:
http://www.bing.com/businessportal

Use Map Maker


(www.google.com/mapmaker) if incorrect in
Google. More effective than clicking
"Report a Problem"

reverseinternet.com
Recommendation Priority (1–3)
Check
Are they on social?
If they're using social networks, what networks are they using?
What is the quality of social interaction?
How many social mentions have their top organic landing pages had?

Are they tagging links?

Are they tagging email?

See last two weeks of tweets w/ BackTweets

Do content pages have social buttons?


Can visitors leave comments and easily subscribe to comments?
Does the site offer different ways to log in to add comments (e.g., Twitter,
Facebook, Disqus, etc.)

Does the site use a social plugin that can be tracked in GA?

Are they on Pinterest?

If so, are they using it effectively?


Are they usng a video-sharing site like YouTube?
If so, are they optimizing their video titles, meta descriptions, and tags?
If so, do they include a link to their site in the description?
If so, are they using annotations to increase engagement?
If so, what are their most-viewed videos?
If so, are they allowing their video stats to be viewed?

If so, where are they being embedded?


Are they embedding their videos in their own site?
Instructions Observations

Too commercial? One way? Interactive?


Pull top organic pages from Traffic Sources
> Sources > Search > Organic > Landing
Page (adjust # of results), add domain with
concatenation in Excel, run SEOTools
Excel add-in: Social
Look for parameters like utm_medium,
utm_source, and utm_campaign.
Search Traffic Sources > Sources >
Referrals > Source for "mail" (Screenshot:
http://www.screencast.com/t/hrjI15TQtp)

Mozbar > Tools > Twitter > Backtweets


SEO Site Tools > Social Media > Twitter

Disqus does this best, imo:


http://disqus.com/admin/register/

Search Engine Land does this really well:


http://screencast.com/t/4IsneNjvrmC6

ShareThis and AddThis plugins can be


tracked automatically in GA.
Search Google for: mysite.com
site:pinterest.com

Click the little bar chart below the video


(bottom-right corner)

Use Screaming Frog to find the YouTube


embed code
Recommendation Priority (1–3)
Check
What CMS does the site use?

Does the site use a favicon?


Does the site have images > 20kb?

Are there any design elements that are distracting?


Does the site offer site search?
Is the site search field in the upper-right corner of each page?

if the site uses site search, does the site search use a query parameter?

How is the user experience for homepage?


How is the user experience for cateogry pages?
How is the user experience for product/content pages?
Does the site use smiling heads for its images?

Are there pages on the site that seem to be problematic?

Does the site have anything on it that auto-plays (video, audio, ads)?
Instructions Observations
BuiltWith Chrome plugin
http://www.woorank.com/en/ (get 1/week)

Screaming Frog > Configuration > Spider >


Check Images or, from a full scan, go to the
Images tab

This is where user expect to find site


search.
See an illustration of a query parameter:
http://www.screencast.com/t/PI85pzjXxB

Using stock photography of good-looking,


smiling models can make a site look
generic and sterile.
You can use this custom GA report
(http://bit.ly/TvtS4B) to sort by organic
landing pages with high bounce rates and
exit rates. (The report is already filtered to
only include organic traffic.)
Recommendation Priority (1–3)
Check
Does every page contain a distinguishing logo in the header?
If the site has a logo, does it link to the canonical version of the homepage?
Does the site have a tagline?
If the site has a tagline, is it compelling?
Does the site have an about page?
If the site has an about page, is it compelling and written in plain English?
Does the page have contact information that's easy to get to?
If the site has contact information, does it include a phone number?
Does the site have sitelinks for brand searches?

Are competitors bidding on the site's brand terms?


Do they have reputation management issues?

If the site has gotten negative reviews, has the client responded to them? Offered
compensation?
Does busienss get regular mentions in blogosphere?
Does business have mentions in industry online publications?
Do experts from business get interviews from publications?
Does anyone from the business participate in Q&A websites?
Does the domain itself get mentions online?
Do they have press releases pointing back to their site?
If so, do the press releases contain good links?
Instructions Observations

Example:
http://www.screencast.com/t/lpuT5LMDp

Search for brand on review sites. I created


a CSE for review sites: http://bit.ly/review-
cse
Recommendation Priority (1–3)

You might also like