You are on page 1of 164

Get to the top of

Make your
sites load faster
CLIMB THE RANKINGS Give your sites
a performance boost today!

The future
of SEO
ESSENTIAL! The dos and donts for better
Google rankings in 2014 and beyond

164 pages of advice from leading SEO experts

Revised &
updated for

2014

Googles guide
to Analytics
INSIDER TIPS Google reveals how to best
use Analytics to measure your business

The SEO Handbook

From the makers of

Future Publishing, 30 Monmouth Street, Bath BA1 2BW


Telephone +44 (0)1225 442244 Website net.creativebloq.com
Editorial
Editor Rob Carney, rob.carney@futurenet.com
Art editor Carlton Hibbert, carlton.hibbert@futurenet.com
Production assistant Adam Hurrell, adam.hurrell@futurenet.com
Contributors
Michael Abitbol, Will Aitken, Matt Ballek, Dan Barker, Christian Bauman, Richard
Baxter, Joshua Bixby, Tyson Braun, James Brockbank, Mark Buckingham, Ben
Callahan, Mike Chipperfield, Michelle Craw, Sam Crocker, Justin Cutroni, Matt Davies,
David Deutsch, Steve Durr, Hugh Gage, Gabe Gayhart, Tim Grice, Colin Grieves, Tom
Gullen, Aaron Gustafson, Jaime Hall, Susan Hallam, Luke Hardiman, Dean Hume,
Glenn Alan Jacobs, Steve Kuncewicz, Tim Leighton-Boyce, Anna Lewis, Stephen Lock,
Rory Lofthouse, Tom Mason, Dave McAnally, Karen McGrane, Bryson Meunier, Jason
Michael, David Mihm, Robert Mills, Katie Moffat, Luis Navarrete, Chelsea Neuman,
Kelvin Newman, Peter O'Neill, Emily Pope, Julian Ranger, Paul Roberts, Mark Roden,
Danny Sullivan, James Swinarski, Justin Taylor, Pete Wailes, Sam Williams, Ben Wood
Advertising
Advertising Sales Director Charlie Said, 020 7042 4142, charlie.said@futurenet.com
Advertising Sales Manager Jas Rai, 020 7042 4219, jas.rai@futurenet.com
Account Manager Suzanne Smith, 020 7042 4122, suzanne.smith@futurenet.com
Marketing
Group Marketing Manager Philippa Newman, philippa.newman@futurenet.com
Circulation
Trade Marketing Director Daniel Foley, daniel.foley@futurenet.com
Print & Production
Production Coordinator Vivienne Turner
Licensing
Senior Licensing & Syndication Manager Regina Erak, regina.erak@futurenet.com
Tel + 44 (0)1225 732359
Future Publishing Limited
Editor in chief Dan Oliver
Group Art Director Steve Gotobed
Head of Creative and Design Declan Gough
Managing Director, Future UK Nial Ferguson
Creative Director Robin Abbott
Editorial Director Jim Douglas
Subscriptions
Phone our UK hotline 0844 848 2852; international (+44) (0) 1604 251045
Subscribe to net magazine online at www.myfavouritemagazines.co.uk
Printed in the UK by William Gibbons.
Distributed in the UK by Seymour Distribution Ltd,
2 East Poultry Avenue, London EC1A 9PT. Tel: 0207 429 4000

As Danny Sullivan, founding editor of Search


Engine Land Search once said, search is the
second most important thing we do on the
web, after email. And it continues to
reshape itself every few months. And that's
why you need to be on the ball when it
comes to the fine art of SEO.
To help you with this, over the following
pages, we bring you the essential advice,
tips, techniques and tutorials you need to
boost your ranking and make your site or
business move in the right direction.

Future Publishing Limited 2014. All rights reserved. No part of this magazine may be
used or reproduced without the written permission of the publisher. Future Publishing
Limited (company number 2008885) is registered in England and Wales.The registered
office of Future Publishing Limited is at Beauford Court, 30 Monmouth Street, Bath
BA1 2BW. All information contained in this magazine is for information only and is,
as far as we are aware, correct at the time of going to press. Future cannot accept any
responsibility for errors or inaccuracies in such information. Readers are advised to
contact manufacturers and retailers directly with regard to the price of products/
services referred to in this magazine. If you submit unsolicited material to us, you
automatically grant Future a licence to publish your submission in whole or in part in all
editions of the magazine, including licensed editions worldwide and in any physical or
digital format throughout the world. Any material you submit is sent at your risk and,
although every care is taken, neither Future nor its employees, agents or subcontractors
shall be liable for loss or damage.

We cover everything from avoiding getting


blacklisted to speeding up your site and
getting the best out of Google Analytics.
Read on to discover top search, marketing
and social tips from industry leaders

The text paper in this magazine is


totally chlorine free.The paper
manufacturer and Future
Publishing have been
independently certified in
accordance with the rules of
the Forest Stewardship Council.

The SEO Handbook 3

Welcome

Welcome

Contents

Contents

The SEO Handbook

Contents
Page 18

Page 8

Page 20

Getting started in SEO


Contents

30 best new SEO tools


Build the perfect toolkit for SEO

Localising content
8

Top SEO myths


Ten SEO myths destroyed

The SEO Handbook

SEO tricks that get you backlisted

24

SEO for startups


18

20 best Drupal modules


A SEO toolkit for Drupal

22

Blacklisted tricks
14

Search marketing trends


Top five trends

Improve rankings and traffic

A primer on SEO for startups

26

The future of SEO


20

Discover what's next for SEO

30

Page 26

The SEO Handbook

Features and insight


Page 46

Page 56

Page 74
Contents

Get to the top of Google


Bryson Meunier has the details

38

Reduce your bounce rate


Keep your visitors longer

46

Google's Analytics guide


The insider's guide to Analytics

50

Optimise for mobile


Twelve mobile techniques

56

Master mobile navigation


Content for mobile devices

62

Understand your audience


Techniques to understand users

Page 50

70

Beat Google link penalties


Top techniques to tackle penalties 74

Content strategy in-depth


Sandi Wassmer's content series

78

Interview: Karen McGrane


The content strategist in profile

Page 92

92

Page 78

The SEO Handbook 5

Contents

Contents

Contents

Contents

The SEO Handbook

Expert tutorials

Page 120

Contents

Serve faster web pages

Make sites load faster

Use Nginx to serve static content

98

How to make sites render faster

Page 98
112

10 top SEO tips

Improve page load times

Glenn Alan Jacobs shares top tips 101

Use content delivery networks

Utilise structured data

Build a responsive site

Schema for ecommerce products 102

A simple responsive website

Boost page performance

Retrofit older sites

Provide a faster page load time

Use responsive techniques

108

116

120

126

Page 126

The SEO Handbook

The SEO Handbook

Become a web pro

Page 138

Page 146

Page 155

Search

Analytics

Marketing

Semantic search
Optimising web pages

Inconsistent data?
134

135

Keywords driving sales


Which keywords are best?

Blogging
146

What's the point?

15 post-Penguin tips
Backlinking top tips

Aiming for data reliability

The purpose of web analytics

Google's remarketing option

A powerful marketing tool

Ensuring successful CRO

Conversions are changing

'S' stands for success

Techniques and opportunities

Customer conversion journeys

Getting the balance right

What does 'SEO' mean?


A definition for SEO

151

Testing times
148

Post-Penguin link building


137

150

Using infographics
147

Remarketing
136

Your marketing secret weapon

149

153

154

Google Adwords
138

Enhanced Campaigns analysis

155

Inbound marketing
Why the term isn't relevant

139

Social

Good social content

Don't game the system


Optimising best practice

Creating content to engage

140

SEO is the glue


SEO in the web design process

Targeting the customer


142

The SoLoMo trend

Seven SEO tools for your toolbox 143

Page 159

Why it's not possible

158

Social data and search

Conversational SEO
144

Website migration
Move sites with renewed ease

157

Speedy social marketing

Seven essential tools

Less type, more talk

156

The rise of social search

159

Make content shareable


145

Spark audience engagement

160

The SEO Handbook 7

Contents

Contents

Getting started

Getting started

30 best SEO tools

best tools
8

The SEO Handbook

Getting started

30 best SEO tools

30 best

SEO tools

Inbound marketer Richard Baxter shares his top tools to


build the perfect toolkit for SEO and digital marketing

Youve heard the saying, 'A poor


craftsman blames his tools'. In our
world, its more like 'really awesome
tools make me look good to my clients', or
similar. Making sure you use the best tools is
critical to being thorough, competitive and
exceptional at your craft, but staying up-to-date
is pretty difficult.
Ive been working in the trenches of the SEO
industry for 10 years, watching different tools come
and go, all gaining in complexity and usefulness as
time goes on. Our industry evolves extremely
rapidly, as does your need to keep an eye on
whats out there to help you be more effective and
agile, especially when carrying out essential, but
often mundane, digital marketing tasks.
Lets take a look at what are new, useful or
downright awesome tools in inbound marketing,
focusing on some of the key stages of the SEO
process: research, technical, link building and
content marketing outreach.
To write this feature, I enlisted the input of some
good friends in my industry, and of course, my
team of over 20 SEOs at Builtvisible (http://
builtvisible.com) who work with these tools all day,
every day.

Keyword research and


audience profiling
Among the classic keyword research tools has
always been the Google Adwords Keyword Tool.
Unfortunately, the Google Keyword Tool is due to
close very soon. In its place, Google announced
the Keyword Planner (http://netm.ag/adwordsbz92), which has most of the data available from
the original keyword tool and more to come.
If youre looking for even more keyword ideas,
try tools based on the Google Suggest API like
Ubersuggest (http://ubersuggest.org). You'll get far
by simply creating a list of keywords and
prioritising them by search volume. Type in your
keywords and see what appears in the Google
autocomplete box to get an idea of how people
are searching around those words.
If youre building a serious data set, the
smart money is in combining different

The SEO Handbook 9

Getting started

Getting started

Getting started

30 best SEO tools

Keyword planning Google's Keyword Planner replaces its Keyword Tool

Searchmetrics The ability to calculate search volume based on traffic algorithms

data points for extra validation. Finding low


competition, high volume keywords is every
search marketer's Holy Grail. Mozs Keyword
Difficulty Tool (http://moz.com/tools/keyworddifficulty) can estimate search volumes and
combines data aggregated from Bing rankings for
keyword by location and Mozs own link data
source, Mozscape (http://moz.com/products/api).
Bing Webmaster Tools (www.bing.com/toolbox/
webmaster) has a nifty keyword tool, showing your
average ranking position, the number of clicks and
impressions for that particular keyword.
If you have an Adwords API key, you could
consider extracting keyword search volumes via its
API by working with your development team (or
using an Excel Tool like our Adwords API Extension
for Excel: http://builtvisible.com/seogadget-forexcel). SEMrush (www.semrush.com) have a
powerful API and present search volumes as
reported by Google. An article by Russ Jones

Bing tools A tool to show your average ranking position

(http://netm.ag/russ-bz92) found that SEMrushs


data had the lowest error rate (compared to its
own index) and a high level of coverage.
What about the new stuff? Keyword research
can move slowly at times. Because theres no direct
return for your efforts (just because youve done
some keyword research hardly means your traffic
will grow), I suspect lower levels of investment find
their way into this corner of the SEM universe.
With that said, were excited about Grepwords
(www.grepwords.com), currently in beta, as a
newcomer to the keyword research tool space, as
well as Searchmetrics (http://suite.searchmetrics.
com/auth/login), which calculates a search volume
based on its own traffic algorithms.

Search engine visibility


monitoring
When it comes to your organic rankings, there
are lots of interesting tools that are handy for

a quick health check or larger scale monitoring


challenges. If youre working in multiple locations,
and youd just like a little data, small web apps
like Search Latte (http://searchlatte.com) help you
check rankings in different countries quickly.
With that said, some of us want to see all of the
data! We use a few tools for rank checking on a
day-to-day basis. Getstat (http://getstat.com) is an
excellent, enterprise-level keyword tracking
platform, with detailed reports, clear data
presentation and useful alerts service. Its also able
to collect ranking data at the regional level, which
is really useful for tracking rankings by US state.
Advanced Web Ranking (www.
advancedwebranking.com) is a powerful solution
for scheduled, localised ranking, link monitoring,
keyword research. Its also a powerful, site-crawlbased search engine accessibility monitoring
platform. Combined with proxy services like
Trusted Proxies (www.trustedproxies.com), its fast

The data Sometimes you want to be able to see all of the data. There are plenty of tools available to help you on a daily basis

10

The SEO Handbook

30 best SEO tools

Trusted Proxies Suitable for most in-house SME SEO teams and agencies, Trusted Proxies can be configured to run on a server

and scalable enough for most in-house SME SEO


teams and agencies. Usefully, it can be configured
to run on a server, with AWR clients connecting to
a single data source across your network.

Technical SEO and search


engine accessibility
Ive always thought Bing SEO Analyzer (http://
www.bing.com/toolbox/seo-analyzer) in Bing
Webmaster Tools is a really good tool for quickly
identifying on page issues, like malformed
containers, missing H1 elements and the like. Its
real power comes from a simple to interpret user
interface, often lacking in so many 'technical'
SEO tools. The tool visibly renders the web page
youre analysing, and highlights any issues it finds

during the analysis process. Mozs PRO toolset


(http://moz.com) comes with a deep site crawler
(lovingly referred to by its team as Roger Mozbot).
Approximately once a week, you receive an
update to your crawl data, with a user interface
that updates you on crawler discovered errors,
warnings and notices. Moz have a very simple to
use, visual interface that's ideal for newcomers
to SEO. Its data export, API services, link analysis
and social monitoring make for a well-rounded
advanced SEO campaign solution. Export data
from its tools includes advanced, technical SEO
features like the contents of your X-Robots filed in
your server header response. Hardcore!
Lately, Screaming Frogs SEO Spider (www.
screamingfrog.co.uk/seo-spider) has become the

SEO Spider Screaming Frog's SEO Spider has become the 'go to' site crawler

'go to' site crawler. Able to highlight SEO


accessibility issues, SEO Spider comes with
powerful filtering to weed out specific issues, like
missing Google Analytics tracking code. It also has
a nifty sitemap generator.
Im very excited about the premium service,
DeepCrawl (www.deepcrawl.co.uk). Its a great deal
more pricey than annual subscription tools like
Screaming Frog, and free tools like IIS SEO Toolkit
(http://netm.ag/iis-bz92). Follow the handy
installation guidelines (http://netm.ag/guidelinesbz92), but has the capacity to crawl industrial-size
websites with millions of pages. This is something
the others simply cant do.
Log analysis has taught me more about
SEO than any other single activity in the last

SEO Analyzer Bing's tool is great for quickly identifying page issues

The SEO Handbook 11

Getting started

Getting started

Getting started

Getting started

30 best SEO tools

Ahrefs A new tool to the scene, this link data monitoring tool is fast and has a powerful API

decade. You learn so much about SEO


simply by looking at the resources
Googlebot requests on your website. On that note,
we recommend you try the free edition of Splunk
(www.splunk.com/download) with a recent log file
export, to see what you can find.

Link analysis, monitoring and


reporting
Link analysis has always been a rapidly-evolving
area of our industry. In light of Googles very recent
Penguin algorithm updates, that evolutionary rate
of change has increased exponentially. Every day,
Chrome extensions like Check My Links (http://
netm.ag/check-bz92) are extremely useful for
broken link building and general on page link
checking. The rather wonderful Scraper (http://
netm.ag/scraper-bz92) makes light work of
fetching URLs in batches from web pages.

Redirect Path Cleverly logs each redirect step

12

The SEO Handbook

The Web Developer extension for Chrome and


Firefox have been a long time staple of any SEO
interested in technical health.
Redirect Path (http://netm.ag/redirect-bz92) from
Ayima cleverly logs each redirect step taken when a
URL is requested by the browser, frequently
highlighting when SEO-unfriendly, multiple hops are
mad, or worse, where 302 redirects are lurking in
the chain.
There are some well-known players in the link
data industry. Majestic SEO (http://developersupport.majesticseo.com) and Mozs Mozscape
(http://moz.com/products/api) both have a vast
reach into the link graph (our agency uses the API
services offered by both companies for our in-house
tools). Probably the most frequently used tool
in-house at Builtvisible for fast link appraisal would
be Open Site Explorer (www.opensiteexplorer.org).
For really deep dive stuff we consolidate data from

all sources, including Googles Webmaster Tools.


If youre an Excel junkie, managing all of these
data sources gets a lot easier with Builtvisibles
own Links API Extension for Excel (http://
builtvisible.com/seogadget-api-available-links-apiextension). The Excel plug-in talks to API services
from Majestic, Moz, Builtvisibles own Links Contact
API (http://netm.ag/links-bz92) and soon, the
Ahrefs API (https://ahrefs.com/api).
If youre into deep SEO auditing with Excel, and
youd like a few new tools in Excel, install Niels
Bosmas SEO Tools for Excel (http://nielsbosma.se/
projects/seotools).
Relatively new to the link data scene are Ahrefs
(https://ahrefs.com). The link data monitoring is
extremely fast (new and lost link discovery seems
to be a real strength for these guys). We rate the
toolset in the 'hardcore' category for link data
mining. It has a very powerful API, too.

Python Pyscape Pyscape (http://netm.ag/pyscape-bz92) solves the problem of getting data from the Mozscape API in bulk

30 best SEO tools

Fresh Web Explorer The new darling of the real-time mentions monitoring scene, you can compare mentions of your favourite terms found on the internet up to four weeks ago

For the Python-minded, Benjamin Estess


Pyscape (http://netm.ag/pyscape-bz92) is for you. It
solves the problem of getting data from the
Mozscape API in bulk. Anyone who can run a
Python script in Google App Engine should be up
and running with this in minutes.
For those times when you think you may have
been working with the wrong SEO agency, and
your links could be to blame for a recent drop in
your organic rankings, were excited about LinkRisk
(http://linkrisk.com) as a fast and powerful link
audit tool. It identifies suspect links that may need
removal, and its a useful tool to base some of your
outreach for link building on, too.

Social monitoring and metrics


Social Crawlytics (https://socialcrawlytics.com) is a
site-crawl-based competitive social analytics tool
that (among other useful reports) provides pageby-page social metrics, author popularity and a
breakdown of page level shares by social network
via a solid UI or API interface. Its free, which is nice!
On the subject of social, my favourite tool on the
web is Topsy (http://topsy.com). Topsys a powerful
real-time social search engine, allowing you to
search by URL or search term, delivering mentions
by social profiles on Twitter and Google+.
See the example search result for 'SEOgadget.
com' here: http://netm.ag/topsyeg-bz92. Note the

ability to filter for 'influential only' results. The new


darling of the real-time mentions monitoring scene
is Fresh Web Explorer (http://freshwebexplorer.
moz.com). You can compare mentions of your
favourite terms found on the internet up to four
weeks ago, export the data and combine it with
other information from your tools. My favourite
feature is the ability to find mentions of your site
that dont currently link. Very useful. l
Richard Baxter is a regular contributor to the SEO
and inbound marketing industry while running a
busy, technology-based SEO and inbound marketing
agency, Builtvisible (http://builtvisible.com).

Social Crawlytics A great site-crawl-based competitive social analytics tool that provides page-by-page social metrics and it's free

The SEO Handbook 13

Getting started

Getting started

Getting started

Getting started

14

The SEO Handbook

The top 10 SEO myths

The top 10 SEO myths

The top 10 SEO myths


Search marketing consultant Mark Buckingham destroys
his top 10 favourite search engine optimisation myths
Chances are, if youre reading this, youve dabbled with SEO
to varying levels of success; from minor frustration to Google
gratification. Whatever your experience, myths still abound the
oft-laboured carousel of search engine optimisation: at its best, a well
planned, ongoing strategy in the pursuit of visibility and primed accessibility,
underpinning, informing and complimenting exemplary design usability and
content. At its worst, a dark art, misunderstood, an unwieldy afterthought, a
quick fix that did you more harm than good.
For truly effective SEO, working in tandem with good design, content
creation and general business practice; look beyond the dashboard for the route
to success. Talking to leading experts in the search engine optimisation industry,
such as Google's Matt Cutts and Search Engine Land's Matt McGee, lets look at
10 of my personal perennial favourite SEO myths.

1. Satisfaction, guaranteed
Lets start with the bedrock of search marketing: there is really no such thing as
guaranteed rankings when it comes to organic, or natural search results. Any
company or specialist proffering such should be treated warily; ask whether
theyre referring specifically to organic search terms or paid search? Whilst is
possible to speculate on long tail niche keywords searches, for all but the most
niche key terms, results will vary and can take weeks, if not months. A good
search marketer will set realistic expectations, using SEO to prime all areas of
your website and content, rather than offer empty promises.
There are one or two hundred factors that influence your ranking with the
search engines no company or individual can control all of these. SEO might be
best achieved with great skill, but there are myriad external factors, dependent
on the success of your products or services, not to mention a slice of luck,
involved with determining whether or not you achieve good visibility on the
internet mantelpiece.
Search Engine Lands (http://searchengineland.com) editor-in-chief, Matt
McGee, says: The only way to even possibly come close to guaranteeing
rankings is if youre doing it on the paid side and happen to have a term that
youre willing to bid high enough on and to get high enough clickthrough to
sustain top spot. Also, personalisation comes into play: what you see might be
different to what I see, so theres absolutely no way to guarantee a number one
ranking on Google.

2. A high Google PageRank = high ranking


Despite popular belief, Google PageRank does not equal your ranking. The idea
that a high PR means youre going to rank across the board for everything is a
myth. For certain keywords a lower PR page might outrank a higher PR page,
but the rankings dont specifically go in exact PR order, says Matt McGee.

SEO land Search Engine Land is a new hub for everything related to SEO

Having a high PageRank is nice but it doesnt automatically mean high rankings
for everything, and it certainly doesnt automatically mean youre going to get
tons of traffic and sales.
McGee adds: Its still often seen as the number one factor in Googles
algorithm when its actually one of a couple of hundred factors. Its a very visible
symbol for a lot of webmasters and business owners, but the more time you
spend in the search world, the sooner you realise its not the be-all and end-all.

3. Endorsed by Google
Put simply, if youre dealing with a firm who make any allusion that theyre
endorsed or approved by Google for optimisation purposes, its likely
theyre a fraud. The reality is that Google does not endorse any SEO company.
They do have Analytics and AdWords certification, so providers in these areas
can take tests for accreditation. Google definitely does not put their stamp of
approval on any individual consultant or company, affirms Matt McGee.
Personally, Im not opposed to the idea of some accreditation or regulatory
standards, given this very subject matter, and unregulated nature of the search
world, but I just cant see it happening any time soon. Googles Webmaster
Guidelines (https://support.google.com/webmasters/answer/35769) and
its beginners guide to SEO (http://netm.ag/starter-bz92), as well as various

The SEO Handbook 15

Getting started

Getting started

Getting started

Getting started

The top 10 SEO myths

Beginner's guide Googles SEO starter guide (http://netm.ag/starter-bz92) covers around a dozen common areas that webmasters could consider optimising

esteemed resources on the web, should be consulted when undertaking any


SEO or hiring a professional, but many professionals cite that what they teach
you is very vanilla. McGee adds: It gets you in the door but its not always
going to be everything that you need.

4. Meta tag keywords matter


A perennial favourite myth is probably the keywords meta tag. Googles head of
webspam and all-round search sage, Matt Cutts, says: Google doesnt use the
keywords meta tag in our scoring at all. Its just a waste of time to throw a lot of
phrases into the keywords meta tag. It would be a better use of your effort to
do things like speed up your website, because that can directly improve the
usability of your site even independently of SEO. Metatag descriptions, and
certainly titles matter, but its true the keyword tag is generally completely
redundant across the board.
David Mihm, president of GetListed.org, agrees: Can you help me optimise
my meta keywords? This is probably the number one phrase I hear from small
business owners who call and want me to help them with website optimisation.
The fact is that no search engine uses them any more. Google, which rarely
discloses ANYTHING important about its algorithm, formally declared it does
not use meta keywords via its search quality guru Matt Cutts nearly two years
ago. The two metas that site owners should still worry about are including
keywords in the <title> tag (extremely important for optimisation), and the
meta description, which, although it does not seem to affect ranking, can be
used to increase clickthrough rates from the search result pages.

5. Cheat your way to the top


Attempting to tricking Google, Bing, et al, and trying to manipulate search
results is a bad idea, and even if you succeed, if and when the search engines
discover your sites deception, you risk your site will be removed from the
index, with potentially disastrous business consequences.
Its arguable that Google et al might miss the odd page with a few sneaky
invisible keywords; after all, this might be the work of an errant (but potentially
well meaning) assistant and not your own work. But a trend or consistency

16

The SEO Handbook

of black-hat SEO is likely to do you much more harm than good as the search
engines get better and better at sniffing out sites, from dubious redirects to
affiliate link farms, that simply dont deserve to be there. The basic adage is, if
it works for the user, its likely to have a place on Google; how far up you climb
is dependent on myriad of factors, and those sites that cheat arent just risking
their credibility, but usually reek of over-optimisation, which in some cases can
be a by-product of a site that was never really designed to please its audience
first and foremost. Being gung-ho in your quest for high rankings at the expense
of your content is nearly always a futile process.

6. Keywords? Cram em in
The notion the keywords that every page needs a certain percentage of time to
outrank the competition is a fallacy. Says Matt McGee: Ive always said you do
have to use the keywords, you need to have pages that talk about the products
and services you sell. Theres no perfect number: its not that if you mention the
keyword seven times on this page Im automatically going to rank well. It
doesnt work that way: there are so many other factors and a page that gets a
lot of inbound links with the right anchor text can rank for terms that dont
even appear on the page. The notion that theres a perfect percentage for
keywords simply isnt true.
Furthermore, your copy should be persuasive, informative and punchy: youll
only serve to limit your copys punch by simply clawing keywords into the text.
Be verbose, create opportunities to talk about your company, products and
niche verticals, but never, repetitive.
David Mihm adds: Its a myth to say I will optimise your websites Keyword
Density. It is important to include keywords on your pages but there is no
magic number of times to use a keyword. Write your text for humans!

7. Spending money on Google AdWords boosts


your rankings
The assumption that spending money on AdWords will somehow engender you
to Google and thus advantage your organic search listings is an understandable,
but untrue, belief. Google has said so many times over the years, but the myth

The top 10 SEO myths

Webspam head Google's head of webspam, Matt Cutts

Adwords myth Don't expect Google to boost rankings just because you bought Adwords

Moz director David Mihm of Moz local search strategy team

Check listings Moz's Getlisted.org (https://getlisted.org) checks your listings on Google, Bing, and other local search

What really matters is


the speed, depth and
richness of the content
you deliver" Mark Buckingham
never seems to go away. Its no mistake, however, to identify some correlation
between targeted ad spend and your sites organic coverage. Search expert
Matt McGee, says: Ive seen studies over the years that suggest that when you
have good visibility on both your paid and organic it increases clickthrough on
your pages, and thus traffic, increasing awareness, which leads more to links,
etc, etc.
I certainly think theres nothing wrong with spending money on AdWords.
But its a definitely a myth that theres a direct impact on your rankings.

8. Land here
Every page on your site should be treated as potential landing page; you cant
assume a visitor is going to land on your homepage or your products overview
page. The idea that you have one special search landing page is not helpful. All
pages are possible landing pages.

9. Set it and forget it


Its true that continually jostling for higher rankings, making incessant iterations
and tweaking, doesnt give you time to sit back and monitor the success of your
hard work and can be a fruitless process. Its also unadvisable to go to the other
extreme and assume SEO is a one off project. Good SEO never really ends, like

a successful company wouldnt settle with just one single marketing investment.
If you think youve achieved all your SEO, Ill bet youre not making the most
of your website and your offline marketing activities. Theres always more that
can be done, and even if your rankings dont immediately benefit, your site
will. Even with limited resources, even adding or improving a single page every
month is better than leaving a static site to flounder, which may, in time, be
superseded by your competition and afforded less currency by your users and
engines alike.

10. Rankings arent the only fruit


A lot of people come in to SEO thinking that the end goal is to get rankings; but
the end goal is to make money.
If a number one ranking for a certain keyword isnt making you money, its
worthless. If a number three or number four ranking is getting you clicks, youre
converting your traffic into customers, then that ranking is much more
valuable, says Matt McGee.
This is my favourite myth of all. Being on top is great, but, in my opinion,
it isnt the be-all and end-all and it wont necessarily yield your site maximum
conversions. Naturally you need prominence, but its the quality of the site and
your content that also matters. Id wager a site in fourth place on the first page,
above the fold, that fulfils the visitors requirements is, by and large, going to
be more successful than one that belies its pole position through lacklustre
content, relying more on inbound links and other good fortune to supplant its
superior competition.
What really matters is the speed, depth and richness of the content you
deliver, and where your audience buys into it, you, your brand, services, or
products, and how consistent that message is across the web and in the real
world. SEO should be a laboured but fluid process, priming your good work
and ensuring its tweaked, organically, for maximum accessibility; not just
an afterthought. Good SEO is about putting your best foot forwards and
continually developing the site to be simply as good as it can be. Place your
visitors first, and the search engines will follow.
Remember that rankings are a means to an end, they are not the end itself. l

The SEO Handbook 17

Getting started

Getting started

Getting started

Getting started

Search marketing trends

Five search
marketing trends
Kelvin Newman, director at SiteVisibility, shares his top
five trends for search marketing
It's been an interesting couple of years for the digital marketing
industry: Facebook continued to rise despite a shakey initial public
offering (IPO) and fightback from Microsofts Bing. Lets not forget
the interesting new platform, Google+, which celebrated its first birthday
in June 2012. While Googles social network may be struggling to catch the
publics attention, it promises to be very influential in the future of search.
The continued impact of Googles Penguin and Panda updates have
re-shaped the search and SEO industry. Brands of all shapes and sizes have had
to learn how to adapt to more white-hat tactics to prevent being penalised
by Googles algorithm updates, which target webspam.
Pleasingly, for most honest SEOs, the decrease in ranking for some sites has
actually opened up opportunities for those who have played by the rules in
the past.
So with Penguins and Pandas aside, heres what we think will shape the
industry throughout the year and beyond:

1. Structured data
Google and other search engines are pulling more structured data into search
result pages. Therefore, it will be vital for digital marketers to mark-up data in
search-friendly ways, such as using schema.org, microformats, or microdata.
The advantage of structured data is that it allows users to refine their searches
using concepts rather than just individual keywords or phrases.

The continued impact of


Google's updates have
re-shaped the search and
SEO industry" Kelvin Newman
2. Social signals as a ranking factor
Studies show that more widely shared content tends to rank better on search
engines. At the moment most believe the relationship isnt causal, but this may
change in the future. Understanding the measurement of social signals is
imperative. Ultimately, there are many reasons to embrace social media in
your marketing endeavors: it makes sense to your business, not because it has
significant ranking powers, but because it is vital for your SEO efforts.

3. Siri and APIs for discovery


The methods by which people discover content online is moving away from
traditional search and navigation. Users are now finding content away from
traditional search interfaces via apps on a great variety of devices. The key will
be giving these people the ability to access your database and eventually
convert them into customers. APIs, of course, allow data from one site to flow
outside of it through an app or a mashup with another internet service. All

18

The SEO Handbook

Apple's Siri Voice recognition technology provides new ways to discover content online

businesses should have one and I expect this to be part of most digital
marketing strategies in the rest of 2013 and beyond.

4. Mobile search and responsive web design


More and more searches are being completed on mobile devices. In fact,
Google has stated its preferred mobile solution is responsive design, which
re-orders and shapes the page for different size devices rather than serving
different URLs. I suspect many businesses are clued up to this now and that
most specs for site redesigns will include responsive layout.

5. Shifting line in acceptable links


Google isnt the enemy; Google is the referee. Google has, rightly, been
cracking down on manipulative link-building. As these parameters change,
what was acceptable in the past may no longer valuable; you need to fully
understand where your existing links are coming from and have a sensible risk
assessment for the future.
If the last 18 months is anything to go by, then were in for an eventful
time. Some of the changes I discuss above seem very likely but those that
will probably have the biggest impact will probably be Black Swans. These
are events that have a major effect and are often inappropriately rationalised
with the benefit of hindsight. Theres very little we can do to prepare for these
events other than keep our eyes and ears open, and stay agile. l

THE ALL-NEW

TRY IT
TODAY!

Youll find additional imagery, exclusive audio and video content in


every issue, including some superb screencasts that tie in with
the practical projects from the issues authors. Dont miss it!

Getting started

Getting started

20 best Drupal modules for SEO

20 best Drupal
modules for SEO
Mark Roden, editor of WebCommune, assembles
a comprehensive SEO toolkit for the Drupal CMS
Drupal is the best CMS for search engine optimization (SEO). The
community has contributed a ton of modules to ensure webmasters
are adhering to best practices and are equipped for the future. In
fact, the wealth of CMS tools provide users with the ability to control all
elements of their campaign.
Due to the impending final release of Drupal 8 at the time of writing,
the following list of modules are mainly for Drupal 7. (Drupal typically only
supports two versions, so its a safe bet to focus on the middle ground of 7 in
the meantime). With the proper combination of modules, Drupal morphs into a
flexible platform equipped for the ever-changing world of SEO.

1. SEO Compliance Checker


This module (https://drupal.org/project/seo_checker) delivers feedback on the
compliance of a preset series of user-defined rules. Upon creation or
modification of nodes, the module will run a set of checks and display the
results in a table. Tested elements include title attributes <a href>, keyword
density/usage, and alt <img> attributes.

2. SEO Checklist
A vital download. This (https://drupal.org/project/seo_checklist) ensures youve
dotted the is and crossed the ts in your quest to be the best. The module
doesnt actually enforce any function, but does provide a comprehensive
checklist for on-site SEO, marking off each as you complete it. This can prove to
be highly valuable for those who arent so familiar with the logistics of SEO.

3. Page Title
This module (https://drupal.org/project/page_title) provides the ability to
control individual nodes, setting content in the <title> tag. Its one of the single
most important elements in a successful SEO campaign, and a vital module.

4. Path Redirect
With Path Redirect (https://drupal.org/project/Path_Redirect), you can redirect
URLs from one path to another path. Using this module is important for
maintaining the integrity of your site, and preventing search engine crawls from
resulting in error. Additionally, links existing on external sites are preserved, and
wont result in a 404.

20

The SEO Handbook

5. Global Redirect
Global Redirect (https://drupal.org/project/globalredirect) fixes duplicate URL
bugs that tend to occur with clean URLs and the Pathauto module. Although
aliases for URLs may appear different, they are in fact duplicate. With the Global
Redirect module, a 301 redirect is created from the old node to the alias.

6. Metatag
The concept of meta tags is still a source of comedy. A great deal of weight used
to be put on them. Hysteria ensued. The importance has decreased, though use
of the module cant hurt for providing structured metadata. In reference to SEO,
meta tags (https://drupal.org/project/metatag) include meta description tags
and the meta keywords tag that search engines often pick up on.

7. Search 404
To ensure Not all who wander are lost, Search 404 performs a search on the
keywords within a non-existing URL. The module does a great job of keeping
visitors on your site using search engine keyword detection and expressionbased filtering from the URL. Search 404 (https://drupal.org/project/search404)
is recommended to keep bounce rates down and user engagement up.

8. XML Sitemap
XML Sitemap (https://drupal.org/project/xmlsitemap) generates a dynamic
sitemap built for search engines, adhering to the specification of sitemaps.org.
It enables you to configure content types, taxonomy terms and more for
intelligent crawls from search engines.

9. Site Verification
This module (https://drupal.org/project/site_verify) helps with the verification
process of your website. Supported methods of verification include adding
meta tags or uploading a file. Once initiated, a bi-directional check is initiated,
and search engines can then verify that you are the site owner.

10. Google Authorship


This is a standard for obtaining authorship information, and not quite a module.
Implementing Google Authorship (https://plus.google.com/authorship) can

SEO Checklist Great for beginner SEO, this provides a handy checklist for on-site SEO

Search engines reward


websites designed for best
user experience" Mark Roden
provide great ranking signals for authors with high credibility. All you have to
do is get the authors on your site to create Google+ profiles, then add a
rel="author" attribute. Details on adding user names: http://netm.ag/rel-bz92.

11. Links Checker


This module (https://drupal.org/project/linkchecker) extracts links from content
when saved, providing reports on failed responses. Quality issues such as
broken links can be a negative signal to search engines.

12. Taxonomy Title


This module (https://drupal.org/project/taxonomy_title) enables administrators
to update the H1 for taxonomy term pages. Keyword-rich and accurate heading
tags (eg H1, H2, H3) provide structure that's indicated to have some SEO weight.

13. HTML Purifier


This module (https://drupal.org/project/htmlpurifier) filters HTML to ensure
rendered output is standards-compliant, removing all malicious code in the
process. Without HTML Purifier, users leave themselves open for XSS attacks.

14. Menu Attributes


The Menu Attributes (https://drupal.org/project/menu_attributes) module
enables the administrator to target specific attributes, including id, name, class
and styles. The module comes in handy when setting items to "nofollow",
allowing for an effective linking strategy for your website.

20 best Drupal modules for SEO

Paths Use Pathauto and Sub-Pathauto to create new alias and keyword-rich paths for URLs

15. Footermap
Search engines reward websites designed for the best user experience.
Footermap (https://drupal.org/project/footermap) generates a site map block in
the footer region to improve navigation. Use links sparingly and efficiently.

16. Pathauto
The Pathauto module is (https://drupal.org/project/pathAuto) a staple of Drupal
SEO, enabling keyword-rich paths for URLs. It ensures search engines and site
visitors may gather information on content through URLs.

17. Sub-Pathauto
Automatically creates a new alias for URLs based on, or extending, an existing
alias, sub-pathauto (https://drupal.org/project/subpathauto) allows for further
generation of user-friendly URLS.

18. Content Optimizer


In tandem with Content Analysis, Content Optimizer (https://drupal.org/project/
contentoptimizer) provides statistics and recommended actions to boost search
engine rankings. Consider it an integrated version of Google Analytics.

19. Site map


The Site Map module (https://drupal.org/project/site_map) will give site visitors
a complete view of the contents of your website, including blogs, categories,
and more. Similar to XML sitemap, but more useful as an index page for the site.
Drupal generates RSS feeds automatically.

20. Google Analytics


Once youve created an account in Google Analytics, you can go ahead and
download the Google Analytics module. From there, youll have access to key
insights on traffic, page logistics, and more. This is highly recommended for any
SEO campaign. l
Mike Roden is an editor at WebCommune (www.webcommune.com)

The SEO Handbook 21

Getting started

Getting started

Getting started

Getting started

Localising content

Localising content
for better SEO
Michelle Craw shows you how to improve rankings
and increase traffic by localising online content
For any business operating in or selling to multiple international
territories, the localisation of your online content is a crucial tool
in your attempts to increase traffic, improve rankings and increase
conversion rates.
Google, which is still by far the most commonly used search engine in the
western world, is constantly refining the algorithms that underpin its searches in
order to provide high-quality, relevant, useful results for its users. Several recent
Google updates have focused on providing web developers with the opportunity
to provide localised versions of the same website in different territories.

Mind your language


For example, late in 2011, Google introduced a new markup tag hreflang. This
tag can be used to identify translated content on multinational sites, and helps
search engines distinguish between language variations. It enables webmasters
not only to increase the quantity of web traffic, but more importantly the quality
of that traffic.
A common problem with many multinational sites is that largely identical
content, for example in the case of a UK and US version, creates mass duplication
across the site. The hreflang tag solves the issue multinational duplication by
telling search engines, for example, that this US version is essentially a copy
of this UK version, intended for a different audience. Successfully implementing
this markup will help ensure that its the US version which performs for the
US audience.

Mobile device usage


continues to grow" Michelle Craw

Page views The data shows one third of UK page views were from mobiles and tablets

your site in a way that search engines can better use it. The type of rich markup
we always recommend, and is recommended by Google, is that found at schema.
org (http://schema.org) also known as microdata. Marking up your content with
these tags provides users with additional information and makes your content
look more interesting on the results page.
Another example is how a few results for a popular recipe will show
additional information in the search results, such as star ratings, reviews, cooking
time, calorie information, which helps attract the attention of users.

Mobilise for success


In April of this year, Google launched an addition to this markup, the x-default
hreflang tag, which is intended for use on multinational sites that also have a
global page or version of site. The tag ensures users in countries that do not have
localised content are sent to a sites default version or page.
Lets presume www.example.com is our global version of our site, which also
has versions localised to the UK, US and Canada. Adding this x-default hreflang
tag to our global version of a site would ensures that users from say, Australia,
would be sent to the global page.
<link rel="alternate" href="http://uk.example.com/en" hreflang="en-uk"/>
<link rel="alternate" href="http://us.example.com/en" hreflang="en-us"/>
<link rel="alternate" href="http://ca.example.com/en" hreflang="en-ca"/>
<link rel="alternate" href="http://example.com/en" hreflang="x-default"/>

Rich snippets
Enhanced search results listings known among SEO practitioners as rich
snippets can capture users attention and encourage them to click through
from the search engine results page.
There are multiple different types of rich markup, and its becoming an
increasingly important feature of SEO. Essentially, its a way to structure code on

22

The SEO Handbook

As mobile device usage continues to grow across the world, international


webmasters have to tailor their strategies to incorporate new means of searching.
The image (shown above) from late last year shows statistics from a
comScore UK Digital Future report. The data shows that almost one third of all
UK web-based page views were from mobiles and tablets. We can see that this
varies widely across countries, with the UK, the Netherlands and Russia being
particularly low in this regard.
There is an undeniable global trend in terms of users consuming content on
the move, but the recent growth in tablets is particularly interesting.
Last year Adobe released a research paper covering tablet sales and traffic to
more than 300 brands across North America, Western Europe and Asia-Pacific.
The paper predicted that tablet traffic would exceed smartphone traffic in 2013.
Just two months ago, Adobe released its latest Digital Index, confirming its
predictions. The report shows that just last month, for the first time tablets are
now taking a greater share of page views than smartphones. This growth is
particularly impressive, given the first iPad only launched in April 2010.
The overall message is clear: the ever-increasing sophistication of online
consumers means that, in the near future, only websites that use responsive
design to deliver content optimised to a users particular location and device will
deliver the desired results for brands. l

Hand-picked jobs by Creative Bloq, the UKs fastest growing design community

JOBS SEARCH

Graphic design jobs |

Receive weekly jobs alerts, straight to your inbox


Search all recruiters directly
Get expert career advice tailored for designers and developers

Search at http://jobs.creativebloq.com
From the makers of

Getting started

Getting started

Don't get blacklisted

Google SEO tricks that


will get you blacklisted
Glenn Alan Jacobs, managing director of consultancy SpeedySEO,
rounds up the top 10 SEO tips you should never follow
Black hats were used to identify the bad guys in old Wild West
movies. When it comes to search engine optimisation (SEO), the
term is also associated with unethical cowboys. The white hats
were the good guys in Westerns, just as they are in SEO.
Search Engine Law. The web has a sheriff. Hes big, hes mean and hes
quick-on-the-draw. His name is Google and if he was a character in a Western
hed be played by John Wayne. Sheriff Google keeps the internet frontier safe
for law-abiding citizens and white hat content creators alike.
While white hat websites work within the law, search engines are locked
in an ever-escalating shoot-out with black hat practitioners. Internet users
get caught in the crossfire on a regular basis; unable to differentiate between
reputable sites and those with harmful, spam-filled content.
This guide describes the best ways to get on sheriff Googles bad side ...

Bad medicine
1) Keyword stuffing is bad medicine. Proper keyword use is not the concern of
this article, so for now well focus only on the improper kind. Keyword overuse
leads to synonym underuse, and makes for content thats inaccessible to the
average human user. Though people might not be able to read your content,
search engine robots still will. Oversaturated pages will get you penalised.
The average safe density of keywords should be between two and eight
per cent of your total word count. When creating copy you should think of
your audience, not of your page ranking.
2) Hidden text is invisible to human eyes. Keywords or links can be
camouflaged by colour-matching text to background leaving them unreadable
to human visitors, but perfectly readable to search engine bots.
More complex methods employ cascading style sheets (CSS) or layering to
hide text beneath surface copy. Such text is also readable to a search engine

24

The SEO Handbook

spider and not a human user. Black hat operatives attempt to fill their sites
with hidden content for the express purpose of achieving higher rankings in
search lists, regardless of whether their pages are relevant to a users initial
search request. Google law basically states that you should build your website
for users, not for search engines. Ignoring this advice by including hidden text
or links is one of the quickest ways to get your site blacklist bound.
3) Doorway/gateway/bridge/portal pages are created specifically for search
engine bots. They are designed to target particular keywords or phrases and
will usually be extremely user-unfriendly and/or difficult to read. Because they
are simple devices used to trick people towards actual websites, they rarely
contain anything useful (other than any prominent CLICK HERE links through
to the real destinations). Black hat webmasters create portal or bridge pages
that bypass the need to click on a link completely, using fast meta refresh
commands that whisk you to another site (without so much as a by-yourleave). For this reason, many search engines now refuse to accept pages that
use fast meta refresh.
4) Cloaking can be achieved either through IP address delivery or agent
delivery. As with people, bots are identified by their user agent or their IP
addresses. Two sets of content are created, one delivered to the Google-bot,
the other to human visitors. The bot is deceived by the fake pages (the content
of which is usually saturated with targeted keywords) and grants the website
a higher relevancy ranking. When the user clicks on what they perceive to be
a promising link, theyre promptly forwarded to a browser page thats nothing
to do with their original search.
5) Mirror websites are two or more separate sites that use identical content,
but employ different keywords and meta tags to obtain rankings on

Getting started

When attached backlinks


to blog or forum posts, you
should always keep your
content relevant" Glenn Alan Jacobs
alternative searches. This technique violates the rules of many search engines,
and will probably get one or all of your mirrored sites banned.

search engine rankings. The process is costly as well as time consuming and, if
Google finds out, can lead to you getting your entire network dropped from
the index (including the site that youre optimising).
8) Backlink generation is a good thing. However, generating backlinks too
quickly is a bad thing. A new website that suddenly surfaces with an inordinate
number of backlinks looks suspicious, and spamming will be suspected by
Google. Therefore, you should build backlinks at a natural pace to avoid
incurring penalties.
When attaching backlinks to blog or forum posts, you should always keep
your content relevant and attempt to bring something to the conversation.
If you dont do this, you will be recognised as the spammer you are and
rightfully punished.

6) Link farms, specifically free-for-all link farms (FFAs), are to be avoided like
the plague. When Google inevitably identifies an FFA link farm as a 'bad
neighbourhood', it will infect also any linked pages and eventually deflate
their values.
Link scheme participants obtain links from farm sites in exchange for fees
or backlinks, but in either case its almost certainly an unsound investment.

9) Scraper sites are the online equivalent of a chop-shop. They are spam
websites that steal existing content using web scraping, often to generate
advertising revenue or to manipulate search engine ranks. Web scraping
works in a similar way to web indexing, which is the process employed by
most search engines to create ranking lists.
Unscrupulous black hat webmasters use scraping to gather content, before
repackaging it for their own purposes. Using someone elses content (even
open content) can constitute copyright violation if you fail to adequately
reference it.

7) Independent Backlinking Networks (IBLNs) are an entirely different kettle


of fish. Black hatters with cash to burn and time to waste might choose to
use IBLNs. A network of sites are set up solely to provide backlinks to the
pages you wish to promote, in such a way as to increase your standing in

10) Phishing pages are (according to Google) a page designed to look like
another page in an attempt to steal users personal information. The reasons
why phishing will get you blacklisted should be obvious, so dont even think
about doing it. l

Bad neighbourhoods

The SEO Handbook 25

Getting started

Getting started

SEO for startups

SEO for startups


Tom Gullen presents a primer on SEO for startups, explains
common mistakes and iterates the importance of accessibility
SEO is an industry that sparks frequent heated debate and
passionate responses. Its an industry that is often misunderstood
and even dismissed. Yet for startups a basic SEO foundation and
understanding of it is likely to be of crucial importance, and can really help
them on their path to success.
So how do we go about beginning to optimise our startups website for
search engines? Accessibility should be a primary concern for websites not
only because it makes your website accessible for less able people, but also
because a search crawler bot should be considered your least able user.
Developing your website in a highly accessible manner comes with the
additional benefit of making your website highly accessible for search engine
crawlers. Basic accessibility for websites isnt difficult to achieve.

Page title

Accessibility and SEO


Accessibility with dynamic content

As mentioned previously, we should consider search crawlers our least able


users. Although a search crawler can render and fetch an image,
understanding the content of the image for a computer is an extraordinarily
complex problem. We need to let the crawler know more about the content of
the image and we achieve this through use of the alt tag.
The alt tag is used by user agents that cant display images. It should
be concise and describe the content of the image. If you have a photo of
an oak tree in a park you might give it an alt tag of Oak tree in Richmond
Park. Descriptively naming the image file also can have a positive effect, for
example oak-tree.jpg would be better than myphoto.jpg. Not only does the
search engine now have a much better grasp of the content of the image, it
also has a better idea about the content of your website in general. A well
formatted image optimised for accessibility may look as follows:

Disabled JavaScript is not exclusively the domain of archaic browsers; plug-ins


such as NoScript (http://netm.ag/noscript-bz92) have millions of users. The
Google crawler is able to execute some JavaScript when crawling. However,
its a risky proposition to rely on this if your content is inaccessible to agents
that have disabled JavaScript.
When developing a website I firmly believe that progressive enhancement
is an essential principle that should be adhered to. If, for example, we are
building an online store for music CDs, we would first build it without any
JavaScript. A user will click on an artist and a new page will load. Once the site
functions and displays perfectly in this manner, we can then add layers on top,
such as dynamic content loading with Ajax.
This has two advantages. If your JavaScript fails, theres a good chance
the site will still function and display properly but most importantly youve
created a highly accessible website. Although youre using the latest new
techniques you havent sacrificed accessibility in the process and as a result
crawlers are free to roam your website.

Page titles are an extremely important part of your page, because they are
very frequently relayed in search engine results and carry weight in search
engine ranking algorithms. Its important to keep the title as concise and
contextually rich as possible: 65 characters is a good rule of thumb.
A frequent mistake is to incorrectly format the title by placing the name of
the website at the start of the title tag.
Its highly recommended to place the website name at the end of the title
in a search engine result the name of the website is generally of little interest
to the people searching and you may sacrifice a lot of your clickthrough rate
for that page.

Correctly formatting images

<img src="images/oak-tree.jpg" alt="Oak Tree in Richmond Park" />


This is far more accessible and gives more clues to search crawlers than:
<img src="images/dcm0000013.jpg" />

Formatting links properly


If you are linking to a page on your website that goes into great depth about
oak tree leaves, the worst way this could be linked to on your page is:
If you want to read about Oak Leaves <a href="leaves.html">click here</a>.

Bad form(at) An example of a badly formatted page title. As the website name is
presented first it can create less accessible search results

26

The SEO Handbook

This is not only awful from a usability point of view, but also from an SEO
point of view. One highly important part of a links anatomy is the text within
the link this provides a very strong clue to search crawlers about what the
page being linked to is about. If youve ever heard of Google Bombing this is
the underlying reason why it works.

SEO for startups

NoScript NoScript is a popular extension for Mozilla-based browsers that only allows JavaScript to be executed on websites the user chooses. When designing an accessible and SEO

A better way to present the link would be as follows:


Read more about <a href="oak-leaves.html">Oak Leaves</a>.
This principle extends to hyperlinks on other websites that link back to you.
A related website linking to your website in this way:
Want to learn more about Oak Trees? <a href="http://www.oak-trees.
com">Visit this site!</a>
Is inferior to being linked to this way:
Learn more about <a href=http://www.oak-trees.com>Oak Trees</a>.
In the second example search crawlers are being given a big hint to the
content of the website being linked to. You obviously have limited control over
how third party websites format the links. However, its important to keep

Sitemaps have evolved


and now are commonly
used with XML" Tom Gullen
in mind as opportunities for suggestions to third party website owners may
present themselves in the future.

Meta tags
Its common knowledge now that the meta Keywords tag should be
considered redundant for SEO purposes. Not only is it a waste of markup, it
also gives your competitors strong clues about the terms you are targeting!
There are, however, other very useful meta tags which should be utilised
on your website.

1D
 escription meta tag: The description meta tag should be a concise
overview of the page. It is often displayed in search engine results, so not
only is it important to design it as concisely and as descriptively as possible
but also think about how appealing it is for a potential visitor to click on.
Descriptions shouldnt extend more than around 160 characters in length.
2C
 anonical meta tag: The canonical meta tag is an important one that is often
overlooked by web developers. To understand why we need the canonical
meta tag we have to understand that search engines can treat pages with
slight variations in their URLs as separate and distinct pages. As an example,
take these two URLs:
http://www.example.com/shop/widget.html
http://www.example.com/shop/widget.html?visitID=123
They could be treated as distinct URLs even though they display exactly the
same content. This could impact on your site negatively, because you ideally
want the search engines to only index the first URL and ignore the second.
The canonical meta tag solves this issue:
<link href="http://www.example.com/shop/widget.html" rel="canonical" />
Placing the canonical meta tag on the widget.html page lets crawlers
know your preferred version of the page.

Sitemaps
Sitemaps should be kept up to date and contain every URL you want to be
indexed. You might not realise that some pages on your site are buried deeply
in your website and hard to access a search crawler may not explore that
deeply. By listing every page on your site in a sitemap youve made your site
far more accessible to the search crawlers and you can be sure that the search
engines will know about all your content.
Sitemaps have evolved and now are commonly used with XML. The XML
schema for sitemaps comes with a few options such as the last modification
date, how frequently this page is changed and its relative priority.
If you are not completely confident in your usage of the more advanced
attributes such as the change frequency and priority, its best to ignore them.

The SEO Handbook 27

Getting started

Getting started

Getting started

Getting started

SEO for startups

Sitemaps Sitemaps should be kept up to date and contain every URL you want to be indexed. Sitemaps.org provides guidelines on the protocol

The search engines are going to be intelligent enough to determine these


values for themselves more often than not. The absolutely essential thing you
need your sitemap to contain is a full directory of URLs on your website you
wish to be indexed.

Common SEO mistakes


Paying for link building
Link building is the process of increasing the number of links on other sites to
your website. One way this is gamed is to manufacture links to your website
en masse in an effort to feign authority. Its a quantity over quality
methodology that may have been successful in Googles earlier days, but as
Googles algorithms have intelligently evolved this methodology is offering
increasingly diminishing returns.
What happens when you pay an SEO firm to link build for you? More often
than not, it will spam other websites on your behalf with automated tools.
Its a selfish tactic youre receiving negligible (if any at all) benefits at the
expense of honest webmasters time they have to clean it up off their sites.
Earlier this year Google released an algorithmic change named Penguin.
The Penguin updates intention is to devalue websites that engage in
underhand tactics such as spam link building. A highly unethical tactic in the
SEO world called negative SEO has since come into the limelight since the
Penguin update. Negative SEO is the act of engaging in black-hat SEO tactics
on behalf of your competitors with the objective of getting them penalised.
Its unlikely your startup will be negatively SEOd: it takes a concerted effort,
money and a distinct lack of ethics. However, if youre paying for sustained
link building campaign for your startup youre running the risk of shooting
yourself in the foot and being devalued by Googles algorithms. Repairing
the damage of a bad quality link building campaign can be extremely costly,
difficult and time consuming.
Paying for Link Building Packages should be a huge turn off. There are
negligible (if any at all) benefits, and a huge amount of downside. Its often
the hallmark of an unethical and poor quality SEO firm.

Keyword density
Reading up on SEO you probably have come across words such as keyword
density referring to the percentage of words in a particular body of text that
are relevant to the search terms you are interested in. The theory is that if you
hit a specific density of keywords in a body of text you will be ranked higher in
search results.
Keyword density is often presented as an oversimplification of numerical
statistic called tf*idf. tf*idf reflects the importance of a word in a body of text
or collection of bodies of text in a far more accurate way than rudimentary
keyword density measurements. It's described mathematically probably isnt
the end of the story. Its likely search engines have modified this statistic and
weighted it differently in different cases to improve quality of returned results.

28

The SEO Handbook

What conclusion should we draw from this as a new startup? You should
probably ignore it all. When youre writing content such as a new blog post
you need to remind yourself of your objectives youre trying to write content
that people will want to read. Text tuned to specific keyword densities has a
potential large downside, which is that the text becomes increasingly obscure.
A well written body of text will likely attract more good quality links and social
shares, which in turn will increase the value of your website in search engines
eyes. Dont worry about keyword densities. Instead, worry about the quality of
your writing.

Ignoring clickability of search engine results


When designing your pages title and meta descriptions, its easy to overengineer them specifically for the search engines. Remembering your actual
end objective is to get real people to click on search results is important. If
youre ranked on the first page of a search result, the text extracted by the

Playing by the rules is


a sustainable, long term
strategy" Tom Gullen
search engine in the result needs to be concise, descriptive and appealing for
the visitor to click on. When designing these aspects of a page, which are
likely to be relayed into the search engine result, its important to strike a
balance between the benefit of potential increased rankings and the user
friendliness and clickability of that content.

Play to the rules


Following good quality SEO tactics that play by the rules (known as white-hat)
are the safest bet for your long term strategy. It may be tempting at times to
engage in black-hat tactics certainly the arguments presented by the blackhats can be seductive and promise quick results yet you are risking
alienating what inevitably is going to be one of your major and free sources of
traffic. Risking this channel of potential customers is not a price a startup
should be willing to pay.
Playing by the rules is a sustainable, long term strategy. And this should
align itself perfectly with your ambitions as a new startup. l
Tom Gullen is a founder of Scirra (www.scirra.com). Scirra is a startup that builds
game creation tools

INSPIRATION THAT
LASTS ALL YEAR
2014 ANNUALS

ILLUS TR ATI ON AN N UAL 2 014


BR AN D I N G AN N UAL 2 014
GR APHIC D E S I GN ANN UAL 2 014
(SOLD OUT )

THE MOS T INSPIR ING DE SIGN FROM


THE WOR LDS LE ADING CR E ATI V E S
O N S A LE NOW: bit.ly/de sign_annual

Features

The future of SEO

Features

nd
a
s
do
e
h
t
l
er
t
a
t
e
e
v
b
re
We nts for kings
do le ran
g
Goo

30

The SEO Handbook

Features

The future of SEO

Features

ver the past decade year, Ive worked


as a developer, designer and search
marketer. In my day-to-day role as
CTO of Builtvisible, I spend my time
working alongside the search agency
side of the business, creating tools, educating the
team on technology and design trends, and push
for greater inventiveness and innovation in the
content we produce as part of our clients marketing
campaigns. However, Im aware of the reputation
that the industry has, of producing content purely
for links and rankings rather than to give amazing
experiences to engage users. It shouldnt have to be
like this though. Too many people think the best way
to market their businesses is to buy the worst quality
links from the lowest traffic sites on the web. So
whats the industry doing about it, and where does it
go from here?

How did it come to this?

Author
Pete Wailes
is the CTO at Builtvisible
(builtvisible.com), an
international creative
search marketing
agency, and developer
of the CSS library
OpenDAWS
Illu stration
Linzie Hunter
is a Scottish illustrator
and hand-lettering
artist based in
Peckham, South
London. Her clients
include: Time Magazine,
The Guardian, Hallmark,
VH1, Nike, BBC, Orange
and Marks & Spencer
www.linziehunter.co.uk

SEO as an industry begun as a group of technical


people who watched the search engines and how
they were operating, deconstructing how they
worked and reverse engineering their technologies.
Therefore, its no surprise that, with those people
being more technically-inclined (and not marketers),
the practice of SEO developed in a fairly uncreative
manner. While its certainly been useful for the
web that it exists (much of what search engines do
nowadays wouldnt be possible without the better
side of the SEO industry to ensure sites are properly
crawlable, with semantically marked up data), its
not all roses.
Over the years, weve seen SEOs engage in a
variety of tactics, some more effective than others,
in an attempt to game the engines. Thankfully, these
have slowly become less effective, although theres
still holes if you know where to look, or youre
willing to brute force them.
The happy result of this though is that the industry
has been slowly bent towards a different path,
looking at more traditional marketing methodologies
to create the content required to get a site to rank.
This is an improvement as marketing based on
more traditional principles, with an understanding
of messaging, branding and targeting will deliver
results that can be measured far more tangibly.
Rather than evaluating work on quantity of links
or on PageRank, we can talk about revenue, goal
completions, customer lifetime value and so on.

The SEO Handbook 31

Features

The future of SEO

Features

The challenge we now face is that as an industry,


we lack a deep understanding of marketing. Not
surprising, as most of the people who make up the
industry dont come from a marketing background.
In part thanks to this, over the last three or four
years weve seen a lot of cargo-cult creativity, with
people copying tactics theyve seen others employ.
Theyve not understood though why those tactics
worked, or what the strategic objectives of those
campaigns were.
One result of this has been proliferation and abuse
of mediums and methods of content delivery, namely
guest posts and infographics, both of which Google
has now called out explicitly for abuse. If you read
between the lines though, its not the practice of
producing infographics, or of writing for publications
thats the issue - its when these are used as a
method of creating links, rather than because
theyre genuinely useful, or as a result of a desire to
connect with a publications audience.
So how do we up our game, as marketers, and
produce better ideas, and work with designers and
developers to produce seriously interesting content?

Redesign with
SEO in mind
During 2013, we decided to refresh the
SEOgadget brand (now Builtvisible). In doing this,
weve followed the processes outlined here:
Researching existing examples of agency sites
to understand what a good agency website
looks like today
Prototyping first, enabling stakeholders to
better understand the concepts
Taking content concepts to external figures to
understand the impact and gain feedback
Crafting new and legacy content, including
video and presentations
Creating a complete demo site, functionally
accurate, but without all the final assets in
place to allow for final revisions
This process has saved weeks of time and
refinement, as at every stage, something that
truly represents the end product is being built
around and showcased. Its also allowed for a
far more flexible creative process, as if weve
required revisions, weve been able to make
them in the browser and interact with those
changes live.
Its also allowed us to adapt the design
rapidly and sign off amends as the final content
has been produced, where that content has
necessitated changes that hadnt been foreseen
previously. The speed of change testing and
revision development has therefore been roughly
halved, versus the previous PSD to HTML, to final
version method of development.

32

The SEO Handbook

A Better Way

Below Followerwonk is

a social analytics tool for


mining Twitters user graph

Well, the first thing to note is that it doesnt have


to be like this. As the digital industry in general has
slowly started to acquire traditional marketing talent
(and visa versa), were seeing it start to produce some
truly compelling work. Pieces like Beats by Dr. Dre
#showyourcolour (statigr.am/tag/showyourcolour),
The Feed by Getty Images (www.gettyimages.co.uk/
editorial/frontdoor/thefeed) and Rexona: DO MORE
(domore.rexona.com/en-GB/adrenaline/home), not
to mention more experimental design/content
forms like the oft-cited Snow Fall by The New York
Times (www.nytimes.com/projects/2012/snow-fall),
and Serengeti Lion by National Geographic (ngm.
nationalgeographic.com/serengeti-lion/index.html)
show real promise.

Features
Further, theres campaigns like Imaginate by
Red Bull (imaginate.redbull.com) that couldnt exist
anywhere other than digital, which reach millions
through really creative storytelling, combined with
the inherent shareability that digital content can
have. These show a wonderful understanding of
the way that the consumer mindset works in 2014.
However, while these have all won multiple prizes
and serious awards, as well as huge traffic and
mindshare for their clients/publications, each has
areas where they fall short.

A regular check-up is
required to make sure
that content continues
to perform
These issues range from failures of cross-browser
compatibility to a lack of specificity around the
message, failing to ensure the content is findable
from search engines and so on. Each would have
been easily fixable. If youre creating something
thats tied to an event, acquire the domains around
the main campaign terms. For #showyourcolour,
that wouldve been showyourcolour.com and

showyourcolour.co.uk. Set up a microsite talking about


the project, linking to events, providing an official
source for the campaign, and linking to the various
domains where social activity is happening. With
content thats less differentiated from the main
site, host it in a subfolder rather than a subdomain.
Combined with good copy and good internal
architecture, you can ensure that the main campaign
area outranks announcement pages and related
pieces, avoiding the issue Getty Images has with The
Feed, where the campaign ranks number two.
Similarly, a regular check-up is required to make
sure content continues performing. Snow Fall
currently fails to render well on most modern mobile
devices. Getting this fixed probably wouldnt take
that much development time.
So how do we, as an industry, improve the
situation, to deliver the right message, delivered to
the right consumer, through the right medium, at
the right time, whilst ticking all the right technical
boxes too? The answer is both simple, and at the
same time, frustratingly complex. To quote Dieter
Rams, Good design is thorough down to the last
detail. Equally, our work must be designed and
engineered to completion.
Doing that is becoming increasingly harder.
The technical and creative disciplines involved
in producing cutting edge digital work are only
bifurcating further, leaving behind creatives who
dont understand the technical sides of how search
engines crawl, index and rank content, let alone
the deeper technical issues involved in producing
content at this scale. The technical people who do

Above Serengeti Lion was


created over two years, and
uses incredible video, audio
and images to present
compelling journalism

The SEO Handbook 33

Features

The future of SEO

Features
understand those things become less understanding
of the language that creatives use to describe the
outputs they need, and how they work. Enter the SEO
industry, which is perfectly placed to act as a third
component to unify these two vital elements.

How SEOs role is changing

Above left MMM3000


(Saatchi & Saatchi working
for Mattessons) gained
huge social traction, but a
lack of search consideration
limited its success
Above right Thanks to

CSS and JS frameworks,


designers and developers
can build rapid prototypes
to better represent creative
concepts to the client

34

At Builtvisible, we view the SEO of today as a


technical project manager. SEOs need to be
specialised enough technically to be able to go
through server log files and unpick how a site
is crawled; they need to be able to work with
developers on frontend code to implement markup,
analytics tracking code and so on, as well as a host
of other things besides. Equally though, todays SEOs
have to have a broad range of skills in order to be
capable of working with designers and other creative
production teams on commissioning and refining a
variety of creative materials.
Whilst they dont do this work themselves, they
absolutely need to be able to converse with those
teams in the language that they use. That role is
something we see other, more specialist professions
struggle with. Its understandable too, as theyre not
as exposed constantly to all the types of content that
we encourage our people to seek out. And nor should
they be; as specialists, theyd be less effective in
their chosen discipline if they did. The result though
is a lack the breadth of understanding wide to be
able to ensure a consistently high quality of output,
across such a broad range of material types.
Its fair to say that were trying to make our staff

The SEO Handbook

into a hybrid project manager/technical SEO. The


closest analogous position in a traditional production
role would be a marketing coordinator, although they
tend to lack the specific technical depth we train our
SEOs to have, and instead have a deeper knowledge
of creative media and production, specialising in
fewer areas.

Mining social data to inform


prototypes
Looking at the industry a moment, over last year or
so weve seen the future of what SEO could become.
Between the emergence of Universal Analytics,
which allows for truly complete, campaign-oriented
tracking, a much better understanding of outreach
and PR and strong tech knowledge, the leading SEO
agencies are starting to produce really compelling
work. For two interesting demos of Universal
Analytics, search YouTube for WeMo Switch Universal
Analytics, and Measuring Dance Moves.
At the core of this is the proliferation and
manipulation of social data across the web. Its
become far easier to mine the social web to
understand which content items and production
houses are being featured most often by the main
influencers and thought leaders in any given
industry. Thanks to APIs from Twitter and its ilk,
and at-scale scraping technologies like 80legs and
DeepCrawl, its possible to monitor topics cross the
web. This enables brands to track in real time the
ways that consumers are talking about them and
subjects they care about.

Features

The future of SEO

Turning to process now, lets look at how this


plays out in the real world. Armed with social data,
search volumes and site analytics, the creative SEO
team can then work to analyse what forms of content
are resonating with the specific target market the
client wishes to engage with, and what messages are
getting across most effectively.
This content can then be used in liaison with
designers and developers to quickly prototype
concepts for page layouts, complete with usable

Above Polygons
Xbox One review was
timely, beautifully built, and
nailed the targeting to a
single group of passionate
users: gamers

The technical and creative industries are slowly having to


adapt to ever increasing complexities in their respective
areas online. With the more proliferation of technologies
like SVG and WebGL, were seeing a shift from the web
being a place where the code editor isnt the only game in
town. With sites like The Verge, Polygon, The Guardian,
Airbnb and so on taking ever more creative approaches to
their content, theres a need developing for people with
communication skills.
However, the more things change, the more they stay
the same. The content thats being created still relies on
the same marketing principles that content creators have
been using forever. As a result, if I had to give everyone
one recommendation, itd be to check out the work of the
best marketers of the 20th century, and what weve had
of the 21st so far. Start analysing why those campaigns
worked, what it was they were trying to do. Understand
the purpose and the context of that content, what the
message was and why it worked with the group it was
aimed at.
Id begin by reading Predatory Thinking by Dave Trott,
Hegarty on Advertising: Turning Intelligence into Magic by
John Hegarty, and Confessions of an Advertising Man by
David Ogilvy. Between those three, youll get a pretty solid
foundation on the type of thinking and doing required to
produce great marketing, whatever the medium.

The creative SEO team


can then work to
analyse what forms of
content are resonating
functionality that allows for a basic, but accurate
demo. Those prototypes can be used to show the
client the intended result. The process is starting to
acquire the name design in browser
, and in a world
where designs have to be responsive, delivers a
vastly simpler, quicker workflow.
These mockups dont necessarily need to use
the correct data or content, but it does need to be
a realistic representation of the end product, and
obviously if the actual content required is available,
that only makes the representation closer to the

The SEO Handbook 35

Features

Further reading

Features

final product. For example, if theres supposed to


be a chart, it can be rendered with Highcharts using
template data, or if there will be a HTML5 video
running in the background, a sample video will
suffice to make the demo workable.
The advantages of this are obvious; during the
pitching and refinement process, the client, design
and technical teams are all looking at and working
with something that works as the user will finally
see it. This vastly reduces the level of perceived
change on the part of the client from the old
Photoshop to production method of building work.
After a basic prototype is realised and the client
is happy with the basic concept, this can be taken to
influential individuals in the target community to
garner their feedback, ideally around 3-5 at most (for
ensuring a consistent tone for the piece, whatever
it may be). This ensures that the people involved in
the initial promotion and distribution feel a strong
attachment to the work, as well as having a sense of
ownership in the material itself.
With a working, finalised prototype of the creative
piece, the actual process of completing the design
and finishing the project now becomes vastly
simpler, as the constraints imposed by the visual
design are in evidence in the system itself. Thus all
the images, video, copy and other assets should be
simpler to commission and add to the work.
Also, during the final content production process,
the creative media teams involved can play with
the prototype to ensure the best fit for their work
in the piece it will end up in. This helps provide a
framework for ensuring producers know where they
have more creative freedom and scope flexibility,
and where theyre more constrained, reducing the
time required for revisions and cutting room time.

Bringing it All Together

Left Content pieces


like the Local Food Guide
pieces from HouseTrip.
com are fundamentally
a search content play,
but provide genuinely
useful information

36

The SEO Handbook

Now that the piece is built, it can seem that the job is
finished. However, theres a key component missing
to all this at the moment, which weve seen time
and time again: no dedicated area for marketing
the content. Looking at two recent examples, firstly
Every Shot Imaginable (www.youtube.com/user/
Everyshotimaginable) was launched with a YouTube
channel, but without a dedicated area on the
European Tour website for that content. As a result,
the site doesnt rank well for the names of its videos,
or of the campaign. Nor was the campaign name
particularly picked up on by the target market. If it
had set up a dedicated section on the site, talking
about the campaign, it would have had a far more
compelling place to drive traffic to. This would also
have likely produced better social engagement,
as they would have been able to tailor the copy

Features

The future of SEO

resources
If youre looking to get started with modern SEO best practice,
here are a few blogs and journals to get you started:

Blogs

If on the other hand youd prefer a more lively environment to learn


from and somewhere to network with people, try these events:

Conferences
MozCon moz.com/mozcon
searchlove www.distilled.net/events
Future of Web Design futureofwebdesign.com/london-2014

specifically around the content, as well as being able


to be more innovative with the interface, testing and
improving the page over time, rather than relying on
the generic videos page on their site.
Secondly, the MMM3000 campaign (mattessons.
outsideline.co.uk) for Mattessons resulted in a Tumblr
account outranking the Mattessons site for the key
terms created specifically for the campaign. Given

Its all about paying


attention to the details
and knowing those
things exist and matter

the particularly tech-savvy nature of the group being


targeted for this (gamers under 18), you cant help
but believe that this was a massive oversight.
Furthermore, putting the content on a subdomain
reduces the ability of the content to pass weight to
the main domain. Theres 69,000 results in Google
for the term MMM3000
, all related to the campaign
in question, from thousands of sites, all of which
could have been engaged and brought in to link to
the main site, where all the videos and information
are hosted. Instead, they link to YouTube, Tumblr,
Facebook, Twitter and so on, creating less benefit

for the brand, and no search impact. Again, this


wouldnt be tricky to fix. Simply monitoring
mentions of key phrases and following up with
bloggers to ensure they linked to the right places
would have yielded great search benefit, as well as
also creating connections with the most engaged
members of the audience. Having a diverse social
presence is fantastic, but it needs to be managed and
corralled to derive the most benefit.
Going back to the beginning, its all about paying
attention to the details, and knowing those things
exist and matter.
Through bearing in mind good site architecture,
ensuring basic SEO essentials like title tags, server
headers, page copy optimisation, ensuring pages are
spiderable and so on, you can avoid 95 per cent of the
most common issues.

Above The Telegraphs

UCAS Calculator currently


ranks first for the UCAS
calculator, as a result of
strong domain authority
and link weight

Conclusion
In the SEO industry there exists an army of people
who are passionate about creating amazing
experiences for consumers, and who want to build
amazing content for their clients, pushing to create
great work.
We believe that SEO has a chance to really help not
just the agencies involved, but the consumers and
brands too in enabling discovery and re-discovery of
the great content produced.
The search optimisation industry may not be
perfect, but, at its best, its helping to develop better
websites and create more engaging content for
clients of all sizes.
Its not quite the industry we want yet, but we can
see it from here. l

The SEO Handbook 37

Features

Moz Blog moz.com/blog


Builtvisible Blog builtvisible.com/blog
Search Engine Land searchengineland.com

Features

Get to the top of Google

Features

Get to the
top of Google!
SEO is a shapeshifter: its current,
grown-up incarnation is audiencedriven, engine and user-friendly.
Bryson Meunier has the details

Words Bryson Meunier


(@brysonmeunier) is
director of SEO strategy at
Resolution Media, and a
primary architect of the
agencys naturalsearch
product andClearTarget
Digital BehaviorAnalysis
www.resolutionmedia.com

Image Mike Brennan


is arteditor of .net
www.twitter.com/mike_
brennan01

Sure, you know about SEO. You


might not be an expert, per se, but
you have a good understanding of
the basics: title tags, clean URLs, text-based
design and so on.
Even if youre more of an expert than most,
just as often its what you dont know about
SEO that will hurt you.
Many webmasters found this out the hard
way in February 2011, when Googles Panda
update was released. Many of these same
webmasters were hit again little more than
a year later, when the search giants Penguin
update followed targeting low-quality,
spammy link-building tactics. What they
thought was SEO worked for a little while, and
then turned out to be less than optimal. Sites
were penalised; traffic and revenue lost, and
so-called SEOs fired.
Since then theres been a new tone in the
SEO industry. Not all of us were creating lowquality content and links; but for many of those
that were,these two updates were a wake-up
call. And for those of us who have always been
focused on high-quality, relevant content and

links, it was something like redemption. Our sites


soared while so many fell.
This is the new normal for SEO. Yes, there are
still some who call themselves SEOs but focus
on manipulative tactics with short term revenue
goals; yet there are also many who are part of
a large and growing industry of specialists in a
highly complex discipline that requires marketing,
technical, and research and communication skills.
So just how big is SEO? Believe it or not, its
bigger in the minds of Google searchers than
webdesign. Once considered a subset of web
design, searches for SEO now eclipse those of
web design worldwide (see http://netm.ag/
trends-238).
The total projected value of the North
American search marketing industry (SEO and
paid search) in 2013 is $26.8billion, according
to industry trade organisation SEMPO (www.
econsultancy.com/us/reports/sempo-state-ofsearch), and 13 per cent of companies have SEO
budgets of half a million to morethan three
million a year (up from 8per cent in 2011).
As budgets increase, theres more to

Features

Get to the top of Google

Features

RELEVANT
CONTENT

MOBILE
ONLY

RESPONSIVE
spammers

CONTENT
INSPECTION

USER
EXPERIENCE

NO DEAD
LINKS PLEASE

Features

Get to the top of Google

RELEVANT
CONTENT

Changing trends Google Trends data shows that the number of searches for web design has declined over time to the point where it is now eclipsed by searches for seo

Features
SEO mission control Google Webmasters has many valuable reports on crawling and indexing of content, as well as who links to you and what queries your site appears for

Multimedia SEO case study


Background: a client came to us looking to
increase natural search traffic to its carimagesharing site.

Challenge:
l Images were hidden behind JavaScript and
notindexed.
l Images were hosted on another domain (a
common CDN) making it impossible to create
image sitemaps.
l Title tags were branded, making it difficult for
the engines to understand what keywords the
site was relevant for.
l Most common search phrases were not used
in content.
l As a new site, there was a lack of authority.

Strategy:
l Rewrote the URLs so that they would appear
to the engines to be hosted on clients site to
get more content indexed.
l Made images more accessible by adding
noscript tags to the download page.
l Used sitemaps and image sitemaps to ensure
that the engines were aware of our content
and the structure of our site.

l Changed the title tags of the major pages so


they included relevant keywords, and changed
the copy in these pages when necessary to
include popular relevant keywords.
l Followed Googles image search best practices
(http://netm.ag/imagesearch-238), and added
structured markup to all images to ensure that
the engines had as much information about
what the images were relevant for as possible.

lose, and many companies have become


more risk averse forgoing the shady
tactics they may have pursuedin the past.
In cutting out the garbage, we start to see
what SEO is really good for (and has always
been good for): connecting relevant content
with relevant searchers, and making content
discoverable through accessibility and marketing.
For those of you who still think of SEOs
as greasy algorithm-chasers in cheap suits or
parents basements, consider the new reality.

Results

Engines are not enemies

l In the first four weeks after implementation,


the number of images indexed went from
10,000 thumbnail images to 54,000 images. In
six months, 403,000 images were indexed.
l Doubled relevant organic search traffic in
four weeks, taking organic traffic from 11 per
cent to 22 per cent of the total visits. Organic
search was 41 per cent of visits in six months.
l Clickthrough rate from organic search
increased 460 per cent in the first four weeks.
l Monthly organic visits grew 4198.61 per cent
from 1,083 to 46,554 in six months.
l In a year, the site sat on Google page one for
relevant high-volume terms such as pictures
of cars pics of cars and car wallpapers.

When I started doing SEO in-house for a Fortune


50 corporation 10 years ago, there were many in
the organisation who were a little nervous about
what we were doing. Nothing was against the
Google guidelines because there were no such
guidelines in existence. At that time there were a
few books, but SEO was largely something that
was spoken of covertly, and certainly never to
search engines, which, it was thought, would
likely think of it as manipulation.
Today we know better. Google and Bing have
both published extensive webmaster guidelines,
and Google has even published a guide to SEO
for beginners (http://netm.ag/seostarter-238).

eo

On the rise SEO can increase your traffic significantly if you do it correctly. In the case study shown above, organic
traffic grew by more than 4,000 per cent in the first six months
Way forward Detailed, integrated SEO plans can be crucial

40

The SEO Handbook

Features

Get to the top of Google


In-depth
Danny Sullivan

Big hitters Google offers more than 570 videos for webmasters, which, in four years, have had over 10 million views

In August 2011, Matt Cutts, Googles head of


webspam, released a video statement (http://
netm.ag/spam-238) saying that Google does not
consider SEO by itself to be spam. This sentiment
now appears in Googles definition of search
engine optimisation (http://netm.ag/seodef-238),
in which it says: Many SEOs and other agencies
and consultants provide useful services for
website owners.
Still, because of a few spammers who call
themselves SEOs, SEOs in general have the
reputation of being charlatans, and have been
portrayed as such on television shows such as
TheGood Wife and Dexter.
SEO has unfortunately got a bad rap, and its
due mainly to questionable SEO practitioners who

practice. And our software, which bills monthly,


has more than 18,000 subscribers as of today.
IfSEO were just snake oil, I strongly suspect folks
would stop paying.
SEO, in its legitimate form, is now a more
accepted part of the web design process, and in
many organisations is finally getting a seat at the
table when it comes to designing professional,
search engine-friendly web sites.

A process, not a project


In my decade-plus doing enterprise SEO, there
have been many instances in which the SEO
team is brought in after the website is already
complete, and told to magically make it search
engine friendly. This isnt ideal. As Google says in

In cutting out the garbage we start to see


what SEO is really good for: connecting
relevant content with relevant searchers
perpetuate the snake oil stereotype by making
customers believe theres some magic black
box that tricks the search engines, says Gord
Hotchkiss, chief strategy officer for Montrealbased Mediative (www.mediative.com) and
regular columnist for Search Insider. Hotchkiss,
and the other experts I reach out to for this
article, explain that SEO is simply about getting
relevant content indexed, and making sure its
visible to the search engines.
All of the veteran SEOs that I speak to
understand why SEO still has the reputation in
some circles of being snake oil. But they insist
that it has, at this point, become much more
mainstream and credible.
Rand Fishkin, founder of Seattle-based SEO
software company SEOMoz (www.seomoz.org),
discusses with me a few of his favourite reasons
for SEO being something other than snake oil,
including that SEOmoz itself has more than
2million monthly visits, nearly all from web
marketers looking to learn more about the

its guide to SEO: If youre thinking about hiring


an SEO, the earlier the better.
The really competitive sites that Ive worked
with over the years understand this, and integrate
SEO into every stage of the planning process,
from information architecture to content strategy
to design, development, launch and post-launch.
A lot of web designers and developers are
hesitant about integrating SEO further into the
process, because doing so effectively produces
extra work. But the rewards can be great, reminds
Vanessa Fox, founder and CEO of Nine by Blue
and author of Marketing in the Age of Google.
Organisations are losing 1) tremendous insight
into their customers and potential customers
if they dont take advantage of the free search
data thats available from the millions of searches
we do each day; 2) the opportunity to reach a
significantlylarger audience through beingvisible
in search results.
Would you put the Mona Lisa in a
closet? Would you spend hours cooking

How would you convince doubters that SEO


is a legitimate marketing strategy?
SEO has been around for nearly two decades
now and is recommended by Google, which
even provides its own guide for it. The
worlds largest search engine isnt going to
be pushing snake oil it has every reason
not to. The fact that it does is probably the
best reason beyond all the many out there
about why SEO is important. Ignore it, and
youre ignoring what the actual search
engines you want to be listed in are telling
you to do to improve your chances.
Many designers see SEO as extra work
and may add it at the end of the web
design process if theyve added it at all.
Why should these people take SEO more
seriously (if indeed they should)?
If you build it, they dont necessarily come.
Creating a shiny, wonderful new website
doesnt matter if youve built it in a way to
make it invisible to search engines. Designers
often test to ensure their sites work well
in different browsers like IE, Firefox and
Chrome. Well, Ive long written about
search engines as being the most important
browser of all, because everyone uses them
to find sites. But if youve not designed your
site for the unique things the search engine
browser wants, people wont locate them.
A lot of software has been introduced in the
past couple of years that seeks to automate
aspects of the SEO process. Will SEO ever be
completely automated?
Software cant automatically tell the type
of content your readers are interested
in, the type of words theyll use to seek
that content, create the quality content
itself to serve themand thats just the
foundational part of SEO. Tools are nice. But
weve had tools for years to build houses,
yet we havent completely automated house
building. Never say never, but I think humans
will long be involved in SEO.
What would you say are some promising
trends in SEO today and why?
The use of more social data as a potential
additional signal beyond looking at links to
identify quality content is most promising
to me. The greater use of structured data is
also encouraging, as are the new possibilities
opened up as people search on mobile
devices and with mobile apps.

The SEO Handbook 41

Features

Company Search
EngineLand
Role Founding editor
Web www.
searchengineland.com

Features

Get to the top of Google

Paul A
checked in
at Facebook

Features

On target Vanessa Fox of Nine by Blue (www.ninebyblue.com) created this searcher persona in order to connect audience goals with relevant content from the business

beef wellington, and as it emerges perfect


from the oven throw it in the trash?
Then why would you build a website without
considering how it will be found?
Theres another reason for making SEO a
priority in the web design process, advises
Hotchkiss it forces you to create a better
website! Good SEO optimisation should be baked
into your information architecture. It will force
you to think about common content themes. It
requires you to consider how all digital assets
(such as videos and user-generated content) will
be integrated into the overall user experience. It
helps eliminate user experience dead ends such
as gratuitious Flash interfaces and, my personal
pet peeve, content locked in PDFs. It extends
your perception of your online footprint beyond
the bounds of your website, including things like
social media. It will also instil a healthy rigour
when it comes to thinking about how your site
links together. Good SEO practices means a better
user experience.
From my experience, more organisations than
ever are learning these lessons, and are no longer

While SEO is constantly evolving, at the


moment it seems focused on mobility,
utility, the audience and automation
thinking of SEO as a project, but as an ongoing
process that ensures a website will be as visible
in search as possible. This is good for web design
because it gives it a larger audience, but also
good for business.

Its not just about links:


emergingSEO trends
While many commentators have claimed that
SEO is dead since it began around 1997, the truth
is that it doesnt die; it evolves with the search
engines. While SEO is constantly evolving, at the
moment it seems focused on mobility, utility, the
audience and automation, among other things.
One of these trends is the dissolving
distinctionbetween SEO, user experience and

Readymade people Resolution Medias ClearTarget Behavioral Analysis takes keyword research to another level by
harnessing the power of big data and automation to create actionable searcher personas

42

The SEO Handbook

content strategy. In one recent Webmaster


Tools YouTube video (http://netm.ag/cutts-238),
Matt Cutts even suggested that those looking
to change the name might consider searcher
experience optimisation to differentiate from the
'snake oil' variety of SEO.
Some, such as Vanessa Fox, have suggested
that SEO need not proceed as a separate activity
from UX and content strategy: I think that both
disciplines should incorporate best practices from
search rather than thinking of it as something
tacked on later, she says. Particularly, the data
available from search is extremely valuable.
Also, understanding that many visitors begin
with a major search engine and that any page
of the sitecan therefore become the homepage

Get going Googles SEO Starter Guide defines tactics to help


search engines and webmasters display relevant content

NO DEAD
LINKS PLEASE

Features

Get to the top of Google

In-depth Implementing optimised Facebook sharing

Learning curve Blogs and Twitter help SEOs stay on-trend

of the site can shift how we look at both page


designand content.
At the same time, Fox and all of the other
SEOs I asked recognise that content strategy
and usability, while essential for reputable SEO,
need technical and other elements from SEO to
be useful as a way of getting incremental search
engine traffic.
When SEO is done the right way, usability
and content is a huge part of the plan, opines
Eric Enge, founder and CEO of Massachusettsbased Stone Temple Consulting and co-author
of The Art of SEO. This is something that the
snake-oil SEO people dont worry about. For longterm success as a web publisher, the use must
comefirst. However, for success as a business,
you needto do more.
With this concentration on content strategy
and usability comes a focus on the audience as
well. For Hotchkiss, this is a shift from wordmatching to utility, and follows the search
engines own evolution. Today, good SEO is
about making sure that when a prospect uses
a word (or words) to search for something, you
match that as best as possible, he says. But in
the future, SEO will be about ensuring that when
your prospect wants something, you deliver it.
It may not be content. It may be a movie ticket,
a hotel booking, a restaurant reservation or a
downloaded TV show.

l og:title the title of your object as it should


appear within the graph, for example SEO
Mentioned Again on The Good Wife.
l og:type the type of your object, such as
article. Depending on the type you specify,
other properties may also be required. In the
case of an article, additional information can
be included in the graph*.
l og:image an image URL that should
represent your object within the graph.
These should be consistent with the article
(for instance www.brysonmeunier.com/
wp-content/uploads/2012/10/seo-from-thegood-wife.jpg.
l og:url the canonical URL of your object
that will be used as its permanent ID in the
graph, such as www.brysonmeunier.com/seomentioned-again-on-the-good-wife.
*When these pieces of data are known or
available publicly (note: never push information
in an Open Graph tag that is not visible on the
page in question this could be flagged as a
form of cloaking), include these pieces as part
ofthe lines in the Facebook Meta Tag:
la
 rticle:published_time -datetime when the
article was first published.
la
 rticle:modified_time-datetime when the
article was last changed.
l article:expiration_time-datetime when the
article is going out of date.

l article:author-profilearray the writers


ofthe article

Live example
For the example, this is what should be in the
<head> section of the URL (this format should be
included for all pages on the site; additions are in
red). The content title and description should
carry through to the Open Graph tags:
<html prefix="og: http://ogp.me/ns#">
<head profile="http://www.w3.org/1999/xhtml/
vocab">
<meta http-equiv="Content-Type" content="text/
html; charset=utf-8" />
<link rel="shortcut icon" href="http://www.
brysonmeunier.com/sites/all/themes/spirezen/
favicon.ico" type="image/vnd.microsoft.icon" />
<meta content="SEO Mentioned Again on the 
Good Wife" about="/seo-mentioned-again-on-
the-good-wife/" property="dc:title" />
<link rel="shortlink" href="/node/398" />
<link rel="canonical" href="/seo-mentioned-
again-on-the-good-wife/" />
<title> SEO Mentioned Again on The Good Wife
| BrysonMeunier.com</title>
<meta name="description" content="CBS 
Television show "The Good Wife" mentions SEO
again in a fictional trial. Read more." /> 
<meta property="og:title" content=" 
SEOMentioned Again on The Good Wife " />
<meta property="og:description" content=" CBS 
Television show "The Good Wife" mentions SEO 
again in a fictional trial. Read more."
<meta property="og:type" content="Article" />
<meta property="og:url" content=" http://www.
brysonmeunier.com/seo-mentioned-again-on-
the-good-wife/" " />
<meta property="og:image" content=" http://
www.brysonmeunier.com/wp-content/
uploads/2012/10/seo-from-the-good-wife.jpg" />
<style type="text/css" media="all">@import 
url("http://www.brysonmeunier.com/modules/
system/system.base.css?mbovli");

A change in analysis
Delivering on this promise frequently requires a
new type of analysis. In the past, marketers have
done keyword research to uncover keywords as
proxies for user intent. In Marketing in the Age
of Google, Vanessa Fox describes the process
of creating searcher personas that get beyond
simple keyword matching and search volume
exercises. And still others, such as iCrossings
Core Audience (www.coreaudience.com) and
Resolution Medias ClearTarget (http://netm.ag/
cleartarget-238) try to understand characteristics
of audiences, including but not limited to the
keywords that they use.
For some businesses, mobility will not change
user intent. For example, news is not going
to be rewritten for a separate platform, as

Up next Open Graph tags and other structured data help power search innovations like last years Knowledge Graph

The SEO Handbook 43

Features

Facebooks Open Graph (the overarching


name of the algorithm that handles third
party content on Facebook) relies on a series
of meta tags, which need to be present in the
<head> section of every web page to guide
how contentshould be displayed when shared.
There are specific parameters that these meta
tags need to adhere to in order to be activated
in search results. There are essentially four lines
of code that need to be present wherever theres
sharing functionality:

Features

USER
EXPERIENCE

Get to the top of Google

Paul A
checked in
at Facebook

Features
To the point Googles Matt Cutts is active in the webmaster community, answering key
questions in videos such as how important is it to have keywords in a domain name?

Karen McGrane and other adaptive content


advocates frequently point out. However,
for some businesses it does; and if marketers
want to get the most traffic and conversions from
the mobile platform (in other words, be optimised
for) the devil is in the detail, and understanding
potential differences between mobile and desktop
audiences is key.
There are technical considerations as well,
which make a type of SEO geared towards
these differences what is commonly called
mobile SEO what it is. Many people in the
SEO world, as with their counterparts in the
design world, believe that responsive design is
the answer to these differences and Google
stated a preference for responsive design in June.
However, as I said in issue 232 of .net, responsive
design is not always best for the user, and
Google wouldnt prefer it in those cases (http://
netm.ag/googlerwd-238). If mobile and desktop
search behaviour is significantly different, Google
supports using dynamic serving or switchboard
tags as well.
Cindy Krum, CEO of Denver-based mobile
marketing agency Mobile Moxie and author

Primary source Matt Cutts is one of the most popular personalities in SEO. As Googles
head of webspam, his blog has been required reading for SEOs since 2005

of Mobile Marketing: Finding Your Customers


Wherever They Are agrees that responsive design
is one of many solutions for mobile SEO. I have
been recommending a mixed solution for most
of my clients, she says, leveraging responsive
design when it makes sense, and special mobileonly landing pages when keywords or use-cases
cannot be appropriately addressed with a
responsive design approach.

Tools of the trade


Another big shift in SEO has been the introduction
of more automation to the process. SEO software
has been around as long as SEO has (remember
WebPosition Gold?), but the breadth of tools and
level of sophistication has increased considerably
in the last year or two.
Now there are tools to help you optimise
the long-tail through semantic relevance
(BloomReach (www.bloomreach.com)), reporting
tools (Conductor (www.conductor.com),
BrightEdge (www.brightedge.com), SEOMoz)),
link building tools (Ontolo (www.ontolo.com),
Ahrefs (www.ahrefs.com), Open Site Explorer
(www.opensiteexplorer.org), Majestic SEO (www.

majesticseo.com)) and more, all geared toward


automating aspects of the SEO process (see our
top 20 at http://netm.ag/seotools-238).
When this happens, inevitably someone in
the press will claim that the tool will allow you
to fire your redundant SEO; but none of the SEO
experts or software providers I talk to agree.
Great SEO starts with human beings who are
creative, tenacious, and empathetic to the needs
of searchers, says Fishkin. No software can ever
automate those processes.
There are other trends in SEO that are
important, among them the integration of social
signals into ranking algorithms and SEO that is
not just text-based, but about understanding
and optimising images and videos perhaps
eventually for wearable computing purposes
(Google Glass, for example). All in all, legitimate
SEO has evolved with the search engines and
it continues to do so. As Ive said, understanding
and applying this new information requires a
new type of SEO practitioner, and a different kind
of user-focused SEO. The next time someone
tells you something different, you now have the
knowledge to set them straight. l

Resources
Link building means earning
hard links, not easy links
Danny Sullivan explains the rationale behind his
rant from SMX Advanced 2012, going into detail
about the concept of link earning, as opposed
to link building for the sake of acquiring links en
masse (http://netm.ag/sullivan-236).

Google Webmasters
YouTubechannel
Since 2009, Matt Cutts, Maile Ohye and other
members of the Google Webmaster team
have been posting videos aimed at helping
webmasters increase the visibility of their
sites. Among the 500-plus videos uploaded

44

The SEO Handbook

aremythbusting efforts (Google does not


use the keywords meta tag in web ranking),
advice (5 common mistakes in SEO (and 6
good ideas!)) and general tutorials (How does
Google search work?), but all of them represent
essential guidance from Google in doing
legitimate search engine optimisation (http://
netm.ag/webmasters-238).

Matt Cutts blog


Matt Cutts is the head of webspam for
Google, and a key liaison between Google and
webmasters, including SEOs. This is his official
blog, which has been an essential resource for
SEOs since July 2005 (www.mattcutts.com/blog).

Bing Webmaster Help


Duane Forrester is Bings equivalent of Matt
Cutts, and the information he provides to
webmasters is equally impressive. This is
Bingswebmaster help center, which includes
its own webmaster guidelines (www.bing.com/
webmaster/help).

Search blogs I follow


Feed of posts from the 150 search blogs I read
regularly to stay up to date on emerging trends
in SEO. Contains posts from SEOMoz, Nine by
Blue, Search Engine Land, Mediative, Stone
Temple Consulting Blog, Mobile Moxie and
more (http://netm.ag/searchblogs-238).

THE ALL-NEW

MAGAZINE IPAD EDITION


The iPad edition of net has been completely rebuilt from the
ground up as a tablet-optimised reading experience.

TRY IT
FOR FREE
TODAY!
Youll find additional imagery, exclusive audio and video content in
every issue, including some superb screencasts that tie in with
the practical projects from the issues authors. Dont miss it!

TRY IT FOR FREE TODAY WITH OUR NO-OBLIGATION 30-DAY TRIAL AT


UK: netm.ag/itunesuk-275 US: netm.ag/itunesus-257

Features
46

The SEO Handbook

Reduce your bounce rate

Reduce your
bounce rate
How do you keep visitors on your site longer
once theyve clicked through from a search
result? David Deutsch gives the lowdown

Words David Deutsch is


Director of SEO and SEM
at SEO Brand. His previous
roles include Director of
Online Marketing at New
Epic Media.
www.seobrand.com

Image Mike Chipperfield


is co-founder and creative
director of Brighton-based
collective Magictorch.

We all want to get our site to the top


of Google. But thats only half the story
what happens when people click
through to your site?
Do they hang around a while and check out
what you have to offer or quickly move on to
the next result? Obviously we want the former to
happen, so how can we make sure it does?
When visitors find nothing of interest on your
site at first glance and leave immediately, this is
known as a bounce.
A website with a high bounce rate from
goodquality traffic sources is an indicator
that the website isnt performing up to its
visitors expectations.

Sources of traffic
Reducing your websites bounce rate is a great
firststep in improving its overall performance
andconversion rate.
First you need to analyse the bounce rates of
the different traffic sources. Focus your efforts

on improving the bounce rates of the highest


converting traffic sources such as:
l search engine traffic
l email marketing campaigns
l affiliate campaigns
Ignore the bounce rate from unqualified
sources such as:
l unknown referrer sites
l social networking sites
l random directory sites

Landing page design


Considering how you design your landing pages
can really make a difference to your bounce rate.
Take the example of Real Gap (www.realgap.com),
which provides gap year and travelling ideas in
over 35 countries for those youngsters wishing to
take a year off before, during or after university.
The website uses specially-designed landing
pages for its pay-per-click (PPC) accounts. As you
can see below, the PPC page (left) has a stronger
callto action whereas the SEO page (right) just
containsgeneral gap year information. Not
surprisingly, the bounce rate for the PPC page
is much lower than for the organic page:
18.24 per cent for the former compared

Be specific Real Gaps


SEO landing page is more
generic, so the bounce
rate is higher

Getting attention A strong call to action helps to lower


your sites bounce rate, as on Real Gaps PPC landing page

The SEO Handbook 47

Features

Features

Features

Reduce your bounce rate


Case study Jamster.com

Features

Bare bones Testing showed this static page had only


middling success at lowering Jamster's bounce rate

Attention grabbing The interactivity of this Flash page


kept users on the page and lowered the bounce rate

The purpose of testing your landing


pages is to reduce bounce rates and
increase conversion rates
with 25.44 per cent for the latter. Its a
clear demonstration of how the design
of the page people land on influences your
bounce rate.
So whats the secret to designing a page with
a low bounce rate? Here are some helpful tips:
Headlines: Make sure your headline refers directly
to the place where your visitor came from or the
ad copy that drove the click.
Calls to action: Its vital to provide a clear call to
action. Heres another good example: Only 99
per night at the Hilton Hotel in Paris. Book now.
Heres a bad example: Cheap hotels in Paris, the
most romantic city in the world. Choose from
300 hotels in Paris, France.
Be clear and concise: You need to write clear,
specific and targeted content thats geared
specifically towards your visitors. Dont write
content that is vague or general. Your visitors are
here to read about a specific subject, so they wont
thank you for wasting their time.
Hierarchy: Place the most important information
at the beginning of paragraphs and bullets.
Keep it simple: Remove all extraneous matter
from your landing page. Ask for only enough
information to complete the desired action.

Testing
How do you know whats working and what isnt
in terms of keeping visitors on your site? Answer:
you dont. So assume nothing, test everything.
Were privileged to work in an industry where
we can test every idea we have without spending
any money. Thanks to Googles Content
Experiments (in Analytics), we can create, test and
monitor the performance of landing pages. The
purpose of testing your landing pages is to reduce
bounce rates and increase conversion rates.
Setting up a A/B split test on Google Content
Experiments is very easy and free. Heres a basic
overview of the steps involved.
lC
 reate a campaign to test landing pages
against each other.
lU
 pload the URL of the original page that
youre testing.
lU
 pload the URLs of the other landing pages
you want to test against the original.
lC
 opy the JavaScript codes provided by
Google and paste them in to the relevant
pages youre trying to test. (This will require
access to the HTML code of the pages.)
lS
 end at least 500 visitors to the primary URL
and Google will separate those visitors
randomly for you to test the performance of
the pages in a non-biased way. You can do
this by using AdWords, email campaigns or
even affiliates to send traffic from specific
keywords through the funnel.

Golden rules

Stats amazing The bounce rates for Real Gaps PPC pages
are much lower than for the organic pages

48

The SEO Handbook

Finally, here are some golden rules to remember


when designing and testing landing pages:
lN
 o idea is a bad idea until youve seen it in
action. If you get an idea for a landing page,
then try it! Try them all.
lM
 ake a different landing page for every
keyword that gets at least 500 clicks per

Jamster (www.jamster.com) is aimed


primarily at a young audience and provides
them with ringtones, dial tones, mobile
games, screensavers and music for mobile
phones. Left, far left and below you can see
three different landing page designs: an
interactive page powered by Flash (left), a
rock-themed page (far left) and a generic
landing page (below). Using PPC as the only
source of traffic, we were able to determine
that Flash interactivity was the key to a very
low bounce rate and high conversion rate.
The Flash landing page used a great call
to action and Flash interactivity. Providing
a fun and exciting Flash widget alongside
a strong call to action delivered a very high
return on investment.
The rock-themed page not surprisingly
appealed to rock fans but was a static page
without Flash interactivity. The generic page
was designed with no specific genres of
music and used light, friendly colours.
The results (from PPC sources) indicate
that neither of these were the right
approach for Jamster consumers. The Flash
page had a 17.4 per cent bounce rate,
compared to a bounce rate of 22.1 per cent
for the rock page and one of 49.8 per cent
for the generic landing page.

Dullsville This generic static page did little to


keep users interested and on the site

month. This ensures your highest converting


keywords get all the attention they deserve.
lS
 end at least 500 visitors from the same
traffic source/keyword to each landing page,
to accurately gauge its performance.
lO
 nce youve tested several landing pages, try
improving them by testing different versions
of the best performing pages.
lT
 he best landing pages can then be tested to
varying degrees using multivariate testing.
This is where you test specific elements of
the page rather than entirely new pages.
Creating landing pages that encourage visitors
to stay longer can sometimes be time-consuming
and expensive, but its absolutely essential to
high converting, high performing websites. In the
words of Samuel Beckett: Ever tried. Ever failed.
No matter. Try Again. Fail again. Fail better. l

THE ULTIMATE
SOFTWARE TRAINING

Created for professional designers by professional designers, the


Computer Arts Studio Training series brings you expert tutorials in
Illustrator, Photoshop and InDesign. Complete your set today!
On sale now at WHSmith, Barnes & Noble, www.myfavouritemagazines.co.uk/design
and for iPad at http://goo.gl/sMcPj

Features

Google's guide to Analytics

Features
50

The SEO Handbook

Features

Google's guide to Analytics

Features

Googles
guide to
Analytics
With the emergence of new web technologies, make sure
youre making best use of Google Analytics to measure your
business. Googles Justin Cutroni presents his pros guide

Words Justin Cutroni is


a blogger, author and
the analytics advocate at
Google. Hes responsible
for user and community
education and global
enthusiasm surrounding
digital analytics. He
publishes the blog
Analytics Talk (http://
cutroni.com/blog) and
has authored or
co-authored three books
about Google Analytics

Image Mike Brennan


is arteditor of net
www.twitter.com/mike_
brennan01

The world of digital analytics is


changing fast. With the onset of
mobile, and the rapid growth of
connected devices, traditional web analytics is
no longer adequate.
To meet the growing needs of businesses,
Google Analytics (http://www.google.co.uk/
analytics) has changed from a web analytics
solution to a business measurement platform.
From websites to mobile apps, as well as for
almost any other internet-connected device,
you can use Google Analytics to measure your
entire business.

Where data comes from


Lets start at the beginning with how Google
Analytics collects and organises data. All data is
sent to Google Analytics via a data hit. Included
in this request is information about the visitor
and their behaviour.
Once the hit is collected, Google Analytics
processes the hits about every three hours and
turns them into dimensions and metrics, which
are the building blocks of reports.
Dimensions represent information about

your users and the sessions they create. Some


common dimensions include Page Title (the title of
a HTML page), the visitors geographic location
(Country, Region, City, etc), or the type of device
the visitor is using (tablet, smartphone, etc).
The second type of information is metrics.
Metrics are the numerical data collected by Google
Analytics. Metrics can be simple integers, like the
number of visitors to a website, or calculated
values, like the average amount of time a visitor
spends on a site.
You can think about Google Analytics metrics as
a hierarchy. At the top of the hierarchy you have
visitors, a count of the number of people that
interact with the site. Visitors create visits, which is
a defined period of interaction. Within that visit
are PageViews. Finally, at the bottom of the
hierarchy, we have something called Actions, which
are detailed interactions within pages.
Theres other data as well. Some of the
most important data is conversion data.

The SEO Handbook 51

Features

Google's guide to Analytics


Conversions represent important business
actions, like completing a transaction or
submitting a lead form. You measure these actions
with a feature called Goals. If youre an ecommerce
company, you can also measure ecommerce data
including total number of transaction, products
purchased and total revenue, etc.
Once the data has been collected and
processed, its turned into reports. Almost all of the
reports in Google Analytics are built using
Dimensions and Metrics. In most cases, all the
values for a single dimension are showed in a table
along with metrics for each value. For example, the
Countries report shows all the values for the
Countries dimension along with a number of
different metrics.
This is a fairly simplistic overview of how the
system works. But it should give you a good
foundation for some of the advanced topics that
well discuss later. Lets move into more detail and
talk about measuring a website.

Case study LaTienda

Features

Basic website implementation


To measure a website with Google Analytics, you
must add a snippet of JavaScript to every page on
your site. This JavaScript collects basic information,
like the URL of the page the visitor is looking at or
where the visitor came from (another website,
search results, etc) and then automatically sends
the data back to Googles collection servers via a
data hit.
Heres what the basic tracking code looks like:

LaTienda (www.tienda.com) is an awardwinning, family-owned business supporting


artisanal firms in Spain. The firm works with
small family-run businesses, many of which
are dedicated to centuries-old food-making
traditions. With warehouses in Williamsburg,
Virginia and Alicante, Spain, the company ships
hundreds of thousands of orders throughout the
United States, Canada and Europe.
LaTiendas brand equity is built on its
fundamental commitment to the customer
experience. The company guarantees a positive
experience for its customers by promising to
replace or refund any products delivered in
anything less than an excellent condition.
Overall, it had been seeing great success with
its online orders. But the company wanted to
continue looking for opportunities to grow sales.
To assist with this, the company worked with
WebStrategies (www.webstrategiesinc.com),
located nearby in Virginia.
In particular, a key product category required
more expensive shipping methods if the
shipping address was too far from LaTiendas
Virginia warehouse. The challenge was to
understand the impact on sales of varying
shipping rates for this subset of products.
LaTienda grouped visitors into two regions:
Region A visitors were close enough to the
warehouse to always get reasonable shipping
costs. Region B visitors were everywhere else,

52

The SEO Handbook

and had to use a more expensive shipping


method for the key product category.
WebStrategies wanted to measure the impact
on sales whenever one of the key products was
placed in the cart. To measure this, it installed
Event Tracking to the Add To Cart buttons on
every product page.
Advanced Segments Custom Reports was
used to separate visitors in Region A from
Region B, and drilled down to view performance
by product category. Sure enough, visitors from
Region B were found to be 48 per cent less likely
to make a purchase if they placed an item from
the key product category in their cart, which
raised total shipping costs.

<!-- Google Analytics -->


<script>
(function(i,s,o,g,r,a,m){i['GoogleAnalyticsObject'=r;i
[r]=i[r]||function(){
(i[r].q=i[r].q||[]).push(arguments)},i[r].l=1*new
Date();a=s.createElement(o),
m=s.getElementsByTagName(o)[0];a.async=1;a.
src=g;m.parentNode.insertBefore(a,m)
})(window,document,'script','//www.google-
analytics.com/analytics.js','ga');
ga('create', 'UA-XXXX-Y');
ga('send', 'pageview');
</script>
<!-- End Google Analytics -->
This code does a few things. First, it
asynchronously loads a library named analytics.js.
This library contains all the information that
Google Analytics needs to operate. Then it sends

A recipe for success


To combat this, LaTienda.com implemented a
less expensive, flat rate shipping model in
Region B and monitored sales. After the test, the
rate at which Region B visitors completed the
shopping cart were found to have increased by
nearly 70 per cent.
Just to be sure, the company checked to see if
there was a similar increase in conversion rate
for Region A visitors, and found that it didnt
fluctuate more than 3.4 per cent over the same
time period. The analysis confirmed that product
shipping rates greatly impacted shopping cart
behaviour, and used data to measure the results
of a key business decision.

Hierarchy Google Analytics has a defined data hierarchy of


visitors, visits, PageViews and actions

Features

Google's guide to Analytics

Views Use the Dimension selection links to view your event categories, actions and labels

an invisible image request back to Googles


analytics servers. The type of hit is called a
PageView. When the hit is received, the system will
calculate other metrics, like visits and visitors.
Also notice the UA number in the previous code
snippet. This is your account number. This is how
Google Analytics connects your account to the
property thats collecting the data.
This JavaScript should be placed in the <HEAD>
tag on every page of your site. If you use some
type of CMS, you can add the JavaScript directly to
your template.
Youll get a lot of data from the standard

Comparing price Rasmussen College was able to see price interest by school of study

behaviour. Thats where event tracking comes in.


Event tracking is a flexible data collection tool
that can be used to measuring button clicks, video
players or any other user interaction with content.
Events count anything you want. The neat thing
about event tracking is that you define the data
and when its collected.
An event has four components:
Category: The category is a way to group certain
events together, like all of the events that happen
in a video player.
Action: The action is the actual action that a user

Todays modern website is complex. Ajax,


responsive design and HTML5 are quickly
abstracting the traditional analytics model
Google Analytics page tag. But at some point you
will discover that you need a more detailed
measurement as well.

Measuring visitor interactions


Todays modern website is complex. Ajax,
responsive design and HTML5 are quickly
abstracting the traditional analytics model. Users
can do many different things on a single page,
almost rendering the PageView metric useless.
From an analytics perspective, we need more
in-page measurement to better understand user

takes. It could be clicking a button, clicking on a


HTML link, or changing a setting. Almost anything.
Label: The label provides more information about
the action. For example, if your action is a Video
Play Button, then the label may be the name of the
video. Or, if youre measuring clicks on links, the
label could simply be the URL of the link that the
user clicked.
Value: A numeric value for a particular event. This
could be the number of seconds, or a score in a
game. The only requirement is that its an integer.
The value actually applies to the combination of

the Category, Action and Label. Google Analytics


will sum the value in the reports.
So how do you generate event data? Unlike
PageViews, which are automatically generated by
the Google Analytics JavaScript, the event
JavaScript must be placed inside your code. You
need to attach them to the appropriate code that
represents a user action.
For example, if you want to measure a click on a
button, you could create a category called
UserActions, an action called ButtonClick and a
label that is the name of the button. Use the
following code to the onClick event handler:
<button id="button" onClick="ga('send', 
'event', 'UserActions', 'ButtonClick', '[BUTTON 
NAME]');">Please click</button>
When the above code executes a data hit,
containing all of the event information, its sent
back to the Google Analytics collection servers.
While the previous code will work, its always a
good idea to abstract your event tracking code,
and any analytics code for that matter, from the
presentation layer.
Heres a very common example of abstracting
the event code while also making it cross-browser
compatible. This code will measure PDF downloads
when a user clicks on a HTML button:
var linkClick = document.getElementById
('button');

Case study Rasmussen College


Rasmussen College (www.rasmussen.edu)
wanted to know the importance of tuition costs
to prospective students, and if that information
was a factor in their higher education selection
process. Although Rasmussen College had
created a tuition cost estimator for prospective
students to use on the site, there was no way to
measure engagement. This lack of knowledge
regarding a prospective students use of the
tuition estimator tool limited the ability of
Rasmussen College to accurately position the
value of an investment in a college education.
Rasmussen wanted to answer questions like:

l What schools of study had the most priceinterested students?


l Which regions of the US contained the most
price-interested students?
l What is the relationship between tuition
estimation and lead generation?
l Are students able to find the information they
want regarding tuition?
Rasmussen College integrated Google
Analytics events and custom variables with the
tuition estimator to capture geographic and
programmatic information as prospective
students interacted with the widget.

Rasmussen College exported custom event


and variable information from Google Analytics
and analysed the data. Price interest metrics
were developed across program and geography
using the Google Analytics data. For example,
students interested in some schools of study
were more than twice as interested in price as
student interested in other schools.
In addition, tuition estimator users were four
times more likely to complete the website
inquiry form. Those who went on to perform a
separate ROI calculator were 7.3 times more
likely to complete the website inquiry form.

The SEO Handbook 53

Features

Dimensions The example above shows the Countries dimension and standard metrics for
each value of the dimension

Features

Google's guide to Analytics

Features

Events In Content > Events reports, view event data based on Category, Action or Label

addListener(linkClick, 'click', function() {


ga('send', 'event', 'UserActions', 'Button
Click', 
'[BUTTON NAME]');
});
function addListener(element, type, callback) {
if (element.addEventListener) element.
addEventListener(type, callback);
else if (element.attachEvent) element.
attachEvent('on' + type, callback);
}
Now lets take a look at the data. Event data is
found in the Content Reports, Events section, Top
Events report. Its organised based on the Category,
Action and Label values that you create in your
code. Notice that, by default, youre viewing the
data by category. You can switch to view the data
based on Labels or Actions using the Primary
Dimension links at the top of the table. You can
also select an individual Category and view all the
actions and labels associated with that category by
clicking on a category name.
So how can you use events? Here are some of
the most common reasons events are used in
order to measure:

Custom dimensions Create custom dimensions in the Google Analytics Admin section

l PDF and other types of file downloads


l Clicks on links that point to other domains
l Audio players, calculators, video players
l Product configuration tools. For example, an
automobile configurator
There are opportunities to use events
everywhere on a website. But, before you
undertake a massive implementation, make sure
the data is useful and actionable.

Adding custom dimensions


and metrics
Another way to customise the data in Google
Analytics is through Custom Dimensions. If youve
been using Google Analytics for a while, you may
already be familiar with the Custom Variables
features. Custom Dimensions are the new version
of Custom Variables.
As the name implies, Custom Dimensions are
dimensions of data that you manually create.
To create a Custom Dimension, you start by
defining the dimension in the Google Analytics
Admin section. There you name your Dimension
and specify the Scope.
The Scope defines what data hits a Custom

Dimension should be applied to. There are three


types of scope:
Hit: Just apply the Custom Dimension to a single
hit of data
Session: Apply the Custom Dimension to all the
hits contained within a session. This would include
all PageViews, events and transactions
User: Apply the Custom Dimension to all of the hits
from all of the sessions for an individual user. This
is useful when you want to identify a trait of your
users that will not change over time
Once you create the Dimension in Google
Analytics, the system will display the code you
need to add to your website.
Heres the trick, Custom Dimensions can only be
sent with an existing Google Analytics hit. So far in
this article weve covered the most common types
of hits, a PageView hit and an event hit. So, if you
want to set a Custom Dimension, you must send
that data with an existing hit.
For example:
ga('send', 'pageview', {'dimension15': 'Event 
Value'});

The future of measurement


Many critical business activities take place in
other locations, like kiosks and point-of-sale
systems. Data from these systems is often stored
in a different database, completely separate from
app or web data. This can lead to a partial view
of customer behaviour. In reality, what a business
needs is the ability to understand the customer
no matter where they interact with the business.
To help solve this problem Google Analytics is
introducing a number of features that collectively
are known as Universal Analytics (http://netm.ag/
ua-244). The goal of Universal Analytics is to
make Google Analytics more customer-centric by
measuring all customer interactions, no matter
where they happen. Lets look at some of the
most important features.
The first feature, called the Measurement
Protocol, is a method to send data to Google
Analytics from any network-enabled device.
Google has clearly defined the format of the data
so you can integrate the collection directly into

54

The SEO Handbook

your specific systems. All you need to do is send a


HTTP POST request to the Google Analytics
collection servers. Like the other data collection
that weve discussed (web and mobile), the actual
data appears as query string parameters in the
POST request.

Cross-device measurement
The next feature in Universal Analytics is the
ability to measure how your customers connect
with your business across devices. The feature
will be able to measure a user as they navigate
across all of these devices. In order to do this, you
as a business must be able to provide some type
of primary key.
For example, if youre an airline and have
frequent flyer ID for each of your users, you can
specify that Google Analytics use this as the
unique identifier on each platform. (This feature
will be released soon.)
Finally, in addition to collecting data from any

network-enabled device using the measurement


protocol, you will be able to import data directly
into Google Analytics. This is particularly useful as
there are other business-related data sources that
help provide more context to the data. For
example, a ski resort may want to import daily
snowfall totals to view that information along
with other data to provide context for its overall
business performance.
As youve probably noticed, the focus of
Universal Analytics is primarily in the
infrastructure and collection of data. Its
important to note that all of the data is just like
other data in Google Analytics. You can use it and
manipulate it just like any other data.
Once its in Google Analytics, you can use all
of the standard analysis tools, like segmentation,
custom reports, custom dashboards, real-time
reports and attribution models. The value is that
all of these tools take on more significance when
they can analyse data across business interaction.

Features

Google's guide to Analytics


Google Analytics resources

Custom dimensions Once you create the Custom Dimension, you must then add the necessary code to collect the data

From a conceptual level, events, custom


dimensions and metrics, can be
applied to mobile applications as well
For instance, a few common ways to use
Custom Dimensions include:
l Separating members from non-members
l Separating logged in users from non-logged
in users
l Measuring repeat customers
In addition to creating Custom Dimensions, you
can also create your own Custom Metrics. See our
support documentation for more details (https://
support.google.com/analytics/
answer/2709829?hl=en-GB).

Measuring mobile applications


Now lets step away from measuring websites and
talk about measuring mobile applications. Many
people dont know that Google Analytics can
measure Android and iOS apps. From a conceptual
level, everything weve discussed so far, events,

custom dimensions and metrics, etc, can be


applied to mobile applications as well.
In general, the mobile application data model is
the same as the web data model. But there are a
few slight differences. For example, there arent
anyvisitors or PageViews in the mobileapplication
world. Instead, there are users and screens, but the
hierarchical structure of the data is still the same.
This data is collected in a similar manner to
website data. Its sent to Google Analytics via a
request to the Google Analytics collection servers.
The primary difference between a mobile data hit
and a web data hit is that the mobile hit isnt
created using JavaScript code. Instead Google
Analytics provides a SDK for both Android and iOS
to generate the data hits.
From an implementation perspective, its a bit
different to implement app tracking. Unlike a
website where you just add some code to the site
and you instantly have data, you need to integrate
the app tracking code within the app. The exact
integration depends on the platform.
One measurement concept specific to mobile
apps is dispatching hits. Unlike the JavaScript
tracking code, which sends hits immediately to
Google Analytics, mobile apps queue the hits prior
to sending them. This feature is designed to
mitigate the challenges of unreliable network
access and limited battery life.
By default, Google Analytics will send the
queued hits every two minutes. But you can
control the periodic dispatch using the following
code (iOS):
[[GAI sharedInstance] setDispatchPeriod:60];

Data model Similar to the web analytics data model, this is


the mobile analytics data model

In Android, the default dispatch is 30 minutes. But,


like iOS, you can customise the interval code by
using the following:

lG
 oals: (http://netm.ag/goals-244), or
conversion tracking, is an absolutely critical
feature. Make sure you define goals for
your business.
l Implementing ecommerce: If you sell a
product online then you should be using
the ecommerce module (http://netm.ag/
ecomm-244) to measure your transactions.
lT
 racking campaigns: Like goals, campaign
tracking (http://netm.ag/track-244) is a
critical feature.
lM
 obile app tracking: Everything you ever
wanted to know about the Android SDK
(http://netm.ag/android-244) and the iOS
SDK (http://netm.ag/ios-244).
lU
 niversal Analytics: Learn more about the
next generation of Google Analytics,
Universal Analytics (http://netm.ag/uni244).
lM
 easurement Protocol: The foundation of
Universal Analytics, the Measurement
Protocol (http://netm.ag/proto-244), lets
you collect data from any network
connected device.

GAServiceManager.getInstance().
setDispatchPeriod(60); 
In addition to standard data, you can also
measure information thats specific to the app
world, like app crashes and exemptions. This
information is particularly useful as you look to
improve app performance and the user experience.
It only takes a single line of code to collect this
information. In Android, the code would look
something like this:
<bool name="ga reportUncaughtExceptions"
>true</bool>
And, in iOS, expect the code to look something
like this:
[GAI sharedInstance].trackUncaughtExceptions =
YES;

Wrapping up
Hopefully, weve been able to broaden your
perspective of whats possible with Google
Analytics. Its important to remember that, in order
to be actionable, data needs to relate directly to
business strategies and tactics. Using the full
breadth of features that are available in Google
Analytics, youll be able to better align your data
with your business. l 

The SEO Handbook 55

Features

There are many, many Google Analytics


help resources. The Developer Site (https://
developers.google.com/analytics) is a mustvisit resource for anyone thats implementing
Google Analytics. The Help Centre (https://
support.google.com/analytics/?hl=en) is also
very good if youre looking to understand
features from a business perspective. Here are
some specific resources that you may want to
check out:

Features

Optimise your site for mobile

Optimise your
Features

site for mobile

From keepalives to HTTP compression, follow Joshua Bixbys 12


mobile techniques to make your mobile site 70 per cent faster
As mobile use is poised to overtake
desktop use within the next year,
theres a serious disconnect between
users performance expectations and
performance reality. Three out of four mobile
users say theyre willing to wait five seconds
or less for a page to load (Gomez, 2011), yet a
recent survey of top ecommerce sites found
that the average site took 8.9 seconds to load
over an LTE network and 11.5 seconds over 3G
(Strangeloop Networks, 2012).
Site owners have tried to offset this
disconnect by offering mobile-only versions of
their websites, with limited success. Up to one
third of mobile users will opt to visit the full
site when given the option. Shoppers who stay
on the mobile site generate less revenue than
those who visit the full site: just 14 per cent
compared to 79 per cent generated by mobile
shoppers on the full site, and seven per cent via
the mobile app (Strangeloop Networks, 2011).
Clearly, the onus is on site owners to ensure
a speedy user experience across the board,
despite performance constraints that are outside
their control, such as inconsistent networks and
low-powered devices.

l Too many connections


l Too many bytes
l Too many server round trips
l Poor resource caching in the browser
l Third-party calls (marketing tags, analytics, ads)
These arent just mobile problems. Desktop
users suffer, too. But the impact on mobile is felt
much more deeply. While many of the solutions
outlined in this article are good practices for
any site served to desktop and mobile, some
are unique (and hard-won) solutions developed
specifically for mobile. All will deliver
significant benefit to mobile visitors.

Five performance culprits


To understand how to fix performance,
first you understand the five primary
performance culprits:
D ef

ou r c

The SEO Handbook

re s

es

er

56

Simplify pages

r re

D ef

HTTP compression

D e fe

Features

res

ou

Optimise your site for mobile

Minify JavaScript

c r uo

se

se

Features

Resize images
Convert events

The SEO Handbook 57

r ce

Features

Optimise your site for mobile

Four key performance terms


Response time

Features

What it means:Response time causes a lot of


confusion. Theres no single definition. It can
refer to any number of things: server-side
response time, end-user response time, HTML
response time, time to last byte with no
bandwidth/latency, and on and on.
When its useful:Different types of
response time measurements tell you different
things, from the health of your backend to
when content starts to populate the browser.
You need to know what youre measuring and
why. If user experience matters to you, ask
how the type of response time youre looking
at relates to what the end user actually sees.

Time to first byte


What it means:Time to first byte is measured
from the time the request is made to the host
server to the time the first byte of the
response is received by the browser. Time to
first byte doesnt really mean anything when it
comes to understanding the user experience,
because the user still isnt seeing anything in
the browser.
When its useful:For detecting backend
problems. If your websites time to first byte
is more than 100 milliseconds or so, it means
you have some backend issues that need to be
looked at.

Start render
What it means:As its name suggests, start
render indicates when content begins to
display in the users browser. However, it
doesnt indicate whether the first content to
populate the browser is useful or important,
or is simply ads and widgets. This term seems
to have evolved as an alternative to end-user
response time, but its not yet widely used
outside of hardcore performance circles.
When its useful:When measuring large
batches of pages or the performance of the
same page over time, its good to keep an eye
on this number. Ideally, visitors should start
seeing usable content within two seconds. If
your start render times are higher than this,
you need to take a closer look.

Load time
What it means:The time it takes for all page
resources to render in the browser, from those
you can see, such as text and images, to those
you cant, such as third-party analytics scripts.
Load time needs to be taken with a grain of
salt, because it isnt an indicator of when a site
begins to be interactive. A site with a load
time of 10 seconds can be almost fully
interactive in the first five seconds. Thats
because load time can be inflated by thirdparty scripts, such as analytics, which users
cant even see.
When its useful:Load time is handy when
measuring and analysing large batches of
websites, because it can give you a sense of
larger performance trends.

58

The SEO Handbook

Survey An average ecommerce sites takes 8.9 seconds to load over an LTE network and 11.5 seconds over 3G

Core optimisation techniques


Dont do anything until youve addressed
these two best practices. These are the lowhanging fruit on the performance tree, but its
astonishing how often theyre neglected. In a
recent survey of 2,000 top ecommerce sites, 13 per
cent of sites did not enable keepalives and 22 per
cent failed to implement text compression
(Radware, 2013).
1 Enable keepalives
The problem: TCP connection is the process by
which both the user and the server send and
receive acknowledgment that a connection has
been made and data can begin to be transferred.
Too many TCP connections will slow sites down.
To find out if your site has this problem, run a
URL from your site through a performance testing
tool and get its keepalive score. Try WebPagetest
(www.webpagetest.org), which is a free online
tool supported by Google. If the keepalives are

anything less than an A, then take a look at your


sites waterfall chart. If you see a lot of orange
bars, you have a problem that could be fixed by
using keepalives.
The solution: While its not easy to speed up a
TCP connection, you can control how many times
the connection takes place. Make sure you have
the proper configuration on your servers and
load balancer. Also, a number of content delivery
networks dont do keepalives properly, so keep
your eyes open for lots of orange bars on content
coming from your CDN.
2 Add HTTP compression
The problem: According to the HTTP Archive, the
average web page is now well over 1MB in size.
Thats a massive payload, especially for mobile.
The solution: Compressing resources can reduce
the number of bytes sent over the network.
Compressing text-based content (HTML, CSS,
JavaScript) isnt the only way to reduce your
payload, but its one of the easiest. As with
keepalives,first test your siteand get its compress
text score, then check out its waterfall chart. If
your page scores lower than an A, or you see a lot
of bright blue bars on your waterfall, you have a
problem that could be fixed through compression.
Make sure your site follows Googles best practices
for compression (http://netm.ag/payload-242).

Intermediate optimisation
techniques
Once youve nailed the low-hanging fruit, this set
of techniques should be next on your list.

Delayed A one second delay resulted in a 3.5 per cent drop


in conversions and a 9.4 per cent decrease in page views

3 Compress images
The problem: Images account for a full 60 per cent
of the average web pages payload. In my travels,

Features

Optimise your site for mobile

Payload The average web page is 1400 KB in size, with 60


per cent of the payload coming from images alone

After compression,
you can typically
expect a 10 to 40
per cent decrease
in file size
I regularly see sites that use unoptimised and
unnecessarily bulky images. Combating this bulk is
a huge step toward making pages faster.
The solution: I cant stress enough the importance
of ensuring your images are saved in the optimal
compressed format.
These formats are:
l Photos JPEG, PNG-24
l Low complexity (few colours) GIF, PNG-8
l Low complexity with transparency GIF, PNG-8
l High complexity with transparency PNG-24
l Line art SVG
But its not enough to use the right format.
Whatever tool was used to create the graphic
wont save the file in the most efficient way, which
is why you need to pass all images through a
compression tool. (See the sidebar for a list of tools
to consider.) After compression, you can typically
expect to see a 10-40 per cent decrease in file size,
without any noticeable sacrifice to image quality.
4 Minify JavaScript and CSS
The problem: A pages source code can contain a

Features

Orange bars Each orange bar on this waterfall chart


represents a brand new TCP connection

Waterfall chart In a case study involving an un-optimised version of the Velocity conference homepage (http://
velocityconf.com), keepalives and HTTP compression shaved more than five seconds from the pages load time

Case study Velocity Conference website optimisation


In June 2012, at the Velocity Web Performance
and Operations Conference in Santa Clara,
California, we conducted a workshop (http://
netm.ag/velocity-242), demonstrating a series
of core and advanced mobile optimisation
techniques in action.
Taking an unoptimised version of the Velocity
homepage and, iterating through many of the

optimisation techniques discussed in this article,


we reduced the pages mobile load time from
20.5 seconds to 5.6 seconds, which is a 73 per
cent improvement. By cutting resources by
two thirds and reducing the total payload from
954KB to 497KB, we took the pages start render
time from almost eight seconds down to just
over two seconds.

lot of unnecessary characters such as spaces, new


line characters, and comments and these can
consume bandwidth and cause additional latency.
The solution: Minification, which is usually
applied only to scripts and stylesheets, eliminates
inessential characters. On average, minified files
are reduced by about 20 per cent. Minification can
also mean the difference between a cacheable
object and one that is too big for the cache on a
particular mobile device.

fewer bundles equals fewer server round trips.


As an example, take a before-and-after look at
a page with 92 objects. Before consolidation,
the page had a start render time of more than
six seconds, meaning that it took that long
for content to begin displaying in the mobile
browser. After consolidation, the number of
objects was reduced to 34, and the start render
time was cut almost in half to 3.656 seconds.
Warning: Resource consolidation can be a
double-edged sword for mobile browsers.
Reducing requests speeds up page load the first
time, but larger consolidated resources may not
be cached efficiently for repeat visitors. If youre
using this technique, make sure to balance it
with techniques to optimise local storage.

5 Consolidate resources (images, CSS, JavaScript)


The problem: Latency can kill a pages performance
right out of the gate. In web performance circles,
latency is the amount of time it takes for the host
server to receive and process a request for a page
object. The amount of latency depends largely on
how far away the user is from the server, but its
typically 75-140 millisecond per object for desktop
users and up to a full second per object for mobile
users over 3G. When you consider that a typical
web page contains around 100 objects, you can
see how these milliseconds pile up.
The solution: Consolidating similar page resources
into bundles is an effective way to battle latency:

6 Optimise localStorage
The problem: Caching is an essential technique
for improving load times for repeat visitors, or
for visitors who view multiple pages during a
single visit, but desktop and mobile caches are
not created equal.
Traditional browser caching doesnt work
well for mobile devices. Mobile browser

The SEO Handbook 59

Features

Optimise your site for mobile

Features

caches are usually much smaller than


desktop browsers. Page resources are
flushed quickly, sometimes within a single visit.
The solution: While mobile browsers are getting
better, theyre still not where we want them to
be. One positive development is theHTML5 Web
Storage specification, which has been implemented
in all major desktop and mobile browsers,
providing a great alternative to relying solely on
browser caching. Because HTML5 localStorage
is programmable and scriptable, it can be used
for whatever you want, including acting as a
programmable object cache. LocalStorage is
typically good for caching CSS and JavaScript,
especially site-wide files. LocalStorage can also be
used for small images using the Data URI scheme.
7 Defer non-essential resources
(For example, third-party scripts for ads, analytics,
social widgets.)
The problem: Parsing JavaScript can take up to
100 milliseconds per KB of code on some mobile
devices.Worse, poorly optimised script for ads,
social media widgets, or analytics support can add
precious seconds to load times and sometimes
completely block a page from rendering.
The solution: Defer as much as possible until after
onload. The scripts to defer could be your own,
or they could be scripts from third parties.Many
script libraries (such as scripts that support
interactive user behaviour, such as drag and drop)
arent needed until after a page has finished
rendering.Downloading and parsing these scripts
can be deferred until after the onload event. The
same logic applies to script execution.
You will also want to carefully evaluate the use
of large script frameworks (for example, jQuery) for
mobile sites. This is especially important if you are
only using a couple of objects in the framework.

Velocity homepage Intermediate performance techniques shaved an additional four seconds off the mobile load time

Advanced optimisation
techniques
After you have successfully implemented the core
and intermediate best practices mentioned
previously in this article, there are still a few
remaining things you can do to ensure that you're
squeezing every last drop of performance from
your pages.

Additional reading and tools


l Programming the Mobile Web by
Maximiliano Firtman
l WebPagetest www.webpagetest.org
l iWebInspector (Web debugging tool for iOS
simulator) www.iwebinspector.com
l Mobile Perf Bookmarklet http://netm.ag/perf-242
l Google Web Performance Best Practices http://
netm.ag/google-242
l HTML5 Web Storage specification http://netm.
ag/storage-242
l HTML5 semantics http://netm.ag/html5-242
l Table: which mobile and desktop browsers
support which HTML5 features? http://
mobilehtml5.org
l The best image compression tools http://netm.
ag/compression-242

60

The SEO Handbook

Poorly optimised script for ads, social


media widgets or analytics support can
add precious seconds to load times
8 Convert click events to touch events
The problem: Have you ever noticed the delay
that occurs between the time you tap your screen
and page activity? On touchscreen devices, the
onclick event doesnt fire immediately. Instead,the
device waits up to half a second (300 milliseconds
on most devices). This provides the user with an
opportunity to initiate some other gesture instead
of a click.
The solution: To fix this, use the touchend event
instead, which fires immediately when the user
taps the screen. You may still want to handle the
onclick event to ensure that the browser changes
the appearance of the button to show a clicked
state, and to support browsers that dont handle
touch events.
To prevent duplicate code execution when
both touchend and onclick code fire, add a click
event handler that calls preventDefault and

stopPropagation if the click was the result of a


user tap already handled by touchend.
9 Suppress redirects
The problem: When users try to navigate to a
standard desktop site from a mobile device it
often generates an extra round trip to the client
and then back to the mobile site. This ends up
consuming several hundred milliseconds across the
mobile networks.
The solution: For obvious reasons,its faster to
deliver the mobile web page directly in response
to the original request, rather than delivering a
redirect message that then requests the mobile
page. As a courtesy to users who prefer to view
the desktop site on their mobile device, you can
provide a link on the mobile site that signals your
application to suppress this behaviour.
10 Implement network-sensitive
(3G vs Wi-Fi resource preloading)
The problem: Preloading resources in the users
browser, in anticipation of their visiting additional
pages on your site, is a great technique for desktop
users, where bandwidth isnt an issue. But for
mobile users, preloading can eat up precious (and
expensive) bandwidth.
The solution: Preloading should be used only
when combined with code to detect the type
of mobile connection. On Android 2.2+, the
navigator.connection.type property returns values
that allow you to differentiate Wi-Fi from 2G, 3G
and 4G connections, and on BlackBerry, you can
check the value of blackberry.network to access
similar information.
In addition, server-side detection of user-agent
header data or other information embedded in
requests can alert your application to the type of
connection in use.

Features

Optimise your site for mobile

Responsive site Global Islands Vulnerability Research Adaptation Policy and Development
(GIVRAPD) is a responsive WordPress website built with LESS & SMACSS
Revenue Less revenue is generated via mobile sites

11 Resize images
The problem: Im filing this under advanced
technique because its tricky to implement
especially for large, complex, dynamic sites but
its a critical performance challenge. As Ive already
mentioned, images account for a huge portion
of a typical pages payload, which is crippling for
mobile, not to mention completely unnecessary for
smaller screens.
The solution: Dont waste bandwidth by relying
on the browser to scale a high-resolution image
into a smaller width and height. Instead, it's best
to dynamically resize images in your application,
or even replace images with smaller versions for
mobile sites.
Another option is to load a very low-resolution
version of an image initially to get the page up as
quickly as possible, and then replace that with a
higher-resolution version on the onload or ready
event, after the user has had a chance to begin
interacting with the page.
12 Simplify pages with HTML5 and CSS3
The problem: Theres no real problem here, per se.
This technique is a pure optimisation play.
The solution: The HTML5 specification includes
new structural elements, such as header, nav,
article, and footer. Using these semantic elements
yields a simpler and more efficiently parsed page
than using generic nested div and span tags. A
simpler page is smaller and loads faster, and a
simpler DOM means faster JavaScript execution.
The new tags are quickly being adopted in new
browser versions, including mobile browsers.
Similarly,new CSS3 features can help create
lightweight pagesby providing built-in support for
things like gradients, rounded borders, shadows,
animations, transitions and other graphical effects
that previously required you to load images.

The short answer is: yes. The long answer is: but
perhaps not as much as you think they do.
Content delivery networks (CDNs) have
emerged as an excellent tool for mitigating web
latency. In web performance circles, latency is
the amount of time it takes for the host server to
receive and process a request for a page object.
The amount of latency depends largely on how
far away the user is from the server.
To put this in real-world terms, say you visit a
web page and that page contains 100 resources.
Your browser has to make 100 individual
requests to the sites host server(s) in order to
retrieve those objects. Each of those requests
experiences at least 20-30ms of latency. (More
typically, latency is in the 75-140ms range.) This
adds up to two or three seconds, which is pretty
significant when you consider it as just one
factor that can slow your pages down.
When you also consider that a page can have
upwards of 300 or 400 objects, and that latency

can reach a full second for some mobile users,


you can easily see where latency becomes a
major problem.
CDNs cache content in distributed edge
servers across a region or worldwide, bringing
content closer to users. Therefore, this reduces
the round trip time.
In our step-by-step optimisation of the
Velocity homepage (http://velocityconf.com)
(also see the case study embedded in this
article), we were intrigued to notice that, while
CDNs did help performance somewhat shaving
approximately 250 milliseconds off the pages
time to first byte and 1.5 seconds off load time
the impact wasnt as dramatic as we would
expect to see on a desktop site. This led us to
wonder about the overall effectiveness of edge
selection for mobile devices.
While these findings shouldnt be taken to
mean that CDNs arent effective for mobile, they
do beg further investigation.

Take away
No matter what evolutionary leaps we make in
mobile technology, web pages are only going to
grow bigger and more complex. To keep pace and
maintain some semblance of control, we need to
continue to innovate our practices for optimising
directly at the page level. l

Final reduction In the final reveal, implementing some advanced optimisation techniques brought the mobile load time
for the Velocity conferences content-rich homepage down to a satisfactory 5.56 seconds

The SEO Handbook 61

Features

Do CDNs help mobile sites?

Features

Master mobile navigation

Features

Master
mobile
navigation
Users want content. But, argues
Aaron Gustafson, you first need
to ensure they can locate it,
whatever their device type

Words Aaron Gustafson


is the founder of the web
consultancy Easy Designs,
group manager of the
Web Standards Project
(WaSP), and author of
Adaptive Web Design
http://aaron-gustafson.com

Image Mike Brennan


is art editor of net
www.twitter.com/mike_
brennan01

62

The success of a site almost always


hinges on its content. Content, as
we are frequently reminded, is king.
But if thats the case, then navigation is the
establishment that keeps him on the throne.
Navigation does a lot more than move
someone from location to location within a site;
it shapes a users experience. Smart navigation
can enhance a users understanding of your sites
content, services, and offerings, while obtuse
language, unfamiliar acronyms and jargon can
confuse and distract your users from their goals.
While well-executed navigation reflects your
brand and enhances its credibility, inappropriate
navigation create dissonance and breeds
mistrust. Simple, straightforward navigation can
increase your sales by ensuring users easily find
your goods and can make a purchase; confusing
navigation can instead ramp up your customer
service and support costs.
Sure, its not always sexy, but navigation is
critical to the success or failure of your website.
It doesnt matter how good your content,
product, or service is if users cant find it,
theyll move on.

Evolution
When the mobile web first became a reality, the
few companies that felt the need to venture into
this uncharted territory mostly airlines and
financial institutions did so by creating
completely separate mobile websites, often with
unique content and navigation. Frequently, these
sites amounted to a lite version of the parent
website, with content

The SEO Handbook

and navigation focused on the tasks most likely to


be needed on the go. The vast majority of these
sites (especially the homepages) were entirely
navigation: lists upon lists of links and simple forms
for accomplishing what the business interests or UX
teams deemed to be key tasks.
It was a decent strategy at the time. Not many
mobile web browsers were powerful enough to
handle CSS-based layouts and few supported more
than the most basic elements of HTML in the form
of WAP or XHTML MP. Browsing on these devices
was painful, often accomplished by means of the
phones number pad, a rocker button, or a scroll
wheel. If you were lucky enough to have a Treo or a
similar device, you could use your finger or a stylus
to click the tiny links on your minuscule screen.
All of this is to say that navigating the web
on a mobile device was not a enjoyable
activity. Given the incapable browsers and

Master mobile navigation

Features

Features

The SEO Handbook 63

Features

Master mobile navigation

Features
Real deal Authentic Jobs (www.authenticjobs.com) hides nav unless a screen is 768px wide and supports media queries

the accompanying slow-as-molasses data


connections, it was impossible to hop onto
the web to look something up. There was definitely
no hopping. Just waiting and frustration.
Given the state of browsing on a mobile
device, simple navigation and task-focused mobile
sites were a breath of fresh air. Trying to browse
(or simply load) a traditional desktop site was a
nightmare. Consequently, if you used the web on
a mobile, it was usually because you really needed
access to information immediately and didnt have a
better device handy. And when you did so, you went
right to the site, got the information you needed,
and left. You rarely lingered and certainly never
surfed around just for fun. Unless, of course, you
were incredibly bored.
With the advent of the iPhone in 2007 our
concept of what it meant to access the web from
a mobile device changed dramatically. Out of
nowhere, Apple came along and challenged the
common wisdom by enabling browsing on mobile
to be every bit as full-featured as it was on a
desktop or laptop. Not only that, it was fun!
(Fast? Well, not so much. At least not back in 07.)

The first iPhone made it possible nay, enjoyable


to tap and swipe your way across the web on a
small screen. In the coming months and years,
users began to shift away from the focused, taskoriented traditionally mobile browsing paradigm
to a more traditionally desktop-ish one. And
with the advent of CSS3 media queries in 2010,
it became possible to craft adaptive layouts that
tailored themselves to the screen on which they
were displayed without relying on JavaScript-based
trickery and alternate style sheets.

The balancing act


With the distinction between mobile and desktop
dissolving, managing a breadth of web experiences
from a single code base became an intriguing
reality and made it more difficult to justify
stripping away content from small screen devices.
After all, the mobile context that was pretty much
the norm on the old mobile web was no longer
ensured. Study after study proved that people were
surfing the web on their phones while plopped on
the couch in front of the television. Thats about as
far from on the go as you can get.

Clarity is key
Whether designing for desktop or mobile, the
clarity of your navigation labels is key.
As Jared Spool pointed out in Stop Hiding
Behind Products (http://webstandardssherpa.
com/reviews/stop-hiding-behind-products),
we should avoid using generic labels (such as
products) in our navigation when we can use
more specific terms such as snow melters or
DVD players. Meaningful labels will help our
users more quickly (and accurately) suss out
what lies behind our navigation links. It also
sets an expectation for what they will find when
they click, while reducing the likelihood that
they will need to bounce in and out of several
sub-pages to find what theyre looking for.
Similarly, it is important to realise when your
chosen label may be inappropriate for your
audience. For instance, avoid using internal
corporate jargon in your navigation (or copy

64

The SEO Handbook

for that matter). Jargon (unless specific to your


industry) is not only meaningless to your users;
it can make them feel alienated or unwilling to
trust your business if they feel you are hiding
something by using it.
Finally, you should avoid using SKUs, part
numbers or other non-descriptive labels in your
navigation. Few users will have memorised your
entire catalog. If you have multiple products
that fall into a category, include that category
(clock radios, for instance) in your navigation,
but dont make sub-navigation out of the SKUs.
Link to those product pages from the category
landing page. No one wants to guess whats
behind navigation item #1.
With clear, straightforward labels to guide
them, your users will be able to find their way
around your site quickly and easily no matter
which platform theyre on.

The truth is, users are no longer satisfied with a


lite mobile experience, nor do they want to hunt
for the View Full Site link only to be greeted by a
hard-to-use layout meant for a large monitor. They
want to be able to do anything they could do on
traditional desktop web platforms (and potentially
more). Its in our best interests to support a single,
adaptive web experience: Opera recently found 59
per cent of its US user base is mobile-only. That
number is higher in places such as Egypt, Brazil, and
South Africa, and its growing worldwide.
If we assume users are just as likely to browse
on a mobile device as they are on the desktop, we
stop thinking about the small screen as needing
less and focus on making content, layouts and
navigation appropriate to the real estate and
capabilities available. And, at the same time, we
should revisit our assumptions about desktop users:
should wading through extra cruft in the pursuit of
our content become a thing of web designs past?
We need to seek out that happy medium that
balances the needs of desktop web users with those
using mobile devices.

Navigation strategies
Im going to take you on a whirlwind tour of mobile
navigation strategies. Each has its pros and cons
but, more importantly, each has its own set of
dependencies. Most, as youd expect, rely on media
queries. Some have source order requirements. And
a few rely on JavaScript, the absence of which can
make for an awkward interface.

Hide it
Our first strategy takes its cue from the old mobile
web camp and only enables users to accomplish
key tasks (as identified by the UX team, upper
management or user testing of an existing mobile
site). Users are offered little (or no) navigation and
can only access a subset of a websites features. In
some cases, the decision to reduce or remove
navigation is made to conserve real estate. Users
who only experience such a website on a mobile
device may never know theyre missing features, but
users who visit it on multiple platforms (which is an
increasing trend) are likely to become frustrated
when they cant see items theyre used to accessing.

Features

Master mobile navigation


Three little lines
Tim Kadlec

On the go US Airways mobile sites traditional approach

Easy access Prioritised nav helps you get at everything

Authentic Jobs (www.authenticjobs.com) is a


good example of this strategy. The main navigation
of the site is hidden by default and then displayed if
the browser supports media queries and is at least
768px wide:

primary ones for mobile devices. This, of course,


assumes that each of those navigation items has a
corresponding landing page that provides access
to those sub-pages (unfortunately, this is not
always the case).
As with the previous example, this strategy
would place a tax on your small screen users by
forcing them to download more markup than
they require. On the plus side, unlike the previous
example, you are not likely to frustrate a crossplatform user because the primary navigation
options will be consistent.
If you are considering this strategy, you should
also ask yourself whether the sub-navigation is
really necessary on any platform. After all, if you are
comfortable getting rid of it on smaller screens, do
you really need it on larger ones?

/* Hide the nav by default */


body > header nav {
display: none;
}
/* >= 768px */
@media only screen and (min-width: 768px) {
/* Show the nav again */
body > header nav {
display: block;

}
}

Theres a host of interesting navigation


options available when youre looking
to adapt your designs to smaller screens
While this strategy may serve their users well,
it is important to realise that mobile users are still
being forced to download the HTML. True, its only a
few navigation links in this instance but when you
begin to use display: none for more of your content
(especially images) it can have a tremendous affect
on the speed of the site, as well as how much of a
users bandwidth your site consumes.
As I mentioned, this strategy can be frustrating
for frequent users who want to complete a task
(such as posting a job) from a small-screen device.
Incidentally, most of the pages on Authentic Jobs
do look and work well on mobile; you just cant
navigate between them.

Trim it
If you are struggling to find a reasonable layout
for a large navigation menu comprised of several
tiers of navigation items, you might want to
consider reducing the number of links to only the

Its worth noting that this concept can be used in


concert with most of the other navigation strategies.
As part of an experiment (www.justmarkup.com/
lab/juma/nav/example2), Michael Scharnagl crafted
a navigation scheme wherein he classified the main
navigation links into three distinct groups according
to priority, progressively adding more items as more
real estate became available. On smaller screens, he
displays a more link that enables users to toggle
the additional navigation items into view without
relying on JavaScript (akin to the JavaScript-less
drop-down I will discuss shortly).

Shrink it
If your sites navigation is relatively succinct, its
possible youll be able to get away with simply
adjusting the layout and size of your navigation
items. Succinct is obviously a matter of opinion, so
you should be cognisant of how much space
your navigation occupies even when

When a responsive site is loaded on a small


screen, the navigation is often collapsed
or positioned at the bottom of the page to
provide more room for the actual content. In
place of the full menu, a button is included
that will control the display of the navigation
items. The button typically sports a sharp
looking icon of some sort.
Icons are a nice touch and add visual
appeal but we need to be careful not to be
too clever at the expense of our visitors.
Icons should be intuitive and reflect their
functionality. Unfortunately, many icons used
to represent navigation today, such as the
plus sign or an arrow, fail on both counts.
There is one, the trigram for heaven
(recognisable as three horizontal lines), that
has started to emerge as a de facto standard
but it still has a way to go before its as easily
identifiable to people as, say, using an icon of
a phone for contact us link.
It will depend on your target audience but,
for now, youll probably find its often best to
pair the icon with some text to help make its
purpose more obvious. You could achieve this
by using CSS-generated content and inserting
the Unicode symbol for the three lines but
Android struggles with Unicode symbols and
wont display the icon as intended.
You can get the same general effect by
positioning the generated content absolutely
in relation to the links parent element and
applying a couple of borders. For example:
<li id="menu"><a href="#">Menu</a> </li>
#menu{
position: relative;
}
#menu a{
padding-left: 20px;
}
#menu a:before {
content: "";
position: absolute;
top: 30%;
left:0px;
width:12px;
height:2px;
border-top: 6px double #000;
border-bottom: 2px solid #000;
}
Its not quite as tidy as Unicode but it gives
the same result, works in Android and means
you avoid the extra HTTP request that would
come from using an image. The result is an
intuitive, appealing interface for your visitors.

The SEO Handbook 65

Features

Author Implementing
Responsive Design
(New Riders, 2012)
URL www.timkadlec.com

Features

Master mobile navigation

Features
Baby steps Confab 2012 (www.confab2012.com) makes minor accommodations for small-screen users, but clusters its nav away at the top of the display

diminished. Unless theyre looking for


inspiration, or writing a piece on mobile nav,
users dont generally come to your site with a view
to checking out the navigation.
As with all of these strategies, maintaining a
clickable size is key on touchscreens because fingers
are much less precise than a mouse pointer. Be
sure to provide ample targets (44px square at a
minimum) and give your users a little breathing
room between navigation items in order to reduce
the possibility of mis-taps.
/* <= 720px */
@media (max-width: 720px) {
/* Linearize the list & space out the items */
.mainnav li {
display: inline-block;


line-height: 1em;

margin: 0 .5em .5em 0;
}
/* Make the links large & tappable */
.mainnav li a {
border-radius: 1em;


font-size: 1em;

padding: .5em .75em;

Bottoms up Nichols Colleges nav (http://goto.nichols.edu)


stays at the base of the page, rearranging as layout shifts

66

The SEO Handbook

}
}

Rearrange it
For a number of years now, web standards
advocates, SEO consultants and accessibility experts
have been arguing in favour of putting the content
first in terms of source order. After all, if youre
using CSS, its a breeze to move your navigation
to the top of the page.
The benefit of this approach is that it provides
immediate access to the meat of your page for
your users and for search engine spiders alike.

Source order independence can be very useful


for increasing the usability and accessibility of
your site. As long as youre comfortable with
absolute positioning, its not much of a challenge
to implement either. Just be sure you also include a
back to top link at the end of your navigation list,
in order that users can easily move back to the top
of the page too.
Jumping down the page to an anchor reference
can be a little jarring. But with the help of some
JavaScript, you can hijack a click on skip to links
to smoothly scroll the user up or down the page.
Karl Swedbergs Smooth Scroll plug-in for jQuery

We need to seek out that happy medium


that balances the needs of desktop web
users with those using mobile devices
Incidentally, its also incredibly helpful for mobile
users, because they dont need to wade through all
of the navigation options for your site before they
can get to the content. Contents magazine (http://
contentsmagazine.com) uses this simple approach
in concert with a skip to navigation link at the top
of the page to provide immediate access to the nav
when a user wants it.
/* >= 768px */
@media screen and (min-width: 48em) {
/* Hide the skip to */
.go-nav {

left: -1000em;
}
/* Move the nav up */
#site-nav {
position: absolute;


top: -5em;
width: 100%;


z-index: 5;
}
}

(https://github.com/kswedberg/jquery-smoothscroll) has served me reliably for this purpose.

Collapse it
One of the more popular mechanisms for managing
larger navigation lists on mobile is the drop-down
menu. This particular UI construct can be
accomplished in a a few different ways, each of
which has its own set of dependencies. To choose
the right one, you must first determine whether or
not the menu should push the page content down
when it expands.
Starbucks (www.starbucks.com) is probably
the most popular example of a drop-down menu
that pushes the content down the page rather
than sliding over it. In order to accomplish this,
the navigation list must appear at the top of the
document so that when it expands, it pushes all
subsequent content down the page.
/* Hide the nav by default */
#nav .nav_menu {
display: block;

Features

Master mobile navigation


Position is everything
James Kalbach

Natural approach Contents (http://contentsmagazine.com) is well named: its content is placed first in source order

height: 0;
overflow: hidden;
}
/* Open it when JS addes the .open class */
#nav.open .nav_menu {
height: auto;
}

a) converting a list-based nav to a select using


JavaScript, or b) maintaining a hard-coded select in
addition to a list for your navigation.
Five Simple Steps (www.fivesimplesteps.com)
uses the latter approach for this technique. The
style sheet manages which one is displayed based
on media queries.

Interestingly, without JavaScript, Starbucks leaves


the nav open despite using a skip to link as the tab.
It could have accomplished the same UI treatment
without JavaScript using the :target pseudoclass selector, as was done on Nichols Colleges
prospective student site (http://goto.nichols.edu).
For more on Nichols Colleges pure CSS technique,
visit www.netmagazine.com/tutorials/build-smartmobile-navigation-without-hacks.
The Nichols College site, as well as being
an excellent example of a JavaScript-less dropdown, shows the other option when it comes to
this UI component: source order independence.
Like Contents magazine, this site maintains all
of its navigation at the bottom of the page and
rearranges it as the layout adjusts. When the user
is on a small screen device that supports :target,
the primary navigation links move to the top and
are hidden until the user clicks the skip to link that
triggers the reveal.

/* <= 767px */


@media only screen and (max-width: 767px) {
/* Hide the list */
nav select {

display: none;
}
/* Show the select */
nav select {
display: block;

}
}
/* <= 767px */
@media only screen and (min-width: 768px) {
/* Show the list */
nav select {

display: block;
}
/* Hide the select
nav select {
display: none;

}
}

body:not(:target) #nav {
/* these styles are only applied if :target and :not 

are understood (and the body is not 
targeted, of course) */

}

Convert it
Another popular mechanism for shrinking
navigation on mobile devices is to convert it into a
select element. The benefit of this approach is that
it drastically reduces the space required for
navigation, while maintaining a look and feel that is
familiar to the user. It also places no limitation on
the number or depth of navigation items. The
downside is that using this scheme requires either

If you decide to include a select in your markup


rather than dynamically creating it with JavaScript,
you will need to include a button to submit the
form containing it (and have some server-side code
to manage the redirection) you cant rely on JS to
be available to trigger the page change.
If you dont want that kind of hassle, youre
better off generating the select dynamically from a
list in the markup using JavaScript. At least that way
you know that JavaScript is available to handle the
change event. TinyNav (http://tinynav.viljamis.com)
and Responsive Menu (https://github.com/
mattkersley/Responsive-Menu) are two (of

Deciding where to position navigation


menus is an important consideration for
screen designers.
In desktop applications, main navigation
options are typically located horizontal along
the top. This convention is nearly universal,
but with exceptions. An advantage is that the
functionality and tools of the program are
given the full screen width.
The first websites with static navigation
menus positioned them vertically on the left.
For a hierarchical, content-based website, this
shows the structure of the pages well. The
menu is also always visible when the page
loads, even if there is horizontal scrolling.
But web navigation is not as standard as
desktop applications menus also appear
along the top and even on the right side
in certain cases.
With the design of mobile applications,
we can see a different set of conventions
starting to emerge. On smartphones, menus
are frequently positioned horizontally along
the bottom of the screen. Not only does this
provide easy access with the thumb when
the phone is being held in the palm of the
hand; it also avoids having to reach across
the screen obscuring it in order to touch
an option, as is likely to happen with a topaligned menu.
Whats more, hidden menu options are a
common feature with mobile devices. These
can typically either be accessed by swiping
across a small tab to open a drawer on
screen, or via a hard key on the device itself.
Discoverability and memorability of these
options is lower, since they mostly remain
out of sight, but this tactic saves valuable
screen real estate.
Tablets frequently have menu bars at the
bottom of the screen, but also along the
vertical axis on the left side. The devices
slightly larger screens allow for this. A lefthand position also mirrors how people hold
tablets on the sides of the device. Overall,
we find even more variation in location of
navigation menus with mobile devices than
with websites.
So whats the best screen location for
mobile navigation? The situation, not
some arbitrary guideline, should be what
ultimately drives your decision. There are
potentially many viable options and for
designers, these mean its imperative to
thoroughly understand user behaviour, the
context of use, and the limitations of the
device to make informed decisions.

The SEO Handbook 67

Features

Author Designing Web


Navigation (OReilly 2007)
URL http://
experiencinginformation.
wordpress.com

Features

Master mobile navigation

Features
Swap out Here a list nav is switched for a select element

the many) JavaScript options out there for


managing list-to-select conversion.

Reveal it
The final mobile navigation paradigm showing
potential is the slide to reveal nav treatment
popularised by apps such as Path and Sparrow on
iOS, and Facebook on the web.
On Facebooks mobile site, the navigation is
contained in a div classified as mSideArea and
the main content is contained in a div identified
as page. These two divs are contained within an
outer div identified as viewport. This outer div is
relatively positioned to create a positioning context
for .mSideArea, which is absolutely positioned and
given a width of 260px and a negative left offset of
the same amount to move it out of view; #page is
positioned relatively with no offsets.
#viewport,
#page {
position: relative;
}
.mSideArea {
width: 260px;

Facebook mobile The site uses a common menu icon, then slides the page to the right to reveal the navigation behind it

position: absolute;
left: 260px;
}
The reveal is accomplished by adding a class of
sideShowing to the body element. The addition of
this class triggers #page to receive a left offset of
260px (shifting the page content to the right) and
sets .mSideAreas left offset back to 0, moving it into
the 260px of empty space to the left of #page.
.sideShowing #page {
left: 260px;
}
.sideShowing .mSideArea {
left: 0;
}
This is a pretty clever piese of code to be sure.
And, should you want to make it a little slinkier, add
a CSS3 transition:
#page,
.mSideArea {
/* Insert prefixed versions here */

transition: left .5s;


}
As Facebooks approach relies on toggling a
class, it is JavaScript dependent. But theres no
reason you couldnt use the :target pseudo-class
(like the one employed by Nichols College) to
accomplish a similar effect.

The decision is yours


Theres a host of interesting navigation options
available to you when you are looking to adapt your
designs to smaller screens. Each strategy has its own
set of pros and cons, as well as dependencies in the
form of source order, markup, CSS, and/or JavaScript
support. Just keep in mind that there is no silver
bullet; every project is different and every projects
needs are different. By taking a hard look at your
content and your user goals, however, you should
be able to identify the most suitable strategy. l
Authors note: Id like to say thanks to Brad Frost for
creating and maintaining a living compendium of
mobile navigation strategies. You can check it out at
http://netm.ag/frost-232.

Reading list
There are tons of smart people experimenting
with different mobile navigation schemes and
concepts. Ive mentioned a few in this article,
but I thought it would be good to point out
some other gems I found in my research.

Responsive Navigation Patterns:


http://netm.ag/frost-232
As mentioned above, Brad Frosts compendium
of adaptive navigation strategies, including his
own perspectives, is an invaluable resource.

A Responsive Design Approach


for Navigation, Part 1:
http://netm.ag/wachs-232
In this article, Maggie Costello Wachs does a
great write-up on adapting navigation for a

68

The SEO Handbook

wide range of circumstances based on her


experience working on The Boston Globe site.
I cant wait for Part 2.

for prioritising key tasks, while also offering


users easy access to everything else.

Pull Down for Navigation:

An alternative to select elements


as navigation in narrow viewports:

http://netm.ag/kenny-232
Im intrigued by Tom Kennys approach with this
technique. I dont think its quite ready for prime
time, but it is an interesting idea. Keep an eye
on this one.

http://netm.ag/johansson-232
Doing exactly what it says on the tin, in this
piece Roger Johansson explores an alternative
to converting your menu into a select using CSS
and a bit of JavaScript.

Responsive Multi Level Navigation


lets try:

Essential considerations for


crafting quality media queries

http://netm.ag/scharnagl-232
In this experiment, Michael Scharnagl explores
prioritised navigation schemes in responsive
navigation. I like this concept because it allows

http://netm.ag/gillenwater-232
In this bit of required reading, Zoe Gillenwater
runs through a ton of media query configuration
options, giving pros and cons for each approach.

SUBSCRIBE TO

AND GET TWO YEARS FOR THE PRICE OF ONE!


Subscribe today to get 26 issues for the price of 13!

GREAT REASONS TO SUBSCRIBE


l

Every new-look issue delivered


to your door, days before it hits
the shops

Stay updated on the new


web technologies

Exclusive access to the worlds


leading experts

UK OFFERS
6 MONTHS DD 26.99 (Save 23.89)*
1 YEAR 59.99 (Save 17.88)*
2 YEARS 77.87

SAVE
50%

(Save 77.87)*

TWO SIMPLE WAYS TO SUBSCRIBE:


ONLINE: WWW.MYFAVOURITEMAGAZINES.CO.UK/NETP1V
PHONE: 0844 848 2852 AND QUOTE CODE NETP1V
NOT FROM THE UK? HERE ARE YOUR EXCLUSIVE OFFERS
LIVE IN THE
US? TURN
TO PAGE 57

EUROPE 39.99 (every 6 months) REST OF WORLD 48.49 (every 6 months)


Terms & conditions: *UK savings compared with buying 13 full price issues from UK newsstand. Europe and ROW have no set
newsstand price and therefore we cannot advertise the specific savings you will make. You will be charged in GBP. This offer is for
new print subscribers only. You will receive 13 issues in a year. Full details of the Direct Debit guarantee are available upon request.
If you are dissatisfied in any way you can write to us or call us to cancel your subscription at any time and we will refund you for all
unmailed issues. Prices correct at point of print and subject to change. Offer ends: 31 August 2014. For full terms and conditions
visit: www.myfavouritemagazines.co.uk/terms

Features

Understand your audience

Features
70

The SEO Handbook

Understand your audience

Understand
your audience
Good research isnt just about finding out how many unique
visitors you have. Rob Mills sets out techniques to help you to
get to know your users and make more informed decisions
Before I entered the design world I
was an audience research executive at
BBC Wales. This gave me a valuable
grounding in the importance of audiences,
research and having data to support any
decisions you make. It also taught me that
theres a big difference between knowing your
audience and understanding your audience.
Knowing is associated with top-level data such
as having 23,000 unique visitors a month to a site.
Understanding your audience is about finding out
as much as possible about the people behind those
numbers, including your users' social and cultural
situations and media consumption as well as their
likes, dislikes and needs.

Why understand?

Words Rob Mills is studio


manager at Bluegg,
conference speaker and
author of A Practical Guide
to Designing the Invisible,
from Five Simple Steps.
www.blueegg.co.uk

Image Sam Williams is a


member of Brighton-based
collective Magictorch
www.magictorch.com

Understanding means that you can make informed


decisions when designing for a target audience.
Doing research to establish who the audience is
and then digging deeper to understand them is
always time well spent. Content strategist Relly
Annett-Baker nicely summarised this on the
Creating Web Content course I recently completed:
I do not trust stakeholders to know their audience
unless they have conducted some sort of research.
They either talk about the audience they had when
they were still largely in contact with them dayto-day, or the one theyd like. Even if the research
you undertake confirms what you already assumed
about the audience, it's definitely still worth having
that validation.
By understanding the audience you can make
informed decisions with regard to colours, tone of

Audiences are now


spread more thinly
and you need to
work to keep them
voice, typefaces and imagery. Youll understand the
cultural situation of your audience, so if a certain
colour has different connotation in one culture to
another, then you can choose the best palette so
as not to offend.

Targeting content
Targeting is vital: the web is evolving quickly and
marketplaces are increasingly competitive. Greater
choice has resulted in fragmentation, so audiences
are now spread more thinly and you need to work
hard to get them and keep them. Users are
selective about where they spend their time online,
and this is a concern if they are consumers with
money to spend.

Knowing vs understanding
Top-level tricks of the trade such as Google
Analytics and the Jetpack WordPress plug-in
(http://netm.ag/jetpack-233) provide an overview
of your audience, but address knowing more than
understanding. With Analytics you can find
out your users language, location,

The SEO Handbook 71

Features

Features

Features

Understand your audience

Features
Interpretation A handy grid featuring the key research methods, together with their main pros and cons

device, how they came to your site, their


user journey and how long they stayed all
useful information, but specific to a single visit, and
offering nothing more about the people behind
the numbers.

Establishing a framework
To dig deeper you need to choose the research
methods that best offer the level of detailthatyou
need. Then you need to derive meaning fromthe
data to find stories within. The first step is to
establish your objectives. It might be best to
categorise the audience information you aim to
obtain, perhaps in the following way:
l Basics
l L ifestyle
l Media

Basics could cover overall user numbers,


gender splits, age segmentation, nationalities
and locations. Lifestyle data is concerned with
education, employment, socioeconomic status,
health, finance and religion.Finally, Media
provides insight into your users consumption of
newspapers and magazines, technology, television
and radio and, of course, the internet.
Find out more about your users' media
consumption. What newspapers and magazines do
members of your audience read. What sort of TV
programmes do they watch? Which radio stations
do they listen to? How much time do they spend
online, and on what particular devices? Answers
to these questions start to help build a picture of
your audience as people rather than numbers. Add
some Basics and Lifestyle data and youll have a
thorough understanding of your audience.

Colour coded Understanding your audience means you can make informed decisions about design elements

72

The SEO Handbook

Choosing your tools


The research methods available can include
questionnaires, focus groups, persona generation,
social media and interviews. In addition, you cant
beat some simple audience observation. There are
pros and cons for each using method, so you need
to determine which is best suited to your budget
and objectives.
Questionnaires can be a cost-effective way to
gather data, but the design of the questionnaire
what you ask and how you ask it is vital in
getting valid data that you can analyse. Openended questions enable the participants to share
whatever they like, but can be harder to derive
meaning from. Services like SurveyMonkey let you
to design surveys, and even help you target them
at the right audience.
Interviews can offer a large amount of
information as you are free to explore topics in
detail. Thought needs to be given to the questions,
but there is also scope to react to something that
has been said and digress as needed. But owing to
the time involved, interviews can be an expensive
research method.
It seems focus groups divide people down the
line. Some love them and others see no value in
them. They can provide insightful information on
attitudes and perceptions, but their value is largely
dependent on the facilitator and moderator. They
must withdraw the information, make sure it
isnt prompted or biased, and be experienced at
observing a focus group environment.
Social media isnt a research method as such,
and shouldnt be used for this sole purpose: it can
be difficult to find any insight among the high
volume of noise. That said, with saved searches

Understand your audience

Features

Features

Brand identity Use the data from your research about your audience to provide insight and help define your designs

l Define your objectives


l Determine your audience through research
l Deconstruct find the stories, analyse the data,
and extract the insights
l Refine as needed, at suitable intervals
If you do find that the audience profile has
changed since the last time you carried out
research, simply adapt your site in line with this
new insight.

Once you have gathered the data,


you need to extract the meaning
from it and gather insights
Once you have gathered the data, you need to
extract the meaning from it and gather insights.
If you are starting on a new website, you should
make all your subsequent design decisions in
the light of these insights. If you have an existing
website, you should review it in light of your
research results.

Repeating the process


Audiences dont change overnight, but their
behaviour can fluctuate as rapidly as anything
else on the web does. For this reason, it's worth
carrying out research at regular intervals to make
sure you are still targeting the right people.
A good process here is Define, Determine,
Deconstruct and Refine:

Considering the client


Finally, remember that the objectives and research
framework you establish should be related to
your clients business goals. Dont just think about
what you want to know as a developer; instead,
consider what you need to find out about the
target audience in order to help your client achieve
those goals.
Understanding your audience in this way will
help you make informed design decisions, which
are backed up by hard data. This should make
selling your designs to the client a lot easier,
because youll be designing with their business
goals in mind.
Just remember: knowing is useful, but
understanding is essential. l 

Dont trust research blindly


What audiences say they do and what they
actually do can be two very different things.
One of my favourite anecdotes about this
comes from Steve Mulder, author of The User
Is Always Right.
In one of his webinars, Mulder told a story
of an electronics company that was carrying
out research to better understand consumers
in the marketplace. As part of this research,
the company conducted a focus group to
determine what colour it should make its
new boombox.
Ahead of the focus group, the company
had narrowed the choice down to two
possibilities: yellow or black. The participants
were all part of the target audience that this
product would be aimed at. All they had to
do was choose between the two.
The overall consensus was for yellow, since
it was seen as a vibrant, energetic colour. It
seemed that the company had its answer.
But as the participants left the focus
group, they were each rewarded with one
of the new boomboxes. They could choose
between a yellow one and a black one. They
all chose black.
You might argue that this example simply
shows that focus groups are flawed, but the
moral of the story remains that audiences
often say what they think researchers want
to hear. All your research findings need to
be validated before you commit to that allimportant business or design decision.

The SEO Handbook 73

Images/Gareth Strange www.strangelove.me

and lists you can find out what a wide audience


are saying about a certain product, event, service,
brand or company.
Another Marmite-style approach is personas.
Some people swear by them and others wish they
were banished for good. If created effectively they
can help guide functionality and design. They need
to be communicated to the client well so they can
use them accurately, because theres always a risk
that they offer a one size fits all solution.

Features

Beat Google link penalties

Features
74

The SEO Handbook

Beat Google link penalties

Beat Google
link penalties
Whether its a problem with Penguin or a
manual links penalty, Tim Grice says there is a
way back into Googles good graces

Words Tim Grice is director


of search at Branded3. He
oversees the development
of search strategies and
campaigns for some of
the UKs biggest brands
www.branded3.com

Image Mike Chipperfield


is co-founder and creative
director of Brighton-based
collective Magictorch
www.magictorch.com

The last 18 months has seen some of the


biggest changes to Googles algorithms
that help decide how websites appear
in search results. This had a dramatic impact on
the SEO industry and website rankings on search
engines. One of the biggest updates is known as
the Penguin update.
The changes were intended to reduce the
impact of unnatural or spammy links, which
helped some websites rank. In some cases Google
sent messages out via Webmaster Tools informing
sites of unnatural links. In other cases, Googles
updates caused manual spam actions against sites
involved. While most people in the search world
welcomed these changes, there were a few issues.
Google failed to confirm what an unnatural link is.
This left webmasters searching through link profiles
trying to guess which links could cause problems.
Typically, low quality blog networks and article
syndication sites were being targeted. However,
anything with aggressive commercial anchor text
also seemed to be getting picked up by Googles
new algorithms.
Whether youve had an algorithmic problem
with Googles Penguin update or youve had a
manual links penalty, there is a way back into
Googles good graces. Links will need removing and
links that youre unable to remove can be added to
the disavow tool.

branded3.com) have some compelling evidence as


to its effectiveness.
Just so were all clear, it does work. If you have
been hit by a manual links penalty, following a few
simple rules and using this tool is all you need to
do to recover your rankings.
However, before I dive into the evidence I
thought its useful to go over some of the reasons
people are dubious about the disavow tool. Well
then talk through how to sort this mess out.

Admission of guilt
A lot of people believe adding links into the
disavow tool is an admission of guilt and
submitting a file will cause further trust issues
with your website. This simply isnt true. Ive never
seen a site react negatively to the submission of a
disavow file.

Google and bad links


Im 100 per cent sure that if Google could just
ignore bad links, it wouldnt have sent unnatural
links messages, have rolled out the Penguin update
or launched the disavow tool. Put simply,
Google cannot tackle link spam

Branded3 Author Tim Grice


helps large UK brands
achieve significant increases
in visibility and ROI

The disavow tool


Google launched the disavow tool to aid
webmasters struggling with unnatural links and
provide a route to recovery. Theres been mixed
opinion as to the extent the tool works, why
Google have launched it, and what impact its
on search results in general. After five months of
testing, digital marketing agency Branded3 (www.

The SEO Handbook 75

Features

Features

Features

Beat Google link penalties

Be honest about your links


The fact is, if your link isnt editorial then
its advertising. As much as we may love a
link, if its advertising, Google doesnt want
to count it. If youve had an unnatural links
message, then Google knows about your
bad links. Theres no point making excuses
for obvious bought or manipulated links.
Just remove the links because, chances are,
they arent helping you anyway.

Anchor text

Features

If youve been struggling to get a penalty


revoked, and there are obvious keywords
you no longer rank for, remove or disavow
every single link with that anchor text. Who
links with commercial anchor text anyway?

Timescales
Typically, you will get a response after a
reconsideration within two weeks. If you are
successful and have a penalty revoked you
may have up to four weeks to wait before
any rankings come back.

A word of warning
If you havent had an unnatural links
message warning, you need to be very
careful when using the disavow and
reconsideration process. Its likely Google
hasnt found your bad links and are still
counting them. Disavowing and sending
in a reconsideration request will cause a
full valuation of your profile and you may
have added links that still count towards the
disavow tool.

Negative signals
There seems to be a genuine fear of using
this tool around the SEO community.
However, if youve had the unnatural links
message, you really shouldnt worry. I have
yet to see even one negative consequence
when using the tool to remedy an unnatural
links message. Likewise, I have yet to see any
negative results through the submission of
multiple reconsideration requests. If youve
had a manual penalty, you simply need to go
through with this process. Dont worry either
about another penalty hitting as a result of
being transparent.

Reconsideration requests
Even though I would still recommend
sending in a detailed reconsideration, Im
95 per cent sure Google arent reading
them or delving into any Google docs sent.
However, I would continue to write a good
reconsideration request and send all data,
just to show willing.

Disavowed You need to have a Google Webmaster Tools account set up to be able to use the disavow tool to disavow links

with algorithms. It may be able to identify certain


types and devalue certain anchor text signals,
but ultimately its fighting a losing battle. The
reason that we have the disavow tool and all these
penalties is that Google wants us to clean up the
web for them.
Now you may feel crowd-sourcing from Google
is wrong and a way of trapping SEOs. Thats fine,
but dont be naive enough to believe that Google
could just ignore bad links.

Reasons to use the tool

Good sites could get hurt

Example 2 manual link spam penalty

Ive heard people worrying that good websites may


be wrongly reported and prevented from passing
value. Honestly, I dont think this will be the case.
If youve been hit by a penalty, youre not going to
disavow your better links. Plus, if those links are
on good sites, its straightforward enough just to
email and ask them to remove or change the link.
Its the spam sites without any contact details and
maintenance that will get reported and potentially
de-indexed. Basically, if your link profile is made up
of bad links, expect to lose rankings soon.
In addition, anyone whos trying to tackle lost
ranking and completely disregards the disavow
tool based on some sort of moral stance against
Google obviously isnt making any money from
natural search. Any agency that advises against
link removals, link disavowing or sending in a
reconsideration requests is naive and doesnt fully
understand the updates rolled out last year.

l Unnatural links message received


l 80 per cent of the links are pulled down within
three months
l Multiple reconsideration rejections
l Disavow tool used
l Filed reconsideration
l Message received advising that the manual spam
penalty is removed
l Rankings back within seven days

If used correctly, the tool works. For example:

Example 1 manual link penalty


l Unnatural links message received
l Removed 95 per cent of link spam a month later
l Reconsideration rejected
l Used link disavow for remaining links
l Filed reconsideration
l Rankings came back within 10 days

Example 3 algorithmic anchor text filter


l Unnatural links message received
l 60 per cent of bad links removed
l Multiple reconsiderations rejected
l Disavow tool used
l Filed reconsideration
l Message received advising that there were no
manual penalties
l Rankings recovered three weeks later

Start again?
Is it time to just give up and start again? Ive
yet to come across a hopeless case. Weve
had sites where we have had to remove over
5,000 linking domains and still managed to
secure a positive result.
Reinclusion After filling re-inclusion requests, many sites see dramatic recoveries in traffic (credit: www.johnfdoherty.com)

76

The SEO Handbook

Beat Google link penalties

Features

Features

Site explorer The search engine for links, this tool by SEOmoz allows you to perform
competitive website research and explore backlinks, anchor text (and more) for free

So if youve been hit with any kind of links


penalty, youre going to want to know how to deal
with it. There are a few pieces of advice I would
recommend you follow:

Link audit
Make sure you undergo a thorough link audit.
Combine Opensiteexplorer (www.opensiteexplorer.
org), Majestic SEO (www.majesticseo.com) and
Webmaster Tools (www.google.com/webmasters)

Majestic A link intelligence tool for SEO and internet PR and marketing. Majestic SEOs Site
Explorer shows inbound link and site summary data, as well as your sites backlink history

As well as disavowing the links you should also


send in a reconsideration request. This is the only
way a manual penalty can be removed and, until
you get a response, you wont be able to find out
which one you have.

Two penalties at play


In the last two years, Google handed out two
types of link penalty for either manual actions or
algorithmic penalties. For example:

Any agency that advises against link


removals, link disavowing or sending
reconsideration requests is naive
links to ensure you have the biggest sample
possible. Youll then need to work through them
and classify your links. Split them into three groups:
l Good link
l Good site, aggressive anchor text
l Low quality website and link
Obviously, leave the good links alone, contact
the sites with aggressive anchor text and request
removal, and add all spam links into a text file ready
for the disavow tool.
You must do a link audit before using the tool.
The last thing you want to do is disavow a link that
is genuine and passing value. On the other hand,
you need to make sure you collect as many of the
bad links as possible.
If you do the audit and find you have very few
or no good links, then dont expect your rankings
to return. At best youll have a clean sheet to start
working from again.

1 Manual penalty for unnatural links


2 Algorithmic anchor text based penalties
You may have them both. If you follow my
previous advice above, you will receive a response
from Google advising which you have. This advice
comes in the form of two messages:

Manual penalty
If you get a manual penalty response, happy days!
You will recover within 10 days.

Algorithmic issue
You may get one of two messages about
algorithmic issues. If your issue is algorithmic,
adding the suspect links into the disavow tool will
help you overcome it. You may be suffering with a
Panda penalty, which will need to be investigated.
Again this message means there arent any
manual actions and the issue is algorithmic. If its
down to links, disavowing them will alleviate issues.

Remember you may have an algorithmic penalty


and a manual action! You have to take care of both
and send a reconsideration request.
So there you have it, the tool does work, and it
will help you sort issues with penalties.
It amazes me when I see people advising against
the use of the tool. Some businesses are losing
millions in income because of these updates. Its
ridiculous to not use a tool that could prevent this
by killing bad links.
Before I finish up, let me add a couple of
warnings for you to consider:
l Building good links and doing lots of social,
content or viral marketing is not going to sort
out your penalty.
However, on the other hand:
l If all your links are low quality, then Google isnt
going to reward you for removing them. You
need good links in the first place.

In summary
The best plan of action:
1 Carry out a link audit and classify links
2 Manually remove aggressive anchor text
3 Add spam links to a text file
4 Disavow spam links
5 File reconsideration
6 Await response
7 Recover rankings
Throughout, you should build great links
through real outreach and marketing, too. l
Tim Grice speaks regularly about the search industry.
For more information about upcoming events, visit:
www.branded3.com/events

The SEO Handbook 77

Features

Content strategy

Features
78

The SEO Handbook

Features

Content strategy

Giving context
to your content
In the first of a four-part series,
Sandi Wassmer establishes a
framework for content strategy
Content is defined in the Oxford English
dictionary as the things that are held or
included in something, which is more
than a tad ambiguous so its not surprising that
interpretations of the word in the context of web
design and development vary so widely.
If you ask 10 web designers or devs what
content is and what role they play in its creation,
you may get 10 different answers ranging from
Content? What? Im a designer not a copywriter
to Im glad you asked. Sit down and let me tell you
about the importance of taxonomies and metadata
management. But they will all know one thing: the
success of their site, web app, mobile app, or other
platform that content is delivered on, relies on it.

Content, and why it isnt optional

Words Sandi Wassmer,


MD of Copious and CEO of
NoBlah, is a digital
technologist, inclusive
design radical, Government
adviser and occasional
sleeper. Her zeal for UX is
inspired by good design
and architecture.
www.sandiwassmer.co.uk

Image Mike Chipperfield


is co-founder and creative
director of Brighton-based
collective Magictorch.
www.magictorch.com

Put simply, content is anything that conveys


information. It has many forms such as text, images,
graphics, illustrations, icons, logos, buttons, audio,
video, metadata and file downloads, to name a few,
and unless its 1998 and your website is a single page
of text-based static HTML, then content should be
the centre of your universe because its all youve
got. When a website is broken down into its discrete
elements, its all just content.
And its not just websites youre creating content
for and long gone are the days when website copy
was simply repurposed for use in mobile apps or
social media. Its now just one big, dynamic, mobile
and interactive digital platform with one thing in
common: content. Ignore it at your peril.

make for a very unbalanced brand relationship. So


your content must also meet your sites objectives
whether these are to sell a million rubber ducks
or to get a hundred people to join your online
petition. Of course this means your content has to
be outstanding, but for it to do all of the things it
needs to do for your users, your brand and your
organisation, you must get strategic. Otherwise your
content will be a tiny needle in a very large haystack.

Strategy and the creative process


Being strategic does not require a suit, a boardroom
or any other metaphors you may conjure up. Its
about having purpose and, crucially, about having
design goals aligned with those of other disciplines
within your digital team and the wider marketing
and business objectives of your organisation.
Playing an active part in the content process
gives web designers context, and for those who find
the design briefs they receive all too brief, content
strategy provides structure and direction, which in
turn will help clarify your design goals. For those
of you who already have structure and direction,
approaching content this way will ensure that your
goals are aligned, but also provide the opportunity
to learn and hone your skills.
Just like web design and development, content
strategy has its process; it has rules that you follow,
or know how to break properly, and it too is evolving
at breakneck speed. Content strategy requires
creativity, passion, a keen sense of judgement

Bases covered Content


structure throughout user
interaction is vital; organise
the flow and design for all
potential outcomes

The digital media marketplace


The landscape for digital content is ever changing, so
everyone involved in its creation needs to consider
the different channels in which it will be distributed,
the environments that people will be in when
interacting with it and the multitude of devices that
they will be using when doing so.
Within this varied landscape, for your website or
app to be successful, meeting users needs alone will

The SEO Handbook 79

Features

Content strategy: part 1

Features

Content strategy
In a word Findable

Features

Fringe benefits Digital agency Fringe has optimised its


sites content to provide the best UX on different devices

Team responsibility A simple Google search for recipes


shows how vital it is to get down to the detail

and plenty of tenacity, just like web design. Make


friends with content. I promise you will find you have
a lot in common.
Content creation and design are interdependent,
but in organisations that arent integrating content
strategy into processes, they remain at loggerheads.
In advertising and marketing the creative team
model, which has art director and copywriter
conceptualising together, is longstanding. Creative
teams bring brands to life with consistent, cohesive
messaging but web design has not followed suit.
The internet was never intended to be the
marketing tool it is today; it was born of a practical
requirement by technically-minded folk and design
was nowhere to be found. Designers then came

evolves, these will change some characteristics will


exit, new ones arrive and their order of importance
will move around. However, for now and for the
foreseeable future, they should be your mantra. But if
there is one characteristic that must be embedded in
your brain, its findable (see In a word, above right).
If it isnt findable, it cant be anything else.
Content must be: findable; meaningful; useful;
on-brand; memorable; usable; discoverable;
on-message; timely; accessible; accurate; relevant.
If youre thinking making this happen is the
responsibility of a copywriter, then think again.
Copywriting is a major deliverable in the discipline
of content strategy, but it is just that a deliverable
and should not be mistaken for strategy. Still, without
copywriters, users will have little to consume; even
video content starts with copy. In whichever medium,
copywriting is at the centre of getting a brands
message across, but the best copy in the world cant
overcome poor technology, an inconsistent UI, garish
iconography or an unattractive visual design.

When a website is
broken down into
discrete elements,
its all just content
along and gave it style, with marketers joining in
once they realised what an incredible tool they had
at their disposal. However, in web design, creative
teams are multidisciplinary, a lot more flexible and
have developers as part of their core.
From brand managers to developers, designers to
copywriters, user experience folk to technical project
managers to online marketers and more, all play
pivotal roles in the life of your content.
In large organisations, content strategy is
becoming an integral part of marketing and
business strategy, with clear delineation of roles and
responsibilities, but in smaller ones, content strategy,
like many other aspects of web design, is shared
among those involved in its delivery. However you
slice it, thinking strategically is essential.

What will make your content great?


Great content has certain qualities and characteristics
that have been identified as being essential in order
to meet your websites objectives and users needs.
Although there is no disputing them, as technology

80

The SEO Handbook

Digital teams working in harmony


Content strategy is all about seeing your website as
a whole, with everyone involved in its life working
in collaboration and for the same purpose. Content
strategy requires all of these to be successful:
l Engaging, informative and on-message copy
l Beautiful, welcoming and on-brand design
l Intuitive, straightforward, memorable interfaces
l Effortless, meaningful and accommodating UX
l Solid, consistent, yet malleable architecture

Development and creation


The creation, distribution and management of
content involves all of the digital team at various
stages in its life. For those fortunate enough to work
for a forward-thinking organisation, content strategy
and governance will be an integral part of business
operations and will involve managers from other
departments, not just those involved in its creation.
Content strategy has been viewed as a subset
of user experience, but thats just overcomplicating
matters. They are both actually derivatives of
traditional marketing communications disciplines and
should reside in digital marketing, and be rooted in
insight and understanding.

Digital content creators are tasked with


meeting users increasingly voracious
appetites, and decreasing patience, for
finding, getting and digesting the content
they want, when and how they want it.
Content strategy is as much about thinking
and planning as it is about delivering; in the
overcrowded and unstructured digital world,
it is usually via Search that users first engage
with your brand and make that all important
decision as to whether or not they will
interact with your content.
There is no dress rehearsal; your content
has about a nanosecond to grab users
attention. Thats why SEO folk have embraced
content strategy with open arms.
Search engines continually tinker with
how content is displayed and they even
display the same page content differently
depending on search criteria. Gone are the
days when matching your <h1> and <title>
copy got you to the top.
Google uses clever algorithms to
determine whether or not content should
be displayed and how best to prioritise it;
although the clickable link is always be the
<title>, what is presented to the user can
be the full title element or only partial copy,
depending on things such as keyword density
and relevance, as well as the relationships
between textual content and HTML.
And if your content gets thus far, and
what lies within and beneath the <title> copy
is compelling enough to get users to click,
then you have another nanosecond to sustain
their attention.

The more you understand your audience, the


better you will be able to serve them. Being a part
of content strategy gives you the space and time to
learn, think and plan, so that your content can do its
job. Content strategy must be at the heart of your
design and development process.

The content life cycle


The stages in the content life cycle may not always
be discrete and your content may not always stick to
a strict calendar, but your content will always have a
beginning, a middle and an end; this process is the
content life cycle.
Sometimes stages merge, sometimes they expand,
and sometimes you may not have all the resources
you want or need and you will have to make some
difficult decisions. But the content life cycle just
keeps on moving and goes like this: Learn; Analyse;
Report; Think; Plan; Create; Distribute; Manage;
Monitor; Measure; Evaluate.
The content life cycle provides a great framework
but participating in content strategy is not just about
joining in and throwing your content into the mix. All
of the different content elements are interdependent
and interconnected, with great structure and
information architecture maturing nicely. l
Turn to page 82 for part two: bridging the gap from
data to meaningful information

Content strategy

Features

Features

The SEO Handbook 81

Features

Content strategy

Content strategy: part 2

Making data

Features

meaningful

In the second of a four-part series,


Sandi Wassmer looks at bridging the
gapfrom data to meaningful information
Contents purpose is to convey
information. But its form is data, which
is why information architecture is so
important. Working with content strategists and
information architects, and getting to know your
data intimately, will ensure content is impeccably
organised, appropriately categorised, well
structured, safely stored and carefully managed.
As content strategy is still finding its feet, you
may find digital teams in different places within
organisations. You may even find that classification,
metadata and taxonomies across your organisations
digital content are inconsistent and the design
process is the perfect time to put that right.

Content management
Content management and content management
systems are not the same thing, and this little
misunderstanding can be the source of great
frustration for those involved in content strategy.
Content management is a process, something you
plan and do; a content management system is a
piece of software and is something you utilise.
Content management systems from a five-page
WordPress blog to an enterprise-level information
management system are what most digital teams
use to create, publish, manage, distribute and store
all manner of web content, and for those not versed
in the merits of content strategy, its easy to simply
rely on what the system provides. Although you may
get lucky and find a CMS that is perfectly in line with
your content strategy, its improbable. However, as
content management systems are not created by
designers or content strategists, it is important to
understand the inherent systems and structures that
exist within your CMS: these may impose restrictions
on your strategy and approach to design, particularly
if the inherent systems are not flexible, or you dont
have the resources to modify them.
You also need to consider that once your
wonderfully designed website is finished, it will be
handed over to marketing to manage the day-to-day

82

The SEO Handbook

processes. If youre not part of the content strategy,


your relationship with the content will end when
the design ends youll have no control over how
its used and, when the next round of design and
development work is required, your beloved content
will most certainly be returned to you in a very
different state to what you left it in.

Befriend your brand


The content you create must be able to strike a
magic balance: it has to achieve business objectives
and fulfil users needs simultaneously. Many websites
put their brands objectives first and then wonder
why users are unhappy. Achieving and maintaining
this fine balance is not easy at the best of times. But
if you are not equipped with the essentials of brand
and user, its just a stab in the dark. The insights and
understanding gained when you stop and learn will
facilitate this coveted balance, and what you learn is
the foundation on which great content is built.
Get hold of as much information as possible
about your brand: working to brand guidelines is
a good place to start, but youll need more. The
breadth and depth of what web designers and
developers learn at this early stage depends on the
role digital plays in an organisations marketing mix,
how large it is, how mature the brand is and how
much importance is placed on market research.
The brand embodies business and marketing
objectives, and delivers these through your brand
promise. Its a simple equation: if you buy A product,

Marketing medicine The


NHS knows theimportance
of brand guidelines, and
has developed a website
(www.nhsidentity.nhs.uk)

Features

Content strategy

Classification, taxonomies and metadata


Organisational skills may not be sexy. Yet while
these terms can be daunting and confusing,
they are the backbone of your web content:

Metadata

Classification

lA
 dministrative metadata is used to improve
your CMS or other method of managing
content for administrative purposes.
lS
 tructural metadata is the data inherent in
your CMS database. Its unseen by users, but
is key to keeping your content well managed.
lD
 escriptive metadata (such as tags used to
categorise and filter blogs or news content)
is what frontend designers and devs need to
consider. If managed well they can greatly
improve the findability of content; used badly,
they are a hindrance and not fit for purpose.

Taxonomy

Online research Among others, SurveyMonkey does


surveys and CrazyEgg offers heatmaps and eye-tracking

B will happen and you will feel C. As web designers


and developers, you must do all you can to ensure
your users feel C.
The brand is something that should be at the
heart of every organisations culture, and you should
ask for whatever brand documentation is available,
including the brand mission, vision and values, its
aims, goals and objectives, brand guidelines, and
marketing communications.
Most organisations determine their marketing
strategy annually, and everything else will stem from
here. Ask for information about marketing strategy,
objectives, campaigns andkey messaging.

Understanding users is key


If your organisation has conducted market research,
then the subsequent insights into the needs of the
websites target audience and segmentation of
users based on demographics, needs, behaviours
or other criteria can all prove invaluable. If your
organisation doesnt do market research, ask why;
after all, by now youll understand the importance of
understanding users needs in creating great online
experiences and interactions.
Every web designer Ive included in the content
strategy process has benefited hugely, particularly at
the learning stage. At first they may have rolled their
eyes and tolerated what they considered marketing
fluff, but once theyve started to understand its
relevance to them and how the upfront investment
will pay dividends for the future, their attitude has
changed. You too will find that your approach to web
design evolves, and as it does you will wonder how

The hierarchical semantic structure of your


content and its relationships. The system for
managing content and metadata within your
CMS, and the language together with the
structure of your HTML, are both taxonomies.

you ever designed websites without deference to


content, and to your intertwined relationship with it.
In addition to the brand information your
marketing department provides, understanding
users is about two things statistical data from
quantitative research and analytics, and behavioural
data from qualitative research.

User intelligence
Amassing user intelligence is an evolving area,
but the variously named methodologies are all
derivatives of either quantitative or qualitative
research. Traditional market research processes still
stand us in good stead, and technology provides
enormously improved ways of collecting, collating
and analysing the collected data.

Market researchs
insights into a sites
target audience
can be invaluable
Quantitative evaluation is objective, and is about
statistics, and analytics measure performance against
set targets. In contrast, qualitative evaluation looks
at the quality of interactions and is more subjective.
Analytics data and reports from your existing website
should be used, where available, because these
provide the context for deeper understanding, and
can also be used to compare against analytics data
post-launch.

Quantitative research
Statistical quantitative research starts with the
collection of data, and there are a number of
tools available for this, such as questionnaires,
online surveys, qualitative research (the whys and
wherefores), interviews, focus groups, random
sampling, projective groups, heatmaps, eye-tracking
and product testing

Ethnography
Admin CMSes offer many ways to manage online content

Ethnographic studies attempt to understand


behaviour and culture and, while ostensibly

qualitative, are not generally adopted by marketers.


However, if you have the resources theyre a
worthwhile tool. It is one thing to have a snapshot
of your users at set points, but a deep level of
understanding gained over a longer period is clearly
preferable, and can provide both sectional and
longitudinal observations that other methodologies
will not uncover.

Users and their many devices


With convergence around the corner, cloud
computing hovering above and everyone always on
the go, people are constantly accessing web content.
The requirement to deliver content in many different
configurations can be overwhelming, but as web
design matures there are some conventions and
consistencies emerging.
The knowledge you gain from looking at how
best to deploy and deliver content to the different
distribution channels will ensure your content is in
an optimal form from the start, but you must also
think about the users, their similarities, and their
differences. Unlike the disciplines of accessibility or
UX, the role of content strategy is focused on the
brands relationship with its target audience, and
so requires a different mindset. Remember, content
strategy is the child of marketing your design
goals would be very different if your purpose wasnt
marketing-led or user-centred.
At the same time, you need to be thinking about
the devices your target audience will use to access
and interact with your content. Desktop PCs and
laptops are where the internet matured, but nothing
in technology stands still and while platforms such
as game consoles and TVs are currently underutilised,
this will not be for long. Its clear that mobile devices
will be key in the future, but smartphones and tablets
are not yet fully featured computers, and this poses
one of the biggest challenges.
As if this is not enough, when planning you will
need to ensure that your content is planned and
built so that it renders in the way you intend on
different browsers, operating systems, native mobile
interfaces, mobile applications, media players and
access technologies. l
Turn to page 85 for the third installment of our content
strategy series, which moves beyond theory and
discusses putting your knowledge into action

The SEO Handbook 83

Features

The process of naming and arranging content


so your users can find it quickly. It involves
analysing content, associating metadata and
determining where in a taxonomy they belong.

Metadata is data about other data, and there


are three key types:

Features

Content strategy

Features
84

The SEO Handbook

Features

Content strategy

Put knowledge
into action
of bringing the disciplines of responsive
In the third of a four-part series, possibilities
design and content strategy together.
Sandi Wassmer argues that it pays to
Defining design goals
plan before embarking on any project When defining your goals, the understanding you
On commencing any project, you should
give full consideration to the different
places your content needs to get to,
and remember that it frequently needs to do so
simultaneously. So if you dont think strategically
and for the different outcomes, youll find things
coming unstuck pretty quickly.
You will need to use whatever resources you
have to make this process more efficient. If you are
creating a news story that needs to populate different
channels, in different formats, at different times and
in different languages, the way that you create, store,
distribute, manage, maintain and eventually archive
the different versions of a single source of content
(and all of its associated files and metadata) is key.
Even if your brief says that you are only designing
a website, you must nonetheless consider the
different distribution channels, which can be
categorised as follows:
l Websites, web applications and mobile apps
l Social media, blogs, news feeds and aggregators
l Photo, video and media sharing
l IPTV
l Gaming
l Communications

have gained about your target audiences will be


invaluable. Considering how best to align messaging
with visual output will become second nature.
Delving into peoples behaviour and preferences
enables you to optimise design and content for
better UX. Since users are now expected to find,
interact with and respond to your content in
apredefined way, your design goals must ensure
thatthese expectations are consistently met.
This predefined behaviour is instigated by a call
to action, which elicits user responses. This must
deliver your websites objectives placing increasing
demands on web designers. Users respond to the
call to action, and user interaction is more important
than ever. But the content users generate when they
interact with your website (whether via a one-line
comment on a blog post or an upload of a 20-minute
video of their dad singing his favourite songs), and
the quality of the metadata that they generate, is
frequently overlooked. You, however, wont do that
because you know how interdependent content
strategy and web design really are.
If you want to include the facility for people to
comment on blogs in your design, you need to
think about users and their behaviour first.
Should the facility be for registered users

Made-to-measure The
iPlayer on the PS3 and
Xbox delivers the same
content to distinctly
different environments

But its not just about redeployment: a lot of brands


are using social media as the hub of their online
advertising activity. Facebook campaigns that started
out as extensions of website content are now
becoming almost standalone, and with online video
advertising increasing rapidly, and its polar opposite
Twitter doing the same, delivering social media
content is constantly being redefined. So even if its
not in the brief, think about how your web content
can be deployed on other channels in the future.
The proliferation of mobile apps provides a whole
new set of rules. With limited real estate, and a much
more task driven focus, new approaches to design
have emerged and the role that content strategy
plays will become more important. Just think of the

The SEO Handbook 85

Features

Content strategy: part 3

Features

Content strategy
Plan content before you build
Assess and decide
You need to be brutally honest about what
will work and what wont. Set out clear
criteria before you assess, and stick to them.
Mind the gap
Once you know what youve got, you need
to figure out what you havent got. What
content is missing? What do you need
to fulfil your goals and objectives? What
do your users really want? See this as an
opportunity to innovate.

Features

On message The importance of ensuring that brand


messaging and visual design are aligned is evident

Metadata matters Flickr and Stackoverflow both use tags,


but the end results are distinctly different

only? Should you use preset categories for


tagging? Or should you allow users to define
their own tags? If you open it up to users, will they
run amok, or will you find great insights that result in
modifications to your system of classification?
Content strategists should not be left to make
such decisions without designers or developers, and
you should follow suit.
Similar considerations need to be made for other
user-generated content (and particularly when
providing the facility for users to upload their own
content), because this content will not only come
with its very own family of metadata, it will also
include whatever metadata your system applies or

by involving those in the digital team who arent


directly involved in content strategy, as well as those
strategic decision-makers from other areas of your
organisation who arent involved in content delivery.

Before you take


stock of your
content, take stock
of your time
enables during the upload process. If you dont get
down to the detail of metadata when you define,
scope and create your functionality, you will erode
the UX and forego the opportunity for future
insights. PDF downloads entitled Document 1 and
photos tagged with the file number created by the
users camera are of no use to anyone.

Take stock: content and competitors


It is crucial at this stage that before you take stock of
your content, you take stock of your time. You must
be realistic about how much time and energy you
put into this stage, and what the benefits will be. If
you do not have very sophisticated tools and you find
getting information out to be very labour intensive,
but a new CMS will make it a doddle, then this will
have to be a consideration. Whatever tools you
employ, you must assess two things simultaneously
the quantity and quality of your content and the
quantity and quality of your competitors content.
Once all of this information has been assimilated
and analysed, it is collated into a digestible report
for the digital team to consider. Although it is
true that too many cooks spoil the broth, getting
as many different opinions as possible at this
stage can be very useful. Rewards can be reaped

86

The SEO Handbook

From learning to decision making


toeditorial strategy
The next stage is referred to as editorial strategy. If
you are thinking, Hang on a minute. Arent content
strategy and editorial strategy the same thing?, the
short answer is no.
Editorial strategy is not about editing in the
same way that content strategy isnt just about
copy. Editorial strategy sets out how your content
strategy will be implemented. This is where it all
comes together, where insights and understanding,
statistics and analysis are given due consideration
and decisions are made. Your editorial strategy
documents how your digital team will deliver,
manage and maintain the best content possible.

Checking out the competition


Go for variety and select the market leader,
an up-and-comer and one thats nearing
insolvency. This will yield more useful results.
How do you stack up?
Before you start to plan, you will need to
review how your research and analysis will
inform your plan. You must now:
l Compare your content analysis with your
design goals
lC
 ompare your content analysis with your
content strategy
lC
 ompare your content analysis with your
brands objectives
l Compare your content analysis with your
users needs
l Compare it all with competitors content
Now ask yourself whether there are any risks
to achieving your design goals. Rethink? Full
steam ahead? Proceed with caution?

Designing with purpose


How this will happen is dependent on any number
of factors, but the intersection of design, technology
and marketing is the key to successful content.

Scoping
Just as you scope a web design project, you will
have to determine the resources available to deliver
against your design goals and websites objectives,
whether theyre people, content assets or budget,
and there will be hard decisions to make. Resources
are always finite, so attention to detail at this stage is
a must. Your scope will need to include details of:
l The different distinct types of content that need
tobe created
l How much of each of the different types of
content is required
l How often the different types of content will
needto be updated
l Which distribution channels will utilise the content
l What the associated costs are
What you determine here will provide the framework
for the work to come. You may be surprised at the
results, because it is usually this stage that serves as
a bit of a reality check. It is incredible how quickly
requirements that were absolutes fall by the wayside
when it becomes evident just how much work goes
into getting content right. It is not that content

doesnt matter; its just that the discipline of content


strategy on the web is still in its infancy and, as
with anything new, it will take time before people
appreciate just how valuable content is.

The editorial calendar


The concept of the editorial calendar is not new
to those who have worked in publishing, and the
principles on the web are much the same. It is a
way of looking at content output and determining
what content needs to be delivered and when. But
its not just a matter of plugging holes. The editorial
calendar is where consideration is given to content
as a whole, and the ebb and flow of how users
interact and respond to it. SEO folk talk a lot about
fresh content and consistency, and Google is certainly
watching, so the thinking behind the timing and
volume of output can be make-or-break.
A companys address rarely changes, but it may
publish news daily, blog weekly and tweet on the
hour. The editorial calendar is the hub of all content
creation activity once the design process is over.
Fulfilling the needs to produce content will influence
your design decisions.
Now that you understand the context, you can get
on with the planning and implementation. l
Turn to page 88 for the final part of our content
strategy, examining bringing your content to life

Content strategy

Features

Features

The SEO Handbook 87

Features

Content strategy

Content strategy: part 4

Features

Bringing your

content to life

In the last of a four-part series of articles,


Sandi Wassmer finds that interdisciplinary
collaboration makes the most of your content
In the previous articles, I looked at why
you need a strategy for managing the
content that goes to make up a website,
and how to begin developing one. In the final
part, Id like to look at the people you will need to
involve in that process. However digital teams are
configured internal, external or a combination of
both the core disciplines of design, development
and marketing remain. All three play key roles
in each stage of the process, but the balance will
differ from team to team and project to project.
When deciding who will be on your team, look
beyond job titles: roles and responsibilities vary
widely. The titles listed below seem to be commonly
understood but there are no absolute conventions
quite yet. As the industry matures, job roles will
morph and mutate, so this list is likely to evolve.

Marketing, communications and PR


Digital marketing now includes a range of discrete
disciplines. Here are some of them most common:
l Digital marketing
l User experience
l Content strategy
l Online PR
l Social media marketing
l Copywriting
l Search marketing
l Market research

Designers of many disciplines


It has been generally accepted that web designers
must have a solid foundation in graphic design in

88

The SEO Handbook

order to create on-brand content. But as websites


have evolved from veritable online brochures
towards fully fledged interactive products,
web design has developed greater affinity with
industrial design or architecture. Not only do you
need to consider what a website looks like but also
how site visitors, using a near-endless range of
platforms and devices, will consume its content.
As such, the field of web design has been split up
into a larger set of individual disciplines:
l Visual design
l User interface design
l Information architecture design
l Interaction design

Devs of all shapes and sizes


The days when web developers were banished to
the backroom to code are long gone. While it may
not be their natural habitat, developers are front and
centre when it comes to content. If developers
are not involved in planning content strategy,
our beloved content may never make its way
to a particular browser or mobile app. Although

Stages of interaction
Users are content editors
too so make it easy for
them. The Flickr Uploadr
guides users through the
process of uploading
images to the site

Content strategy

Features

Features

Multiple devices Content strategy must evolve in line with access patterns. IMDb (www.imdb.com) is a good example
of how a complex set of data can be organised to ensure that users can access it as intended using a range of devices

Golden rule Its not just how content looks on different


devices that counts but also how users interact with it

to many developers, content may just be data,


everything that happens to, within and around that
data is ultimately their responsibility.
Key disciplines within the wider field of
web development include:

hopefully, an increasingly multidisciplinary approach


will lead to an environment in which all members
of digital teams approach content together rather
than as separate entities. It is both the meat and the
bones, after all.

l CMS development
l Database development
l Client-side scripting
l Server-side programming
l Web applications development
l Mobile applications development

Information architecture: an aside

l User interface design


l Content classification
l Content management
l Asset management
l Metadata management
l Analytics management

But lets not get too hung up on job titles. What


matters is that you are reading this article, which
means that there is already a move towards
integrated, interdisciplinary collaboration.
The transition may not always be smooth:
marketers and designers regard one another with a
degree of mutual suspicion, and bringing developers
to the party may complicate things further. But

The structure and management of content, data,


information or however you choose to refer to
it is the key to your digital offerings sustainability.
Handle it with care.
I will leave the debate about exactly who is
responsible for structuring the data to one side here.
Instead, lets just note that the term information
architecture covers a wide range of activities;
and that there is considerable crossover, not only
between individual disciplines, but with design
and development and even the role of content
strategists themselves.
A preliminary roll call of information architecture
disciplines could include the following:

And remember: creating great content requires


lots of communication. If interdisciplinary teams
understand each other, and you dont run for the hills
when marketing folk start talking about economies of
scale and cost efficiencies, you will be able to create
wonderful content together.

Plan and schedule activity


Before we conclude, I want to look at some of
the logistical issues that relate to implementing
an effective content strategy. The workflow and
scheduling processes are the same as that of design.
You will need to use whatever resources you
have; whatever you need to source.

Once your content has been distributed,


its time to start monitoring and measuring.
All the systems and structures that you set
up during the content strategy process will
now be put to the test. These will provide the
means to capture all sorts of usage data, and
enable you to evaluate how successful your
strategy has been.
You will use the same tools to measure your
results here as during your research phase.
The criteria that you determine during scoping
should be tracked and measured, and the results
should be shared with your team whether you
have been successful or not, its essential that
you analyse and assess your results.
Your team will need to evaluate both
statistical and behavioural data, if available. You
will need to know not only who is using your

site, but what they are doing, why, where, when


and how. You should analyse metadata and look
at metrics such as keywords vs bounce rate.
You should monitor how people navigate,
what content they land on, how long they linger
and where and when they exit. You should
observe your users in order to find out
if they are mesmerised by those shiny buttons,
or why your fabulous new way of managing
blog comments has led to attrition.
But you dont just need to know whether
or not you met goals or objectives; you need
to know why. You should look at things such
as comments, or the relevance of retweets, to
find insights into why you did or didnt meet
certain criteria. Sometimes you just dont know
why, and you will need to test sample content
before it can be put right. You may determine

Image: www.strangelove.me

Monitor. Measure. Evaluate. Improve.

In depth Find out all you can about your users: not
only who they are, but why they use your content

that copy changes will do the trick, or that an


amendment to some of the metadata around
form functionality may be the answer or you
may find that a simple colour change draws
users to the content like bees to honey.

The SEO Handbook 89

Features

Content strategy
Best fit A good example of
content aligned to
structure, games website
Thunderbolt (www.
thunderboltgames.com)
uses responsive design
techniques to provide
elegant experiences across
different platforms

Features
You will need to determine dependencies and
interdependencies, and make sure you have
made contingencies and can accommodate change.

Goodbye to old content (for now)


Just because youve been brutally honest when
assessing your content, this doesnt mean it should
be put out to pasture. Make sure that you archive
old content with the same care and attention you
pay to new content. Make it easy to bring it out
of retirement; it might not be in line with current
strategy but it could be in the future.

Make architecture work for you


If you were to bring content and user interaction
to life and put them in a house together, it would
be like living with an interior designer someone
constantly buying new ornaments and moving the
furniture around. The design of the building itself
wouldnt be affected, and nor would its structural
integrity, but the interior would constantly change.
You should apply the same concepts to the web.
Users should be free to interact with content without
disturbing design or structure.
Fortunately, a clean semantic structure is now a
given for modern web projects. Integrating metadata
and taxonomies helps make it possible for everyone
involved in creating content to maintain it, enabling
users to find what they are looking for and complete
tasks quickly and efficiently.
It is vital that each and every user interaction
is valuable for both parties in the relationship.
If, for example, you decide to open up tagging
on user-generated content, make sure that this is
effortless for the user and that your system can
process new metadata within its taxonomy in a
meaningful way.

90

The SEO Handbook

Creativity from constraint


Once all the planning has been done, its time
to get creative. Although it may seem that the
requirements of content strategy are constraints,
they are actually opportunities. Gone are the days
of pixel perfection. Instead, making content fitting,
intuitive and responsive is the order of the day.
The rapid maturation of technologies such as
JavaScript make it possible to access content in
new ways, while the features being implemented
in HTML5 will do a lot of the thinking and decision

of opportunities for creating the right content to


meet your organisations multifaceted objectives,
and to serve users complex and ever-changing
needs. But dont bite off more than you can chew
the complexity of each project you create will only
increase in future.
And, of course, the life cycle of content is circular:
you will always be part of its continued development,
and you will always be learning. If you take only one
thing from this guide, I hope its that you can always
improve the way in which you work. Please take the

Embracing content strategy will give your


work depth and relevance, and if you do
it right, new meaning will emerge
making for you. Some pretty awesome techniques
such as CSS3 media queries are coming into their
own, so the technology is there for you to explore.

time to reflect, and share what you have learnt with


your team.

Content is here to stay

So now it is over to you to take all of the information


that has been presented in this series of articles and
do something inspiring with it.
Effective creation, collation, distribution and
maintenance of content requires the combined might
of the full digital team and then some.
But the results are worth it.
Embracing content strategy will give your work
more depth and relevance, and if you do it right, new
meaning will emerge from the process.
So design a findable, beautiful, usable and
accessible website for your target users and dont
forget to have fun. l

When the web hit the ground running back in 1999,


the phrase content is king was used far too often by
folk who didnt really understand its meaning.
Despite that, the internet boomers were correct:
content really is what matters. But this doesnt
mean that web design or development should play
second fiddle. Integrating content strategy into
the design and development process presents the
perfect opportunity for everyone to get involved.
With mobile platforms starting to take centre
stage, making users more time-constrained and
task-driven than ever before, there is a smorgasbord

Now its over to you

Launch a website

We know web design can be intimidating, packed with jargon and


technical detail. But this 180-page guide, revised and updated for
2014, sheds light on the art and science of developing a website
and will help you go through the process from start to finish
On sale now at WHSmith, Barnes & Noble, www.myfavouritemagazines.co.uk
and for iPad at http://netm.ag/itunesuk-248

Features

Interview: Karen McGrane

Features

I cant even ride an


elevator without trying
to work out the logic
Karen McGrane

http://karenmcgrane.com

Photography/Daniel Byrne

User experience expert and content strategist


KarenMcGrane is a self-confessed systems person. She
chats to Martin Cooper about Google Glass, lightning
strikes and the rarity of using the web industrys front door
Can you imagine if the ships computer
in Star Trek was like open tag, b, close
tag? Thats not the future! I want to
make the Star Trek computer work really well.
Thats my goal in life! And that pretty much sets
the tone for our chat with Karen McGrane.
A renowned speaker, author and practitioner
in the field of content strategy, Karen is a highvoltage geek. Whats more, she has an utterly
infectious love of technology and the web.
In the space of just over 30 minutes she laughs,
waves and gallops through a truly amazing
menu of technologies. One minute shes talking
Star Trek, the next Bletchley Park. Moments
later shes charged through speech recognition,
Solaris workstations, punched cards, VAX, Apple,
Gutenbergs Press and AOL. Its like being led
at warp speed through the worlds biggest tech
museum by the worlds most excitable guide.
Google Glass? No I havent tried it, she grins.
I think its going to be the Segway of mobile.
Such quickfire quips come at you with complete
spontaneity. Its not for comedic effect, though.
Every point she throws out there ties elegantly into
her lifelong passions for information architecture
and content strategy.

92

The SEO Handbook

As such, she completes her take on Google


Glass thus: Looking at some of the UI guidelines
and the publishing specs they released a couple
of weeks ago, theyre going to give you these
HTML templates and youre going to funnel your
content into them. Great! Is your content going
to be structured to render appropriately on those
screens? Im not suggesting we all go out and
restructure our content for Google Glass. What I
am saying is: if you have that underlying content
already, great! Youre better set up for the future.

Bolt from the blue


So does McGrane believe that theres a secret to
content strategy?
The idea that content is being published out
of a database and it can go anywhere [adaptive
content] is a massive transformation, she says.
You know, so much of what we do is trying to
make the web behave like print. Content strategy
is really helping the world understand what makes
the web different from print and how we fully take
advantage of this new medium. Its exciting! Its
Gutenberg-level stuff.
With that cleared up, we move to more
biographical pastures: McGrane works as an

independent consultant and has been running


herown company since 2006.
I focus on assisting companies that need
help with mobile, of which there are many, she
explains. I work from home its delightful.
Id sayIm in New York two weeks out of every
month.Beyond that I do quite a bit of client
traveland conference speaking.
A decade before starting up her business,
McGrane can recall first hearing the words
information architecture which in effect,
shesays, kickstarted her career.
It was in 1996. It was like I was hit by a
lightning bolt, she reminisces. What I loved
about it was the idea that it [information
architecture] combined the linguistic with the
categorical, the structural nature of findability
withthe physical, and the tactile with the look
andfeel side of design. It was right there in the
name: information architecture.
So after this epiphany, did McGrane follow
the self-taught pathway, or go seeking more
traditional training? Its kind of a truism in this
industry: everybody started out doing something
else and came in through a side entrance, she
says. Im one of the rare birds who came in
right through the front door and I have never
done anything else. I have a graduate degree in
technical communication and human computer
interaction from Rensselaer Polytechnic Institute.
Its an engineering school in upstate New York.
Essentially, I have a graduate degree in being
aUX designer, with a focus on information
architecture and content strategy.

Features

Features

With information architecture arriving in


her life like a force of nature, we wondered
if technology itself made a similarly dramatic
entrance. You know, I dont think there was one
big moment I think there was one long stretch
of moments throughout my life, she says. My
school district I grew up outside Minneapolis,
Minnesota had a focus on technology. We had
computers early on. I remember working on the
old Apple II and Apple IIe machines when I was
quite young. I think they had a mainframe in the
closet and stacks of punched cards. I was exposed
to the traditional green screens and command line
way of computing. I feel like Ive used computers
my whole life. They were always there.
And the first time she encountered the internet?
Wow! I remember poking around on some of the
early systems: CompuServe, Prodigy and AOL. We
all had an AOL account.
When she started graduate school in 1995, she
remembers the new beta of Netscape came out for
Solaris workstation. The computer lab would light
up. Wed all run and download it! Looking back it
was exciting. You were watching the web explode.
I only regret not buying a million domain names!

Content vs structure
After leaving the cloistered world of
academia, McGrane landed her first job as an
informationarchitect with Razorfish. I was the
first IA. I was the first person with any kind of UX
background, hired when Razorfish was like 30
people. And I have never done anything else!
I think Im so lucky in that this job [matches]
how my brain works, she adds. I feel really lucky
to [have found] a career that allows me to do what
I do best. I was very lucky the internet came along
right when it did. It seemed like fate.
All this sets us to wondering: when it comes
to books, films anything with a narrative
framework which does she find most alluring:
content or structure? Easily the structure and
the management of it the engineering system
behind it, is her response. I love a good story.
I love a solid narrative. I love a good turn of
phrase and I enjoy the act of writing. Whats
really interesting is trying to reverse engineer
something. Its pattern recognition really. You can
let [structure] just wash over you or you can be the
kind of person who is constantly trying to figure
out how the system works. So yeah, Im definitely
a systems person. If I go to Disneyland, am I
letting it wash over me? No. Im like: why is this so
great? What are they doing to make this a great
experience? Id say thats a common trait in UX

94

The SEO Handbook

professionals. You cant just enjoy something. You


have to try and work outwhy youre enjoying it!
Darting back to her compulsive reverse
engineering of life, the universe and everything,
she quips: I cant even ride an elevator without
trying to work out the logic. How do they work out
which floor to rest on?
With her love of content proved and her geek
credentials underlined, were keen to explore a
contrasting, steely side of McGranes character.
Head over to her website www.karenmcgrane.
com and hunt down the posting Uncle Sam Wants
You (January 2013). There youll find a link to
her An Event Apart DC 2012 talk and also a full
writtentranscription. The standout quote is: Do
you really believe that America is a meritocracy
if that information isnt available on mobile?
Becauseits not.
Im deeply idealistic, McGrane asserts, Im
very lucky to have a platform to speak from and

my job is to make the web to make the world


a better place. One of the things thats really
electrified me about mobile is that there are
mobile-only users. Its so easy for people like us
tofall into the trap of thinking that every user is
like us, that every user has multiple computers
anda broadband connection. [Its easy to believe]
that the internet is just part of the fabric of
everyday life. Ive had an internet connection
athome since 1994.
Traditionally, she continues, its populations
who are disadvantaged who are less likely to
have an internet connection. Think about all of
the things that are not available to you if you
cant get online. Something like 80 per cent of
Fortune 500 companies only advertise their jobs
online. You cant find a job posting if youre not
on the internet. Think about all the resources
that are available for education, for housing, for
government services and affairs.

Interview: Karen McGrane

interests and aptitudes. I know a lot of developers


who are doing content modelling and theyre
figuring out how the content should be structured.
They are the people who have to cope with the
eleventh hour shitstorm, when somebody needs to
figure out the content at the end of a project.
Changing gear and direction, McGrane expands
her idea: Somebody has got to do content

touchscreens for the last 20 years and then, one


day, touchscreens turned up and they worked
perfectly and transformed how we interact with
technology in ways we never anticipated. The same
thing is going to happen with speech.
Barely pausing for breath, she continues:
Speech is going to change everything. Its an
example of why adaptive content is so important.
If you think about it: what happens now if your
content needs to be read when somebodys driving
a car. Think about all the presentation information
built into our content to describe how it should
be styled. It needs to be somehow stripped out
for audio interfaces. I like to give the example of
italic verses emphasis as semantic tags. People
say, whats the difference? Every major browser
is going to render emphasis as italics. So, who
cares? What happens when its an audio interface?
Suddenly emphasis does mean something very
different from italics. Its a simple example if you
can wrap your head around that.
She closes with a simple summary of her
mission: Our content really does have to go
everywhere. If you start trying to solve that
problem now, youll be better off in the future. l

Features

Features

The quips are gone now and her sentences


become short and staccato. Does everybody in
America have access to the same information? To
the same services? To the same opportunities?
Youdont if you lack access to the internet.
Pausing to draw breath, McGrane drives home
her point: I believe we have a responsibility to get
this information out there where everybody can

You can let structure wash over you,


orbe a person who constantly tries to
figure out how the system works
access it. For education, housing, jobs, banking,
healthcare. Mobile isnt a nice-to-have. Mobile is
essential if you want to reach the people who need
that information the most.
Then she nods and stops talking, using silence
as amplification.
So who is actually getting content strategy
right? The United States released a Digital
Government Strategy that is very forward looking.
The recommendations are the sorts of thing
Id recommend to a client if they were a large
enterprise. GOV.UK has been doing amazing things
in the UK space, too.
That begs the question of course, what smaller
companies, unable to afford to hire either a full
time content strategist or a part time consultant,
should do. On any given project it comes down
to your teams makeup, on their backgrounds,

strategy but I dont think it needs to be somebody


with the title content strategist. Its rather a
controversial view in UX circles, but I think we have
a responsibility to teach people the basic principles
of user experience design, user research and
information architecture. We have a responsibility
to get those ideas out there to the broader design
and development community. We shouldnt just
mint people who have our special UX or content
strategy title. I want to be sure that whoever looks
after content has the tools to do so.
As we wrap up, its impossible to resist invoking
McGranes geekiness again. Just for one last time.
Which technology does she think will change the
world when touch has become pass?
Im very interested in when speech-based
interfaces are a reality. Everybody pokes fun
at Siri. But, you know, people [laughed] at

Karen McGrane
Job UX and content strategist
Online http://karenmcgrane.com
Twitter @karenmcgrane
Recent projects Hearst
Corporation, American Express
Publishing, Celebrity Cruises

The SEO Handbook 95

Tutorials

SEO techniques

Page 120
Tutorials

Page 98

Page 126

96

The SEO Handbook

Page 112

Tutorials

SEO techniques

Expert tutorials
Step-by-step skills to boost your page rankings
from leading industry experts
Contents

Serve faster web pages


Use Nginx to serve static content

98

10 top SEO tips


Glenn Alan Jacobs shares top tips 101

Utilise structured data


Schema for ecommerce products 102

Boost page performance


Provide a faster page load time

108

Make sites load faster


112

Improve page load times


Use content delivery networks

116

Build a responsive site


A simple responsive website

120

Retrofit older sites


Page 102

Page 108

Use responsive techniques

126

Page 116

The SEO Handbook 97

Tutorials

How to make sites render faster

Tutorials

Serve faster web pages

Nginx serve
faster web pages

Download
the files! >

All the files you


torial
need for this tu
http://
at
d
un
fo
be
can
41
x-2
gin
/n
.ag
netm

Nginx is a high performance and open source web server. Jaime Hall explains
how to use it to serve all static content and speed up page loading times
Knowledge needed Intermediate command line, basic server knowledge
Requires Linux (uses Debian for the example)
Project time 30 minutes

If you dont have Apache already installed, lets install it, along with PHP:
Apt-get install apache2 apache2.2-common apache2-mpm-prefork apache2-
utils libapache2-mod-php5 php5 php5-common

Tutorials

Nginx (http://wiki.nginx.org) makes up 13 per cent of the web


servers currently being used to serve web pages across the world.
The open source, lightweight web server can serve all static content
while acting as a proxy to Apache (www.apache.org) to serve the non-static
content. Follow this tutorial for a guide to installing and setting up Nginx to
make your web pages load faster.
The commands performed within this tutorial are applicable for Linux-based
platforms, specifically Debian and Ubuntu, so a few adjustments would need to
be made for it to work on other flavours of Linux, like file locations and package
installation commands. Start with an existing Linux platform installed without
any other web servers running. However, if you already have Apache running,
simply skip the installation part.
Lets begin by updating our list of packages so we have an up-to-date list.
Toget the most up to date version of Nginx, we will be using the Dotdeb
repository. These need to be added to our source list. Also, add the GnuPG key:

At this point, you can install any other additional modules that you may
require for PHP like the GD library, cURL, ImageMagick, MySQL etc.
If your server is currently restricting port access via IPTables or similar, then
you will need to open up port 8080 so that any non-static content can proxy
through to Apache on that port as well as the standard 80 port for Nginx.

nano /etc/apt/sources.list
deb http://packages.dotdeb.org squeeze all
deb-src http://packages.dotdeb.org squeeze all
wget http://www.dotdeb.org/dotdeb.gpg
cat dotdeb.gpg | apt-key add -
apt-get update

# Number of connections each worker can run


worker_connections 2048;

Next we install Nginx:

iptables -A INPUT -p tcp --dport 8080 -j ACCEPT


iptables -A INPUT -p tcp --dport 80 -j ACCEPT
Next, update a few of the settings in Nginx to improve performance:
nano /etc/nginx/nginx.conf
# Typically 1 per core
worker_processes 1;

# specifies how many open file handles allowed per worker


worker_rlimit_nofile 8192;
# Relieves HTTP connection creation
keepalive_timeout 10;

apt-get install nginx 

# Specifies the client request body buffer size

Need to know Nginx has a full list of all available configuration settings

Load time Use Pingdoms toolkit to test the load time of web pages

98

The SEO Handbook

Tutorials

Serve faster web pages

What does proxy mean?


A proxy server is a process that acts as a middleman between a request and a
resource. This can be used in many different styles, which include Reverse
Proxy, Open Proxy and Forward Proxy. These allow you to use a proxy to: load
balance between multiple resources; mask the final resource; or, in our case,
forward a request onto a different process without the user knowing if certain
criteria have or have not been met. Fundamentally, a proxy server acts as a
gatekeeper, deciding who shall pass and on which path they can take.

Why use Nginx and Proxy?


The second most visited site in the world (according to Netcraft) runs on a
modified version of Nginx (www.taobao.com). So how is it better than
Apache? Nginx, unlike Apache, works as an event-driven asynchronous
request handler as opposed to Apache, which uses process or thread-oriented
request handling. This means Nginx is able to utilise a single thread to handle

multiple requests whereas Apache has to spawn new processes or threads for
each request it receives. This uses less memory as each process has a certain
memory overhead every time one is spawned. Because of this, a much more
predictable memory usage can be achieved under large traffic loads.
Because Nginx is only serving static files, the process itself uses very little
memory and doesnt need the overhead of additional modules that Apache
may require (such as mod_rewrite, mod_php, mod_deflate and so on).
Beloware some example memory usages for Apache and Nginx as seen on
acouple of live web servers using fairly default installations. These will vary
between servers but they give a fairly good representation: Nginx, 4.62mb
and Apache, 29.12mb.
Theres a massive difference in overhead between the two, which will
increase with traffic. You can see why spawning multiple Apache processes
can make memory usage fluctuate greatly.

client_body_buffer_size 8K;
# Headerbuffer size for the request header from client
client_header_buffer_size 1k;
# Maximum accepted body size of client request
client_max_body_size 2m;

# Enable compression to reduce transfer times.


gzip on;
gzip_http_version 1.0;
gzip_vary on;
gzip_comp_level 6;
gzip_proxied any;
gzip_types text/plain text/html text/css application/json application/x-
javascript text/xml application/xml application/xml+rss text/javascript;
gzip_buffers 16 8k;

The Nginx web server can


serve all static content while
acting as a proxy to Apache
Create an example site within Nginx:

Network flow Resources as theyre requested from the server via debug toolbar
# Set expiration headers
expires 30d;
}
# All other extensions not in the above
location / {
# Root directory for Apache site files
root /var/www/example-site/httpdocs/;
# Index file to look for
index index.html index.htm index.php;

nano /etc/nginx/sites-available/example-site
server {

# Disable access logs to improve performance


access_log off;

# Listen on the standard web port


listen 80;

# how long nginx will wait to get the response to a request


proxy_read_timeout 60;

# Host names to respond to.


server_name www.example-site.com;

# Timeout for the connection to the upstream server


proxy_connect_timeout 30

# extensions to check for to be served by Nginx


location ~* ^.+.(jpg|jpeg|gif|png|ico|html|css|js|ttf|eot)$ {

# Url to proxy to


proxy_pass http://www.example-site.com:8080;

# Root directory for Nginx site files


root /var/www/example-site/httpdocs/;

# Add the Remote Address in Apache header


proxy_set_header X-Real-IP $remote_addr;

# Disable access logs to improve performance


access_log off;

# Keep the Host header in Apache


proxy_set_header Host $host;

The SEO Handbook 99

Tutorials

# Maximum number and size of buffers for large headers to read from client
request
large_client_header_buffers 2 1k;

Tutorials

Serve faster web pages

# Contains client request-header X-Forwarded-For


proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
}

}
Symbolically link the available site within the enabled sites:
ln s /etc/nginx/sites-available/example-site /etc/nginx/sites-enabled/example-
site
Update the port that Apache listens to from 80 to 8080, so Nginx can listen
to the default web port 80 and proxy to Apache on the 8080 port:
nano /etc/apache2/ports.conf
NameVirtualHost *:8080
Listen 8080
Create an example site within Apache:
nano /etc/apache2/sites-available/example
<VirtualHost *:8080>
# Email for the server admin
ServerAdmin jaime@strawberrysoup.co.uk
# Url for the website
ServerName www.example-site.com
# Any additional urls
ServerAlias example-site.com
# Root for the site where the files are stored
DocumentRoot /var/www/example/httpdocs/

Tutorials

# Directory permissions
<Directory /var/www/example/httpdocs/>

Options FollowSymLinks
AllowOverride All


Order allow,deny

allow from all
</Directory>
#
ErrorLog /var/log/apache2/error.log
# Possible values include: debug, info, notice, warn, error, crit,
# alert, emerg.
LogLevel warn
# Location for the access log
CustomLog /var/log/apache2/access.log combined
</VirtualHost>

By proxy An example diagram with communication via a third intermediary machine

How much faster is Nginx?


With so many different factors at play, such as the number of
resources on a site or the amount of PHP code to be processed, the
page loading time to the user can vary significantly. Below are some
example test load times.
I used Pingdom Toolss (http://tools.pingdom.com) Full page test to
perform four test runs and take the average. These results arent 100 per
cent accurate, but it gives a fairly good comparison.

Test 1

Test 2

Site: Irregular Choice list page


Total Requests: 55
Page Size: 960KB
Proxy through Nginx: 3.05s
Apache: 4.52s

Site: Oasis Overland homepage


Total Requests: 139
Page Size: 1.5MB
Proxy through Nginx: 3.57s
Apache: 5.08s

There is almost a 33 per cent drop in load time between just Apache and
using Nginx to proxy to Apache. The next test was a concurrency stress test.
This should show more in the way of the benefit of Nginx and how eventdriven asynchronous request handling works better for static content.
Before and between each test Nginx and Apache were restarted and left
for five minutes to allow the server to settle. To run the stress test, AB
(Apache Benchmark) was used against the Oasis Overland site.

Test 1

Test 2

Concurrent connections: 250


Number of requests: 1500
Apache result
Requests per sec achieved: 56.55
Total time to complete: 26.52s
Time per request: 17.68ms mean
Nginx result
Requests per sec achieved: 67.55
Total time to complete: 22.20s
Time per request: 14.80ms mean

Concurrent connections: 500


Number of requests: 1500
Apache result
Failed. Connection reset after
1012 requests.
Nginx result
Requests per sec achieved: 60.74
Total time to complete: 24.70s
Time per request: 16.46ms mean

Symbolically link the available site within the enabled sites:


ln s /etc/apache/sites-available/example-site /etc/ apache /sites-enabled/
example-site
Restart both the Apache server and the Nginx server:
/etc/init.d/apache2 restart
/etc/init.d/nginx restart
Now, if we go to your site everything should be working correctly with all
static content like images, CSS, JavaScript etc will be being served by Nginx and
all PHP processing will proxy to Apache. If you want to bypass Nginx for any
reason you can simply add :8080 to the end of the domain name to go direct to
Apache. For example: www.example-site.com:8080.
More information on specific commands can be found on Nginxs Wiki along
with example usage: http://wiki.nginx.org. l

100

The SEO Handbook

Its clear that Nginx is able to handle a much higher number of


simultaneous connections in a quicker time. Even with increased
concurrent connections to 1000 with 3000 requests, it still held up fine,
with very similar results. Also, the load on the server from Apache was
higher with it taking longer to release the memory usage than Nginx.

About the author


Names Jaime Hall
Site www.strawberrysoup.co.uk
Clients Animal, Bodyshop, Ironfist, Irregular Choice
Areas of expertise HTML5, CSS3, LAMP,
server management
Whats your pet hate?
People who dont take pridein their work

Tutorials

SEO top tips

SEO 10 top tips


tags
01 Unique
Use unique title tags and meta
descriptions for each page; obvious, but very
important. Try to include your SEO keywords
while creating text that will attract users to click
on your page. Keep title tags to a maximum of
70 characters and meta descriptions to roughly
155 characters. Dont use meta keywords, they
give no benefit and help your competitors find
your important keywords.

selection
02 Natural
Dont over-optimise. With the release
of the Penguin update, Google is rewarding
sites that are more natural. So avoid stuffing
keywords into your title tags and content. Dont
have hidden text or links, dont use cloaking
or sneaky redirects. You get the picture. Keep it
natural and concentrate on user experience.

makesure you 301 redirect the old websites


links to your new pages. The same goes for if
you have multiple domain names: use one as
the primary domain and then 301 redirect the
others. This way you will retain all backlinks
andavoid any duplicate content issues. If you
want to find where your backlinks currently
point to, use something such as Open Site
Explorer or MajesticSEO.

linking
04 Internal
Make good use of internal linking. When
designing a great looking site we sometimes
forget about which pages are the most
important and how theyre linked to. Make sure
your internal links support your most important
pages and use good descriptive anchor text to
let search engine spiders understand what the
page is about. You can use a website crawler
such as the SEO Spider Tool from Screaming
Frog, or Xenus Link Sleuth (both of which are
free to use) to see how your internal linking
affects spidering.

careful
05 Be
Be careful of free open source themes
and plug-ins. Many free WordPress themes
often include hidden code that will place
spammy external links on your website or,
even worse, malware. Dont trust themes
and plug-ins, make sure you check them first
before installing or get them from a trusted
source. Usually these hidden links and so on
are hidden in Base64 PHP code, so you can
check for that before installing. To easily check
WordPress themes, install and run TAC (Theme
Authenticity Checker).

Place to be Create listings in directories such as Google Places to increase your visibility

When building your website, perform


keyword research at an early stage so
that it can be built into the design
cheat
06 Dont
Dont create spammy links. Dont buy links.
The latest Google updates hit those using linking
schemes and blog networks hard. If you get
caught, your site can easily be pushed from page
one to page five in an instant.
So dont be lazy and cheat, build links the
proper way by creating great content and
buildingan audience and relationships online.
If a linkbuilding package looks too good to be
true, it probably is.

keywords early
07 Research
When building your website, perform
keyword research at an early stage so that it
can be built into the design of your site. If you
have an existing site, consider running a test
usingGoogle AdWords to see what converts
for you. This way, you can easily identify the
required landing pages (money pages) to ensure
that yourdesign caters for it, without horrible
bolt-on afterthoughts.

microdata
08 Use
Take advantage of Googles rich snippets
by using microdata in your web build. Ever seen
review rating stars in a organic listing? Thats a
rich snippet and it can greatly improve your click
through ratios (CTR). You can mark up ratings,
videos, people, products and many other things.
Visit http://schema.org to find out more. Also,

Google has recently released the REL=AUTHOR


attribute that allows you to connect content to a
specific author, which Google can then use in the
search results.

information
09 Clear
Optimise your website for local search
results. Create a listing in local search engines and
directories, for example, Google Places and Yahoo
Local. Display your business information clearly
on your website, for example, in the footer. Check
your whois data and make it the same as the
businesss if its different. Consider local business
directories, businesses you work with and your
local chamber of commerce.

social
10 Think
Search engines are placing great weighting
on social signals and so you want to encourage
the use of these networks such as Facebook
and Twitter. Make sure your content is easy to
share, and promote content through your own
social profiles. Concentrate on getting shares and
retweets, rather than just increasing followers and
Likes. Its the sharing of content that we want, the
followers just give us the audience to do so.
Glenn Alan Jacobs is managing director
of SpeedySEO, an online marketing
agency based in Essex
www.speedyseo.com

The SEO Handbook 101

Tutorials

301 redirects
03 Manage
If moving to a new website or URL,

Tutorials

Utilise structured data

Schema utilise
structured data

Download
the files! >u

All the files yo


torial
need for this tu
http://
can be found at
a-241
netm.ag/schem

Theres a shift towards the semantic web from Google and other search providers.
Luke Hardiman demonstrates how to use structured data for ecommerce products
Knowledge needed Basic HTML
 Requires Text editor, Google Structured Data Testing Tool
Project time 30 minutes
Theres never been a better time for small online businesses and
startups with great merchandise to get their wares in front of new
customers, and all without having to rely on paid clicks, banners or
any other annoying, expensive ad campaigns. In many ways, content is king
again, and not just the editorial kind. Surround your online products with
quality, contextually relevant content, mark it up in a semantically sound way
and youll be taking the high road to commercial success.

How did we arrive here?

Tutorials

Obsessing over semantic markup used to be more or less the exclusive domain
of web standardistas, many of whom were driven by anything but a commercial
agenda. Some, such as web designer Jeffrey Zeldman, did in factpresent the
business case(www.zeldman.com/dwws) for web standards many years ago.
Arguments for big business adopting web standards were met with varying
degrees of success and no shortage of deaf ears.
Structured data now makes that case more emphatically than ever before,
with visual evidence in the form of rich snippets and money in the bank,
delivered on the back of improved search engine performance. These benefits
will extend beyond the big search engines in the near future.

Wrath of Penguin
Major Google updates(Panda in 2011(http://netm.ag/panda-241) andPenguin
in 2012 (http://netm.ag/penguin-241) took a far-reaching swing at spammy and
poor quality content. Sites judged to have a poor user experience, too many
ads, or deemed to have benefitted from dodgy link building campaigns suffered
massive drops in organic traffic. Many of them paid a high price in lost revenue.
So what does this SEO smackdown have to do with frontend development?
Well, its more about the increased focus on semantics and what that means
for nearly anybody producing and marking up online content, from your

Schema.org Google, Bing, Yahoo and Yandex showed unanimous support by declaring
Schemas microdata vocabulary as the preferred standard in 2011

102

The SEO Handbook

average blogger maintaining an audience, to online stores, local businesses and


beyond. Anyone investing in content and looking to get some kind of return on
investment monetary or otherwise should now be paying attention to how
they present it at the data level.
Rewarding structured data-powered sites is, in many ways, dependent
upon knowing the origin of that content. Googles Authorship looks to
solve this problem, extending the concept of provenance (http://netm.
ag/authorship-241) to the web. It enables the search giant to make value
judgements about the person or organisation behind a sites content. Many
believe this is the new frontier for reaping organic search success.
This has come about in part because Google is looking tounload some of
its reliance(http://netm.ag/anthony-241) on easily manipulated search ranking
signals such as thelink graph (http://netm.ag/dejan-241). In the near future,
its thought it will look to semantic markup, Author Rank and social signals
to do this. Some would argue its already happening. We saw Google launch
theKnowledge Graph(http://netm.ag/graph-241) with the emphasis on
things not strings a clear shift towards semantic data mining. Larry Page
stated that the Knowledge Graph is only at one per cent of the level of detail
Google is aiming for.

Enter Schema.org
In June 2011, Google, Bing, Yahoo and Yandex all announced a rare show of
unanimous support and declared Schema.orgs (http://schema.org) microdata
vocabulary the preferred standard. This provoked no small amount of protest
from the microformats and RDFa people, who had been working on their own
data vocabularies for a considerable amount of time. Nevertheless, pragmatists
have appreciated the across-the-board adoption by the big search players, the
freedom to extend thevocabulary (therell be more on this later), the relative
ease-of-use, the testing tools and the plentiful documentation that Schema.org
brings to the table.
The unilateral declaration of support by the big search players succeeded
in lighting the intended fire. In the recent past, business-minded frontend
developers have set about implementing microdata wherever a sites content
fits the available structured data vocabulary. The rewards are beginning to

Semanticweb.com This reports on the unanimous show of support for microdata


from Google, Yahoo and Bing

Tutorials

Utilise structured data

Q&A Ask Google a question, and its now trying to deliver you an answer, rather than
just list a bunch of links to sites that may or may not do that

pay dividends as with the author rich snippets in the form of increased
clickthrough rates from the search results.
Theres no question that highly relevant search results that stand out from
the competition are going to bring you more traffic. Not only that, but its
becoming clear that rich snippet traffic is more likely to convert into revenue. A
user that clicks on a rich snippet result invariably has a better understanding of
the content theyre going to find at the end of that click. Theyre making a more
informed choice, which could mean theyll be more likely to place an order at
the end of their session.

Rich snippet gold rush


Right now, across the ecommerce and inbound marketing world, there is a big
drive afoot to mark up content as semantically as possible. Naturally, any
product manager would want to beat competitors with handsome, semantically

Its becoming clear that rich


snippet traffic is more likely
to convert into revenue
rich search results. Frontend developers have found themselves at the business
end of this commercially driven markup trend.
In this tutorial were going to look at how you can take your product markup
to the next level and take aim at sexy consumer confidence-building rich
snippets. Better still, many would say that, by implementing semantically-rich
product templates today, you are readying your site for all manner of indexation
in the semantic web applications of tomorrow. Mobile apps and traditional
ecommerce product aggregators will surely rely more heavily on semantic
markup as HTML5 and microdata implementation takes off across the web.

The concept ofprovenanceis about establishing ownership and


authenticity for a given entity, in our case web pages. One standout
example of an algorithmic attempt to validate and evaluate online
content can be seen in GooglesAuthorship(http://netm.ag/
author-241) initiative. One way to look at it is as a new kind of
insurance policy against future web spam, which includes the semantic
web. Weve already seen spammers souring the semantic petri dish
with entire sites of rubbish created solely for gaming the structured
data crawlers.
In simple terms, Authorship, powered by the rel='author' and
rel='publisher' attributes, enables Google to do a quick background
check on the originator of any content those attributes pertain to.
In the new search economy of content valuation, there is a growing
belief (originating amonginboundmarketers) that online business
owners need to pay attention to building trust and cultivating their
online reputation. Their efforts will be, and are to some degree already
being, judged algorithmically, based oncontent quality, position on
thesocial graphand other aspects of thedigital footprint.
Google has moved to set upGoogle+as a nexus for personal and
professional online profiles. Where many have dismissed the service as
another failed attempt to take on Facebook,others have recognisedthat
there is something more forward-thinking going on there.
By adding <link rel="author" href="[google plus profile url]">
and<link rel="publisher" href="[google plus business page url]"> to an
HTML documents head, you can direct Google to a point of reference
that serves to expose who produced the content its crawling. This is how
author thumbnails come to appear in the Google search results, among
other more far-reaching benefits. Tests have shown this leads to better
clickthrough rates. Users are more inclined to trust, or show interest in, a
search result that Google has been able to put a face to.
Matt Cuttss personal site is sufficiently established to define his online
presence, and so his rel="author" links point back to http://mattcutts.
com. Hes also not using the official Google microdata recommendation.
Microformats are still working fine for his blogging needs.

Building in the semantics

Testing times TheGoogle Structured Data Testing Toolprovides feedback to review


your changes as you work as well as any nesting or vocabulary errors

By far and away, the best approach is to build microdata into the CMSs
templates. No frontend developer wants an inbox full of Word documents from
the content team and to have to set about marking up the contents of each one
on a case-by-case basis.
Personally, I mostly work with ExpressionEngine (http://ellislab.com/
expressionengine), which is known for being a very flexible publishing system.
I always try to start each project with solid information architecture and at least
a basicdata dictionary. These are the two main ingredients necessary for setting
up the CMS.If youre fortunate enough to already have your data entities welldefined in your CMS, youre off to a flying start and maybe you are ready
to skip ahead to implementation at the template level.

Googles tool This will kick out a list of all of the identified items on your page, as
well as any errors or entities which do not fit within your chosen schemas

The SEO Handbook 103

Tutorials

Provenance

Tutorials

Utilise structured data

Preparing new sites


for structured data
In the planning phase of your site, the challenge is to identify all of
the structured data entities and attributes relevant to the business
at hand. Once you have your data dictionary, you can use it to set up
your publishing system so that these entities each have a separate
field assigned to them in the CMS.
This level of separation gives you the flexibility to output the content
in a granular way, wherever it is needed. So, on your product detail
page, you can output everything at once while also having the option
to rearrange the pages components whenever you want or need to test
a new layout.
On listing templates or search results pages, you can output product
content piecemeal, showing only the quick view attributes, like pricing
and a summary. This makes it easy to scan a list of options. Need
geo-co-ordinates for maps? A single image for the homepage slider?
No problem. All of your site content is neatly segmented in a way
that allows you to build and progressively tailor your site towards the
perfect user experience.

Data integrity
For microdata purposes, data integrityallows for coding up templates
where each type of content can be outputted into the relevant HTML5
element, which in turn contains the specific microdata attributes to
define exactly what type of content lives inside.
Get your HTML and itemtypes ready to receive whatever new
products the content team throws at them. Once youve achieved this
youve won significant victories for your web platform on three of the
following different fronts:

Tutorials

l At the business level, you have a highly flexible and product-specific
publishing system for your client
l At the user level, you have well-defined data that can be searched
or outputted wherever needed, putting your content to work in the
service of a great user experience
l At the findability level,you have detailed product data and attributes
that describe semantically rich entities for search engines as well as
for aggregators

Content creators and copywriters


Within your CMS,as a well-structured set of custom fields, is assembled
for content authors into a publish form awaiting a blog post, new travel
destination, hotel or whatever the content strategy requires. Copywriters wont
need (or want) to think of it in these terms though. We dont want to waste
their creative juices with thoughts of data structure and the semantic web.
However, it benefits everybody users, the content team, search engines and
ultimately the businesss bottom line to structure the system in this way. We
want the site to have the capability to output product and related data in both a
flexible and highly specific format.

A practical use case


Lets take a travel website as a practical example. As the base for your site
content, you may have a set of location entities countries, travel regions, cities
and towns. The products are most likely things like accommodations and tours,
which are each related to one or more locations. Within each tour product,
there are various product attributes:
l a gallery of imagery
l locations visited, perhaps ordered sequentially
l locations with their own geo-co-ordinates
l the resulting route map
l activities on offer at each waypoint
l accommodations stayed at
l the tour price
l a list of scheduled departure dates
l a special offer for last-minute departure dates
l a video overview of the trip
l a tour category or travel style
l traveller reviews of the experience
From these attributes we can extract a set of data entities based upon the
microdatavocabulary available to us at Schema.org (http://netm.ag/docs241). Better yet,as of November, Schema.org is officially extensiblevia the
GoodRelations vocabulary (http://netm.ag/gr-241), which is an ecommercespecific structured data vocabulary.Think of GoodRelations as a common
language for defining ecommerce content across the semantic web.
All of the big search players Google, Yahoo, Yandex and Bing have again
come out in support of this move, so you can be pretty confident that using
GoodRelations with Schema.org microdata will translate to increased visibility
for online merchandise.

Marking up a tour
In terms of the immediate benefits available to us, and based on our tours
product attributes as listed above, there are a few different types of rich snippet,
which we can target with our product content.
The following code snippets are taken from the sample HTML file provided,
which marks up an African overland tour with some of the entities listed above.
Go ahead and customise this to your own needs.

Authorship
To start, well add <link> tags for the rel="author" and rel="publisher"
attributes. For the example below, I have these pointing to the Google+ profile

Express yourself The ExpressionEngine product publish form makes it simple to


output entity-specific content to structured data templates

104

The SEO Handbook

Our tour The detail template contains a many aspects which can be formatted as
structured entities, from pricing to events and a considerable amount of geodata

Tutorials

Utilise structured data

Pricing An 'In Stock' message pops up next to the price in your search result. This is
achieved by creating an itemscope for the product

blog category breadcrumb trail

of the pages author. You may or may not want to do this, based on how
connected your author is within their industry. An author who is a respected
authority on the pages subject matter, with a strong social media footprint or a
considerable weight of content published elsewhere online, is going to be a
valuable person to link up here.

<!DOCTYPE html>
<html>
<head>
<title>Multi-country Overland Tour</title>
<!-- were specifying authorship here, which gets us an author thumbnail in 
the Google SERPs -->
<link rel="author" href="https://plus.google.com/u/0/1174663396329391101
11/posts" /> 
<!-- were specifying the publisher here, which ties the content to our 
Google+ Profile --> 
<link rel="publisher" href="https://plus.google.com/101112670412082074285
/posts">
With the current state of play, Ive found that authorship tends to override
other structured data Google may have found and leave you with a rich

With the current state of play,


Ive found authorship tends to
override other structured data
snippet dominated by your authors photo. This may or may not work for you,
depending on what type of content you want to expose in the search results,
(or how good looking your content team is!).

Breadcrumb navigation
Here we add a breadcrumb (http://netm.ag/crumbs-241) itemprop attribute to
the nav element. All you need to tell Google is that this set of links contains
information about your content architecture.Using breadcrumbs will not only
give you a rich snippet that reflects your sites structure, but consistent use of
breadcrumbs will also serve to expose your information architecture to bots and
users alike. Double win.
<!-- The WebPage itemtype is implied, but Im declaring it below for the sake 
of clarity -->
<body itemscope itemtype="http://schema.org/WebPage">

<!-- The breadcrumb is not part of the Product schema and must sit
outside of the Product itemtype below --> 
<nav itemprop="breadcrumb">

You are here

<a href="/">Home</a> >

Canon EOS This is the search result detail youre aiming for. Theres breadcrumb, star
rating, review date, author name and thumbnail and links to more of his content

<a href="/budget-safaris/">Budget Safaris</a> > 

<a href="/budget-camping-safaris/">Budget Camping Safaris</a> >
Multi-country Overland Tour

</nav>

Pricing
Here we create a new offer (http://netm.ag/offer-241) itemscope for the
product, contained within the <div> that wraps our pricing information. This
specifies currency, amount (price) and Im also using the new GoodRelations
businessFunction attribute to specify that we are selling this tour, as opposed to
leasing it or any other type of trading.
<!-- the 'offer' - pricing details. Using GoodRelations there is huge scope
for additional detail here -->
<div itemscope itemprop="offers" itemtype="http://schema.org/Offer">

Priced from <abbr title="Per Person Sharing">pps</abbr>
<meta itemprop="priceCurrency" content="GBP">


<span itemprop="price">999</span>

<link itemprop="businessFunction" href="http://purl.org/goodRelat

ions/v1#Sell"> 
</div>
Specifying<link itemprop="availability" href="http://schema.org/InStock">
can also get you a little In stock message that pops up next to the price in your
search result. Im not sure how well this fits with the selling of tours, and there
may be a better way to handle this for products that are not off-the-shelf items.
(Ive left it out of the example code.)

Events
Because events arent catered for at the moment as properties of the product
vocabulary, I havent used an itemprop attribute on any of the <li>s. Im still
going to include them on the page as structured data entities though. This is in
the hope that the search engines and aggregators will make use of the clearly
defined content.
<!-- were marking up the scheduled tour departures as events.
Note: events are not valid properties of the Product schema, hence

no itemprop attribute -->

<h3>Departure Dates</h3>
<ul>

<li itemscope itemtype="http://schema.org/Event">

<meta itemprop="name" content="Scheduled Tour Departure">

<meta itemprop="location" content="Cape Town City Centre">

<meta itemprop="duration" content="10 Days">

<time itemprop="startDate" datetime="2013-03-08">Fri, 08 Mar

2013</time> -
<time itemprop="endDate" datetime="2013-03-17">Sun, 17 Mar


2013</time>
</li>


<li itemscope itemtype="http://schema.org/Event">

The SEO Handbook 105

Tutorials

Vicky Rowe Heres a rich snippet for a blog post, showing verified authorship and a

Tutorials

Utilise structured data

<meta itemprop="name" content="Scheduled Tour Departure">


<meta itemprop="location" content="Cape Town City Centre">
<meta itemprop="duration" content="10 Days">


<time itemprop="startDate" datetime="2013-03-22">Fri, 22 Mar
2013</time> -


<time itemprop="endDate" datetime="2013-03-31">Sun, 31 Mar

2013</time>

</li>

<li itemscope itemtype="http://schema.org/Event">

<meta itemprop="name" content="Scheduled Tour Departure">
<meta itemprop="location" content="Cape Town City Centre">

<meta itemprop="duration" content="10 Days">

<time itemprop="startDate" datetime="2013-04-05">Fri, 05 Apr

2013</time> -

<time itemprop="endDate" datetime="2013-04-14">Sun, 14 Apr

2013</time>
</li>

</ul> 
In an event-specific move targeted at the non-coders out there, Google
recently launcheda wysiwyg toolthat allows you to show them where on
your site the event data can be found.The Data Highlighter(http://netm.ag/
data-241) currently resides inside Google Webmaster Tools and seems a bit
limited at the moment. Perhaps it will be refined over time. Another thing to be
aware of is that youll only be helping Google to understand your content and
not disseminating it anywhere other than within Googles range of services.

Reviews and video


Ive included my VideoObject as an associatedMedia child of the Review entity,
but feel free to add yours anywhere you like. Just be aware that

Tutorials

Be aware that associateMedia


isnt currently a valid property
for the product schema
associatedMedia isnt currently a valid property for the product schema. So you
wouldnt be able to semantically tie a video to a product in this particular way.
<!-- lets add a review, and include a video with it -->
<h3>Reviews of this Trip</h3>
<div itemprop="review" itemscope itemtype="http://schema.org/Review">
<span itemprop="name">Not a happy camper</span> -
by <span itemprop="author">Frank Chickens</span>,
<meta itemprop="datePublished" content="2011-04-01">March 1, 2013
<div itemprop="reviewRating" itemscope itemtype="http://schema.org/
Rating">

<meta itemprop="worstRating" content="1">

<span itemprop="ratingValue">1</span>/

<span itemprop="bestRating">5</span>stars
</div>
<span itemprop="description">The truck broke down and lions ate our 
guide.</span>
<div itemprop="associatedMedia" itemscope itemtype="http://schema.
org/VideoObject">

<h5>Video: <span itemprop="name">Lions Eating Safari Guide</

span></h5>

<meta itemprop="duration" content="T1M33S" />

<meta itemprop="thumbnail" content="lion-chomp-thumb.jpg" />

<video>

<source src="lion-lunch.mp4" type="video/mp4">

</video>
<span itemprop="description">Not for sensitive viewers - includes 


graphic footage of lion eating bus driver.</span>
</div> 
</div>

106

The SEO Handbook

Webmaster Tools Be aware that entities wont show up immediately. The time scale
depends upon how often and how deeply Googlebot is crawling your content
That wraps up the example code. One thing to bear in mind is, with this
particular example, we have a lot of entities all competing for attention on one
product page. You may find you get better results (particularly with Google
rich snippets and long tail searches) by adding more content to each entity. For
example, for longer reviews, and then breaking things out into separate URLs,
something like this:
.../my-tour/reviews/
.../my-tour/locations-and-map/
.../my-tour/departure-dates/ 
This way you dont compromise your chances of getting a separate rich
snippet for each entity.
Separate URLs also means youre more likely to rank forlong tailsearch
queries. For example, location+product, product+review and so forth. The long
tail can be a conversion gold mine. The specificity of the query often means
the user is quite far down the decision-making process, and is more likely to
convert to a sale or a booking.

Tracking progress
Once you have some structured data entities live on your site, Google allows
you to monitor the indexation progress via Google Webmaster Tools.
Under the labs section in Webmaster Tools theres also an Authorship report,
showing how often rich snippets were presented in the search results and the
percentage of clickthroughs that resulted.
Similar to itspolicy on cloaking,Googlesguidelineson rich snippets(http://
netm.ag/rich-241) specify that any content marked up with structured data also
be shown to the user. Dont be tempted to present one set of entity-stuffed
structured data to the engines without making it visible on the page. At best
youll get no benefit, and at worst youll be penalised and your site could
disappear from the search results.l

About the author


Name Luke Hardiman
Web www.africanbudgetsafaris.com and
http://southernafricatravel.com
Areas of expertise Frontend, design, UX, content strategy
Clients Barclays, Ogilvy Interactive, Volkswagen
Whats your pet hate? Sites that kick things off with
Welcome to our website!

THE 100 BEST typefaceS EVER


Along with FontShop AG, we bring you the definitive list of the best
typefaces ever in one beautiful 180-page book. Youll discover the
historical significance and origins of some of the worlds best typefaces,
and how you can use them in your own design work. If youre a graphic
designer or typographer, its an unmissable selection of type
On sale now at WHSmith, Barnes & Noble, www.myfavouritemagazines.co.uk
and for iPad at http://goo.gl/sMcPj

Tutorials

Boost page performance

ASP.NET boost
page performance

Download
the files! >

All the files you


torial
need for this tu
http://
at
d
un
fo
be
n
ca
42
p-2
/as
.ag
netm

Bundling and minification are two very effective techniques to provide a faster page
load time and browsing experience for your website. Dean Hume explains how
Knowledge needed ASP.NET, basic CSS and JavaScript
 Requires Visual Studio 2012 or Visual Studio Express 2012 (free)
Project time 1 hour

Tutorials

Most CSS and JavaScript thats written by developers includes


comments, line breaks and spaces. This makes the code a lot easier
to understand and easier on the human eye. However, while it may
make your code readable and great for the next person to review, it can have
a negative effect on your web page performance.
If we take the jQuery library for example, the file is around 225KB in size.
If we were to strip out all the spaces, comments and line breaks from the file
using a technique called minification, the file size comes in around 93KB. That
is almost a 60 per cent saving on the original file size! Imagine the impact that
this will have on your network traffic and hosting costs. Not to mention that if
you have a high traffic website, every user that visits your site will have less to
download and experience faster load times. If you were to save on the file size
of every JavaScript and CSS file on your website, you may be surprised by the
impact that this can have on the page load speed of your sites.
So what exactly is minification? Lets take a look at the snippet of CSS below:
input[type="search"] {
-webkit-box-sizing: content-box;
-moz-box-sizing: content-box;
box-sizing: content-box;
-webkit-appearance: textfield;
}
input[type="search"]::-webkit-search-decoration,
input[type="search"]::-webkit-search-cancel-button {
-webkit-appearance: none;
}
textarea {
overflow: auto;

Visual studio Visual Studio 2012 offers a whole host of great features designed to
help develop interactive web applications with new tools for JavaScript and jQuery

108

The SEO Handbook

vertical-align: top;
}
If we take this same file and minify the contents, it will look a little
something like this:
input[type="search"]{-webkit-box-sizing:content-box;-moz-box-sizing:content-
box;box-sizing:content-box;-webkit-appearance:textfield}input[type="search"]::-
webkit-search-decoration,input[type="search"]::-webkit-search-cancel-button{-
webkit-appearance:none}textarea{overflow:auto;vertical-align:top}
The contents of the file have had all the spaces stripped out, comments and
line breaks removed, which has made a significant difference to the overall
size of the CSS. The code above isnt exactly very easy to read and will be a bit
of a pain to debug. However, your browser doesnt care what the code looks
like; it will process it in exactly the same way. The key difference here is that
the minified code will download a lot faster and allow your users to begin
interacting with your web pages sooner.
In order to see the size differences between minified and unminified files, I
compared the following popular frontend frameworks:
File Names

File Size Before

File Size After

File Size Savings

Twitter Bootstrap CSS

98KB

80KB

19%

Twitter Bootstrap JS

60KB

27.8KB

53.7%

Zurb Foundation

99.3KB

74.9KB

25%

jQuery

225.78KB

93.28KB

58.68%

jQuery Mobile

240KB

91KB

62%

As you can see from the table above, there are considerable file size savings
to be made by simply minifying your CSS and JavaScript files. The amount of

Getting started Begin by creating a new project in Visual Studio. In this example, we
are going to use an ASP.NET MVC 4 application

Tutorials

Boost page performance

Our JavaScript is getting bigger


Its surprising to know that every year the size of the JavaScript files that
developers are writing is growing. According to the HTTP archive (www.
stevesouders.com/blog/2012/02/01/http-archive-2011-recap), between
January 2011 and January 2012, the average size of JavaScript files grew
by about 44 per cent. Between January 2012 and April 2013 that average
size of JavaScript files increased by 23 per cent. That is incredible! If we
continue at this rate, the JavaScript files that we as developers serve to
our users are going to become huge.
With the introduction of JavaScript MVC frameworks, coding entire sites
in JavaScript and HTML is now becoming the norm. As developers, we need
to be mindful of the effect that this can have on the performance of our
websites. When a browser needs to parse and execute JavaScript, it blocks
the UI thread and makes websites slower.
However, this growth isnt just limited to JavaScript. CSS files are also
growing in size and the average size of CSS files between January 2011
and January 2012 grew by 19 per cent. As we know, the bigger the size of
the files the longer it takes for our users to download. This is especially
relevant for your users that access your website on a mobile device or tablet
and its important to keep this in mind when creating your web pages.

The chart above shows the growth in size of JavaScript file sizes between
January 2011 and January 2013. Its on a steady incline and reflects an
interesting trend. While having a lot of JavaScript on your website may be
inevitable, keep in mind that it may block the UI thread and makes your
website slower. By ensuring that we stick to best practices, such as bundling
and minifying CSS/Script files, we give our users a better and faster
browsing experience.

savings that you will be able to make will differ depending on the contents and
file type. These changes can make a big difference to overall page load times.

What is bundling? Well, bundling is the act of combining all your JavaScript files
in a web page into one single file, and similarly all the CSS files into one. Web
pages can contain multiple script tags and style tags referencing different files
on your server. The more CSS or script files that you have in your web page, the
more HTTP requests your users need to make every time they visit your web
page. HTTP requests are expensive because they mean a round trip from the
browser to the server every time. If this can be reduced, then the web page load
times can be improved. Since minifying a file doesnt affect the way its rendered

By minifying and bundling


your script and style files you
can get even better results
by a browser, why not combine the two techniques together? By minifying and
bundling your script and style files you can get even better results.
In this article, were going to run through an example in ASP.NET MVC that
will allow you to automatically minify and bundle all your resource files. The
new features that have been built into ASP.NET 4.5 make this a very easy process
and takes the pain out of your day-to-day development. The code will allow
you to easily switch been the minified and unminified versions of your code
depending on whether you are in release mode or not.

Getting started
Lets get started with the code example. Start by firing up Visual Studio and
creating a new project. Choose an ASP.NET MVC 4 application and give your
project a name. Ive chosen to name mine NetMagBundling. Click OK.
Next, you will be presented with a screen similar to the one pictured on the
right. Choose the Basic template and click OK. This will create a new project
and provide you with the default ASP.NET MVC solution files.
In this example, Im going to use the Twitter Bootstrap framework along
with a copy of the latest jQuery library in order to get a simple design up and
running. The Twitter Bootstrap framework is available to download at http://
twitter.github.io/bootstrap. Once youve downloaded the Twitter Bootstrap
files, add them to your project. I have added my CSS to the Content folder and
JavaScript to the Scripts folder. Ignore the minified versions of the file that
come with the Twitter Bootstrap download, as we are going to use the ASP.NET
bundling framework to automatically bundle and handle this for us.

Basic template Next, choose the Basic project template. This will create all the
resource files that we are going to need to get started

In your Solution Explorer, you will also notice a folder called App_Start.
The App_Start folder contains a file called BundleConfig.cs, which enables
you to set and manage the bundling and minification for the project. Open
the BundleConfig.cs file and you will notice a method called RegisterBundles.
We will use this method to notify the ASP.NET framework which files to minify
and how to bundle them together. When you create a basic project in Visual
Studio, it will provide you with common libraries and you may notice that the
RegisterBundles method already contains some code inside it. Remove this code
as we are going to write one that will only bundle the files that we need.
Update the code inside the RegisterBundles method to represent:
public class BundleConfig
{
public static void RegisterBundles(BundleCollection bundles)
{
// Minify the CSS
bundles.Add(new StyleBundle("~/css/minify").Include(
"~/Content/css/bootstrap.css",
"~/Content/css/bootstrap-responsive.css"));
// Minify the JavaScript
bundles.Add(new ScriptBundle("~/js/minify").Include(
~/Scripts/bootstrap.js"));
}
}
In the snippet of code above, you will notice that theres a reference
to the Bootstrap CSS and JavaScript files. We are creating a new

The SEO Handbook 109

Tutorials

Bundling

Tutorials

Boost page performance

Compression +
minification = a boost
If youre looking to squeeze even more speed out of your page load
times, then you should be looking at HTTP compression. Its simply an
algorithm that eliminates unwanted redundancy from a file in order
to create a file thats smaller in size than the original representation.
In the same way that you may zip a file on your hard drive and then
unzip it again when needed, HTTP compression does the same thing
with your web page components. When a user requests a file from
the server, the server can return a compressed version of the file.
The users browser then understands that it needs to decompress the
file that came back from the server. By passing a smaller file back
between the server and browser, it makes the download time a lot
quicker. It also shaves off a considerable amount of bytes that need to
be downloaded.
Most web servers already contain the ability to compress files. Such
web servers simply need to have this feature enabled. Depending on
the web server that you are using, check the settings and dont forget
to enable compression.

Tutorials

The chart above is taken from the httparchive.org website and


shows the number of compressed responses between January 2012
and January 2013. This chart was produced by analysing around
290,000 URLs. Surprisingly, only around 70 per cent of all content
analysed was actually compressed. Considering how easy it is to
enable compression on your web server, you would think that figure
would be a lot higher.
In this article, were discussing bundling and minification, which
reduces the size of your script and CSS files. By combining this
technique with HTTP compression, youre able to squeeze even more
bytes out of these files. Together, these two techniques start to add up
and can be simple but effective techniques to improving the speed of
your web pages.

StyleBundle and ScriptBundle that will minify and bundle both of these
files together respectively. The paths inside the Include method both
point to the location of the Twitter Bootstrap files. If we take the StyleBundle,
for example, there is a path that points to "~/css/minify". This path doesnt exist
in our file structure, but is in fact an endpoint. We will point to this endpoint in
the HTML and the ASP.NET bundling framework will create our new bundle on
application start-up.
This new bundle that weve created will take the CSS files, minify them and
combine them together into one and do the same with the JavaScript files. This
means less to download and fewer HTTP requests being made.
In order to use the bundles that weve just created in your web page, create
a new view and add the following code:
@{
Layout = null;
}

<!DOCTYPE html>
<html>
<head>
<title>Bundled Boostrap template</title>
@Styles.Render("~/css/minify")
</head>
<body>
<div class="container">
<h1>Minifying and Bundled CSS/JavaScript</h1>
</div>
@Scripts.Render("~/js/minify")
</body>
</html> 
Ive created a simple HTML layout that uses the Twitter Bootstrap classes. You
will notice a method called Styles.Render. This method will use the bundle that
we defined and write out the style tags for us. Similarly, theres also a
Scripts.Render method that will bundle the scripts and write out the script tags
onto the page.
Lets fire up the application and compare the differences. If you run the
application and browse to the source of the code, you will notice HTML similar
to the following:
<!DOCTYPE html>
<html>
<head>
<title>Bundled Boostrap template</title>
<link href="/Content/css/bootstrap.css" rel="stylesheet"/>
<link href="/Content/css/bootstrap-responsive.css" rel="stylesheet"/>
</head>
<body>
<div class="container">
<h1>Minifying and Bundled CSS/JavaScript</h1>
</div>
<script src="/Scripts/bootstrap.js"></script>
</body>
</html>
OK, so the HTML contains all the scripts tags and style tags that we added
into the bundles. However, they dont appear to be minified or combined
together. This is because were running the application in debug mode. In
order for the minification and bundling to work, we need to ensure that the
application is in release mode. Bundling and minification is enabled or disabled
by setting the value of the debug attribute in the compilation element in the
Web.config file.
In your solution explorer, navigate to your Web.Config file and update the
following line:

Starting line In your Solution Explorer is a folder called App_Start containing a file
called BundleConfig.cs. This enables configuration of bundling and minification settings

110

The SEO Handbook

<system.web>
<compilation debug="false" />
<!-- Lines removed for clarity. -->
</system.web>

Tutorials

Boost page performance

Bootstrap In this example, we are going to use the Twitter Bootstrap framework.

Using frameworks If you havent already heard of Twitter Bootstrap, its a powerful

Download the framework and add the files to the CSS and Scripts folders accordingly

frontend framework that has a set of pre-rolled responsive templates

Set the compilation element in the Web.Config file to false and it will ensure
that you are no longer developing in debug mode. If you start the application
again and navigate the HTML of the page, youll notice the bundles in action.

applications, you can process and apply the custom transforms yourself. The
custom transforms will allow the framework to interpret the files and apply
bundling accordingly. Although the example in this tutorial ran through
bundling in ASP.NET MVC, the bundling framework also works in ASP.NET
Webforms (www.asp.net/web-forms).

The loading time of your web


page will be significantly
reduced for returning users
The bundles now point to the virtual path that we created in the
BundleConfig.cs file. These files are minified and combined making them
significantly smaller in size resulting in a faster page load time. Bundling
sets a HTTP expires header one year from when the bundle is created. This
means that if a user hits your web page, the CSS and JavaScript that they have
downloaded will stay in their cache for one year. Every time they revisit your
site, they wont need to download the files again because they already have
a copy in the cache. The loading time of your web page will be significantly
reduced for returning users. Depending on the frequency of your returning
users, you may also notice a significant reduction in the amount of network
traffic that you generate.
However, the framework is clever enough to recognise that the contents
of your files may change and could affect what your users have stored in their
cache. You may have noticed that the virtual path for the bundles has a query
string value of v with a unique hash identifier thats been appended. As long as
the contents of the bundle do not change, the ASP.NET application will request
the bundle using this token. However, if any file in the bundle changes, the
ASP.NET optimisation framework will generate a new token, guaranteeing that
any browser requests for the bundle will get the latest version.
The ASP.NET bundling framework also allows you to write your own custom
bundles. If you prefer to use LESS, CoffeeScript, SCSS or Sass bundling in your

Summary
By using this new feature of ASP.NET, you get three benefits in one: your files
are minified, combined together and will be cacheable. This means that your
users download less, make fewer HTTP requests and wont need to download
the files more than once unless the contents change.
If we take a look at the file sizes before and after bundling, there is a
noticeable difference:
File Type

File Size Before

File Size After

File Size Savings

Overall page weight

206.3 KB

147.41KB

30%

JavaScript

60.3 KB

27.6 KB

55%

CSS

124 + 21.6 KB

119.9 KB

17.66%

While this page is a very simple page, we have still managed to reduce the
overall page weight by around 30 per cent. That is a significant reduction on the
original weight of the page and will certainly benefit your users.
The average connection speed for internet users worldwide is an average
of 1.8 Mbit/s. In the UK, our connection speeds are around 6 Mbit/s, which is
significantly higher than the rest of the world. However, as web developers, its
important that we think about the users globally. Many users may be accessing
your website from different locations around the world with poor connection
speeds. These performance techniques will make a big difference to these users
and their overall browsing experience while on your website. As we know, the
bigger the size of the files, the longer it takes for users to download. This is
especially relevant for users accessing websites on mobile devices or tablets.
3G speeds can vary wildly depending on a number of factors and can be flaky
at the best of times. Think about performance best practices when developing
your website; all your users will benefit from your work!
If you are using ASP.NET MVC in your next application, give bundling a go.
For less than an hours worth of work, your users will notice the results! The
source files for this project are available to download at http://github.com/
deanhume/NetMagBundling. l

About the author


Name Dean Hume
Web www.deanhume.com
Areas of expertise ASP.NET, web performance
Clients Michelin, Jamies Italian and Fullers
Whats the most trouble youve ever been in?
What happens in Kazakhstan stays in Kazakhstan!

The SEO Handbook 111

Tutorials

<!DOCTYPE html>
<html>
<head>
<title>Bundled Boostrap template</title>
<link href="/css/minify?v=x2tjtoRj4l9AoKFwO-qVI5gpFTC7fxSWPa0gEL-BrNY1
rel="stylesheet"/>
</head>
<body>
<div class="container">
<h1>Minifying and Bundled CSS/JavaScript</h1>
</div>
<script src="/js/minify?v=ii4SomVnNME7Iq1GL51nWqk9KnL_D13MXTIn-
0yYx6I1"></script>
</body>
</html>

Tutorials

Make your sites load faster

Speed make your


sites load faster
Your websites visitors care whether or not it loads quickly. Tom Gullen shows you
how to make sites render faster, and runs down why you should be doing this
Knowledge needed Intermediate CSS and JavaScript, basic HTML
Requires A website to speed up
Project time Highly dependent on website
Speed should be important to every website. Its a well-known fact
that Google uses site speed as a ranking metric for search results.
This tells us that visitors prefer fast websites no surprise there!
Jakob Nielsen wrote in 1993 about the three limits of response times (see
www.useit.com/papers/responsetime.html); although the research is old by
internet standards, our psychology hasnt changed much in the intervening
19 years. He states that if a system responds in under 0.1 seconds it will be
perceived as instantaneous, while responses faster than one second enable the
users thought flow to remain uninterrupted. Having a web page load in 0.1
seconds is probably impossible; around 0.34 seconds represents Google UKs
best load time so this serves as a more realistic (albeit ambitious) benchmark. A
page load in the region of 0.34 to 1 second is achievable and important.

Tutorials

The price of slowing down

One of the first things to look at is the size of your HTML code. This is
probably one of the most overlooked areas, perhaps because people assume
its no longer so relevant with modern broadband connections. Some content
management systems are fairly liberal with the amount they churn out one
reason why it can be better to handcraft your own sites.
As a guideline you should easily be able to fit most pages in <50KB of
HTML code, and if youre under 20KB then youre doing very well. There are
obviously exceptions, but this is a fairly good rule of thumb.
Its also important to bear in mind that people are browsing full websites
more frequently on mobile devices now. Speed differences between sites
viewed from a mobile are often more noticeable, owing to them having

Content delivery networks


can significantly improve
load times for your website

These sorts of targets have real world implications for your website and
business. Googles Marissa Mayer spoke in 2006 about an experiment in which
the number of results returned by the search engine was increased to 30. This
slowed down the page load time by around 500ms, with a 20 per cent drop
in traffic being attributed to this. Amazon, meanwhile, artificially delayed the
page load in 100ms increments and found that even very small delays would
result in substantial and costly drops in revenue.
Other adverse associations linked with slow websites include lessened
credibility, lower perceived quality and the site being seen as less interesting
and attractive (see http://netm.ag/webpsychology-231). Increased user
frustration and increased blood pressure are two other effects we probably
have all experienced at some point! But how can we make sure our websites
load speedily enough to avoid these issues?

slower transfer rates than wired connections. Two competing websites with
a 100KB size difference per page can mean more than one second load time
difference on some slow mobile networks well into the interrupted thought
flow region specified by Jakob Nielsen. The trimmer, faster website is going
to be a lot less frustrating to browse, giving a distinct competitive edge over
fatter websites and going a long way towards encouraging repeat visits.
One important feature of most web servers is the ability to serve the HTML
in a compressed format. As HTML by nature contains a lot of repeating data it
makes it a perfect candidate for compression. For example, one homepages
18.1KB HTML is reduced to 6.3KB when served in compressed format. Thats
a 65 per cent saving! Compression algorithms increase in efficiency the
larger the body of text they have to work from, so you will see larger savings

Free speed Open source web page performance grading browser plug-in YSlow (yslow.
org) is based on the Yahoo Developer Networks website performance recommendations

Tooled up There are alternative high quality resources for measuring performance,
such as Googles free web-based PageSpeed Online tool

112

The SEO Handbook

Tutorials

with larger HTML pages. A 138.1K page on a popular forum is reduced to


25.7K when served compressed, a saving of over 80 per cent which can
significantly improve total transfer times of resources.
There are virtually no negatives to serving HTML in this form; everyone
should be enabling it for all their HTML content. Some web servers have
different settings for compressing static and dynamically generated content,
so its worth ensuring youre serving compressed content for both if possible.

Content delivery networks


Content delivery networks (known as CDNs) can also significantly improve
load times for your website. CDNs are a collection of servers distributed
across the globe that all hold copies of your content. When a user requests
an image from your website thats hosted on a CDN, the server in the CDN
geographically closest to the user will be used to serve the image.
There are a lot of CDN services available. Some of these are very costly but
advertise that they will offer better performance than cheaper CDNs. Free CDN
services have also started cropping up, and may be worth experimenting with
to see if they can improve performance on your website.
One important consideration when using a CDN is to ensure that you set
it up correctly so you dont lose any SEO value. You may be receiving a lot of
traffic from images hosted on your domain, depending on the nature of your
website, and by moving them to an external domain it might adversely affect
your traffic. The Amazon S3 service enables you to point a subdomain to its
CDN, which is a highly preferable feature in a CDN.
Serving content on a different domain (such as a CDN) or a subdomain
on your own domain name that doesnt set cookies has another key benefit.
When a cookie is set on a domain, the browser sends cookie data with each
request to every resource on that same domain. More often than not, cookie
data is not required for static content such as images, CSS or JavaScript files.
Web users upload rates are often much slower than the available download
rates, which in some cases can cause significant slowdown in page load times.
By using a different domain name to serve your static content, browsers will
not send this unnecessary cookie data, because they have strict cross domain
policies. This can speed up the request times significantly for each resource.
Cookies on websites can also take up most of an HTTP request; 1,500 bytes
is around the most commonly used single-packet limit for large networks so
if you are able to keep your HTTP requests under this limit the entire HTTP
request should be sent in one packet. This can offer improvements on page
load times. Google recommends that your cookies should be less than 400
bytes in size this goes a long way towards keeping your websites HTTP
requests under the one-packet/1,500 bytes limit.

One-piece Sprite sheets are easy to implement and can offer significant improvements
on page performance by reducing the total number of HTTP requests

Pingdom Chasing waterfalls


The free Pingdom (www.pingdom.com) tool for measuring website load
times is a valuable resource for measuring the impact your changes have
on your website load time. Access it at http://tools.pingdom.com.
The tool calculates your pages waterfall. This is simply the list of
requests done to load the page you are testing. It helps you to quickly
identify the most costly resources on your page in regards to loading time.
Every individual resource in the waterfall is broken down into segments
showing you exactly how long it takes to fetch each one. The resource
load times are broken down into the sections DNS, SSL, Connect, Send,
Wait and Receive, which represent a chronological overview of how each
resource is resolved.
DNS describes how long it takes the browser to look up the DNS
information to begin resolving the request. If your page uses SSL, the
SSL segment shows you how long it takes to perform an SSL handshake.
After this information is obtained the browser will attempt to establish
a connection to the web server, which is represented with the Connect
segment. With many requests, information is sent from the browser to
the server (such as cookie information), and the time it takes to do this is
represented with the Send segment.
After all these actions, all thats left is for the web server to send the
data to the browser represented by the Receive segment. The Wait time
is simply the gap between the send and wait. Requests to resources on
web servers are a lot more involved than they might seem, and a tool
like Pingdom is useful for identifying specific bottlenecks in requests for
resources such as a slow DNS.
Each time you run the performance test Pingdom will save the result,
so they can be viewed historically this is extremely useful for observing
your changes over time.

Further techniques
There are other, easier to implement techniques that can offer great benefits
to your sites speed. One is to put your JavaScript files at the end of your HTML
document, just before the closing body tag, because browsers have limits on
how many resources they can download in parallel from the same host.
The original HTTP 1.1 specification written in 1999 recommends browsers
should only download up to two resources in parallel from each
hostname. But modern browsers by default have a limit of around six.

On your marks Pingdoms speed-testing interface couldnt be simpler, and


results are stored so that you can review changes in loading time

The SEO Handbook 113

Tutorials

Testing time Pingdoms free tool for analysing the waterfall of your web page helps
break down each resources load time, which can help point out bottlenecks

Make your sites load faster

Make your sites load faster


Image: http://en.wikipedia.org/wiki/File:NCDN_-_CDN.png

Tutorials

Break it down Google Analytics has several useful tools and reports inside it that can
help you identify the slowest pages on your website

YSlow Open source analysis

Tutorials

YSlow is an open source plug-in for all major web browsers (except
Internet Explorer) that can help analyse the speed of your web pages.
It uses Yahoos rules for high performance websites to suggest ways in
which you can improve the pages performance, and can be downloaded
for free from http://yslow.org.
Yahoos rules for high performance websites are comprehensive
and regarded as being one of the best sources of information on how
to speed up your website, so YSlow is an exceptionally useful tool. It
will quickly point out issues that you may have assumed were set up
properly but were not. For example, you may assume that all your text
content is served gzipped when in fact some elements may not be, or you
may not be using correct expire headers on some resources. YSlow will
also highlight any non-minified CSS/JavaScript files and duplicate CSS/
JavaScript files, as well as a host of other useful and often easily fixed
problems that can have noticeable effects on your websites performance.
Installing and running YSlow is a quick and easy process, and it will
immediately highlight problem areas. However, sometimes you will come
across recommendations from YSlow that are out of your control. For
instance, on our website (www.scirra.com) it recommends we need to
configure our entity tags (ETags). Unfortunately on IIS it isnt possible to
configure these to meet the recommendation of YSlow (it would provide
such a minor performance improvement if it was possible to configure the
ETags that it shouldnt be of too much concern to you anyway).
If you are using third party plug-ins and scripts, you may also find they
do not appease YSlow fully. You will sometimes have to make judgement
calls on whether finding alternatives or removing these such plug-ins
will actually yield performance improvements that will make your efforts
worthwhile more often than not the answer is no.

Attention please YSlows website performance scorecard, which includes a


colour-coded breakdown showing areas that rate well, and those needing work

114

The SEO Handbook

Spread the load The right hand image shows how content is distributed on a CDN,
compared with a traditional one-server setup on the left side
If your web page has more than six external resources (such as images/
JavaScript/CSS files) it may offer you improved performance to serve
them from multiple domains (such as a subdomain on your main domain
name or a CDN) to ensure the browser does not hit its maximum limit on
parallel downloads.
Rather than splitting multiple requests onto different domains, you may
consider combining them. Every HTTP request has an overhead associated
with it. Dozens of images such as icons on your website served as separate
resources will create a lot of wasteful overhead and cause a slowdown on
your website, often a significant one. By combining your images into one
image known as a sprite sheet you can reduce the number of requests
required. To display the image you define it in CSS by setting an elements
width and height to that of the image you want to display, then setting the
background to the sprite sheet. By using the background-position property
we can move the background sprite sheet into position so it appears on your
website as the intended image.
Sprite sheets also offer other benefits. If youre using mouseover images,
storing them on the same sprite sheet means that when the mouseover is
initiated there is no delay because the mouseover image has already been

Slow websites are associated


with lessened credibility and
lower perceived quality
loaded in the sprite sheet! This can significantly improve the users perceived
loading time and create a much more responsive feeling website.
Specifying the dimensions of any other images in <img /> tags is also an
important factor in increasing your web pages perceived loading time. Its
common for devs not to explicitly set width and height for images on pages.
This can cause the pages size to expand in jumps as each image (partially)
loads, making things feel sluggish. If explicit dimensions are set the browser
can reserve space for the image as it loads, stopping the page size changing
and sometimes significantly improving the users perceived loading time.
So what else can we do to improve this? Prefetching is one such feature
available in HTML5. Prefetching enables loading of pages and resources before
the user has actually requested them. Its support is currently limited to Firefox
and Chrome (with an alternative syntax). However, its ease of implementation
and usefulness in improving the perceived loading time of your web page is
so great that its something to consider implementing.
<!Firefox Prefetching -->
<link rel="prefetch" href="http://www.example.com/page2.html">
<!Chrome Prerender -->
<link rel="prerender" href="http://www.example.com/page2.html">
<!Both in one line -->
<link rel="prefetch prerender" href="http://www.example.com/page2.html">

Tutorials

compression of both static and dynamic content

There is a behavioural difference between prefetching and prerender. Mozillas


prefetch will load the top level resource for a given URL, commonly the HTML
page itself, and thats where the loading stops. Googles prerender loads child
resources as well, and in Googles words does all of the work necessary to
show the page to the user, without actually showing it until the user clicks.

Prefetching and prerendering considerations


But using this feature also comes with important considerations. If you
prerender/prefetch too many assets or pages then the users entire browsing
experience may suffer; if you have any server-side statistics these can become
heavily skewed. If the user doesnt click the preloaded resource and exits your
website, your stats tracker may count the visit as two page views, not the
actual one. This can be misleading for important metrics such as bounce rates.
Chromes prerender has another caveat developers need to be aware
of, in that the prerendered page will execute JavaScript. The prerender will
load the page almost exactly the same way as if the link has been clicked on
by the user. No special HTTP headers are sent by Chrome with a prerender;
however, the Page Visibility API enables you to distinguish whether the page is
being prerendered. This is crucially important again for any third party scripts
that youre using, such as advertising scripts and statistics trackers (Google
Analytics already makes use of the Page Visibility API so you dont have to
worry about that). Improperly handling these assets with the Page Visibility
API again makes you run the risk of skewing important metrics.
Using prefetch and prerender on paginated content is probably a safe and
useful implementation for example on a tutorials web page that is split into
multiple sections. Especially on content like tutorials its probably important to
keep within Nielsens uninterrupted thought flow boundaries.
Google Analytics can also give valuable clues as to which pages you may
want to prerender/prefetch. Using its In-Page Analytics you can determine
which link on your homepage is most likely to be clicked. In some cases with
highly defined calls to action this percentage might be extremely high which
makes it an excellent candidate for preloading.
Both prefetching and prerendering work cross-domain an unusually
liberal stance for browsers, which are usually extremely strict on cross-domain
access. However, this probably works in Googles and Mozillas favour because
they are able to create a faster browsing experience for their users in several
ways, offering a significant competitive edge over other browsers that dont
yet support such features.
Prefetching and especially prerendering are powerful tools that can have
significant improvements on the perceived load times of web pages. But its
important to understand how they work so your users browsing experience
is not directly and negatively affected.

Ajax content loading


Another way to improve loading times is to use Ajax to load content as
opposed to loading the entire page again more efficient beacuse its only
loading the changes, not the boilerplate surrounding the content each time.
The problem with a lot of Ajax loading is that it can feel like an unnatural
browsing experience. If not executed properly, the back and forward buttons
wont work as the user expects, and performing actions such as bookmarking

Situation normal The Chrome Web Store loads a lot of content with Ajax in a way
that feels like a fast, natural browsing experience
pages or refreshing the page also behave in unexpected ways. When
designing websites its advisable to not interfere with low level behaviours
such as this its very disconcerting and unfriendly to users. A prime example
of this would be the efforts some websites go to to disable right-clicking on
their web pages as a futile attempt to prevent copyright violations. Although
implementing Ajax doesnt affect the operation of the browser with the same
intention of disabling right-clicking, the effects are similar.
HTML5 goes some way to address these issues with the History API. It is
well supported on browsers (apart from Internet Explorer, though it is planned
to be supported in IE10). Working with the HTML5 History API we can load
content with Ajax, while at the same time simulating a normal browsing
experience for users. When used properly the back, forward and refresh
buttons all work as expected. The address bar URL can also be updated,
meaning that bookmarking now works properly again. If implemented
correctly you can strip away a lot of repeated loading of resources, as well as
having graceful fall backs for browsers with JavaScript disabled.
There is a big downside however: depending on the complexity and
function of the site you are trying to build, implementing Ajax content
loading with the History API in a way that is invisible to the user is difficult. If
the site uses server-side scripting as well, you may also find yourself writing
things twice: once in JavaScript and again on the server which can lead
to maintenance problems and inconsistencies. It can be difficult and time
consuming to perfect, but if it does work as intended you can significantly
reduce actual as well as perceived load times for the user.
When attempting to improve the speed of your site you may run into some
unsolvable problems. As mentioned at the start of this article, its no secret
that Google uses page speed as a ranking metric. This should be a significant
motivation to improve your sites speed. However, you may notice that when
you use resources such as Google Webmaster Toolss page speed reports they
will report slower load times than you would expect.
The cause can be third-party scripts such as Facebook Like buttons or
Tweet buttons. These can often have wait times in the region of hundreds of
milliseconds, which can drag your entire website load time down significantly.
But this isnt an argument to remove these scripts its probably more
important to have the social media buttons on your website. These buttons
usually occupy relatively small spaces on your page, so will not significantly
affect the visitors perceived loading time which is what we should primarily
be catering for when making speed optimisations. l
For more ways to speed up your site visit: www.netmagazine.com/features/15surefire-ways-speed-your-site

About the author


Name Thomas Gullen
Site www.scirra.com
Twitter @scirra
Areas of expertise Web development, ASP.NET (C#),
JavaScript, SEO
How do you like your eggs? Grown up into a chicken

The SEO Handbook 115

Tutorials

Downsizing A screenshot from the IIS7 webserver showing how easy it is to enable

Make your sites load faster

Tutorials

Improve page load times

CDN improve
page load times
Dean Hume explains how content delivery networks can drastically increase
the performance of your website by serving data thats closer to your users
Knowledge needed HTTP web concepts, FTP tools
Requires Web browser for profiling (Firebug, Chrome Dev Tools)
Project time 30 mins

Tutorials

Content delivery networks (CDNs), put simply, are collections of


web servers that are distributed across multiple locations around
the world in order to deliver content more efficiently to users. The
goal of a CDN is to serve content to end-users with high availability and
high performance. So what does this mean for us as developers and why
should you use a CDN for your files?
Whenever a user makes a request to your website that is hosted in, say,
New York, while they are based in London, the internet tubes must connect
from the users location in London to the datacentre in New York. This
means that your users will need to make a round trip across the world in
order to retrieve a file from the server. Imagine not having to make such a
journey to fetch these files this is where a content delivery network comes
in. Since static files such as images, JavaScript and CSS dont change very
often, there is no reason that they cant be served to the user by another
server that is geographically closer to them. Its a shorter distance to travel,
and this means quicker response times.
The benefits of using a CDN extend far beyond just brilliant response
times; using a CDN additionally reduces the amount of bandwidth and
requests that are served from your website. You get all the benefits of
caching, Gzipping and a wider network that reduces the amount of
bandwidth that is consumed by your website. A CDN also increases the
number of files that a browser can download in parallel. Most browsers only
allow you to download three or four files at a time from one domain. Using
a CDN will enable the users browser to download more files in parallel,
increasing their response times.

OK, so the benefits are apparent, but are large companies the only parties
that can afford to use a content delivery network? The answers no CDN
technology is commercially available to all developers and it is highly
affordable. Youll pay only for the file storage space and outgoing bandwidth
that you actually use. I use a CDN for my personal blog that receives a few
thousand hits a month and I only pay 30p a month.

Performance gains
The most important part that developers can play in enhancing the browsing
experience for users is improving the speed and response times of our
applications. At Yahoo a test was conducted, and the sites that moved static
content off their application web servers and onto a CDN improved end-user
response times by 20 per cent or more (see http://developer.yahoo.com/
performance/rules.html#cdn). Now, you may not get this level of an
improvement, but even a performance increase that is close to that is
worth it in my opinion.
Steve Souders is the head performance engineer at Google and originally
coined the term the performance golden rule. This states that developers
should optimise frontend performance first [because] thats where 80 per
cent or more of the end-user response time is spent. Just think about all
the static components that are in your web pages images, style sheets,
JavaScript and so on.
If we can look to improve on the performance of these static files, we can
make big gains in terms of users perception of our sites.

Commercial content delivery networks


In this article I aim to compare three of the markets leading commercial
CDNs: Amazon Cloudfront, Windows Azure CDN and Rackspace CDN. I will
be putting these cloud products head to head on three key features: CDN
response times, price and ease of use. Although I have only profiled three,

Feeling blue The Windows Azure management portal is an intuitive interface thats a

Canine control This image was used as a benchmark across all the content delivery

great dashboard for managing your CDN and cloud instances

networks tested

116

The SEO Handbook

Tutorials

Improve page load times

Business impact
In 2006, Google discovered that shifting from a 10-result page loading
in 0.4 seconds to a 30-result page loading in 0.9 seconds reduced traffic
and ad revenues by 20 per cent. When the Google Maps homepage was
shrunk from 100KB to 70-80KB, traffic rose 10 per cent in the first week,
and an additional 25 per cent in the following three weeks.
Over at Amazon it was found that every 100ms increase in load time
of the www.amazon.com homepage decreased sales by 1 per cent (for
more see www.websiteoptimization.com/speed/tweak/psychology-webperformance). Other major online players have revealed similar results:

there are still some other great CDNs out there worth looking into check
out CacheFly (www.cachefly.com), EdgeCast (www.edgecast.com), GoGrid
CDN (http://netm.ag/gogrid-232) and Google App Engine (https://
appengine.google.com).

The testing process


To compare the different response times, I uploaded a small image (8KB) and
used this as a benchmark across all the CDNs. I tested over the period of a
week at different times of the day and using different servers across the
world these included ones in London, New York, Tokyo and Sydney. I
noticed that the response times can vary wildly between fresh visits, so in
order to get the most accurate picture I needed to get a lot of data.
Response times are also all relative to your geographic location and
internet connection speed. To check your response times against a few

Website performance can have a big impact on business success. By


reducing page load times, users will instantly see the benefits and
hopefully so will your conversion rates.
If we look at it from a time-based point of view, every second that a
user passes waiting for a resource to download means less search
queries on your site, less clicks and therefore less browsing on your site.
Time is money!
In terms of sustainability, improved website page speeds also
decrease operating costs by reducing hardware requirements and
bandwidth, which in turn reduces carbon footprint. Either way you look
at it its a win-win situation.
Google has also introduced page speed as a determining factor in
search rankings if other search engines follow, it makes it more
important than ever to improve our websites (see http://netm.ag/
souders-232).

CDNs reduce the amount


of bandwidth and requests
served from your website
different CDNs head to www.cloudclimate.com/cdn-speed-test the site
offers a graph that details your average response time for your connection
as well as the global average.
Its important to remember that when testing for response times you
need to use the non-cached version of the image. CDNs add an expiry
header, and simply refreshing your browser might give you the cached
version instead of hitting the server to fetch the image again. By hitting
CTRL + F5 in your browser, you will request a fresh version of the file.

Amazon Cloudfront
Amazon is by far the most popular CDN option out there. The company has
also been creating cloud services for many years and has built up a great set
of products. I have been using Amazon Cloudfront for a few months now
and so far it has been fast, cheap and relatively easy to set up. Again, as with
most CDNs in the market today, Amazon charges only for the content that
you deliver through the network and theres no monthly fee associated.
Prices compared with the rest of the CDN services are very competitive,
and it seems that by default the cheapest storage solutions are both in the
US and Europe. I do feel that setting up an Amazon Cloudfront solution
was a little more difficult compared to the other CDNs. It seemed more for
developers than junior webmasters. However, once set up it was easy to use
and very efficient.
One downside that I noted about Amazon Cloudfront is that there isnt
native support for Gzip. It can be enabled using custom scripts, but it would

Shining example Chrome Dev Tools come bundled with the Chrome browser
and offer developers a great way to investigate and develop applications

be nice if were offered as part of the package. Customer service also isnt
included by default, and you need to pay a bit in order to receive this facility.
This could make things tricky if you are a new developer starting out with
cloud tools.

Rackspace CDN
Rackspace comes across as a bit of a dark horse in the CDN department.
Pricing was really attractive (similar to Amazons Cloudfront), as well as being
simple to calculate and flexible. Rackspace uses the Akamai network as a
base for its CDN service.
Akamai has been around for a long time now, and has a massive global
network with servers deployed in 72 countries. Akamai is also the world
leader in content distribution, boasting 73,000 servers around the world, so
by using this option you get all the benefits of a world-class network
without prohibitive costs.

The SEO Handbook 117

Tutorials

Functional fowl Cyberduck is a cloud storage browser for Mac and Windows. It
enables you to upload files to Amazon, Rackspace and Google Storage with ease

l Yahoo found a 400ms delay caused a 5-9 per cent decrease in traffic.
l Bing disovered that a two-second delay caused a 4.3 per cent drop in
revenue per user.
l Mozilla made its download page 2.2 seconds faster and was
rewarded with an increase of 15.4 per cent in downloads.
l Netflix enabled Gzip on the server; simply by this single action pages
became 13-25 per cent faster and saved 50 per cent of traffic volume!

Tutorials

Improve page load times

Profiling tools
Most modern browsers come with built-in developer tools enabling you
to see the network usage of your site. I find it extremely useful to use
the dev tools that Chrome has on offer; to fire them up, simply hit F12
in your browser and it will pop up. There is also a Google Page Speed
add-on available that will integrate with Chrome Developer Tools. It
is very useful for determining a rating for your sites page speed, and
also gives you a list of areas that you can improve on.
Alternatively, Google offers a web page that enables you to profile
your site without having to install any plug-ins. Try it out at https://
developers.google.com/speed/pagespeed/insights. The site will also
allow you to profile your site against Mobile devices.
My other favourite tool to use is Firebug, a plug-in available for
Firefox and more recently Chrome. Firebug has an advanced JavaScript
debugger as well as the ability to accurately analyse network usage and
performance. Head to http://getfirebug.com to download it.

The set-up was easy and even included an online tool that I could
use to upload files with. I did also come across a great utility online
called Cyberduck (http://cyberduck.ch). This offers an FTP-esque file explorer
that you can quickly connect to your Rackspace storage account. This tool
can also be used with any Amazon storage account.
In contrast to Amazon, with Rackspace Gzip compression is automatically
enabled for any static file that you upload you automatically receive this
feature. The client support for Rackspace also seems quite impressive: it
offers permanent phone service for client queries.

Windows Azure CDN

Tutorials

Windows Azure is a relative newcomer to the cloud marketplace. There has


been a big marketing push recently by Microsoft to promote its cloud
services. I think the company has done a great job in providing an easy to
use service that is available to all languages and tools and any framework.
Thats right all languages. If you are using node.js, Java, PHP or .NET you
will be able to use the Windows Azure CDN. Microsoft has gone all out to
make this an open cloud platform that is available to all developers.
I found the service really easy to set up, with loads of great tutorials
available on the Windows Azure (www.windowsazure.com) website. I
managed to get up and running in 15 minutes. There is also a free 90-day
trial that enables you to test out the services before deciding whether to buy
them this is a great option.
When it came to trying to upload files to the Azure CDN. However, I found
things a little less user-friendly than expected. There werent many tools
online that enabled me to simply upload files to my instance in a FTP-like
manner. Every time I needed to publish my entire application in order to
upload the files.

Money management The Rackspace application gives a breakdown of your billing


and upcoming expenses. It has a friendly UI that allows for easy upload of resources
I would like to have seen an open source tool, as has been developed for
the other services, but hopefully this will come in time. Overall, Windows
Azure is a great CDN thats easy and fuss-free to set up.

Response times (London)


Average Response
(milliseconds)

Fastest Response
(milliseconds)

Slowest Response
(milliseconds)

Rackspace
CDN

18ms

8ms

47ms

Amazon
Cloudfront

37ms

17ms

57ms

Azure CDN

17ms

8ms

59ms

Pricing
Storage

Bandwidth Out

Rackspace
CDN

$0.10 / GB

$0.12 / GB

Amazon
Cloudfront

$0.0075 / 100
transactions

$0.12 Europe & N America; $0.19


Other (both variable)

Azure CDN

$0.1252 / 100
transactions

$0.12 Europe & N America; $0.19


Other (both variable)

Comparing
As you can see from the tables above, there is little between the three CDNs
and all offered a superb service. But if I were to choose a winner it would
have to be Rackspace: it consistently offered the best response times, the
customer support has been great and the price is very competitive.

Conclusion
It is really easy to get set up using a CDN and If you only made one change
to your site today, serving your static files from a CDN would improve your
performance significantly. You could even have one CDN account serving
loads of different websites that you work on. In this article, Ive reviewed a
few different CDN services out there but whichever one you do decide to
go with, your users can only benefit! l

About the author

Global improvements No matter which CDN you choose, your customers will
benefit from having closer access to static files

118

The SEO Handbook

Name Dean Hume


Site www.deanhume.com
Areas of expertise C#, ASP.NET-MVC, jQuery, HTML5,
NUnit, MOQ
Job Software developer
What do you wish youd invented?
Cheese I love the stuff!

ON
SALE
NOW!

Start a Web Agency


The essential guide to starting and running a thriving web design
business. Discover everything from setting up to managing money
and winning big clients. Start your new business venture today!
On sale now at WHSmith, Barnes & Noble, www.myfavouritemagazines.co.uk
and for iPad in the net magazine app at http://netm.ag/itunesuk-253 (UK)
and http://netm.ag/itunesus-253 (US)

Tutorials

Build a basic responsive site

CSS build a basic


responsive site

Downlo
the files!ad
>
The fi

les yo
this tuto u need for
ria
found at l can be
htt
ag/resp-2 p://netm.
31

Responsive design is much misunderstood. Jason Michael lays to rest some


myths, and then walks us through building a simple responsive website
Knowledge needed Basic CSS and HTML
 Requires Text editor
Project time 1-2 hours

Tutorials

Everyones talking about responsive web design. But does everyone


understand what its for? Im not sure. Many web designers and
developers seem to me to have misunderstood the problem
its trying to solve.
Put simply, its not about making sites for mobile devices, its about adapting
layouts to viewport sizes.
Over the next few pages, Ill look at the principles behind responsive web
design in detail, so were sure to understand the concepts correctly. Once weve
got that out of the way, Ill walk you through building a website that scales
perfectly on both large and small screens.
Responsive web design has mainly become a hot topic because more and
more people are using mobile devices such as iPhones, iPads, and BlackBerrys
to access the internet.
So its becoming increasingly important to understand that a website should
not be specifically about either the desktop or the mobile device, but about
building in such a way that its layout adapts to varying viewport sizes.
If you think about the new inventions we will inevitably see in the future,
then an adaptive layout that can automatically respond to the users preference
becomes an indispensable and highly valuable commodity.
One of the main reasons media queries have become more popular is the
fact that websites are unusable on devices they werent considered for during

design and build phases. They become fiddly to navigate around or maybe
the fixed width is wider than the users viewport, making it difficult to zoom in,
pan, zoom out and find what they are looking for.
Frustrating? For sure. But more frustrating as a developer is that these
websites should have been built in such a fashion that they scale down to fit
any viewport size.
Many sites using media queries strip out information, hiding certain aspects
of the site that they deem less important. So the user with a smaller device gets
an easier to use website, but with stripped-down content.
But why should I, on a mobile device, not get the same benefits from a
website as a desktop user?
With the help of media queries we can completely customise the layout of
our website dependent on screen size. Which is great, but do we really need to
supply several adaptations of our site?
And why should we settle for a site thats so badly designed or built that it
cant scale gracefully?

User frustration
Some people believe that its okay to cut features and eliminate content they
believe is non-essential to the user. But how can you be sure that the
information you are cutting or pushing to a secondary page is not the content
that is most important to me? You cant.
As an example, I was on the Nike Football website on my MacBook and
reading about the football academy they are running with the Premier
League, which I found really interesting its one of the main features as you
get to the website.

Desktop delights Nike Footballs full site features main navigation offering all available options including the feature on
the companys football academy thats visible in the image above

120

The SEO Handbook

Scaled back But view the site on an iPhone and


the academy features nowhere to be seen

Tutorials

Build a basic responsive site

Further reading (1)


Theres a wealth of great articles and tutorials out there for anyone
wishing to learn more about responsive design. Here are just a few

Responsive Web Design


The article by Ethan Marcotte for A List Apart that started it all. Fluid
grids, flexible images, and media queries are the three technical
ingredients for responsive web design, he writes. But it also requires
a different way of thinking. Rather than quarantining our content into
disparate, device-specific experiences, we can use media queries to
progressively enhance our work within different viewing contexts.
l www.alistapart.com/articles/responsive-web-design

5 really useful responsive web design patterns

Just padding The iPad version of Nikes site says train like a pro, but the desktop
versions football academy article cant be found here at all
However, when I tried to show a friend of mine on my iPhone, I discovered
Nike has its own mobile site, and Nike Football consists of just two options: one
about the latest Mercurial Vapor boots (not interested), and one about the new
technology used on Nikes football shirts (not interested).
I tried my iPad and it was completely different again and still no sign of the
academy information I was looking for.
Its not just Nike thats guilty of this its hundreds of sites. And I find it
highly frustrating that I should get penalised for using a different device. I feel
that if content isnt worth showing to smaller device user, then it probably isnt
worth showing to anybody.
The first thing we need to understand is that responsive web design isn't just
about mobile it considers all viewport sizes. And secondly, developing a good

Some interesting responsive design patterns that are being


implemented by talented designers all over the web.
l www.designshack.net/articles/css/5-really-useful-responsive-webdesign-patterns

CSS Regions
Adobes proposal seeks to enable magazine-style layouts to be created
on the web. Find out more at:
l http://dev.w3.org/csswg/css3-regions
l www.css-tricks.com/content-folding
l www.broken-links.com/2011/11/07/introducing-css-regions

responsive website requires more time and effort than just using media queries
With a vast and growing number of web-enabled devices, its important to give
your website the best possible chance to facilitate a solid user experience.
We know that by having a responsive site we can use a single codebase. This
is great in that it means we neednt adjust our content for each device. But
many websites hide content deemed unnecessary to mobile users, and there
are two issues with this.
Firstly, it effectively penalises mobile users browsing the website. And
secondly, including a hidden style in our CSS doesnt mean the content doesnt
get downloaded. This can massively affect performance, especially for those on
poor connections.
So perhaps the best way to go about designing a website is to consider
mobile, or smaller devices, first. This way you can focus on the most important
information your site needs to give. And then, if necessary, you can use
conditional loading techniques where your layout grid, large images and media
queries are applied on top of the pre-existing small-screen design.
The real reason many full websites are unusable on mobile devices is
because they are unusable on any device. If its designed well enough in the
first place and built correctly, then it should scale up or down gracefully and
effectively. A responsive site doesnt necessarily have to be targeted at mobile
devices; if its built correctly it doesnt have to be targeted to any particular
device. It will just work. And Ethan Marcotte sums it up well in his article
'Responsive Web Design' from A List Apart: Rather than tailoring disconnected
designs to each of an ever-increasing number of web devices, we can treat
them as facets of the same experience, he writes. We can design for an
optimal viewing experience, but embed standards-based technologies into our

Tutorials

Why should I, on a mobile


device, not get the same
benefits as a desktop user?
New tricks Chris Coyiers article (www.css-tricks.com/content-folding) explores
CSS Regions, which enables content to flow through multiple elements
designs to make them not only more flexible, but more adaptive to the media
that renders them. In short, we need to practice responsive web design.
With the vast evolution of devices, responsive web design wont fully prevent
us from making changes for new devices, but it should eliminate the need to
make viewport related changes. Weve all been through it building websites
that dont quite work in IE6. Its an issue that drove us all crazy and we spent
hours applying hacks to fix it. However, there has never really been that much
of an issue with IE6, its just we were building our sites wrong. With a vastly
growing number of web-enabled devices it is important that we build our sites
in a way that allows it to adapt to change.

The walkthrough
For the purpose of this tutorial I have put together a website that scales
beautifully between large and small screens. You keep all the content on all
sizes. And with the use of media queries I have switched the navigation from
a horizontal display to vertical display for smaller devices, and given the user
enough padding on the realigned adaptation to work well on touch screens.
One thing that I especially like, when you view smaller-screen versions of
sites where the main navigation fills the screen area, is the ability to skip to the
content you really want using page anchors. Having this appear at the top of
the page helps prevent mobile users from having to scroll down to get to
the main body of content.

The SEO Handbook 121

Tutorials

Build a basic responsive site

Further reading (2)


Responsive questions
A great post on Jeremy Keiths Adactio blog. Weve always had the
ability to create fluid layouts, writes Keith. And yet web designers and
developers have wilfully ignored that fact, choosing instead to create
un-webby fixed-width layouts It was never specifically about mobile
devices or users in a mobile context; it was always about adapting
layout to varying viewport sizes.
l http://adactio.com/journal/5351

The future of CSS layouts


Peter Gastons article for the .net website takes a detailed look at the
different options for creating rich, dynamic pages.
l www.netmagazine.com/features/future-css-layouts

Why we shouldnt make separate mobile websites


The reasons many full websites are unusable on mobile devices are
because many full websites are unusable on any device, says Bruce
Lawson in this article for Smashing Magazine. In other words, if its
designed well enough in the first place and built correctly then it should
scale down gracefully and effectively.
l http://mobile.smashingmagazine.com/2012/04/19/why-we-shouldntmake-separate-mobile-websites

Responsive layouts using CSS Media Queries


A great blog post from Kyle Schaeffer explaining the basics of
responsive web design layouts.
l http://kyleschaeffer.com/best-practices/responsive-layouts-using-cssmedia-queries

Ethan Marcotte answers your responsive web


design questions

Tutorials

The guru of responsive web design asks questions posed by readers of


.net magazine.
l www.netmagazine.com/interviews/ethan-marcotte-answers-yourresponsive-web-design-questions

CSS3 Flexible Box Layout Module


FlexBox offers exciting possibilities for responsive web design.
Find out more details in this excellent article by Peter Gasston.
l www.netmagazine.com/tutorials/css3-flexible-box-model-explained

15 Detailed Responsive Web Design Tutorials


l http://designwoop.com/2012/03/15-detailed-responsive-webdesign-tutorials

From the top Designwoops 15 detailed responsive web design tutorials offers
plenty for newcomers to the subject to chew on

122

The SEO Handbook

Degrade beautifully Mediaqueries.es is a nice site that is perfect for inspiration as it


shows you versions of many sites degrading into smaller screensizes
The key element of flexibility in responsive design is a fluid layout width.
Allyou need to do is create a wrapper, content, and column widths
that will adaptto different device widths. Its nothing new, but is now more
important than ever. To keep things simple, Im going to show you how to
create a fluid page consisting of navigation, feature image and two-column,
which takes into consideration the layout on various sized devices. Youll notice
Ive included respond.min.js, which is a lightweight polyfill that enables media
queries to work in IE6-8. Here is the basic HTML structure:

Many sites are unusable on


mobile devices because they
are unusable on any device
<!DOCTYPE html>
<html lang="en">
<head>
<meta charset="utf-8"/>


<title> Demo | Responsive Web</title>

<meta name="viewport" content="width=device-width, minimum- 
scale=1.0, maximum-scale=1.0" />

<link href="styles/main.css" type="text/css" rel="stylesheet">

<!--[if lt IE 9]>
<script src="//html5shiv.googlecode.com/svn/trunk/html5.js"></script>


<![endif]-->

<script type='text/javascript' src='scripts/respond.min.js'></script>
</head>
<body>
<div id="wrapper">


<header>

<nav id="skipTo">

<ul>

<li>
<a href="#main" title="Skip to Main Content">Skip to Main Content</a>
</li>

</ul>

</nav>
<h1>Demo</h1>


<nav>
<ul>
<li><a href="#" title="Home">Home</a></li>
<li><a href="#" title="About">About</a></li>
<li><a href="#" title="Work">Work</a></li>
<li><a href="#" title="Contact">Contact</a></li>

</ul>

Tutorials

Build a basic responsive site


Big guns Viewing the tutorial layout on a large
screen (left), max-width restrains it from expanding too far


</nav>
<div id="banner">

<img src="images/kaws.jpg" alt="banner" />
</div>


</header> 
<section id="main">


<h1>Main section</h1>
<p>Loremp>


</section>
<aside>


<h1>Sub-section</h1>
<p>Lorem p>

</aside>

</div>

</body>
</html>
When it comes to the CSS, setting a max-width is a good idea in order to
stop the site scaling across enormous screens and this wont withhold the
page from shrinking. One main issue when switching from fixed widths to fluid
is images. And there is a simple fix for this in your CSS. Just set your images
width to 100%:
/* Structure */
#wrapper {
width: 96%; 
max-width: 920px;
margin: auto;
padding: 2%;
}
#main {

width: 60%;

margin-right: 5%;
float: left;


} 
aside {

width: 35%;
float: right;

}

/* Logo H1 */

header h1 {
height: 70px;
width: 160px;
float: left;
display: block;
background: url(../images/demo.gif) 0 0 no-repeat;
text-indent: -9999px;
}
/* Nav */
header nav {
float: right;
margin-top: 40px; 
}
header nav li {
display: inline;

margin-left: 15px;


}
#skipTo {
display: none;
}
#skipTo li {
background: #b1fffc;

}

/* Banner */ 
#banner {
float: left;
margin-bottom: 15px;
width: 100%;
}
#banner img {

width: 100%;

}
Your image will now display at its parent elements full width and will
contract along with it. Just be sure your images max-width doesnt exceed the
max-width of its container otherwise it may pop outside. Remember to use
this method effectively the image must be large enough to scale up to whatever
size of your largest set viewport.
Using large images can effect load time, so on smaller viewports
where they are unnecessary there is a responsive image method where

The SEO Handbook 123

Tutorials

Perfectly formed Reducing screen size (below), the


site remains in view and elements contract accordingly

Tutorials

Build a basic responsive site

Small victory As screen size


reduces our media queries code
shows the skip to content link

Fluid movements Combining a series of grabs (above) enables the impact of transitions between screen sizes to be appreciated fully
you would detect the users screen size and pull in smaller/larger image
depending on what was necessary. There are still a few major challenges
with this method but is still worth looking into. Mat Marquis, a member of the
jQuery Mobile team has written a great article on this method and he explains
the pros and cons: http://netm.ag/respimage-231.

Main navigation switch

Tutorials

The main reason that you may want to switch the navigation is because the
scaling down could become unreadable and hard to click. By using this method,
you are enabling the user to access it more easily. You will also notice in the
code that we have made some changes to the #main and aside sections to
switch them to one column.
/* Media Queries */
@media screen and (max-width: 480px) {
#skipTo {

display: block;
}

header nav, #main, aside {

float: left;

clear: left;

margin: 0 0 10px; 

width: 100%;

} 

header nav li {

margin: 0;

background: #efefef;

display: block;

margin-bottom: 3px;

}

header nav a {

display: block;

padding: 10px;

text-align: center;

} 
}
You will notice on some mobile devices that your website automatically
shrinks itself to fit the screen, which is where we get the issues of having to
zoom in to navigate through fiddly content.
To allow your media queries to take full effect a typical mobile-optimized site
contains something like the following:
<meta name="viewport" content="width=device-width, minimum-scale=1.0, 
maximum-scale=1.0" />

124

The SEO Handbook

The width property controls the size of the viewport. It can be set to a
specific number of pixels like width=960 or to the device-width value which is
the width of the screen in pixels at a scale of 100%. The initial-scale property
controls the zoom level when the page is first loaded. The maximum-scale,
minimum-scale, and user-scalable properties control how users are allowed to
zoom the page in or out.
As I said before, responsive web design has never been about making sites
for mobile devices. Its about adapting layouts to viewport sizes. Having a
responsive site that adjusts to varying viewports should be the default option. If
you wish to create a mobile version that looks completely different and shows

The challenge is about what


technologies and methods
we use to develop websites
only important content then go ahead, but at least allow the user that choice
to see the full website too. We should concentrate on using the technologies
sitting under the responsive design umbrella to create a better web.
Something that will help us tremendously with fluid layout, and which
Imvery excited about, is Flexible Box Layout Module. FlexBox, as its
alsoknown, provides a method of automatically resizingelements within
their parent without having to calculate height andwidth values. As well
asdynamically changing an elements size, FlexBoxcan also apply
properties to a parent that control where any empty space is
distributed. If youre not aware of FlexBox then check Peter Gasstons articles
at http://netm.ag/flex1-web.
Developers must consider having the same content organised in a manner
that is the same for everyone. The minimum standard we should set ourselves
as developers is to create websites that work for everyone, everywhere. l
This tutorial has received a technical review from Stephanie Rieger (@stephanierieger)

About the author


Name Jason Thomas Michael
URL http://jasonthomasmichael.com
Areas of expertise CSS, HTML
Twitter @thejasonmichael
How do you like your eggs?
Poached with rock salt and stacks of pepper!

the responsive web


design handbook

Want to take your responsive design skills further? Then this is


the book for you. Packed with tips, techniques and tutorials, The
Responsive Web Design Handbook will enable you to design and
build brilliant sites that work on multiple devices. Pick it up today!
On sale now at WHSmith, Barnes & Noble, www.myfavouritemagazines.co.uk
and for iPad at http://netm.ag/itunesuk-248

Tutorials

Reponsively retrofit older sites

RWD responsively
retrofit older sites

Download
the files! >

All the files you


torial
need for this tu
http://
can be found at
35
netm.ag/rwd-2

You can use responsive techniques on older sites as a first step toward better
small-screen experiences. Check your idealism at the door, says Ben Callahan
Knowledge needed Intermediate CSS and HTML, understanding of
responsive techniques
 Requires Text editor, web browser, inspector, patience!
Project time Less than an hour

A word of warning: the next few pages may be a bit controversial.


Especially if you, like me, are a firm believer in web standards,
semantics, and just plain doing things right. But if you also share a
desire to build great experiences for the users of the sites you work on, you
may find these techniques useful when you just cant start from scratch.
Most of us probably agree that the web is never really done. The real-time
nature of the beast is what makes our medium unique, yet we often choose
File > New over a steady evolution of our sites. The truth is, we dont always
get to start over. And, as Kristopher Layon argues in Mobilizing Web Sites:
Strategies for Mobile Web Implementation (Develop and Design) (http://netm.ag/
layon-235), doing something is better than doing nothing.

Tutorials

Responsive retrofitting defined


For the sake of this article, Id like to define responsive retrofitting as finding
the fastest and lowest-risk approach to creating a better experience for users of
any size screen.
The key words here are fastest, lowest-risk, and any. We are trying to do
this quickly, efficiently, with minimal risk to the existing desktop-resolution
(whatever that is) site. And, these techniques could be used to provide a better
experience at smaller resolutions or at larger.

Lets take a look at an existing site were going to use the Responsive
Design Twitter account, @RWD and start to experiment with the grid itself.
Fire up your browser (in my case this is Chrome), head over to www.twitter.
com/rwd, and open the inspector. You should now see something along the
lines of Figure A.
Next, lets drill into the markup a bit. In the body tag, youll see a div with
an ID of doc. Inside that are two divs, one with an ID of page-outer. Generally, I
start off looking for fixed-width containers.
The #page-outer element doesnt have a width specified in the CSS, so drill
down one level further to the div with an ID of page-container. Youll notice
that this has a width of 837 pixels set in the CSS. Were going to change it to
100% simply by clicking on 837px in the inspector and replacing that with 100%

Despite the webs real-time


nature, we often choose File >
New over steady evolution
(see Figure B). Immediately, the two columns of content shift to the left and
right of the screen. However, because they are both fixed-width columns, there
is a bit of space left between them (see Figure C).
Also, because this element has a padding set:

When were approached by someone interested in a retrofit, we almost always


start in the browser playing with the inspector. The very nature of a retrofit
implies that the site already exists somewhere; the inspector gives you an
opportunity to experiment with current markup and styles.

.wrapper, .wrapper-narrow, .wrapper-permalink {



width: 100%; /* was 837px */
padding: 54px 14px 15px;

}

Figure A Here the Responsive Design Twitter account (www.twitter.com/rwd) is being


inspected in Chrome

Figure B Changing the width of the page-container div from 837px to 100% using the
inspector in Chrome

Retrofitting a fluid grid

126

The SEO Handbook

Tutorials

Reponsively retrofit older sites

Retrofits: when to say yes


After tackling a few retrofit projects, weve learned pretty quickly that
they can be like a drug for clients. Weve often been able to complete
these projects in less than half the time, for less than half the cost of
starting over. However, not every site can be retrofitted, and not every
site should be. Heres a list of questions we ask to evaluate whether a
site is right for this approach:

adjusted from 837px to 100%

its total width is now more than 100% (28px more, to be precise). This
introduces a bit of left/right scrolling in the browser window. We can alleviate
this by adding a new style with the inspector. If youre using Chrome, click the +
symbol (the New Style Rule button) at the top of the Styles palette. If you have
the #page-container element selected still, it will pre-populate the new style
rules selector with that ID. Were just going to add the box-sizing property and
set it to border-box.
#page-container {
box-sizing: border-box;
}
The box-sizing property (http://dev.w3.org/csswg/css3-ui/#box-sizing) forces any
padding or borders of the element to be laid out and drawn inside the specified
width and height. See Paul Irishs article (http://netm.ag/irish-235) on this
property for cross-browser compatibility and performance concerns.
With this rule applied, youll see that the browser no longer requires any
horizontal scroll. The 28 pixels of padding (14 pixels on each side of #pagecontainer) is now counted inside the 100% width exactly what we need in this
case. Now, lets get those columns flexing a bit.
Inside the #page-container div is a div with a class of dashboard, which
contains the entire left-hand column. Upon inspection, youll see that it has a
width of 302 pixels specified. 302 divided by 837 gives the relative width that
the .dashboard element took up when the layout was locked at 837 pixels. Its
approximately 36%, so well set that in the inspector.
.dashboard {
width: 36%; /* 302/837 = 36ish% */
float: left;
}

Does the existing site have solid UX at higher widths?


This may sound obvious, but if the site doesnt function well for users
in its current state, it will be very difficult probably impossible to fix
without solving the core UX problems. If you suspect this to be the case,
Id recommend starting the project off with a round of usability tests. Its
hard for clients to disagree with real data.
Is the existing markup semantic?
While this isnt always a critical component, especially because this is a
short-term step, it will make your job much easier. Weve taken retrofit
projects with very clean and very ugly HTML. Id always recommend that
you experiment in the inspector or with some static files before signing
a contract.
Does the real timeline require very fast action?
This is always difficult to determine, but you want to make sure that
there is a real reason this is necessary. Is your client responding to
something in the market that requires an immediate change? Did they
just redesign with someone who didnt consider small or very large
resolution devices/displays? Its challenging to really know, but you
need to make sure this is the best thing for them. Remember, our clients
should pay us to offer our honest expertise about whats best for them.
Try to honour that.
I cant stress enough that retrofitting is a step in the right direction.
There are very valid cases where an approach like this will create a much
better experience for our users on smaller or larger devices. Try to keep
the focus on the user and youll make good decisions, both for your
client and the visitors of their sites!

Taking the same approach with the right column (with a class of content-main),
which has a width of 522 pixels, gives us about 63%. This leaves us a 1% gutter
between the columns, which looks about right in this layout (see Figure D).
.content-main {
width: 63%; /* 522/837 = 63ish% */
float: right;
}
I love this kind of experimentation because it gives you a very good feel for
whats actually possible. As you can see, within just a few minutes and with
only a handful of styles, were able to get the Twitter site flexing pretty well.
Obviously, I selected Twitter because its a fairly simple layout and has clean
markup to work with. You may not be so lucky on your project. Remember this
doesnt mean you cant be successful! However, you will want to try this kind of
in-browser experimentation before you sign a contract.
We dont generally use grid-systems in our HTML/CSS work. However,
there are cases where it makes sense, particularly when youre handing

Do it right Performance must be part of the retrofitting process. Here we test


The Boston Globe site on Akamais Mobitest (http://mobitest.akamai.com)

The SEO Handbook 127

Tutorials

Figure C The primary container on the Responsive Design Twitter account page,

Can you overcome the performance challenges?


Theres been a lot of talk about performance concerns in RWD lately, and
rightly so. Many of the responsive sites that exist have been built without
page weight, number of requests or the speed of the experience as a
priority. When we take a retrofitting project, this is of primary concern
to us. Our retrofitting efforts are always paired with performance
evaluations and recommendations. It could be a red flag if your client
wont consider this as part of the project. Creating a site with a great
layout for small viewports, but that performs terribly, could do more
damage than good.

Tutorials

Reponsively retrofit older sites

Basic linking of retrofit CSS


The most common approach to linking CSS in retrofitting is to use maxwidth media queries to apply a set of styles to viewport widths less than
a certain resolution.
<link rel="stylesheet" href="original.css">
<link rel="stylesheet" href="rwd-lte-60em.css" media="(max-width:
60em)">
<link rel="stylesheet" href="rwd-lte-30em.css" media="(max-width:
30em)">
These links enable you to scope your CSS changes to any viewports that
are less than or equal to 60em in width, and then any that are less than
or equal to 30em. You could also add media query blocks at the bottom
of your existing CSS file:
/* original styles */
@media (max-width: 60em) {
/* styles for 60em or lower */
}
@media (max-width: 30em) {
/* styles for 30em or lower */
}
This will give the same effect, but with only a single request. If you
cant modify your original CSS file(s), you can also combine these two
techniques to limit the maximum number of requests to two for CSS:

Tutorials

<!-- HTML -->


<link rel="stylesheet" href="original.css">
<link rel="stylesheet" href="rwd-lte-60em.css" media="(max-width: 
60em)">
/* CSS */
/* styles for 60em or lower */
@media (max-width: 30em) {
/* styles for 30em or lower */
}
Obviously, its critical to consider number of requests especially on
mobile devices. Id encourage you to make this a primary factor in the
way you link CSS for your next retrofitting project.

Figure D Here the Responsive Design Twitter page has been made fluid with the
Chrome inspector
templates off to be managed by another organisation. In scenarios
where you need to use a grid-system and you already have a CSS
preprocessor in your workflow, you may be able to use one of the semantic
grid systems out there. Two that weve used and had great success with are
The Semantic Grid System (www.semantic.gs) and Susy (http://susy.oddbird.
net). These tools dont require non-semantic class names (which break down in
responsive web design). Generally, they use mixins or functions to re-define the
widths at various breakpoints.

Retrofitting flexible content


In the same way you would approach a greenfield responsive project, once
we have the foundation of our site flexing (using ratios instead of fixed-width
declarations) we need to consider how the content that lives inside that fluid

Its not until you try something


new with CSS that you realise
how powerful it can be
grid will respond. Text generally does this without much issue, but other
content types can be a bit more of a hassle especially when youre retrofitting.
Oftentimes, we find that a legacy CMS is writing width and/or height attributes
on the img tag itself. Sometimes we even see inline styles being set. How are
we supposed to handle these kinds of challenges?

Images
After spending some time fighting with inline styles, Ive landed on a few tricks
to help you out. In cases where width and height attributes are specified, you
can actually override these with a simple width or height declaration in your
CSS. If you are dealing with inline styles, you can always use the !important
keyword in CSS to override these declarations. Obviously, be careful where
and when you do this, but it does work.
Another trick is to use min-width and/or max-width instead of !important.
Perhaps you want to set an image to fill 100% of its container and to maintain
its aspect ratio. You can do this even if there are inline styles specified, by
setting the min-width and/or max-width to 100%.
<div class="column">
<img src="/i/image.png" alt="alt text" style="width: 200px; height: 100px;">
</div>
In a scenario like this, you could force the image to be flexible by using both
min-width and max-width, like so:

Take it further For more on how to apply media queries in any project check out
the W3C Recommendation for Media Queries (www.w3.org/TR/css3-mediaqueries)

128

The SEO Handbook

.column {
width: 50%;
}

Tutorials

Reponsively retrofit older sites

Figure E Data tables can prove a headache in RWD. Above is the the starting point for
our table example, as viewed in Chrome

Essentially, the min-width and max-width rules combine to force the image
to 100%, regardless of the inline widths. Because the height is also specified
inline, we still need to override that with an !important keyword to maintain
the aspect ratio.
This seems to work really well in all modern browsers. IE8 and older give a
bit of trouble with the auto rule on the height. The real point of this exercise is
to encourage the level of exploration thats needed for retrofitting. Its not until
you attempt something youve never done with CSS that you begin to realise
how powerful it can be.
Check out the /Images folder to see some tests Ive been running to override
inline styles with CSS. These are by no means exhaustive, so make sure you
combine overriding efforts like this with a healthy dose of testing.

Tables
Tables of data are always a challenge in responsive web design. Particularly
in retrofitting, where you often cant touch the markup, they can make for a
difficult time. Lets look at an example. Here is a pretty standard table, with
some attributes specified to apply style (see Figure E):
<table border="0" bgcolor="#eeeeee">
<thead bgcolor="#000000" style="color: #fff">
<tr>
<th width="100"></th>
<th width="81">Today</th>
<th width="81">Sep 28</th>
...
<th width="81">Jun 28</th>
</tr>
</thead>
<tr>
<th width="100">11 Payments</th>
<td width="81">$27.00</td>
<td width="81">$18.00</td>
...
<td width="81">$18.00</td>
</tr>
...
</table>
This table represents a list of payment schedules and its based on an actual
retrofit project that we worked on at Sparkbox. At the end of a series of
questions, this was presented to the user, in a modal dialog. Initially, I thought
there was no way. After a few minutes in the inspector, I was able to get this
responding fairly easily.

Fun with retrofitting


After a few retrofitting projects, we started tossing ideas around in the
Sparkbox office. It felt to us that a lot of folks were commenting about
poor mobile experiences, many of which could be made (at least a bit)
more tolerable with some CSS. So, we created the Responsive Retrofitting
Bookmarklet (https://github.com/sparkbox/Responsive-Retrofitting).
Rotation station The sub-page rotator element of www.paulwyatt.co.uk fades
basic
that we
put a new style sheet for any site wed
imagesThe
in and
outidea
over is
a period
of can
300ms
like in this repository on GitHub and executing the bookmarklet on that
domain will check for the CSS and inject it onto the page. What this
enables you to do is create a retrofit CSS file for any site youd like!
I asked a few folks on Twitter if they had any free-time for a geek
project one weekend. The multi-talented Phillip Zastrow (@zastrow)
accepted the challenge and created a retrofit CSS file for www.apple.
com. You can check it out by installing the bookmarklet, going to Apples
homepage*, and running the bookmarklet. Take your browser width
down and watch how the navigation responds! (See image above.)
Zastrow spent just six hours playing with this CSS read his write-up
for more info on the process (http://netm.ag/phanza-235). One final
note: this only works in WebKit browsers at the time of writing.
*Since the launch of iPhone 5, the homepage content has changed
significantly. The navigation still responds well, but youll see the photo of
the iPhone itself hasn't been accounted for in Zastrow's retrofit CSS

Here are just a few styles that shift the table around to make it much more
digestible on small screens:
/* make browsers include padding and border inside the width */
* {
-moz-box-sizing: border-box;
-webkit-box-sizing: border-box;
box-sizing: border-box;
}
/* table cells get "display: block" and "float: left" */
th, td {
display: block;
float: left;
text-align: center;
border: 0;
border-bottom: 1px solid #aaa;
}
/* the far left column will be full-width and called out */
th {
width: 100%;
background-color: #000;
color: #fff;
}

The SEO Handbook 129

Tutorials

.column img {
min-width: 100%;
max-width: 100%;
height: auto !important;
}

Tutorials

Reponsively retrofit older sites

Advanced CSS linking


Occasionally, we come across a site where the existing large-res styles
are so complex, they would be incredibly difficult to undo for smaller
layouts. In such cases you may want to serve mobile first styles to
viewport widths leading up to the current sites fixed width resolution.
The goal with this approach is to have zero impact on the original CSS
for larger viewport widths, but to use a mobile-first CSS structure leading
up to that resolution without having to undo overly complex styles from
the original CSS. We call this small resolution first, capped:

Figure F After applying a few styles, we have a table that is much more manageable at
smaller viewport widths

Tutorials

/* each inner table cell will fit four-across (25%) */


td {
width: 25%;
min-height: 3em;
border-right: 1px solid #aaa;
}
/* hide the header row */
thead {
display: none;
}
/* the last option only has one inner cell, make it 100% */
tr:nth-child(5) td {
width: 100%;
}
Just these few styles enable us to turn this table on its head and produce a
much more usable experience for those viewing on small screens. In this case,
Im essentially just setting the individual table cells to display: block and floating

Were helping our client


understand what the next,
long-term step should be
them left. Giving the first cell in a row 100% width and each of the remaining
cells 25% stacks them nicely together (see Figure F).
If you really paid attention, youll notice that Ive just completely hidden the
top row of data. In the actual project, these column headers were generated
dynamically on the server based on the current day. Obviously, we cant do that
with CSS. Instead, I actually used nth-child selectors and CSS generated content
to create some more generic headers (Today, Month 2, Month 3, and so on).
This isnt an optimal approach, but in a retrofit project it may represent a
valid temporary solution (see the /Tables folder in the example files). Were
doing everything we can to improve the experience as quickly as possible. Then,
were helping our client understand what the next, long-term step should be.

Wrap up
Remember our focus with a retrofit is really on the user. Were using the power
of responsive CSS techniques to quickly create a better experience. This isnt
long-term solution, but there can be real benefits to a phased approach. l
Thanks to Stephanie Rieger (@stephanierieger) for her peer review of this tutorial

130

The SEO Handbook

<!-- HTML -->


<head>
<script src="/js/modernizr.js"></script>
<script>
yepnope({
test : Modernizr.mq('(min-width: 0px)'),
yep : '/css/rwd.css',
nope : '/css/original.css'
});
</script>
<noscript>
<link rel="stylesheet" href="/css/original.css">
</noscript>
</head>
/* rwd.css File */
@media (max-width: 959px) {
/* styles for smallest viewport widths */
}
@media (min-width: 600px) and (max-width: 959px) {
/* styles for mid-tier viewport widths */
}
@media (min-width: 960px) {
/* original CSS styles */
}
Were using yepnope.js (www.yepnopejs.com) and Modernizrs (www.
modernizr.com) media query test to check if the UA supports media
queries (and has JS enabled). Where it does, we serve the responsive
CSS. In the cases where it doesnt, or when we dont have JS enabled,
we serve the original CSS. You can see this in action (also, using SCSS
aggregate and partials) in the examples files under the /Linking folder.

Project partner Modernizr is useful because it enable you to determine


design based on feature support instead of just viewport or device width

About the author


Name Ben Callahan
URL www.seesparkbox.com
Areas of expertise HTML, CSS, beach volleyball
Twitter @bencallahan
Whats your ideal fancy dress costume?
Ive been known to dress as [US volleyball legend]
Karch Kiraly (pink hat and all!)

Whether you're a WordPress beginner or a seasoned pro, The


ultimate guide to WordPress has something for you. Discover the
best themes and plug-ins, learn expert techniques and much, much
more. Take your WordPress sites to a new level today!
On sale now at www.myfavouritemagazines.co.uk
and for iPad at http://netm.ag/itunesuk-248

Web pro

Insider tips from industry leaders

Page 138

Page 142

Page 146

Page 147

Page 149

Page 150

Page 154

Page 155

Page 156

Page 157

Page 159

Web pro

Page 134

132

The SEO Handbook

Web pro

Insider tips from industry leaders

Become a web pro


Industry experts offer tips and opinion on
search, marketing and social techniques
Search

Analytics

Semantic search
Optimising web pages

Inconsistent data?
134

15 post-Penguin tips
Backlinking top tips

Aiming for data reliability

Good social content


146

What's the point?


135

Keywords driving sales


Which keywords are best?

Social

The purpose of web analytics

Google's remarketing option

The SoLoMo trend

Why it's not possible

Conversions are changing

Social data and search

Techniques and opportunities

Customer conversion journeys

The rise of social search

149

What does 'SEO' mean?


138

139

Your marketing secret weapon

140

A powerful marketing tool

150

151

Testing times
142

Ensuring successful CRO

Seven essential tools

'S' stands for success

Seven SEO tools for your toolbox 143

Getting the balance right

Conversational search

Google Adwords

What does it mean for SEO?

160

Using infographics

SEO is the glue


SEO in the web design process

Spark audience engagement

Marketing

Blogging

Don't game the system


Optimising best practice

159

Make content shareable

Inbound marketing
Why the term isn't relevant

158

144

Enhanced Campaigns analysis

153

Web pro

A definition for SEO

157

Speedy social marketing


148

Post-Penguin link building


137

156

Targeting the customer


147

Remarketing
136

Creating content to engage

154

155

Site migration
Migration without migraines

145

The SEO Handbook 133

Web pro

Semantic search

S earch

Recommended
Good examples
of semantics
Name Siri
URL http://netm.ag/
siri-230
Info Talk to Sirias you
would to aperson. Say
somethinglike, Tellmy
wife Im runninglate,
Remind me to call
the vet or Do I need
an umbrella? and Siri
answers you. This is
a perfect example of
semantics. Use it to
optimise your site!

Semantic search
One of the hardest things for most
website optimisers to come to grips
with is the idea of using Semantics
when optimising their pages.
In normal everyday conversation we all skip
over the details of what other people say and
assume that they mean one thing when they
actually say another.
Semantic search tries to understand the
searchers intent and the meaning of the query
rather than parsing through keywords like a
dictionary. Currently the search engines give
youresults based solely on the text and the
keywords that you put in that search. Essentially,
they give you their best guess.
Semantic search will dive into the relationship
between those words, how they work together,
and attempt to understand what they mean.
When people search, they aim to answer
a question. They just search in the truncated
version of that question. So far keyword
research has been largely data-driven around
the popularity of the keywords/phrases in their
question. Keyword research in semantic search
focuses on what that person actually means
when searching for that keyword.

What could people mean they search Car?


1
2
3
4

 hat is a car?
W
Where can I buy a new car? used car?
How do I drive a car?
What are the latest cars?

Anyone optimising their website for specific


keywords needs to ask: What is the searcher
looking for when they type in this keyword?
When the answer is many different things,
then you know your conversion rate will be very
low because a lot of the people searching for that
keyword were never looking for your product.
But when the answer to this question is clear and
simple, you know your click through rate and
conversion rate will be extremely high.
For example, if Im optimising for Europcar
(www.europcar.co.uk), I know that the keyword
Audi car rental London will receive a high click
through rate and a high conversion rate because
that is exactly what Europcar does.
So, to improve your sales, think about what
your visitors mean rather than what they say.

Name Search
EngineLand
URL http://
searchengineland.com/
library/google/googlepenguin-update
Great articlecollection on
the latest Google Update.

Name SEOmoz
URL http://moz.com/
blog/how-wpmuorgrecovered-from-thepenguin-update
Info An article about
how to recover from the
Penguin update.

David Deutsch is CEO at New Epic Media

How to Better your business for better SEO


Expert advice
Web pro

Name Christian Bauman


Job title Content manager
Company SEO Brand
URL www.seobrand.com

Googles Penguin
update has seen the
use of traditional
SEO shortcuts take a
huge hit. Originally,
when search engines
were just beginning
to take off, you could

134

simply hammer in
the right keywords
and generate loads of
backlink traffic for your
website, ranking above
others regardless of
the quality of the
service on offer.

The SEO Handbook

The purpose of
the Penguin update,
is to develop a better
means of determining
relevancy for every
search. By recognising
organic content, real
sites get rewards
affirming the golden
rule of SEO: content
is king.
The importance
of unique visits to
your sites and people
going places knowing
what they want and
how they can get it

from you, resulting


in a low bounce rate,
goes beyond any SEOrelated shortcut.
The time to look
into branding your
business and getting
people to know it
on a first name basis
couldnt be more
critical. Much like
Panda, getting around
the Penguin update
isnt about finding
new rabbits to pull
out of the black hat of
SEO. Businesses that

have people coming


to them are beginning
to rank above those
who just specialise
in spamming. This is
more of a return to the
old school.
Organic traffic
showcasing users who
go to your site and
stay on your site, and
exposure from links
generated by users
other than yourself,
will always create for
smoother sailing for
your business. This

has proven true even


before the Pandas and
Penguins were let out
of the zoo.
The top listings on
the Google index mark
the World Champion
of that specific
industry. You can try
to do what you want
behind the referees
back, but a true
champion will have to
punch their way to the
top like Pacquaio. May
the better business,
get the best business.

Web pro

15 post-Penguin backlink tips

Useful SEO
resources

The Penguin update is one of the most


significant Google algorithm updates
I have seen in my career to date. The
update focused on off-page optimisation aka link
building, and was code named Penguin. It is one
of the first updates I have seen that has affected
some of the top agencies and practitioners who
traditionally have been untouched for many years
by Google updates.
For a number of years it had been possible (and
sometimes surprisingly easy!) for sites to rank well
using high volumes of low quality links and anchor
text optimisation.
This backlink profile checklist is designed to
help identify low quality links that may be pointing
to your website, and help you to make informed
decisions about the quality of websites that link
to your site. I would recommend you gain a good
knowledge of your backlink profile before trying to
remove any bad backlinks:

01 Is the linking site reasonably well designed?


02 Is the content of the site high quality and
grammatically correct?
03 Are you happy having your brand associated
with this website?
04 Does the domain have any traffic at all when
reviewing in Alexa, Compete.com or any
similar services?
05 Is the domain (or at least the page) relevant to
your website? Relevant, high quality, industry
blogs with traffic seem to be the safest links to
build since Google Penguin.
06 Is the link sitewide? If so, it's probably best
to be branded (for instance, does it have
the company or website name?), and make
sure you dont have too many sitewide links,
especially if part of an exchange, network or
if they have been paid as these are all against
Google Webmaster Guidelines.
07 Is the site indexed? Simply Google
site:example.com and check that there are
pages indexed.
08 What is the PageRank of the websites

09

10

11

12

13

14

15

homepage? If there are a lot of links pointing


to the domain, but the homepage has a
PageRank of zero, there could be a penalty on
the website. Sites that dont have any pages
indexed are exactly the kind of links you need
to be really careful of.
If you arent sure about the quality of the link,
make sure it links to a deep page to prevent a
penalty on the homepage.
Does Googling site:example viagra return
any results? If so it is often a red flag that the
site has been spammed, and as a result, links
are less likely to pass any value to your website.
Are there many domains hosted on the same
server? This can sometimes also be a red
flag. Using this tool (www.yougetsignal.com/
tools/web-sites-on-web-server) can help you
to identify lower quality websites if there are
many hosted on the same address.
Emerging search engines like DuckDuckGo
(https://duckduckgo.com) do a fantastic job at
helping you find higher quality link prospects.
If you are trying to remove low quality links,
save and screengrab emails to webmasters.
This can be used as evidence to major search
engines that you are trying to improve the
quality of your backlink profile.
Start to consider authorship (Google+ and
linked social accounts) as a quality signal for
your backlink profile. You can bet your ass
thatGoogle is!
Consider backlinks and social signals together
as a ratio. Are you in an industry with a lot of
social activity? If so, you should be attracting
both links and social engagement. Not doing
so will also appear unnatural.

Name Google
Conversion Optimizer
URL http://netm.ag/
googleoptimiser-246
Info A fantastic tool
for increasing the
performance of your
landing pages and
yourkeywords.
Name Maxymiser
URL www.maxymiser.
com/landing-pageoptimization
Info If you want a
more robust A/B testing
software to use, then
Maxymiser is a fantastic
service and software
that will increase the
performance of your
website by some way.

Web pro

15 post-Penguin
backlink tips

Name You Get Signal


URL www.yougetsignal.
com/tools/web-sites-onweb-server
Info A reverse IP
domain check takes
a domain name or IP
address pointing to a
web server and searches
for other sites known to
be hosted on that same
web server.

Expert Advice
Name Stephen Lock
Job title Former SEO product
marketing manager
Company Analytics SEO
URL www.analyticsseo.com

The SEO Handbook 135

Web pro

Which keywords are driving your sales?

S earch

Useful SEO
resources

Which keywords are


driving your sales?
When advertising on Google Adwords
it's very easy to gauge which keywords
are driving sales for an ecommerce site.
All you have to do is log in to your Adwords
account and follow some simple steps:
1 S tart by heading to the Menu > Tools and
Analysis > Conversions
2 Click New Conversion and follow the
steps indicated
From this point on, any time one of your
keywords triggers a sale, you will see it in the
Adwords interface itself.
The big question I get, however, is how
you can gauge the performance of your SEO
keywords in Google Analytics.
There are two ways to track conversions/
sales in Google Analytics. One is by tracking your
ecommerce site performance. The other is by
tracking goals.

If you have an ecommerce site and Google


Analytics is installed, follow these steps:
1 U
 sing the main menu, go to Traffic Sources >
Sources > Organic
2 N
 ear the top of the report youll see options:
Site Usage, Goal set 1 and Ecommerce
3 C
 lick on Ecommerce
This will provide you with the quantity of sales,
average order value, revenue per keyword,
conversion rate and more for each SEO keyword.
If you dont have an ecommerce site but you still
want to track specific actions on your site, set up
goals to see actions (Admin > Profiles > Goals).
Always work to improve your keywords
conversion rates. The more clicks a keyword can
deliver, the more important it is to increase that
keywords conversion rate.

Name Useful Metrics


Info Simple, fresh new
web analytics service.
Users can rebrand the
service in line with their
clients via CMS. The
service is free to use on
two sites, and theres
a 30-day trial (http://
usefulmetrics.com).
Name Google May
Be Crawling Ajax Now
How To Best Take
Advantage Of It
Info Search Engine
Lands Vanessa Fox
discusses Googles
policy for crawling
Ajax sites in this 2010
article (http://netm.ag/
google-ajax-233).
Name Searchable
Dynamic Content
With Ajax Crawling
Info Zack Grossbart
explains how to use
Ajax without hurting
your SEO in this
informative article from
Smashing Magazine,
published in 2011
(http://netm.ag/
searchableajax-233).

David Deutsch is CEO at New Epic Media

How to Search engine optimise an Ajax website


Expert advice
Web pro

Name Michael Abitbol


Job title Senior SEO specialist
Company SEO Brand
URL http://seobrand.net

Asynchronous
JavaScript and XML
(Ajax) is used to
create more dynamic
websites. It makes
object request calls
back to the server to
update the content

136

while any visitor


is viewing it, so
it doesn't require
the browser to be
refreshed. However,
it does pose some
problems for search
engine optimisation.

The SEO Handbook

The problem
When a search engine
robot visits any web
page to index the
content, it doesnt click
on the links and
buttons like a common
customer. Instead, it
notes the URLs
associated with each
page, visits them
individually and then
indexes them.
Ajax wants its pages
to be minimal, which is
an approach opposite
to that of the search

engine. Therefore, it's


possible that any
content changed by
Ajax thats not hardcoded onto the pages
will be missed by
search engines,
diverting your traffic
to other pages. There
are two solutions to
this problem:

Solution
number one
First, you should
degrade your pages to
normal flat mark-up

language. This is
important for nonJavaScript-capable
browsers and search
engines. When you
use the Ajax call, make
sure that the page has
the same content

Solution
number two
Second, you can use
Ajax in minimalist
fashion, so search
engines can see the
optimised content,
while at the same

time, a user can have


updated content. This
approach may be used
to remove the content
from any web pages
that are affecting your
website in a negative
manner, such as
testimonials, which
Ajax/JavaScript enables
you to insert without
affecting your
keyword density.
Ajax is also good
for creating member
sections, forums, slide
shows, and so on.

Web pro

Post-Penguin link building

Useful SEO
resources
Name My Blog Guest
Info Guest blogging is
one of the most natural
methods of obtaining
backlinks. Join My
BlogGuest and submit
your posts to other
blogowners.
www.myblogguest.com

Post-Penguin
link building
As most of you will know all too well,
Googles Penguin update took the
online world by storm. Websites that
had historically ranked well suddenly found
themselves nowhere at all on SERPs despite
having never touched black-hat SEO techniques.
With this in mind, many webmasters and SEOs
are unsure as to which SEO techniques are safe,
and what needs to be avoided.
Here are some techniques and opportunities
that are safe to use in post-Penguin times, which
wont see webmasters receiving unnatural link
warning messages from Google.
Always seek out natural linking opportunities.
Ask yourself whether the link youre placing
exists simply to pass on PageRank or whether
it adds to the purpose of where its placed. For
example, an informative guest blog post on a
relevant website with a link to your website is
considered natural and safe to use, as its adding
related resources within a useful article. If a link

Name PR WEB
Info Submit and
distribute your press
release through PR
Web for your website
to obtain authoritative
backlinks from a range
of trusted sources. Its
also searchable.
http://uk.prweb.com

was placed in a completely unrelated blog post,


however, you must always consider whether its
placement looks natural.
In addition to the above, the submission of
press releases, is another completely safe and
white-hat method of obtaining links. Always be
sure to use branded anchor text within press
releases and write a newsworthy release which
will generate pickup from major online news
outlets. The links obtained from such pickups
pass both authority and trust to your website,
and are acknowledged as completely natural.
Lastly: link bait! Write useful and informative
articles on your own website, or do something
completely out of the ordinary to promote your
site, and people will naturally link to it. Just as
YouTube videos regularly go viral, a well-written
and useful piece of content will be picked up by a
wealth of online resources.

Name Google
SERP Snippet
Optimisation Tool
Info Ensure your
title tags and meta
descriptions not
onlylook great but
display properly on the
search engine results
pages with this free
toolthat simulates
Googles SERPs.
www.seomofo.com/
snippet-optimizer.html

James Brockbank is a search consultant

Expert advice
Name Rory Lofthouse
Job title SEO consultant
Company Bronco
URL www.bronco.co.uk

Get the foundations


right and youll have
a great base to start
with for the future.

Title tags
Title tags are now
limited by pixel width

and not the traditional


character limitation.
So think carefully
before making the
firstletter of each
word upper case. Title
tags need to have a
call to action, to be

descriptive and include


keywords, yet without
being keyword stuffed.
If Google doesnt like
your title tag, then it
will change it: perhaps
to your <H1> tag.

Heading tags
Optimising <H> tags is
often overlooked or
wrongly implemented.
Ideally you should only
have one <H1> tag per
page, but Google will
value your site equally

if you have multiple


<H1> tags as long as it
is justified and they
arent there just to
over-optimise the
page. <H1> is usually
given the most SEO
value, but this doesnt
mean you should
ignore or not optimise
<H2> to <H6>.

Descriptions
Your meta description
is a chance to promote
your company. Google

will fill in your meta


description for you,
but its better for users
if you customise each
one. Meta descriptions
are not used as a
ranking factor as they
once were, but its
valuable advertising
space on Google.

Content
Content is king and
everyone knows it, but
people still feel the
need to copy content

from other websites.


Not only could this
land you in trouble, it
could affect your
websites rankings.
You need plenty of
high quality content,
rather than multiple
pages of poor content.
If your writing skills
are not the best, or
you dont want to
write content, you can
always outsource the
work it will be worth
the effort.

The SEO Handbook 137

Web pro

How to Get the SEO basics right

Web pro

What does SEO mean anyway?

S earch

SEO
troubleshooting
resources
Name Common
Technical SEO Problems
and How to Solve Them
Info A good
overview of common
technical problems
SEOs face, including
canonicalisation
problems, indexed 404s
and improper redirects.
http://netm.ag/
common-235

What does SEO


mean anyway?
If you ask 10 different people about
what SEO is, youre likely to get
10 different answers. They will all
probablyinclude the process of getting traffic
from organic listings in search engines, but what
that is can differ depending on the experience
and philosophy of who you ask.
Id like to share the definition that Ive learned
in the dozen years Ive been a SEO expert.
SEO is not spam. The best SEOs are trying to
understand what is most relevant to a searchers
query so they can give it to them, which is
exactly what the search engines are trying to do.
By definition search engine optimisation is about
the optimal case, and if you optimise something
it doesnt stop being successful because of
search engine penalties or algorithm shifts.
Because of this, there is no grey- and black-hat
SEO, only SEO and spam.
SEO is hard graft. If you spend most of your
day tweeting about SEO, then youre not an

SEO. SEOs spend their days doing research


to understand their prospects, testing and
implementing theories, and communicating the
value and details of SEO to various stakeholders
within an organisation.
SEO is strategic as much as it is tactical. You
can be an expert in using regular expressions
to rewrite URLs but if you waste time and
resources rewriting URLs when changing title
tags will be more effective and efficient, then
you loseat SEO.
SEO is not dead and will never die. SEO
evolves with the search engines. Techniques
may become irrelevant, but as long as people
use search engines and there are unpaid listings,
there will be SEO. In future columns Ill be aiming
to offer some deeper insights into how to make
this type of SEO work for you.

Name SEMPO State


of Search Marketing
Report 2012
Info The Search Engine
Marketing Professionals
Organization (SEMPO)
released its annual
survey of pros, which
shows the issues they
care about most. http://
netm.ag/sempo-235
Name Ask Matt Cutts
webmaster questions
Info Googles head
of webspam is once
again taking SEO and
webmaster questions
for the Google
Webmaster Central
YouTube Channel.
http://netm.ag/cutts-235

Bryson Meunier is director of SEO strategy at


Resolution Media

How to Optimise images


Expert advice
Web pro

Name Chelsea Neuman


Job title SEO supervisor
Company Resolution Media
URL www.resolutionmedia.com

While metadata and


header tags have
historically held high
importance for SEOs,
images should be
moving up the priority
list: they hold more
significance and have

138

a higher presence in
universal search than
ever before. Done
right, optimising
images can be fairly
quick and easy, and
significantly affect site
rankings, too.

The SEO Handbook

Things you need to


consider include:
1 Image alt tags:
Every image on a site
should have an image
alt tag that is short and
descriptive (typically
around 140 characters).
The tags should also
include your desired
targeted keywords.
2 Filenames: Before
uploading an image,
pick a descriptive
filename (different

from the alt tag)


containing keywords
you wish to rank for
in SERPs. Dont use
stop words (a, be,
of, the, and so on) in
the filename; they
are usually ignored
by search engines.
Separate each word
with dashes and do not
include any spaces.
3 Anchor text: Linking
to images with text can
play a large role in how
they rank for keywords.

When linking to an
image, use descriptive
anchor text. Avoid
generic terms such
as image or basic
filenames that do not
provide search engines
with information
thats meaningful.
4 Surrounding
content: A page that
contains at least 200
words of quality,
relevant content
surrounding an image
helps define it and

is a key component
in getting it ranked.
Captions describing
it are also beneficial:
theyre one of the
most-read pieces of
content on a site and
offer search engines
further information.
Making images a
bigger part of your
SEO strategy is a wise
investment that can
produce significant
results, without major
commitment of time.

Web pro

Don't call me an inbound marketer

Recommended
SEO
resources
Name Inbound
Marketing is
Incomplete Marketing
Info Raven Tools
co-founder Jon
Henshaw says that good
marketing requires
marketing to all stages
of the consumer
decision journey.
http://squawk.im/
industry-news/inboundmarketing-incomplete

Dont call me an
inbound marketer
Someone recently asked me if
Resolution will be adopting the term
inbound marketing to describe what we
do. SEOMoz and others have popularised the
term, with people like Rand Fishkin (http://moz.
com/rand) arguing that we cant just be SEOs
anymore. Heres why I won't be adopting it:
We already have a word for marketing with
explicit or implicit permission. Its called
permission marketing and Seth Godin (http://
www.sethgodin.com/sg) popularised it as far
back as 1999. Inbound marketing is a synonym
with one less syllable. It doesnt seem to have
any advantage to anyone but Hubspot (www.
hubspot.com) and SEOMoz in calling it inbound
marketing rather than permission marketing.
I agree with Rand Fishkins recent video that
SEO touches a number of different fields (http://
netm.ag/seo-243). I dont agree that this is a new
phenomenon. SEOs especially enterprise SEOs
have always had to understand related fields

in order to cause positive change within an


organisation. However, there are also many tasks
that content marketers, social media marketers,
designers, developers, paid search marketers etc,
dont do. Good SEOs better do these things
extremely well or theyll never be hired by large
search firms like mine. In addition to these tasks,
there are also niche disciplines within SEO: video
SEO, mobile SEO, social SEO etc, that too many
companies ignore but could make them ultra
competitive in SEO. As an industry, we should be
focusing on creating more of these.
As an SEO, I wont resist change if it will cost
me business, so I will adopt the moniker if the
rest of the industry does. But, so far, they
havent, and I applaud others like Jon Henshaw
of Raven Tools (http://raventools.com/about)
who are speaking out against it as well.

Name Matt Cutts Says


No to Renaming SEO
Info Googles Matt
Cutts explains why he
thinks changing the
term SEO to
inbound marketing
is unnecessary.
http://youtu.be/
ZStQhWx8YPc
Name AJ Kohn on SEO
vs Inbound Marketing
Info Im an SEO and
proud of it, says AJ
Kohn in this detailed
explanation of what SEO
is, including why hes
not comfortable calling
it inbound marketing.
www.blindfiveyearold.
com/what-is-seo

Bryson Meunier is director of content solutions at


Resolution Media

Expert advice
Name Matt Ballek
Job title YouTube strategist
and video SEO
Company VidiSEO
URL www.vidiseo.com

In 2012, Google added the ability to associate


your website with your YouTube channel. The
real perk is the ability to add annotation links
to your associated website (http://netm.ag/
google-243). Annotations are overlaid on top
of your video and previously could only link to
YouTube.com URLs. Adding annotation links to
an associated website opens up a whole new
world of possibilities, including the ability to
track clicks back to your website.

Heres how you do it:


1 Video selection
Choose a video that
has a call to action and
is annotation friendly.
Its best to have
annotations in mind
when creating the
video to maximise
interaction.
2 Tracking string
Choose a destination
URL from your website
and create a custom
analytics string.
Creating a Google
Analytics campaign

tracking string (https://


support.google.com/
analytics/answer/
1033863) is easy. I set
the source as
YouTube, the medium
as video, the
campaign content as
annotation and the
campaign name as the
videos title.
All of the large
analytics providers
offer custom campaign
tracking products.
However, the process
of creating a tracking
string will vary.

3 Annotation
On YouTube, add an
annotation to your
selected video and set
the link type to
Associated Website.
Then, simply paste
your URL complete
with tracking string.
Now any clicks on
that annotation will
appear in Google
Analytics, so you can
see how these visitors
interact with your site
or track their
movement towards
conversion goals.

Once you have set up


on multiple videos, its
easy to identify which
of your videos are
driving the most value
for your business.
Arming yourself
with this data also
makes it easier to
eliminate low quality
view sources.
Are you running a
paid promotion of
your video? See how
many of those paid
views hit your website
rather than just your
view count.

The SEO Handbook 139

Web pro

How to How to track YouTube conversions

Web pro

Don't game the system, optimise

S earch

Optimisation
resources
Name Disavow
Links Tool
Info Googles tool for
cleaning up a spammy
link profile should
only be used by those
websites who are having
problems getting bad
links taken down. http://
netm.ag/disavow-bz92

Dont game the


system, optimise
The Merriam-Webster Dictionary
defines optimisation as: an act, process
or methodology of making something
(as a design, system or decision) as fully perfect,
functional or effective as possible.
I think black-hat SEO is an oxymoron: blackhat techniques are by nature short-lived. Part of
being as fully perfect, functional or effective as
possible when it comes to visibility in the search
engines retaining that visibility for long periods
of time, regardless of algorithm changes.
Perfection isnt short-lived its sustainable.
When Google had just announced its own
Disavow Links tool at Pubcon, people in the SEO
industry were busy dissecting it and analysing
its merits. The utility is Googles version of a
tool Bing released, and enables webmasters to
tell search engines they dont want to be held
responsible for certain links that point to their
sites, and that these links should not count
toward their sites overall authority or credibility.

Name SEO Should


BeInvisible
Info RKG President
Adam Audette argues
over-optimisation is not
SEO: SEO should be an
invisible layerbeneath a
strong user experience,
a beautiful site, and
a clear, coherent
message. http://netm.
ag/invisible-236

Its a useful tool if youre taking on a new


client who is trying to recover from having hired
a bad SEO, and this is what Google promotes
it as today. But its sad that such a tool is even
necessary if we really are optimisers, and not just
spammers flying under the radar until Google
finds our loophole.
So I would like to propose that the SEO
industry, and all of us who define it, start living
up to our names. Were optimisers. We compete
with one another in the search results by making
content relevant, accessible and findable, which
is not an easy task. Focusing on the sustainable,
and not just what works right now, will elevate
everyone in our industry, and ensure that our
discipline of gaining visibility for our clients
content in search results truly is as fully perfect,
functional or effective as possible.

Name Search and the


Age of Usefulness
Info Search veteran
Gord Hotchkiss argues
that SEOs today need
to be both relevant and
contextually relevant, or
useful, in order to keep
up with Google.
http://netm.ag/
usefulness-236

Bryson Meunier is director of content solutions at


Resolution Media

How to Earn links by building relationships


Expert advice
Web pro

Name James Swinarski


Job title SEO supervisor
Company Resolution Media
Web www.resolutionmedia.com

Manual link building


is hard work. As
Danny Sullivan said
on Search Engine
Land (http://netm.
ag/sullivan-236): The
harder it is to earn a
link, the more likely

140

[it] will help you with


Google. Cornercutting link builders
didnt survive recent
algorithm changes;
If you want to build
links, you have to
build relationships.

The SEO Handbook

Heres how we do it at
Resolution Media:
1 Use tools such
as Open Site
Explorer (www.
opensiteexplorer.org),
BrightEdge (www.
brightedge.com) or
Majestic SEO (www.
majesticseo.com)
to identify relevant
websites that might be
interested in linking to
your site. Use brand
keywords to find whos
already talking about

you or your industry,


or find related sites
to the ones that are
already linking to you.
2 When you start
reaching out to the
webmasters of the
relevant sites youve
found, dont do it just
to get a link.
Find what you like
about the site and
let the webmaster
know. If you really like
the site, share it over
social media.

3 Aim for trust,not


links. You shouldnt
expect to get a link
with just one or two
compliments on the
site. Make sure you
provide a consistent
level of engagement
with your new contact.
It may take a few
weeks or even more
to gain their trust, but
as we said earlier, the
best links to help you
with Google are never
the easiest links.

4 Provide value or
go away. Ask yourself
what you can offer to
the webmaster that
would be of value to
them? If the answer is
nothing or money,
youre wasting not
only your time, but
the webmasters too.
Ifyou provide value
over time, you will
most likely get a link
from the site owner
without having to
request it.

SUBSCRIBE TO

AND GET TWO YEARS FOR THE PRICE OF ONE!


Subscribe today to get 26 issues for the price of 13!

GREAT REASONS
TO SUBSCRIBE
l

Get your copy early

Stay up to date on the new


web technologies

Exclusive access to the worlds


leading experts

PRICES
6 MONTHS $67.99 (Save 38%)*
1 YEAR $135.99 (Save 38%)*
2 YEARS $220.87 (Save 50%)*

LIVE IN THE
UK? TURN
TO PAGE 14

SAVE
50%*

SUBSCRIBE ONLINE AT imsnews.com/net-a035


OR CALL TOLL-FREE ON 1-800-428-3003 QUOTE CODE A035

Terms & conditions: * Savings compared with buying 13 full priced issues at $16.99 from the US newsstand. This offer is for new North American
print subscribers only. You will receive 13 issues in a year. Minimum subscription term is 12 months. If at any time during the first 60 days you are
dissatisfied in any way, please notify us in writing and we will refund you all unmailed issues. Prices correct at point of print and subject to change.
Offer ends 30 September 2014.

Web pro

SEO is the glue

S earch

SEO integration
resources

SEO is the glue in the


web design process
The other day I heard someone in our
team say: theres no place in our process
for that SEO task right now, so we just
have to add it to our process.
This seems simple, and it is; but so many
organisations get it wrong. They think of SEO as
one step in a process (usually post-launch),
rather than the checkpoints throughout the
process to ensure that everything that the
individual teams do is search-engine-friendly.
In some discussions of the design process, as
in the article that appears first in Google for the
phrase web design process (http://netm.ag/
process-238), SEO is absent entirely. But we can
still use the framework it provides for a sense of
how SEO should be integrated: during planning,
design, development, launch and post-launch.
In the planning phase, web designers,
information architects and content strategists do
a requirements analysis to understand client
goals, the target audience, and so on, then

Name Structuring
an SEO Project
Info OMD UK SEO
director Sam Crocker
explains how to
integrate SEO into the
web design process,
proposing ways to
structure SEO beyond
the retainer model.
http://netm.ag/
crocker-238
Name How SEO
CanWork with
ContentStrategy
Info SEO veteran
Lee Odden explains
the common ground
between SEOs and
content strategists.
http://netm.ag/odden238

determine software requirements. Often, groups


involved follow their own disciplines, and SEO is
a low priority if it is a consideration at all.
Often, none of these groups consider what
people are actually looking for online when
considering what content to build. Likewise,
software is evaluated on its aesthetic merits,
and not on its accessibility to search engines.
In this and the design phases that follow, SEOs
can be valuable in that they represent a different,
search-engine-user-centric viewpoint to temper
aspects of each discipline that cause problems
with search engine visibility. By presenting this
viewpoint throughout the process, good SEOs
can become the glue that holds your web design
process together. The result will be a website
that is not only more friendly to search engines
but to your users as well.

Name SEO
Webmaster Tools Help
Info Googles official
SEO guide says: A great
time to hire [an SEO] is
when youre considering
a redesign, or planning
to launch a new site,
and lists specific points
SEOs can help with.
http://netm.ag/
seodef-238

Bryson Meunier is director of SEO strategy at


Resolution Media

How to Optimise Flash with the SWFObject technique


Expert advice
Web pro

Name Gabe Gayhart


Job title SEO supervisor,
content solutions
Company Resolution Media
URL www.resolutionmedia.com

The SWFObject
method (http://
code.google.com/p/
swfobject) makes it
possible to embed
Flash content in a
standards-compliant
way. But, improper

142

copying or failure to
update the alternative
content in the
container div can
mean this becomes
out of line with
content within the
Flash movie.

The SEO Handbook

The following method


should ensure the two
remain in sync.
The key to success is to
use an external XML
file. This makes it easy
to update Flash
content, since it
enables the site
owner to edit the XML
document and have
the Flash pull that
content into the SWF
and display it.
When this method
is applied, the content

that is used in the


Flash is also available
for populating the
SWFObject alternative
content div. This can
be done through the
use of XSLT: effectively
a style sheet for an
XML document that
can be used to
transform the XML into
HTML. (For more
information, Google
XSLT tutorial.)
This way, updating
the XML file will also

update both the Flash


and the alternative
content in HTML.
Bear in mind that
for this approach to
work, three separate
pieces of code will
beneeded:
1 XML
The XML document
that the Flash is using
for its content.
2 XSLT
An XSLT document that

lays out your template


for transforming XML
to HTML.
3 A server-side script
Client-based scripting
solutions typically
transform the code
after the page is
loaded, thus
preventing that
content from being
crawled by spiders.
Find more details at
http://netm.ag/
swfobject-238.

Web pro

Seven essential tools for SEO

SEO
resources
Name Tools for
Pulling Rank from
SMX Advanced
Info Michael Kings
tools presentation from
SMX Advanced 2012 is
almost a year old, but
the tools he presents are
still useful to SEOs.
http://netm.ag/king-240

Seven essential
tools for SEOs
There are so many essential tools for the practice
of SEO. Here are seven more tools to add to your
SEO toolbox:
 ri Valet (http://urivalet.com) Checking
U
server headers is essential for diagnosing a
number of crawling and indexing issues. This is
the best server header checker that Ive seen.
This one has more user agents than most,
including mobile user agents for mobile SEO.
Feedly (http://feedly.com) Knowledge
management is a task in SEO that can separate
the casual practitioner from the SEO expert.
Many SEOs use social networks such as Twitter
for knowledge sharing. I personally find it too
noisy. As Google Reader is discontinued, Feedly
can handle RSS and social feeds.
Screaming Frog (www.screamingfrog.co.uk/
seo-spider) If Xenu were created with SEO in
mind, Screaming Frog would be the result. Easy
exclusions, greater flexibility and sitemap

Name The Brain As The


Original SEO Instrument
Info This is a good
reminder from
Todd Mintz that tools
are only as useful as the
person using them.
http://netm.ag/mintz240

creation make this tool a great addition to any


SEOs toolbox.
Bing Webmaster Tools (http://bing.com/
webmaster) Duane Forrester and the rest of
the team at Bing Webmaster Tools have put a
lot of work into its version of Google
Webmaster Tools over the years, and it shows.
The SEO analyzer and Index Explorer are two
features youll wish Google had too.
Bright Edge (http://brightedge.com) If you
represent an enterprise, Bright Edge is among
the leaders in the group. SEOs deal daily with
big data, and Bright Edges recommendations
feature helps us prioritise whats most
important.
SEO Browser (http://seo-browser.com) Like
the Google cache, it lets you see your site how
spiders do, but with more actionable data.

Name 78 resources
for every Internet
Marketers Toolkit
Info These arent
exclusively SEO tools,
but this big list is fairly
exhaustive and is
organised by function
and clearly labelled as
free or paid for, easily
getting to the most
valuable tools for you.
http://netm.ag/
davies-240

Bryson Meunier is director of content solutions at


Resolution Media

Expert advice
Name Sam Crocker
Job title Digital director
Company OMD UK
URL www.samuelcrocker.com

The assumption holds


that the higher up
the search engine
results pages (SERPs)
you appear the higher
the click through rate
(CTR) you will see to
your website. There
are a number of
models used to predict

CTR based upon


ranking. These models
dont account for the
advantage that using
structured data can
give your site because
search engines have
begun displaying rich
snippets. This can help
your site stand out.

Although microformats
and RDFa may be
supported, Google
recommends using
microdata, so this
would be the best long
term implementation
from a rich snippet
perspective.
Before you implement
this mark-up, check
content types are
currently supported by
rich snippets (updated
list from Google:
http://netm.ag/
richsnippets-240).
Plus additional

requirements to get
Google to display
authorship
information: http://
netm.ag/
authorinfo-240).
Visit http://schema.org
to see hierarchy,
schema types and
other documentation.
Implement microdata
onsite or use Googles
Data Highlighter
(http://netm.ag/
highlighter-240) if
onsite HTML
implementation is
not feasible.

Test the
implementation
with the Rich Snippet
Testing Tool (http://
netm.ag/
richtesting-240).

Top tips
 esults may vary. For
R
many schema types
(such as reviews),
this can lead to a
considerable uplift
in traffic compared
with ranking in a
similar position
without rich
snippets displayed.

 isplaying some
D
information (such as
price or stock) in the
SERPs may actually
reduce CTR but
increase conversion.
Always consider the
user experience.
If you are showing
reviews on your site,
try to implement the
mark-up on a
product page from
which a purchase
can be made.
Keep information
up to date to keep
users happy.

The SEO Handbook 143

Web pro

How to Use microdata to improve CTR

Web pro

Conversational search and SEO

S earch

Conversational
search links
Name Googles
Impressive
Conversational Search
Goes Live on Chrome
Info Danny Sullivan
explains what it is and
why he thinks it really
is one of those
significant changes.
http://netm.ag/
search-244

Conversational
search and SEO
If you use Chrome, you now have the
ability to speak your search terms
instead of type them. What, if anything,
does conversation search mean for SEO?
Now Google has the ability to answer more
conversational queries, searchers may begin
speaking to it with more natural language
queries (eg Tell me the name of the best
restaurant in Hells Kitchen in New York), which
are generally longer and more complex. If youre
relying on a few high volume keywords for the
bulk of your revenue, consider long-tail
keywords. Companies like Bloomreach (www.
bloomreach.com) create pages for long-tail
queries to help with conversational search.
What keywords bring up direct answers?
If youre depending on exact match domains
like WhatTimeIsItInLondon.com to drive your
business, this is another nail in your coffin.
Where possible, Googles conversational search
continues the trend of providing users with the

Name Google
announcement on
conversational search
Info The official
announcement on
conversational search by
Googles Amit Singhal.
http://netm.ag/amit-244

best single answer rather than lots of search


results. Find keywords that are too ambiguous
to bring up direct answers and then optimise
accordingly. Good for users and your business.

Help Google understand


Does Google understand what youre relevant
for? There are now many more ways searchers
can be asking for content, and Google will need
more help than ever figuring out what content
provides the best answer to those queries.
If youre not already using Schema or other
structured markup, todays a good day to start.
Googles made it easier than ever with
enhancements to the data highlighter tool in
Google Webmaster Tools. Also keep an eye on
Wikipedia and Freebase (www.freebase.com)
for inaccuracies around your brand.

Name Implications of
conversational search on
SEO from 360i
Info 360is group
director of SEO, Mike
Dobbs, provides his take
on how conversational
search will affect
SEO, noting how
conversational search
may further complicate
the analytics blackout
that secure search has
given marketers.
http://netm.ag/360i-244

Bryson Meunier is director of content solutions at


Resolution Media

How to Increase your local Google+ visibility


Expert advice

Web pro

Name David Mihm


Job title Director of
local search strategy
Company Moz
URL http://moz.com

Google+ has been widely criticised by


mainstream and tech press alike for its lack
of user engagement. Nonetheless, helping
Google establish the three-way connection
between location, website and author is one of
the easiest SEO opportunities youll ever come
across. Engaging with the network on all three
fronts can yield tremendous gains in brand
exposure and increase your clickthrough rate
across a variety of search results.

144

The SEO Handbook

1 Claim a location
To ensure you show up
for relevant local
search queries, make
sure that the full
Name, Address and
Phone number (NAP)
of your business
appears together in
static HTML. See the
coding information in
Schema.org/
LocalBusiness. Claim
your local Google+
Page at www.google.
com/business/
placesforbusiness. If
you operate a multi-

location business, be
sure that each location
has its own dedicated
location page on
which this NAP
appears. Then claim
all of your locations
at Google Places for
Business. During the
claiming process, use
the individual location
pages as the URLs
associated with each.
2 Be a local brand
To increase the
likelihood that Google
will show a Knowledge

Graph result for your


brand name, show you
are a local brand
rather than an
anonymous spammer
by adding the
rel=publisher tag
to your homepage.
If youre a multilocation business, link
your rel=publisher tag
from your homepage
to your company
Google+ page.
3 Link to authors
Link to authors
personal Google+

profiles by adding the


rel=author tag to each
page they have
authored, like so:
<a href="https://plus.
google.com/youruniq
ueID"rel=author">Go
ogle</a>
Include your
website in the Links
section of your
personal Google+.
Author photos in
search results for your
business should
increase clickthroughs.

Web pro

Site migration without migraines

SEO
resources
Name How to use the
Bing site move tool
Info We need to give
Bing a lot of credit for
providing a useful tool
in its webmaster tools
before Google. This one
allows you to let Bing
know which new URLs
are associated with
which old ones, even
without 301s in place.
http://netm.ag/bing-242

Site migration
without migraines
I was fortunate to be one of the experts
at SES New York (http://sesconference.
com/newyork), which is part of a series of
events for search and social marketing. I was part
of the conference's Meet the Experts: Roundtable
Forum, talking webmasters through the perils
of site migration along with Eric Enge of Stone
Temple Consulting.
If you dont account for SEO when developing a
new site, the worst case scenario is losing all of the
historical data that the search engines depend on
to rank your site properly in search results.
Over time, links and shares accumulate to pages
URLs on the web and if you dont have a plan
for how that equity will be transferred, you could
find yourself starting over. However, theres an
easier way. The easiest way is to keep your URLs.
So many times, URLs are changed arbitrarily: for a
new CMS, or for tracking purposes, or because the
new developers want to start over entirely and
dont consider SEO or the business consequences

of their actions. These arbitrary decisions can be


disastrous for your site.
If you do have to change URLs, expect a drop in
traffic as the search engines catch up, but it can be
less painless if you do the following first:
Get a list of your current URLs. In the Traffic >
Links to your Site Report in Google Webmaster
Tools, theres a list of your most linked pages.
Permanently redirect these URLs to equivalent
pages. Temporary (302) redirects dont transfer
link equity, so its important to use 301s. If there
isnt an equivalent page, then redirect to the most
relevant category page on your site or serve a 404.
Bing also offers a site move tool (www.bing.
com/webmaster/help/how-to-use-the-site-movetool-bb8f5112) within Webmaster tools to assist
with the process. Whatever you do, dont expect
the engines to take care of it for you.

Name Website
migration tips for SEOs
Info An article by
iCrossings Modesto
Siotos on site migration,
including a full process
for webmasters and
SEOs to follow.
http://netm.ag/
seomoz-242
Name How to avoid
SEO disaster during a
website redesign
Info An article by Glenn
Gabe includes case
studies of companies
that didnt take SEO
into account
before launching.
http://netm.ag/
disaster-242

Bryson Meunier is director of content solutions at


Resolution Media

Expert advice
Name Tyson Braun
Job title SEO supervisor
Company Lowes
URL www.lowes.com

Working with SEOs can sometimes be a


challenge; all text mantras can come off as
restrictive to creative projects. However, what
we SEOs really mean to say is that creating a
down-level experience through progressive
enhancement (PE) or graceful degradation
tactics should be implemented to ensure web
content can be understood (and promoted) by
search engines. Here are five ways to integrate
PE into your next web development project:

1 Unobtrusive
JavaScript
If youre using
JavaScript, separate into
behaviour and content
layers so text-only
browsers and search
engines can read the
content presented
without disrupting the
basic HTML.
2 Semantic HTML
Wherever possible
utilise HTML markup
that provides semantic
meaning over
presentation-only
signals, such as

<strong> over <bold>.


Keep up with the latest
options at schema.org
to codify web content,
such as for events, so
Google can slice and
dice the web to improve
search results.
3 No script tag
I recommend using this
catch-all tag to provide
a graceful degradation
option for browsers
that are not running
JavaScript at all.
4 External CSS Files
Place CSS styling into an
external file to allow

content to display
without complex CSS
styling. This also has
the added benefit to
improve your text to
code ratio that permits
the search engine bots
to crawl your pages
very quickly.
5 Flash as an
enhancement
If youre utilising Flash,
build it into your page
as an enhancement to a
HTML or HTML5
experience. This will
allow the rich
experience of Flash

without clouding the


experience of search
engines as well as lesscapable browsers.
The yet imagined
presentation of the web
shouldnt be restricted
at all, but it works best
if its built on a
foundation that can be
understood by search
engines. By adopting
these, as well as other
tactics, youll foster a
successful partnership
with your SEO teams in
creating a rich and
findable internet.

The SEO Handbook 145

Web pro

How to Embrace progressive enhancement for SEO

Web pro

Inconsistent data in Google Analytics?

A
nalytics

Q&A
Name Peter ONeill
Job title Founder
Company L3 Analytics
Web www.l3analytics.
com

Inconsistent data in
Google Analytics?
When Google introduced the premium
version of Google Analytics (GA), it was
more aimed at implementing reliability
rather than providing an upgraded feature set.
Its no secret that one of the problems plaguing
the web analytics community is the lack of human
resource (decent web analytics practitioners). In
bringing out the premium version of GA, Google
was aiming to alleviate some of this anxiety by
providing paying customers with, among other
things, a service that guaranteed 24-hour support
to deal with issues relating to their product. (A
short trawl around some of the forums will confirm
that GA, like pretty much all other web analytics
tools, is not without its faults.) An additional part
of the service included what appears to be a more
robust data set as a result of zero sampling.
Sampling is an issue that presents the rest of
the non-premium GA user population with a bit
of a problem relating to accuracy and consistency.
Google sells the benefits of sampling in GA from
the standpoint of speed, although for long-term
users this may be a debatable point. Users are now
able to adjust the level of sampling themselves,
but the degree to which they can access a full data
set varies depending on the report being run and

.net: What do you think


are the main issues
vexing web analysts at
practitioner level?
PO: There are technical
questions on campaign
attribution and member
segmentation, but I
believe the main issue
is how we can break
through and really
make a difference to
organisations. We
know we have access to
incredibly valuable data
and insights, but we need
the resources to focus
on the insights (not on
running reports). Most
importantly, we need the
buy in and support of
senior management.

factors such as date range: the longer the date


range the greater the level of sampling needed.
The issue with sampling is that it is not
uncommon for numbers to change in any given
data set if that data set is re-run; in other words
there can be a lack of consistency when rechecking
figures. For example, if an analyst wants to look
at the last 52 weeks worth of data against total
weekly visits and associated bounce rate, it is a
simple matter of setting the required report, date
range and interval. When the report has been
run, if sampling has been applied there will be the
option to adjust the data set for a greater level of
accuracy (vs speed). If the report is subsequently
re-run it is quite possible that the numbers will
differ to some extent from the original report.
In order to have confidence in numbers,
consistency is important. It can be disconcerting
when exactly the same report produces noticeably
different output, but analysts need to hold their
nerve. Here as much as anywhere, the old adage of
looking at the trend and not obsessing over single
data points here and there holds strong. It just
helps to remember that. l

.net: How is the web


analytics community
evolving to tackle new
challenges as they arise?
PO: It is starting to
band together to
share knowledge and
experiences. The recent
MeasureCamp (a digital
analytics unconference)
was really exciting: a lot
of people discovered
theyre not alone, and it
was amazing how freely
individuals were willing
to share.

Hugh Gage is owner of Engage Digital

Web pro

Statistics: Conversion rate optimisation


Tools, strategies and processes employed by companies to improve conversion rates
Methods used

Percentage

Methods used

Percentage

A/B testing

46%

Competitor benchmarking

27%

Copy optimisation

42%

Segmentation

22%

Customer journey analysis

40%

Event-triggered / behavioural email

21%

Online surveys / customer feedback

40%

Abandonment email

20%

Usability testing

30%

Multivariate testing

17%

Cart abandonment analysis

29%

Expert usability reviews

15%

Source RedEye / Econsultancy Conversion Rate Optimization Report 2012

146

The SEO Handbook

.net: What advice would


you give to web analysts
just starting out?
PO: To ask a lot of
questions, of as many
people as possible. Also,
to practice questioning
data and interpreting
what it means. These
web analysts have the
potential to be at the
centre of organisations
in the future, influencing
every other department.

Web pro

What's the point of web analytics?

Q&A
Name Dan Barker
Job title Independent
ecommerce consultant
Web www.barker.dj

Whats the point


of web analytics?
The process involved incessant A/B testing and
analysis of all aspects of the email campaign from
subject lines, body copy, tone of voice (including
mild profanity such as Hell yeah...) to font type
and size, colour of the Donate buttons as well
as recency and frequency, and no doubt much
more. Even when they had a winning formula, they
would continue to test and analyse performance.

Keep asking questions

.net: Without access to


web analytics data, what
would you use instead?
DB: Sometimes I use
other systems to get an
abstract picture of whats
going on, links, shares
and search volume over
time, volume of search
interest in comparison
to the site rankings, back
office order numbers in
comparison to marketing,
and so on. From there
you can test hypotheses:
if we change this input,
this output should occur.

The campaign staff found that what once worked


did not always continue to do so. In other words
every format appeared to have a use by date.
The point is they never stopped asking
questions that would help them answer the most
important question of all: what do we have to do
to raise as much campaign funding as possible
using email? While a robust testing program no
doubt generated big data, it was the perpetually
enquiring and analytical mindset of the digital
analytics team that led to the big money. In the
end, bigger even than Mitts cash pile. l
Hugh Gage is owner of Engage Digital

Statistics Digital opportunities


Mobile optimisation
Social media engagement
Targeting and personalisation
Content marketing
Content optimisation
Conversion rate optimisation
Marketing automation
Joining up online and offline data
Brand building / viral marketing
Video marketing
Social media analytics
Connected TV

2012

2013

38%
54%
NA
18%
37%
31%
11%
23%
27%
24%
19%
NA

43%
35%
35%
30%
27%
25%
23%
23%
18%
18%
14%
3%

Source: Econsultancy Quarterly Digital Intelligence Briefing:


Digital Trends for 2013 in association with Adobe

What are the most exciting digital-related opportunities for your organisation in 2013?

.net: What is the most


common misconception
about web analytics?
DB: Occasionally people
fall into the trap of
thinking of it purely as
measuring the current
numbers rather than
about the information
needed to help change
future numbers. Its also
important to consider if
the numbers are correct
and give a true picture of
reality by asking how the
data was measured.

The SEO Handbook 147

Web pro

With all the past and present talk of big


data (a now hackneyed expression) and
how its the new oil you may be forgiven
for thinking that, as long as you have access to it,
all will be well.
Even if you have a good statistician, theres
no guarantee of success. The idea that data can
be manipulated and interpreted to give desired
answers isnt new. So what comes first, the data
or the questions? Theres really no doubt that the
questions should and do come first. Whether you
have access to big data or a clairvoyant doesnt
really matter so long as you get insight that, when
put into practice, yields the most cost efficient
results. In spite of the industry rhetoric, [web]
analytics is not an end in itself, its a means to find
(one of many) ways in which performance can be
improved in the digital environment.
Now that the US presidential election is well
and truly over, the covers have been lifted on
his email fundraising efforts. Much of the funds
Obamas campaign raised online were small
donations levied from the barrage of emails sent
out by his campaign staff.

.net: Whats more


important, stats muscle
or an analytical mindset?
DB: Stats tools can give
you a good picture of
how your website or
your marketing activities
are performing. If youre
analytical enough, you
can figure out whats
going on and how to
improve it even if stats
arent perfect. If youve
got the mindset, you can
often build the muscle.

Web pro

Remarketing with Google Analytics

A
nalytics

Q&A
Name Tim
Leighton-Boyce
Job title Google
Analytics and
ecommerce consultant

Remarketing with
Google Analytics
Imagine a person walks into your store,
has a good look around, picks out an
item, gets to the cash till and is all ready
to pay, but decides they want to do some last
minute shopping around first and walks out; that
would be disappointing. The first thing youd
want to do is get them back in. If you were the
owner of an online store, getting that customer
back would be much easier using Remarketing
with Google Analytics (http://netm.ag/ga-bz92).
Last year, Google Analytics (GA) announced its
new remarketing option, which is like an enhanced
version of AdWords remarketing. Making it work
is a pretty simple. First, ensure your AdWords and
Google Analytics accounts are linked. If youre,
in the parlance, a joined up marketer then this
should already be done. Second, add a line of
code to the standard Google Analytics tracking
code on every page of your site. This brings in the
DoubleClick cookie thats needed for remarketing.
You can find the requisite line of code in GAs
help centre (http://netm.ag/help-bz92). Third,
amend your privacy policy to reflect the use of GA
remarketing. Find guidelines in the help centre.
Now, for the interesting bit. In the admin menu
for your Google Analytics account, youll find a tab

.net: What are your top


three tricks in GA?
TLB: Explore the
reporting API. This may
sound scary, but its
easy to get started and
then a whole new world
will open up. The best
starting place is using
Google Drive and the
GA magic script (http://
netm.ag/magic-242). Also,
make sure to track errors
to improve usability and
conversion rates.

labeled Remarketing Lists. This is where you can


start building your remarketing list. Once built,
it should automatically show up in your Google
AdWords account where you can activate it and
start running ads on Googles display ad network.
The real clincher in all this is that setting up Google
Analytics remarketing lists can involve as many
filters in as great a variety as you find in your GA
Advanced Segment configuration.
Also, because remarketing is carried out at the
visitor level and not the session level, lists can be
built using sequential filters and even cohorts. For
example, you should in theory be able to retarget
a list of people that saw a particular product or
group of products, added them to their shopping
basket, created an account, or perhaps just got
to the payment stage but then abandoned the
checkout process and left the site - all in that
order. This group could be considered a bunch
of pretty hot leads and by creating this list, you
can subsequently remarket to them using highly
targeted creative treatments from your product
listing ads to reflect both product and price of the
item your potential customer had in the basket. l

.net: What are the most


undervalued features?
TLB: Theres so much
you can do with custom
dashboards and reports
to make your everyday
working life easier.
Custom segments are
even more valuable since
they can add so much
power to your analysis.

Hugh Gage is owner of Engage Digital

Statistics Digital opportunities


Key challenge

On the radar

Not an issue

Marketing doesnt have the technical


skills to utilise our technology to
the full extent

2012

29%

49%

23%

2013

21%

45%

34%

Marketing doesnt have the


mathematical skills to fully analyse
and optimise our programs

2012

26%

33%

42%

2013

21%

37%

43%

148

The SEO Handbook

Source: Econsultancy Quarterly Digital Intelligence Briefing:


Digital Trends for 2013 in association with Adobe

Web pro

How does your organisation regard the following challenges for 2013?

.net: Whats the single


most important metric
that no analyst should
be without?
TLB: That depends on
the purpose of the site,
but my generic answer
is that it would probably
be goal conversion rates
or funnel abandonment
rates, which are
segmented to compare
performance for different
groups of visitors. The
team working on the
site should be able to
recognise the numbers
as ones that they have
direct control over. The
numbers are there to
help make changes.

Web pro

Conversions are changing

Q&A
Name Anna Lewis
Job title Digital
marketing executive
Company Koozai
Web www.koozai.com
Twitter @Koozai_Anna

Conversions
are changing
which is a legitimate thing to do; and four if you
consider one and three as part of the whole.
It wouldnt be unreasonable to work on the
basis that if each step in the conversion journey
were optimised to its maximum using a rigorous
testing schedule (at least for the digital steps in the
journey), then this will act as the rising tide that
floats all [customer] boats in the context of driving
improved performance. In addition, the issue of
visitor analytics still has a slight European-shaped
cloud hanging over it so its not yet clear how
much we will really be able to rely on it.
In truth, it doesnt matter how smart the
analysis and insight is, if the resource to act on it
isnt available internally or externally via agencies
and consultants, then progress wont be made.
Its surprising how many organisations arent
set up to move quickly to take advantage of the
output they get from their web analytics and other
data. This is where a startup mentality must be
adopted in organisations of all sizes. As everybody
proclaims, the pace of change online is both rapid
and relentless. To know that and not structure
human resources accordingly is baffling. l

.net: Whats in your


analytics tool box?
AL: Right now, my tool
box is mainly Google
Analytics, a bit of tag
manager to sort the
code out, some juicy API
export spreadsheets,
good old Excel and the
best tool for any analyst
my brain! Im also a fan
of heatmap tools Crazy
Egg (www.crazyegg.com)
and ClickTale
(www.clicktale.com).

Hugh Gage is owner of Engage Digital

Statistics How mobile ads are used for brand awareness


Almost half of advertisers are using mobile advertising to increase brand awareness as opposed to driving in-store footfall

40%
30%
20%
10%
0%

14

46

10

12
5

BRAND
AWARENESS

INCREASED
FOOTFALL

3
PRODUCT
LAUNCH/RELEASE

12

REGISTRATIONS

14

29

SITE
TRAFFIC

38

11

SUSTAINED
IN-MARKET PRESENCE

Source: Econsultancys 10 interesting digital


marketing stats report

50%

.net: If you had one wish


what would it be?
AL: My wish is for all the
data I see to be usable. It
needs to be accurate, not
sampled, have enough
volume, be unbiased and
not based on a flawed
collection method. I hate
making decisions on data
that doesnt give the
whole picture!

The SEO Handbook 149

Web pro

These days everybodys familiar with


conversion rates and conversion funnels.
One gives you a top-level overview of
how well your site is performing. The other gives
you a rough idea of where the problem is in
terms of the on-site customer conversion journey.
Of course, there are other ways in which your
sites performance, and indeed the performance of
your online business as a whole, can be evaluated.
Now, with the customer journey shifting further
towards multi-channel and visitor-centric analysis
(although this latter aspect may yet be stymied in
Europe by the EU position on privacy), the way
in which we think of conversion is changing. As
potential customers rely more heavily on some
form of mobile device during the research phase
of their buying cycle, as well as the physical offline
experience in the form of the high street retail
outlet (the infamous issue of showrooming) and
then perhaps coming back to a laptop or desktop
to make the final purchase, it could be argued that
[in this example] there are one, three and four
different conversion aspects to consider. One being
the customer conversion journey across all of these
touch points on the purchase cycle; three if you
count each of the different touch points separately,

.net: Whats exciting


you most in the world of
analytics right now?
AL: Theres a lot of
exciting stuff going on
in the analytics world!
Google are releasing
lots of new functionality
for people using Google
Analytics, which is great.
Universal Analytics is
the big one right now,
but real time and social
reports have also both
seen good improvements
recently. Im also loving
the additional data we
keep getting in GA.

Web pro

Your marketing secret weapon: blogging

M
arketing

Q&A

Your marketing secret


weapon: blogging
Developing a new website? Then make
sure you dont forget to build a blog into
your marketing plan. Its a powerful way
to help visitors discover your website, is great
at helping to build trust and makes keeping in
touch with customers much easier.
Getting found in Google is a top marketing
priority. Blogging brings a number of SEO benefits
and plays an essential part in the drive to get high
search engine rankings. New content being placed
on the website makes Google happy in its quest to
find fresh content. The blog content tends to stick to
a central theme, which means there is going to be
topical keyword density and relevance.
Blogs also tend to acquire links from other
blogging websites, whether in the form of explicit
links from other sites, trackbacks or links from blog
directories. Of course RSS feeds can ping Google the
moment new content is published, triggering rapidfire indexing of the content.
One key benefit to a blog is the ability to keep
in touch with your customers. Blog comments are a
great two-way communication channel. And unlike
other forms of social media, you can moderate and
edit comments before they go live, which appeals
to many businesses worried about negative or
libellous comments. Or do you want to ensure that
your content can reach customer iPads? Then the
Flipboard iPad app gives a fantastic interface to blog

content in tools such as Google Reader. There is a


new marketing trick up the collective sleeve of the
blogger: the principle of authorship. Photographs of
authors appearing in the search results next to the
articles they have written creates a more prominent,
and trustworthy, search result. Google is looking
for great content from great authors, and it knows
there is lots of great content in blogs. By linking to
your Google+ profile you can gain the additional
benefit of a highly visible search result.
A blog hosted on your own domain gives all the
key SEO benefits: content on your domain, inbound
links to it, and a customer experience where visitors
are right in the mix of your own sites marketing
messages. An externally hosted blog loses that SEO
power and gives visitors a less strong experience.
Other development activities could include
integration with email marketing systems, with
LinkedIn profiles or with other social platforms. l
Susan Hallam is CEO at Hallam Internet

Blogging plays an essential


part in the drive to get high
search engine rankings

Web pro

Statistics: Social media marketers blogging activities


 logs remain a strong area of focus for social media marketers,
B
with 68% indicating theyll increase their blogging activities.
 he self-employed are more likely to step up their blogging, with
T
76% reporting
activity in
(down
from 79% in 2011).
Web
analyticsincreased
expenditure
2009
Of those who work
Technology:
39% 11 hours a week or more with social media,
at least 73% of marketers plan to increase their blogging.

Increase: 68%
Decrease: 2%
Stay the same: 18%
No plans to utilise: 12%

Consulting and services: 19%

In 2010, 81% of marketers planned on increasing their blogging


Internal
staff: 42%
and 75% said the same in 2011.

150

The SEO Handbook

Source: www.socialmediaexaminer.com/
SocialMediaMarketingIndustryReport2012.pdf

Name Matt Davies


Role Creative director
Company Attitude
Design
URL www.
attitudedesign.co.uk
.net: Whats your top
priority for a new blog?
MD: Each element of
text needs to be styled
in a legible way, which
is easy on the eye and
clearly differentiates
each area of text. Think
about using fonts,
line spacing, sizes and
colours effectively.
.net: How should
developers work
alongsidedesigners
when building a blog?
MD: My number one
tip is for developers
and designers to go
through a wireframe
process together
before any designing
or coding begins.After
wireframes are in
place, the designer
can get creative on
styling the elements
and the developer can
sit back and be content
to know that what the
designer comes back
with will be possible.
.net: What is the future
for blog design?
MD: Technology is
moving us on into
live microblogging,
which is like Twitter
but through blogs,
with users exchanging
small snippets of
information quickly.
This new technology
will mean that layouts
will probably become
more conversation-like
as opposed to walls of
text, which is the norm
at the moment.

Web pro

Using inforgraphics as a marketing tool

Q&A
Name Paul Roberts
Role Motion/
graphic designer
Company
Harmonix Graphics
URL www.behance.
net/harmonix

Using infographics
asa marketing tool
Start by submitting your infographic to free
showcase websites like Visual.ly, Infographic File,
Infographics Only and Visual Loop. Getting featured
on these services will get you a high-quality link to
your website, and will also serve to drive visitors.
Next, publicise it on social sharing sites like
Tumblr, Pinterest, Reddit and StumbleUpon, and get
ready to tweet about it and put it on your Facebook
page. Now you are going to really pump up the
marketing action. Have a look around, and find
other infographics that have covered a similar topic.
Use a backlink research tool like Open Site Explorer,
and discover sites that have referred to these
infographics. Get in touch with these site owners,
and let them know about your new infographic.
And finally, make sure your influential contacts
know about your infographic. Drop them an email
and invite them to use it on their own blogs! l

.net: What tools


do you use?
PR: I use Photoshop
to create the layout
and illustrations. If the
design requires a 3D
style illustration Ill use
Lightwave to create an
basic isometric object,
which is then imported
into Photoshop and
used as a template to
draw over, or coloured
if required.

Susan Hallam is CEO at Hallam Internet

A powerful tool in
your marketing arsenal

Statistics: Chart Style


22%

Pie Chart

24%

Pictoral Chart

24%

Line Chart

32%

Bar Chart

*Source: http://ivancash.com/Infographic-Infographic

Percentage of infographics with the following charts:

.net: Where do you


get your inspiration?
PR: There are so many
amazing infographics
being created at
the moment that
inspiration is not hard
to find.Of course the
internet has a wealth
of infographic sites that
keep me up to date in
the latest trends.And
I also have an ever
growing collection
of infographic design
books that cover the
main design principles.

The SEO Handbook 151

Web pro

Infographics are a powerful tool in your


marketing arsenal. These graphic visual
representations of information or data
can breathe new life into your web marketing
activities if you follow a few basic steps.
One of the main online marketing benefits of
infographics is the power they have to generate
inbound links for your SEO, and to increase the
sharing of the content on your website. And of
course theyre a great way to get your marketing
messages across without lots of boring text.
To get started, you are going to need some
interesting research data that catches your
audiences attention. Your data then needs to be
converted into a great graphic. Data visualisation is
a fascinating discipline, but dont make the mistake
of assuming that general graphic design is the same
thing as infographic design.
Once you have your infographic, you need a
plan to promote it in order to generate links and
start to get some marketing traction. Embedding
the infographic on your own website means your
current visitors will get a chance to see it, but given
the amount of time you will take designing it, you
will need to make it work hard promoting it to new
audiences to get good return on your investment.

.net: What makes a


great infographic?
PR: Successful
infographics tell a
story that the user
wants to engage with.
They work best if they
surprise, or amaze, or
even inspire the user.
And they have to have
great data to start
with. Even the best
design cant make a
boring story interesting
by slapping on a few
graphs and charts.

TTHE
H E AWA
WINNING
AWA RRDD -- WINNING

TAILORED FOR TABLET.


Experience
multimedia content
content
Experience additional
additional multimedia
in
iPad edition
edition
in our
our fully
fully interactive
interactive iPad
TRY
TRY ITIT
FOR
FOR FREE
FREE
TODAY!
TODAY!

ComputerArts
Artsreaders
readers know
know design
design matters.
matters. Thats
Computer
Thats why
whyweve
wevecompletely
completely
reinvented
our
digital
edition
as
a
fully
interactive
iPad
experience
reinvented our digital edition as a fully interactive iPad experiencewith
with
impeccable
usability.
Theres
also
additional
interactive
content,
impeccable usability. Theres also additional interactive content,
suchas
asimage
image galleries
galleries and
and bonus
bonus videos,
such
videos, which
whichbring
bringthe
the
motion
content
featured
in
the
magazine
to
life.
motion content featured in the magazine to life.

TRY IT
IT FOR
FOR FREE
FREE TODAY
TRY
TODAY WITH
WITH
OUR
NO-OBLIGATION
30-DAY
OUR NO-OBLIGATION 30-DAY TRIAL
TRIAL AT
AT
http://goo.gl/sMcPj
(UK)
or
http://goo.gl/aib83
http://goo.gl/sMcPj (UK) or http://goo.gl/aib83(US)
(US)

Digital replicas also available on Google Play, Nook and Zinio


Digital replicas also available on Google Play, Nook and Zinio

Web pro

Testing times

Q&A
Name Steve Durr
Role MD
Company This Is
Kode Limited
URL http://thisiskode.
com

Testing times

.net: What single


design aspect makes
the biggest difference to
website performance?
SD: The search bar.
Providing a prominent
visual to the search has
been proven to help
increase sales. Helping
customers find what
they are looking for is
the key for securing
quick conversions.

help, or introduce an online survey or customer


feedback tool to identify barriers or areas of concern.
Usually the most successful CRO boils down to
simplifying things. What you do not want to do is
test something silly like just changing the colour of
a button. That is simply daft. You need substantial
changes that will make a big impact on the number
of conversions the site delivers.
Your next job as a web developer is to create
two or more variations of a page, let them go
live concurrently and wait to see what happens.
There are free services such as Googles Content
Experiments, that will manage this process for you,
delivering the variations of your pages to visitors
from the search engines.
In order to deliver a successful experiment, pick a
page that is a busy one on the site, and get sufficient
unique visitors to know if the test has worked or not.
You also need to be patient, and let the test run at
least for a couple of weeks to be sure you are getting
representative visitor traffic. Most of all, you need to
make big, bold changes that will deliver a substantial
change in your visitor behaviour.

.net: How important


are images?
SD: How many of us
would buy what we
cant see? Product
images are essential.
Its great images that
make the sales.

Successful CRO boils down


to simplifying things

Statistics: What changes are most tested on a website?


72% 71%

71% 70%
63% 61%

2011

65%

2012

56%
49% 49%

49%

44%

43%

39%

34%
26%
13% 11%

Call to
action
buttons

Page
layout

Copy

Navigation

Checkout
process

Promotion
and offers

Images

Product
selection
process

Security
fields

.net: Does the way


forms are designed
improve conversions?
SD: Forms can be the
bane of the online
shopping experience.
Having multiple
confusing forms,
replicating information
or just not being clear
is enough for anyone
to abandon the basket.
.net: How do you
persuade a client to
amend their design?
SD: Large online
retailers invest untold
time and money
to establish online
best practices. Using
recognisable high
street names as
examples helps to
reassure the client
that the amends will
improve conversion.

The SEO Handbook 153

Web pro

Susan Hallam is CEO of Hallam Internet

Source: http://econsultancy.com/uk/reports/conversion-rate-optimization-report

Web developers want to build the best


possible sites for their customers, creating
a site that meets the clients requirements.
But what happens if a website isnt meeting its
goals or delivering what the client expected? Or,
what if a client wants to further improve a wellperforming site in order to make it work even
harder for the business?
Were now entering the territory of conversion
rate optimisation (CRO). When a website visitor
completes a goal on a website, this is known as
a conversion. A conversion could be filling in an
enquiry form, making a purchase, downloading a
whitepaper, or signing up for a newsletter. CRO helps
your website perform better and gets more visitors
to do what you want them to do. CRO is the process
of guessing what the problem may be, and testing
different solutions.
The web development team plays an integral
role in a CRO project, but having a firm steer from
the client is an essential starting point. A solid
understanding of customers, their needs, and what
could be stopping them from converting on the
website will give you a solid base to work from.
As a web developer, you'll be asked to provide
support in a number of different ways. A typical
first step would be to implement a visitor feedback
solution on the site that enables the customer to
let you know what is going wrong at the precise
moment they are having difficulty. You could embed
a live chat function that prompts users to ask for

Web pro

Make sure 'S' stands for success

M
arketing

Q&A
Name Ben Wood
Role Online marketer
Company
Hallam Internet
URL www.
hallaminternet.com

How to make sure


S stands for success
Great web design is like walking a
tightrope. Getting the balance wrong
sees a high bounce rate from your site,
with users looking at one page before leaving.
While every web design project is unique, why
not try using this simple marketing framework
to keep your focus on those all important web
visitors. Called the Five Ss, it defines five universal
characteristics of your web visitors: Selfish, Sceptical,
Stubborn, Stuck, and most importantly, Stupid.

Web pro

Your visitors are stupid A contentious statement


maybe, but I bet Ive got your attention now! Firsttime visitors dont know about your business yet;
existing ones may not know all you have to offer.
Write a strong headline that says in plain language
what you do. Have a prominent logo with a clear,
compelling strapline. Remove unnecessary text
and ensure visitors can work out what you do in a
few seconds. Business owners are often so close to
their offering that they cant see the obvious; as a
designer your job is to help the stupid visitors.
Your visitors are selfish My visit to your website is
about me and my needs. I need to understand why
choosing you will help me out. Show me relevant
pictures. Use words like you rather than us or we
or I. And reassure me just how easy it is for me to
proceed to the next step of your buying process.
Your visitors are sceptical They are bombarded
with internet marketing messages and have learned

.net: How useful is


bounce rate in assessing
site performance?
BW: Its a simple but
effective measure of
user engagement, and
if visitors are leaving
after one page view
then you know you
have problems. Dont
focus on overall site
bounce rate, drill down
and look at specific
landing pages, the
traffic source and that
specific bounce rate.

to be cautious. Do you include trust signals such as


partners or accrediting bodies logos, or reviews
and testimonials? Do you set out benefits Ill get if
I offer you my email address? Do you offer multiple
contact points? Your design must make trust easy.
Your visitors are stubborn You want me to
buy that blue widget. So give me alternative calls
to action in case Im interested, but not ready to
take the plunge. Want me to follow you on social
networks? Sign up for your newsletter? Bookmark
your site? These micro-conversions are steps along
the path to encouraging visitors to buy that widget.
Your visitors are stuck Even keen visitors may
need the push a compelling call to action (CTA) can
provide. Are your CTAs big, prominent and colourful
enough, and positioned within enough white space
that theyre easily recognisable? Are you usingvisual
clues: arrows, fingers pointing or eyes looking the
right way? Have you removed distractions so when
Im ready to go for it you dont sidetrack me? l
Susan Hallam is CEO of Hallam Internet

Site visitors are bombarded


with marketing messages
and have learned caution

Statistics: Bounce rate by industry


The metrics of an average website
Average time on site: 190.4 seconds
Average page views: 4.6
Bounce rate: 40.5%
New visits: 62.9%
Statistics to accompany this article: source:
http://blog.kissmetrics.com/bounce-rate

154

The SEO Handbook

Industry

Bounce rate

Retail sites driving well targeted traffic

20-40%

Simple landing pages with one call to action such as add to cart

70-90%

Portals such as MSN, Yahoo groups etc

10-30%

Service sites: self service or FAQ sites

10-30%

Content websites with high search visabilty (often for irrelevant terms)

40-60%

Lead generation: service for sale

30-50%

.net: Are there


other ways to measure
user satisfaction?
BW: User experience
feedback tools are a
great starting point,
and users leaving
feedback often convert
to sale. There are
a number of free
feedback tools that
can be easily integrated
into any website.
Take a look at services
such as 4Q (http://4q.
iperceptions.com)
or Kampyle (www.
kampyle.com).
.net: Is Google
Analytics the best
choice for web metrics?
BW: Its the market
leader, and has the
additional benefit of
being free. The Adobe
and IBM offerings may
be more powerful, but
for most web analysts
the Google offering
does the trick, and it
now also comes in an
Enterprise version.

Google
AdWords

Googles AdWords Enhanced Campaigns presents pros and cons. Start


migrating campaigns now to embrace the new PPC possibilities, says Emily Pope

The cons
lR
 eporting of performance by device will now be
a lot more time consuming.
lW
 ell be able to view tablet performance
individually, but we wont be able to optimise
it. If we want to increase visibility on tablet

devices we will also be increasing our visibility


on desktop. And vice versa.
 ablet-specific ad copy will now be redundant.
lT
lA
 ssigning marketing budgets to each device will
be a thing of the past.
lA
 dvertisers may see volume/click-through rates/
cost-per-click/cost and other key performance
indicators affected, as now every keyword
targeted to desktop will also target tablet too.

The pros
l Fewer campaigns to manage; three device
campaigns will become one.
l We still have the ability to adjust mobile bids.
l We can now adjust bids by location, allowing us
to be more visible to customers nearer to our
offline business.
l We can now see performance by each individual
site link in AdWords, allowing PPC managers to
use these insights to improve engagement.
l We can schedule site links by time and day, and
assign specific site links to mobile devices.

Test migrating one


campaign over first
and play around
until you feel more
comfortable
Emily Pope

l Site links will be available at ad group level and


campaign level, allowing for more control over
what site links appear with which ads.
l Call extensions with a Google forwarding phone
number will now be free on desktop and tablet.
Previously this was charged at 1 per call.
l Although not available at the moment, Google
has said we will eventually be able to track
conversions across devices. For example, seeing
the contribution of mobile to desktop.
Google is changing and adapting in line with
technology and trends, and our job is to move
with them. Despite there being a number of
downsides to these changes, there are also a lot
of positives to come from this update and we are
already starting to hear of Enhanced Campaign
success stories.
My top tip? Start transitioning your campaigns
today. Although you dont have to fully migrate
until 22 July, the sooner you dip your toe in the
water, the sooner you can understand the impact
of the changes and react to your learning.
Still nervous? Roll out your campaigns
gradually. Test migrating one campaign over
first and play around with it until you feel more
comfortable with the settings and bidding options.
Change isnt always easy but, thankfully ,with the
new AdWords Editor 10.0, you can upgrade in bulk
and set all your bid adjustments in no time. Start
now and start thinking about all the great new
PPC possibilities. l 
Emily Pope is senior PPC manager at Amaze (www.
amaze.com)

The SEO Handbook 155

Web pro

In February, Google announced the


single biggest update in the history
of AdWords. Enhanced Campaigns, a
controversial update thats got pay per click
(PPC) managers talking all over the world, waves
goodbye to the days of separate mobile, desktop
and tablet campaigns.
You may ask, Why? Googles answer: future
proofing. The line between tablet and desktop
is becoming increasingly blurred and this will
only worsen with the emergence of new devices,
such as those with detachable keypads. Google
suggests that there is now little distinction
between the two.
So what does this all mean exactly? In a
nutshell, it means that tablet and desktop will be
rolled into one group with mobile still considered
separate to an extent. For each PPC campaign
created you will be able to target desktop and
tablet with the option to then either include or
exclude mobile. Therefore, all campaigns must
now target either all devices or tablet/desktop
only; there will be no mobile-only campaigns.
So, what does this mean for PPC campaigns? If
we dig a little deeper into these changes and what
they mean, we can set out some clear pros and
cons of these changes. Ill start with the cons, as I
always prefer to end on a positive note.

Xxxxxpro xxxxxxxx
Web
Creating good social content

Social

Social influence
measuring tools
Name Klout
Info Loved and hated
in equal measure, Klout
frequently elicits
negative comments due
to being less than
transparent about its
scoring method. It can
be useful as a research
starting point, but isnt
to be taken too seriously.
(www.klout.com)

Creating good
social content
As content is so vital to successful social
media engagement, you want to make it
more likely that your content is engaged
with and shared online. Here's how to effectively
create and use visual and audio content.
Its clear that we live in the era of the visual
web. Human beings are hardwired to respond to
visual cues. We can specifically see this at work on
Facebook, where its algorithm, Edgerank, gives
more weight (and therefore priority in the news
feed) to photographs over pretty much any other
type of content. So how do you go about creating
photos that people will be more inclined to share?
Well, assuming youre not a brilliant professional
photographer, there are a couple of points to think
about. Firstly try to develop your own style; on
Instagram, the most popular users tend to have
a style that doesnt change wildly from photo
to photo. Secondly, there has been an explosion
in popularity of photo memes. Come up with,
or contribute to, a good one and you could find

yourself with some great social content. Moving


away from visual imagery, an area where there is a
huge opportunity for individuals and businesses to
create social content is with audio. Interesting audio
is quicker and easier to produce than video, and it
can be fantastic at drawing people into your story. As
experienced audio blogger Christian Payne
(@Documentally) explains, audio can create instant
rapport between you and the listener. He notes that
with audio, the individuals mind fills in the spaces
and the content becomes more personal in the
mind of the listener. Tools such as Audioboo (www.
audioboo.fm) and ipadio (www.ipadio.com) mean
that you dont even have to master podcasting just
record, upload and youre done.
Of course you still need an overall strategy for
your content. But play around with creating different
types of content, and you may find your audience
engaging with you much more.

Name Peerindex
Info Does a similar job
to Klout. Includes useful
lists of people within
different topics. Like
Klout they also run
perksfor those with
scores that measure up.
(www.peerindex.com)
Name Kred
Info Created by the
team that run social
analytics tool
PeopleBrowsr, Kred is
more open about the
factors it takes into
account with its scoring.
(www.kred.com)

Katie Moffat is an online PR/social media specialist

How to Stay legal in social media


Expert advice
Web pro

Name Steve Kuncewicz


Job title Intellectual property,
media and social media lawyer
URL www.linkedin.com/in/
stevekuncewicz

After the first true


social Olympics, social
media and its use has
undeniably cemented
itself into mainstream.
Some tweets made
headlines for all
the wrong reasons,

156

including one to diver


Tom Daley saying he
had let down his
father. A subsequent
threatening tweet to
another user led to an
arrest. So, how far can
you go online?

The SEO Handbook

1 Anything you say (eg


tweet, blog or Facebook)
online that 'the man in
the street' could view
as negatively affecting
a reputation could be
defamatory. That is
unless you can rely on a
defence of it being true,
honest comment as a
statement of opinion
based on true fact, or
protected by privilege.
A recent case set the
total cost in damages of
each defamatory word
at 3,750.

2 If two or more tweets


or posts are deemed
to have caused fear,
alarm or distress, this
could lead to a claim
in damages or even
a criminal conviction.
Amenacing, obscene
or indecent tweet
could result in a
prosecution under the
Communications Act
2003. Jokes in bad taste
will probably not lead
to a prison sentence.
Threats, however, will
be taken seriously.

3 Impersonation is
not always flattering.
Ask the tweeter who
impersonated Wendi
Deng. You could face a
libel action, or one of
passing-off and could
even face a fraud claim.
4 Content can be
shared on the terms of
use of most platforms,
but content that
contains an extract
from a copyright work
could lead to a claim for
copyright infringement.

It's best to always ask


for permission to quote
extracts to avoid having
to ask for forgiveness.
5 Finally, any post,
message or tweet
containing information
thats either confidential
or in which the subject
of what you post could
expect a reasonable
expectation of privacy
could lead to a claim
under the Data
Protection Act or for
breach of privacy.

Web pro

Targeting the customer

Tools to travel
back in social
media time
Name Timehop
Info Timehop collates
updates from various
social media platforms
from one year ago and
sends them to you in a
daily email. Its
surprisingly rewarding to
be reminded of what you
were doing.
www.timehop.com

Targeting
the customer
Very few people would argue with
the assertion that smartphones have
revolutionised our behaviour as
individuals and consumers. By 2016, the number
of UK users is expected to doubled to 41.9m.
Marketers have long understood the importance of
optimising websites for mobile but the Social Local
Mobile (or SoLoMo) phenomenon means there
are a whole host of other opportunities to connect
with customers.
On one level SoLoMo is about serving up
useful information, offers or discounts based on
the physical location of an individual. This may be
as a result of local search: for instance, I search
for the nearest drycleaners on my mobile and the
site gives me the details of the nearest branch to
my physical location. Or I check in on Foursquare
and am notified of a nearby restaurant offering
a special discount that day, a restaurant that also
happens to have been recommended by several
of my friends on Foursquare. Thats SoLoMo in
action. The concept has been around for a few years
now, but the technology and associated apps are

growing in sophistication. Add in geofencing and


it becomes really interesting. Using a service such
as Geoloqi (https://geoloqi.com), developers can
add geofencing to apps so that when an individual
enters a particular area theyre sent a message.
Imagine travelling to Los Angeles, and when
you land, your smartphone instantly alerts you with
information about the address that youre heading
to, says Geoloqis co-founder Amber Case. More
than that: Geoloqi has cross-referenced your arrival
time with the bus schedule, and alerts you to the
departure time of the next bus to get you to your
destination. Once youre on the bus, the phone
alerts you when youre approaching your stop. You
can see how this concept will work brilliantly with
retailers and brands. I may choose to be notified
when one of my friends checks into a particular
venue or alerted if a theatre Im nearby has some last
minute tickets available.
When it comes to SoLoMo, in the future, being
rewarded by becoming the mayor just wont cut it.

Name OhLife
Info Appealingly simple
to use, OhLife sends a
daily email asking how
did your day go?. Reply
with as much or as little
as you like, and you end
up with a neat life diary.
www.ohlife.com
Name Momento
Info This beautifully
executed iPhone app
helps capture your life as
you go along. It collates
updates from many social
networks, you can add
your own updates, and
its also a useful way to
search back through old
tweets by date.
www.momentoapp.com

Katie Moffat is an online PR/social media specialist

Expert advice
Name Julian Ranger
Job title CEO
Company SocialSafe
Web www.socialsafe.net

As internet users,
we are increasingly
pouring our lives into
online profiles that we
dont own. We may be
in possession of the
keys, but the truth is,
were really driving

a rental car. It makes


sense to consider how
youd feel if you lost
all the data: all those
memories gone in an
instant, with thanks
to a server failure or
successful hack.

Social data backup is an


emerging market, with
several companies
offering a wide range of
applications or services
that extract your data
from a variety of
different social networks
and online profiles. Most
of these store this copy
elsewhere in the cloud,
while a smaller number
allow you to save it
locally, which has its
obvious privacy benefits.
However, the social

networks often limit the


amount of data that can
legally extracted by
these apps. For example,
Twitter only allows your
last 3,200 tweets to be
accessed. So its probably
best to start backing up
your content now, in
case youre unable to
access the data at a
later date.
While not all services
provide a comprehensive
backup, or indeed one
that allows you to

actually interact with


your data, a simple
Google search along the
lines of Facebook
backup or download
my tweets will return
several options in this
area allowing you to
make your own decision.
Whatever option you
choose, do it now or
you may live to regret it.
We accept the need to
back up our professional
existence, so why not
our personal lives?

The SEO Handbook 157

Web pro

How to Take control of your online social data

Xxxxxpro xxxxxxxx
Web
Why you can't speed up social marketing

Social

Tools to create
social media
infographics

Why you cant speed


up social marketing
Organisations new to social media often
bemoan the amount of time it takes to
get any kind of payback from the effort
required to run an effective marketing campaign.
But how am I supposed to do all this on top of my
day job? they ask. There are barely enough hours
in the day as it is. This complaint is exacerbated by
the myth often propagated by social media gurus
that using social networks for marketing doesnt
require any additional budget. While on the face
of it, marketing via social media costs nothing
(unless, increasingly, youre using Facebook
but thats another story), there is a hidden cost
involved: that of the resources necessary.
Building an effective presence on social media
takes time, effort and patience. In the same way that
you cant rush new friendships offline, its virtually
impossible to speed up the formation of your online
community artificially. This means that you need
to accept several things. First, that youre in it for
the long game and social media is not a quick-fix

marketing solution. To convince your boss, have a


look at the online profiles of a couple of businesses
that use social media effectively. Youll undoubtedly
find that theyve been doing it for quite some time.
Use these as examples when putting your case for
taking a more long-term approach.
Second, it makes much more sense to focus on
one or two networks and use them really well than
set up a profile on Twitter, Facebook, Google+,
Pinterest, Tumblr et al and attempt to keep them all
updated with relevant content.
Third, devote some time (Id suggest at least
20 minutes) every day to contributing to online
communities, by sharing content and, importantly,
joining in and contributing to other conversations.
Success will always be related to what you have to
say and how you say it: bland, boring or corporate
marketing messages will be met with a resounding
wall of silence.

Name Visual.ly
Info One of the betterknown tools to help you
produce infographics,
Visual.ly plugs into your
Facebook or Twitter data.
You can use existing
templates or tap into the
Visual.ly marketplace for
a more custom and
costly solution.
www.visual.ly
Name Easel.ly
Info An infographiccreation tool similar to
Visual.ly, Easel.ly offers a
few more options for
customising its output,
using its bank of available
icons and graphics.
www.easel.ly
Name Infogr.am
Info Infogr.am is a
simple-to-use tool that
enables you to create not
just infographics but also
charts and diagrams. It
costs nothing to sign up.
www.infogr.am

Katie Moffat is an online PR/social media specialist

How to Promote your product or service on Twitter


Expert advice
Web pro

Name Robert Mills


Job title Studio manager
Company Bluegg
Web www.bluegg.co.uk

When promoting your


product or service on
Twitter, there's a fine
line between effective
marketing and
annoying behaviour.
We all follow someone
who has a message

158

to share, but it's


frustrating when
blatant self-promotion
appears constantly in
your timeline. Here's
a few simple ways to
promote your services
more effectively.

The SEO Handbook

1 Timed repetition
Information can be
lost quickly on Twitter
as your tweet slides
down your followers
timelines. Mentioning
something more than
once is fine, but do so
at intervals, with other
tweets in between: if
its all you tweet about,
people will just see your
content as noise which
will have the opposite
effect to the one you
intend. Consider time
zones, too, so that your

tweets reach as many


people as possible.
2 Provide context
How you say something
is as important as what
you say. For example,
dont just tweet, Buy
my book, and then link
to it. Instead, if you say
something about the
book and who might
benefit from reading
it, it will be easier to
get the attention of the
audience. Give people a
reason to look: if there

are associated blog


posts, sample chapters
or anything else linked
to your product, tweet
those too.
3 Check your feedback
Set up saved keyword
searches so you can
easily see what people
are saying about your
product, campaign,
or whatever. Whether
good or bad, its worth
knowing. And, if you do
spot a tweet, reply to it
saying thanks for buying

your product or service,


and ask for feedback.
4 Retweet with care
But be very careful
about retweeting every
mention or compliment
you receive. This can
be seen as egotistical
and annoying to your
followers. Favourite
the tweets instead and
preserve them for your
own reference, or use
them to create other
forms of promotional
material later.

Colin Grieves on

Social data
and search

With the release of Facebooks Graph Search, the inevitable rise of social search
will increasingly be at the core of marketing budgets in 2013, says Colin Grieves

Social search
With Graph Search, marketers will be faced with
the same issues, as they look to use it effectively
for their business. For a long time, Facebooks
search bar has been uncomfortably inadequate.
Therefore, the Graph Search announcement
was a relief, but, as always, advertisers are already
working to identify the associated opportunities

and challenges. In order to ensure they feature in


search results, brands will need to engage with
their social communities and this will propel
creative and effectivecontent to the top of the
social media managers mind. Clever marketers
will use insights gleaned from social listening and
applications to inform the type of content used
within their owned properties.
So, here are some tips on how to prepare for
the inevitable rise of social search:
1 Get your house in order
Its important to ensure that your Facebook page
is as complete as possible (as is already the case in
SEO best practice). Every singlefield must be filled
out. If you arent categorising yourself, Facebook
will make decisions for you or you wont feature
at all.
2 Start thinking in pictures and minimise links
Only directly shared photos and videos will
show in photo and video searches, so minimise
Facebook links for this type of post.

The obsession
with blindly chasing
fans or followers
has thankfully lost
credence
Colin Grieves

3 Keep building your community


According to recent reports, advertisers recently
took issue with the News Feed algorithm changes
that caused a decline in organic reach. Those
who spent their budgets acquiring fans were
beginning to ask what their pages were worth if
they werent able to reach their fans with a simple
page post. Graph Search will boost the value of
this community; the more fans you have, the more
likely you are to show up in a users search.
4 Encourage sharing
Search results will be highly personalised, meaning
that an identical search will yield different results
for you and me. Strong connections between a
user and an object will likely be given priority over
weak connections. Therefore, brands need to think
about how they can strengthen the relationship
with their fans by encouraging sharing of content
from their page.
In the first roll out, Graph Search will only
include results according to people, pages, apps,
places and groups, but brands should expect this
to branch out to include posts and comments in
the near future.
With large scale user take up, Graph Search
provides another way for brands to distinguish
themselves from their competition. The key will
be to get the basics right dont let admin slip
and continue to build your fan base. After all, its a
brands fans that will provide its biggest advantage
when Graph Search becomes mainstream.
Colin Grieves is director of propositions/strategy,
data and analytics at Experian Marketing Services

The SEO Handbook 159

Web pro

As digital living took hold in 2012, social


media became a significant part of
everyday life for millions across the UK.
Indeed, social media makes up 12 per cent of all
internet visits and a remarkable 23 per cent of
the total time spent online in the UK.
With more than a billion Facebook accounts
and half a billion Twitter users, this year will
undoubtedly see even more of marketing budgets
being spent on reaching social media users.
Though the obsession with blindly chasing fans or
followers has thankfully lost credence (yet sadly
remains prevalent in some organisations).
The challenge is to monetise and measure
social investments by understanding the value of
the data, insights and conversions that these social
channels create.
I believe 2013 will see the evolution and
more mainstream adoption of tools that provide
straightforward ways to measure the value to the
business of this social sharing. In turn, this will
help businesses return to the challenge of creating
the unique and provoking social media campaigns
that are necessary for word of mouth to flourish.
Adding to the social challenge is Facebooks
recently launched Graph Search.

Xxxxxpro xxxxxxxx
Web
Make your content more shareable

Social

Online
professional
profiles
Name LinkedIn
Info The original
andmost well-known
professional social
network. Often mocked,
but worth keeping up to
date and seeking
recommendations. LI
canbe a great source of
work for freelancers.
www.linkedin.com

Make your content


more shareable
Whether its a tweet, a blog post, a video
or a status update, every individualand
organisation wants their content to result
in engagement with their target audience.
On an individual psychological level, feedback
from our network validates our contributions; it
makes us feel worthwhile and appreciated. As an
organisation or as a professional with something
to sell, that engagement also has a more practical
value its how your brand spreads to potential
newcustomers and how awareness grows of
whatever it is you have to offer. So I thought Id
mention five points to consider if you want your
content to be shared online.
First, humans relate to content that connects
on an emotional level. If the content evokes a strong
emotion, its much more likely that your audience
will notice it and choose to share it. Second, we all
view life through our own lens. This means that in
order for me to connect with you or your product,
I have to understand how it relates to me. In a
nutshell, you need to make your content about the
target audience, not about your product or service.

The third common element in popular online


content is storytelling. Of course, not every tweet
orupdate has to have its own narrative, but
generally all of your social content should help to
generate a bigger story about the organisation,
product or service. Tell us about the people behind
the logo, the ethos of the business, the quirks and
snippets of information that gives your company a
unique personality.
Another common trait of successful social
content is that it makes you want to become a part
of the world it inhabits. This may be because its
aspirational, or because it makes you feel better
about yourself. This is often the case with great
campaigns from charities.
Finally, and more prosaically, research has shown
that we talk the most about the things that surround
us, such as work, food, family and friends. If you
can tap into this with your content in a way that
surprises or delights youre more likely to hit the
mark and encourage sharing.

Name Publicate
Info The idea behind this
newer platform is that by
allowing people to share
a wide variety of content
about not just their work
history but also their
interests and passions, it
will provide a better
snapshot of the individual.
http://publicate.it
Name Pinterest
Info Admittedly, Pinterest
isnt a professional profile
tool per se, but there are
some brilliant examples of
creatives using Pinterest
as a visual CV of work. Its
a good way to get
yourself noticed.
http://pinterest.com

Katie Moffat is an online PR/social media specialist

How to Keep your blog regularly updated


Web pro

Expert advice
Name Tom Mason
Job title Head of social media
Company Delineo
Web www.delineo.com
Twitter @totmac
Content may be king, but maintaining a blog
takes a bit of effort. Yet blogging doesnt have to
take up all of your time and there are a number
of ways you can minimise the legwork. Here are
a few techniques to reduce the stress of blogging
and provide quality content for your readers.

160

The SEO Handbook

Put time in the diary


A decent blog post isnt
a 10-minute job. Set
aside some quality time
each month for your
blog. Setting a diary is
a useful way to ensure
that you give yourself
enough time to put
words onto screen.
Be my guest
Guest posts are a great
way to get new material

and open your site to a


new audience. Contact
bloggers and individuals
who have something
relevant to say on your
industry and see if theyd
be interested in offering
some thoughts.
Find your muse
Get inspiration for your
posts by using an RSS
reader or curation tool
such as Feedly (http://

feedly.com). Using the


Favourite option on
Twitter is also a really
useful and quick way to
bookmark any content
you come across that
you may want to come
back to for a source of
inspiration. Keeping an
eye out for commentary
on event hashtags is a
useful technique to get
creative juices and blog
ideas flowing.

Any questions
Dont be afraid to
ask for ideas from
your social media
connections. Some of
the most interesting,
relevant posts come
from questions posed
by followers. Address a
specific need or concern
and youre providing
readers with a valuable
resource theyll want to
visit again and again.

BE A BET TER DESIGNER!


Ever wanted to master Adobe Illustrator but didnt know where
to start? Well, now you can easily add the illustration tool to your
creative arsenal all thanks to this in-depth guide from the makers
of Computer Arts magazine. Pick it up today!
On sale now at WHSmith, Barnes & Noble, www.myfavouritemagazines.co.uk
and for iPad at http://goo.gl/sMcPj

10 things

10 things you may have missed

The SEO Handbook

10 things you
may have missed

Subscribe to
net magazine
today and save
See page 69 fo !
more details r

Hopefully by now you'll be an SEO expert but just in case you're still
hungry for more, here's 10 things in this issue you may have missed
PAY FOR LINK BUILDING
01 DON'T
What happens when you pay an SEO firm
to link build for you? More often than not, it
will spam other websites on your behalf with
automated tools. Its a selfish tactic youre
receiving negligible (if any at all) benefits at the
expense of honest webmasters time and they
have to clean it up off their sites.
This and more SEO tips for startups on page 26

03 RESPONSIVELY
RETROFITTING OLDER SITES
Most of us probably agree that the web is never
really done. The real-time nature of the beast is
what makes our medium unique, yet we often
choose File > New over a steady evolution of our
sites. The truth is that we do not always get to
start over.
Ben Callahan on the first steps towards better
small-screen experiences. See page 126

LOADING SITES
04 FASTER
One of the first things to look at is the size
of your HTML code. This is probably one of the
most overlooked areas, perhaps because people
assume its no longer so relevant with modern
broadband connections. Some content
management systems are fairly liberal with the
amount they churn out one reason why it can
be better to handcraft your own sites.
Tom Gullen shows you how to make your sites load
ultra-quick. See page 112

GOOD CONTENT STRATEGY


05 A
You know, so much of what we do [as an
industry] is trying to make the web behave like
print. Content strategy is really helping the world
understand what makes the web different from
print and how we fully take advantage of this
new medium. Its very exciting! Its Gutenberglevel stuff.
Check out more of Karen McGrane's content
strategy advice in our interview on page 92

FROM THE SOURCE


06 STRAIGHT
To meet the growing needs of businesses,
Google Analytics (www.google.co.uk/analytics)

162

The SEO Handbook

02
BOUNCE RATE

A website with a high bounce rate from


good quality traffic sources is an indicator
that the website isnt performing up to its
visitors expectations.
Reduce your bounce rate with our expert
guide beginning on page 46

has changed from a web analytics solution to a


business measurement platform. From websites
to mobile apps, as well as for almost any other
internet-connected device, you can use Google
Analytics to measure your entire business.
Measure your business's success with Google's own
guide to Analytics on page 50

does not endorse any SEO company. However,


Google does have Analytics and AdWords
certification, so providers in these areas can take
tests for accreditation.
This and other SEO myths debunked on page 14

THE CHEAP SUITS


07 FORGET
In cutting out the garbage, we start to see

last year or so weve seen the future of what


SEO could become. Between the emergence
of Universal Analytics, which allows for truly
complete, campaign-oriented tracking, a much
better understanding of outreach and PR and
strong tech knowledge, the leading SEO agencies
are starting to produce really compelling work.
Pete Wailes on the future of SEO page 30

what SEO is really good for (and has always been


good for): connecting relevant content with relevant
searchers, and making content discoverable through
accessibility and marketing. For those of you who
still think of SEOs as greasy algorithm-chasers in
cheap suits or parents basements, consider the
new reality.
Get to the top of Google with our feature on page 38

DOESN'T
08 GOOGLE
ENDORSE SEO COMPANIES
Put simply, if youre dealing with a firm who
make any allusion that theyre endorsed or
approved by Google for optimisation purposes,
its likely theyre a fraud. The reality is that Google

FUTURE?
09 THE
Looking at the industry a moment, over

10 OPTIMISE!
I would like to propose that the SEO
industry, and all of us who define it, start living
up to our names. Were optimisers, James
Swinarski. We compete with one another in
the search results by making content relevant,
accessible and findable. It's not an easy task.
More from James Swinarski on page 140

In 164 packed pages, we show you how to


get to the top of Google. From making
your site load faster and climbing the
rankings to the latest SEO tricks and
techniques, this handbook contains
everything you need to know to become a
master of SEO. Youll also learn all about
Analytics from a wide range of industry
experts and Google itself and key
marketing techniques to drive your site
or business to the top of the worlds
favourite search engine.

Like this? Then youll also love

Visit www.myfavouritemagazines.co.uk/design
for more information