You are on page 1of 9

Web Analytics

10/15/2009
Chris Thorn
MIST 7500
MIST 7500 Midterm
Chris Thorn

Web Analytics

Web Analytics are numbers, stats and metrics that measure the activity on a website.
They are a statistical rabbit hole for web administrators and site owners to fall down
when analyzing the performance of their site. The available metrics range from the
obvious, headline-grabbing Number of Daily Visitors to the esoteric, subjective
Conversion Rate. Understanding the meaning, context and level of accuracy of web
analytics is important.
Depending on their needs, users can choose from a number of web analytics
services and programs. Corporations can purchase or build enterprise-level software for
large scale platforms. Smaller users can choose from various free analytics programs to
monitor their websites. The key is finding an analytics program that captures the
important information and displays it in a way for the user to effectively analyze and
process the data.
To find the right program, users must understand the metrics definitions and
functionality. Key metrics include Unique Visitor, Repeat Visits and Session Duration
among others. Unique Visitors, as defined by Wikipedia, is a uniquely identified visitor
to the site, whether by IP address, visitor log-in or cookie, during a certain timeframe set
by the analytics software or administrator. This metric is often confused with the
laymen’s perception of hits, which are the number of requests for files on a page. Servers
can request many hits per page depending on the number of files, such as images, text,
ads and other files, hosted on that page. The number is incredibly misleading but still
haunts the industry today. For example, the sales team at my office is constantly asked
how many hits our network of sites receives on a daily basis. The metric is meaningless.
Unique Visitors on the other hand is more accurate but still has flaws.
Why Unique Visitors works better is that it counts a metric once during a set time
period. If a user visits a site in the morning and returns in the evening, depending on the
set time constraints, they are likely viewed as one unique visitor for the day. This metric
is built on a few assumptions. First, there is one user per measurement method, which
means the analytics program assumes the same visitor used the same computer (if using
first-party cookies) or came from the same IP address (if using web log analysis) that day.
But if a home computer is used by multiple family members to visit CNN.com during a
single day, the analytics software will only count one Unique Visitor despite different
users at the computer. IP address analysis also can be inaccurate because some corporate
users share one external IP address or use dynamic IP assignment. Multiple users might
be hitting the site from one IP address.
Another metric that is focused on by administrators is New vs. Returning Visitors.
At first glance, this appears to be an amazingly powerful metric. In theory, as the
analytics software tracks site activity, it differentiates Unique Visitors that are new to the
site from others that are returning to the view the content. When I first started tracking
visitors to my web comic, I misinterpreted this metric in the way mentioned above. As
my understanding of site analytics grew more nuanced, I realized that New/Returning
also is flawed. When users clear their cookies or visit from different IP addresses, they
are classified as new. Considering how many users clear their cookies after each online
session, this can be an inaccurate measurement tool.
Obviously web analytics are not 100 percent accurate. Some metrics are better
gauges than others. Daily visitor rates or other stats can be frustrating when viewed as
just numbers because they can’t be trusted as absolutes. But where I’ve found web
analytics to be incredibly useful is in understanding visitor trends and site effectiveness
by putting the metrics in context.
My MIT blog has generated no traffic since I installed Google Analytics and
Sitemeter software to the blog (in full disclosure, the site has counted 6 visits but they are
all from me). But I have used both programs to track and develop my web comic,
http://www.registered-weapon.com, and I’ll discuss one method used to derive meaning
from the captured analytics.
Both analytics programs are installed by inserting a piece of javascript code into
the HTML of the targeted site’s homepage. Simply create an account with the tracking
service, copy the code from the service site, paste it into in the HTML of your site. Once
enabled, the analytics starts tracking immediately but it may take some time to display
the results.
First Google Analytics is the more sophisticated of the two programs I’ve
mentioned.

Above is the analytics overview page. With a quick glance, I can see the general
stats for all of my sites during the last 30 days. For deeper analytics, I can click on the
View Report link.
Now I’m viewing the dashboard for Registered Weapon. The metrics summarized
on the overview page are now graphically displayed for easy consumption. I can drag my
mouse across the line graph to view the number of Unique Visitors for a certain day
along the 30-day timeline, which is an efficient way to catch-up on site traffic. For more
detailed analysis, I can choose from a menu on how I’d like to view the analytics. The
options include Visitors, Traffic Sources, Content and Goals. I will not explain every
feature or view in detail but I will provide a good example of using the analytics to
determine site or campaign success.
On October 12, we ran an ad campaign on various blogs and comics sites. The
sites were chosen for past performance(determined by the method I’m about to describe)
or greatest potential of traffic based on similar content or audience.

From analyzing the stats for the day in the image above, the dominant traffic
driver was still from direct traffic, which are bookmarks, Google Reader or visitors
typing the site address into the browser. The other traffic sources listed are either sites
that linked to us for reasons like reviews, recommendations or something else and sites
where we placed the advertising. Those sites included Overcompensating.com (46 visits),
Basic Instructions.net (21 visits), Wigu.com (13 visits), needforbushiod.com (9 visits).
Assuming all advertising costs were equal, Overcompensating.com is the clear traffic
driver and best advertising investment right? Not necessarily. The visit numbers need to
be put in context with the other analytics listed, which are either averages or percentages.
See the table below.

The Page/View average column shows Wigu.com delivered the most page views
per visitor on average. For a web comic, this is extremely important because it means
someone is reading the content in large chunks. The Average Time on Site column
illustrates that readers from BasicInstrutions.net spent lots of time on the site, which says
we hit a sweet spot as it skewed the average higher. Now, the Percentage of New Visits
are above 75 percent for all sites so that’s considered a success. A second inference from
that data set since only one site produced 100 percent new visitors is that regular readers
on those sites have visited our site before, enjoyed the content but required a prompt to
remember that. Maybe we convinced them to bookmark our site this time. Maybe that
will happen next time. And, finally, the Bounce Rate percentage is possibly the most
important when put in the context of a ratio between the total of visitors generated by
each individual site. While Overcompensating,com delivered the highest number of
visitors, the Bounce Rate is 50 percent. Wigu.com, on the other hand, only delivered 13
visitors but the Bounce Rate is 7 percent. To me, the Wigu.com users are the right
demographic for our content at a much higher percentage. As long as Wigu ad rates are
reasonable, we stand a high chance of converting their users into our readers at a
maximum value to us.
Sitemeter is more limited in functionality than Google Analytics but it offers a
neat advantage. When content is updated on Registered Weapon, an admin will tweet
about it with a link, post the link on the Facebook page and send an email blast to
newsletter subscribers. With Sitemeter, we can use the service to track who is on the site
at any given moment, which helps determine effectiveness of our communication tools.
This method, while wholly inaccurate, is similar in execution to news
organizations that monitor web traffic in real time to determine how to position headlines
and stories for maximum exposure and traffic generation.
In general, web analytics are powerful tools for measuring your site but they must
be understood and properly interpreted for companies to use the data in the right way.

Web Site Attractiveness

When discussing web site attractiveness, there are two separate points-of-taste to
take into account: the human eye and the search engine. A beautiful, intricate site design
that flows content in an organized way for the reader means nothing to the text hungry
search engine that hunts on the strength of key words. The trick is building a site that
satisfies both needs.
Depending on the site’s content or purpose, the visual design or the coding side
might be more important. For most sites, driving traffic to their site is the initial problem
so search engine optimization (SEO) and search engine marketing (SEM) will be the first
topic in this paper. SEO is different from SEM in a few fundamental ways.
Most prominently, SEO is considered the organic way to improve search results
in various engines while SEM is a paid model for site owners to appear with search
results, according to Wikipedia. Search engines have taken an active role in SEO
techniques to help web developers and content creators increase their search relevancy in
an ethical, content-driven way. For example, Google provides a webmaster central site
for developers to reference when building sites. They can follow useful SEO methods,
learn how to implement a Google sitemap and submit their site for guaranteed indexing in
the search engine. Yahoo! Site Explorer also provides similar guidance.
SEM is monetarily driven. Search engines are often paid on a per click (PPC)
basis by advertisers, according to Wikipedia. The links, bought or bid on for keywords,
appear as sponsored ads surrounding the search results. Or the advertiser pages are
indexed in the database based on keywords but no promises are made for rankings or
placement. Although not as organic as SEO, SEM offers an advantage to advertisers:
control (Wise, Paternack, 2005). By using SEM, sites can define the landing page for the
visitors, which provides control over the first contact between site and prospective
customer. In contrast, SEO landing pages are dictated by the generated search results.
SEO methods are often defined as either White Hat, which follows the straight
and ethical path of optimization, or Black Hat, which artificially manipulates searches
that may result in bans for offending sites. Jill Whalen, an expert in SEO, sees the
differing methods in an alternate way (Whalen, 2004). She suggests viewing the
techniques as sustainable and temporary. Companies and developers all want the same
thing: traffic. But if companies aren’t willing to put in or pay for the effort to design a
sustainable site over time, they may resort to the questionable methods for a quick buck if
it fits their business model.
If visitors successfully navigate the search engine waters to a site, the second
definition of web site attractiveness is important. The aesthetic appeal can make or break
a site for a visitor in just a few seconds. Design is subjective to individual tastes but there
are a few agreed upon do’s and don’ts. First, clarity or effective communication (Snell,
2009). He warns designers of letting visual flair trump the clear delivery of information.
He also offers 7 areas of focus for site attractiveness: text, images, titles and headers,
icons, design styles, colors, and audio/visual content.
Each of these topics can be discussed individually at length without reaching a
consensus on the right approach. The most successful strategy is likely to design your site
within the context of your business as it relates to those 7 core areas. If you operate a
design studio, your site’s appearance should be more visually engaging and intricate than
a news portal that is built around efficiently communicating the headlines.
An interesting survey, conducted by Smashing Magazine (Friedman, 2008), uses
50 popular blogs to analyze 30 blog design problems or questions. They breakdown the
blogs into percentages as to how they approached the problem. The survey is a great
source of information for blog designers.

Sasquatch Planet

Admittedly, Sasquatch Planet is still on the starting blocks in respect to SEO and site
design. I’ve installed the analytics services to the site but the traffic has yet to surface.
Configuring the SEO material will be difficult as I’ve yet to define a purpose for the blog
outside of the MIT assignments. A good approach to take might be to focus on keywords
that emphasize my nascent stage in IT and my participation in the UGA MIT program.
Prospective readers include prospective students at in the program. When I was searching
for information on the program, my research led to the Terry MIT pages. A current
student’s perspective may help others decide if this is right approach for them.
Integrating the SEO keywords should be done in an organic manner. Do to
spamming and abuse, Meta Tags have lost their prominence as search result generators
(Wise, 2008). The search engines use complex algorithims to rank content. Current SEO
methods include emphasizing keywords or phrases in your content often. For Sasquatch
Planet, I should write more headlines that feature words like “UGA”, “MIT” and
“Program”. I should also repeat my core words in the content.
Since I’m using Blogger as my host, I’m at a disadvantage in regards to SEO
when compared to other services that offer theme related categories that can be tagged by
search engines (Hurlbert, 2005). While that strategy is out, I can play tag with every site I
know. The more links I create to other sites and the more other sites link to my site, I can
create relevancy and activity for search engines to hone in on. Also, incoming and
outgoing links are often laden with keywords relating to a topic.
For the visual side, I plan on introducing a new header that clearly explains the
blog to both reader and search engine alike. The template, font and color scheme will be
customized with more thought than when I initially launched the site. During the coming
weeks, I’m interested to see if these changes drive more traffic to the blog than me,
myself and I.
Bibliography

"Web Analytics." Wikipedia, The Free Encyclopedia. 10 October 2009, 11:34.


Wikimedia Foundation, Inc. 12 Oct. 2009. <
http://en.wikipedia.org/wiki/Web_analytics>.

Wise, Bill, and Pasternack, Dave. "SEO isn’t SEM”. DMNews., 5 Dec. 2005. Web. 11
Oct 2009. < http://www.dmnews.com/SEO-Isnt-SEM/article/89604/>

Whalen, Jill. "Black Hat/White Hat Search Engine Optimization”. Search Engine Guide,
16 Nov. 2004. Web. 11 Oct 2009. < http://www.searchengineguide.com/jill-
whalen/black-hatwhite-hat-search-engine-optimization.php>

Snell, Steven. "Clear and Effective Communication in Web Design." Smashing


Magazine., 3 Feb. 2009. Web. 12 Oct 2009.

Snell, Steven. "Clear and Effective Communication in Web Design." Smashing


Magazine., 3 Feb. 2009. Web. 12 Oct 2009.
<http://www.smashingmagazine.com/2009/02/03/clear-and-effective-communication-in-
web-design/>

Friedman, Vitaly. "A Small Design Study of Big Blogs." Smashing Magazine., 24 July
2008. Web. 12 Oct 2009. < http://www.smashingmagazine.com/2008/07/24/a-small-
study-of-big-blogs/>

Hurlbert, Wayne. "SEO Tips for Blogs hosted on Blogger." Web Pro News., 11 Sept
2005. Web. 111 Oct 2009. < http://www.webpronews.com/topnews/2005/09/11/seo-tips-
for-blogs-hosted-on-blogger>

You might also like