You are on page 1of 7

Chapter 7 The Internet, Intranets, and Extranets W41

GROWTH OF THE INTERNET BETWEEN 1999 AND 2001 (page 203)

The Internet global network


as of September 1999.
[Source: Courtesy of
UUNET Technologies.]

BUILDING BLOCKS FOR THE NEXT GENERATION


INTERNET (NGI) (page 205)

Collaboration and Communication Video-enabled Web


Real-time language translation
Internet telephony
Instant messaging
Web browser
Electronic mail
Knowledge Management XML
Third-generation search engines
Directory services
Seamless Integration Web-desktop integration
Jini
Java
Pervasive Computing Third-generation cellular
Wireless application protocol
Bluetooth wireless data standard
Security, Privacy, Authentication Public key infrastructure
Platforms for privacy preference
Reliability and Scalability IPv6 Internet protocol
Speed Differentiated services
Cable modem
Digital subscriber line
W42 Chapter 7 The Internet, Intranets, and Extranets

Table W7.1 The Major Categories of the USENET (page 209)


comp Computer hardware, software, and protocol discussions (example:
comp.protocols.tcp-ip)
news Groups that deal with USENET software, network administration, and
informative documents and announcements (example:
news.announce.newsgroups)
rec Recreational subjects and hobbies, such as aviation, games, music, and cooking
(example: rec.arts.books)
sci Topics in the established sciences, such as space research, logic, mathematics,
and physics (example: sci.space.space-station)
soc Groups for socializing or discussing social issues or world culture (example:
soc.women)
talk Debates and discussions on various current events and issues, such as politics,
religion, and the environment (example: talk.politics.mideast)
biz Business-related groups (example: biz.comp.services)
alt Alternative group of discussions that are not carried by all USENET sites; some
are controversial, others are not; not considered one of the major USENET
categories (example: alt.fan.dave_barry or alt.fishing)
misc Topics that do not fit anywhere else, such as job hunting, investments, real
estate, and fitness (example: misc.education)

EXAMPLE (page 213)

The Potential of Medical Services on the Web Suppose that you have been injured
and you are taken to a health clinic for treatment. Your identification card contains a
code that checks your insurance and gives the clinic permission to view your medical
information on the Internet. This process triggers a program that pages your doctor.
Before the doctor can access your medical history, another Web site is alerted that de-
termines whether he or she is authorized to see your information. The doctor exam-
ines you, orders X-rays, and dictates notes into a handheld computer. The computer
sends the notes to your online file, the doctor’s office PC, and the clinic’s server, and
then creates a link to your new X-rays in the clinic’s imaging computer system. The
doctor types a prescription for pain medication into the handheld. That process trig-
gers a query to your records to see whether you are taking any other medications. The
record site says you have another prescription. That site checks with the drugmaker to
see whether the two drugs can be taken simultaneously. The doctor’s handheld also
activates a link to your insurance company to see if you are covered for the medica-
tion, and then sends a bill for the appropriate amount. The prescription request trig-
gers an order that is sent to the pharmacy closest to your house. The pharmacist fills
the prescription, and when finished, signals a delivery service to take the medication
to you. ●
Chapter 7 The Internet, Intranets, and Extranets W43

MICROSOFT’S .NET AND HAILSTORM (page 213)

.NET is Microsoft’s platform for XML Web services. XML Web services allow appli-
cations to communicate and share data over the Internet, regardless of operating sys-
tem or programming language. XML Web services can be implemented on any
platform and are defined through public standards organizations such as the W3C.
And with XML Web Services, not only can applications share data, but they can also
invoke capabilities from other applications without regard to how other applications
were built. Sharing data through XML allows them to be independent of each other
while simultaneously giving them the ability to loosely link themselves into a collabo-
rating group that performs a particular task.
The Microsoft .NET platform includes a comprehensive family of products, built
on XML and Internet industry standards, that provide for each aspect of developing,
managing, using, and experiencing XML Web services. XML Web services will be-
come part of the Microsoft applications, tools, and servers you already use today—
and will be built into new products to meet all of your business needs. More
specifically, there are five areas where Microsoft is building the .NET platform today,
namely: Tools, Servers, XML Web Services, Clients, and User Experiences.
.NET clients are PCs, laptops, workstations, phones, handheld computers, Tablet
PCs, game consoles, and other devices. What will make these clients part of the
“.NET” system is their ability to access XML Web services, which makes them
“smart.” Smart clients use software that supports XML Web services and enable you
to access your data regardless of the location, type, and number of clients you use.
Hailstorm technology will enable developers to build user-centric services that
offer a new level of personalization and privacy protection for both consumers and
business users. Hailstorm will provide management of basic elements such as calen-
dar, location, and profile information, saving developers from having to create their
own systems for these capabilities. Microsoft hopes that with Hailstorm, instant mes-
saging will expand beyond chat to become the infrastructure for a range of services
such as Web-based e-mail, real-time stock quotations, and calendar functions.
Hailstorm will make it easier to integrate the silos of information that exist today.
Hailstorm services are oriented around people, instead of around a specific device, ap-
plication, service, or network. Hailstorm puts users in control of their own data and
information, protecting personal information and providing a new level of ease of use
and personalization.
For example, with Hailstorm services, booking a flight using an online travel
reservation service becomes much simpler because with the user’s consent, the travel
service automatically accesses the user’s preferences and payment. If the traveler is
traveling on business, and her company has travel policies she must adhere to, her in-
dividual affiliation with her company’s Hailstorm group identity makes it possible for
the travel service to automatically show her only the choices that meet both her pref-
erences and her company’s requirements. Once she has chosen her flight, the travel
service can use Hailstorm, with her explicit permission, to figure out which calendar-
ing service she uses and automatically schedule the itinerary onto her calendar, auto-
matically updating that itinerary and notifying her if her flight will be late. And
through Hailstorm, she can share that live flight itinerary with whomever she is going
to visit so that they will also know when and where to expect her. The information in
her Hailstorm-enabled calendar can then be accessed through her PC, someone else’s
PC, a smart phone, a PDA, or any other smart connected device.
W44 Chapter 7 The Internet, Intranets, and Extranets

C# OVERVIEW (page 213)

For the past two decades, C and C have been the most widely used languages for
developing commercial and business software. While both languages provide the pro-
grammer with fine-grained control, this flexibility comes at a cost to productivity.
Compared with a language such as Microsoft Visual Basic, equivalent C and C ap-
plications often take longer to develop. Due to the complexity and long cycle times as-
sociated with these languages, many C and C programmers have been searching
for a language offering better balance between power and productivity.
The ideal solution for C and C programmers would be rapid development
combined with the power to access all the functionality of the underlying platform.
They want an environment that is completely in sync with emerging Web standards
and one that provides easy integration with existing applications. Additionally, C and
C developers would like the ability to code at a low level when and if the need
arises.
The Microsoft solution to this problem is a language called C# (pronounced
C sharp). C# is a modern, object-oriented language that enables programmers to
quickly build a wide range of applications for the new Microsoft .NET platform, which
provides tools and services that fully exploit both computing and communications.
The new Web economy—where competitors are just one click away—is forcing
businesses to respond to competitive threats faster than ever before. Developers are
called upon to shorten cycle times and produce more incremental revisions of a pro-
gram, rather than a single monumental version. C# is designed with these considera-
tions in mind. The language is designed to help developers do more with fewer lines
of code and fewer opportunities for error.

METRICS FOR EVALUATING WEB SITES (page 214)

Your Web site has been live for some time, and it is time to tell your executives how
the site is doing. Without much trouble, you can tell them how many people show up
and how many pages they view. You can show them some pie charts indicating which
pages are the most popular, and what time of day most users scrutinize your site.
Now is the time to examine how well the Web site is serving your company. The
difficulty is in performing that analysis. There is no standard practice for this “online
analysis.”
The goal is to measure the effectiveness of moving people through the customer
life cycle: from suspect, to prospect, to novice customer, to seasoned customer, to
company advocate. To measure this progression, think in terms of these concepts:
• Reach: The percentage of the possible audience you are able to touch with your
attention-getting promotion. What is the cost per impression and, more important,
the cost per lasting impression?
• Acquisition: That point when you have engaged a prospective customer in a dia-
logue as opposed to merely getting his or her attention for the moment. How much
do you have to spend to get a qualified prospect on your site?
• Conversion: Refers to the moment customers buy, but it can also refer to signing
up for a seminar, joining a discussion, subscribing to a newsletter, etc. Cost per con-
version should top a list of important numbers to track.
Chapter 7 The Internet, Intranets, and Extranets W45

• Abandonment: Customers were part way through the buying process but went
away without hitting the “submit” button. Why did this happen? Is it your price?
Your navigation? Your sluggish server?
• Retention: You want customers to buy from you again and again. If it costs a lot of
money to attract customers, why aren’t you tracking what makes current customers
leave, and focusing on ways to keep them coming back?
• Stickiness: The length of time that prospects and customers spend on your Web
site.

Formulas for Measuring Web Site Effectiveness


Here are a few formulas for measuring the above concepts:
Advertising and other promotional costs
Acquisition cost  Number of clickthroughs
As an example, say you spend $30,000 on an ad campaign buying 1 million impres-
sions. The campaign achieves a 0.05 percent clickthrough rate resulting in 5,000 visits.
So, your acquisition cost is $6 per person ($30,000  $5,000  $6.00).
Advertising and other promotional costs
Conversion cost  Number of sales
Using the same example, if 5 percent of your 5,000 visitors buy something, you have
made 250 sales, at a conversion cost of $120 per sale ($30,000  250  $120).
Average content area refresh rate
Freshness factor  Average session visit frequency

A freshness factor of less than 1.0 means customers see stale content. They visit
more often than you change your content. For example, if a company refreshes the
content of its Web site 15 times per month (average content area refresh rate) and if a
customer visits that company’s Web site 30 times per month (average session visit
frequency), then the freshness factor is .5, indicating that the company’s customers are
seeing stale content.
A freshness factor greater than 1.5 means your content changes more often than
visitors show up to view it. That means you are wasting resources. If a company re-
freshes the content of its Web site 30 times per month, and a customer visits that com-
pany’s Web site 15 times per month, then the freshness factor is 2.0, indicating that the
company is wasting resources.
Freshness can be calculated across your entire site and user base or it can be fo-
cused on a specific customer segment or content area.
Required clicks to first purchase
First-purchase momentum 
Actual clicks to first purchase
You can count the exact number of clicks required to make a purchase, starting
from your homepage. But visitors do not know your site as well as you do, and are likely
to make mistakes. Counting up the actual number of clicks-to-purchase, and comparing
it to the minimum requirement to buy something, gives you valuable information about
the clarity of your navigation. Reducing the first-purchase momentum will increase
sales. Every click added to the purchasing process increases abandonment.
For example, if a company’s Web site requires 5 clicks to make a purchase, and a
customer takes 8 clicks to make their purchase, then that customer may become frus-
trated and not make as many purchases. In this example, the company should assess
the difficulty customers have navigating its Web site.
W46 Chapter 7 The Internet, Intranets, and Extranets

Why It Is Difficult to Get “True” Site Statistics


Despite sophisticated and expensive traffic analysis software, it is impossible to know
even the basics of site traffic statistics. First, there are many technical problems that
can either create phantom pageviews and users, or obscure real ones. Second, the
most widely cited third-party sources for traffic and user numbers—such as Media
Metrix and Nielsen/NetRatings—get their numbers the way that TV ratings are gen-
erated, by tracking panels of Web users. The technical problems related to traffic
analysis can cause many discrepancies between what your company thinks and what
the third-party companies show about your Web site. These discrepancies can range
from 20 to 40 percent.
For executives, these numbers cause other problems. Without precise data, it is
difficult not only to measure overall progress but also to develop a picture of what vis-
itors do on a site—which features they like and how long they use those features. On-
line advertisers demand figures that can be independently verified. So, why is it so
difficult to get the numbers straight?
We can begin with the Web server. A record of every request for information
from a server gets recorded in a log file, and that is where the trouble begins. For in-
stance, log files record hits, which amount to calls for discrete pieces of data—such as
a logo or chunk of text—stored somewhere in the system. It usually takes many hits
(or certain kinds of hits) to make up a pageview. So how many hits equal a pageview?
It depends on the page. Most sites just estimate how many hits amount to a page and
use that number to calculate traffic. Obviously, with sites estimating these numbers,
there is no consistency of data among sites, and evaluating sites based on these incon-
sistent data is impossible.
Then there are external factors that further distort numbers. Foremost are spi-
ders—also known as crawlers or bots. Spiders are automated software programs that
follow hypertext links from Web page to Web page to find documents and index their
contents. Their visits generate what look like hits to a server log, and often end up in-
flating traffic numbers reported by Web site managers. Spiders’ visits are easy to iden-
tify in the raw server logs, but it is time-consuming to comb through huge amounts of
data. Top-end site-metrics software, from firms such as Accrue, NetAcumen, and SAS
Institute, usually does a good job of filtering out the spiders’ hits, but even those help-
ful programs require lots of careful tuning.
Another confounding influence is Web cache technology, which stores frequently
requested Web pages so that users can get to them more quickly. America Online, for
example, caches millions of commonly requested Web pages on its servers. When an
AOL visitor types in a URL for a cached page, the visit typically does not register on
the original site’s server logs.
When all else fails, one crude way of checking your site traffic is to compare
pageview statistics with the number of ads served (assuming you serve ads). For ex-
ample, if you serve one banner per page, the number of pageviews should, in theory,
equal the number of banners served. Of course, this is rarely the case; nevertheless, it
is an important variable to watch because big discrepancies suggest something is
wrong.
There are independent auditors who will study your site logs and draw their own
conclusions about your numbers. For fees starting at $1,000 per month, firms like
ABC Interactive, BPA International, and Engage I/Pro will review a site’s traffic
numbers to ensure that they are reported honestly and to make sure the methodology
behind tabulating the figures meets industry standards. One expert says that most
sites will find their unaudited results differing by 5 to 10 percent from their audited
figures.
Chapter 7 The Internet, Intranets, and Extranets W47

Beyond calculating your pageviews, there is another metrics problem: counting


unique visitors to your site. Web servers record hits from IP addresses, which com-
puters use to identify each other on the Internet. Unfortunately, an IP address often
does not correspond to a single computer, let alone a single individual. Corporate
users usually go through a companywide server called a proxy server, which masks
their IP addresses for security reasons. Dial-up users are assigned a different IP ad-
dress at random by their ISP every time they log on. And, of course, there is no way
to account for multiple users of the same computer.
A partial solution is the cookie, a small file that most commercial sites install in
the browser of visiting computers. Cookies help servers record visits from a particular
browser used on a particular computer. But, that approach is not perfect. Machines in
places like libraries or computer labs may be shared by many people each day, which
results in undercounting. Alternatively, a single user might visit the same site from
home and from work, which produces overcounting. To make matters worse, an un-
known percentage of Web surfers set their browsers to reject cookies, which means
these visitors are not counted at all.

How to Get the Best Site Numbers


• Eliminate spiders. To distinguish spiders’ hits from those created by real users,
look for unusual activity on your logs. Spiders do things no normal person would,
like visit every single page on your site in an hour. Once you think you have found
a spider, comb through Web logs to locate its IP address, then direct your analytical
software to ignore future hits originating from that address.
• Watch out for masked IP addresses. Not every address represents an individual
user. Corporations and dial-up ISPs (notably AOL) can show your server a single
IP address for many actual users. Look for high traffic from a single address. It may
indicate that you have more users than your data suggests.
• Avoid cookies. Do not expect accurate visitor counts from cookies. An unknown
number of Web users set their browsers not to accept cookies. Cookies also cannot
distinguish between multiple people using the same computer.
• Defeat caches. The most common way to defeat caches is to generate as many
pages as possible on the fly, using scripts to assemble them from a database. Dy-
namic pages are extremely difficult to cache.
• Know your audience. Since Media Metrix and Nielsen/NetRatings track users only
in homes and at work, ask your IT department to filter out users coming from li-
braries and schools before comparing trends in your site’s traffic with Media
Metrix’s figures.

Table W7.2 URLs of Popular Metasearch Engines (page 217)


MetaCrawler metacrawler.com Vivisimo vivisimo.com
Dogpile dogpile.com Search.com search.com
Ixquick ixquick.com ProFusion profusion.com
All 4 one all4one.com

You might also like