You are on page 1of 14

Interop 2011 Strategy Guide:

Cloud Computing
Whats Inside: Key insights, strategies and best practices for cloud computing ,including:
Data storage considerations
Tips to negotiate cloud contracts
Statistics on cloud adoption
Security services put to the test
And much more.

2011

Introduction
By Lenny Heymann, Executive Vice President, Interop
Interop has compiled a strategic guide featuring delivering key insights, important strategies and best practices
for cloud computing.
The days of cloud computing services being used for seasonal capacity are gone. Organizations are
incorporating cloud services in to their IT practices on a permanent basis. The capital and operational costs are
compelling. Ensuring adequate application performance and having redress defined before you commit to a
service will set expectations and spell out everyones responsibility.
Successful cloud computing programs are integrated with your IT initiatives and your business, and like any outsource arrangement,
you need to ensure if and when you can safely put sensitive information in to the cloud without exposing yourself to more risk.
Partitioning strategies can help but do require forethought.
Luckily standards such as cloud audit are being developed and adopted which defines how cloud providers can describe their
security practices to customers.
Cloud computing is an evolving space addressing many needs to successfully incorporate cloud services into your organization. A
thoughtful approach to selecting a cloud service and designing applications to take advantage of cloud services are key.

Table of Contents
4

APM Vs. The Cloud

Managing Export-Controlled Data In The Cloud

Supply Chain In The Cloud: Enabled Just In Time

Negotiating Cloud Computing Contracts

10

CloudAudit Gets Real

11

The Clouds Proliferating Open Source APIs

12

Wholesaler Builds A Private Cloud With Virtual I/O

13

Web Security Services Put to the Test

15

Top 20 Government Cloud Service Providers

The State of Cloud Computing

2011

APM Vs. The Cloud


By Dave Stodder
IT must rethink its approach to managing the performance of cloud applications.Its
not just security thats keeping CIOs from moving critical business applications into the
cloud. When any downtime can negatively affect the bottom line, IT organizations want to
make sure they can deploy application performance management (APM) tools to monitor
availability. But thats not yet easy, or sometimes even possible, when applications are
housed off-site. If a cloud provider doesnt offer sufficient metrics or allow customers to
deploy APM instrumentation on its infrastructure, how can you know when performance is
suffering, and why?

Going to the cloud wont


save money if you lose
business because of poor
performance.

You cant. So we were surprised when a recent InformationWeek Analytics APM survey revealed scant use of monitoring when it
comes to software as a service or apps on public cloud services. Just 28% of respondents use APM tools to monitor most of the
cloud applications that they use, while 70% monitor only a few or none at all.
The problem is, the application architecture transformation brought about by the adoption of cloud services requires an equally
transformational approach to performance monitoring. One example: Some APM vendors are embracing the cloud as part of the
solution via the concept of monitoring and management as a service, or MaaS. MaaS platforms from companies like AppDynamics,
BlueStripe, and Coradiant can automate tasks typically involved in setting up APM software, including agent installation and
component relationship mapping. The monitoring service can be used for both on-premises applications and those in a public cloud
environment.
In other cases, when it comes to SaaS and infrastructure-as-a-service platforms, IT teams are using synthetic transaction tools that
simulate real application traffic and data payloads; they help test the user experience and discover bottlenecks and other problems
that affect speed, transaction completeness, and
availability. Whether you adopt APM as a service
or adapt your in-house techniques, metrics
Do You Use APM Tools to Monitor SaaS or Public Cloud Apps?
will need to change to suit evolving application
We dont use
and business service priorities with regard to
Saas or public
2%
Yes, most of them
28%
transaction processing, Web page load times,
cloud applications
and so on. For example, companies may need to
tweak data collection and analysis steps to give
managers a more coherent and up-to-date picture
of application performance. Rather than being
concerned about each component individually, the
No
38%
Yes, to monitor a
32%
collection perspective must shift to the overall user
few of them
experiencein most cases, the transaction view.
Going to the cloud wont save money if you lose
business because of poor performance, so make
Data: InformationWeek Analytics Application Performance Management Survey
the APM transition a priority, not an afterthought.
of 100 business technology professionals using APM. August 2010.
Decide whether your current toolsets can extend
visibility over both on-premises and cloud or
virtualized applications, so that you can have confidence and certainty about performance.
David Stodder is chief analyst at Perceptive Information Strategies. Write to us at iwletters@techweb.com.

The State of Cloud Computing

2011

Managing Export-Controlled Data In


The Cloud
As IT pros evaluate cloud computing services, they must be aware of federal
regulations that restrict where certain data gets stored, or potentially face
serious penalties.
By Marsha McIntyre, InformationWeek
Companies evaluating cloud computing must consider the regulatory compliance
implications of this new approach to computing. One area of concern is whether any of
your companys data is controlled under U.S. export control rules, including whether use of
cloud services could lead to the disclosure of controlled technical data without the required
export authorization.

U.S. companies are


prohibited from exporting
controlled technical data
to certain foreign countries
without an export license.

It is important to consider export control implications of IT decisions early in the process


because U.S. export control rules have a strict liability standard, meaning that a violation
occurs whether the unauthorized disclosure was accidental, negligent, or intentional.
Individuals, as well as companies, may be held responsible for export violations. The penalties for non-compliance are severe,
ranging from $250,000 to $1,000,000 per violation. Individuals could face up to 20 years imprisonment.

The most popular cloud computing option is public cloud computing. A common example is Web-based e-mail like Googles Gmail.
In the public cloud scenario, the customer generally has no control or knowledge over the exact location of the provided resources.
Usually the customer is presented with a standard service level agreement with limited or no ability to tailor the terms of use.
Without the ability to tailor the service parameters to a companys business, it is likely that public cloud solutions will not meet export
compliance standards, if such needs exist.
Recently, some cloud service providers have been marketing their services as export control compliant. Knowing the basic U.S.
export control rules governing technical data should help companies decide whether cloud computing services being offered to them
meet their export compliance needs for all their systems and applications.
IT departments must determine whether export-controlled data may be contained on their systems and work with their legal
department to formulate a plan for handling such data inside or outside of the cloud. For the purposes of this discussion, controlled
technical data is data controlled under the International Traffic in Arms Regulations (ITAR) or the Export Administration Regulations
(EAR). Typically, this information is in the form of blueprints, drawings, models, formulae, specifications, photographs, plans,
instructions, or documentation regarding an export-controlled item or service.
U.S. companies are prohibited from exporting controlled technical data to certain foreign countries without an export license. For
example, sending an e-mail with export-controlled technical data to a customer in India would be an export of the data to India and
could require export authorization.
The rules also restrict the release of export-controlled technical data to certain foreign nationals, inside or outside the U.S., without an
export authorization. (To do so would be considered an export to that persons country of citizenship.) Companies are often surprised
by this rule. For example, if an American engineer in the U.S. walks blue prints for the manufacture of an export-controlled item down
the hall to his colleague who happens to be an Indian citizen, or e-mails them to him, this would be considered an export to India and
could require export authorization.
Companies in the defense industry should also be aware that, under ITAR, merely giving foreign nationals access to defense technical
data, whether or not the foreign national actually views it, is considered an export that requires authorization.

The State of Cloud Computing

2011

In the public cloud scenario, the customer generally has no control over or knowledge of the exact location of its data, and in fact,
there could be multiple copies of its data in multiple locations. Providing export-controlled data to a data center located outside the
U.S. could be considered an export to the data center location, which could require export authorization.
Additionally, once the company hands over its data to the service provider, the customer has limited control over who has access to
the data. From a security perspective, that is no doubt of great concern. In addition to requiring strong security controls, companies
with export-controlled data must implement measures to prohibit foreign nationals from having access to their export-controlled data.
Companies wary of turning over their data to public clouds have been considering private cloud models, in which the cloud service
provider constructs a cloud solely for one organization, or hybrid clouds, which enable data and application portability between a
private cloud and a public cloud (so more sensitive data can be kept in the private environment).
Any scenario in which a third-party service provider has access to your companys export-controlled data introduces risk of improper
disclosure to that third party for which your company could be liable.
To minimize the risk of improper disclosure of your export-controlled data, following are some key questions to ask the cloud
provider:
How is the cloud service set up to comply with U.S. export controls?
Where in the world will your data be stored?
How is sensitive data segregated and controlled?
Would any foreign nationals have access to your data?
Does an auditable trail exist?
Its important for IT departments to have answers to these questions as they evaluate cloud services.
Marsha McIntyre is an attorney at Hughes Hubbard & Reed LLP who focuses on export controls and sanctions. Prior to joining Hughes Hubbard & Reed,
McIntyre worked at the U.S. Department of State, Office of the Legal Adviser, providing guidance on international trade issues.

Adoption Status for Public Clouds

Adoption Status for Private Clouds

66% are using, planning and considering Public Clouds

72% are using, planning and considering Private Clouds

17% are already using

18% are already using

12% are already using & plan to expand usage

13% are already using & plan to expand usage

11% are planning to use within the next 12 months

13% are planning to use within the next 12 months

26% are interested in/considering

28% are interested in/considering

Based on survey of Interop New York attendees, October 2010

The State of Cloud Computing

2011

Supply Chain In The Cloud:


Enabled Just In Time
By Mary Shacklett
Although research firm IDC estimated that $7.7 billion will be spent worldwide on cloud
services, most of it will go to enterprise requirements planning (ERP), customer relationship
management (CRM) and cloud services providers, and supply chain management will lag
behind. However, now there are signs that these predictions are changing. Much of the
credit can go to the recent recession, when many companies outsourced manufacturing,
scaled down inventories, and avoided major product commitments in an environment
where it was virtually impossible to tell which items cost-conscious customers were going
to buy.

Pay close attention


therefore to the vendors
service levels, response
times for issues and
remedies for unavailability.

This combination of keeping inventories scaled back, relying more than ever on just-in-time (JIT) inventory ordering and then having
to blend ordering and production systems with systems of thousands of suppliers around the world finally created enough critical
mass to overturn traditional industry wisdom about keeping supply chain systems in-house and protecting against leakages of
production information and intellectual property.
Added pressure from outsourcing to suppliers around the world for least cost manufacturing was certifying all of these suppliers
to communicate in secure environments with corporate IT systems. Supplier certification is a painful, iterative process capable of
overwhelming an entire IT staff. The fix was obvious: Why not go to a supply chain, cloud service provider that already has 80 percent
of your supplier base certified and ready to plug in to your supply chain?
There are two aspects to the benefits of cloud computing models for manufacturing and logistics organizations, said John Brand,
Research Director at Hydrasight, an IT research and analysis firm. One is to remove the internal costs associated with running your
own IT infrastructure. The second is the benefit of increased visibility across organizational boundaries, particularly if a third party
is involved. In fact, when you consider what cloud-based e-mail services can do for the control and removal of spam and viruses,
cloud-based supply systems can similarly reduce the noise within the supply chain to simplify and speed up data exchange.
Perhaps the most pivotal question for companies seeking the cloud, however, is the degree of integration their businesses and
systems require with their supplier bases. There are two fundamental cloud computing approaches to be considered: either a
Web portal that provides real time communications and collaboration capabilities between companies and their suppliers, or a fully
integrated business-to-business (B2B) solution that not only provides real time communications and collaborations between all
parties, but that also performs transaction processing and data base updates in real time.
The reality is that organizations are often better served by data intermediaries who aggregate and value add to the data that passes
through the supply chain. Most often this data is made anonymous or heavily obscured to ensure privacy and integrity, while giving
organizations greater insight and intelligence into data which can be reasonably shared between parties, with the right security
policies and protocols in place. These data hubs can provide very rich services beyond simple data aggregation, reporting and
analytics, Brand said.
Many companies adopt the Web portal approach to a cloud-based supply chain and achieve fast results, getting their entire
supplier bases online in a matter of several weeks. Other companies, mostly large enterprises with robust supply chain integration
requirements, require considerable B2B integration with suppliers and with cloud-based supply chain solutions, just as they would
with an internal supply chain system. In these cases, integration can be tricky, and is a project that gets into months rather than
weeks.
The biggest obstacles are usually inflated expectations for business users, obstructionary IT departments, and poor alignment in IT
infrastructure, said Brand. Getting the balance right between the level of investment required to move into cloud-based services
and the execution of a migration and management plan is still a significant challenge.

The State of Cloud Computing

2011

Negotiating Cloud Computing Contracts


By Christopher C. Cain, Foley & Lardner LLP
Cloud computing continues to grow apace, with more businesses each year considering some form of a cloud solution. This is not
to say that IT departments are abandoning traditional software solutions but they are picking and choosing business functions they
are willing to push to the cloud. With one foot increasingly in the cloud and the other remaining in the business, IT personnel need
to keep in mind the difference between a traditional software solution and a cloud offering. With those differences in mind, you can
focus on the key aspects of the cloud computing agreement.
Cloud computing involves scalable and elastic IT-enabled capabilities delivered as a service. The vendor hosts the software and
data, often your data and other customers data held in a shared environment. In contrast, traditional software licensing involves
the delivery of a good, the software, installed locally in your environment. The software is usually highly configurable so it can meet
particular business needs and you retain control over the data used by the software. So going from the ground to the cloud means
your focus must shift from installing and configuring software to making sure the cloud service is available when needed and secure.
Lets consider availability and security each in turn.
The cloud service needs to be available for use in your business, but you are relying on the vendor, not your own IT personnel, for
that availability. Pay close attention therefore to the vendors service levels, response times for issues and remedies for unavailability.
Any reputable cloud vendor should have a very high uptime warranty, guaranteeing that the cloud service will have an uptime of a
certain percentage, during certain hours, measured over an agreed upon period. Carefully consider the agreed upon measurement
period (e.g., daily, monthly, quarterly), as vendors want longer measurement periods because they dilute the effects of a downtime.
Then ensure the vendor provides latency warranties for untimely or delayed responses from a service is effectively unavailable. The
agreement needs to include a matrix for estimated resolution times for reported problems based on severity of the issue. Finally,
the vendor should provide adequate service credits as a remedy for excessive downtime. The remedy should start out as modest
credits towards future services and scale to larger credits and if repeated failure occurs, you should have the right to terminate the
agreement without penalty.
Data security is important to protect sensitive data, both the companys and your customers. You are accountable for complying
with security and privacy laws, regardless of whether you or a cloud vendor are holding the relevant data. And data breaches are
expensive. A recent study Cost of a Data Breach, Ponemon Institute, LLC examined the costs of dealing with a data breach and
revealed an average total cost of $6.75 million. At a minimum, if a breach of security or confidentiality requires notification to your
customers under any privacy law, then you should have sole control over the timing, content and method of such notification.
The cloud agreement needs to have specific details regarding the vendors security measures, security incident management, and
hardware, software and security policies. These should all be reviewed by someone competent in data security. Compare such
policies with your own. More customers of cloud vendors are demanding the vendor match the customers policies and provide
copies of annual SAS 70 audits.
Vendor data centers located in a foreign country are a big potential problem because no opportunity exists to inspect the foreign
location and the location of the data may determine the jurisdiction and the law governing it. There is no global privacy law or
standard and thus protections vary widely. Moreover, vendor help-desk personnel accessing your data could be located in a foreign
country with limited security and privacy laws. Consider requiring the vendors data center be located and the services be performed
in the United States, and that no data be made available to those located outside the United States. If you cannot obtain these
warranties, find another vendor or at least consider very carefully the data you send to that cloud.
Data format, insurance and fee escalators are three key issues to address in minimizing your cloud risk. Avoid the hidden costs of
being locked in to the vendors solution because of its proprietary file format. The agreement should require that, at termination,
the vendor has to return your data both in the vendors data format and in a platform-agnostic format, and thereafter destroy all of
the customers information on vendors servers, all upon expiration or termination of the agreement. Also related to data formats is
the issue of deduplication and if your vendor uses it. Deduplication removes redundant data from your files to save storage space

The State of Cloud Computing

2011

in the vendors network. This process may remove metadata from the file which can result in many issues in the event of litigation.
Companies have found themselves subject to sanctions in litigation because metadata is missing from data relevant to the litigation.
Accordingly, you need to consider requiring the vendor to keep a full copy of the data, with all metadata, or you need to retain full
copies.
Also, dont overlook your ability to help self-insure against risks associated with a cloud agreement. While the vendor should have
technology errors & omissions insurance, consider getting a cyber-liability policy for your business. Cyber-liability insurance can
protect you against unauthorized access to a computer system, theft or destruction of data, hacker attacks, denial of service attacks
and malicious code or violations of privacy regulations. To avoid sticker shock from escalating prices, you should attempt to lock in
any recurring fees for a period of time (one to three years) and thereafter an escalator based on CPI or other third-party index should
apply.
If you are considering moving some business functions into the cloud, keep in mind the difference between the cloud and traditional
software and protect your business accordingly.
Christopher C. Cain is a partner with the law firm of Foley & Lardner LLP, practicing in the firms Information Technology & Outsourcing and Transactional &
Securities practices. He routinely counsels clients on the legal, technical and transactional issues arising in technology transactions.

Vendors Perceived as Cloud Computing Providers


Which of the following do you consider cloud computing providers?

2010

2009

Platform providers like Google Apps, iCloud

73%
70%

Infrastructure providers like Amazon EC2, GoGrid

53%
51%

SaaS providers like Salesforce.com, Sage

48%
45%

Virtualization technology providers like VMware, Xen


38%
Other

43%

9%

7%

Note: Multiple responses allowed


Base: 607 respondents in October 2010 and 547 in February 2009
Data: InformationWeek Analytics State of Cloud Computing Survey of business technology professionals

The State of Cloud Computing

2011

CloudAudit Gets Real


By George Hulme
For enterprises, one of the biggest challenges with cloud computing include transparency
into the operational, policy and regulatory, and security controls of cloud providers.
For cloud providers, one of their pressing challenges is answering all of the audit and
information gathering requests from customers and prospects. CloudAudit aims to change
that.

SAS 70 audits dont really


account for much of
anything when it comes
to security.

Not being able to assess and validate compliance and security efforts within various
cloud computing models is one of the biggest challenges cloud computing now faces. First, when a business tries to query a cloud
provider, there may be lots of misunderstanding about what is really being asked for. For instance, when a business asks if the
provider conducts periodic vulnerability assessments, and the provider responds affirmative they could be acknowledging an annual
review, a quarterly review, or a daily vulnerability assessment. Perhaps they check yes when really all they perform is an annual
penetration test. Too much ambiguity.
Additionally, cloud providers cant spend all of their time fielding questions about how they manage their infrastructure. And,
regrettably, not many public cloud providers offer much transparency into their controls. And no, SAS 70 audits dont really account
for much of anything when it comes to security.
To help clear the fog, an organization that just formed this year and is moving fast in the area of cloud management, CloudAudit.org,
has emerged with what it hopes will be part of the solution. The group is developing a common way for cloud computing providers to
automate how their services can be audited and assessed and assertions provided on their environment for Infrastructure-, Platform-,
and Software-as-a-Service providers. Consumers of these services would also have an open, secure, and extensible way to use
CloudAudit with their service providers.
The group currently boasts about 250 involved in the effort, from end users, auditors, system integrators, and cloud providers
representing companies such as Akamai, Amazon Web Services, enStratus, Google, Microsoft, Rackspace, VMware, and many
others.
Last week the group released its first specification to the IETF as a draft, as well as CompliancePacks that map control objectives to
common regulatory mandates, such as HIPAA, PCI DSS, and ISO27002 and COBIT compliance frameworks.
As (if) CloudAudit is embraced by cloud providers, businesses should be able to shop and compare services much more intelligently.
Also, it could help some cloud business users feel more comfortable moving regulated data (where its permitted) to a public provider.
For cloud service providers, CloudAudit can help them to more cost-effectively handle the number of audit requests each year. And,
who knows, such transparency may even be a boost to business.
Building a standard is one thing, getting it adopted, working, and embraced by industry is quite another. Next post Ill will bring you a
discussion with a cloud management provider who has already begun putting CloudAudit to use.

The State of Cloud Computing

2011

The Clouds Proliferating Open Source APIs


By Charles Babcock
In June, Red Hat moved its Deltacloud open source project into the Apache Software
Foundations incubator. In July, Rackspace made its Cloud Files code open source and will
collaborate with partners in the OpenStack project. The Open Cloud Standards Incubator
at the DMTF is producing another set. Isnt this just too much open source?

The Deltacloud project in


the Apache incubator looks
like an even more true
open source project.

Both OpenStack and Apache Deltacloud are projects with similar goals. They seek to build
out a set of Representational State Transfer or lightweight REST APIs that allow outsiders
to tap into the services of a cloud provider over an HTTP network. Red Hat CTO Brian Stevens, not wishing to sound unrealistic,
said, yes, two sets of open source APIs coming into existence at roughly the same time will compete with each other. In fact, they will
be a little different and used for different purposes.
The DMTF has a set of already suggested APIs in hand from Oracle, Fujitsu, VMware and Telefonica, and its goal is to produce a set
of cloud APIs that will work for those vendors and the customers they seek to supply.
The Rackspace/NASA Nebula-based OpenStack backers are looking to provide systems that could be used by a cloud services
suppliers who want to manage up to a million servers. Its software will be aimed at the service provider and allow many service
providers to look the same and be dealt with the same way by their customers. RightScale CTO Thorsten von Eicken went
somewhat out of his way to say that this is a true open source project, meaning it will include a variety of vendor participants and
form a community around the resulting code.
But he could have just said an open source project. The use of true open source project tells me that this group is a little nervous
about its open source standing. It is after all a group of vendors who each have a direct commercial interest in the outcome. Looked
at from that perspective, the Deltacloud project in the Apache incubator looks like an even more true open source project, open to
developers from around the world, each of whom will have a minimal direct commercial interest in the outcome.
I dont really care about the hair-splitting. The OpenStack project reminds me of XenSource, the company formed behind the open
source hypervisor that was backed by IBM, Sun, Oracle, HP and others. But I view XenSource as less successful in attracting
multitudes of developers to its cause than some other projects because of that vendor domination. If the agenda is being set by
Oracle and IBM, how many independent developers, working for nothing, are going to spend time on the project or choose its output
for their next project? That worry doesnt affect Oracle, et al too much because they have thousands of existing customers ready to
work with the alternative they provide.
The Apache Deltacloud project is more of a grassroots project, possibly more likely to be picked up and used by a variety of
grassroots developers and enterprise developers seeking to build an internal cloud. If enough of these implementations come into
being, then the cloud suppliers will take notice. Perhaps theyve already implemented OpenStack as the means to get to a rapidly
scalable infrastructure quickly. Theres no reason why they couldnt dedicate part of that infrastructure to being activated by a set of
APIs already in use inside the enterprise.
In effect we need all three of these open source APIs efforts to accomplish different goals and allow the cloud to become a form
computing that connects to many different customers and implement varied styles of computing. We are well on our way.
Emerging technology always comes with a learning curve. Here are some real-world lessons about cloud computing from early adopters. Download the
latest all-digital issue of InformationWeek for that story and more. (Free registration required.)

The State of Cloud Computing

2011

10

Wholesaler Builds A Private Cloud With


Virtual I/O
By Beth Bacheldor
Supplies Network, a privately-owned wholesaler of IT products, decided to roll out a virtual All of these cables,
I/O product as part of its ongoing efforts to modernize its data center. The company has
switches, ports, etc. are
virtualized all its servers and is building out a private cloud infrastructure. The goal of this
modernization is to boost scalability and performance while cutting costs and complexity,
complex and expensive.
so the last thing the St. Louis-based company wanted was a complicated, expensive
cabling infrastructure to connect its private cloud, explains Dan Shipley, data center architect for Supplies Network.
The company considered several options, including alternatives based on Fibre Channel over Ethernet (FCoE), but went with a
virtual I/O package from Xsigo Systems. Traditionally, you need different cables for different networks on the Ethernet side, with ten
to twelve Gigabit Ethernet or Fibre Channel connections on the back of each server, says Shipley. All of these cables, switches,
ports, etc. are complex and expensive. He says that when the company looked at how it wanted to build out a cluster of virtualized
servers, a key design element was a consolidated wiring structure.
Xsigo offers its I/O Director, a switch-like device thats designed to make a single network connection appear to be multiple virtual
NICs or Host Bus Adapters (HBAs). Xsigos technology offloads NIC emulation to the I/O Director. Each server has one physical
InfiniBand card (or two for redundancy), which the I/O Director can make appear to be multiple InfiniBand, Ethernet or Fiber Channel
cards. Each InfiniBand card supplies 10Gb/s of bandwidth that can be dynamically allocated between network and storage
requirements, according to Xsigo. Meanwhile, multiple Ethernet and Fibre Channel connections are replaced by a thin InfiniBand
cable. The I/O Director can connect to the SAN, LAN, and IPC networks.
Supplies Networks virtualized server deployment consists of 16 Hewlett-Packard DL380 and DL360 servers, VMwares vSphere
software, and a NetApp SAN for storage. Because NetApps SAN does not offer native InfiniBand support, Supplies Network relies
on an external InfiniBand interface to connect it to the private cloud. Shipley says he and his staff considered the pros and cons of
different solutions. InfiniBand has been around for a while, it has been used in high-performance computing and super data centers.
Its standardized protocol stack is mature, and runs at 40Gbit speeds today, he says. FCoE, however, is still in a state of flux and
we are worried about vendor interoperability and vendor lock-in. There were other concerns, too. In particular, Shipley says that with
an FCoE link, storage traffic always takes precedence. You cant say on an FCoE link, use only 4Gbs of fiber for storage and use
the rest for streaming video, he says. We were concerned about that. Shipley points to other pros of Infiniband: native support of
remote directory memory access (RDMA), and its lower latency compared with FCoE. Shipley also believes the security is stronger.
Bandwidth, scalability and flexibility are important to Supplies Network, which has more than 5,000 products in its catalog, runs four
distribution centers to supply products nationally, and offers managed print services, among other things. Xsigos InfiniBand fabric
can scale beyond 2,000 nodes and provides performance up to 40Gb/s per link, according to Xsigo. Xsigos open fabric connects to
widely-available server I/O cards and blade components from leading vendors including Dell, Hitachi, HP, IBM, and Supermicro. Xsigo
says its technology also includes the flexibility to change and expand I/O configurations on the fly in response to new requirements.

The State of Cloud Computing

2011

11

Web Security Services Put to the Test


By Randy George
The Web is a dangerous place. Web-based malware can cripple a PC in seconds and
take hours to remove--if it can be removed at all. Enterprises have turned to on-premises
products, both gateway-based and client-based, to filter inbound malware and stop users
from surfing to compromised or outright malicious sites. Now enterprises have another
option: A growing number of providers offers Web security as a service. These software as
a service (SaaS) offerings promise protection similar to what youd get with an on-premises
product, but without the capital outlay or ongoing operational costs. We tested Web
security services from Barracuda, McAfee, Symantec, Webroot and Zscaler.

Key takeaway for us in the


labs is that the average
user will probably never
know that he or she is
using a third party for
Web security.

You can find the full report, complete with a detailed analysis of each service reviewed,
here. Generally speaking, SaaS-based Web security works like this: Proxy your outbound Internet traffic through the closest point of
presence that your Web security vendor provides, and the provider skims out the malware mixed in with legitimate Web traffic.
As we prepared to launch this review, we were skeptical about the whole concept of Web security in the cloud. The idea of routing
your outbound Web traffic through a third-party proxy seemed like mayonnaise on a hamburger: just not appealing. We can now say
that after completing this review, well eat that mayo burger.
But that doesnt mean were giving up ketchup. The fact is, premises-based Web malware products from the likes of Bluecoat,
Finjan, Websense, McAfee and others are still our first choice for protecting users in corporate offices. They have proven themselves
to be scalable, reliable and effective. We cant yet say the same for SaaS Web security products--the market is simply too immature.
We need more assurance that these services will scale sufficiently before were ready to recommend wholesale adoption.
However, theres never been an efficient way to extend on-premises protection to remote users. Bluecoat and others offer halfhearted products via proxy clients, but routing traffic from an employee on the West Coast to a Web gateway filter on the East Coast
isnt efficient. This is where a Web security service can make a difference. We think that at present, midsize and large enterprises
that need to protect road warriors and small branch and remote offices should consider supplementing an on-premises product with
a SaaS-based offering. A service is easy to deploy, particularly for a subset of your entire employee base, and has low capital and
operational costs. In addition, user groups will get a similar level of protection as with an on-premises product.
Without fail, everyone we speak with about Web security in the cloud asks about the overhead added to the browsing experience,
so latency testing was a key element of our reviews. As we report our latency findings, keep these points in mind: First, many factors
go into the latency equation, so your mileage may vary from ours. Second, each providers cloud may have served us content out of
cache, potentially making one providers latency better than anothers. In addition, cloud providers have different geographical points
of presence; those with sites closer to our Boston-based lab had an advantage. Third, latency measurements should not be used
to rate one vendor over another, but rather should be used to broadly illustrate the impact of routing traffic through any providers
service.
The key takeaway for us in the labs is that the average user will probably never know that he or she is using a third party for Web
security. We generally found that most sites added only 0.1 or 0.2 seconds of latency. For sites that had many objects to fetch, we
saw a few extra seconds of latency, but that was at the absolute highest end of the spectrum.
The accuracy of each providers URL and malware filter was another key component of our testing. There were no curveballs in this
section of the testing: The goal was to simply see how well each vendor appropriately categorized and blocked our attempts to
access a set of well known spyware/malware domains.
We randomly selected 10 URLs from a huge database of known malware domains maintained from malwaredomains.com, and we
ran each vendor through the testing simultaneously. Because of the random nature of our domain selection, its fair to argue that
our results may not tell the entire story in terms of just how accurately each vendor can filter malware. For the most part, we were

The State of Cloud Computing

2011

12

extremely pleased with how well each malware filter worked. With so many sites compromised on the global Internet, its impossible
to expect any URL filter to stop each and every potential threat. However, if nine out of 10 sites that are known to host malware can
be filtered right from the get-go, IT managers have a tiered security capability when the service is combined with endpoint antivirus
and malware protection.
We also set out to see how well each vendor implemented Web security above and beyond simple URL filtering. We stuffed the eicar
test virus, which antivirus vendors support for testing signature-based detection systems, into an encrypted zip file, a self extracting
zip, and a zip file that was recursively zipped multiple times. All the providers in our roundup were able to detect the virus stuffed
inside the self-extracting and recursively zipped file.
We were initially able to slip the encrypted zip file past everyone. However, Barracuda, Zscaler, and Webroot have administrative
controls that can be configured to disallow password-protected and encrypted zip files. Symantec did not natively provide this
functionality, and McAfee had no administratively configurable attachment policy at all.
Next, we tried to slip a booby-trapped PDF through each providers security scanners. Our infected PDF contained active code that
could be used to install a downloader for distributing malware to infected clients. Every vendor detected and filtered the exploit.
We also threw a booby-trapped JPG and SWF file at each provider, with the same results. While not every service was extremely
configurable from a management perspective, all of the providers performed well at the core function of detecting nefarious
attachments and filtering them in the cloud.
For other attacks, we felt Zscaler had the most robust security capability. For example, Zscaler was the only provider that was able to
prevent cookie theft via cross-site scripting attacks. Zscaler accomplished this by applying a watermark to each cookie dropped onto
the computer. If those cookie contents were acessed by a third-party site, the Zscaler cloud assumed a cookie theft was underway
and blocked the transaction. Zscaler can also enforce policy at the cloud level against other XSS attacks. The Zscaler service offers
application, P2P, and file control features, bringing the most impressive array of security features to the cloud.
The SaaS Web security providers offered good out-of-the-box reporting capabilities. In fact, there are some big name on-premises
appliances that can learn a lesson from how the cloud vendors are doing it. Most of the vendors in our lineup have very useful
dashboards that quickly displayed a range of pertinent information. You can get snapshots of top URLs, bandwidth consumed by
user and spyware/virus activity from all the players in our roundup. On the whole, Barracuda and Zscaler offered the most robust and
elegant reporting engines in the group. Both services offer a good selection of pre-canned reports, and you can drill down for more
detailed usage. All of the vendors, except for McAfee, offered the ability to download PDF-based reports, as well as the ability to
schedule the creation of new reports for quick access when needed.
In terms of value-added features, Zscaler includes basic data loss prevention functions as part of its core offering. Customers can
turn on dictionaries and enforce DLP policies for the Health Insurance Portability and Accountability Act (HIPAA), the Gramm-LeachBliley Act (GLBA), Payment Card Industry (PCI) Data Security Standard and others through the management interface. Zscaler can
also apply those DLP policies to attachments and instant messages, so long as the transaction traverses the Zscaler proxy.
Webroot impressed us with its ability to execute proactive vulnerability scans against any system protected by its cloud. The
vulnerability assessment results were linked to explanations of each vulnerability, along with remediation instructions.
Compared with the cost of on-premises protection, cloud-based Web security is a relatively reasonable proposition. And, if cash is
tight, the Web services model may be a much more appealing option for smaller organizations. Prices are relatively similar across the
board, ranging from $1.50 to $5.00 per user per month.
If youre not ready to do Web Security in the cloud, youre not alone. But based on our experience in the labs, the protection
technology is robust. Organizations with no on-premises Web security should consider adopting a provider. The sell will be tougher
for organizations with significant investments in on-premises equipment. As mentioned before, we see remote access and protection
of small branch/home offices as the best use of these services for midsize and large organizations that already have Web security
gateways in place at the main office. Meanwhile, as the market and the providers mature, we expect Web security services to grow
into a viable option across the board.

The State of Cloud Computing

2011

13

Interops Cloud Computing Events


ABOUT INTEROP
Interop Las Vegas is the biggest IT event of the year with 200+ sessions and 350+ exhibitors covering the most important business
technology trends, May 8-12, 2011. Learn about the latest innovations that drive business value including cloud, virtualization,
mobility, networking, security and more. Register with code WHITEPAPER to get a free expo pass or save $100 on current price of
conference passes.
www.interop.com/lasvegas

ABOUT ENTERPRISE CLOUD SUMMIT


Interop Las Vegas hosts Enterprise Cloud Summit, where you will learn about practical cloud computing designs, as well as the
standards, infrastructure decisions, and economics to understand as you transform your organizations IT. Focus on public clouds on
May 8 and private clouds on May 9, 2011. Register with code WHITEPAPER to get a free expo pass or save $100 on current price
of passes.
www.interop.com/lasvegas/enterprise-cloud-summit

The State of Cloud Computing

2011

14

You might also like