This action might not be possible to undo. Are you sure you want to continue?
Joseph J. Raffa
Introduction This paper introduces the concept of technology and distribution discontinuities related to desktop applications in the enterprise. It examines a number of alternative approaches to the traditional method of installing applications on personal computer client hardware, and considers each in light of the major issues CIOs are facing today in application deployment, namely the cost and complexity of provisioning, maintaining adequate performance for the users, total cost of ownership and security. Finally, the paper looks at a number of functional areas within the world of IT and poses a series of questions that can be used as a filter to determine if one of the new application platform architectures is applicable in that area which is a possible investment opportunity. Abstract Over the past 40 years, the computing paradigm and architecture employed to effect applications has evolved from centralized mainframe systems through departmental minicomputers and eventually to highly decentralized personal computers. The personal computer brought unprecedented interactivity and functionality to each user’s desktop, based on application software installed and maintained locally on each machine. During the last decade, the revolution of the World Wide Web brought users access to networked resources delivered via a browser installed on each personal computer. The nature of that access has become continuously richer over time, starting with static data access in the form of web page viewing, evolving to internet commerce and eventually to Software as a Service (SaaS). SaaS replaces installed server software with web-based functionality and has enabled software vendors to serve enterprise customers with an incremental usage model sold and deployed in a distributed fashion. However, there has always been a significant difference in functionality between web-based applications and natively-installed desktop applications, hence the need for both types. In addition, due to the fact that software vendors must invest considerable resources to support multiple desktop operating systems and hardware platforms, the personal computer industry rapidly consolidated in the early years to a dominant systems architecture based on the Microsoft Windows operating system and Intel/AMD microprocessors (aka the WinTel architecture). As long as the requirement for pervasive installed desktop software remains, the economies of scale associated with the WinTel architecture will effectively dominate client computing economics and application architectures. In the past twelve months, however, there has been a dramatic acceleration of vendors, led by Google, to close the functionality gap between web applications and native applications by giving the browser new capabilities. For example, Google has developed browser functionality, known as Gears, which enables web applications to persist and function in the browser, even when the browser is disconnected from the Internet. Upon reconnection, the Gears-based application synchronizes with the web service to present the user with a seamless experience. Similar technologies are becoming available from Adobe (AIR) and Microsoft (Silverlight). In addition, the virtualization technology suppliers, led by VMware and Citrix/Xen, have augmented their traditional server-side virtualization offerings to include fully-functional virtual clients which can be pooled, centrally managed and allocated to users on demand. This is in effect starting to blur the traditional lines between installed client applications and server-based applications. On the mobile platform front, advanced browser functionality and increased bandwidth from products such as the iPhone (which includes the Safari browser) and the Google G1 phone (which includes the Google Chrome browser and Gears) are closing the functionality and performance gap with traditional personal computers. On the server side, virtualization technology is maturing and enabling vendors to offer cloud computing services and web-based resources, such as Amazon’s EC2 for computing and S3 for storage which are providing application vendors with a broad palette of functionality they can tap to improve the overall user value proposition. Similar cloud-based resources are also now straightforward for enterprises to provide, based on equipment in datacenters they own or control, in pursuit of that same user experience. Hence, with the gap closing, it is now possible for developers to build compelling and competitive applications based on these emerging technologies, which do not require native installed applications. This opens up the possibilities of future client application platforms to non-WinTel systems. As this phenomenon develops, the locus of application development will inevitably shift toward this persistent web application model, further enhancing the value of a web-based platform, marginalizing the need for natively-installed applications and opening up the industry to the next application platform beyond the PC. The winners will be application vendors
who figure out how to take market share throughout this transition. The losers will be the companies who are deeply tied to the traditional WinTel architecture and installed application paradigm. The Major Issues for IT Desktop Operations IT managers are handcuffed by all of the processes they must follow as workers become more mobile and compliance requirements more difficult to enforce. Specifically, the desktop operations group is responsible for the productivity of its users while making sure that their PCs are locked down to the required spec. However, current client management and security tools tackle only part of their problem; for example, tools can assure that users have access to necessary applications but can’t guarantee that machines are healthy enough to perform sufficiently. This adds up to five universal challenges: Reining in the costs of PC Management. Managing the day-to-day operations associated with supporting a PC environment is no easy assignment. IT departments are working overtime to deploy additional applications to users on a one-off basis and at the same time making sure all machines and applications are patched in a timely manner. All of this hurt IT’s bottom line while only keeping machines at a bare minimum of acceptable standards. Securing devices and data regardless of location. An OS will always require patching, but increasingly the challenge is securing the data. More than three-quarters of North American and European enterprises stated that protecting customer data and intellectual property are top business objectives in 2008. To do this, however, they are forced to deploy multiple technologies, such as antimalware, host intrusion prevention, information leak prevention, and encryption in silos, making it difficult for IT to keep up with the latest threats. Remaining compliant with regulations and mandates. In today’s world, IT must be able to ensure that machines are compliant with the necessary regulatory and corporate requirements, such as the Office of Management and Budget (OMB) mandate to have all PCs configured with 300 specific settings. Whether the requirement is around energy efficiency or data protection, organizations must have full control and visibility into their PCs and at the snap of the finger be able to prove to auditors that they are in compliance. Supporting a changing workforce. The number of typical nine-to-five office-bound workers decreases every year. In the wake of this, IT faces a two-fold challenge: 1) the increasing number of mobile and remote employees, and 2) a younger, more demanding generation of workers who are used to greater choice and access due to their experiences in the consumer world. Furthermore, IT must still provide a consistent level of service to these users that now require anytime, anywhere access to data and applications. Planning for disasters or workforce disruption. Another top priority for today’s enterprises is protecting the organization’s information assets from the next 9/11, Hurricane Katrina or even a simple laptop theft. The traditional computing environment is not equipped to handle such disasters, however, so IT must find a way to ensure workforce continuity in the event of a disruption. Market Size As of June 2008, the number of personal computers worldwide in use hit one billion, and about 75% of those are used by enterprises. Mature markets like the United States, Western Europe and Japan accounted for 58 percent of the worldwide installed PCs. About 180 million PCs (16 percent of the existing installed base) were expected to be replaced in 2008. The whole installed base grew 12 percent annually. Therefore, there are many possible insertion points for new client technology that naturally arise due to this replacement cycle. History of Client/Server Computing The mainframe computer was the first product category in the new age of computer technology. It was considered to be the best of what was available. Big, expensive machines that were used by large organizations to support their
Application Streaming. The basic concept of application streaming has its foundation in the way modern computer programming languages and operating systems produce and run application code. Only specific parts of a computer program need to be available at any instance for the end user to perform a particular function. This means that a program need not be fully installed on a client computer, but parts of it can be delivered over a low bandwidth network as and when they are required. The diagram below shows three basic models of application execution:
Firstly, distributed applications are installed directly on the client hardware and executed locally. This is the traditional application architecture for desktop computing. The second model illustrates the concept of application streaming. The application is stored and maintained in the streaming server, then downloaded on demand to client personal computers, where they are executed locally. Only those portions of the application which are required are downloaded on-the-fly, and the version of the streamed application must precisely match the version of the operating system on the personal computer. It may be necessary to maintain multiple versions of the application targeted at various client operating systems, such as Windows XP and Vista. This architecture greatly reduces the time and cost required to maintain the applications, as they are stored centrally, and does not require the applications to be rewritten, but it needs to be carefully designed so that users do not incur a performance penalty.
Finally, in the third model, the applications are stored and executed on the server, and the results are presented to the client. This model encompasses a wide range of application categories ranging from simple browser-based applications through Rich Internet Applications, and generally requires the applications to be specifically architected or re-architected to run on the server rather than on the client. Virtualization. Virtualization is a technique that utilizes a software implementation of a computer: one that provides a complete simulation of the underlying hardware. The result is a system in which all software capable of execution on the raw hardware can be run in the virtual machine. In particular, this includes all operating systems. Implemented on the server side, this technique allows multiple operating systems to share conventional hardware in a safe and resource managed fashion, but without sacrificing either performance or functionality. Each of these operating systems runs on its own virtual machine, which is under the control of the virtualization environment. This provides a high degree of control in managing a datacenter infrastructure, in that these virtual machines may be started, stopped or relocated to different physical server hardware so as to maximize the utilization of the server hardware in the datacenter. The resulting improvement in server utilization and cost reduction as a result of adopting virtualization products from suppliers such as VMware and Citrix/Xen has fueled the growth of this category. Virtual Desktop Infrastructure (VDI) is a server-centric computing model that borrows from the traditional thinclient model but is designed to give system administrators and end-users the best of both worlds: the ability to host and centrally manage desktop virtual machines in the data center while giving end users a full PC desktop experience. The user experience is intended to be identical to that of a standard PC, but from a thin client device or similar, from the same office or remotely. Many commercial solutions also add the ability to switch some incoming client sessions (using connection broker software) towards traditional shared desktop systems such as Microsoft's Terminal Services or Citrix's application servers, blade servers or even to individual unused physical desktop computers.
The diagram below illustrates how a pool of virtual desktops, stored and executed in a centralized environment, may be accessed and controlled by clients, delivering a user experience similar to actually running the code on the client hardware. The virtual desktops are assigned to each client on demand by the virtual desktop manager, which also manages the authentication, security and resources assigned to each virtual desktop. There may be several different types of virtual desktops, containing different operating systems, applications and resources, available to users and under control of the IT department:
Desktop virtualization provides many of the advantages of a terminal server, but (if so desired and configured by system administrators) can provide users much more flexibility. Each of them, for instance, might be allowed to install and configure their own applications. Users also gain the ability to access their server-based virtual desktop from other locations. It is now even possible for users to access their virtual desktop from a web browser. Coming full-circle in the balance between centralized and distributed execution, a new category of virtual desktops is emerging which allows the virtual desktop to run on the client PC in addition to the server. The virtual desktop can be “checked out” of the central repository, used on the client hardware (either online or offline) and then checked back in to the central pool for later use on another client PC. This approach maintains the benefits of centralized administration without compromising responsiveness and performance from the user’s perspective.
To summarize, there are multiple architectural choices available to CIOs today for delivering applications to the desktop. The table below shows the full matrix of possibilities and highlights the areas which are most promising and reasonable, considering the performance tradeoffs for the end user:
Application Development vs. Application Delivery It is important to draw a distinction between the application development environment and the application delivery infrastructure. This discontinuity considers the impact of changes in both of these processes and the consequent impact on the application vendors and infrastructure providers involved in serving the user. There are some subtleties here which incumbent and upstart software vendors need to consider. For example, think about the snappy features of Microsoft Word. If the user highlights a group of text, then selects a new font and rolls over the various fonts, the highlighted text will immediately respond to reflect the prospective choice. However, if the user tries this using a server-based virtual desktop over RDP, the look and feel is very different and can be unresponsive and frustrating. What’s the lesson here? Even though the development environments for traditional Windows XP and the virtual Windows XP are identical, the application implementation undertaken by the vendors will differ depending on their target delivery model. The vendors want their applications to look good. The relevant question is: will they release two versions, for the native desktop and the virtual desktop, in order to maintain user satisfaction? The development environments are very similar, making it easier for incumbent application vendors to exploit this new method, but the deployment is completely different, which presents possible costs and tradeoffs for the IT department and for users, impacting adoption. On the other hand, consider the web access to a virtual desktop, such as is available from VMware VDM Web Access, contrasted with online apps such as Google Apps. From a user’s perspective, these might look somewhat similar, with usage consisting of opening the browser and logging in, using the word processor, spreadsheet and presentations applications over the web. In this case, the development environments are radically different, but the deployment and user experience may be quite similar, making it easier for the users to switch vendors.
We will consider the needs of multiple constituents as we discern the opportunities to exploit this discontinuity: Application developers (Independent Software Vendors) – there is an 80/20 phenomenon and a long tail of fringe applications. It is easier to pick off mainstream applications like Word, Excel and PowerPoint in a new development environment, but what about the long tail? At the same time, does a
web-based approach offer a better distribution opportunity than fighting for shelf space with packaged software? The IT department – the IT department needs to provide the infrastructure and maintenance for enterprise deployment. Interestingly, the server-based virtual desktop and the RIA approach may look very similar from an infrastructural, administrative and resource perspective. The end users – end users can also be segmented into power users, rote data-entry, and others, and their switching costs and the impact on their daily work processes are fundamental.
The best opportunities to exploit the new application platform will be those which deliver the maximum amount of user value, while simultaneously best leveraging the existing investments in infrastructure and application development from legacy implementation. Although the increasing number of architectural choices is exciting from a technologist’s point of view, the uncertainty, complexity and potential risks of these various approaches can be overwhelming and confusing to IT management. Each of these approaches can be shown to be more efficient and cost-effective than the traditional installed desktop software method, but there is no perfect universal replacement presently available which does not results in some sort of compromised performance or functionality. However, this confused environment opens up the possibility to new entrants, including startups. Architectural Alternatives for Application Platforms As mentioned above, CIO’s have multiple architectural choices available today for delivering applications to the desktop. The four primary options are: Traditional Installed PC Software Streamed Applications Virtual Desktop Infrastructure Web-based Applications
The dominant approach to-date has been traditional installed PC software, with the operating system and the applications installed and run directly on the client hardware. In the enterprise today, this is comprised mostly of Windows XP running Microsoft Office. This method of deployment has become increasingly expensive over the years. Specifically, the total cost of ownership of a typically managed Windows desktop has reached more than $5,000 per year. In fact, the high cost of PC management and support is offsetting the relatively low cost of PC hardware. Desktop hardware and software acquisition costs typically account for only 20-30% of the total cost to the enterprise over the life of the device, while the remaining 70-80% consists of IT maintenance costs, such as moves/add/changes (MACs) of employees, repairs and fixes, and upgrades. Ongoing PC management including deployment of applications, software updates, and security patches can be labor intensive because of the need to test and validate deployment for a wide variety of PC configurations. Application streaming solutions, such as Softricity’s SoftGrid, Altiris SVS, or Citrix Systems’ Streaming Server, enable the dynamic delivery of applications to end users’ desktops from centralized servers. While similar to serverbased computing, the fundamental difference between application streaming and server-based computing is that the application is not actually run on the central server. Instead, when a user launches an application from the client in an application streaming environment, the server streams the necessary application files to the user’s PC and the application launches. The key point is that not all of the files are sent to the PC; only what is required to run that particular application is streamed (i.e., when launching Excel, the whole 300 Mb Office application will not be streamed down). Application streaming allows quicker deployment of applications by providing on demand access to applications that are needed from any computer. Since each application executes inside its own virtual space, applications will
not conflict with existing applications on a user’s desktop and, thus, regression testing will not be needed. Additionally, it is no longer required to install all applications, which reduces the size and complexity of system images as well as the number of images that must be maintained across the environment for different user communities. On the other hand, for an application to be available in a streaming environment, the application would first need to be sequenced and then packaged using a tool from one of the streaming suppliers. This adds to the cost and overhead associated with application adoption and maintenance. The Virtual Desktop Infrastructure’s form of desktop delivery, called network centric management, is different than traditional client management architectures in that the authoritative copy of an OS and application packages are both created and maintained centrally. When a user logs in, the centralized system hosting an end user session accesses the OS and/or application packages over the network, executes the data locally, and presents the computing platform to the host device. Virtual Desktop Infrastructure management enables a single-instance servicing model, where OS and application package configuration changes made in one central location can be made available to all users in a deterministic manner, a model that provides for an unprecedented level of agility and manageability with end users being able to access their OS and applications from anywhere and for IT able to deploy software and desktop environments with a high level of agility and simplicity. The ability to pool desktop computing resources to eliminate the one user per Windows image paradigm represents one of desktop virtualization’s key differentiators. Decoupling the user from a specific desktop environment requires the infrastructure to figure out which VM server has capacity for a new end user attempting to log in; discover the user’s virtual disk files; start up a VM using those files; and then connect the user to that VM desktop image. In many ways, the concept of a Virtual Desktop Infrastructure (VDI) parallels the theme of utility computing and provides many of the individual advantages of legacy desktop management models previously described into one desktop delivery platform. Although a VDI offers the broadest set of advantages to other models, some disadvantages exist depending on the end user type. For example, the VDI model requires clients to be online and connected to the server with reliable network bandwidth, which is not always practical for mobile knowledge workers. Furthermore, even when the required connectivity is in place, the performance of graphics-intensive applications is compromised relative to native installed applications, which may be unacceptable to power users. In order to address these issues and provide a complete desktop virtualization solution, a number of new entrants are offering the ability to run corporately managed applications and PC images offline or, more accurately, when infrequently connected. Combining the paradigm of a Virtual Desktop Infrastructure with client-hosted hypervisors and streaming/provisioning technology to enable users to seamlessly switch between online desktops and cached desktop images represents an ideal way to offer centrally-managed desktop environments to local users with LAN connectivity, as well as disconnected mobile and non-corporate remote users using a single architecture. Virtualization solutions enabling hybrid, online/offline modes of operation are especially useful for mobile employees, allowing employees to work with online applications and PC images, even if disconnected from the backend server. For example, if a mobile employee intends to leave the corporate environment and travel on an airplane to a meeting across the country, a cached image of that user’s virtual desktop, including the operating system, applications, and files, is copied onto a virtual machine created by a hypervisor on that employee’s notebook before it is disconnected, and the virtual desktop image will then run locally inside of a virtual machine. The desktop environment is completely hosted on top of the operating system of the notebook and does not require a constant connection with a backend server. A major concern with offline deployment models is the security of the offline image. To address this concern, both the local VM image and any files that are extracted from the image to the local host environment must be encrypted. Advanced Encryption Standard (AES) encryption keys can be generated by the server and stored locally on the client. Once connectivity to the backend server infrastructure is restored, both active synching with the server image, as well as versioning support for changes to user data and files, are then required to create a seamless end user experience with client side caching of a centrally-managed virtual desktop image and associated data.
For example, Kidaro, acquired by Microsoft in March 2008, offered a desktop computing solution for enterprise desktops and laptops that enabled encapsulation of an entire desktop operating system, applications, tools, and data into a virtual machine that operated as an isolated workspace, whether or not the local machine was connected to the network. Kidaro’s platform used client-hosted desktop virtualization, which has the benefits of supporting mobile users and disconnected use. Kidaro also offered a virtual desktop on a USB drive, named Kidaro ToGo, for providing users access to corporate applications and data anywhere and from any device. There are a large number of new entrants and startup companies in this product category. Traditional web applications center all activity around the client-server architecture with a thin client. Under this system, all processing is done on the server, and the client is only used to display static (in this case HTML) content. The biggest drawback with this system is that all interaction with the application must pass through the server, which requires data to be sent to the server, the server to respond, and the page to be reloaded on the client with the response. By using a client side technology which can execute instructions on the client's computer, RIAs can circumvent this slow and synchronous loop for many user interactions. This difference is somewhat analogous to the difference between "terminal and mainframe" and client-server/fat client approaches. Internet standards have evolved slowly and continually over time to accommodate these techniques, so it is hard to draw a strict line between what constitutes an RIA and what does not. But all RIAs share one characteristic: they introduce an intermediate layer of code, often called a client engine, between the user and the server. This client engine is usually downloaded as part of the instantiation of the application, and may be supplemented by further code downloads as use of the application progresses. The client engine acts as an extension of the browser, and usually takes over responsibility for rendering the application's user interface and for server communication. Although developing applications to run in a web browser is a much more limiting, difficult, and intricate process than developing a regular desktop application, the efforts are often justified because: Installation footprint is smaller – overhead for updating and distributing the application is trivial, or significantly reduced compared with a desktop or OS-native application Updates and upgrades to new version can be automatic or transparent to the end user Users can use the application from any computer or mobile device with an internet connection Many tools exist to allow off-line use of applications, such as Adobe AIR, Google Gears, Curl and other technologies Most RIA technologies allow the user experience to be consistent, regardless of what operating system the user chooses Web-based applications are generally less prone to viral infection than running an actual executable Areas for Evaluation The data center is being transformed under the influence of virtualization and externally hosted applications and the idea of a “standard” corporate computing platform will vary based on organizational, departmental and user requirements. The key questions we ask should be motivated in search of opportunities for startups to deliver technology components and/or vertical solutions that can exploit these emerging platform options. Some areas to consider include: Web-Based Applications – Startups like Zoho have been able to create broad suites of web-based applications in a highly capital-efficient manner, offering enterprise customers an easy path toward replacement of installed Microsoft Office apps. Virtual Desktop Infrastructure – The propagation of virtualization-ready microprocessor into desktop, laptop and mobile devices is opening up a plethora of options for new virtualization techniques which combine the traditional benefits of centralized efficiency with the flexibility of decentralized and offline execution. Platform as a Service (PaaS) – This is an outgrowth of the Software as a Service application delivery model which makes all of the facilities required to support the complete lifecycle of building and delivering web applications and services entirely available from the Internet. As end users are becoming more familiar and comfortable with using consumer platforms such as Facebook to host their personal applications and
- 10 -
communications, the possibilities for introducing a similar type of platform into the corporate environment are emerging. Thin Clients, Netbooks and Nettops – The common denominator with these hardware platforms is that they are network-centric, offering the user the minimum amount of functionality required to access web-based and virtualized environments. Startups such as DeviceVM are developing novel approaches to providing users with always-on, fast and functional clients. Mobile Applications – The surge in capability and popularity of smart mobile devices such as the Blackberry and the iPhone is increasing the options for mobile application developers, and new investment funds such as the Blackberry Partners Fund and the iFund have been set up to target these startups.
Questions To Help Identify Investment Opportunities Categorizing these new VDI, RIA and PaaS paradigms can be tricky: any device that can run an Ajax- Java- or Flash-enabled Web browser can deliver a rich desktop experience. What’s certain is that the “virtual office” may now be taken to the extreme. Enterprises can design and implement their entire corporate IT infrastructures as a hosted service, with users tying in via the internet with a minimum of hardware. The following questions are an initial attempt to define a set of criteria that can be used to test various investment opportunities: 1. Where is the most value (and thus the area best suited for a startup investment) to the business customer – in the virtualized infrastructure or mobile computing platform, or in the applications that reside within or on top of them? What are the likely impacts to the existing network and storage infrastructure? Are major upgrades and investments going to be necessary before the full power of the “virtual office” can be realized? Are there likely to be investment opportunities in areas that are pre-conditions for a successful “virtual office” world? For example, what areas of security and compliance might be worth investigating? How do regulations in industries such as in healthcare, financial services, and consumer privacy impact the “virtual office” concept? Are some areas just going to remain off limits? Are there going to be geographic or regional differences with regards to either acceptance of the “virtual office” or regulatory requirements that need to be considered? With regard to enterprise adoption, when are the natural “buying windows?” Is the decision-making centralized or decentralized? Will “virtual office” solutions be priced and licenses like traditional applications (i.e. perpetual license), per-user subscriptions such as SaaS, or time-based or usage (i.e. metered) driven criteria? Is one more likely to dominate than the others? What role will Linux play on the desktop? Will it ever become a mainstream platform? Can the opensource desktop applications, such as OpenOffice, be used in a virtualized environment as a viable alternative to Microsoft Office? Is it practical to provide a virtual client-as-a-service offering for the small- and medium-business (SMB) market? What would be the go-to market strategy?
10. Now that it is possible and practical to virtualize the client environment, in addition to the server environment, is there an opportunity for a new enterprise application delivery platform to emerge which is
- 11 -
not encumbered by the traditional middleware architectures? Perhaps architectures focused on human organizations and processes, rather than on computers? 11. What are the alternatives to a hosted service if the enterprise customer insists on keeping their data within their firewall? 12. With regard to the reality of existing licensing agreements in large enterprises and within the SMB markets, who is in control? What openings will present themselves in the next three years for new entrants? Conclusion Enhanced browser functionality, a rich palette of virtualization options and dramatic improvements in mobile devices are opening up the possibilities of future client application platforms to non-Windows systems. As this phenomenon develops, the locus of application development will inevitably shift, marginalizing the need for natively-installed applications and opening up the industry to the next application platform beyond the PC. This discontinuity raises the prospect of exciting investment opportunities for to consider in the coming years.
- 12 -
This action might not be possible to undo. Are you sure you want to continue?
We've moved you to where you read on your other device.
Get the full title to continue reading from where you left off, or restart the preview.