You are on page 1of 11

1993 Gartner Symposium Overview CAUSE INFORMATION RESOURCES LIBRARY The attached document is provided through the CAUSE

Information Resources Library. As part of the CAUSE Information Resources Program, the Library provides CAUSE members access to a collection of information related to the development, use, management, and evaluation of information resources- technology, services, and information- in higher education. Most of the documents have not been formally published and thus are not in general distribution. Statements of fact or opinion in the attached document are made on the responsibility of the author(s) alone and do not imply an opinion on the part of the CAUSE Board of Directors, officers, staff, or membership. This document was contributed by the named organization to the CAUSE Information Resources Library. It is the intellectual property of the author(s). Permission to copy or disseminate all or part of this material is granted provided that the copies are not made or distributed for commercial advantage, that the title and organization that submitted the document appear, and that notice is given that this document was obtained from the CAUSE Information Resources Library. To copy or disseminate otherwise, or to republish in any form, requires written permission from the contributing organization. For further information: CAUSE, 4840 Pearl East Circle, Suite 302E, Boulder, CO 80301; 303449-4430; e-mail info@cause.colorado.edu. To order a hard copy of this document contact CAUSE or send e-mail to orders@cause.colorado.edu. OVERVIEW OF GARTNER GROUP SYMPOSIUM 93 October 4-8, 1993 Lake Buena Vista, Florida CAUSE Exchange Library Document CSD0791 OVERVIEW OF CAUSE AND GARTNER GROUP RELATIONSHIP The Gartner Group is a worldwide provider of tactical information, strategic analysis, and data on the information industry. Gartner offers a broad range of decision-support services to executives in corporations, government agencies, and higher education that produce or use information technology products and services. CAUSE and the Gartner Group have established a program that provides the following benefits to CAUSE member institutions. 1. CAUSE member campuses receive a discount for Gartner Group Services. This is referred to as CAUSE Association Pricing

for ITM-A services. All CAUSE members with an E & G (Educational and General) budget above $200m or with student enrollment of 10,000 or more will receive ITM-A (two recipients of all materials) for $16,500 per year, which provides a discount of $3,500 from the current list of $20,000 per year. In addition, a new pricing structure has been offered to smaller CAUSE members for ITM-A (one recipient) for institutions with an annual E & G budget less than $200m and with fewer than 10,000 students. This onerecipient offering is only available to CAUSE member institutions for $10,500 per year. 2. CAUSE is able to use Gartner data and information for white papers, newsletters, CAUSE/EFFECT articles, and other CAUSE publications. A CAUSE Task Force has been established to help find the most productive means for distributing information from Gartner Research Notes to CAUSE members. Two "White Papers" are currently available through the CAUSE Exchange Library, one entitled "Client/Server Issues: A CAUSE/Gartner Group White Paper" (CSD0768) and the other entitled "Telecommunications Issues: A CAUSE/Gartner Group White Paper" (CSD0769). Additional reports will be made available in 1994. CAUSE Information, the CAUSE bi-monthly newsletter, also has a regular feature using research from the Gartner Group. 3. CAUSE members who sign up for ITM-A services will also receive one conference ticket to the Gartner annual Industry Symposium. QUESTIONS CONCERNING THE 1993 GARTNER SYMPOSIUM WITH GARTNER GROUP RESPONSES Note: These questions are from a Gartner Group brochure announcing the 1993 Symposium and are included to provide an overview of Symposium 93. 1. Why should I attend Symposium 93? Because you need the right information necessary to make wise choices in today's volatile IT arena. You need answers to hot key issues such as: * Beyond client/server - Which architecture will emerge as the leader? * Technologies to depend on - Which ones will position you for the year 2000? * An architectural and financial leadership vacuum - What does the crystal ball hold for IBM's future? * Desktop supremacy - Microsoft, Novell or ...? * Changing organizational structure - What are the roles of IS, top managers and the whole tier of end users? 2. What is new in the program lineup?

This year we are introducing Exploratorium 93, a demonstration forum designed to give attendees a hands-on look at the newest IT technologies in an educational environment. Only selected vendors have been invited to participate. They will also be offering one-hour product education sessions to attendees in a small-group interactive format. In addition, new program formats include head-to-head sessions, multimedia presentations, and interactive surveys. Informative keynote presentations will be presented by William H. Davidow, general partner, Mohr, Davidow Ventures, and John P. Imlay, Jr., chairman, Dun & Bradstreet Software Services Inc. 3. How can I maximize the experience of attending Symposium 93? Symposium 93 offers many different learning formats and more than 100 sessions to choose. You can customize the entire agenda to your own specific needs, or you may want to follow one of the 12 focused mini conferences. This format presents the ideal situation for multiple members of an organization to gain a thorough grounding of current issues and trends in all areas of the IT industry. COMMENTS AND CHOSEN HIGHLIGHTS FROM SELECTED CAUSE MEMBER ATTENDEES OF SYMPOSIUM 93 SAM PLICE, CHIEF OPERATING OFFICER, INFORMATION TECHNOLOGY DIVISION AT THE UNIVERSITY OF MICHIGAN, OFFERED THE FOLLOWING LIST OF HIGHLIGHTS AND GARTNER PROJECTIONS: * By 1996 all vendors will make the transition to core platform technology based on CMOS. (0.8 probability) * By 1996, more than 60% of network enterprise-level data will be backed up and managed by large central servers. (0.7 probability) * Large systems and enterprise MIS will remain the key manager, operator, and administrator of enterprise applications and storage during the rest of the decade. (0.7 probability) * Average mainframe selling prices ($ per MIP) will decline to near workstation levels by 1998 (0.9 probability) * The need for increased LAN bandwidth and microsegmentation will drive mid- to high-end hub products to support virtual LAN capabilities by 1996. (0.8 probability) * Switching hub products will extend the life of existing ethernet NICs, slowing the deployment of ATM to the desktop. (0.9 probability)

* The NOS will play the most prominent role in transforming today's infrastructure into a network computing environment. (0.8 probability) * Mobile computing will introduce at least five new operating systems, and new pen- based extensions will seriously complicate development efforts. * Security issues will be exacerbated by mobile computing, and user discipline problems will slow adoption until a catastrophic failure occurs. (0.7 probability) * By 1995, document management will become the most important service on the LAN after messaging. (0.7 probability) * Lotus will leverage its work-group strategy and architecture to maintain the lead notes has established through 1994. (0.8 probability) * The four fundamental factors driving the adoption of open systems--functionality, availability, complexity, and cost--will favorably intersect beginning in 1995. (0.7 probability) * The widespread implementation of a single open systems environment on both UNIX and proprietary systems will not occur before 1997. (0.6 probability) * OSF Motif and DCE will emerge quickly (by 1994) as de facto standards. (0.8 probability) * Lotus and Apple will offer successful alternatives to a Microsoft defined groupware environment. (0.7 probability) * IBM will evolve MVS to try to forestall migrations to alternative mainframes, but will not be able to attract the majority of new workloads. (0.8 probability) * MIS organizations that do not hedge the risk of vendor failure will lose their jobs. (0.9 probability) * Moore's Law (Technology capability doubles every 18 months) will hold through the middle of the next century. (Kurzweil) * Lotus Notes is best positioned to be the object storage and replication standard across multiple object models and multiple server operating systems. * Leading E-mail vendors will deliver products supporting the CMC interface by the end of 1993. (0.7 probability) * The messaging services access module and the associated API will become an integral function of the leading operating systems by 1995. (0.7 probability) * Multiple tools will be required to satisfy most organizations' critical applications requirements through 1998. (0.8 probability)

* By 1998, all leading 4GL vendors will incorporate Object Oriented features. (0.9 probability) * Monolithic environments will be required for highly secure, highly available systems through 1995. (0.8 probability) * Despite mounting competitive threats, Intel will hold on to its dominant position in the market. (0.7 probability) * OSF DCE will be the leading example of "open systems" standards at the desktop, yet it will not even be installed on 25 percent of all desktop systems. (0.7 probability) * The most successful organizations will balance investments in technology with investments in reshaping the attitudes of the IS staff. (0.7 probability) * Horizontal organization will leverage scarce skills across the organization. * By 1996, Oracle, Sybase, and IBM will dominate the RDBMS market. (0.8 probability) * As the NOS role in network computing shifts to a "services emphasis," the range of services offered by NOS leaders Novell, Banyan, and Microsoft will offer vast supersets of DCE services. DCE will be positioned as a useful subset for "network plumbing." (.85 probability) JOHN STUCKEY, DIRECTOR OF UNIVERSITY COMPUTING AT WASHINGTON AND LEE UNIVERSITY, PROVIDED THE FOLLOWING GENERAL COMMENTS: "I found the two most interesting sessions (of the four on the only day I attended) to be those by Rita Terdiman (Business Process Re-engineering : Hype or Hope?) and Alexander Pasik (Advanced Technologies--A Ten-year Perspective). Terdiman managed to steer a course between naive fascination with the newest fad in organizational reformation/revitalization and dismissal of the excess hype accompanying that new fad. The principles and strategies she suggested were universal and made sense, and she usually illustrated them with examples from several environments and/or corporations. Net conclusion: BPR can be profitably applied to IS processes, with the right involvement, support, and follow-up. The most significant hurdle may be in freeing ourselves to think of using IS in new ways, rather than merely using it to streamline old processes. Pasik spoke of the organizing principle we are moving toward, of an architecture based on an "enterprise served platform" providing location transparency, ubiquitous information access, and scalability. Various hardware and software technologies will help implement (indeed, will be essential to implementing) such an architecture, but the architecture

is more significant than any of the technologies themselves." ROBYN RENDER, ASSISTANT VICE PRESIDENT FOR ADMINISTRATIVE SERVICES AND INFORMATION TECHNOLOGIES, AND DIRECTOR OF THE CENTER FOR INFORMATION TECHNOLOGY AT THE UNIVERSITY OF CINCINNATI OFFERED THE FOLLOWING COMMENTS: "I attended the Gartner Symposium this year with a particular focus. I am involved in planning a retreat with our Cabinet to address IT strategic planning. This will be the first time in many years that we have been able to get dedicated time at the highest administrative levels of our institution to participate in an IT planning process. Recognizing the paradigm shift that higher education is facing, it is imperative that we re-evaluate our vision and goals and reengineer our processes for teaching, health care, research, and public service. Therefore the Gartner sessions on IT and IS roles in business process re-engineering, IS management, and IT funding implications were important to me. I learned that the IS organization should play a leadership role in facilitating re-engineering efforts. The sessions I attended offered good advice and realistic approaches to assuming this role. I also gained insight on models for costing the migration to client/server architecture and managing the transition. Preparing and implementing a new support environment in this era of rapid change and instability was also well emphasized." FRANK THOMAS, ASSOCIATE VICE PRESIDENT FOR INFORMATION SERCVICES, DICK SEIVERT, DIRECTOR OF COMPUTER SERVICES, AND DAVE WASIK, ASSISTANT DIRECTOR FOR ADMINISTRATIVE SYSTEMS AT THE UNIVERISTY OF AKRON, OFFERED THE FOLLOWING INSIGHTS FROM SYMPOSIUM 93: Large Mainframes By 1996, all the mainframe vendors will make the transition to a new mainframe based on CMOS technology rather than ECL logic and bi-polar memory. The architecture will be highly parallel and have the potential for many processors. Minimum CPU size will be 40 MIPS and will sell for 10k to 12k per MIP. Delivery will start in late 1995. The mainframe will not go away, but will become the largest server as more graphical, voice, image, and full motion video data are made available for multimedia applications. Client/Server Client/server architecture provides an empowered end-user with easy to use graphical presentations. This architecture will provide users with easy to use graphical interfaces; however, cost will still be a major problem and estimated at 1.5 to 2.5 times that of the central mainframe. In spite of the added costs, the benefits will be

worth the investment over the long haul. Applications should be targeted for client/server using the following scheme: - Personal and departmental applications should store information at the LAN server or desktop. - Mission critical applications should be stored on an enterprise server. The client/server architecture will not be ready to support open systems mission critical applications for a number of years because of the two phase commit and response times. A two phase commit occurs when an integrated application must update data residing on separate hardware platforms in a single unit of work. The problem happens when a failure occurs before the unit of work is completed and the transaction must be rolled back. This process keeps the data synchronized. Proprietary solutions such as Sybase provide two phase commit but many open systems environments are not willing to convert all of their data to yet another database management system. Security To extend security to the network environment, you should acquire a network security package that supports kerberos technology. Kerberos security requires a dedicated server on the network. All network traffic would be routed to that server for authentication. Enterprise servers with their own proprietary security (e.g.. Top Secret on MVS) must have a kerberos exit to permit authorization. This type of software will provide a signal security sign-on across the enterprise. Kerberos is available in private key and public key formats. The private key format is supported by OSF's DCE standard. It works by providing each client or microcomputer a ticket. When the microcomputer requests a service, the server asks it for the ticket and authenticates it. The public key format is supported by Novell and Microsoft. The key to encrypt is different from the key to decrypt. The encryption key is published and the decryption key is private. Application Development and Management Over the next five years, 60 percent of new applications will be built under the direction and control user departments. Centralized IS will shift its role from application developer to facilitator and provider of infrastructure, architecture, specialty skills, technical support, training, and enterprise data management. Applications must begin to be developed on the client/server platform. To do this, new software development tools such as PowerBuilder or Gupta must be purchased. The minimum hardware these tools will run on adequately is an Intel 486, 33Mhz processor with at least 8 Mb RAM. That means fitting users and programmers with the hardware necessary to support those applications.

Future applications development will not be a methodology is established. Gartner Group methodology be purchased citing experiences Fortune 500 companies that tried to develop methodology and have failed.

successful until recommends that a of many of the their own

By 1998, at least 80 percent of all new multi-user applications will use some form of client/server computing. By 1998, at least 80 percent of prototyping and construction tools and at least 60 percent of surviving analysis and design tools will incorporate object oriented technology. Through 1998, IS budgets for personnel, training, and support will increase relative to overall IS budgets, as a result of tool proliferation and fragmented skill sets. Software Asset Management Long-term client/server software costs will represent the users biggest IS throughout 1998; client/server software costs can be minimized by focusing on desktop discounts and by using larger servers. Cost-per-unit-of-work measurements will become a key competitive platform weapon by 1996. Long-term S390 budget savings require short-term tactics and implementation; do not commit to long-term "fixed-price" licensing arrangements in this time of chaos and change. By 1995, client/server will force maintenance and support models to unbundle, providing users with new service and upgrade paths. Software Management To keep its legacy applications installed base intact, IBM will be forced to drastically drop CPU and storage prices. IBM will evolve MVS to try to forestall migrations to alternative mainframes, but will not be able to attract new applications. Transition Strategies One method for migration is to use the following rule of thumb: - Applications with high need for GUI and low transaction volume are ideal for client/servers using client/server 4GLs or frontware. Applications with high need for GUI and high transaction volume should also be considered, but the technology is immature. - Applications with low need for GUIs should be developed as traditional rather than client/server applications. These could be form-based high volume data entry functions. What client/server does best is improve the end user

interface and enable integration with desktop decision support application. By 1996, nearly all large users will off load some (up to 25 percent) of their mainframes MIPS, but few (less than 5 percent) will replace their mainframes. Miscellaneous Comments A successful organization should be planning a new office information strategy. A new class of network middleware will enable organizations to provide a common set of tools to service the operation of office management in the University. This strategy is referred to as workgroup computing which is the buzzword for helping individuals put all their microcomputer based tools together into an integrated environment that is capable of communication with others using a homogenous set of tools. Typically these tools include electronic mail, calendaring, document management, full-text retrieval, and electronic forms. Central processor software should be considered to provide software distribution over the network and it is available today. One package with a good fit for the universities appears to be Microsoft Hermes. Windows NT is not a good client/server platform. Microsoft has offered announcements on Chicago and Cairo systems and those products may be a better choice than NT. Response times for an update in a centralized processing environment is acceptable to an end user when it is less than four seconds. In a client/server environment with integrated applications, the response times are significantly longer and not acceptable by the end-user." MICHAEL ZASTROCKY, CAUSE VICE PRESIDENT FOR INFORMATION RESOURCES, PROVIDED THE FOLLOWING INSIGHTS AND COMMENTS: "I tried to spend time attending sessions that dealt with the desktop, multimedia, advanced technologies (including voice, video, virtual reality), and networking. Several of the sessions utilized technology to do on-the-spot surveys, and we were able to register our own responses and immediately see the break-out of those responses. One of the sessions was on Desktop Asset Management, and when we were asked what our organizations did with outdated desktop computers, the following responses were recorded: sell to employees sell to used equipment vendors trade-in on new equipment donate to charitable organizations 18% 21% 7% 31%

still have first desktop computer

22%

The presenter suggested the following reasons for not selling to employees: 1. They may bring them back to the office (because they don't like the new equipment as well and bypass new standards for equipment) 2. They may have FCC problems in the home (they could cause problems with the neighbor's TV) 3. Sell to a middle person (who then can sell to an employee and takes the complaints and problems, but be careful about them still bringing it back to their office). The presenter also suggested that in the future, companies need to push for trade-in agreements. He mentioned that laptops have a "shorter life" than desktop machines because changes are more frequent in that environment and the increments of change are smaller. He also suggested that within two years, Personal Digital Assistants (PDA's) will be ubiquitous. Another question that was asked of the audience was, "Who is responsible for asset management of PCs in your organization?" Information services End users Finance No one 53% 30% 3% 14%

In a session on client/server applications, the following reasons were given as positive reasons to move to client/server: 1. To preserve existing host applications you might put a GUI in front of your legacy system 2. Downsizing (downcosting is often a more legitimate reason where you rehost from a more expensive to less expensive architecture) 3. New application requirements 4. Upsizing (moving an application from the desktop to client/server to make the application available to more users) In another session, the presenter suggested that prices for desktop machines may have "bottomed out", and multimedia is being added "free" to new systems. She suggested that processing power is also moving in that direction where the price is not going to get much lower, but the processing

power will continue to increase. Another presenter talked about the need for people to buy at least 15" monitors in order to take advantage of the graphics and multimedia applications that are now becoming available. One analyst pointed out that the labor vs. capital ratios were as follows: Mainframes: Desktop: labor 20% labor 85% capital 80% capital 15%

One caution was made involving the use of GUI's and concurrent licenses. It was suggested that with GUI's, people will leave windows open and concurrent license use will go up. In the desktop videoconferencing session, the analyst suggested that there would be significant growth in the level of services available in the next five years, and the cost would go down. However, the analyst suggested that audio graphics might provide a less expensive means for higher education today. It was also suggested that this technology will not replace flying time but driving time for business travel. Overall, the general consensus from the analysts that I listened to were that we are going to see more change in the next five years than the previous past fifteen years in the desktop area. Several analysts suggested that in the months ahead we will be confronted with so much confusion in the marketplace that users will often want to respond like the ostrich, and we will have to work hard as we deal with the natural resistance to change.