You are on page 1of 187

EMERGING TRENDS IN TECHNOLOGY MANAGEMENT

UNIT I

NOTES

INTRODUCTION
Introduction In this knowledge management era with ever changing technology and emerging technologies it is mandatory to keep in tune with the change and changing policies. Be alert to leverage knowledge and know how world class organizations have succeeded. This unit provides insight into such issues. Learning Objectives To know about The principles of a Knowledge Leveraging Community Infrastructure Supporting Technologies Why should you be a Learning Organisation ? How to create a Learning Organisation Facets of World Class Organisation How to build a world class organisation

1.1 SCIENCE AND TECHNOLOGY POLICY SYSTEMS Science and technology have profoundly influenced the course of human civilization. Science has provided us remarkable insights into the world we live in. The scientific revolutions of the 20th century have led to many technologies, which promise to herald wholly new eras in many fields. As we stand today at the beginning of a new century, we have to ensure fullest use of these developments for the well being of our people. Science and technology have been an integral part of Indian civilization and culture over the past several millennia. Few are aware that India was the fountainhead of important foundational scientific developments and approaches. These cover many great scientific discoveries and technological achievements in mathematics, astronomy, architecture, chemistry, metallurgy, medicine, natural philosophy and other areas. A great deal of this traveled outwards from India. Equally, India also assimilated scientific ideas and techniques from elsewhere, with open-mindedness and a rational attitude characteristic of a scientific ethos. Indias traditions have been founded on the principles of universal harmony, respect
1 ANNA UNIVERSITY CHENNAI

DBA 1736

NOTES

for all creation and an integrated holistic approach. This background is likely to provide valuable insights for future scientific advances. During the century prior to Independence, there was an awakening of modern science in India through the efforts of a number of outstanding scientists. They were responsible for great scientific advances of the highest international caliber. In the half century since Independence, India has been committed to the task of promoting the spread of science. The key role of technology as an important element of national development is also well recognised. The Scientific Policy Resolution of 1958 and the Technology Policy Statement of 1983 enunciated the principles on which the growth of science and technology in India has been based over the past several decades. These policies have emphasized self-reliance, as also sustainable and equitable development. They embody a vision and strategy that are applicable today, and would continue to inspire us in our endeavors. With the encouragement and support that has been provided, there is today a sound infrastructural base for science and technology. These include research laboratories, higher educational institutions and highly skilled human resource. Indian capabilities in science and technology cover an impressive range of diverse disciplines, areas of competence and of applications. Indias strength in basic research is recognized internationally. Successes in agriculture, health care, chemicals and pharmaceuticals, nuclear energy, astronomy and astrophysics, space technology and applications, defense research, biotechnology, electronics, information technology and oceanography are widely acknowledged. Major national achievements include very significant increase in food production, eradication or control of several diseases and increased life expectancy of our citizens. While these developments have been highly satisfying, one is also aware of the dramatic changes that have taken place, and continue to do so, in the practice of science, in technology development, and their relationships with, and impact on, society. Particularly striking is the rapidity with which science and technology is moving ahead. Science is becoming increasingly inter- and multi-disciplinary, and calls for multi-institutional and, in several cases, multi-country participation. Major experimental facilities, even in several areas of basic research, require very large material, human and intellectual resources. Science and technology have become so closely intertwined, and so reinforce each other that, to be effective, any policy needs to view them together. The continuing revolutions in the field of information and communication technology have had profound impact on the manner and speed with which scientific information becomes available, and scientific interactions take place. Science and technology have had unprecedented impact on economic growth and social development. Knowledge has become a source of economic might and power. This has led to increased restrictions on sharing of knowledge, to new norms of intellectual
2 ANNA UNIVERSITY CHENNAI

EMERGING TRENDS IN TECHNOLOGY MANAGEMENT

property rights, and to global trade and technology control regimes. Scientific and technological developments today also have deep ethical, legal and social implications. There are deep concerns in society about these. The ongoing globalisation and the intensely competitive environment have a significant impact on the production and services sectors. Because of all this, our science and technology system has to be infused with new vitality if it is to play a decisive and beneficial role in advancing the well being of all sections of our society. The nation continues to be firm in its resolve to support science and technology in all its facets. It recognizes its central role in raising the quality of life of the people of the country, particularly of the disadvantaged sections of society, in creating wealth for all, in making India globally competitive, in utilizing natural resources in a sustainable manner, in protecting the environment and ensuring national security. 1.2 POLICY OBJECTIVES Recognizing the changing context of the scientific enterprise, and to meet present national needs in the new era of globalisation, Government enunciates the following objectives of its Science and Technology Policy: To ensure that the message of science reaches every citizen of India, man and woman, young and old, so that we advance scientific temper, emerge as a progressive and enlightened society, and make it possible for all our people to participate fully in the development of science and technology and its application for human welfare. Indeed, science and technology will be fully integrated with all spheres of national activity. To ensure food, agricultural, nutritional, environmental, water, health and energy security of the people on a sustainable basis. To mount a direct and sustained effort on the alleviation of poverty, enhancing livelihood security, removal of hunger and malnutrition, reduction of drudgery and regional imbalances, both rural and urban, and generation of employment, by using scientific and technological capabilities along with our traditional knowledge pool. This will call for the generation and screening of all relevant technologies, their widespread dissemination through networking and support for the vast unorganized sector of our economy. To vigorously foster scientific research in universities and other academic, scientific and engineering institutions; and attract the brightest young persons to careers in science and technology, by conveying a sense of excitement concerning the advancing frontiers, and by creating suitable employment opportunities for them. Also to build and maintain centres of excellence, which will raise the level of work in selected areas to the highest international standards. To promote the empowerment of women in all science and technology activities and ensure their full and equal participation.
3

NOTES

ANNA UNIVERSITY CHENNAI

DBA 1736

NOTES

To provide necessary autonomy and freedom of functioning for all academic and R&D institutions so that an ambience for truly creative work is encouraged, while ensuring at the same time that the science and technology enterprise in the country is fully committed to its social responsibilities and commitments. To use the full potential of modern science and technology to protect, preserve, evaluate, update, add value to, and utilize the extensive knowledge acquired over the long civilizational experience of India. To accomplish national strategic and security-related objectives, by using the latest advances in science and technology. To encourage research and innovation in areas of relevance for the economy and society, particularly by promoting close and productive interaction between private and public institutions in science and technology. Sectors such as agriculture (particularly soil and water management, human and animal nutrition, fisheries), water, health, education, industry, energy including renewable energy, communication and transportation would be accorded highest priority. Key leverage technologies such as information technology, biotechnology and materials science and technology would be given special importance. To substantially strengthen enabling mechanisms that relate to technology development, evaluation, absorption and upgradation from concept to utilization. To establish an Intellectual Property Rights (IPR) regime which maximises the incentives for the generation and protection of intellectual property by all types of inventors. The regime would also provide a strong, supportive and comprehensive policy environment for speedy and effective domestic commercialisation of such inventions so as to be maximal in the public interest. To ensure, in an era in which information is key to the development of science and technology, that all efforts are made to have high-speed access to information, both in quality and quantity, at affordable costs; and also create digitized, valid and usable content of Indian origin. To encourage research and application for forecasting, prevention and mitigation of natural hazards, particularly, floods, cyclones, earthquakes, drought and landslides. To promote international science and technology cooperation towards achieving the goals of national development and security, and make it a key element of our international relations. To integrate scientific knowledge with insights from other disciplines, and ensure fullest involvement of scientists and technologists in national governance so that the spirit and methods of scientific enquiry permeate deeply into all areas of public policy making.

ANNA UNIVERSITY CHENNAI

EMERGING TRENDS IN TECHNOLOGY MANAGEMENT

It is recognized that these objectives will be best realized by a dynamic and flexible Science and Technology Policy, which can readily adapt to the rapidly changing world order. This Policy, reiterates Indias commitment to participate as an equal and vigorous global player in generating and harnessing advances in science and technology for the benefit of all humankind. 1.3 LEVERAGING KNOWLEDGE Knowledge evolves knowledge-leveraging practices with the communities that embody them. Knowledge-leveraging practices and communities where practitioners think and act together to transform information and experience into insights and insights into products, services and competencies enhance an organizations ability to live in change and thus, to continue to deliver value in the midst of uncertainty, paradox, complexity and the unknown. Knowledge-leveraging practices and communities engage the fullness of our human ability to learn, create, change. Thus, e-Knowledge adds value to knowledge-leveraging initiatives primarily by participating as co-learner and empathic provocateur in the journey of optimizing organizational performance. Specific services include: assessing an organizations knowledge base (its common sense shaping its decisions and practices); identifying and seeding communities that upgrade and leverage knowledge strategic to business strategy and core competencies; creating and implementing online collaboration environments to support communities of practice, e-learning, virtual teams; developing database-driven solutions to complement face-to-face services as well as administrative, fundraising, marketing and evaluative functions; offering the full range of Internet presence services: web hosting, domain name registration, e-commerce, SSL certificates, Internet marketing.

NOTES

Principles of a Knowledge Leveraging Community Infrastructure Community implies a common interest and it is the pursuit of this common interest that the knowledge-leveraging infrastructure must support. Whether the common interest is to deal with a situation, avoid something, maintain something, or accomplish something, the common interest serves as the basis for the purpose and vision of the community. A community, however, does not exist in isolation and is part of a larger body or system. The system is made up of the community and those with whom the community interacts. These participants in the system may be temporary or ongoing and are defined as follows: Community Members - Those individuals with a common interest who will benefit from employing the leveragable body of knowledge. It is expected that the community members are not entirely capable of creating the leveragable body of knowledge on their own.
5 ANNA UNIVERSITY CHENNAI

DBA 1736

NOTES

External Contributors - Those individuals outside the community who possess relevant knowledge that the could be leveraged by community members, and as such must become part of the leveragable body of knowledge. Community members must interact with external contributors to clarify and crystallize their purpose, vision, and values. The needs of the community members guide the external contributors. Facilitators - It is expected that neither the community members nor the external contributors have the capacity to manage the leveragable body of knowledge. Thus, facilitators are responsible for managing the interactions which create and maintain the leveragable body of knowledge and for maintaining the infrastructure interactions. The following figure 1 depicts the flow of interactions within the system. Note that there are no half-loops; every participant is able to interact with every other participant. The participants interact with each other and with the leveragable body of knowledge through various forms of input, and receive feedback produced by each of the other participants and by the system.

Figure 1 Interaction Principles The extent to which knowledge leveraging can occur within a community is dependent on the nature of the interactions within the community and within the larger system. Certain principles must be at the core of these interactions. These principles relate to the following aspects of knowledge leveraging. Geographic Distribution - Participants in the system are likely to be geographically distributed. This means face to face interactions will be difficult for most. The system must support multiple modes of interaction to accommodate the preferences and learning styles of the individual members. Purpose, Mission, Vision and Values - The community is not likely to begin with a clear and precise shared definition of its purpose, mission, vision, and values. You might say that the community knows only that it needs, but lacks clarity as to just what it needs. The
6 ANNA UNIVERSITY CHENNAI

EMERGING TRENDS IN TECHNOLOGY MANAGEMENT

infrastructure must facilitate interactions between members of the community, external contributors, and facilitators to develop a clear and consistent understanding of the purpose, mission, vision, and values of the community. Changing Participants - Community members, external contributors, and facilitators will change over time. New participants will enter and existing participants will depart. To help new participants ramp up to the current state of community evolution, the infrastructure must provide concise documentation of the agreements and decisions the community has made to date. This will allow new participants to ramp up without impeding seasoned participants from continuing to move forward. The intent is to avoid a continuous rehashing of past decisions because of issues raised by new participants who are unfamiliar with the decisions of the past and simply dont know what they dont know. Purpose Challenge - Once the community has established and documented its purpose, mission, vision and values, there must be a mechanism for challenging the established doctrines on a recurring basis. For the doctrine to remain valid and avoid becoming dogma, it must evolve over time. Personal Development - In order to support the evolution of the body of knowledge, individual members of the community must personally develop. The infrastructure must enable individuals to assess their capacity to contribute to the effort and provide a basis for personal development. This will enable individuals to develop their capacity to support the evolution of the leveragable body of knowledge. Roles and Responsibilities - Facilitated interaction between community members, facilitators, and external contributors serves as the basis for defining the roles and contributions of the facilitators and external contributors. These definitions also need to be documented for future reference by all participants in the infrastructure. Feedback - Community members interacting with the leveragable body of knowledge must be able to provide feedback in several critical areas and the feedback mechanism must be supported by the infrastructure. Content - Feedback on accessed knowledge must be submitted for review to the appropriate individuals to act on the feedback. This is a basis for continuing evolution of the leveragable body of knowledge. Participants - The value of facilitator and external contributor contributions must be evaluated by community members on an ongoing basis. Subgroups - Because it is expected that there will be subgroups of community members working in a project capacity, the community needs to provide feedback to the subgroup regarding the value of its contribution. Subgroups must evaluate their own perceptions of the value of their contributions as well as the level of contribution by their participating members.
7

NOTES

ANNA UNIVERSITY CHENNAI

DBA 1736

NOTES

Return on Investment - Facilitators and external contributors must be able to continually reflect on their perceived return on investment from supporting the infrastructure. This provides a basis for determining whether alterations are appropriate to adjust the return on investment and the facilitators and external contributors interactions with the system. Support Facilities - Subgroups working together must have multiple support facilities to enhance their interactions. At present there is no known single technology that will accommodate the myriad of interactions required. Interactions of subgroups essentially represent a microcosm of the interactions of the whole system. The infrastructure must facilitate the establishment of subgroup objectives, facilitate their ongoing interactions, provide a repository for what the subgroup produces, enable the group to evaluate itself, and allow the community to evaluate the contributions of the subgroup. Supporting Technologies As stated, no single technology exists which will facilitate all the interactions required for a community to develop, maintain and evolve a leveragable body of knowledge. It is believed that there are sufficient technology components available, which, when integrated, will produce an infrastructure that will support the community in the manner described. Because there are multiple types of interactions with differing intended contributions, it seems best to describe the technologies from the perspective of the interactions they must support. In this manner it should then be possible to evaluate a technology based on its capacity to enable and deliver value to the interaction it is supposed to support. The following provides some perspectives on particular technology components and their role in the infrastructure. The Leveragable Body of Knowledge The leveragable body of knowledge is all the knowledge available to the community via all participants in the system. The repository for captured knowledge, the knowledgebase, must provide feedback in support of its own continued development and evolution. It must also support the following types of interactions from each of the participants within the system. Participant Feedback - All participants interacting with the body of knowledge must be able to provide feedback regarding the perceived quality of the knowledge they access. The most important dimensions are perceived to be: Findability - Was the user able to find what they were looking for in a timely manner? If what they were looking for didnt exist within the body of knowledge, it forms the basis for additional content development. If what the user was looking for existed, did they find it in an acceptable time-frame?

ANNA UNIVERSITY CHENNAI

EMERGING TRENDS IN TECHNOLOGY MANAGEMENT

Usability - The extent to which the knowledge was able to be used to serve the users intent. Relevance - Was the knowledge found appropriate to what the user was looking for? Accuracy - Was the knowledge found correct and did it solve their problem? Precision - Was the knowledge found of the appropriate level of detail? Was it too general? Was it too specific? Was it just right? Content Evolution - All participants interacting with the leveragable body of knowledge must be able to provide foundations for additional content. This may be in terms of: Questions - Questions for which appropriate answers were not found in the knowledgebase should serve as the basis for the development of additional content by the facilitators and external contributors. Perspectives - As members of the community gain insights from employing facets of the leveragable body of knowledge, the infrastructure must provide a way for this to form the basis of new content for others to access. Contributions - As members develop new learning, it must serve as the basis for new contributions to the knowledgebase. System Feedback - All participants must receive feedback from the body of knowledge on an ongoing basis. This feedback serves as a basis for corrections to the modes and methods of interaction as well as for the continued development of the content of the body of knowledge. Some of the most relevant components of this feedback are: Value - Feedback must be established regarding the perceived quality or value of the body of knowledge. This feedback is based on some combination of the number and frequency of community member interactions with the body of knowledge, in conjunction with the feedback that participants have provided on the knowledge accessed. This feedback is considered valuable to community members, facilitators, and external contributors. Knowledge Quality - Based on the comments submitted by community members, feedback should be provided to the facilitators and external contributors as to the perceived quality of the content they have developed. This feedback also provides the basis for developing new content and revising existing content. Note that from a composite sense, feedback serves to establish the community members perceived value of the interactions by the facilitators and external contributors. The infrastructure should also support the community members qualitative evaluation of facilitators and external contributors via blind survey. The idea is to balance direct and indirect feedback about the value of interactions.
9

NOTES

ANNA UNIVERSITY CHENNAI

DBA 1736

NOTES

Facilitating Distributed Interaction It is assumed that, for the most part, the members of the community will be distributed worldwide. There may be small, co-located groups of community members, yet this will be the exception rather than the rule. In addition to being geographically distributed, it is expected that individual community members will have different preferences as to when and how to interact. Therefore, it is essential that the infrastructure facilitate the interaction dynamics in such a way as to accommodate the time and space differentials of community members. Personal Profiling - We seem to interact in a more comfortable fashion with individuals we think we know. We develop this sense of knowing from various interactions with individuals. The system must provide a profiling facility to develop a reference background for the participants. This should include personality types (Myers-Briggs, Adizes PAEI, Human Dynamics MEP, etc.), background, desires and aspirations, and special interests. Profiles must be developed online and be readily accessible to anyone that chooses to use them as a basis for better understanding those they are interacting with. Developmental Profiles - The foundation of the system is the common interest of the community, yet this cannot be pursued at the expense of individual aspirations. There is nothing more important to each of us than what we personally desire to accomplish. Therefore, the system must support individual development profiling in a manner which integrates individual development and community development. C - Strategy and Implementation Plan Keeping in view these broad objectives, it is essential to spell out an implementation strategy that will enable identification of specific plans, programmes and projects, with clearly defined tasks, estimates of necessary resources, and time targets. Some of the key elements of the implementation strategy will be as follows: 1. Science and Technology Governance and Investments Suitable mechanism will be evolved by which independent inputs on science and technology policy and planning are obtained on a continuous basis from a wide cross section of scientists and technologists. It will utilize the academies and specialized professional bodies for this purpose. These inputs will form an integral part of the planning and implementation of all programmes relating to science and technology, as also in government decision making and formulation of policies in socio-economic sectors. A greater integration of the programmes in socio-economic sectors with R&D activities will go a long way in ensuring a wider, more visible and tangible impact. This will call for a certain percentage of the overall allocation of each of the socio-economic ministries to be devoted for relevant programmes and activities in science and technology. The States will
10 ANNA UNIVERSITY CHENNAI

EMERGING TRENDS IN TECHNOLOGY MANAGEMENT

also be encouraged and assisted in the use of science and technology for developmental purposes through mechanisms set up for this, and in establishing linkages with national institutions for solving their regional and locale-specific problems. A concerted strategy is necessary to infuse a new sense of dynamism in our science and technology institutions. The science departments, agencies and other academic institutions, including universities i.e. the science and technology system as a whole, would be substantially strengthened, given full autonomy and flexibility, and de-bureaucratized. Mechanisms will be established to review on a continuous basis the academic and administrative structures and procedures in the science and technology system at all levels, so that reforms could be effected to meet the challenges of the changing needs. It will be ensured that all highly science-based Ministries/Departments of Government are run by scientists and technologists. All the major socio-economic Ministries will have high-level scientific advisory mechanisms. Government will ensure continued existence of an Apex S&T Advisory Body which will assist in formulating and implementing various programmes and policies. It will have appropriate representation of industry leaders, leading scientists and technologists and various scientific departments. Government will make necessary budgetary commitments for higher education and science and technology. It will, through its own resources and also through contribution by industry, raise the level of investment to at least 2% of GDP on science and technology by the end of the Tenth Plan. For this, it is essential for industry to steeply increase its investments in R&D. This will enable it to be competitive, achieve greater self-reliance and selfconfidence, and fulfill national goals. 2. Optimal Utilization of Existing Infrastructure and Competence Science and technology is advancing at a very fast pace, and obsolescence of physical infrastructure, as also of skills and competence, take place rapidly. Steps will be taken to network the existing infrastructure, investments and intellectual strengths, wherever they exist, to achieve effective and optimal utilization, and constantly upgrade them to meet changing needs. 3. Strengthening of the Infrastructure for Science and Technology in Academic Institutions A major initiative to modernize the infrastructure for science and engineering in academic institutions will be undertaken. It will be ensured that all middle and high schools, vocational and other colleges will have appropriately sized science laboratories. Science, engineering and medical departments in academic institutions and universities and colleges will be selected

NOTES

11

ANNA UNIVERSITY CHENNAI

DBA 1736

NOTES

for special support to raise the standard of teaching and research. To begin with, a significant number of academic institutions, specially the universities, as also engineering and medical institutions, would be selected for this support to make an impact. Flexible mechanisms for induction of new faculty in key areas of science would be developed. Constancy of support and attention will be ensured over at least a ten-year period. 4. New Funding Mechanisms for Basic Research The setting up of more efficient funding mechanisms will be examined, either by creating new structures or by strengthening or restructuring the existing ones, for promotion of basic research in science, medical and engineering institutions. In particular, administrative and financial procedures will be simplified to permit efficient operation of research programmes in diverse institutions across the country. Creation of world class facilities in carefully selected and nationally relevant fields will be undertaken, to enhance our international competitiveness in areas where we have strengths, opportunities or natural advantages. Indigenous expertise will be used to the maximum extent possible. This would help in nurturing high quality talent and expertise in experimental science and engineering. 5. Human Resource Development The number of scientists and technologists, while being large in absolute numbers, is not commensurate with the requirements in quality and when measured on a per capita basis. The demand is bound to increase in the coming years with more intensive activities involving science and technology. There is need to progressively increase the rate of generation of high quality skilled human resource at all levels. This process would naturally entail reversing the present flow of talent away from science, by initiating new and innovative schemes to attract and nurture young talent with an aptitude for research, and by providing assured career opportunities in academia, industry, Government or other sectors.In order to encourage quality and productivity in science and technology, mobility of scientists and technologists between industry, academic institutions and research laboratories will be ensured. For building up the human resource base in relevant areas, the agencies and departments concerned with science and technology will make available substantial funding from their allocation. Flexible mechanisms will be put in place in academic and research institutions to enable researchers to change fields and bring new inputs into traditional disciplines, and also to develop inter-disciplinary areas. There will be emphasis on a continuing process of retraining and reskilling to keep pace with the rapid advances taking place. Wherever considered necessary, training abroad will be resorted to, so as to build up a skilled base rapidly.

12

ANNA UNIVERSITY CHENNAI

EMERGING TRENDS IN TECHNOLOGY MANAGEMENT

Women constitute almost half the population of the country. They must be provided significantly greater opportunities for higher education and skills that are needed to take up R&D as a career. For this, new procedures, and flexibility in rules and regulations, will be introduced to meet their special needs. New mechanisms would be instituted to facilitate the return of scientists and technologists of Indian origin to India, as also their networking, to contribute to Indian science and technology. Schemes for continuing education and training of university and college teachers in contemporary research techniques and in emerging areas of science will be strengthened and new innovative programmes started. It will also be ensured that higher education is available to the widest possible section of creative students, transcending social and economic barriers. 6. Technology Development, Transfer and Diffusion A strong base of science and engineering research provides a crucial foundation for a vibrant programme of technology development. Priority will be placed on the development of technologies which address the basic needs of the population; make Indian industries small, medium or large globally competitive; make the country economically strong; and address the security concerns of the nation. Special emphasis will be placed on equity in development, so that the benefits of technological growth reach the majority of the population, particularly the disadvantaged sections, leading to an improved quality of life for every citizen of the country. These aspects require technology foresight, which involves not only forecasting and assessment of technologies but also their social, economic and environmental consequences. The growth rate in productivity of the Indian economy has been below its true potential, and the contribution to it of technological factors is inadequate. Similarly, Indian exports today derive their comparative advantage through resource and labour rather than through the power of technological innovation. The transformation of new ideas into commercial successes is of vital importance to the nations ability to achieve high economic growth and global competitiveness. Accordingly, special emphasis will be given not only to R&D and the technological factors of innovation, but also to the other equally important social, institutional and market factors needed for adoption, diffusion and transfer of innovation to the productive sectors. Intensive efforts will be launched to develop innovative technologies of a breakthrough nature; and to increase our share of high-tech products. Aggressive international benchmarking will be carried out. Simultaneously, efforts will be made to strengthen traditional industry so as to meet the new requirements of competition through the use of appropriate science and technology. This industry is particularly important as it provides employment
13

NOTES

ANNA UNIVERSITY CHENNAI

DBA 1736

NOTES

at lower per capita investment, involves low energy inputs, and carries with it unique civilizational traditions and culture. Value addition, and creation of wealth through reassessment, redistribution and repositioning of our intellectual, capital and material resource will be achieved through effective use of science and technology. Deriving value from technology-led exports and export of technologies will be facilitated through new policy initiatives, incentives and legislation. This will include intensive networking of capabilities and facilities within the country. Rigid Quality Standards, and Accreditation of testing and calibration laboratories according to international requirements, will be given an enhanced push to enable Indian industry to avoid non-tariff barriers in global trade. A comprehensive and well-orchestrated programme relating to education, R&D and training in all aspects of technology management will be launched. To begin with, Indian Institutes of Management (IIMs), Indian Institutes of Technology (IITs) and other selected institutions will be encouraged to initiate these programmes. 7. Promotion of Innovation Innovation will be supported in all its aspects. A comprehensive national system of innovation will be created covering science and technology as also legal, financial and other related aspects. There is need to change the ways in which society and economy performs, if innovation has to fructify. 8. Industry and Scientific R&D Every effort will be made to achieve synergy between industry and scientific research. Autonomous Technology Transfer Organizations will be created as associate organizations of universities and national laboratories to facilitate transfer of the know-how generated to industry. Increased encouragement will be given, and flexible mechanisms will be evolved to help, scientists and technologists to transfer the know-how generated by them to the industry and be a partner in receiving the financial returns. Industry will be encouraged to financially adopt or support educational and research institutions, fund courses of interest to them, create professional chairs etc. to help direct S&T endeavours towards tangible industrial goals. There has to be increased investments by industry in R&D in its own interest to achieve global competitiveness to be efficient and relevant. Efforts by industry to carry out R&D, either in-house or through outsourcing, will be supported by fiscal and other measures. To increase their investments in R&D, innovative mechanisms will be evolved. 9. Indigenous Resources and Traditional Knowledge Indigenous knowledge, based on our long and rich tradition, would be further developed and harnessed for the purpose of wealth and employment generation. Innovative systems to document, protect, evaluate and to learn from Indias rich heritage of traditional knowledge
14 ANNA UNIVERSITY CHENNAI

EMERGING TRENDS IN TECHNOLOGY MANAGEMENT

of the natural resources of land, water and bio-diversity will be strengthened and enlarged. Development of technologies that add value to Indias indigenous resources and which provide holistic and optimal solutions that are suited to Indian social-cultural-economic ethos will be developed. A concerted plan to intensify research on traditional systems of medicine, so as to contribute to fundamental advances in health care, and leading to commercialisation of effective products will be undertaken; appropriate norms of validation and standardization will be enforced. A purposeful programme to enhance the Indian share of the global herbal product market will be initiated. 10. Technologies for Mitigation and Management of Natural Hazards Science and technology has an important role in any general strategy to address the problems of mitigation and management of the impacts of natural hazards. A concerted action plan to enhance predictive capabilities and preparedness for meeting emergencies arising from floods, cyclones, earthquakes, drought, landslides and avalanches will be drawn up. Measures will be undertaken to promote research on natural phenomena that lead to disasters and human activities that aggravate them. This will be with a view to developing practical technological solutions for pre-disaster preparedness, and mitigation and management of post- disaster situations. 11. Generation and Management of Intellectual Property Intellectual Property Rights (IPR), have to be viewed, not as a self-contained and distinct domain, but rather as an effective policy instrument that would be relevant to wide ranging socio-economic, technological and political concepts. The generation and fullest protection of competitive intellectual property from Indian R&D programmes will be encouraged and promoted. The process of globalisation is leading to situations where the collective knowledge of societies normally used for common good is converted to proprietary knowledge for commercial profit of a few. Action will be taken to protect our indigenous knowledge systems, primarily through national policies, supplemented by supportive international action. For this purpose, IPR systems which specially protect scientific discoveries and technological innovations arising out of such traditional knowledge will be designed and effectively implemented. Our legislation with regard to Patents, Copyrights and other forms of Intellectual Property will ensure that maximum incentives are provided for individual inventors, and to our scientific and technological community, to undertake large scale and rapid commercialization, at home and abroad. The development of skills and competence to manage IPR and leveraging its influence will be given a major thrust. This is an area calling for significant technological insights and legal expertise and will be handled differently from the present, and with high priority.
15

NOTES

ANNA UNIVERSITY CHENNAI

DBA 1736

NOTES

12. Public Awareness of Science and Technology There is growing need to enhance public awareness of the importance of science and technology in everyday life, and the directions where science and technology is taking us. People must be able to consider the implications of emerging science and technology options in areas which impinge directly upon their lives, including the ethical and moral, legal, social and economic aspects. In recent years, advances in biotechnology and information technology have dramatically increased public interest in technology options in wide ranging areas. Scientific work and policies arising from these have to be highly transparent and widely understood. Support for wide dissemination of scientific knowledge, through the support of science museums, planetaria, botanical gardens and the like, will be enhanced. Every effort will be made to convey to the young the excitement in scientific and technological advances and to instill scientific temper in the population at large. Special support will be provided for programmes that seek to popularize and promote science and technology in all parts of the country. Programmes will also be developed to promote learning and dissemination of science through the various national languages, to enable effective science communication at all levels. A closer interaction of those involved in the natural sciences and technology, social sciences, humanities and other scholarly pursuits will be facilitated to bring about mutual reinforcement, added value and impact. 13. International Science and Technology Cooperation Scientific research and technology development can benefit greatly by international cooperation and collaboration. Common goals can be effectively addressed by pooling both material and intellectual resources. International collaborative programmes, especially those contributing directly to our scientific development and security objectives, will be encouraged between academic institutions and national laboratories in India and their counterparts in all parts of the world, including participation in mega science projects as equal partners. Special emphasis will be placed on collaborations with other developing countries, and particularly neighbouring countries, with whom India shares many common problems. International collaboration in science and technology would be fully used to further national interests as an important component of foreign policy initiatives. 14. Fiscal Measures Innovative fiscal measures are critical to ensure successful implementation of the policy objectives. New methods are required for incentivising R&D activities, particularly in industry. New strategies have to be formulated for attracting higher levels of public and private investments in scientific and technological development. A series of both tax and non-tax fiscal instruments have to be evolved to ensure a leap-frogging process of development.
16 ANNA UNIVERSITY CHENNAI

EMERGING TRENDS IN TECHNOLOGY MANAGEMENT

The formulation of a focused strategy and the designing of new methods and instruments requires inputs from economists, financial experts and management experts and scientists. For this purpose, the apex S&T advisory body will constitute a dedicated task-force to suggest appropriate fiscal measures to subserve the policy objectives. 15. Monitoring Effective, expeditious, transparent and science-based monitoring and reviewing mechanisms will be significantly strengthened, and wherever not available will be put in place. It will be ensured that the scientific community is involved in, and responsible for, smooth and speedy implementation. 16. The New Vision To build a new and resurgent India that continues to maintain its strong democratic and spiritual traditions, that remains secure not only militarily but also socially and economically, it is important to draw on the many unique civilizational qualities that define the inner strength of India; this has been intrinsically based on an integrated and holistic view of nature and of life. The Science and Technology Policy 2003 will be implemented so as to be in harmony with our world view of the larger human family all around. It will ensure that science and technology truly uplifts the Indian people and indeed all of humanity. 1.4 LEARNING ORGANISATION The Learning Organisation is a concept that is becoming an increasingly widespread philosophy in modern companies, from the largest multinationals to the smallest ventures. What is achieved by this philosophy depends considerably on ones interpretation of it and commitment to it. The quote below gives a simple definition that we felt was the true ideology behind the Learning Organisation. A Learning Organisation is one in which people at all levels, individuals and collectively, are continually increasing their capacity to produce results they really care about. The Definition An organisation that learns and encourages learning among its people. It promotes exchange of information between employees hence creating a more knowledgable workforce. This produces a very flexible organisation where people will accept and adapt to new ideas and changes through a shared vision. Background and History The importance of learning was first put forward by a Chinese philosopher, Confucius (551 - 479 BC). He believed that everyone should benefit from learning.

NOTES

17

ANNA UNIVERSITY CHENNAI

DBA 1736

NOTES

Without learning, the wise become foolish; by learning, the foolish become wise. Learn as if you could never have enough of learning, as if you might miss something. The underlying cause for recent emphasis on organisational learning is because of the increased pace of change. Classically, work has been thought of as being conservative and difficult to change. Learning was something divorced from work and innovation was seen as the necessary but disruptive way to change. The corporation which is able to quickly learn and then innovate their work will be able to change their work practices to perform better in the constantly changing environment. Change is now measured in terms of months not years as it was in the past. Business re-engineering used to concentrate on eliminating waste and not on working smarter and learning. History Major research into the art of learning did not actually start until the 1900s. In the 1950s, the concept of Systems Thinking was introduced but never implemented. GouldKreutzer Associates, Inc. defined Systems thinking as: A framework for seeing interrelationships rather than things; to see the forest and the trees. This means that organisations need to be aware of both the company as a whole as well as the individuals within the company. Up until the introduction of this concept, companies concentrated on their own needs not the needs of their workers. Systems Thinking tries to change the managerial view so that it includes the ambitions of the individual workers, not just the business goals. One of the systems used was called Decision Support Systems (DSS). This was for the use of corporate executives to help them make decisions for the future. It was in fact the building of the models, which defined the systems, that benefited the management rather than the systems operation. This was because the building of the model focused on what the business really was and the alternatives available for the future. One benefit of DSS was that it made implicit knowledge explicit. This makes extra knowledge available to the organisation and will tend to allow the organisation to learn better because explicit knowledge will tend to spread faster through an organisation. In this respect DSS can be considered as an additional method of communication in organisations. This systems tool was predicted to be necessary for every executives desktop. But this did not happened.

18

ANNA UNIVERSITY CHENNAI

EMERGING TRENDS IN TECHNOLOGY MANAGEMENT

In the 1970s, the same idea was renamed to Organisational Learning. One of the early researchers in this field was Chris Arygris from Harvard. He published a book on the subject in 1978. Even with this published information the concept still wasnt physically taken on by any companies. In the 1980s, companies discovered time as a new source of competitive advantage. This lead to capabilities-based competition which included the capability of learning. Many other people have continued along this line of research, such as Peter Senge - one of the modern day gurus. Information on the topic has been passed into various companies. These companies are now trying to become Learning Organisations. If the changeover to a Learning Organisation happens overnight, the environment around the workers will be complex and dynamic. There will be agitations and confusion which means learning may not take place because of the chaos caused. So it can only be introduced into a company that is prepared to reach a balance between change and stability, i.e. a balance between the old and the new. Organisations must interact with the environment around them, so the environment must be suitable for that interaction. Becoming a Learning Organisation seems a logical step for all companies to follow and hopefully this document will give a clear understanding why. Why a Learning Organisation ? A company that performs badly is easily recognisable. The signs you need to spot ar ? Do your employees seem unmotivated or uninterested in their work? Does your workforce lack the skill and knowledge to adjust to new jobs? Do you seem to be the only one to come up with all the ideas? And does your workforce simply follow orders? Do your teams argue constantly and lack real productivity? Or lack communication between each other? And when the guru is off do things get put on hold? Are you always the last to hear about problems? Or worst still the first to hear about customer complaints? And do the same problems occur over and over?

NOTES

Can you spot the signs? If any of these points sound familiar the answer for you could be a Learning Organisation.

19

ANNA UNIVERSITY CHENNAI

DBA 1736

NOTES

Creating a Learning Organisation Before a Learning Organisations can be implemented , a solid foundation can be made by taking into account the following : Awareness Environment Leadership Empowerment Learning

Awareness Organisations must be aware that learning is necessary before they can develop into a Learning Organisation. This may seem to be a strange statement but this learning must take place at all levels; not just the Management level. Once the company has excepted the need for change, it is then responsible for creating the appropriate environment for this change to occur in. Environment Centralised, mechanistic structures do not create a good environment. Individuals do not have a comprehensive picture of the whole organisation and its goals. This causes political and parochial systems to be set up which stifle the learning process. Therefore a more flexible, organic structure must be formed. By organic, we mean a flatter structure which encourages innovations. The flatter structure also promotes passing of information between workers and so creating a more informed work force. It is necessary for management to take on a new philosophy; to encourage openness, reflectivity and accept error and uncertainty. Members need to be able to question decisions without the fear of reprimand. This questioning can often highlight problems at an early stage and reduce time consuming errors. One way of over-coming this fear is to introduce anonymity so that questions can be asked or suggestions made but the source is not necessarily known. Leadership Leaders should foster the Systems Thinking concept and encourage learning to help both the individual and organisation in learning. It is the leaders responsibility to help restructure the individual views of team members. For example, they need to help the teams understand that competition is a form of learning; not a hostile act. Management must provide commitment for long-term learning in the form of resources. The amount of resources available (money, personnel and time) determines the quantity and quality of learning. This means that the organisation must be prepared to support this.
20 ANNA UNIVERSITY CHENNAI

EMERGING TRENDS IN TECHNOLOGY MANAGEMENT

Empowerment The locus of control shifts from managers to workers. This is where the term Empowerment is introduced. The workers become responsible for their actions; but the managers do not lose their involvement. They still need to encourage, enthuse and coordinate the workers. Equal participation must be allowed at all levels so that members can learn from each other simultaneously. This is unlike traditionally learning that involves a top-down structure (classroom-type example) which is time consuming. Learning Companies can learn to achieve these aims in Learning Labs. These are small-scale models of real-life settings where management teams learn how to learn together through simulation games. They need to find out what failure is like so that they can learn from their mistakes in the future. These managers are then responsible for setting up an open, flexible atmosphere in their organisations to encourage their workers to follow their learning example. Anonymity has already been mentioned and can be achieved through electronic conferencing. This type of conferencing can also encourage different sites to communicate and share knowledge, thus making a company truly a Learning Organisation. Implementation Strategies Any organisation that wants to implement a learning organisation philosophy requires an overall strategy with clear, well defined goals. Once these have been established, the tools needed to facilitate the strategy must be identified. It is clear that everyone has their own interpretation of the Learning Organisation idea, so to produce an action plan that will transform groups into Learning Organisations might seem impossible. However, it is possible to identify three generic strategies that highlight possible routes to developing Learning Organisations. The specific tools required to implement any of these depends on the strategy adopted, but the initiatives that they represent are generic throughout. The three strategies are: Accidental For many companies, adopting a learning organisation philosophy is the second step to achieving this Holy Grail. They may already be taking steps to achieve their business goals that, in hindsight, fit the framework for implementing a Learning Organisation. This is the accidental approach in that it was not initiated through awareness of the Learning Organisation concept. Subversive Once an organisation has discovered the Learning Organisation philosophy, they must make a decision as to how they want to proceed. This is a choice between a subversive and a declared strategy. The subversive strategy differs from an accidental one in the level
21

NOTES

ANNA UNIVERSITY CHENNAI

DBA 1736

NOTES

of awareness; but it is not secretive! Thus, while not openly endorsing the Learning Organisation ideal, they are able to exploit the ideas and techniques. Declared The other option is the declared approach. This is self explanatory. The principles of Learning Organisations are adopted as part of the company ethos, become company speak and are manifest openly in all company initiatives. The Golden Rules As an organisation which learns and wants its people to learn, it must try to follow certain concepts in learning techniques and mould itself to accommodate for a number of specific attributes. In particular: Thrive on Change Encourage Experimentation Communicate Success and Failure Facilitate Learning from the Surrounding Environment Facilitate Learning from Employees Reward Learning A Proper Selfishness A Sense of Caring

Thrive on Change In a fast-paced, continually shifting environment resilience to change is often the single most important factor that distinguishes those who succeed from those who fail. - Tom Peters The crux of this idea is that for a Learning Organisation to be achieved many changes must be implemented. There can be no doubt that an organisation that enters such changes without a full commitment to them will not succeed. Hence it is constantly re-framing ; looking at problems from different angles or developing and exercising skills. In short, it is never static. To comply with this, the people in the organisation must continually adapt to changing circumstances. It is vital that the changing process be driven from the very top levels of the organisation: the managers must lead the changes with a positive attitude and have a clear vision of what is to be achieved. It is crucial that the management all agree to the strategy and believe in it so that they exude a sense of security and self-assurance.

22

ANNA UNIVERSITY CHENNAI

EMERGING TRENDS IN TECHNOLOGY MANAGEMENT

Encourage Experimentation Every change requires a certain degree of experimentation. To allow this experimentation is the central concept behind a Learning Organisation. Giving employees opportunities and responsibilities is a risk and can be costly in terms of resources. However for a company to learn it is a necessary risk, and approached in a positive manner, will bring many benefits. Innovation, after all, is what sets a company apart. A Learning Organisation needs to experiment by having both formal and informal ways of asking questions, seeking out theories, testing them, and reflecting upon them. It should try to predict events and plan to avoid mistakes be active rather than passive. One way to do this is to review their competitors work and progress and try to learn from their experiences. A Japanese strategy is to send their senior executives on study visits to other countries, raising questions and gathering ideas. They then review the visits and try to learn from them. Just like the changing process, the learning process has to start from the top of the organisation and finds its way throughout. However there is a danger in delegating the questions and theories to lower groups, as the senior executives could feel no ownership of the process and are unlikely to take risks with the conclusions. When John HarveyJones became chairman of ICI, he gave a lot of time and attention to creating space for the top executives to question, think and learn. Communicate Success and Failure It is important for a company to learn from its mistakes and also to appreciate its successes. Discussion and contribution in a team framework is vital, followed by assessment and planning. Each member should be encourage to self-assess their own performance.This requires continuous feedback and assessment which is easy to implement as a Learning Cycle which is shown in Figure 2:

NOTES

Figure 2
23 ANNA UNIVERSITY CHENNAI

DBA 1736

NOTES

The learning should not just stop at the team, however. Lateral spread of knowledge throughout the company can be implemented by a number of mechanisms. Oral, written and visual presentations; site visit and tours; personnel rotation programmes; education and training programmes will all encourage the spread of knowledge and experiences along with reduction of hierarchy and red tape present in many stagnant companies. To learn from ones mistakes, one must be able to accept failure, analyse the reasons for the failure and take action. Disappointment and mistakes are part of the changing process and essential to learning. A true Learning Organisation will treat mistakes as case studies for discussion, thus learning, and ensuring the same mistake does not happen again. For this to be done without blame, and with implied forgiveness, the learning has to be guided by a neutral mentor or coach. This figure may be from inside or outside the organisation, and need not necessarily possess much authority. It is often beneficial to an organisation to form a list of mentors, whose services they can rely on. If this is the case, then it is a pointer to the fact that the organisation has accepted the theory behind possessing negative capability. In order to keep a leading edge over its counterparts, the learning organisation has to keep abreast with the happenings in its internal and external environment. Technical and political issues which may exert pressure on the organisations current and future operations are identified and monitored. Internal sources of information can be work teams, departments or affiliated companies/ institutes within the organisation. Outside consultants, other players in the same field and even customers are potential external sources. Disseminating the value-added information in an efficient manner so that it is easily accessed by everyone within the organisation. One suggestion that stands out in the forecoming age of information highway is putting the computer database on the internet system with limited employee-only access. Joint-ventures provide precious opportunities of actively observing how others systems are run. In such cases, learning objectives should be clearly stated in the contractual agreements between the allies to avoid any future misunderstandings. Accusations of corporate spying are serious matter hence everything should be brought out in the open right from the start and nothing should be done on the sly. Customers represent the best research and development source as they know exactly what they and the market in general want. Moreover, this invaluable resource is free! Hence, it is worthwhile to try to involve the customers in product/ service design.

24

ANNA UNIVERSITY CHENNAI

EMERGING TRENDS IN TECHNOLOGY MANAGEMENT

Facilitate Learning from Employees Some of the most effective consultants your organisation could ever hire are already working for you. - Jim Clemmer (Firing on all Cylinders) Employees themselves, more often than not, know what needs to be done to improve operations. Kanter, Moss (The Change Masters) The above quotes are very true, however it could also be said that in the past a companys employees were there most under-rated and under-used consultants. The importance of this point cannot be overemphasised. The financial implications of learning from within are an obvious long term bonus. It is estimated that only 20% of an employees skills are utilised. This inefficiency can easily be overcome by training and multi-skilling. Reward Learning A learning culture rewards breakthroughs and initiative. Al Flood (The Learning Organisation) The performance appraisal is meant to reflect the organisations commitment to create a learning culture, that is, to promote acquisition of new skills, teamwork as well as individual effort, openness and objectivity and continuous personal development. The fragile human ego yearns for acknowledgement from superiors and fellow colleagues for ones work, in some form of reward or, simply, feedback. Everyone wants to feel that he or she is doing a real job and actively contributing to the proper functioning of the organisation. Caution should be taken when defining benchmarks for performance appraisal. No self-conscious member in the organisation should be left feeling neglected. When individuals lose confidence or give up hope, the learning organisation has failed. Therefore, the efforts put in and learning gained throughout the process should be recognised as well as the endresult. In addition, considerations taken in the performance appraisal should be incorporated into criteria for hiring new employees and promoting current staff. Annual performance reviews for pay-raise and promotion serve well for long term feedback and reward. However, it is also very important to have feedback and reward on a short term basis such as having ones mistake pointed out on-the-spot, and receiving appreciation and recognition there and then. Sometimes, being able to witness the overall accomplishment of ones work is self rewarding.

NOTES

25

ANNA UNIVERSITY CHENNAI

DBA 1736

NOTES

A Proper Selfishness If the Learning Organisation is properly selfish, it is clear about its role, its goals, its future, and is determined to reach them. This may sound extremely obvious, but does to make profits really suffice ? Rather it should be asking: What are the strengths, talents and weaknesses of the organisation? What sort of organisation does it want to be? What does it want to be known for? How will its success be measured, and by whom? How does it plan to achieve it?

The answers for most organisations must start with the customer or client who are they? What do they really want and need ? . This is really the essence of the phrase a proper selfishness it is right that the organisation think of itself in the ways outlined above, but it must remember why it is there. It is there for the sole purpose of serving customers and clients (otherwise how could it exist?). If an organisation neglects this fact, it is exhibiting improper selfishness , and is ultimately set for failure. A Sense of Caring Learning Organisations want everyone to learn and they go to great effort to make that possible. Apart from the points developed above, there are other initiatives: Tuition reimbursement schemes (as found in many American companies) Opportunities to sit in higher level management meetings (as in Japan) Projects to encourage personal development Horizontal careers to open up new possibilities Brainstorming parties around new problems Rewards tied to output, not to status; to performance, not age Public encouragement of questions at all levels The encouragement of initiative Constant celebration of achievement

These points can all be summed up into one phrase care for the individual. People do not take risks with those that they do not trust or genuinely care for. It then follows that organisations which possess a friendly and trustworthy working environment are more likely to succeed in todays climate of change, when calculated risk taking is part of getting ahead of the field.

26

ANNA UNIVERSITY CHENNAI

EMERGING TRENDS IN TECHNOLOGY MANAGEMENT

People Behaviour Behaviour to Encourage There are five disciplines (as described by Peter Senge) which are essential to a learning organisation and should be encouraged at all times. These are: Team Learning Shared Visions Mental Models Personal Mastery Systems Thinking

NOTES

Team Learning Virtually all important decisions occur in groups. Teams, not individuals, are the fundamental learning units. Unless a team can learn, the organisation cannot learn. Team learning focusses on the learning ability of the group. Adults learn best from each other, by reflecting on how they are addressing problems, questioning assumptions, and receiving feedback from their team and from their results. With team learning, the learning ability of the group becomes greater than the learning ability of any individual in the group. Shared Visions To create a shared vision, large numbers of people within the organisation must draft it, empowering them to create a single image of the future. All members of the organisation must understand, share and contribute to the vision for it to become reality. With a shared vision, people will do things because they want to, not because they have to. Mental Models Each individual has an internal image of the world, with deeply ingrained assumptions. Individuals will act according to the true mental model that they subconsciously hold, not according to the theories which they claim to believe. If team members can constructively challenge each others ideas and assumptions, they can begin to perceive their mental models, and to change these to create a shared mental model for the team. This is important as the individuals mental model will control what they think can or cannot be done. Personal Mastery Personal mastery is the process of continually clarifying and deepening an individuals personal vision. This is a matter of personal choice for the individual and involves continually assessing the gap between their current and desired proficiencies in an objective manner, and practising and refining skills until they are internalised. This develops self esteem and creates the confidence to tackle new challenges.
27 ANNA UNIVERSITY CHENNAI

DBA 1736

NOTES

The Fifth Discipline - Systems Thinking The cornerstone of any learning organisation is the fifth discipline - systems thinking. This is the ability to see the bigger picture, to look at the interrelationships of a system as opposed to simple cause-effect chains; allowing continuous processes to be studied rather than single snapshots. The fifth discipline shows us that the essential properties of a system are not determined by the sum of its parts but by the process of interactions between those parts. This is the reason systems thinking is fundamental to any learning organisation; it is the discipline used to implement the disciplines. Without systems thinking each of the disciplines would be isolated and therefore not achieve their objective. The fifth discipline integrates them to form the whole system, a system whose properties exceed the sum of its parts. However, the converse is also true - systems thinking cannot be achieved without the other core disciplines: personal mastery, team learning, mental models and shared vision. All of these disciplines are needed to successfully implement systems thinking, again illustrating the principal of the fifth discipline: systems should be viewed as interrelationships rather than isolated parts. The Laws of the Fifth Discipline Todays problems come from yesterdays solutions. Solutions shift problems from one part of a system to another. The harder you push, the harder the system pushes back. Compensating feedback: well intentioned interventions which eventually make matters worse. Behaviour grows better before it grows worse. The short-term benefits of compensating feedback are seen before the long-term disbenefits. The easy way out usually leads back in. Familiar solutions which are easy to implement usually do not solve the problem. The cure can be worse than the disease. Familiar solutions can not only be ineffective; sometimes they are addictive and dangerous. Faster is slower. The optimal rate of growth is much slower than the fastest growth possible. Cause and effect are not closely related in time and space. The area of a system which is generating the problems is usually distant to the area showing the symptoms. Small changes can produce big results-but the areas of highest leverage are often the least obvious. Problems can be solved by making small changes to an apparently unrelated part of the system.

28

ANNA UNIVERSITY CHENNAI

EMERGING TRENDS IN TECHNOLOGY MANAGEMENT

You can have your cake and eat it too - but not at once. Problems viewed from a systems point of view, as opposed to a single snapshot, can turn out not to be problems at all. Dividing an elephant in half does not produce two small elephants. A systems properties depend on the whole. There is no blame. The individual and the cause of their problems are part of a single system. Behaviour to Discourage An organisation which is not a learning one also displays behaviours, however these should definitely not be encouraged. Rosabeth Moss Kanter studied a range of large Americam corporations and came up with rules for stifling initiative : Regard any new idea from below with suspicion because it is new and because it is from below Express criticisms freely and withhold praise (that keeps people on their toes). Let them know they can be fired at any time Treat problems as a sign of failure Make decisions to reorganise or change policies in secret and spring them on people unexpectedly (that also keeps people on their toes) Above all, never forget that you, the higher-ups, already know everything important about business. These rules are expanded in her book The Change Masters. The Learning Organisation needs to break every one of these rules frequently. Why Learning Organisations Work The People Develop A Learning Organisation encourages its members to improve their personal skills and qualities, so that they can learn and develop. They benefit from their own and other peoples experience, whether it be positive or negative. Greater motivation People are appreciated for their own skills, values and work. All opinions are treated equally and with respect. By being aware of their role and importance in the whole organisation, the workers are more motivated to add their bit. This encourages creativity and free-thinking, hence leading to novel solutions to problems. All in all there is an increase in job satisfaction.
29

NOTES

ANNA UNIVERSITY CHENNAI

DBA 1736

NOTES

The workforce is more flexible People learn skills and acquire knowledge beyond their specific job requirements. This enables them to appreciate or perform other roles and tasks. Flexibility allows workers to move freely within the organisation, whilst at the same time it removes the barriers associated with a rigidly structured company. It also ensures that any individual will be able to cope rapidly with a changing environment, such as those that exist in modern times. People are more creative There are more opportunities to be creative in a learning organisation. There is also room for trying out new ideas without having to worry about mistakes. Employees creative contribution is recognised and new ideas are free to flourish. Improved social interaction Learning requires social interaction and interpersonal communication skills. An organisation based on learning will ensure members become better at these activities. Teams will work better as a result. Teams and Groups Work Better Learning Organisations provide the perfect environment for high performing teams to learn, grow and develop. On the other hand these teams will perform efficiently for the organisation to produce positive results. Knowledge sharing Openness Creates Trust A team is composed of highly specialised members who can not and are not expected to know everything about a job. In this case the sharing of common knowledge is quite important for the completion of a job. Within learning organisations in general, and teams in particular, information and knowledge flows around more freely. This makes for higher productivity within teams and between teams as they build on each others strengths. Trust between team members increases and hence they value each others opinions more. Interdependency In any organisation people depend on each other for the completion of their jobs. Learning Organisations will increase this awareness, and improve relations between people at a personal level. By knowing more about other peoples roles, needs and tasks, members can manage their time better and plan their work more efficiently. This dependency is decreased as learning is enhanced, letting people get on with their own job better as they rely less on others.

30

ANNA UNIVERSITY CHENNAI

EMERGING TRENDS IN TECHNOLOGY MANAGEMENT

The Company Benefits An active learning organisation will have at its heart the concept of continuous learning. Therefore it will always be improving in its techniques, methods and technology. Breakdown of traditional communication barriers The old hierarchical communication barrier between manager-worker has devolved into more of a coach-team member scenario. Leaders support the team, not dictate to it. The team appreciates this which in turn helps them to be highly motivated. All workers have an increased awareness of the companys status, and all that goes on in other departments. Communication between and across all layers of the company gives a sense of coherence, making each individual a vital part of the whole system. Workers perform better as they feel more a part of the company; they are not just pawns in a game. Customer relations A companys first priority is its customers needs. A Learning Organisation cuts the excess bureaucracy normally involved with customer relations allowing greater contact between the two. If the customers requirements change, learning organisations can adapt faster and cope more efficiently with this change. Information resources Over time a company builds up a pool of learning, in the form of libraries, and human expertice. This pool of knowledge within learning organisations is larger than average. New problems and challenges can be met faster using this increased resource. Innovation and creativity As more people in every level of a company engage in continual learning a valid contribution can come from any member of the company, and from any part of the company. Being innovative and creative is the responsability of the whole workforce and allows learning organisations to adapt to changes in the state of the market, technology and competition efficiently. Moreover, this creativity gives rise to an increased synergy.The interaction between high performing teams produces a result which is higher than was planned or expected of them. Facets of World Class Organisation Attitude Our attitude with respect to our work and environment, and our attitude toward our built in individual biases. How do we perceive our organization and its relevance to the ISU mission?

NOTES

31

ANNA UNIVERSITY CHENNAI

DBA 1736

NOTES

To develop attitudes for a world-class culture: Be Knowledgeable Share Network Collaborate Improve Products Be Flexible Be Innovative Change Your Orientation Keep the Proper Balance Process The processes within our individual unit function. Seek methodologies to adapt to meet changes, to speed delivery of our services, and to meet future challenges. Conduct internal benchmarking and transfer skill and knowledge to our people. Internal Benchmarking is the process of identifying the Best Practices developed within an organization and creating a business case for their implementation. Transferring Skill and Knowledge means the process of identifying, demonstrating, and transferring a successfully demonstrated process or practice to other units. Tools What tools help us to do our jobs better, faster, and easier within budget and on time (quality, speed, cost, and delivery)? Seek both low-tech and high-tech solutions. Embrace technology and adopt new skills through training and continuous improvement. World Class Organisations To achieve world-class status, an organization must stimulate creative thinking, encourage dialogue and introspection and promote understanding and new actions. Most important, it must give people - inside and outside the organization - something to care about. When people think of world-class organizations, chances are widely admired companies such as General Electric, Microsoft, British Airways, Hewlett-Packard, CocaCola and Disney spring to mind. Yet what elevates these and other companies from merely successful to the more desired status of world-class? A closer look at the best of the best reveals several shared characteristics. Besides being the premier organization in their industries, world-class companies have talented
32 ANNA UNIVERSITY CHENNAI

EMERGING TRENDS IN TECHNOLOGY MANAGEMENT

people, the latest technology, the best products and services, consistent high-quality, a high stock price, and a truckload of awards and accolades acknowledging their greatness. Dig deeper and youll also find that communication is practiced as a strategic process within these companies thats woven into their business planning, decision-making and organization-wide priorities. It defines their cultures by encouraging dialogue, feedback, interpretation and understanding. The Secret Behind World Class Something else also distinguishes world-class companies from all the others. Worldclass companies give people - their customers, employees, suppliers, even the people in the communities in which they operate - something to care about. While it may sound simple, a closer look at some of the worlds most respected and most successful companies indicates its true. Look at Disney, for example. Beginning with CEO Michael Eisner, everyone at Disney gives people a reason to care about the company because everyone there takes great pains to make their guests believe in make-believe. All new hires at Disney experience a multi-step training program where they quickly learn the language: Employees are cast members, customers are guests, a crowd is an audience, a work shift is a performance, a job is a part, a job description is a script, a uniform is a costume, the personnel department is casting, being on duty is being on stage, and being off duty is backstage. The special language along with the complete immersion into the companys history and mythology, reinforces the Disney frame of mind, starting with its new employees. All this acts to strengthen the sense of purpose and cult-like unity, ultimately intensifying the underlying ideology: To make people happy. These things, including unity of purpose and preservation of image and ideology, work together to make Disney world-class. Building a World Class Organisation Building Block One: Loyal Customers Becoming a world-class organization starts by determining what kind of experience you want your customers to have. How would you like customers to be treated as they interact with every part of your company? Now compare your vision of ideal customer service to what is currently happening in your organization. Look and see where things dont quite match up. If your organization is like most, it will have quite a few bumps and warts that you will want to address. You will also want to consider your companys consistency factor. Delivering your product or service properly time after time after time without fail is the foundation of creating loyal customers. Consistency is critical because consistency creates credibility. Consistency is the key to creating loyal customers.

NOTES

33

ANNA UNIVERSITY CHENNAI

DBA 1736

NOTES

The worst thing you can do is meet expectations one time, fall short another, and exceed every now and then. That will not only drive your customers crazy, but also send them running into the hands of your competition the first chance they get. Building Block Two: Engaged Employees It takes excited and passionate people to create a customer-focused culture. You cannot treat your people poorly and expect them to take good care of your customers. Empowering your people and permitting them to act as owners is essential. There are three things you can do to getand keeppeople excited about their jobs: Worthwhile workPeople want to know that the work they are doing is important and makes a difference. Make sure that everyone understands the significance of their particular role in achieving the companys overall vision. Remind people of their significance on a regular basis. In control of achieving the goalLet people have a say in how they do their jobs. It is increasingly important to place the responsibility for decision making directly on employees themselves. The good news is that employees are more motivated when they know they are being counted on to use their own judgment versus simply carrying out policies that allow for little, if any, individual discretion. Cheer each other onEveryone loves to be recognized for a job well done. Create a collaborative climate where milestones and other measures of improvement are celebrated and people feel acknowledged. Reward and recognition focused on catching people doing things right is one of the best ways to positively reinforce a motivating work environment. Building Block Three: Great Managers Great managers hold everything together. They know that leadership is not about them; it is about serving the vision and the people who make it come alive. Great managers realize that they cannot do it all themselves, so they empower people to make decisions and then support them all the way. Todays manager must be a coach, facilitator and cheerleader for the employees they support. To be world-class, make sure your managers support their direct reports in four ways: Provide access to information and training that gives people the right start and helps them to grow. Lets employees know why what they do is important to your company and how it provides value to your customers. Use performance management as a way to give people the direction and support they needwhen they need itso employees can accomplish their goals and the organization can succeed.
34 ANNA UNIVERSITY CHENNAI

EMERGING TRENDS IN TECHNOLOGY MANAGEMENT

Provide recognition frequently and celebrate performance over time. Catch direct reports doing things right instead of wrong to keep them inspired and focused on whats important. Keep employees growing through ongoing career planning. Give direct reports the opportunity to develop skills that will make them more valuable employees. The process of building a world-class organization takes time, energy, and commitment. It all starts by developing a shared vision that is focused on the customer and the experience you want your customers to have when they interact with your company. It continues with engaged employees who are passionate about delivering that experience and understand how their role fits into the overall picture. Finally, it is held together by great managers and leaders who recognize that their goal is to have the right people, in the right roles, fully engaged and growing if their organization is going to succeed in the long term. At The Ken Blanchard Companies we dont think organizations set a goal for themselves to be no worse than the competition. Instead, we think that people and organizations want to be world-class. Its the job of leaders to bring out that magnificence in people and create the environment in which employees feel safe, supported, and ready to do the best job possible in accomplishing key goals on behalf of their organization. 1.5 DUAL-USE TECHNOLOGY Dual-use is a term often used in politics and diplomacy to refer to technology which can be used for both peaceful and military aims. It usually refers to the proliferation of nuclear weapons, but that of bioweapons is a growing concern. Many types of nuclear reactors produce fissile material, such as plutonium, as a byproduct, which could be used in the development of a nuclear weapon. However, nuclear reactors can also be used for peaceful, civilian purposes: providing electricity to a city, for example. As such, a nation which wanted to develop a nuclear weapon could build a reactor, claiming it would be used for civilian purposes, and then use its plutonium to build a nuclear weapon. During the Cold War, the United States and the Soviet Union spent billions of dollars developing rocket technology which could carry humans into space (and even eventually to the moon). The knowledge gained from this peaceful rocket technology also served in the development of intercontinental ballistic missile technology as well. The International Atomic Energy Agency attempts to monitor dual-use technology in countries who are signatories of the Nuclear Non-Proliferation Treaty, to make sure that fissile material is not diverted to military functions. In recent events, both Iran and North Korea have been accused of having nuclear weapons programs based on dual-use technology.
35

NOTES

ANNA UNIVERSITY CHENNAI

DBA 1736

NOTES

Lax biosecurity at laboratories is worrying researchers and regulators that potential select agents may fall into the hands of malevolent parties. It may have been instrumental to the 2001 anthrax attacks in the United States, and unintentional SARS virus leaks led to lethal outbreaks in China, Taiwan and Singapore over 2003 and 2004. Universities may flaunt regulations, complacent of the dangers in doing so. Though the majority of breaches are benign, the hybridization of Hepatitis C and dengue-fever viruses at Imperial College London in 1997 resulted in a fine when health and safety rules were not observed.A research program at Texas A&M was shut down when Brucella and Coxiella infections were not reported. That the July 2007 terrorist attacks in central London and at Glasgow airport may have involved medical professionals was a recent wake-up call that screening people with access to pathogens may be necessary. The challenge remains to maintain security without impairing the contributions to progress afforded by research. Most industrial countries have export controls on certain types of designated dualuse technologies, and they are required by a number of treaties as well. These controls restrict the export of certain commodities and technologies without the permission of the government. The principal agency for dual use export controls in the United States is the Department of Commerce, Bureau of Industry and Security. More generally speaking, dual-use can also refer to any technology which can satisfy more than one goal at any given time. Thus, expensive technologies which would otherwise only serve military purposes can also be utilized to benefit civilian commercial interests when not otherwise engaged such as the Global Positioning System. 1.6 ROAD MAP TO TECHNICAL PLANNING Overview of the Roadmapping process

Figure 3 The Technology Roadmapping phases.


36 ANNA UNIVERSITY CHENNAI

EMERGING TRENDS IN TECHNOLOGY MANAGEMENT

The Technology Roadmapping Process conducts 3 phases (see figure 3): preliminary activities, the development of the roadmap and the follow-up activities phase. Because the process is too big for one model the phases are modeled separately. Only the first two phases are considered. In the models no different roles are made, this is because everything is done by the participants as a group. Phase 1: Preliminary phase The first phase, the preliminary phase, consists of 3 steps: satisfy essential conditions, provide leadership / sponsorship and define the scope and boundaries for the technology roadmap. In this phase the key decision makers must identify that they have a problem and that technology roadmapping can help them in solving the problem. Satisfy essential conditions In this step it must become clear what the conditions are (they have to be identified) and if they are not met that somebody will take the actions necessary to meet the unmet conditions. These conditions include for example the following: there must be a need for the technology roadmap, input and participation from several different parts of the organization (e.g. marketing, R&D, the Strategic Business Units ) with different planning horizons and different perspectives and the process should be needs driven. All the conditions should be satisfied (or someone is going to take the actions necessary) in order to continue to the next step. The participants can have zero or more conditions of their own. It applies to all the conditions that they have the attribute to be met or not. Insert non-formatted text here Provide leadership / sponsorship Committed leadership is needed because time and effort is involved in creating the technology roadmap. Additionally the leadership should come from one of the participants, one of them provides leadership / sponsorship. This means that the line organization must drive the process and use the roadmap to make resource allocation decisions. Define the scope and boundaries for the technology roadmap In this step the context for the roadmap will be specified. In the company a vision should exist and it must be clear that the roadmap can support that vision. If the vision does not exist one should be developed and clearly stated. When that is done the boundaries and the scope of the roadmap should be specified. Furthermore the planning horizon and the level of details should be set. The scope can be further divided into the technology scope and the participation scope.

NOTES

37

ANNA UNIVERSITY CHENNAI

DBA 1736

NOTES

In table 1. all the different sub-activities of the preliminary activity phase can be seen. All the sub-activities have concepts as end products, these are marked in uppercase. These concepts are the actual meta-data model , which is an adjusted class diagram . Table 1 Activity table for the preliminary activity phase.
Activity Satisfy essential conditions Sub-Activity Description

Identify essential conditions When all the participants come together essential conditions, like what groups should be involved, what are the key customers and what are the key suppliers, can be identified. Take action to satisfy conditions For Technology Roadmapping to succeed conditions from the participants must be satisfied.

Provide leadership / sponsorship

The

part

of

Leadership

Sponsorship should be taken by line organization; they must

drive the Roadmapping process and use the roadmap to make resource allocation decisions. Define the scope and boundaries for the technology roadmap Define scope Develop vision Clearly state vision The already existing vision has to be clear The vision is developed and stated clearly. The scope of the project can further define the set of needs, planning horizion and level of detail. The scope can be further divided into the Technology Scope and the participation scope Define boundaries Also the boundaries should be included

38

ANNA UNIVERSITY CHENNAI

EMERGING TRENDS IN TECHNOLOGY MANAGEMENT

Phase 2: Development phase The second phase, the development of the technology roadmap phase , consists of 7 steps: identify the product that will be the focus of the roadmap, identify the critical system requirements and their targets, specify the major technology areas, specify the technology drivers and their targets, identify technology alternatives and their timelines, recommend the technology alternatives that should be pursued and create the technology roadmap report. These steps create the actual roadmap. Identify the product that will be the focus of the roadmap In this step the common product needs are identified and should be agreed on by all the participants. This is important to get the acceptance of all groups for the process. In case of uncertainty of the product needs scenario-based planning can be used to determine the common product needs. It can be seen that the participants and possibly the scenariobased planning provide the common product needs. Identify the critical system requirements and their targets Once it is decided what needs to be roadmapped the critical system requirements can be identified, they provide the overall framework for the technology roadmap. The requirements can have targets (as an attribute in figure 3) like reliability and costs. Specify the major technology areas These are the areas which can help achieve the critical system requirements. For each technology area several technologies can be found. Example technology areas are: Market assessment, Crosscutting technology, Component development and System development. Specify the technology drivers and their targets In this step the critical system requirements from step Identify the critical system requirements and their targets are transformed into technology drivers (with targets) for the specific technology area. These drivers are the critical variables that will determine which technology alternatives are selected. The drivers depend on the technology areas but they relate to how the technology addresses the critical system requirements. Identify Technology alternatives and their timelines At this point the technology drivers and their targets are specified and the technology alternatives that can satisfy those targets should be specified. For each of the alternatives a timeline should be estimated for how it will mature with respect to the technology driver targets.

NOTES

39

ANNA UNIVERSITY CHENNAI

DBA 1736

NOTES

Time This factor can be adapted suitable for the particular situation. The time horizons for E-commerce and software related sectors are usually short. Other distinctions can be made on scale and intervals. Recommend the technology alternatives that should be pursued Because the alternatives may differ in costs, timeline etc. a selection has to be made of the alternatives. These will be the alternatives to be pursued in figure 3. In this step a lot of trade-off has to be made between different alternatives for different targets, performance over costs and even target over target. Create the technology roadmap report At this point the technology roadmap is finished. It can be seen that the technology roadmap report consists of 5 parts: the identification and description of each technology area, critical factors in the roadmap, unaddressed areas, implementation recommendations and technical recommendations. The report can also include additional information. In table 2. all the different sub-activities of the development phase can be seen. Table 2 Activity table for the Development phase.
Activity Identify the product that will be the focus of the roadmap Sub-Activity Identify needs Description This critical step is to get the participants to identify and agree on the common product needs. This is important to get their buy-in and acceptance. If there is major uncertainty about the common product needs scenariobased planning can be used. Each scenario must be reasonable, internally consistent and comparable with the other scenarios. These are the needs for the product. The critical system requirements provide the overall framework for the roadmap and are high-level dimensions to which the technologies relate. These include things like reliability and costs. For each of the system requirements targets have to be defined.

Use Scenario-based planning

State needs Define critical system Identify the critical system requirements requirements and their targets Define targets

40

ANNA UNIVERSITY CHENNAI

EMERGING TRENDS IN TECHNOLOGY MANAGEMENT

Specify the major technology areas

Specify the technology drivers and their targets

Identify technology alternatives and their timelines Recommend the technology alternatives that should be pursued Create the technology roadmap report

Transform requirements The major technology areas should be into technology oriented specified to help achieve the critical drivers system requirements for the product. The critical system requirements are then transformed into technology drivers for the specific technology areas. Select technology Technology drivers and their targets alternatives with their are set based on the critical SYSTEM targets requirement targets. It specifies how viable technology alternatives must be to perform by a certain date. From the available technology alternatives a selection has to be made. Identify alternatives and The technology alternatives that can their timelines satisfy the TARGETS must be identified. Next to this the TIMELINE from each alternative has to be identified. Select subset of Determine which technology technology alternatives alternative to pursue and when to to be pursued shift to a different TECHNOLOGY. Consolidate the best information and develop consensus from many experts. Create the report Here the actual technology roadmap report is created. This report includes: identification and description the technology, critical factor, unaddressed area, And implementation recommendation and technical recommendation.

NOTES

Phase 3: Follow-up activity phase This is the moment when the roadmap must be critiqued, validated and hopefully accepted by the group that will be involved in any implementation. For this a plan needs to be developed using the technology roadmap. Next there must be a periodical review and update point, because the needs from the participants and the technologies are evolving. Planning and Business Development Context for Technology Roadmapping. The process of technology roadmapping fits into corporate strategy, corporate strategic planning , technology planning and the business development context. 3 critical elements should be connected: needs, products and technology.

41

ANNA UNIVERSITY CHENNAI

DBA 1736

NOTES

Knowledge and skills required Consultant with skills In order to create a technology roadmap it is required to have a certain set of knowledge and skills. This means that some of the participants must know the process of technology roadmapping. Next to this group-process and interpersonal skills are required since the process includes a lot of discussions and finding out what the common need is. If the amount of participants is really large there might be need for a consultant or facilitator. The variety in technology Roadmapping The purpose of technology Roadmapping Product planning: This is the most common type of a technology roadmap; linking the insertion of technologies into products. This can overlap between generations. Programme planning: This type is more directed to the implementation of strategy and related to project planning. Figure 5 shows the relationships between technology development phases, programme phases and milestones. The formats of technology Roadmapping Bars Almost all the roadmaps are (partly) expressed in bars for each layer. This makes the roadmaps very simple and unified, which makes the communication and integration easier. Graphs Also a technology roadmap can be expressed as a graph, usually one for each of the sub layers. Summary This unit gave some insight into the policies of technology management systems, methods to build and maintian a learinging and a world class organisation and the dual use technology alogwith a road map to technical planning. Questions Have you understood? 1. Are policies a boon or a bane to technology management? 2. Why and how can knowledge be leveraged? 3. Elaborate on the steps in creating a learing organisation. 4. How can you differentiate a world class organisation from a medoicare organisation? 5. Prepare a road map towards techinal planning for a project of your choice.

42

ANNA UNIVERSITY CHENNAI

EMERGING TRENDS IN TECHNOLOGY MANAGEMENT

UNIT II

NOTES

CRITICAL FACTORS IN MANAGING TECHNOLOGY


Introduction The need for flexibility in Management, opting for multiple technology choices and technology sourcing has been long felt. An overview of such topics have been dealt in this unit. Issues relating to managing complexity and chaos have also been discussed. Learning Objectives To know about How to bring about flexibility in Management ? Role of Management in Change Choice of Technology in SSEs What is Technology Sourcing? Meaning of complexity, chaos and Uncertianty.

2.1 TOTAL FLEXIBILITY MANAGEMENT Total flexibility management is a managerial approach for developing flexible resources. An extensive variety of management tools and approaches are available to achieve business success in todays competitive global environment. Management approaches such as justin-time manufacturing (JIT), employee involvement, activity-based management, time-based competition and total quality management (TQM) all attempt to meet the needs of the customer cheaper, faster and better. However, many world-class companies are realizing that success in the future will go to the organization with the strategic advantage of flexibility. The flexibility of a resource, or the degree to which it may be used in multiple ways or as a substitute for other, less plentiful resources, is a crucial attribute of all resources. Today, a token effort given to improve flexibility may be as futile as many initial halfhearted quality programs were in the early 1970s. The company-wide focus crucial to successful TQM programs is also needed in developing total organizational flexibility.
43 ANNA UNIVERSITY CHENNAI

DBA 1736

NOTES

Flexibility is an important characteristic of all types of resources, not just materials or machine tools. These include traditional resources, such as people and machines as well as less traditional resources, such as the organizations structure, information flows, culture and decision making processes. Furthermore, a resource with a high degree of flexibility has increased utility as a potential substitute for other resources. By leveraging this characteristic, effective companies achieve more with less. Defining resources and flexibility Viewed broadly, a resource may be defined as anything, tangible or intangible, that is under an organizations control and that may be used in pursuit of its mission. Some obvious examples of resources include plant and equipment, raw materials, employees and financial resources. Within the above definition, a manufacturers customers are not a resource. Although they do contribute to the accomplishment of the firms mission, they are not under its control. However, the firms relationships with its customers are under its control, at least in part. Thus, a comprehensive view of resource management must include the intangible possessions of the organization. These include its relationships (reputation, standing agreements, etc.), its existing knowledge or experience base and the possibly undiscovered and unused talents of its work force. Flexibility management is also gaining importance due to the changes in the demand management strategies. Fig. 2 shows various demand management strategies adopted by firms to meet the demands of dynamic markets for products/services. Companies operating in a make-to-stock environment produce the items and stock them based on the demand forecast, and the focus of the management will be on maintaining the optimum level of the stock. The next level is an assemble-to-order environment where the products are stocked in a ready-to-assemble condition and assembled to meet the orders. This environment provides certain flexibility to build a limited and known variety of products using highly standardised and modular designs. Beyond this, the items will have to be produced using the available designs. This will lead to a make-to-order situation. The emerging environment is leading towards an Engineer-to-Order (ETO) situation where new products are engineered to order by modifying the existing designs. The fundamental assumption of this approach is that the designs are readily available and variety can be accomplished simply by building flexibility into the manufacturing systems. However, the real competence for customisation lies beyond this, that is, in the ability to quickly and efficiently design new products using available competencies and development of new competencies wherever required. We call this as an Innovate-to-Order (ITO) environment. The future organisations are required to operate and compete in this new environment. As we move from make-to-stock situation towards the innovate-to-order level, the requirements for flexibility increases and there will be greater need for flexibility management.
44 ANNA UNIVERSITY CHENNAI

EMERGING TRENDS IN TECHNOLOGY MANAGEMENT

Emergence of Flexibility The concept of flexibility in the operations of the firm seems to have been originated in the later part of 1930s in economics literature and ever since attracting growing attention from industry as well as academic researchers. However, in spite of intense research efforts and large number of publications, especially during the last two decades, flexibility remained a conundrum, a confusing puzzle, defying even a universally accepted definitions. Flexibility is recognized as a complex, multi-dimensional and polymorphous concept, which means different things to different people and is highly context specific. Several attempts are being made in literature to comprehend this complex concept and capture its essence with the help of unified frameworks, taxonomy, models and measures. However, in spite of all these efforts, there are so many gaps in understanding the concept of flexibility, especially from a practitioners point of view. It was observed that, beyond the intuitive and rudimentary perception of the importance of flexibility, there exists little understanding of its nature, and of its effect on manufacturing performance. Other observations have been that, while the potential benefits of flexibility are familiar, the concept of flexibility itself is not well understood. There are many questions in the minds of the practitioners, such as, what is flexibility? Why do we need it ? How does it matter for business performance ? How it is created and exploited, etc.? Hence, there is a need to advance the current understanding of the flexibility and this paper is a step in this direction. There are three areas that are encompassed under the umbrella of Management Flexibility which allow both supervisors and employees to accomplish their goals, improve operating efficiency, take care of personal needs, and adapt to the changing needs of the University and the individual. Those areas are Flexible Scheduling, Telecommuting, and Internal Promotion of Employees. 2.2 CHANGE MANAGEMENT The change management process in systems engineering is the process of requesting, determining attainability, planning, implementing and evaluation of changes to a system. It has two main goals : supporting the processing of changes which is mainly discussed here and enabling traceability of changes, which should be possible through proper execution of the process described here. There is considerable overlap and confusion between change management, change control and configuration management. Change management is an important process, because it can deliver vast benefits (by improving the system and thereby satisfying customer needs), but also enormous problems (by ruining the system and/or mixing up the change administration). Furthermore, at least for the Information Technology domain, more funds and work are put into system maintenance (which involves change management) than to the initial creation of a system. Typical investment by organizations during initial implementation of large ERP systems is 15-20% of overall budget.
45

NOTES

ANNA UNIVERSITY CHENNAI

DBA 1736

NOTES

In the same vein, Hinley describes two of Lehmans laws of software evolution: the law of continuing change (i.e. systems that are used must change or automatically become less useful) and the law of increasing complexity (i.e. through changes the structure of a system becomes ever more complex and more resources are needed to simplify it). The field of manufacturing is nowadays also confronted with many changes due to increasing and worldwide competition, technological advances and demanding customers. Therefore, (efficient and effective) change management is also of great importance in this area. It is not unthinkable that the above statements are true for other domains as well, because usually, systems tend to change and evolve as they are used. Below, a generic change management process and its deliverables are discussed, followed by some examples of instances of this process. The process and its deliverables For the description of the change management process, the meta-modeling technique is used. Activities There are six main activities, which jointly form the change management process. They are: Identify potential change, Analyze change request, Evaluate change, Plan change, Implement change and Review and close change. These activities are executed by four different roles, which are discussed in Table 1. The activities (or their sub-activities, if applicable) themselves are described in Table 2.
Table 1 Role descriptions for the change management process Role Customer Description The customer is the role that requests a change due to problems encountered or new functionality requirements; this can be a person or an organizational entity and can be in- or external to the company that is asked to implement the change. The project manager is the owner of the project that the CHANGE REQUEST concerns. In some cases there is a distinct change manager, who in that case takes on this role. The change committee decides whether a CHANGE REQUEST will be implemented or not. Sometimes this task is performed by the project manager as well. The change builder is the person who plans and implements the change; it could be argued that the planning component is (partially) taken on by the project manager.

Project manager Change committee Change builder

46

ANNA UNIVERSITY CHENNAI

EMERGING TRENDS IN TECHNOLOGY MANAGEMENT

Table 2 Activity descriptions for the change management process


Activity Identify potential change Sub-activity Description

NOTES

Require new A customer desires new functionality and formulates functionality a REQUIREMENT. Encounter problem Request change A customer encounters a problem (e.g. a bug) in the system and this leads to a PROBLEM REPORT. A customer proposes a change through creation of a CHANGE REQUEST. The project manager determines the technical feasibility of the proposed CHANGE REQUEST, leading to a CHANGE TECHNICAL FEASIBILITY.

Analyze change request

Determine technical feasibility

The project manager determines the costs and benefits of the proposed CHANGE REQUEST, resulting in Determine CHANGE COSTS AND BENEFITS. This and the costs and above sub-activity can be done in any order and they benefits are independent of each other, hence the modeling as unordered activities. Based on the CHANGE REQUEST, its CHANGE TECHNICAL FEASIBILITY and CHANGE COSTS AND BENEFITS, the change committee makes the go/no-go decision. This is modeled as a separate activity because it is an important process step and has another role performing it. It is modeled as a subactivity (without any activity containing it) as recommended by Remko Helms (personal communication). The extent of the change (i.e. what other items the change effects) is determined in a CHANGE IMPACT ANALYSIS. It could be argued that this Analyze activity leads to another go/no-go decision, or that it change impact even forms a part of the Analyze change request activity. It is modeled here as a planning task for the change builder because of its relationship with the activity Propagate change.

Evaluate change

Plan change

47

ANNA UNIVERSITY CHENNAI

DBA 1736

NOTES
Create planning

A CHANGE PLANNING is created for the implementation of the change. Some process descriptions (e.g. Mkrinen, 2000) illustrate that is also possible to save changes and process them later in a batch. This activity could be viewed as a good point to do this. The change is programmed; this activity has a strong relationship with Propagate change, because sometimes the change has to be adapted to other parts of the system (or even other systems) as well. The changes resulting from Execute change have to be propagated to other system parts that are influenced by it. Because this and the above subactivity are highly dependent on each other, they have been modeled as concurrent activities. The change builder tests whether what (s)he has built actually works and satisfies the CHANGE REQUEST. As depicted in the diagram, this can result in an iterative process together with the above two sub-activities.

Implement change

Execute change

Propagate change

Test change

Update The DOCUMENTATION is updated to reflect the documentation applied changes. Release change A new SYSTEM RELEASE, which reflects the applied change, is made public.

The implementation of the change in the new SYSTEM RELEASE is verified for the last time, now by the project manager. Maybe this has to Review and Verify change happen before the release, but due to conflicting close literature sources and diagram complexity change considerations it was chosen to model it this way and include this issue. Close change This change cycle is completed, i.e. the CHANGE LOG ENTRY is wrapped up.

Deliverables esides activities, the process-data diagram also shows the deliverables of each activity, i.e. the data. These deliverables or concepts are described in Table 3; in this context, the most important concepts are: change request and change log entry. few concepts are defined by the author (i.e. lack a reference), because either no (good) definitions could be found, or they are the obvious result of an activity. These concepts are marked with an asterisk (*). Properties of concepts have been left out of
48 ANNA UNIVERSITY CHENNAI

EMERGING TRENDS IN TECHNOLOGY MANAGEMENT

the model, because most of them are trivial and the diagram could otherwise quickly become too complex. Furthermore, some concepts (e.g. CHANGE REQUEST, SYSTEM RELEASE) lend themselves for the versioning approach as proposed by Weerd [6], but this has also been left out due to diagram complexity constraints.
Table 3 Concept descriptions for the change management process Concept REQUIREMENT Description A required functionality of a component (or item; NASA, 2005). Document describing a problem that cannot be solved by a level 1 help desk employee; contains items like date, contact info of person reporting the problem, what is causing the problem, location and description of the problem, action taken and disposition, but this is not depicted in the diagram (Dennis, et al., 2002). Document that describes the requested change and why it is important; can originate from PROBLEM REPORTS, system enhancements, other projects, changes in underlying systems and senior management, here summarized as REQUIREMENTS (Dennis, et al., 2002). Important attribute: go/no-go decision, i.e. is the change going to be executed or not? Distinct entry in the collection of all changes (e.g. for a project); consists of a CHANGE REQUEST, CHANGE TECHNICAL FEASIBILITY, CHANGE COSTS AND LOG BENEFITS, CHANGE IMPACT ANALYSIS, CHANGE PLANNING, TEST REPORT and CHANGE VERIFICATION. Not all these have to be included if the process is terminated earlier (i.e. if the change is not implemented). Concept that indicates whether or not reliable hardware and software, technical resources capable of meeting the needs of a proposed system [i.e. change request] can be acquired or developed by an organization in the required time (Vogl, 2004).

NOTES

PROBLEM REPORT

CHANGE REQUEST

CHANGE ENTRY*

CHANGE TECHNICAL FEASIBILITY

The expected effort required to implement and the CHANGE COSTS advantages (e.g. cost savings, increased revenue) gained by implementing the change. Also named economic AND BENEFITS feasibility CHANGE IMPACT An assessment of the extent of the change ANALYSIS

49

ANNA UNIVERSITY CHENNAI

DBA 1736

NOTES

Besides just changes, one can also distinguish deviations and waivers. A deviation is an authorization (or a request for it) to depart from a requirement of an item, prior to the creation of it. A waiver is essentially the same, but than during or after creation of the item. These two approaches can be viewed as minimalistic change management (i.e. no real solution to the problem at hand). Examples A good example of the change management process in action can be found in software development. Often users report bugs or desire new functionality from their software programs, which leads to a change request. The product software company then looks into the technical and economical feasibility of implementing this change and consequently it decides whether the change will actually be realized. If that indeed is the case, the change has to be planned, for example through the usage of function points. The actual execution of the change leads to the creation and/or alteration of software code and when this change is propagated it probably causes other code fragments to change as well. After the initial test results seem satisfactory, the documentation can be brought up to date and be released, together with the software. Finally, the project manager verifies the change and closes this entry in the change log. Another typical area for change management in the way it is treated here, is the manufacturing domain. Take for instance the design and production of a car. If for example the vehicles air bags are found to automatically fill with air after driving long distances, this will without a doubt lead to customer complaints (or hopefully problem reports during the testing phase). In turn, these produce a change request (see Figure 2 on the right), which will probably justify a change. Nevertheless, a most likely simplistic cost and benefit analysis has to be done, after which the change request can be approved. Following an analysis of the impact on the car design and production schedules, the planning for the implementation of the change can be created. According to this planning, the change can actually be realized, after which the new version of the car is hopefully thoroughly tested before it is released to the market. Change management in industrial plants Since complex processes can be very sensitive to even small changes, proper management of change to industrial facilities and processes is recognized as critical to safety. In the US, OSHA has regulations that govern how changes are to be made and documented. The main requirement is that a thorough review of a proposed change be performed by a multi-disciplinary team to ensure that as many possible viewpoints are used as possible to minimize the chances of missing a hazard. In this context, change management is known as Management of Change, or MOC. It is just one of many components of Process Safety Management, section 1910.119(l).1

50

ANNA UNIVERSITY CHENNAI

EMERGING TRENDS IN TECHNOLOGY MANAGEMENT

Change Management is a structured approach to transitioning individuals, teams, and organizations from a current state to a desired future state. The current definition of Change Management includes both organizational change management processes and individual change management models, which together are used to manage the people side of change. Individual change management A number of models are available for understanding the transitioning of individuals through the phases of change. Unfreeze-Change-Refreeze An early model of change developed by Kurt Lewin described change as a threestage process[1]. The first stage he called unfreezing. It involved overcoming inertia and dismantling the existing mindset. Defense mechanisms have to be bypassed. In the second stage the change occurs. This is typically a period of confusion and transition. We are aware that the old ways are being challenged but we do not have a clear picture to replace them with yet. The third and final stage he called freezing (often called refreezing by others). The new mindset is crystallizing and ones comfort level is returning to previous levels. Rosch (2002) argues that this often quoted three-stage version of Lewins approach is an oversimplification and that his theory was actually more complex and owed more to physics than behavioural science. Later theorists have however remained resolute in their interpretation of the force field model. This three-stage approach to change is also adopted by Hughes (1991) who makes reference to: exit (departing from an existing state), transit (crossing unknown territory), and entry (attaining a new equilibrium). Tannenbaum & Hanna (1985) suggest a change process where movement is from homeostasis and holding on, through dying and letting go to rebirth and moving on. Although elaborating the process to five stages, Judson (1991) still proposes a linear, staged model of implementing a change: (a) analysing and planning the change; (b) communicating the change; (c) gaining acceptance of new behaviours; (d) changing from the status quo to a desired state, and (e) consolidating and institutionalising the new state. Kbler-Ross Some change theories are based on derivatives of the Kbler-Ross model from Elizabeth Kubler-Rosss book, On Death and Dying. The stages of Kubler-Rosss model describe the personal and emotional states that a person typically encounters when dealing with loss of a loved one. Derivatives of her model applied in other settings such as the workplace show that similar emotional states are encountered as individuals are confronted with change. Formula for Change A Formula for Change was developed by Richard Beckhard and David Gleicher and is sometimes referred to as Gleichers Formula. The Formula illustrates that the combination of organisational dissatisfaction, vision for the future and the possibility of
51

NOTES

ANNA UNIVERSITY CHENNAI

DBA 1736

NOTES

immediate, tactical action must be stronger than the resistance within the organisation in order for meaningful changes to occur. ADKAR The ADKAR model for individual change management was developed by Prosci with input from more than 1000 organizations from 59 countries. This model describes five required building blocks for change to be realized successfully on an individual level. The building blocks of the ADKAR Model include: 1. Awareness of why the change is needed 2. Desire to support and participate in the change 3. Knowledge of how to change 4. Ability to implement new skills and behaviors 5. Reinforcement to sustain the change Organizational change management Organizational change management includes processes and tools for managing the people side of the change at an organizational level. These tools include a structured approach that can be used to effectively transition groups or organizations through change. When combined with an understanding of individual change management, these tools provide a framework for managing the people side of change. People who are confronted by change will experience a form of culture-shock as established patterns of corporate life are altered, or viewed by people as being threatened. Employees will typically experience a form of grief or loss. Dynamic conservatism This mode by Donald Schn explores the inherent nature of organisations to be conservative and protect themselves from constant change. Schn recognises the increasing need, due to the increasing pace of change for this process to become far more flexible. This process being one of learning. Very early on Schn recognised the need for what is now termed the learning organization. These ideas are further expanded on within his frame work of reflection-in-action the mapping of a process by which this constant change could be coped with. The role of the management Managements responsibility (and that of administration in case of political changes) is to detect trends in the macroenvironment as well as in the microenvironment so as to be able to identify changes and initiate programs. It is also important to estimate what impact a change will likely have on employee behaviour patterns, work processes, technological requirements, and motivation. Management must assess what employee reactions will be
52 ANNA UNIVERSITY CHENNAI

EMERGING TRENDS IN TECHNOLOGY MANAGEMENT

and craft a change program that will provide support as workers go through the process of accepting change. The program must then be implemented, disseminated throughout the organization, monitored for effectiveness, and adjusted where necessary. Organizations exist within a dynamic environment that is subject to change due to the impact of various change triggers, such as evolving technologies. To continue to operate effectively within this environmental turbulence, organizations must be able to change themselves in response to internally and externally initiated change. However, change will also impact upon the individuals within the organization. Effective change management requires an understanding of the possible effects of change upon people, and how to manage potential sources of resistance to that change. Change can be said to occur where there is an imbalance between the current state and the environment. Other Approaches to Managing Change Appreciative Inquiry, one of the most frequently applied approaches to organizational change, is partly based on the assumption that change in a system is instantaneous (Change at the Speed of Imagination) Scenario Planning: Scenario planning provides a platform for doing so by asking management and employees to consider different future market possibilities in which their organizations might find themselves. Theory U of Otto Scharmer who describes a process in which change strategies are based on the emerging future rather than on lesson from the past. 2.3 CHOICE OF TECHNOLOGY Technology choice has important implications for growth and productivity in industry. The use of technology is always tied to an objective. Because various types of technologies can be used to achieve an organizations objectives, the issue of choice arises. The concept of technology choice assumes access to information on alternative technologies and the ability to evaluate these effectively. Moustafa (1990) asserted that effective choice is based on pre-selected criteria for a technologys meeting specified needs. Further, it depends on the ability to identify and recognize opportunities in different technologies. The expected outcome is that the firm will select the most suitable or appropriate technology (AT) in its circumstances. The concept of AT has been a subject of debate for many years. Stewart (1987) contrasted two general views. First, welfare economics defines AT as a set of techniques for making optimum use of available resources in a given environment. Second, social scientists and those working in AT institutions associate AT with a specific set of characteristics. According to Stewart, the characteristics defining AT normally include more labour-using, less capital-using, less skill-using, making more use of local materials and resources, and smaller in scale.
53

NOTES

ANNA UNIVERSITY CHENNAI

DBA 1736

NOTES

It is also sometimes emphasized that AT should not affect the environment negatively and that it should fit in with the socioeconomic structures of the community. The suggested characteristics are too numerous, which implies that a technology can be appropriate in some ways and inappropriate in others. Kaplinsky examined the trade-offs involved in the choice of technology and found that mechanized production can, at times, turn out an inexpensive, higher quality product for consumers, whereas normal production of a lower quality and higher cost product generates more employment (ATI 1987). This illustrates the dilemma involved in evaluating technology and raises the question, Appropriate for whom? This article is concerned with the gaps in knowledge, skills, or resources that hinder effective choice of technology at the enterprise level. In this context, the term appropriate is used loosely to mean technology that is most advantageous to the enterprises purpose and circumstances. Small enterprises The heterogeneity of the SSE sector complicates the problem defining it. The concept is defined in different ways, depending on the purpose of classifying firms as micro, small, medium sized, or large. Technologically, the sector is said to use low-level inputs and skills, to have much greater labour intensity, to produce lower priced products, and to operate on a small scale. The study on which this article is based focused on enterprises in the carpentry and hair-care subsectors employing fewer than 20 employees. It covered micro and small enterprises operating at various levels along the formalityinformality continuum. The Private Sector Diagnosis Survey (USAID 1989) found that most small enterprises in Kenya had fewer than 20 employees. Choice of Technology in SSEs Private-sector development as a suitable alternative for promoting sustainable and balanced growth in India has attracted considerable attention. Many governments and development organizations have focused on the promotion of small-scale enterprises (SSEs) as a way of encouraging broader participation in the private sector. The promotion of SSEs and, especially, of those in the informal sector is viewed as a viable approach to sustainable development because it suits the resources in India. A number of factors have helped to direct the attention of development agencies to the merits of SSEs. For instance, at the peak of the economic crisis in the early 1980s, the SSE sector grew tremendously and exhibited unique strengths in the face of recession. The sector continued to grow, despite hostile economic, regulatory, and political environments. The entrepreneurs in this sector came to be regarded as highly opportunistic and innovative. They emerged spontaneously to take advantage of opportunities that arose in the changing business environment. Moreover, they demonstrated great creativity in starting enterprises with minimal resources.

54

ANNA UNIVERSITY CHENNAI

EMERGING TRENDS IN TECHNOLOGY MANAGEMENT

SSEs have characteristics that justify promoting them in a development strategy. They create employment at low levels of investment per job, lead to increased participation of indigenous people in the economy, use mainly local resources, promote the creation and use of local technologies, and provide skills training at a low cost to society (ILO 1989). It is generally recognized that SSEs face unique problems, which affect their growth and profitability and, hence, diminish their ability to contribute effectively to sustainable development. Many of the problems cited have implications for technology choice. These problems include lack of access to credit, inadequate managerial and technical skills, low levels of education, poor market information, inhibitive regulatory environments, and lack of access to technology. Factors influencing the choice of technology by SSEs Entrepreneurs decide at the enterprise level which technologies to use. The main factors influencing their choice of technology include the objectives of the firm, the resources available, the nature of the market, and their knowledge of available technologies (Stewart 1987). Moreover, the entrepreneurs need technical and managerial skills to choose, adapt, and effectively use technology. Additionally, one would be in a better position to choose a technology if one were able to assess the demand for the firms products, estimate the rate of change in the market that may call for change in technology, gather information about alternative technologies, and estimate the potential return on investment for each alternative. However, many entrepreneurs in this sector lack the education, training, management experience, and other competencies needed to respond to these issues. Because of their economic and organizational characteristics, many SSEs lack information about technologies and have no way of gauging the appropriateness of those they are aware of (Neck and Nelson 1987). Macropolicies also affect technology choice at the firm level through the overall socioeconomic, political, and legal forces. It has been suggested that general socioeconomic environment, industry-specific regulations, taxes, subsidies, trade and financing policies, science and technology research, and dissemination policies tend to favour large-scale enterprises (ATI 1987). Problems hindering the effective choice of technology by SSEs The literature indicates that SSEs face unique constraints that hinder the effective choice of technology. Many SSE owners or managers lack managerial training and experience. The typical owner or managers of small businesses develop their own approach to management, through a process of trial and error. As a result, their management style is likely to be more intuitive than analytical, more concerned with day-to-day operations than long-term issues, and more opportunistic than strategic in its concept (Hill 1987). Although
55

NOTES

ANNA UNIVERSITY CHENNAI

DBA 1736

NOTES

this attitude is the key strength at the start-up stage of the enterprise because it provides the creativity needed, it may present problems when complex decisions have to be made. A consequence of poor managerial ability is that SSE owners are ill prepared to face changes in the business environment and to plan appropriate changes in technology. Lack of information is a key problem affecting SSEs access to technology. Harper (1987) suggested that technologies used by SSEs in developing countries may be inappropriate because their choice is based on insufficient information and ineffective evaluation. Neck and Nelson (1987) suggested that ignorance is a key constraint affecting the choice of technology by SSEs. Further, level of education is relevant, as it may determine the entrepreneurs access to information. Generally, the ability to read and write, exposure to a broader world, and training in the sciences enhance ones ability to understand, respond to, use, and control technologies (Anderson 1985). Lack of access to credit is almost universally indicated as a key problem for SSEs. This affects technology choice by limiting the number of alternatives that can be considered. Many SSEs may use an inappropriate technology because it is the only one they can afford. In some cases, even where credit is available, the entrepreneur may lack freedom of choice because the lending conditions may force the purchase of heavy, immovable equipment that can serve as collateral for the loan. Another related problem is the lack of suitable premises and other infrastructure. The national policy and regulatory environment has an important impact on technology decisions at the enterprise level. The structural adjustment programs (SAPs) currently implemented in many African countries are aimed at removing heavy policy distortions, which have been viewed as detrimental to the growth of the private sector. However, much as these policies may in principle favour SSE growth in the long run, concern has been shown about the ability of the SSE sector to increase production and create more jobs under conditions of declining demand (Henk et al. 1991). SAPs tend to severely affect vulnerable groups in the short run and have been associated with the worsening living conditions in many African countries (USAID 1991). Furthermore, severe cutbacks in government services, such as health and education, force many small-business owners to draw more money from their businesses to meet these needs, thus hindering investment in technology and business expansion. In addition, the resulting reduction in employment and real wages leaves many potential customers without the ability to buy, thus reducing demand. Some evidence from the field This section highlights the findings of a study carried out on the SSE sector in Kenya. The survey used a random sample of 140 SSEs operating in the carpentry and hair-care subsectors in Kenya. The two subsectors are largely dominated by small and micro enterprises. Interviews were conducted with owner and managers of SSEs. The literature
56 ANNA UNIVERSITY CHENNAI

EMERGING TRENDS IN TECHNOLOGY MANAGEMENT

survey included a review of policy documents outlining government policy objectives for SSE development and technology issues in Kenya (for a detailed report of this study, see Ngahu [1992]). The findings of the study correspond to those in the literature. Most of the SSE (78%) were individually owned, and the others were partnerships. The SSEs had not grown much over the years. More than 51% had fewer than 5 workers, and only 22% had more than 10 employees. Sixty-three percent of the owners surveyed had secondary education. More than 60% had some kind of training in a technical area of business, but only 13 and 12% had any training in general business management and marketing, respectively. Most tools and equipment used in the two subsectors were imported from Europe or Asia. In some cases, even simple tools, such as brushes, hammers, and tape measures, were imported. In the hair-care subsector, the chemicals, materials, and equipment were mainly imported. The tendency to rely on foreign sources and the large-scale industrial sector for supply of equipment sometimes led to an incompatibility of the needs and capacities of the SSEs. Wangwe (1993) suggested that SSEs are trying to avoid risk by avoiding unproven technologies. To get information about products, tools, equipment, and processes to use in business, many SSEs rely heavily on friends, competitors, and training courses. More than 64% of the respondents indicated that friends were their main source of information on available technologies. Other sources include training courses, magazines, and sales people. The high reliance on friends as a source of information may explain the similarities among products and services from this sector. Both subsectors serve markets that are clearly segmented, and technologies in enterprises serving the same market were very similar. The key method for technology choice in these enterprises seemed to be simple imitation based on observation. Although imitation strategies have unique merits for small firms because they serve to minimize risks, imitation can be risky in the absence of adequate market information. Many SSEs lack information about consumer demand and competition. Moreover, they lack the skills and resources to conduct market research. As a result, many imitators find themselves in a congested market. The similarity of their products, coupled with the tendency to serve the same market segments, erodes any competitive advantage. This forces them to compete by reducing prices, which in turn reduces profits and opportunities for growth. Most SSE owners were influenced by customer expectations and tastes, current trends, and the technology that competitors were using. Generally, the technologies adopted in both subsectors were labour intensive. Most respondents expressed concern about high prices, inability to determine quality, lack of information about serviceability, and lack of alternatives. They also raised the issue of inadequate infrastructure, high taxation on equipment, lack of access to credit, and lack of appropriate training courses.
57

NOTES

ANNA UNIVERSITY CHENNAI

DBA 1736

NOTES

The government policy on the use of technology in the production of goods and services is to encourage the application of technologies that minimize wastes and exhibit recycling possibilities; the use of local and renewable materials; the use of local talents and inputs wherever possible; and the active development of innovations and inventions (Government of Kenya 1989). Although the policy objectives appear explicit, it is not clear which policy measures or government interventions have been intended to affect the process of technology choice by SSEs. Policy implications SSEs are obviously incapable of sourcing, evaluating, and adapting technologies effectively. The government policy should, therefore, aim to develop these capabilities in SSEs through supportive institutions. Policy can encourage the development of assistance programs to facilitate SSEs access to resources, information, training, and technology. Further, policy should promote the development of technologies appropriate for SSEs. Although it is possible to develop policies designed to improve the circumstances of SSEs, it may be more feasible to support the development of technologies compatible with the SSEs circumstances. Policies should aim to encourage and promote the development of local technologies. Emphasis should be on the promotion of the local tool industry to reduce reliance on imports. SSEs are said to face a liability of smallness. Because of their size and resource limitations, they are unable to develop new technologies or to make vital changes in existing ones. Still, there is evidence that SSEs have the potential to initiate minor technological innovations to suit their circumstances. However, for SSEs to fully develop and use this potential, they need specific policy measures to ensure that technology services and infrastructure are provided. Further, research and development institutions that are publicly funded should be encouraged to target the technology needs of SSE. The problem of access to information may be attributed to the inadequacy of SSE support institutions. This points to the need for a supportive policy to encourage the establishment of documentation centres and information networks to provide information to SSEs at an affordable price. Market characteristics significantly influence technology choice. The government can facilitate the SSEs choice of technology by creating an environment that is conducive to fair competition. The crucial focus of policy should be an enabling environment for technology decisions at the enterprise level. There is a need to go beyond statements of policy objectives and to take specific and consistent measures to ensure that the policy objectives will be achieved. There is a need to address the overall policy framework to ensure that the policy instruments are consistent with key objectives. In some cases, there appears to be an obvious contradiction between policy and implementation.

58

ANNA UNIVERSITY CHENNAI

EMERGING TRENDS IN TECHNOLOGY MANAGEMENT

2.4 TECHNOLOGY SOURCING Todays global competition, forces manufacturing companies to re-evaluate their existing processes and technologies in order to focus on strategic activities. This issue has created an awareness of the importance of the make-or-buy decision and its long-term impact on the organisation. Undertaking make-or-buy decisions requires an analysis of in-house and outside manufacturing technologies and capabilities. Therefore, companies should be able to understand and identify the way the technology portfolio should be built in order to balance in-house and outsourced technologies. This paper discusses the different options for technology sourcing resulting from the importance/competitiveness matrix. This matrix indicates a range of sourcing options as a result of technology process analysis in terms of: importance of the technology to the business. The ability to influence the business key success factors; and competitiveness with which the technology is deployed. This involves assessing the companys level of performance in the use of technology against potential suppliers or competitors. In particular, a critical dimension for technology sourcing, the technology life cycle, is presented, emphasising the importance of understanding and monitoring the life cycle of technologies. This paper particularly shows the critical importance of technology life cycle consideration in the choice of technology sourcing options 2.5 MANAGING UNCERTAINTY (RISK MANAGEMENT) Managing change particularly in the context of Extended Services often requires school change teams to rely on other things falling into place and other people playing their part. In these situations, that is, when the outcome is not entirely under your control, you are faced with uncertainty and the risks that arise. Rather than make an assumption and hope it all works out OK, change managers can use this tool to help reduce and eliminate the risks involved in their change projects by proactively and systematically managing the uncertainty from which all risks stem. The tool, Managing uncertainty (risk management), differs from customary risk management methods in that it focuses attention on the underlying uncertainty rather than the risk, and it proposes a way of effectively tracking the impact of actions taken so that you avoid managing a crisis! What is it? Managing uncertainty (risk management) is a form-driven tool to ensure you identify the uncertainties that present risks to the success of your change project The tool helps the team to understand the assumptions they have made in putting their project plan together It enables you to raise confidence in assumptions, reduce uncertainty and, hence, reduce or eliminate risk to successful change

NOTES

59

ANNA UNIVERSITY CHENNAI

DBA 1736

NOTES

When would you use it? During the deliver stage

Are there any rules? Use it always when planning your implementation As action is taken, revisit and reappraise confidence in assumptions and criticality of risks

2.6 COMPLEXITY MANAGEMENT Managing complexity A content management system (CMS) requires contributions from many different skill sets and coordination across diverse departments and roles. A CMS project can cost hundreds of thousands or even millions of dollars and require months or years to design and implement. Because of the high planning, purchasing, and design costs, there is a need to effectively manage the complexity of CMS projects. Here are ten lessons in managing complexity gleaned from real-world, successful CMS projects. Ideally youll consider these at the beginning of a project when they can have the most impact: 1. Keep the team small.

A big team usually requires a lot of coordination and communication, especially if it is spread across different departments, offices, or cities. This coordination increases the points of failure and quickly reaches a level of diminishing returns for systems that need close collaboration to be designed well. To overcome this problem, one financial services firm formed a multidisciplinary team of only five experienced people to create their content management system. The team included people who both had skills to contribute and could make executive decisions. This team consulted with additional, specialized staff only when needed. In the end they succeeded in building a system in a few months in a company where other efforts typically spent several months and failed. 2. Dont try to fix everything at once.

A CMS alone is complex enough; combining that effort with a site redesign, new workflow, new content, and more may be asking for trouble. An international retailer decided to expand their content management system in a way that required multiple new software packages. To reduce complexity, they swapped in the new CMS without changing the design of either their online or print catalogs.

60

ANNA UNIVERSITY CHENNAI

EMERGING TRENDS IN TECHNOLOGY MANAGEMENT

3.

Only build what you need.

Its important to remember the main benefit of content management systems is efficiency. Anything done with a CMS can be done without a CMS by people with the right skills, albeit in much less efficient ways. Websites often use content management when there is a large volume of content, frequent content updates, content distributed across several media, and so on, tasks that would be arduous when done manually. But if more effort is needed to implement a CMS than to manage the content manually, the return on investment is quickly lost. The potential features of CM software spiral out in all directions, so discipline is needed to decide which features are needed most. At the beginning of a project we can examine which features bring us the most benefit compared with how difficult it is to implement those features, and choose the features with the most value. A new media group at a bank took this approach and built the absolute simplest CMS that would serve their needs, and then gradually added one feature at a time as the need became clear. 4. Create an efficient information architecture. A content management system with a different template for every published page would not be very efficient. And if efficiency is the main benefit of a content management system, then it makes sense to use fewer templates. As designers, we must be very clever about how to arrange diverse information into a small number of templates while still retaining some flexibility in the presentation. A large technology company achieved this efficiency by creating templates as well as reusable modules of informationsuch as a list of related linksthat fit inside those templates. By creating rules that determined how templates could use certain modules the company struck a balance of CMS efficiency with display flexibility. 5. Show your content some love. Of all tasks in a content management project, the creation, editing, and migration of content are probably the most frequently underestimated on the project plan. The survey above reveals this void as the biggest problem with CMS. Amid much sexier design and technology issues, the creation and/or re-formatting of content can be delayed until this eventual necessity delays the project. To counter this problem, one non-profit organization settled on an article layout at the beginning of a project so it could start preparing the content earlier, then continued the content work in parallel with the design and technology work. 6. Hire bouncers as project managers. Perhaps this is going a little too far, but you do need rigorous project managers that understand CM issues who will babysit the team to make sure every little task is getting done. These project managers must do more than make sure documents are delivered on time, they must help connect the work that all team members do.
61

NOTES

ANNA UNIVERSITY CHENNAI

DBA 1736

NOTES

One large retail firm took this to heart by using two project managers: one to oversee the business and user interface issues and another to oversee the technology issues. 7. Tightly integrate design and technology.

Content management software involves certain components, such as content entry screens, that require a combination of interaction design, information architecture, writing, and database programming skills. Few people do all these things well, and having different people or groups design these components in isolation risks poor quality and consistency. My smoothest experience designing these components was when my desk was located right next to the programmers desk and we constantly discussed the design as it evolved. 8. Buy the right size.

In the survey cited above, the number one problem with software is the expense. You might think the solution to this problem is to buy a less expensive software package, but I think a better solution is the buy the right size software package. Tips for choosing the right software Buy small software if youre a small organization. Organizations like Boxes and Arrows, the Asilomar Institute of Information Architecture, and Adaptive Path all use Movable Type to manage content, which was originally designed for weblogs. As content management software, it doesnt provide many basic functions, but it simplifies the publishing process enough for occasional publishing needs. Buy big software if youre a large organization. One big CMS can actually be more efficient than many different, smaller packages. One financial services firm employs a federated model of CMS by using one software platform to publish many websites, avoiding the extra training and technical work needed to work with several different software packages. Buy no software at all if you really dont need it. In the decision to buy vs. build, we can also avoid software all together.

9.

Design faster than business can change. You must design and implement your system faster than your organization can change.

For example, a large computer networking company found that it required three years to roll out a new website design across all its departments and websites. Before they could finish, the organization had undergone significant changes that needed to be reflected in a new site design. Designing fast may mean keeping the scope small, but it can also mean finding innovative approaches to problems rather than simply following conventional methods. Building a Metadata-Based Website is an example that speeds design of very large sites by focusing management on the business concepts instead of the content.

62

ANNA UNIVERSITY CHENNAI

EMERGING TRENDS IN TECHNOLOGY MANAGEMENT

10. Get a second opinion. Content management is an elaborate, dynamic field and there are several solutions to any problem. Just as in medicine, its sometime necessary to get a second or third opinion to hear approaches that arise from different philosophies. One international retail company brought in a CMS expert to consult to the team without doing any of the work herself. As an expert who didnt work for the company or any of the contracted vendors, she was in a good position to provide impartial guidance. As a disinterested third-party, the expert can help smooth interaction within the team while leveraging experience from previous projects. Now you have a list of problems others have had and ten ways to address the complexity of content management. Will following this advice solve your CMS problems? Not entirely. If youve only heard the hype from software vendors, you may have very high expectations that need to be reconciled with reality. Content management systems are not a silver bullet, but they can make your most onerous tasks more efficient if you actively manage the inherent complexity. 2.7 WHAT IS CHAOS AND COMPLEXITY? 1. What is Chaos? - The first concept comes from Chaos, which is defined as the irregular, unpredictable behavior of deterministic, non-linear dynamical systems. Chaos is fast replacing bureaucracy as the new science of organizations. The relevant generalization here is that we live in an uncertain and turbulent environment and, even with massive amounts of available information, it has become increasingly difficult for us to choose appropriate organizational survival behaviors. No one seems to disagree with the assertion that human systems exhibit chaotic behavior. However, management theorists have yet to acknowledge that the deterministic element of chaos can be beneficial in forming viable survival strategies. They have focused almost exclusively on preparing the organization to react quickly to changes in the external environment. What is Chaos Management? The translation of Chaos Theory into management practice is, at best, a loose analogy that has been built upon three generalizations of scientific concepts: Chaos, Complexity Theory, and Complex Adaptive Systems. It has always been somewhat problematic to apply a scientific theory - one that was intended to explain natural phenomena - to explain the affairs of human organizational systems. The relatively new science of chaos is one such application that has made inroads into the realm of management and organizational behavior. Summary Point: Chaos has positive and negative features. 2. What is Complexity? The second concept comes from Complexity Theory, which states that critically interacting components self-organize to form potentially evolving structures exhibiting a hierarchy of emergent system properties. A system normally has two choices of operational modes: stability or instability. In the stable mode, a disturbance will eventually converge back toward the systems initial
63

NOTES

ANNA UNIVERSITY CHENNAI

DBA 1736

NOTES

conditions. In the unstable mode, a disturbance will cause a progressive divergence away from initial conditions. Self-organizing systems operate in a third mode - between stability and instability - where optimal system performance can be achieved in a turbulent environment. This transition zone is known as the edge of chaos, a region of bounded instability in which there is unpredictability of specific behavior within a predictable general structure of behavior. The relevant generalization here is that for a human social system to become selforganizing, it must become a learning organization. That is, survival strategies are developed continuously in response to changing environmental conditions. Recognition of rudimentary deterministic environmental patterns allows the organization to move beyond mere survival to the possibility of a thriving existence. What is the Positive and Negative Side of Chaos? Positive Side of Chaos - The new theory of organizations is how to create what is called edge of chaos patterns of organizing. In this approach individuals and units are given more flexibility and local control and terms are expected to self-organize under the assumption that it is possible to achieve greater adaptability to the customer demands and other environmental shifts and flows. Daryl Conner, author of Managing at the Speed of Change and Leading at the Edge of Chaos: How to Create the Nimble Organization, asserts. Change now breeds itself, he says, so the challenge is, how do we deal with perpetual unrest? The concept of the nimble organization is key to Conners work. In fact, the first line in his book Leading at the Edge of Chaos reads: the focal point for this book is leaderships role in building resilient, nimble organizations. Organizations that are not nimble, Conner says, are constrained. To build nimble organizations, he explains, leaders must bring to the human side of change the same level of rigor and discipline that are applied to the organizations financial assets. As Conner (2000: 18) puts it: Running a corporation that survives and thrives at the edge of chaos has become almost a full-time job. Mergers and acquisitions are creating strange bedfellows, the market is becoming more sophisticated, and the very nature of our businesses is shifting. Some leaders are questioning their abilities to remain competitive in a market where disruption is the norm. Negative Side of Chaos - In its popular usage, chaos is a negative. People say I hate chaos, lets get organized. While the theorists give us fractal, strange attractor, and edge of chaos metaphors, we have to work in the chaos soup. Of concern here, is how does it feel to stare into the abyss, or worse to work in a chaos abyss? One definition of Chaos Narrative comes from Frank (1995) It is the story we tell when we are unable to tell a story; it is the anti-narrative of time without sequence, telling without mediation, and speaking about oneself without being fully able to reflect on oneself.
64 ANNA UNIVERSITY CHENNAI

EMERGING TRENDS IN TECHNOLOGY MANAGEMENT

There is an obvious need to balance theories of chaos management with how people experience chaos as the void of buzzing confusion and being out of control. How are Chaos and Complexity inter-related? What are Complex Adaptive Systems? The third concept is a characterization of the only known type of system that is capable of thriving at the edge of chaos. A Complex Adaptive System is defined as a system of individual agents, who have the freedom to act in ways that are not always totally predictable, and whose actions are interconnected such that one agents actions changes the context for other agents. The relevant generalization here is that to optimize system performance, managerial control must be loosened enough to allow uninhibited communication and interaction among all members of the organization. Creative and adaptive solutions to external constraints will emerge as the learning organization gains the mobility and freedom to actively navigate through uncertainty and turbulence. The behavior of a mature complex adaptive organization can even move into the realm of predictability. How to Control Chaos? Control chaos by applying these basic office management principles: 1. Establish office management routines and stick to them. Routine tasks need routine procedures if you want to stay organized and keep things running smoothly. Set up routines for handling paperwork and office systems. For instance, every piece of paper that comes into your office should be handled once, acted upon, and filed not haphazardly piled on a desk. Office systems, such as computers, will need both administration and what I call panic mode procedures. When the system crashes or a computer-related piece of equipment fails, everyone in your office needs to know who to call and what not to do (such as try to fix the problem themselves). These data management articles provide helpful tips for everything from office filing systems through computer backup procedures. 2. Set up clearly delineated responsibilities. Good office management depends on people knowing who is responsible for what its people who are accountable who get things done. What would happen, for example, if the purchasing for your small business was done by whoever whenever? Would you be able to find a paper clip when you wanted one? Or print off a report when you needed to? Putting one person in charge of ordering all equipment and supplies solves the problem and keeps things running smoothly. Its the same with (computer) systems administration. You need to have one person responsible for the security of your computer system and keeping track of things such as accounts, passwords and software. Otherwise, chaos will proliferate.
65

NOTES

ANNA UNIVERSITY CHENNAI

DBA 1736

NOTES

3. Keep records and keep your business records updated. Keeping records sounds like the easiest part of good office management until you consider the need to keep those records both accessible and updated. But my first rule for controlling chaos will help you get a grip on this; make updating records an office routine. When you get a new customer or client, for instance, it only takes a moment to enter him into your contacts database. Then it will only take another moment or two to update the record after youve spoken to him on the phone. Note that records of customer permissions will have to be kept and customers need to have access to their records. 4. Take a walk through your office and have a sit. Is your office an example of space management or space mis-management? When you walk through the office, do you have to detour around obstacles or run the risk of tripping over something? When you sit down at a desk, could you actually work comfortably there? Are things logically arranged so that the things that you would use most at the desk are closest to hand? There are a lot of things crammed into offices nowadays, from printer stands through filing cabinets. For good office management, you need to be sure that all the things in the office are arranged for maximum efficiency and maximum safety. The Basics of Small or Home Office Design provides tips for safely meeting the power, lighting and ventilation needs of your office space. 5. Schedule the scut work. Its too easy to put off things that you dont like doing, and I dont know very many people that enjoy scut work. Unfortunately, an office, like a kitchen, wont function well without a certain amount of scut work being done. If you are a small business owner whos in the position of not being able to assign whatever you view as scut work to someone else, force yourself to get to it regularly by scheduling time each week for it. Take a morning or afternoon, for instance, and spend it making the cold calls or catching up on the accounting (or updating the records). 6. Delegate and outsource. In a perfect world, everyone would only be doing what he or she had time to do and did well. As the world is not perfect, instead a lot of people are doing things that they dont have the time or talent to do well. Delegating and outsourcing can not only improve your small businesss office management, but free you to focus on your talents as well, thereby improving your bottom line. Virtual assistants can handle many of your office or administrative tasks. For more on delegating, see Decide to Delegate. 7. Make business planning a priority. Many small business owners spend their days acting and reacting and then wonder why they seem to be spinning their wheels. Business planning is an important component of good office management and needs to be part of your regular office management routine.
66 ANNA UNIVERSITY CHENNAI

EMERGING TRENDS IN TECHNOLOGY MANAGEMENT

Successful small business owners spend time every week on business planning, and many use daily business planning sessions as a tool for goal setting and growth. If you have staff, involve them in business planning, either formally or informally. Dont let chaos interfere with doing business. Once you start applying these seven principles of good office management, youll be amazed at the difference good office management makes and how much more business you do. 2.8 R & D PRODUCTIVITY In recent years, both economists and policymakers have focused increased attention on the role that R and D plays in promoting economic growth. Despite the fact that R and D activities exist in many countries, only a handful of nations consistently create leading edge technologies, from communication advances to biomedical revolutions. American scientists, engineers, and other highly skilled professionals are tops in generating new-tothe-world technologies; only Switzerland had a per capita patenting rate comparable to the United States in the 1970s and 1980s. However, Japan, Germany, and Sweden did join the top tier in the 1980s. Why do some nations excel at technological breakthroughs while others lag behind? Put somewhat differently, why does location matter for innovation when ideas easily cross borders, because of global communications networks, relatively open capital markets, and consistently increasing international trade in goods and services? The answers are more than intellectually intriguing. Governments and policymakers are concerned about which resources and policies are likely to be effective in improving their science and technology infrastructures. A better grasp of the complex links between broad public policies and a nations ability to produce genuine high-tech innovations could lead to more effective strategies for improving economic growth. These are the ambitious issues motivating The Determinants of National Innovative Capacity (NBER Working Paper No. 7876) by Scott Stern, Michael Porter, and Jeffrey Furman, which evaluates the factors driving variation in R and D productivity among a sample of 17 OECD countries between 1973 and 1996. The key concept framing their analysis is national innovative capacity, defined by the authors as the ability of a country as both a political and economic entity to produce and commercialize a flow of innovative technology over the long-term. The national innovative capacity concept is built on three distinct scholarly strands. First are the theories of ideas-driven growth, closely associated with the work of Paul Romer. Then there are the microeconomic models of national competitive advantage based on an understanding of industry clusters, a research agenda largely identified with Porter. Finally, the authors draw upon the rich national innovation systems literature among whose most notable authors is Richard Nelson. The national innovative capacity framework
67

NOTES

ANNA UNIVERSITY CHENNAI

DBA 1736

NOTES

highlights three factors that drive a nations ability to innovate at the worlds technological frontier: 1) a common innovation infrastructure, which includes support for basic research and higher education, as well as a countrys cumulative stock of technological knowledge; 2) the extent to which the conditions of a nations industry clusters promote innovationbased competition; and, 3) linkages between the common innovation infrastructure and the industry clusters that allow the resources broadly available for innovation in the economy to flow to their most competitive use. The productivity of a strong national innovation infrastructure is higher when specific mechanisms or institutions, such as a strong domestic university system and funding mechanisms for new ventures, migrate ideas from the common infrastructure into commercial practice, write the authors. Porter, Stern, and Furmans quantitative analysis concentrates on uncovering the relationship between international patenting (patenting by foreign countries in the United States) and the variables making up the innovative capacity framework. Their results suggest that a number of factors are especially important in determining a nations overall level of innovative outputs, including national policies, such as international patent protection and openness to international trade, and factors describing the composition of R and D effort in the economy, such as the share of research performed by the academic sector and the share funded by the private sector. In expanding their analysis to examine the relationship between innovativeness and competitiveness, the authors find that a countrys level of national innovative capacity also has a substantial impact on commercial success in hightech markets at home and abroad. The authors document a striking convergence in innovative capacity among the OECD countries over the past two decades. Whereas the United States and Switzerland had been the world leaders with respect to R and D productivity in the mid-1970s, Japan, Germany, and Sweden have become their peers in the innovation marketplace. The second tier of innovator nations also has expanded with Denmark, Finland, and other countries making genuine strides in improving their commercial exploitation of frontier technologies. The trend toward convergence also may reflect a lessening of Americas traditional dominance. Since the passing of the Cold War, the United States has been increasing its investments in its national innovation infrastructure at a lower rate. Consequently, the authors speculate, as a wider set of countries continue to invest substantial resources in national innovative capacity, we may see that the commercial development of emerging technologies becomes less geographically concentrated in the next few decades than it was in the 50 years of the post-World War II era. 2.9 BUSINESS APPRAISAL OF TECHNOLOGY POTENTIALS Aims To provide manufacturing (and other) companies with the means to assess systematically, the benefit of new technologies to their business. The objectives and outputs are:
68 ANNA UNIVERSITY CHENNAI

EMERGING TRENDS IN TECHNOLOGY MANAGEMENT

To review the tools and techniques currently available to managers in industry for the assessment of technology for business purposes. To identify the gaps and limitations related to the use of these tools and techniques. To develop selected new approaches which integrate and complement existing tools and techniques, filling significant gaps. To apply the developing tools in selected case examples. To provide guidance to potential users in the application of these new approaches.

NOTES

Background The issue of assessing technology for business application remains a foremost concern for managers in industry. Companies are pushed towards diversifying their portfolio of technology as well as accelerating commercial exploitation. They do this by increasing resources directed towards growth and by acquiring developed or developing technologies. This has increased the trading of technology between firms, and these technologies must be valued. Other reasons to value technology include obtaining finance and valuation for tax purposes. In practice, many managers know that there is something unsatisfactory about the standard use of Discounted Cash Flow (DCF) techniques, particularly when there is high uncertainty and flexibility. In recent years much progress has been made, however many key questions remain, in particular that of estimating the value of a particular technology to a particular organisation, now and in the future. This is of central concern in the choice of development projects, and when considering the acquisition of technology external to the firm. Valuing technology is more of an art than a science and methods have been developed from tools used to value tangible assets, and thus there is still a huge amount of scope for research in this area. Recent advances in options and hybrid-model thinking have opened up new paths, but the application of these ideas in practice is very limited. Research approach Initial interviews in a range of companies to determine issues, current practice and future requirements. Multi-company workshops to firm up concepts for further development. Case studies or collaborative projects in companies to develop new techniques.

Deliverables Review of literature & practice, based on working papers and company interviews & cases Framework principles for technology evaluation T-VAL cd to raise awareness of integrated nature of technology evaluation concepts, techniques and resources
69 ANNA UNIVERSITY CHENNAI

DBA 1736

NOTES

Value roadmapping guide to support the evaluation of early stage technologies Decision tree software tool to support more enlightened use of quantitative approaches, supported by two management guides (Beyond DCF and Get better estimates for input into quantitative models)

2.10 DESIGN MANAGEMENT Design management refers to an approach whereby organizations make designrelevant decisions in a market and customer-oriented way as well as optimizing designrelevant (enterprise-)processes. It is a long-continuous comprehensive activity on all levels of business performance. Design management acts in the interface of management and design and functions as link between the platforms of technology, design, design thinking, management and marketing at internal and external interfaces of the enterprise. Historical development of design management The roots of design management go back into the 1920s with and the 1950s and 1940s with. For a long time design management was used as a term, but thereby not understood correctly, since it could be attributed neither directly to the design nor the management. 1940s Design is a function within corporations, or as independent consultancies have not always collaborated well with business. Clients and the market have traditionally viewed design as an expressive and production function, rather than a strategic asset. Designers have focused their skills and knowledge in the creation of designed artifacts, and indirectly addressed larger issues within this creative process. Designers have been uneasy about articulating their value to business in terms that business could understand. There were moves to bridge this gap. In England, the British Design Council was founded in 1944 by the British wartime government as the Council of Industrial Design, with the objective to promote by all practicable means the improvement of design in the products of British industry. 1950s Chicago industrialist Walter Paepcke of the Container Corporation of America founded the Aspen Design Conference in the United States after World War II as a way of bringing business and designers together to the benefit of both. In 1951, the first conference topic, Design as a function of management, was chosen to ensure the participation of the business community. After several years, however, business leaders stopped attending because the increased participation of designers changed the dialogue, focusing it not on the need for collaboration between business and design, but rather on the business communitys failure to understand the value of design. While designers were trying to
70 ANNA UNIVERSITY CHENNAI

EMERGING TRENDS IN TECHNOLOGY MANAGEMENT

make connections to the business community, there were business people that were trying to make connections to the design community. Individuals from both communities began making connections between the goals of business and how design could be a subject in the management suite. Design managements foundations are European in nature and one of the strongest early advocates was Peter Gorb, former Director of the London Business Schools Centre for design management. 1960s to 1970s In 1966 the term design management was mentioned in the Anglo-American literature by Farr. Design management focused on how to define design as a business function and provide the language and method of how to effectively manage it. In the late 1960s and into the 1970s Gorb and others began to write articles that were drafted to designers to learn about business, and to business professionals to understand the untapped potential of design as a critical business function. And what designers need to learn, and this is the most important thing, is the language of the business world. Only by learning that language can you effectively voice the arguments for design. (Peter Gorb) In 1975 the Design Management Institute was founded in Boston and developed following the Harvard Business School. The DMI is an international nonprofit organization that seeks to heighten awareness of design as an essential part of business strategy and become the leading resource and international authority on design management. Economical faculties used the possibility first (after some books regarding this topic were published) of establishing economical courses of studies for design management. Slowly also design faculties followed to take up studies for design management into their academical curricula. Apart from the economical and design-oriented courses there are today also pure master courses in design management (the Westminster university was one of the first in Europe) as well as co-operation programmes, like the International Design Business Management Programme in Helsinki (co-operation programme of universities from design, technology and management). In the late 1970s design management refers to the movement in Great Britain, Europe and America, which focusses on design resources in corporate business. 1980s to today In the beginning, design management was seen by many only as short-lived fashion, but over time it has proved its worth (Design Council 2004), supported by the increasing role of design within the development of social, economic, ecological, technological and cultural processes. And design management grew in importance [...] through the change from a strategy of cost leadership, over the quality leadership to the strategy of performance leadership (Koppelmann 1993). Today, one has to understand design in its entire, contemporary spectrum and thereby not be reduced on linear areas (product design,
71

NOTES

ANNA UNIVERSITY CHENNAI

DBA 1736

NOTES

communication design, industrial design, etc.). Any adjustment of design to certain fields of work would not deal fairly with the social and economic task of design in any way. Design management intervenes here, organizes, mediates and structures in an increasing more complex enterprise and economic world. 1986 saw the launch of the leading periodical devoted to design: Design week. Views on design management Different views on design management Design management is no model that can be projected on any enterprise, no application with linear functionalities and no specific way that leads to success. Rather design management processes are accomplished by humans with different authorities and trainings, who work in different fields of enterprises with different sizes, traditions and industries and they have very different target groups and markets to serve. Design management is multifarious and like that are their different opinions about design management. The design management topics show an overview of the spectrum what design managers deal with. Many agencies are limited to subranges and supplement thereby their classical applied design range (see hand-on-design). Design management and marketing Design management and marketing have many common intersections. In the marketing, which was developed in the 1960s, design became ever more important. In the beginning, design was understood as a marketing instrument, it further developed itself and today it can be seen on the same level as management. Todays management theories speak of an equal partnership between marketing management, product management and design management. Design management versus design leadership In the every-day-business design managers often operate in the area of design leadership. But design management and design leadership are not interchangeable. Like the differences between management and leadership they differ in their objectives, achievements of objectives, accomplishment and outcomes. Design leadership is pro-active it leads from a vision, over the communication, the convey of meaning and collaboration through motivation, enthusiasm and attaining of needs, to changes, innovations and creative solutions. Thereby it describes the futures needs and chooses a direction in order to get to that described future. In contrast, design management is re-active and is responding to a given business situation by using specific skills, tools, methods and techniques. Design management and design leadership depend on each other, design management needs design leadership to know where to go and design leadership needs design management to know how to go there.

72

ANNA UNIVERSITY CHENNAI

EMERGING TRENDS IN TECHNOLOGY MANAGEMENT

Ranges of design management Design management can be divided by its different fields of application into the three ranges operational design management, tactical design management and strategic design management. By Borja de Mozota design management was divided additionally into the levels of strategy, planning, structure, finances, human resources, information, communication and research & development. Operational design management The goal of operational design management is to achieve the objectives set in the strategic design management part. It deals with personal leadership, emotional intelligence and the co-operation with and management of internal communications. The following list in Table 4 shows what the operational design management is coping with: Table 4 Function Level Application
function level strategy

NOTES

application Translation of visions into strategies Defining the role design plays in the brand. Translation of strategies into a design brief. Decisions about product quality and consumer experiences. Defining policies for design, products, communication and brands. Selection of external design agencies/individuals Creation of alliances. Defining of design teams and people who are in touch with designers. Creation of an atmosphere for leadership and creativity. Managing of design project budgets Estimating of design costs. Reducing of designcosts, resp. shift of investments from cold-spots to hot-spots. developing of competences Advising of product managers and CEO's. Creating of symbioses between universities and other companies. Creating of an understanding of companies goals among designers. Creation of design criteria and standards of valuation for design.

planning

structure

operational finances

human resources information communication research & development


73

ANNA UNIVERSITY CHENNAI

DBA 1736

NOTES

Tactical design management The goal of tactical design management is to create a structure for design in the company. It includes the managing of design departments and fills the gap between operational and strategic design management tasks. The following list in Table 5 shows what the tactical design management is coping with: Table 5
Function Level

Application Coordination of the design strategy with the departments of marketing, communication and innovation. Defining quality policy. Structure of design(-management) tools and language Introducing and improving general design processes. Adaption of design processes to innovation processes. Implementation of a design in-house service. Stabilization of the role of design in the innovation process. Managing to meet the budgetplans. Creation of an understanding of design among the company partners. Creation of marketing, design and production plans. Organization of the design language across all design disciplines. Creation of an understanding of and attention on conscious decisions on all levels of the enterprise. Transformation of design theories into practical research tools.

strategy

planning

structure tactical finances human resources information

communication research & development

Strategic design management The goal of strategic design management is to support and strengthen the corporate strategy, to create a relationship between design, strategy and the identity/culture of the company. It controls the consistency of design in the company, allows design to interact with the needs of corporate management and focuses on designs long-term capabilities. The following list in Table 6 shows what the strategic design management is coping with:

74

ANNA UNIVERSITY CHENNAI

EMERGING TRENDS IN TECHNOLOGY MANAGEMENT

Table 6
Function Level

NOTES
Application Definition of a business strategy which includes design goals. Definition of design strategies which are linked to the enterprise strategy. Managing of design projects Creation of design standards. Creation of an atmosphere for leadership, design and creativity. Support of the corporate strategy with design tools. Securing a budget, high enough to be able to apply the design strategy Influencing the hiring and the managing of designers Informing about the design mission/vision in the company. Implementing design thinking in the top management level. Articulation of explicit and implicit communications, which reflect the enterprise values. Planning, introduction and improvement of means of communication on all channels to the figuration of the total brand experience towards the customer. Creating links between technology-development and design strategy

planning structure

finances human strategic resources information

communication

research & development

Commercializing and Managing Innovations

75

ANNA UNIVERSITY CHENNAI

DBA 1736

NOTES


Characteristics of Innovations What are the characteristics of innovations? Since there are many types of innovations and various ways to describe their dynamics, I need to simplify, even oversimplify my description. But this simplification will help us see some patterns that innovations take. A Taxonomy of Innovations First, lets look at the types of innovation. I have classified innovations along three dimensions: discontinuous and incremental; product, process and conceptual; and replacement and enhancement. Discontinuous and Incremental Innovations Discontinuous innovations are not on a continuum with previous technologies; they involve the application of a new technology. The printing press, telegraph, telephone and computers form a series of discontinuous innovations. Such innovations cause a dramatic shift in the way people or firms perform some activity. Christensen, who prefers the term disruptive innovations, emphasizes the risk that they pose for high-performing incumbent firms. Each discontinuous innovation disrupts the existing technology. The printing press, for example, put scribes out of business. Discontinuous innovations emerge for two primary reasons. The first reason occurs when a technology exceeds customers needs and a simpler, cheaper, less powerful technology is adequate to meet their needs. Mainframe computers provided far more performance and functionality than most people needed, and the personal computer industry disrupted the mainframe industry. The second reason involves reaching technical limitations. If a technology is inadequate for customers needs, but has reached a technical limitation, then scientists may apply a different technology to replace the current one. During World War II governments found the power and speed of propeller-driven aircraft to be inadequate; jet engines met their increased needs for air superiority. For hundreds of years, people communicated with ships at sea using visible mechanisms such as flags and lights and audible mechanisms such as bells and horns. In the nineteenth century, navies found this
76 ANNA UNIVERSITY CHENNAI

EMERGING TRENDS IN TECHNOLOGY MANAGEMENT

technology inadequate; it had reached its limits. In the twentieth century, radio replaced the earlier modes of naval communication. Discontinuous innovations allowed progress. The commercial use of the Internet provides a clear and obvious example of a discontinuous innovation. The Internet has fundamentally changed the way that businesses communicate with each other. Discontinuous innovations typically disrupt an industry and occasionally disrupt the way consumers engage in some activity. Despite the fact that an innovation disrupts existing systems and causes chaos, it still follows regular and predictable patterns. We will explore these patterns in the next section. On the other side of the spectrum are incremental innovations. Once a technology is commercially accepted, firms compete by incrementally adding functionality and improving performance. In the late 1980s thousands of people used email. Some of the incremental innovations added very useful functionality to email, such as compatibility among disparate email systems, the standardization of addressing conventions so that users no longer had to type arcane address symbols including % and $ to delimit address names and to direct one email system to communicate with a different system. Most innovations are incremental; people and firms continue to perform an activity in a familiar way. The innovations simply improve performance, functionality or ease of use. Product, Process and Conceptual Innovations Another way to classify innovations is to consider whether they are product, process or conceptual innovations. Product innovations include such things as the personal computer. Process innovations include Henry Fords assembly line, Walmarts supply chain processes and Dells build-to-order manufacturing processes. Conceptual innovations include Copernicuss theory of astronomy. Replacement and Enhancement Innovations Some innovations enhance the ways in which people perform some activity. Although Hollywood feared that the introduction of videotapes would mean the end of the movie industry, it in fact enhanced peoples opportunities for entertainment. Citibank introduced ATMs in the early 1970s enhancing the ways in which people can interact with their banking systems and extending the reach of a given bank worldwide.

NOTES

77

ANNA UNIVERSITY CHENNAI

DBA 1736

NOTES

Table 7 shows discontinuous and incremental product and process innovations.


Product Discontinuous Calculator Videotapes Jet propelled aircraft Incremental New versions of software Different models of aircraft Process Walmarts supply chain Dells build-to-order manufacturing Continuous process improvement

Patterns and Phases of Innovation A successful innovation typically follows a predictable pattern of behavior. It begins with a discontinuous innovation and proceeds through several phases until the technology is mature when it is once again disrupted by a discontinuous innovation. Foster introduces the notion of an S-curve comparing performance and effort, a graph of the relationship between the effort put into improving a product or process and the results one gets back for the investment. He is concerned with limits; reaching the limits of the technology spells the flattening out at the top of the S-curve. It becomes harder and harder to improve the performance of the technology. At the limit of technology, no matter how hard you try, you cannot continue to make improvements. Rogers also looks at an S-curve, but with a difference in the graphs axes, percent of adoption vs. time. In his work on the diffusion of innovations, he characterizes innovations according to relative advantage, compatibility, complexity, trialability, and observability. The characteristics help explain the rate of adoption. Rogers describes the take-off phase for an innovation as being driven by social forces and interpersonal network exchanges. He describes the dissemination of the use of hybrid corn and the social network effect of farmers learning from each other. Metcalfes Law, named after Robert Metcalfe, the inventor of Ethernet networking technology, says that the value of a network grows in proportion to the square of the number of users. The network effect plays a significant role in the pattern and dynamics of innovation which involve communication, such as the fax machine, telephones and software. Once an innovation reaches a critical mass, its acceptance accelerates. We can also see the S-curve as a graph of the total number of adopters on the Y-axis and time on the X-axis. Three inflection points divide the S-curve into four distinct phases. The first inflection point represents the point in time at which entrepreneurs see the commercial value of the innovation. The second inflection point occurs when a standard emerges, and the third inflection point occurs when users needs are met or exceeded or when no further performance improvements can expected.

78

ANNA UNIVERSITY CHENNAI

EMERGING TRENDS IN TECHNOLOGY MANAGEMENT

Managers face different challenges at different phases of innovation. In order to understand the challenges, we must first understand the dynamics. In this section, I present the regular patterns exhibited throughout history. Each cycle begins with a discontinuous innovation. If the innovation is successful, it proceeds through subsequent phases and includes later incremental innovations. The Innovation Phase The first phase is the creation of the innovation. Communities tend to see the innovation as a toy with little or no commercial value and to see the innovators as hobbyists or enthusiasts. 3M managers were skeptical about adhesive that made only intermittent contact when their scientists invented the adhesive for Post-It Notes in the 1970s. When Samuel Morse presented the United States Congress with a prototype of his telegraph machine in 1838, his audience did not take him seriously. Even after Morse received government funding and set up a telegraph line between Washington D.C. and Baltimore, people still saw little practical use for it. But Morse and his partners implicitly understood the network effect. They pushed ahead and built a network of telegraph lines between U.S. cities, hoping that customers would begin to appreciate the value of the telegraph. Firms tend to see the innovation as inadequate to meet their customers needs. In the 1970s the mainframe computer industry viewed personal computers as toys for hobbyists who purchased kits to build them. Few firms saw the personal computer as having any real commercial value. Chaos and Commercialization Phase The first inflection point on the S-curve occurs when entrepreneurs see the commercial value of the innovation and try to build a business around the innovation. As entrepreneurs saw the commercial value of the telegraph, dozens of companies began building telegraph networks, stringing lines haphazardly across the United States. To avoid patent infringements, companies developed unique telegraph systems, incompatible with each other. Creating the telegraph infrastructure was expensive, and dozens of companies failed before becoming profitable. In Europe and England, the telegraph systems grew with government sponsorship. But each countrys system was incompatible with that of its neighbors. Standards did not yet exist. As entrepreneurs saw the value of PCs, many companies emerged to give us the KayPro, the Commodore, the Apple Lisa, the DEC Rainbow and the Victor 9000, none of which was compatible with the others. Software written for one did not work on any others. And they did not communicate with one another. This is not surprising. Just as we have no laws to govern something we have never imagined before, we have no standards to guide discontinuous innovations.

NOTES

79

ANNA UNIVERSITY CHENNAI

DBA 1736

NOTES

The characteristics of the Chaos and Commercialization Phase are hype, disappointments, fear, suspicion, many entrants, incompatible systems and no standards. An indicator that this phase is coming to a close and the next phase is about to begin is that governments see the innovation as important to national interests. Both governments and consumers call for standards and interoperability. Standards Phase The Standards Phase has three characteristics: the emergence of a standard or dominant design, rapid growth, and industry consolidation. The industry as a whole reaches a critical mass, grows rapidly, and all participants aligned with the standard benefit. During this phase, incremental innovations are important. The 1865 international conference on the telegraph yielded the International Telegraph Union, still in existence today. This early standards body worked to unify the many disparate systems and marked a turning point in the telegraph industry. Its work helped to expand the telegraph throughout the world at a remarkable speed. The pace of adoption has accelerated since the mid-nineteenth century. In the computer industry, IBM introduced the IBM PC in 1982. With its strong brand and high level of trust from the business world, it created a standard overnight. This marked the turning point for the personal computer industry. Those participating in the standard flourished. Compaq and Dell began building PCs; Microsoft, Lotus, Borland, Oracle and WordPerfect began building software; Intels processor business grew rapidly. Service providers began offering custom software to run businesses. Once a standard emerges, industry consolidation follows quickly. Almost all the companies building personal computers (other than IBM PC-compatibles) failed in 1983. In general, at this phase, companies whose products are not aligned with the de facto standard fail; companies whose products are aligned with the standard grow. Companies compete during this phase by adding functionality through continuous, incremental improvements to their products. Market Maturity Phase The final inflection point on the S-curve comes when products meet or exceed consumers needs for functionality, or when the technology has reached its natural limits. Competition shifts to customer service and to production and distribution efficiencies. Process innovation is most important at this phase. The personal computer industry is in this phase now. Dell, whose strength is in justin-time production, rather than in enhancing functionality through R&D, has a competitive edge for this phase.

80

ANNA UNIVERSITY CHENNAI

EMERGING TRENDS IN TECHNOLOGY MANAGEMENT

Capability Framework Early Phase Capabilities
81

NOTES

ANNA UNIVERSITY CHENNAI

DBA 1736

NOTES

82

ANNA UNIVERSITY CHENNAI

EMERGING TRENDS IN TECHNOLOGY MANAGEMENT

Late Phase Capability

NOTES

83

ANNA UNIVERSITY CHENNAI

DBA 1736

NOTES

The Capability Bridge

Types of Innovations Customer involvement Product focus Product driver

Early Capabilities Discontinuous Close relationship with customers Features Inventor / engineer drives development to completion Provide end-to-end solution

Product breadth

Scheduling

Flexible, adaptable

Mature Capabilities Incremental Process Good relations with customers but with more distance Cost Product completion is a more routine team effort Focus on core competency and enable partners for non-core areas Efficient, processoriented Defensive Defend present position Ultimately, need to attack yourself

Posture

Aggressive Take Risks Attack existing technology

84

ANNA UNIVERSITY CHENNAI

EMERGING TRENDS IN TECHNOLOGY MANAGEMENT

Why is it so difficult for companies to have both kinds of capabilities? Clearly the capabilities are in tension with each other. Tushman, et. al. use the term ambidextrous organization to describe the approach managers must take to handle both the entrepreneurial and mature aspects of a firm. Managers understand that their firm needs innovations in order to grow and they are often supportive of innovative efforts. But all firms deal with the reality of limited resources, and during the debates on resource allocation, established managers will attempt to control resources even if that means denying them to the entrepreneurial units of the business. In addition to the inherent tension between capabilities such as flexibility and efficiency, managers often undermine innovation because of what I call the Chronos Syndrome. In Greek mythology, Chronos feared that his children would overthrow him. He had himself defeated and overthrown his father. When Chronos son Zeus grew up, he too defeated his father and became the king of the gods. Managers face this same issue; it is difficult to support the efforts that will lead to your own demise. Ambidextrous organizations have the capabilities to support simultaneous discontinuous and incremental innovations. They are inherently unstable, just as Chronos hold on the universe was unstable. They require leadership that can see the longer term value that the ability to produce different kinds of innovation provides. Mastering the technology transfer from labs to business units is difficult, but aided by close relations between the innovators and the clients or customers. Defining workable solutions such as Raytheon or 3M have done provides a long term advantage. Acquisitions are notoriously perilous, but provide another alternative for firms such as Cisco and IBM. Doing basic research and coming up with innovations is laudable. But firms that fail to take the innovation to the next stage lose the VALUE of the innovation. Summary An overview of flexibility and Change in management alongwith design and innovation management has been dealt with. An extensive discussion on business appraisal of technology potentials have also been dealt with. Questions 1. What is the effect of flexibility in management on prdductivity? 2. Is there a relationship between need for sourcing technology and the choice of technology? Explain. 3. Elucidate on the menas of increase productivity in research and design. 4. Perform a business appraisal of an emerging technology of your choice. 5. Define innovation. How does in help in survival of a business. Explain.

NOTES

85

ANNA UNIVERSITY CHENNAI

DBA 1736

NOTES

86

ANNA UNIVERSITY CHENNAI

EMERGING TRENDS IN TECHNOLOGY MANAGEMENT

UNIT III

NOTES

BUSINESS STRATEGY AND TECHNOLOGY STRATEGY


Introduction This unit is meant to throw a bit of light on technology planning, strategy and alliances. The objectives and advantages of joint ventures are also discussed. Three cases on technology bridging have been illustrated. A brief discussion on corporate venturing is documented. Learning Objectives To have a fair idea about Variables in global competitiveness Basic Principles of Technology Planning Consensus building and buy-in Typical structure of a (IT) technology strategy When are joint ventures used? Common uses of corporate venturing

3.1 GLOBAL COMPETITIVENESS The Global Competitiveness Report is a yearly report published by the World Economic Forum. The first report was released in 1979. The 2007-2008 report covers 131 major and emerging economies. The report assesses the ability of countries to provide high levels of prosperity to their citizens. This in turn depends on how productively a country uses available resources. Therefore, the Global Competitiveness Index measures the set of institutions, policies, and factors that set the sustainable current and medium-term levels of economic prosperity. It has been widely cited and used by many scholarly and peerreviewed articles. Somewhat similar annual reports are the Ease of Doing Business Index and the Indices of Economic Freedom. They also look at factors that affect economic growth, but not as many as the Global Competitiveness Report.
87 ANNA UNIVERSITY CHENNAI

DBA 1736

NOTES

One part of the report is the Executive Opinion Survey which is a survey of a representative sample of business leaders in their respective countries. Respondent numbers have increased every year and is currently just over 11,000 in 125 countries. The report ranks the worlds nations according to the Global Competitiveness Index. The report states that it is based on the latest theoretical and empirical research. It is made up of over 90 variables, of which two thirds come from the Executive Opinion Survey, and one third comes from publicly available sources such as the United Nations. The variables are organized into nine pillars, with each pillar representing an area considered as an important determinant of competitiveness. The report notes that as a nation develops, wages tend to increase, and that in order to sustain this higher income, labor productivity must improve in order for the nation to be competitive. In addition, what creates productivity in Sweden is necessarily different from what drives it in Ghana. Thus, the GCI separates countries into three specific stages: factordriven, efficiency-driven, and innovation-driven, each implying a growing degree of complexity in the operation of the economy. In the factor-driven stage countries compete based on their factor endowments, primarily unskilled labor and natural resources. Companies compete on the basis of prices and sell basic products or commodities, with their low productivity reflected in low wages. To maintain competitiveness at this stage of development, competitiveness hinges mainly on well-functioning public and private institutions (pillar 1), appropriate infrastructure (pillar 2), a stable macroeconomic framework (pillar 3), and good health and primary education (pillar 4). As wages rise with advancing development, countries move into the efficiency-driven stage of development, when they must begin to develop more efficient production processes and increase product quality. At this point, competitiveness becomes increasingly driven by higher education and training (pillar 5), efficient markets (pillar 6), and the ability to harness the benefits of existing technologies (pillar 7). Finally, as countries move into the innovation-driven stage, they are only able to sustain higher wages and the associated standard of living if their businesses are able to compete with new and unique products. At this stage, companies must compete by producing new and different goods using the most sophisticated production processes (pillar 8) and through innovation (pillar 9). Thus, the impact of each pillar on competitiveness varies across countries, in function of their stages of economic development. Therefore, in the calculation of the GCI, pillars are given different weights depending on the per capita income of the nation. The weights used are the values that best explain growth in recent years. For example, the sophistication and innovation factors contribute 10% to the final score in factor and efficiency-driven economies, but 30% in innovation-driven economies. Intermediate values are used for economies in transition between stages.
88 ANNA UNIVERSITY CHENNAI

EMERGING TRENDS IN TECHNOLOGY MANAGEMENT

Variables 1. Institutions A. Public institutions 1. Property rights 1.01 Property rights 2. Ethics and corruption 1.02 Diversion of publics funds 1.03 Public trust of politicians 3. Undue influence 1.04 Judicial independence 1.05 Favoritism in decisions of government officials 4. Government inefficiency (red tape, bureaucracy and waste) 1.06 Wastefulness of government spending 1.07 Burden of government regulation 5. Security 1.08 Business costs of terrorism 1.09 Reliability of police services 1.10 Business costs of crime and violence 1.11 Organized crime B. Private institutions 1. 2. Corporate ethics 1.12 Ethical behavior of firms Accountability 1.13 Efficacy of corporate boards 1.14 Protection of minority shareholders interests 1.15 Strength of auditing and accounting standards 2. Infrastructure 2.01 Overall infrastructure quality 2.02 Railroad infrastructure development 2.03 Quality of port infrastructure 2.04 Quality of air transport infrastructure 2.05 Quality of electricity supply 2.06 Telephone lines (hard data)

NOTES

89

ANNA UNIVERSITY CHENNAI

DBA 1736

NOTES

3.

Macroeconomy 3.01 Government surplus/deficit (hard data) 3.02 National savings rate (hard data) 3.03 Inflation (hard data) 3.04 Interest rate spread (hard data) 3.05 Government debt (hard data) 3.06 Real effective exchange rate (hard data)

4.

Health and primary education A. Health 4.01 Medium-term business impact of malaria 4.02 Medium-term business impact of tuberculosis 4.03 Medium-term business impact of HIV/AIDS 4.04 Infant mortality (hard data) 4.05 Life expectancy (hard data) 4.06 Tuberculosis prevalence (hard data) 4.07 Malaria prevalence (hard data) 4.08 HIV prevalence (hard data) B. Primary education 4.09 Primary enrolment (hard data)

5.

Higher education and training A. Quantity of education 5.01 Secondary enrolment ratio (hard data) 5.02 Tertiary enrolment ratio (hard data) B. Quality of education 5.03 Quality of the educational system 5.04 Quality of math and science education 5.05 Quality of management schools C. On-the-job training 5.06 Local availability of specialized research and training services 5.07 Extent of staff training

6. 1.

Market efficiency A. Good markets: Distortions, competition, and size Distortions 6.01 Agricultural policy costs 6.02 Efficiency of legal framework
90 ANNA UNIVERSITY CHENNAI

EMERGING TRENDS IN TECHNOLOGY MANAGEMENT

6.03 Extent and effect of taxation 6.04 Number of procedures required to start a business (hard data) 6.05 Time required to start a business (hard data) 2. Competition 6.06 Intensity of local competition 6.07 Effectiveness of antitrust policy 6.08 Imports (hard data) 6.09 Prevalence of trade barriers 6.10 Foreign ownership restrictions 3. Size 0.00 GDP exports + imports (hard data) 6.11 Exports (hard data) B. Labor markets: Flexibility and efficiency 1. Flexibility 6.12 Hiring and firing practices 6.13 Flexibility of wage determination 6.14 Cooperation in labor-employer relations 2. Efficiency 6.15 Reliance on professional management 6.16 Pay and productivity 6.17 Brain drain 6.18 Private sector employment of women C. Financial markets: Sophistication and openness 6.19 Financial market sophistication 6.20 Ease of access to loans 6.21 Venture capital availability 6.22 Soundness of banks 6.23 Local equity market access 7. Technological readiness 7.01 Technological readiness 7.02 Firm-level technology absorption 7.03 Laws relating to ICT 7.04 FDI and technology transfer 7.05 Cellular telephones (hard data) 7.06 Internet users (hard data)
91

NOTES

ANNA UNIVERSITY CHENNAI

DBA 1736

NOTES

7.07 Personal computers (hard data) 8. Business sophistication A. Networks and supporting industries 8.01 Local supplier quantity 8.02 Local supplier quality B. Sophistication of firms operations and strategy 8.03 Production process sophistication 8.04 Extent of marketing 8.05 Control of international distribution 8.06 Willingness to delegate authority 8.07 Nature of competitive advantage 8.08 Value-chain presence 9. Innovation 9.01 Quality of scientific research institutions 9.02 Company spending on research and development 9.03 University/industry research collaboration 9.04 Government procurement of advanced technology products 9.05 Availability of scientists and engineers 9.06 Utility patents (hard data) 9.07 Intellectual property protection 9.08 Capacity for innovation 3.2 TECHNOLOGY PLANNING The following is a working definition of technology planning established by the RTEC Technology Plan Task Force: A technology plan serves as a bridge between established standards and classroom practice. It articulates, organizes, and integrates the content and processes of education in a particular discipline with integration of appropriate technologies. It facilitates multiple levels of policy and curriculum decision making, especially in school districts, schools, and educational organizations that allow for supportive resource allocations. In general, planning is an ongoing process that translates organizational, public policy, and technology needs into concrete actions. It allows educational organizations to take advantage of technology innovations while minimizing the negative impact of unexpected challenges. Planning provides a road map for the implementation of technology and can result in more efficient expenditure of limited resources and an improvement in student achievement.
92 ANNA UNIVERSITY CHENNAI

EMERGING TRENDS IN TECHNOLOGY MANAGEMENT

Technology plans reflect the policy and educational environment of a state or district. However, a technology plan by itself is not enough to ensure change. The RTEC Technology Plan Task Force believes that the processes of technology plan development, implementation, and evaluation are essential components of educational reform. A welldesigned technology plan is a dynamic tool providing guidance for local innovation. Technology plans also represent opportunities for dialogue and professional development that encourage local decision making. Basic Principles of Technology Planning The Guiding Questions for Technology Planning, Version 1.0, tool is designed to help begin a technology planning process, select a planning model, and move the process forward. It is considered most useful when it is used within a larger planning process and not simply as an add-on or one-time discussion. A good technology planning process can be summed up in six or seven basic principles. These principles have been adapted by Hopey and Harvey-Morgan (1995) and are based in part on a model developed by Shirley (1988). Technology planning for education should: Be an organized and continuous process, use a simple straightforward planning model, and result in a document that improves how technology is used for instruction, management, assessment, and communications. Take into account the mission and philosophy of the organization and be owned by that organization, its administrators, and instructors. (While outside assistance, such as that provided by a consultant, can bring a broad perspective and knowledgeable opinions to the technology planning process, the process must have the commitment of decision makers and staff.) Be broad but realistic in scope, with economical and technically feasible solutions. Involve all the stakeholdersincluding administrators, instructors, staff members, students, parents, community leaders, and technology expertswith experience in education. Identify the strengths and weaknesses of the organization and how each will impact the implementation of technology. Formalize the procedures and methods for making technology decisions, including the setting of priorities and the purchase, evaluation, upgrading, and use of technology.

NOTES

Be driven by educational goals and objectives rather than by technological developments. 3.3 TECHNOLOGY STRATEGY A Technology strategy (as in Information technology) is a planning document that explains how information technology should be utilized as part of an organizations overall business strategy. The document is usually created by an organizations Chief Information
93 ANNA UNIVERSITY CHENNAI

DBA 1736

NOTES

Officer (CIO) or technology manager and should be designed to support the organizations overall business plan. Consensus building and buy-in One of the principal purposes of creation of a technology strategy is to create consensus and stakeholder buy-in. There are many methods to this process such as the delphi method. Organizations that have the option of using a non-biased outside facilitator frequently build consensus quickly using these processes. Successful strategies take into account the collective knowledge of many levels within an organization and attempt to remove bias of one or more individuals. The use of anonymous feedback has been shown to prevent highly destructive passive aggressive employee behavior. Typical structure of a (IT) technology strategy The following are typically sections of a technology strategy: Executive Summary - single page summary of the IT strategy. o High level organizational benefits o Relationship to overall business strategy o Resource summary Staffing Budgets Summary of key projects Internal Capabilities o IT Project Portfolio Management - An inventory of current projects being managed by the information technology department and their status. Note: It is not common to report current project status inside a future-looking strategy document. o Current IT departmental strengths and weaknesses External Forces o Summary of changes driven from outside the organization o Rising expectations of users Example: Availability of open-source learning management systems o List of new IT projects requested by the organization Opportunities o Description of new cost reduction or efficiency increase opportunities

94

ANNA UNIVERSITY CHENNAI

EMERGING TRENDS IN TECHNOLOGY MANAGEMENT

o Description of how Moores Law (faster processors, networks or storage at lower costs) will impact the organizations return on investment - ROI for technology Threats o Description of disruptive forces that could cause the organization to become less profitable or competitive o Analysis IT usage by competition IT Organization structure and Governance o IT organization roles and responsibilities o IT role description o IT Governance Milestones o List of monthly, quarterly or mid-year milestones and review dates to indicate if the strategy is on track o List milestone name, deliverables and metrics Audience A technology strategy document is usually designed to be read by non-technical stakeholders involved in business planning within an organization. It should be free of technical jargon and information technology acronyms. The IT strategy should also be presented to or read by internal IT staff members. Many organizations circulate prior year versions to internal IT staff members for feedback before new annual IT strategy plans are created. One critical integration point is the interface with an organizations marketing plan. The marketing plan frequently requires the support of a web site to create an appropriate on-line presence. Large organizations frequently have complex web site requirements such as web content management. Presentation The CIO, CTO or IT manager frequently creates a high-level overview presentation designed to be presented to stakeholders. Many experienced managers try to summarize the strategy in 5-7 slides and present the plan in under 30 minutes to a board of directors. It is also common to produce a professionally bound booklet version of the strategy - something physical that IT teams can refer to, rather than the more disposable presentation slides. Scope and size Although many companies write an overall business plan each year, a technology strategy may cover developments somewhere between three and 5 years into the future.
95

NOTES

ANNA UNIVERSITY CHENNAI

DBA 1736

NOTES

Relationship between strategy and enterprise technology architecture A technology strategy document typically refers to but does not duplicate an overall enterprise architecture. The technology strategy may refer to: High-level view of Logical architecture of information technology systems High-level view of Physical architecture of information technology systems

3.4 TECHNOLOGY ALLIANCES Technology Alliances Extend the Value of Enterprise Reporting Applications Actuate works with a select number of Strategic Partners to help organizations better utilize their information assets to create world class Enterprise Reporting Applications. Actuates Strategic Partners develop applications that interface with Actuate products to enhance and extend the range of capabilities for joint customers. Together, Actuate and its partners build integrated solutions to enhance Actuates interoperability with other applications. Actuate and its industry-leading partners encourage companies to truly improve corporate performance by creating Enterprise Reporting Applications that are adopted by 100% of the targeted users. Actuate and its Partners Improve Business or Industry Customers leverage Actuates tightly integrated partnerships in ERP, applications, database, development environment and other key technology areas to build applications that address fundamental business processes by streamlining financial management, improving sales tracking and management, providing better visibility into customer accounts and delivering that same account information directly to end-customers for self-service. Organizations across industries have realized the business benefits that come from bringing customers, partners, and employees closer to the information that drives day-today business operations. 3.5 JOINT VENTURES A joint venture (often abbreviated JV) is an entity formed between two or more parties to undertake economic activity together. The parties agree to create a new entity by both contributing equity, and they then share in the revenues, expenses, and control of the enterprise. The venture can be for one specific project only, or a continuing business relationship such as the Sony Ericsson joint venture. This is in contrast to a strategic alliance, which involves no equity stake by the participants, and is a much less rigid arrangement.

96

ANNA UNIVERSITY CHENNAI

EMERGING TRENDS IN TECHNOLOGY MANAGEMENT

The phrase generally refers to the purpose of the entity and not to a type of entity. Therefore, a joint venture may be a corporation, limited liability company, partnership or other legal structure, depending on a number of considerations such as tax and tort liability. When are joint ventures used? Joint ventures are common in the oil and gas industry, and are often cooperations between a local and foreign company (about 3/4 are international). A joint venture is often seen as a very viable business alternative in this sector, as the companies can complement their skill sets while it offers the foreign company a geographic presence. Studies show a failure rate of 30-61%, and that 60% failed to start or faded away within 5 years. (Osborn, 2003) It is also known that joint ventures in low-developed countries show a greater instability, and that JVs involving government partners have higher incidence of failure (private firms seem to be better equipped to supply key skills, marketing networks etc.) Furthermore, JVs have shown to fail miserably under highly volatile demand and rapid changes in product technology. Some countries, such as the Peoples Republic of China and to some extent India, require foreign companies to form joint ventures with domestic firms in order to enter a market. This requirement often forces technology transfers and managerial control to the domestic partner. Another form joint ventures may take are the Joint Ventures (JVs) in the U.S., Canada, and Mexico dedicated to the conservation of priority bird species and their associated habitats. Each of these JVs is different in how they go about their respective missions, but all try to follow the principles of Strategic Habitat Conservation (SHC). SHC combines biological planning, conservation design, conservation delivery, and evaluation and monitoring. Gulf Coast Joint Venture, Lower Mississippi Valley Joint Venture, and Prairie Pothole Joint Venture are just three of the 20+ JVs found in North America. Brokers In addition, joint ventures are practiced by a joint venture broker, who are people that often put together the two parties that participate in a joint venture. A joint venture broker then often make a percentage of the profit that is made from the deal between the two parties. Reasons for forming a joint venture Internal reasons 1. Build on companys strengths 2. Spreading costs and risks 3. Improving access to financial resources

NOTES

97

ANNA UNIVERSITY CHENNAI

DBA 1736

NOTES

4. Economies of scale and advantages of size 5. Access to new technologies and customers 6. Access to innovative managerial practices Competitive goals 1. Influencing structural evolution of the industry 2. Pre-empting competition 3. Defensive response to blurring industry boundaries 4. Creation of stronger competitive units 5. Speed to market 6. Improved agility Strategic goals 1. Synergies 2. Transfer of technology/skills 3. Diversification What is a Joint Venture? Joint Venture companies are the most preferred form of corporate entities for Doing Business in India. There are no separate laws for joint ventures in India. The companies incorporated in India, even with up to 100% foreign equity, are treated the same as domestic companies. A Joint Venture may be any of the business entities available in India A typical Joint Venture is where: 1. Two parties, (individuals or companies), incorporate a company in India. Business of one party is transferred to the company and as consideration for such transfer, shares are issued by the company and subscribed by that party. The other party subscribes for the shares in cash. 2. The above two parties subscribe to the shares of the joint venture company in agreed proportion, in cash, and start a new business. 3. Promoter shareholder of an existing Indian company and a third party, who/which may be individual/company, one of them non-resident or both residents, collaborate to jointly carry on the business of that company and its shares are taken by the said third party through payment in cash. Some practical aspects of formation of joint venture companies in India and the prerequisites which the parties should take into account are enumerated herein after.

98

ANNA UNIVERSITY CHENNAI

EMERGING TRENDS IN TECHNOLOGY MANAGEMENT

Foreign companies are also free to open branch offices in India. However, a branch of a foreign company attracts a higher rate of tax than a subsidiary or a joint venture company. The liability of the parent company is also greater in case of a branch office. Government Approvals for Joint Ventures All the joint ventures in India require governmental approvals, if a foreign partner or an NRI or PIO partner is involved. The approval can be obtained from either from RBI or FIPB. In case, a joint venture is covered under automatic route, then the approval of Reserve bank of India is required. In other special cases, not covered under the automatic route, a special approval of FIPB is required. The Government has outlined 37 high priority areas covering most of the industrial sectors. Investment proposals involving up to 74% foreign equity in these areas receive automatic approval within two weeks. An application to the Reserve Bank of India is required. Please see Foreign Investment in India - Sector wise Guide for sectorwise guidelines under automatic route. Besides the 37 high priority areas, automatic approval is available for 74% foreign equity holdings setting up international trading companies engaged primarily in export activities. Approval of foreign equity is not limited to 74% and to high priority industries. Greater than 74% of equity and areas outside the high priority list are open to investment, but government approval is required. For these greater equity investments or for areas of investment outside of high priority an application in the form FC (SIA) has to be filed with the Secretariat for Industrial Approvals. A response is given within 6 weeks. Full foreign ownership (100% equity) is readily allowed in power generation, coal washeries, electronics, Export Oriented Unit (EOU) or a unit in one of the Export Processing Zones (EPZs). For major investment proposals or for those that do not fit within the existing policy parameters, there is the high-powered Foreign Investment Promotion Board (FIPB). The FIPB is located in the office of the Prime Minister and can provide single-window clearance to proposals in their totality without being restricted by any predetermined parameters. Foreign investment is also welcomed in many of infrastructure areas such as power, steel, coal washeries, luxury railways, and telecommunications. The entire hydrocarbon sector, including exploration, producing, refining and marketing of petroleum products has now been opened to foreign participation. The Government had recently allowed foreign investment up to 51% in mining for commercial purposes and up to 49% in telecommunication sector. The government is also examining a proposal to do away with the stipulation that foreign equity should cover the foreign exchange needs for import of capital goods. In view of the countrys improved balance of payments position, this requirement may be eliminated.
99

NOTES

ANNA UNIVERSITY CHENNAI

DBA 1736

NOTES

How to Enter into a Joint Venture Agreement? Selection of a good local partner is the key to the success of any joint venture. Once a partner is selected generally a Memorandum of Understanding or a Letter of Intent is signed by the parties highlighting the basis of the future joint venture agreement. A Memorandum of Understanding and a Joint Venture Agreement must be signed after consulting lawyers well versed in international laws and multi-jurisdictional laws and procedures. Before signing the joint venture agreement, the terms should be thoroughly discussed and negotiated to avoid any misunderstanding at a later stage. Negotiations require an understanding of the cultural and legal background of the parties. Before signing a Joint Venture Agreement the following must be properly addressed: Dispute resolution agreements Applicable law. Force Majeure Holding shares Transfer of shares Board of Directors General meeting. CEO/MD Management Committee Important decisions with consent of partners Dividend policy Funding Access. Change of control Non-Compete Confidentiality Indemnity Assignment. Break of deadlock Termination. The Joint Venture agreement should be subject to obtaining all necessary governmental approvals and licenses within specified period.

100

ANNA UNIVERSITY CHENNAI

EMERGING TRENDS IN TECHNOLOGY MANAGEMENT

Drafting International Joint Venture Agreements Madaan & Co. has helped US companies & Foreign companies in setting up their Joint Venture operations in India and other countries. Business Joint Ventures are more likely to be beneficial if Joint Venture Entry Strategies are carefully formulated. Negotiating Joint Ventures properly is very important for a win-win Joint Venture. Proper drafting of Joint Venture Agreements are very important for the success of any joint venture. We can help you in setting up your Joint Venture: from entry strategies, to negotiations to drafting agreements to compliance programs. Escorts Construction ties up with Altec Escorts Construction Equipment Ltd (ECEL), a major player in the construction and material handling sectors, will enter the growing earth moving equipment business in 200809 fiscal. Last week, the Faridabad-based company part of the Escorts Group signed up with Altec of US, a global major, and is establishing a large manufacturing facility in Ballabgarh to give a push to these diversification plans. While the Ballabgarh plant, its fourth (others are in Faridabad, Sahibabad and Bhiwadi) and largest integrated seamless construction facility is expected to commence operations in April 2008, another small unit is also being set up in Rudrapur, according to Mr Rajesh Sharma, Associate Vice-President (Marketing & Sales). The Rs 417 crore (2006-07) turnover ECEL had earlier exited from its joint venture with JCB for earth moving equipment by selling its stake. The decision to go on our own is based on the huge opportunities the sector offered and also to turn ECEL into an integrated player with strengths in all segments of construction equipment, Mr Sharma told Business Line here. The agreement signed with Altec will allow ECEL to bring in a host of equipment to serve the power network maintenance business, which promises to grow fast in India. With projected addition of 1,00,000 mw power capacity, power utilities will find maintenance easier with these latest equipment, which do not need power shutdown, he explained. ECEL, which has a total capacity of manufacturing 5,000 equipments per year, hopes to expand to a capacity of 15,000 equipment per year once the two new plants are completed, Mr Sharma, who was here to launch the companys first crawler crane, said. On the companys own growth, he said Escorts TRX 2319, largest PickNCarry crane in the world, would be exported soon.

NOTES

101

ANNA UNIVERSITY CHENNAI

DBA 1736

NOTES

We are discussing with an Australian buyer (I cannot disclose name now), who has shown interest to buy 100 units in the first year, he said. The company sold the first machine to the Lanco Group recently. ECEL is geared up to tap opportunities being thrown up by the growth of infrastructure and real estate sectors, he added. According to industry estimates the construction and earth moving equipment industry is at around Rs 9,000 crore and expected to grow to about Rs 40,000 crore by the year 2015. Tatas, Boeing to float joint venture for aerospace parts in India The Tata group and the US aircraft major Boeing are forming a joint venture company for making defence-related aerospace components in India. The components are for exports to Boeing and its international customers. The joint venture hopes to export components worth $500 million initially. Under the memorandum of agreement signed by Boeing and the Tata group, it is contemplated that the joint venture company will be established by June, a press release issued today said. A research and development centre for advanced manufacturing technologies is also contemplated, said a statement issued by Tatas. This joint venture between Tata and Boeing is an important part of our strategy to build capabilities in defence and aerospace, said the statement quoting Mr Ratan Tata, Chairman of the Tata Group, in the statement. I look forward to the joint venture becoming a world-class facility in India. The joint venture will bring real and lasting value to Indias aerospace industry, while making Boeing products more globally competitive, said Mr Jim Albaugh, President and CEO of Boeing Integrated Defence Systems. Investment details When asked about investment details, a Boeing spokesperson said, this is still under negotiation. But the sources close to Tatas said that the Indian business conglomerate would hold the majority stake in the joint venture. The Tata Group and Boeing signed the memorandum of agreement in December last year. Boeing, in December, had signed a 10-year memorandum of understanding with stateowned Hindustan Aeronautics Ltd (HAL) also to source sub-systems for fighter aircraft and helicopters. The initial intention of the joint venture is making aerospace components for Boeing and its international customers. Production for the domestic market is not in the plan, Mr
102 ANNA UNIVERSITY CHENNAI

EMERGING TRENDS IN TECHNOLOGY MANAGEMENT

Brian Nelson, Global Director of Communication, Boeing Integrated Defence, told Business Line. The joint venture will utilise the existing manufacturing capability of Tatas and develop new supply sources throughout the Indian manufacturing and engineering communities for both commercial and defence applications, the statement said. However, about manufacturing locations, Mr Nelson said there was no decision yet. The manufacturing capabilities established within the joint-venture company would in later phases be leveraged across multiple Boeing programmes, including the Medium MultiRole Combat Aircraft (MMRCA) competition. In the first phase of the agreement, Boeing would potentially issue contracts for work packages to the joint venture company involving defence-related component manufacturing on Boeings F/A-18 Super Hornet for the US Navy and Royal Australian Air Force, CH47 Chinook and/or P-8 Maritime Patrol Aircraft, the statement said. BPCL in pact with Kenyan firm for LPG bottling plant Bharat Petroleum Corporation Ltd (BPCL) has entered into an equal joint venture with the Kenya Pipeline Company Ltd for setting up an LPG bottling plant in Nairobi, Kenya. The plant will have a capacity of 2,000 tonnes with LPG coming from Mombassa, Kenya. Funding plans The initial investment in the project is expected to be around $15 million and will be funded through a 70:30 debt equity ratio, the Kenyan Energy Minister, Mr Kiraitu Murungi, told reporters here on Saturday. The Minister is in India with a delegation to discuss the details of the project with the Indian company. He said the Kenyan energy sector has enormous investment opportunities and Indian companies and investors can take advantage of it. With Kenya being a net importer of oil, it is looking for technical-tie ups in the energy sector. Kenya is seeking Indian collaboration for developing storage, bottling and handling facilities for oil and gas. The country is setting up a 500-km pipeline connecting Mombassa and Nairobi to transport petroleum products at a cost of $110 million.
103

NOTES

ANNA UNIVERSITY CHENNAI

DBA 1736

NOTES

It is being implemented by China Petroleum Pipeline Engineering Corporation and supervised by Petroleum India International. 3.6 TECHNOLOGY BRIDGING Illustrated below is classic case of technology can be bridged with the society. Case: There are an estimated 45m PCs in Brazil, making it the worlds fifth biggest market for computers. The more striking number, however, is the fraction of the population that does not have access to technology. Last years figures showed that 59% of Brazilians have never accessed the internet or used a computer, said Rodrigo Assumpcao, head of a committee that advises President Lulas government on what they call digital inclusion. But measures are underway to change all that, Mr Assumpcao told the BBCs Gareth Mitchell. He feels that being technologically educated is just as important as the basics of numeracy and literacy. A digital or social divide? When you think of Brazil, you think of country that is extremely divided between rich and poor and areas that are developed and under-developed, said Mr Assumpcao. He feels that the class divide within Brazilian society is to blame for the technological divide. In the 50s there was a brilliant Brazilian educator who said that public schools were meant to provide for poor children - everything that the rich children had in their homes. Most middle-class children are brought up with computers, so it becomes second nature to them, Mr Assumpcao asserts. Its like a Swiss army knife, a tool with multiple uses that serves him, thats the experience of a middle-class child in Brazil. In contrast, a poor child may not gain access to a computer until his teenage years, by which time it is a necessity in the working world. Only by the time he is twelve or fourteen, if he is lucky to live beside a neighbourhood association that has a computer, he will only then be taught on some kind of word-processing or web browser. He will be taught that he needs to learn these skills in order to have some rights within the job market, added Mr Assumpcao. He is taught that he has to comply with technology and this perception, this difference between who commands this technology and who is commanded by technology determines in our society who rules and who is ruled, who has access to money and who hasnt and who has access to rights and who hasnt.
104 ANNA UNIVERSITY CHENNAI

EMERGING TRENDS IN TECHNOLOGY MANAGEMENT

Online for change Mr Assumpcao said that 56,000 public schools are presently being fitted with broadband internet, with an aim to have all of the urban public schools in the country connected by 2010. The Brazilian government is also involved with the One Laptop Per Child (OLPC) project, which provides a basic mobile computer for children in developing countries. Its makers were able to produce a low-cost machine by using a less powerful processor and stripping out expensive parts like the hard disk drive. Roseli Lopes from the University of Sao Paulo has co-ordinated a trial of the OLPC project at the Ernani School, northwest of Sao Paulo, that is now in its second year. Its a wonderful experience for the children as they love coming to school and dont want to stay at home, said Prof Lopes. The laptop project works well in classes with large numbers of children, as computers enable individuals to go at their own pace and level. Its active learning, they take part in the search for information and they are not waiting for the teacher, said Prof Lopes. They are having more fun using this technology, not only to read and write but to make videos and take pictures, she added. Children in Brazil only spend between four or five hours at school, so being able to take the laptop home extends the time that they have to learn. That is the most important thing about this project: when they go home they can continue learning and include their families in the process, said Prof Lopes. Even if the parents cant read and write, they can use the camera to take pictures and make the learning more rich. Other solutions The Brazilian government is also trialling a number of other laptop projects in five other cities, employing Intels Classmate and Encores Simputer. The main concern in using different laptops is that they need to be interoperable, so that is one issue that Prof Lopes and her colleagues are constantly evaluating. However, the idea of children being able to access the technological hardware is only part of the solution in bridging this digital divide. We thought it was a good idea but immediately we decided that could not be conducted as the search for the next gadget, said Mr Assumpcao.

NOTES

105

ANNA UNIVERSITY CHENNAI

DBA 1736

NOTES

Also, the Brazilian government has a profound conviction that free software is the way to go, so we are demanding that there is a whole suite of free and open-source software installed in these computers. The whole idea of having closed software on public computers is something which strikes me as wrong, he added. With widely available broadband, laptops on the desk of many of Brazils youth, and a culture of open-source, free software, Brazils digital divide looks to be narrowing. Case on WLL and bridging technology WLL Threat: Cellular Companys In Direct Marketing Overdrive In a bid to retain its existing customers, major cellular companies are now opting for direct marketing plans and educational programmes to combat the WLL threat. Incidentally, some cell firms are even educating subscribers about the benefits of GSM technology over WLL (wireless local loop). After the price war, cellular companies will shift their focus to aggressive marketing plans, predict market analysts. For starters, Airtel has already started sending direct mailers to its existing customers across the country. According to Airtel customers in Mumbai, Bhartis direct mailers talk about the companys new offering which are in the pipeline. In addition, the mailers also highlight the benefits of GSM over WLL. As per the mailer, 80 per cent of the worlds mobile population prefers GSM over WLL, informs an Airtel user. When contacted by FE, Bharti was reluctant to divulge further details on its direct marketing initiatives. So, whats going to be Bhartis gameplan to take on the new entrant, Reliance Infocomm? Says Bharti Cellular Ltd chief operating officer Mumbai Circle Atul Jhamb: The entry of new operators will lead to a significant growth in the market. With a presence across 16 states in India, we are in a strong position to take full advantage of this growth. Also we provide GSM that offers unlimited mobility and the freedom of choice, explains Mr Jhamb. Industry sources say that Bharti has plans to step up its online marketing plans as it already has a strong data base in place. With the entry of a new player in the over-crowded category, BPL Mobile is also stepping up its marketing plans to retain customers. Says BPL Mobile president and COO Deepak Varma: Conventional wisdom suggests that all mobile technologies are the same, therefore all networks deliver the same products and services with tariffs and prices being the only differentiators. But nothing is further from the truth. Mr. Varma believes that service will be the key differentiator in the current scenario. And BPL Mobile is delivering service at the subscribers doorstep by increasing customer
106 ANNA UNIVERSITY CHENNAI

EMERGING TRENDS IN TECHNOLOGY MANAGEMENT

touch points to 75 outletsthis includes 21 BPL Mobile galleries and 54 exclusive BPL Mobile shops, he adds. As a part of its marketing strategy, BPL Mobile also sends direct mailers to its customers informing them about the latest in services and applications in the cellular industry. In addition we clear myths about various technologies and services too. I think a mobile service or experience is the experiential sum of the brand experience which includes anticipating and meeting consumer needs, elaborates Mr Varma. As for Hutchs marketing plans, says Ogilvy & Mather India executive director Nishi Suri (the ad agency which handles the advertising account of Orange): Orange will ensure that it has a competitive edge with effective marketing plans. Today, customers are smart enough to figure out the value-for-money equation between the GSM and WLL technology. With competition intensifying in the cellular industry, its customers wholl reap rich benefits in the new year Case on Nokia and GSM solution for WLL AFTER making a hue and cry over the Governments decision to allow limited mobility services based on wireless in local loop (WLL), almost all the big cellular operators, including Bharti, Hutchison, and Idea Cellular, are evaluating the feasibility of deploying the end-toend GSM 800 WLL solutions introduced by Nokia Networks. According to Mr Sanjay Bhasin, Director - India Strategy, Nokia Networks, the company has had discussions with all the leading cellular operators to enable them to roll out cost-efficient WLL services. The operators will, however, take a decision only if they see a strong business sense, because it would mean that they would have to acquire a basic service licence too. As per the Government regulations, the basic operators can offer limited mobility WLL services on the 800 MHz frequency band. The GSM operators have been allocated 900/1800 MHz band for their cellular services. So for the mobile operators to deploy WLL services, they will have to first acquire a basic licence if they do not have one (Bharti already offers basic services in many circles) and then deploy Nokias GSM 800 Solution. Mr. Bhasin was confident that the operators would be interested in Nokias offering, as he said it is an open standard enjoying huge economies of scale, GSM allows significant savings in capex and opex for operators deploying and managing 800 MHz WLL service, making it ideal for providing economic mass-market service in India and other markets. In the infrastructure-sharing approach, existing GSM networks can be modified to support the 800 MHz WLL band by simple network upgrades, resulting in the most costefficient approach to deploying and managing WLL services, he said.

NOTES

107

ANNA UNIVERSITY CHENNAI

DBA 1736

NOTES

Operators can extend their current GSM services to the 800 MHz WLL band to clearly benefit from synergies with their existing network deployment. The same standard GSM core and radio network technology, including billing and management solutions is also used for GSM 800, Mr Bhasin noted. He said that this type of deployment will lower not only the overall capital expenditures even further, but also the operational costs of running their network since one network is able to offer both mobile and WLL services. Such convergent network deployment methodology will be a crucial advantage for mobile operators, since subscribers can also enjoy the wide choice of GSM handsets and features available in the market. According to Mr Jussi Ware, Vice-President, GSM/EDGE Marketing and Sales, Nokia Networks, GSM is the natural technology for WLL, bringing a wide range of benefits both because of its competitive services and terminals and because it is highly cost-effective to deploy and operate. With more than 70 per cent of the global market, subscribers to GSM networks enjoy the widest choice of terminals, offering different styles, features and prices. Based on the three cases it is evident that technology bridging is concerned with making technology adoptable by the society. 3.7 CORPORATE VENTURING Investment in a new or existing venture by another company. Usually corporate venturing is undertaken by large firms investing in start-ups or small, rapidly growing companies. Corporate venturing means that growing firms have access to more venture funding and are able to receive advice from the investing company. One disadvantage is that large companies can use corporate venturing as a means of stifling competition, through acquisition. Description Corporate Venturing provides an alternative to traditional methods of growing a company. A company invests in new products or technologies by funding businesses that have a reasonably autonomous management team and separate human resource policies. The goals can be to develop products to expand the core business, to enter new industries or markets, or to develop breakthrough technologies that could substantially change the industry. Corporate Venturing can be done in one of four ways: by taking a passive, minority position in outside businesses (corporate venture capital), by taking an active interest in an outside company, by building a new business as a stand-alone unit, or by building a new business inside the existing firm with a structure allowing for management independence.

108

ANNA UNIVERSITY CHENNAI

EMERGING TRENDS IN TECHNOLOGY MANAGEMENT

Methodology Corporate ventures require managers to: Establish strategic objectives. Venturing requires companies to create and screen new ideas identified in-house. It is best used for long-term projects that develop knowledge key to the core business. Managers should evaluate ventures based on strategic needs and ensure that they fit with overall strategy. Develop the correct approach. Managers must then decide which method to use to pursue the new idea. Corporate venture capital, which provides access (through investments) to breakthrough technologies being investigated by startups, can be an effective prelude to a decision to acquire or build a stand-alone business. In some instances, however, firms will want to build the new business themselves to either lock in the value created or leverage close linkages with an existing part of the business; Establish a team. Once the approach is selected, a team can be created with the capabilities, resources, and sufficient independence to manage the program; Create processes to monitor progress and incorporate knowledge. Develop strict metrics and timetables to monitor the development process. In some instances, employ staged funding to ensure progress is on schedule. In all cases, look for means to transfer knowledge from the venture into the broader organization. Common Uses Corporate Venturing may be initiated to: Diversify; Foster relationships with companies key to a firms growth; Access new technology, experts, and research; Build businesses adjacent to the core.

NOTES

Business building may be initiated to: Strengthen the core business; Provide new avenues for growth, or build adjacent businesses; Enter new and emerging markets; Shorten development cycles; Motivate employees to take calculated risks.

What is corporate venturing? The term corporate venturing covers a range of mutually beneficial relationships between companies. The relationships range from those between companies within the same group, through those between unrelated companies, to collective investment by companies in other companies through a fund. The companies involved may be of any size, but such relationships are commonly formed between a larger company and a smaller independent one, usually in a related line of business.
109 ANNA UNIVERSITY CHENNAI

DBA 1736

NOTES

The larger company may invest in the smaller company, and so provide an alternative or supplementary source of finance. It may, instead or as well as, Make available particular skills or knowledge, perhaps in technical or management areas, which a smaller company would otherwise not have access to, and Provide access to established marketing and distribution channels, or complementary technologies. In addition to any financial return it receives from an investment the larger company may gain a competitive advantage by Being able to make better use of its own resources, and Gain access to Research or development, or other work in an area it is interested in new ideas A more entrepreneurial culture.

Forming corporate venturing relationships can be a way for large companies to develop and broaden their business without acquiring other companies, and a way for small companies to grow faster than they otherwise would. A typical outcome would be the development of a new product or process, perhaps involving an exclusive licensing deal between the two companies. Corporate venturing is well established as a growth strategy in the United States. In the United Kingdom (UK) it is currently more limited, being found mainly in areas such as biotechnology, telecommunications and information technology. The Corporate Venturing Scheme (CVS) is intended to encourage corporate venturing involving equity investment in the UK. An overview of the Corporate Venturing Scheme The CVS is aimed at companies considering direct investment, in the form of a minority shareholding, in small independent higher-risk trading companies or groups of such companies. It provides tax incentives for corporate equity investment in the same types of companies as those qualifying under the Enterprise Investment Scheme (EIS) and Venture Capital Trust (VCT) scheme. The incentives are available in respect of qualifying shares issued between 1 April 2000 and 31 March 2010. The aims of the Corporate Venturing Scheme (CVS) are to Increase the availability of venture capital to small higher-risk trading companies from corporate investors, and through this

Foster wider corporate venturing relationships between otherwise unconnected companies. The tax reliefs available are investment relief - relief against corporation tax of up to 20% of the amount subscribed for full-risk ordinary shares, provided that the shares are held throughout a qualification period
110 ANNA UNIVERSITY CHENNAI

EMERGING TRENDS IN TECHNOLOGY MANAGEMENT

deferral relief - deferral of tax on chargeable gains arising on the disposal of shares on which investment relief has been obtained and not withdrawn in full, if the gains are reinvested in new shares for which investment relief is obtained loss relief - relief against income for capital losses arising on most disposals of shares on which investment relief has been obtained and not withdrawn in full, net of the investment relief remaining after the disposal.

NOTES

Creating a Technology Road Map Does your business have a road map to address your business challenges and opportunities with new technologies? If you answered no, then youve come to the right place. With the new year comes new and emerging technologies to consider for your business. For instance, in 2006 you might want to take advantage of wireless Internet Protocol (IP) phones, unified messaging or videoconferencing. All three are now available for small businesses, and all three can improve operational efficiencies, employee productivity and provide substantial cost savingscrucial competitive advantages for any small business. Before you invest in any technology, however, you need a plana road map that matches short-term and long-term business goals with specific technology solutions to help you meet those goals. But small businesses often dont have a plan for technology acquisition. Instead, they traditionally add technology as a means of only addressing an immediate problem. That approach can set the stage for problems as companies evolve. In other words, without a road map, you may be investing in the wrong technology for your business at the wrong time. In addition to wasting money, you may be creating more problems than the technology was intended to solve in the first place. Heres how to create a technology road map for your business. Step 1: Identify current and potential business challenges Identifying what your most pressing obstacles are todayand what theyll likely be tomorrowcan help you accurately determine the best technology solutions for overcoming those challenges. Some common challenges small businesses face include: improving operational efficiencies, enhancing customer responsiveness, containing costs of doing business, and keeping data secure.

Step 2: Map the new technology solution to the biggest business challenge
111 ANNA UNIVERSITY CHENNAI

DBA 1736

NOTES

For your 2006 (and beyond) technology plan, connect the dots between your biggest business challenge and the specific technology solution that addresses that challenge. For example, if improving operational efficiency is your biggest challenge, consider investing in a secure computer network foundation. Such a flexible communications platform supports wireless networks, virtual private networks (VPNs), and other communications tools. A secure computer network foundation goes a long way toward improving your businesss operational efficiencies by enabling employees to communicate more easily wherever they go. Similarly, if reducing operating costs is your top priority, consider a converged network capable of carrying voice and video as well as data. Youll have only one network to manage, which reduces costs; you can take advantage of voice over internet protocol, a great way to cut telecommunications costs; and so on (read Should Your Business Switch to VoIP?). Step 3: Determine what phase your business is in Is your business in its foundation, growth or optimization phase? Knowing the answer can help you determine the core technology investment and road map your business is likely to need. In the foundation phase, a small business is seeking to get established. Communicating effectivelywith employees, customers and suppliersis especially critical. So your technology road map should take into account the need to provide the easiest possible access to information, offering the best service to customers, and keeping information secure. Businesses in the growth phase are established and looking to be more efficient and cost-effective. Technology considerations might include offering workers the ability to work from home or on the go. You might also want to enhance your communications infrastructure to provide greater operational efficiencies and cost savings through IP telephony. In the optimization phase, its time to differentiate your business with customers and suppliers. To do so, implementing customer relationship management, sales-force automation and call-center applications to enhance information sharing may be your priority. Step 4: Ensure that the immediate technology choice will help evolve your business over time Whichever technologies you decide on, only invest in solutions thatll help your business achieve your goals today and also supportwith minimal upgradesthe needs youll have in the future as your business evolves.

112

ANNA UNIVERSITY CHENNAI

EMERGING TRENDS IN TECHNOLOGY MANAGEMENT

Small-business buyers often go for the lowest-priced technology that meets their needs today. This generally means buying products that dont offer as many capabilities as others. But this approach can actually cost you money and time in the long run. For example, to save money, some small businesses purchase PCs with 512MB of memory or less. For about $200 more, they could have a PC with 1GB of memory or more. The more memory a PC has, the faster applications will run. And, by extension, the faster the PCs performance is, the less time you or an employee wastes. The goal is to get the best value over time, not the lowest upfront cost. To get the best long-term technology investment for your business, make sure any vendor youre considering offers easy financing for its technology solutions. Many technology vendors now provide flexible financing and leasing options especially tailored for small- and medium-sized businesses. In addition, take into account whats available in terms of service and support for any solution youre considering. Look for vendors that can provide system design and ongoing support for both minor and major software upgrades. In some cases, such services are available from a technology companys local resellers. Also, make sure your solution vendors can help train your staff so they can handle routine maintenance. A Few More Tips Before You Buy Now youre ready to make solid investments in 2006 thatll help your business be more agile, efficient and competitive. But before you spend your hard-earned money, Ill leave you with a few more tips. Minimize the number of vendors. It might save you money to buy network routers from vendor A, firewalls from vendor B, and network storage from vendor C. But youll have three vendors to deal with if something goes wrong. And guess what? Vendor A will invariably point a finger to vendor B as the culprit, and vice versa, leaving you caught without a solutionin the middle. Spare yourself the agony and the time (and remember, time is money) by getting as much of your technology solutions from one vendor as possible. Remember, youre not alone. Talk to trusted peers, partners, suppliers, friends even competitors. Find out what technologies are working for them, and which ones arent. Ask them about the specific benefits theyve received. Find out if they experienced any unpleasant surprises, and if so, how they dealt with them. Stay tuned. As for the new and emerging technologies mentioned earlier in this column, keep reading this column. In the coming months Ill explain 2006s hot new technologies and how they can help your small businesses thrive this yearand beyond. Summary After reading this unit you must have had a fair idea about being globally competitive, corporate venturing and advantages of joint ventures.
113

NOTES

ANNA UNIVERSITY CHENNAI

DBA 1736

NOTES

Questions 1. What parameters are significant for being globally competitive? 2. Elaborate on technology alliances 3. Explain the objectives and methodology of joint ventures. 4. Using an example of your choice explain corporate venturing. 5. How can business be mapped to technology? Explain.

114

ANNA UNIVERSITY CHENNAI

EMERGING TRENDS IN TECHNOLOGY MANAGEMENT

UNIT IV

NOTES

TECHNOLOGY MANAGEMENT IN EMERGING INDUSTRIES


Introduction This unit deals with the globalisation of the industry and the comtemprory technologies such biotechnology, biopharm, nanotechnology and biological engineering. The effects of globalisation is discussed alongwith the applications of the existing technologies. Learning Objectives Measuring Modern globalization Effects of globalization Fundamental concepts of Nanotechnology History and applications of Bio-technology Genetic testing and Gene Therapy Basic elements of telecommunication

4.1 GLOBALISATION OF INDUSTRY Globalization (or globalisation) in its literal sense is the process of transformation of local or regional things or phenomena into global ones. It can also be used to describe a process by which the people of the world are unified into a single society and function together. This process is a combination of economic, technological, sociocultural and political forces. Globalization is often used to refer to economic globalization, that is, integration of national economies into the international economy through trade, foreign direct investment, capital flows, migration, and the spread of technology. Modern globalization Globalization in the era since World War II is largely the result of planning by economists, business interests, and politicians who recognized the costs associated with protectionism and declining international economic integration. Their work led to the Bretton Woods

115

ANNA UNIVERSITY CHENNAI

DBA 1736

NOTES

conference and the founding of several international institutions intended to oversee the renewed processes of globalization, promoting growth and managing adverse consequences. These institutions include the International Bank for Reconstruction and Development (the World Bank), and the International Monetary Fund. Globalization has been facilitated by advances in technology which have reduced the costs of trade, and trade negotiation rounds, originally under the auspices of the General Agreement on Tariffs and Trade (GATT), which led to a series of agreements to remove restrictions on free trade. Since World War II, barriers to international trade have been considerably lowered through international agreements - GATT. Particular initiatives carried out as a result of GATT and the World Trade Organization (WTO), for which GATT is the foundation, have included: Promotion of free trade: o Reduction or elimination of tariffs; creation of free trade zones with small or no tariffs o Reduced transportation costs, especially resulting from development of containerization for ocean shipping. o Reduction or elimination of capital controls o Reduction, elimination, or harmonization of subsidies for local businesses Restriction of free trade: o Harmonization of intellectual property laws across the majority of states, with more restrictions. o Supranational recognition of intellectual property restrictions (e.g. patents granted by China would be recognized in the United States) The Uruguay Round (1984 to 1995) led to a treaty to create the WTO to mediate trade disputes and set up a uniform platform of trading. Other bilateral and multilateral trade agreements, including sections of Europes Maastricht Treaty and the North American Free Trade Agreement (NAFTA) have also been signed in pursuit of the goal of reducing tariffs and barriers to trade. Measuring globalization Looking specifically at economic globalization, it can be measured in different ways. These center around the four main economic flows that characterize globalization: Goods and services, e.g. exports plus imports as a proportion of national income or per capita of population Labor/people, e.g. net migration rates; inward or outward migration flows, weighted by population

116

ANNA UNIVERSITY CHENNAI

EMERGING TRENDS IN TECHNOLOGY MANAGEMENT

Capital, e.g. inward or outward direct investment as a proportion of national income or per head of population Technology, e.g. international research & development flows; proportion of populations (and rates of change thereof) using particular inventions (especially factor-neutral technological advances such as the telephone, motorcar, broadband)

NOTES

As globalization is not only an economic phenomenon, a multivariate approach to measuring globalization is the recent index calculated by the Swiss think tank KOF. The index measures the three main dimensions of globalization: economic, social, and political. In addition to three indices measuring these dimensions, an overall index of globalization and sub-indices referring to actual economic flows, economic restrictions, data on personal contact, data on information flows, and data on cultural proximity is calculated. Data is available on a yearly basis for 122 countries, as detailed in Dreher, Gaston and Martens (2008). According to the index, the worlds most globalized country is Belgium, followed by Austria, Sweden, the United Kingdom and the Netherlands. The least globalized countries according to the KOF-index are Haiti, Myanmar the Central African Republic and Burundi. Effects of globalization Globalization has various aspects which affect the world in several different ways such as: Industrial - emergence of worldwide production markets and broader access to a range of foreign products for consumers and companies. Particularly movement of material and goods between and within national boundaries. Financial - emergence of worldwide financial markets and better access to external financing for borrowers. Simultaneous though not necessarily purely globalist is the emergence of under or un-regulated foreign exchange and speculative markets. Economic - realization of a global common market, based on the freedom of exchange of goods and capital. Political - some use globalization to mean the creation of a world government, or cartels of governments (e.g. WTO, World Bank, and IMF) which regulate the relationships among governments and guarantees the rights arising from social and economic globalization. Politically, the United States has enjoyed a position of power among the world powers; in part because of its strong and wealthy economy. With the influence of globalization and with the help of The United States own economy, the Peoples Republic of China has experienced some tremendous growth within the past decade. If China continues to grow at the rate projected by the trends, then it is very likely that in the next twenty years, there will be a major reallocation of power among the world leaders. China will have enough wealth, industry, and technology to rival the United States for the position of leading world power.

117

ANNA UNIVERSITY CHENNAI

DBA 1736

NOTES

Informational - increase in information flows between geographically remote locations. Arguably this is a technological change with the advent of fibre optic communications, satellites, and increased availability of telephony and Internet. Cultural - growth of cross-cultural contacts; advent of new categories of consciousness and identities which embodies cultural diffusion, the desire to increase ones standard of living and enjoy foreign products and ideas, adopt new technology and practices, and participate in a world culture. Some bemoan the resulting consumerism and loss of languages. Also see Transformation of culture. Ecological- the advent of global environmental challenges that might be solved with international cooperation, such as climate change, cross-boundary water and air pollution, over-fishing of the ocean, and the spread of invasive species. Since many factories are built in developing countries with less environmental regulation, globalism and free trade may increase pollution. On the other hand, economic development historically required a dirty industrial stage, and it is argued that developing countries should not, via regulation, be prohibited from increasing their standard of living. Social (International cultural exchange) - increased circulation by people of all nations with fewer restrictions. o Spreading of multiculturalism, and better individual access to cultural diversity (e.g. through the export of Hollywood and Bollywood movies). Some consider such imported culture a danger, since it may supplant the local culture, causing reduction in diversity or even assimilation. Others consider multiculturalism to promote peace and understanding between peoples. o Greater international travel and tourism o Greater immigration, including illegal immigration o Spread of local consumer products (e.g. food) to other countries (often adapted to their culture). o World-wide fads and pop culture such as Pokmon, Sudoku, Numa Numa, Origami, Idol series, YouTube, Orkut, Facebook, and MySpace. Accessible to those who have Internet or Television, leaving out a substantial segment of the Earths population. o World-wide sporting events such as FIFA World Cup and the Olympic Games.

Technical o Development of a global telecommunications infrastructure and greater transborder data flow, using such technologies as the Internet, communication satellites, submarine fiber optic cable, and wireless telephones o Increase in the number of standards applied globally; e.g. copyright laws, patents and world trade agreements.

Legal/Ethical o The creation of the international criminal court and international justice movements.

118

ANNA UNIVERSITY CHENNAI

EMERGING TRENDS IN TECHNOLOGY MANAGEMENT

o Crime importation and raising awareness of global crime-fighting efforts and cooperation. Decreased by 50.1% compared to a 2.2% increase in Sub-Saharan Africa.

NOTES

Although critics of globalization complain of Westernization, a 2005 UNESCO report showed that cultural exchange is becoming mutual. In 2002, China was the third largest exporter of cultural goods, after the UK and US. Between 1994 and 2002, both North Americas and the European Unions shares of cultural exports declined, while Asias cultural exports grew to surpass North America. 4.2 MANAGING TECHNOLOGY Technology has changed the rules of business and provided more tools to capitalise on new opportunities. However, it has also brought a complexity that comes with managing computers, networks, websites and more. These articles can help you understand important technical issues that you will face in running your business. How to Protect Your Computers Computer security is one of the most important issues that any business faces. Learn about improving the safety and security of your business computers. As businesses rely more and more on technology to run a smooth operation, computer security has become a critical issue. These articles provide you with a sound starting point for setting up policies and instituting practices to help safeguard your companys computers. 5 Key Elements for Your PC Security Plan Experts agree that you need to think carefully about computer security, whatever size business you run. Here are five things to consider when you draft your PC security plan. 5 Key Elements for Your PC Security Plan Protecting your private business information from the outside world is one thing. But what about from prying eyes inside your own office? Here are five things to consider. Without an internal PC security plan at your business, all of the files on your company PCs could be available for anyone in your office to see. That could include strategic documents, financial files, and employee records. That cant be what you want. Yet many small-business owners fail to devise such a plan, and end up paying the consequences. Not only is their business information at risk, but they also threaten confidentiality pledges made to employees and customers. You need a formal PC security plan that is simple to understand. Here are five basics that should be part of your security plan:
119 ANNA UNIVERSITY CHENNAI

DBA 1736

NOTES

1. Use Password ProtectionProtecting files with passwords ensures that only authorised users can open a data file. Your operating system most likely has a built in password protection system and most software applications including Microsoft Office let you password-protect files and folders. 2. Choose Creative PasswordsYour spouses, childs or dogs name should be offlimits as passwords. The reason: People in the office know them and could guess that they may be your password. The same rule applies to birthdates, street addresses, favourite bands or singers, and other terms or words that people are likely to associate with you. Also, keep in mind that it is harder to crack a password that is made up of a mixture of numbers and letters in upper and lower case, as well as one that is changed frequently. Facilitate use of passwords by providing instructions to everyone in your company on how to create them, when to change them, and how to protect files and folders. 3. Use EncryptionOne way to protect the valuable information on your business PCs is to encrypt data. Encryption software turns data into a string of gibberish that you need the correct software key to decipher. Encryption software is commonly used to limit access to highly confidential files such as financial and customer lists, to safeguard laptop PCs that will be used outside of the office, and to protect top secret emails. 4. Never Leave Data UnattendedSomething as simple as encouraging your staff to close files before leaving their desks can limit PC security risks. Without this precaution in place, a break for lunch can leave PC files open to anyone who passes by. Support PC security by imposing rules that require staff to close all documents while not in use. 5. Limit Laptop BreachesThe use of laptop PCs enhances productivity, but it also threatens the security of your business if proper precautions are not taken. Encourage all remote workers to keep security in mind outside of the office by using small fonts when working on confidential documents in public place. If your staff members use public technology resources, show them how to ensure that documents remain on their laptop hard drives, rather than on the resources computers. Encryption can also protect laptop computers that are used outside of an office. Clean the Hard Drive Before Dumping Your PC Even when it comes time to get rid of your old computers, security is an issue. Learn how you can ensure that important company information doesnt fall into the wrong hands. Defend your Business with a Firewall Learn what a firewall is and how it can help discourage unwanted intrusions into your business data and private information.

120

ANNA UNIVERSITY CHENNAI

EMERGING TRENDS IN TECHNOLOGY MANAGEMENT

Internet Use at Work Growing The reality is that an increasing number of companies are making this investment. Mostly, its because of increased business use of the internet. As more and more businesses provide high-speed internet access to their employees, they seek to stop employees accessing pornography or games or doing excessive personal business through the web and email. Worldwide, the number of employees who have their internet or email use at work under surveillance is estimated at 27 million, according to a 2001 Privacy Foundation study. Though still largely the domain of corporations, an increasing number of small businesses are monitoring. In-Stat/MDR found that as far back as 2000, 19% of the small businesses it surveyed were monitoring employee web use, with 10% of the respondents also taking action to block certain sites considered inappropriate. Monitoring products vary from piecemeal solutions to comprehensive. Websense, for example, is a frequently used product to monitor employee internet use; it can filter out websites as appropriate. Likewise, MIMEsweeper, is a popular e-mail monitoring product. Meanwhile, WinWhatWhere from TrueActive Software monitors every email, instant message and document sent and received, and also every keystroke typed on a PC where it is installed. The latest version even snaps pictures from a WebCam, saves screenshots, and reads keystrokes in multiple languages. Company founder and CTO Richard Eaton says about 80% of its sales have been to businesses, and the remainder to government agencies, parents monitoring their kids PC use and men or women suspicious of their lovers. If you are satisfied with your answers here, follow these five tips: A word to employees: Never send an email or instant message at work that you wouldnt be afraid to read the next day on the front page of a newspaper, Gartenberg warns. Likewise, dont visit websites at work with URLs youd mind seeing posted, next to your name, in a public forum. Performance and Reliability Find out how computers can help you cut down on travel costs, reduce your reliance on paper and keep you connected while you and your employees are on the road. As computing performance and reliability increase, computers provide more efficient and cost-effective ways to tackle everyday business tasks. Here are some approaches that will help you capitalise on your existing investment. Virtual Meetings Cut Travel Costs Face-to-face meetings will always be an important part of the business process. But there are alternatives, such as videoconferencing, Web conferencing and more.
121

NOTES

ANNA UNIVERSITY CHENNAI

DBA 1736

NOTES

Virtual Meetings Cut Travel Costs Trying to cut back on business trips? Find out how new advances in technology can substitute time-consuming face-to-face meetings. A majority of companies have higher travel expenses than they need, says Alisa Jenkins, senior director at Bredin Business Information, a business consulting firm. This doesnt mean you have to cut out all travel. There are still many cases where meeting face to face is best. But there are also good ways to meet virtually that can make many of your business trips unnecessary. Alternatives to business travel such as web conferencing with Microsoft Office Live Meeting or similar products continue to improve with advances in internet and related technologies, most agree. Well address the options, including video conferencing, teleconferencing, online collaboration tools and the web conferencing in detail below. But first: When do you absolutely need to meet? Here are some scenarios mentioned by experts: You are meeting a new client. You are introducing new people perhaps your replacement to an ongoing but important business relationship. You are attempting to close a significant sale or cut an important deal. You are delivering a product that you must demonstrate. You need to resolve a controversial or complex problem, or discuss top-secret matters such as an acquisition or merger. You need to meet with an attorney to discuss legal matters. You need to solicit money from an investor. You are making sales or training presentations and your materials are best presented in person. Your competitors are meeting face to face with a client you want.

Perhaps you could add other scenarios specific to your company or industry. The point is, meetings remain critical to the success of your business. However, there are many meetings where technology can substitute for travel easily and effectively. Virtual meetings may not be as much fun, but they can allow you to get a lot of work done at less expense. Heres a rundown of the alternatives: Video Conferencing An interactive use of video, computing and communication technologies to allow people in two or more locations to meet either one-on-one or in groups of up to a
122 ANNA UNIVERSITY CHENNAI

EMERGING TRENDS IN TECHNOLOGY MANAGEMENT

dozen people or so without being physically together. Video can be streamed over the Internet or broadcast over television monitors. Pluses: High-end video conferencing systems (such as those owned by many larger corporations) can bring together large groups of people in disparate locales to hear speeches and presentations in a broadcast-quality setting. But video conferencing today also can be done on the cheap, with inexpensive webcams and free or low-cost software, such as Microsoft NetMeeting. Minuses: Unless you go to a video conferencing centre, audio and video equipment must be purchased. (NetMeeting, for example, requires a PC sound card with a microphone and speakers, as well as a video capture card or camera for video support.) Most video conferencing providers charge by the hour, so you may feel pressured to end on the hour and leave business undone. Web Conferencing Video conferencing without the video or, put another way, teleconferencing with the addition of the web for interactive presentations, using PowerPoint, Excel or other documents. Audio can be transmitted by telephone and/or PC microphones. Pluses: All you need is Internet access and a phone. You can make presentations at once to as many as 2,500 people in different locations. You dont have to email the PowerPoint slides or other documents to your audience ahead of time you use the visuals and highlight points in real time. Other participants can also use drawing tools to make points or take control of your presentation as well. NetMeeting works well for web conferencing as well. Minuses: Its certainly not the same as meeting in person, and you miss out on peoples facial expressions and body language, unlike video conferencing. But for straightforward business plan reviews, sales meetings, software demonstrations and customer presentations, it works and brings a lot of people from far and wide together for one meeting. Teleconferencing Teleconferencing services are offered by long-distance carriers or independent service bureaus using sophisticated call connection bridges to join many different phone calls into a single conversation. Pluses: Calls can be set up quickly and easily, at relatively low cost. All you need is a telephone. Accompanying documents can be faxed, emailed or shipped overnight to meeting participants in advance, if necessary. Minuses: Teleconferences work well for simple information sharing and straightforward decision-making that require no visual presentation. But they are not a suitable way to
123

NOTES

ANNA UNIVERSITY CHENNAI

DBA 1736

NOTES

discuss more-complicated matters, which could be presented better via web conferencing. Teleconferencing also is not a desirable way to begin or even further an important business relationship. But, in a pinch, it can accomplish a lot. Online Collaboration Tools While email remains a key business tool, this discussion will focus on extranets private websites that allow you to share files, documents and use message boards with selected customers or partners. Pluses: Having an extranet wont take the place of a long-distance meeting using one of the alternatives above. But it can, over time, reduce the need for some meetings by allowing you to have ongoing communication and document-sharing. Minuses: You can communicate in real time using chat or instant messaging, but most communication is not interactive. Extranets, however, effectively can turn a teleconferencing session into a web conferencing one if all of the participants have access to the private site. 4.3 NANOTECHNOLOGY & MATERIAL SCIENCE Nanotechnology refers to a field of applied science whose theme is the control of matter on an atomic and molecular scale. Generally nanotechnology deals with structures 100 nanometers or smaller, and involves developing materials or devices within that size. Nanotechnology is a highly diverse and multidisciplinary field, ranging from novel extensions of conventional device physics, to completely new approaches based upon molecular self-assembly, to developing new materials with dimensions on the nanoscale, even to speculation on whether we can directly control matter on the atomic scale. Nanotechnology has the potential to create many new materials and devices with wide-ranging applications, such as in medicine, electronics, and energy production. On the other hand, nanotechnology raises many of the same issues as with any introduction of new technology, including concerns about the toxicity and environmental impact of nanomaterials, and their potential effects on global economics, as well as speculation about various doomsday scenarioes. These concerns have lead to a debate among advocacy groups and governments on whether special regulation of nanotechnology is warranted. Fundamental concepts One nanometer (nm) is one billionth, or 10-9 of a meter. By comparison, typical carboncarbon bond lengths, or the spacing between these atoms in a molecule, are in the range 0.12-0.15 nm, and a DNA double-helix has a diameter around 2 nm. On the other hand, the smallest cellular lifeforms, the bacteria of the genus Mycoplasma, are around 20000 nm in length.

124

ANNA UNIVERSITY CHENNAI

EMERGING TRENDS IN TECHNOLOGY MANAGEMENT

To put that scale in another context, the comparative size of a nanometer to a meter is the same as that of a marble to the size of the earth. Or another way of putting it: a nanometer is the amount a mans beard grows in the time it takes him to raise the razor to his face. Two main approaches are used in nanotechnology. In the bottom-up approach, materials and devices are built from molecular components which assemble themselves chemically by principles of molecular recognition. In the top-down approach, nanoobjects are constructed from larger entities without atomic-level control. A number of physical phenomena become pronounced as the size of the system decreases. These include statistical mechanical effects, as well as quantum mechanical effects, for example the quantum size effect where the electronic properties of solids are altered with great reductions in particle size. This effect does not come into play by going from macro to micro dimensions. However, it becomes dominant when the nanometer size range is reached. Additionally, a number of physical (mechanical, electrical, optical, etc.) properties change when compared to macroscopic systems. One example is the increase in surface area to volume ratio altering mechanical, thermal and catalytic properties of materials. Novel mechanical properties of nanosystems are of interest in the nanomechanics research. The catalytic activity of nanomaterials also opens potential risks in their interaction with biomaterials. Materials reduced to the nanoscale can show different properties compared to what they exhibit on a macroscale, enabling unique applications. For instance, opaque substances become transparent (copper); stable materials turn combustible (aluminum); solids turn into liquids at room temperature (gold); insulators become conductors (silicon). A material such as gold, which is chemically inert at normal scales, can serve as a potent chemical catalyst at nanoscales. Much of the fascination with nanotechnology stems from these quantum and surface phenomena that matter exhibits at the nanoscale. Molecular nanotechnology: a long-term view Molecular nanotechnology, sometimes called molecular manufacturing, is a term given to the concept of engineered nanosystems (nanoscale machines) operating on the molecular scale. It is especially associated with the concept of a molecular assembler, a machine that can produce a desired structure or device atom-by-atom using the principles of mechanosynthesis. Manufacturing in the context of productive nanosystems is not related to, and should be clearly distinguished from, the conventional technologies used to manufacture nanomaterials such as carbon nanotubes and nanoparticles. It is hoped that developments in nanotechnology will make possible their construction by some other means, perhaps using biomimetic principles. However, Drexler and other researchers have proposed that advanced nanotechnology, although perhaps initially
125

NOTES

ANNA UNIVERSITY CHENNAI

DBA 1736

NOTES

implemented by biomimetic means, ultimately could be based on mechanical engineering principles, namely, a manufacturing technology based on the mechanical functionality of these components (such as gears, bearings, motors, and structural members) that would enable programmable, positional assembly to atomic specification (PNAS-1981). The physics and engineering performance of exemplar designs were analyzed in Drexlers book Nanosystems. In general it is very difficult to assemble devices on the atomic scale, as all one has to position atoms are other atoms of comparable size and stickyness. Another view, put forth by Carlo Montemagno, is that future nanosystems will be hybrids of silicon technology and biological molecular machines. Yet another view, put forward by the late Richard Smalley, is that mechanosynthesis is impossible due to the difficulties in mechanically manipulating individual molecules. Nanomaterials This includes subfields which develop or study materials having unique properties arising from their nanoscale dimensions. Interface and Colloid Science has given rise to many materials which may be useful in nanotechnology, such as carbon nanotubes and other fullerenes, and various nanoparticles and nanorods. Nanoscale materials can also be used for bulk applications; most present commercial applications of nanotechnology are of this flavor. Progress has been made in using these materials for medical applications; see Nanomedicine.

Bottom-up approaches These seek to arrange smaller components into more complex assemblies. DNA nanotechnology utilizes the specificity of Watson-Crick basepairing to construct well-defined structures out of DNA and other nucleic acids. Approaches from the field of classical chemical synthesis also aim at designing molecules with well-defined shape. More generally, molecular self-assembly seeks to use concepts of supramolecular chemistry, and molecular recognition in particular, to cause single-molecule components to automatically arrange themselves into some useful conformation.

Top-down approaches These seek to create smaller devices by using larger ones to direct their assembly. Many technologies descended from conventional solid-state silicon methods for fabricating microprocessors are now capable of creating features smaller than 100 nm, falling under the definition of nanotechnology. Giant magnetoresistance126 ANNA UNIVERSITY CHENNAI

EMERGING TRENDS IN TECHNOLOGY MANAGEMENT

based hard drives already on the market fit this description, as do atomic layer deposition (ALD) techniques. Peter Grnberg and Albert Fert received the Nobel Prize in Physics for their discovery of Giant magnetoresistance and contributions to the field of spintronics in 2007. Solid-state techniques can also be used to create devices known as nanoelectromechanical systems or NEMS, which are related to microelectromechanical systems or MEMS. Atomic force microscope tips can be used as a nanoscale write head to deposit a chemical upon a surface in a desired pattern in a process called dip pen nanolithography. This fits into the larger subfield of nanolithography.

NOTES

Functional approaches These seek to develop components of a desired functionality without regard to how they might be assembled. Molecular electronics seeks to develop molecules with useful electronic properties. These could then be used as single-molecule components in a nanoelectronic device. For an example see rotaxane. Synthetic chemical methods can also be used to create synthetic molecular motors, such as in a so-called nanocar.

Speculative These subfields seek to anticipate what inventions nanotechnology might yield, or attempt to propose an agenda along which inquiry might progress. These often take a bigpicture view of nanotechnology, with more emphasis on its societal implications than the details of how such inventions could actually be created. Molecular nanotechnology is a proposed approach which involves manipulating single molecules in finely controlled, deterministic ways. This is more theoretical than the other subfields and is beyond current capabilities. Nanorobotics centers on self-sufficient machines of some functionality operating at the nanoscale. There are hopes for applying nanorobots in medicine, but it may not be easy to do such a thing because of several drawbacks of such devices. Nevertheless, progress on innovative materials and methodologies has been demonstrated with some patents granted about new nanomanufacturing devices for future commercial applications, which also progressively helps in the development towards nanorobots with the use of embedded nanobioelectronics concept. Programmable matter based on artificial atoms seeks to design materials whose properties can be easily, reversibly and externally controlled. Due to the popularity and media exposure of the term nanotechnology, the words picotechnology and femtotechnology have been coined in analogy to it, although these are only used rarely and informally.

127

ANNA UNIVERSITY CHENNAI

DBA 1736

NOTES

Applications As of April 24, 2008 The Project on Emerging Nanotechnologies claims that over 609 nanotech products exist, with new ones hitting the market at a pace of 3-4 per week. The project lists all of the products in a database. Most applications are limited to the use of first generation passive nanomaterials which includes titanium dioxide in sunscreen, cosmetics and some food products; Carbon allotropes used to produce gecko tape; silver in food packaging, clothing, disinfectants and household appliances; zinc oxide in sunscreens and cosmetics, surface coatings, paints and outdoor furniture varnishes; and cerium oxide as a fuel catalyst. The National Science Foundation (a major source of funding for nanotechnology in the United States) funded researcher David Berube to study the field of nanotechnology. His findings are published in the monograph Nano-Hype: The Truth Behind the Nanotechnology Buzz. This published study (with a foreword by Anwar Mikhail, Senior Advisor for Nanotechnology at the National Science Foundation) concludes that much of what is sold as nanotechnology is in fact a recasting of straightforward materials science, which is leading to a nanotech industry built solely on selling nanotubes, nanowires, and the like which will end up with a few suppliers selling low margin products in huge volumes. Further applications which require actual manipulation or arrangement of nanoscale components await further research. Though technologies branded with the term nano are sometimes little related to and fall far short of the most ambitious and transformative technological goals of the sort in molecular manufacturing proposals, the term still connotes such ideas. Thus there may be a danger that a nano bubble will form, or is forming already, from the use of the term by scientists and entrepreneurs to garner funding, regardless of interest in the transformative possibilities of more ambitious and far-sighted work. Implications Due to the far-ranging claims that have been made about potential applications of nanotechnology, a number of serious concerns have been raised about what effects these will have on our society if realized, and what action if any is appropriate to mitigate these risks. One area of concern is the effect that industrial-scale manufacturing and use of nanomaterials would have on human health and the environment, as suggested by nanotoxicology research. Groups such as the Center for Responsible Nanotechnology have advocated that nanotechnology should be specially regulated by governments for these reasons. Others counter that overregulation would stifle scientific research and the development of innovations which could greatly benefit mankind. Other experts, including director of the Woodrow Wilson Centers Project on Emerging Nanotechnologies David Rejeski, have testified that successful commercialization depends
128 ANNA UNIVERSITY CHENNAI

EMERGING TRENDS IN TECHNOLOGY MANAGEMENT

on adequate oversight, risk research strategy, and public engagement. More recently local municipalities have passed (Berkeley, CA) or are considering (Cambridge, MA) ordinances requiring nanomaterial manufacturers to disclose the known risks of their products. The National Institute for Occupational Safety and Health is conducting research on how nanoparticles interact with the bodys systems and how workers might be exposed to nano-sized particles in the manufacturing or industrial use of nanomaterials. NIOSH offers interim guidelines for working with nanomaterials consistent with the best scientific knowledge. Longer-term concerns center on the implications that new technologies will have for society at large, and whether these could possibly lead to either a post scarcity economy, or alternatively exacerbate the wealth gap between developed and developing nations. The effects of nanotechnology on the society as a whole, on human health and the environment, on trade, on security, on food systems and even on the definition of human, have not been characterized or politicized. Health and environmental concerns - Nanotoxicology Some of the recently developed nanoparticle products may have unintended consequences. Researchers have discovered that silver nanoparticles used in socks to reduce foot odor are being released in the wash with possible negative consequences. Silver nanoparticles, which are bacteriostatic, may then destroy beneficial bacteria which are important for breaking down organic matter in waste treatment plants or farms. A study at the University of Rochester found that when rats breathed in nanoparticles, the particles settled in the brain and lungs, which lead to significant increases in biomarkers for inflammation and stress response. A major study published more recently in Nature nanotechnology suggests some forms of carbon nanotubes a poster child for the nanotechnology revolution could be as harmful as asbestos if inhaled in sufficient quantities. Anthony Seaton of the Institute of Occupational Medicine in Edinburgh, Scotland, who contributed to the article on carbon nanotubes said We know that some of them probably have the potential to cause mesothelioma. So those sorts of materials need to be handled very carefully. In the absence of specific nano-regulation forthcoming from governments, Paull and Lyons (2008) have called for an exclusion of engineered nanoparticles from organic food. Regulation of Nanotechnology Calls for tighter regulation of nanotechnology have occurred alongside a growing debate related to the human health and safety risks associated with nanotechnology. Further, there is significant debate about who is responsible for the regulation of nanotechnology. While some non-nanotechnology specific regulatory agencies currently cover some products

NOTES

129

ANNA UNIVERSITY CHENNAI

DBA 1736

NOTES

and processes (to varying degrees) by bolting on nanotechnology to existing regulations there are clear gaps in these regimes. Stakeholders concerned by the lack of a regulatory framework to assess and control risks associated with the release of nanoparticles and nanotubes have drawn parallels with bovine spongiform encephalopathy (mad cows disease), thalidomide, genetically modified food, nuclear energy, reproductive technologies, biotechnology, and asbestosis. The Woodrow Wilson Centres Project on Emerging Technologies conclude that there is insufficient funding for human health and safety research, and as a result there is currently limited understanding of the human health and safety risks associated with nanotechnology. The Royal Society report identified a risk of nanoparticles or nanotubes being released during disposal, destruction and recycling, and recommended that manufacturers of products that fall under extended producer responsibility regimes such as end-of-life regulations publish procedures outlining how these materials will be managed to minimize possible human and environmental exposure (p.xiii). Reflecting the challenges for ensuring responsible life cycle regulation, the Institute for Food and Agricultural Standards has proposed standards for nanotechnology research and development should be integrated across consumer, worker and environmental standards. They also propose that NGOs and other citizen groups play a meaningful role in the development of these standards. 4.4 BIOTECHNOLOGY Biotechnology is technology based on biology, especially when used in agriculture, food science, and medicine. The United Nations Convention on Biological Diversity defines biotechnology as: Any technological application that uses biological systems, living organisms, or derivatives thereof, to make or modify products or processes for specific use. Biotechnology is often used to refer to genetic engineering technology of the 21st century, however the term encompasses a wider range and history of procedures for modifying biological organisms according to the needs of humanity, going back to the initial modifications of native plants into improved food crops through artificial selection and hybridization. Bioengineering is the science upon which all biotechnological applications are based. With the development of new approaches and modern techniques, traditional biotechnology industries are also acquiring new horizons enabling them to improve the quality of their products and increase the productivity of their systems. Before 1971, the term, biotechnology, was primarily used in the food processing and agriculture industries. Since the 1970s, it began to be used by the Western scientific establishment to refer to laboratory-based techniques being developed in biological research, such as recombinant DNA or tissue culture-based processes, or horizontal gene transfer in living plants, using vectors such as the Agrobacterium bacteria to transfer DNA into a
130 ANNA UNIVERSITY CHENNAI

EMERGING TRENDS IN TECHNOLOGY MANAGEMENT

host organism. In fact, the term should be used in a much broader sense to describe the whole range of methods, both ancient and modern, used to manipulate organic materials to reach the demands of food production. So the term could be defined as, The application of indigenous and/or scientific knowledge to the management of (parts of) microorganisms, or of cells and tissues of higher organisms, so that these supply goods and services of use to the food industry and its consumers. Biotechnology combines disciplines like genetics, molecular biology, biochemistry, embryology and cell biology, which are in turn linked to practical disciplines like chemical engineering, information technology, and robotics. Patho-biotechnology describes the exploitation of pathogens or pathogen derived compounds for beneficial effect. History of Biotechnology Brewing was an early application of biotechnology. The most practical use of biotechnology, which is still present today, is the cultivation of plants to produce food suitable to humans. Agriculture has been theorized to have become the dominant way of producing food since the Neolithic Revolution. The processes and methods of agriculture have been refined by other mechanical and biological sciences since its inception. Through early biotechnology, farmers were able to select the best suited and highest-yield crops to produce enough food to support a growing population. Other uses of biotechnology were required as crops and fields became increasingly large and difficult to maintain. Specific organisms and organism by-products were used to fertilize, restore nitrogen, and control pests. Throughout the use of agriculture farmers have inadvertently altered the genetics of their crops through introducing them to new environments and breeding them with other plantsone of the first forms of biotechnology. Cultures such as those in Mesopotamia, Egypt, and Pakistan developed the process of brewing beer. It is still done by the same basic method of using malted grains (containing enzymes) to convert starch from grains into sugar and then adding specific yeasts to produce beer. In this process the carbohydrates in the grains were broken down into alcohols such as ethanol. Ancient Indians also used the juices of the plant Ephedra Vulgaris and used to call it Soma. Later other cultures produced the process of Lactic acid fermentation which allowed the fermentation and preservation of other forms of food. Fermentation was also used in this time period to produce leavened bread. Although the process of fermentation was not fully understood until Louis Pasteurs work in 1857, it is still the first use of biotechnology to convert a food source into another form. Combinations of plants and other organisms were used as medications in many early civilizations. Since as early as 200 BC, people began to use disabled or minute amounts of infectious agents to immunize themselves against infections. These and similar processes have been refined in modern medicine and have led to many developments such as antibiotics, vaccines, and other methods of fighting sickness.
131

NOTES

ANNA UNIVERSITY CHENNAI

DBA 1736

NOTES

In the early twentieth century scientists gained a greater understanding of microbiology and explored ways of manufacturing specific products. In 1917, Chaim Weizmann first used a pure microbiological culture in an industrial process, that of manufacturing corn starch using Clostridium acetobutylicum to produce acetone, which the United Kingdom desperately needed to manufacture explosives during World War I. The field of modern biotechnology is thought to have largely begun on June 16, 1980, when the United States Supreme Court ruled that a genetically-modified microorganism could be patented in the case of Diamond v. Chakrabarty. Indian-born Ananda Chakrabarty, working for General Electric, had developed a bacterium (derived from the Pseudomonas genus) capable of breaking down crude oil, which he proposed to use in treating oil spills. Revenue in the industry is expected to grow by 12.9% in 2008. Another factor influencing the biotechnology sectors success is improved intellectual property rights legislation and enforcement worldwide, as well as strengthened demand for medical and pharmaceutical products to cope with an ageing, and ailing, U.S. population. Rising demand for biofuels is expected to be good news for the biotechnology sector, with the Department of Energy estimating ethanol usage could reduce U.S. petroleumderived fuel consumption by up to 30% by 2030. The biotechnology sector has allowed the U.S. farming industry to rapidly increase its supply of corn and soybeans the main inputs into biofuels by developing genetically-modified seeds which are resistant to pests and drought. By boosting farm productivity, biotechnology plays a crucial role in ensuring that biofuel production targets are met. Applications Biotechnology has applications in four major industrial areas, including health care (medical), crop production and agriculture, non food (industrial) uses of crops and other products (e.g. biodegradable plastics, vegetable oil, biofuels), and environmental uses. For example, one application of biotechnology is the directed use of organisms for the manufacture of organic products (examples include beer and milk products). Another example is using naturally present bacteria by the mining industry in bioleaching. Biotechnology is also used to recycle, treat waste, clean up sites contaminated by industrial activities (bioremediation), and also to produce biological weapons. A series of derived terms have been coined to identify several branches of biotechnology, for example: Red biotechnology is applied to medical processes. Some examples are the designing of organisms to produce antibiotics, and the engineering of genetic cures through genomic manipulation.

132

ANNA UNIVERSITY CHENNAI

EMERGING TRENDS IN TECHNOLOGY MANAGEMENT

Green biotechnology is biotechnology applied to agricultural processes. An example would be the selection and domestication of plants via micropropagation. Another example is the designing of transgenic plants to grow under specific environmental conditions or in the presence (or absence) of certain agricultural chemicals. One hope is that green biotechnology might produce more environmentally friendly solutions than traditional industrial agriculture. An example of this is the engineering of a plant to express a pesticide, thereby eliminating the need for external application of pesticides. An example of this would be Bt corn. Whether or not green biotechnology products such as this are ultimately more environmentally friendly is a topic of considerable debate. White biotechnology, also known as industrial biotechnology, is biotechnology applied to industrial processes. An example is the designing of an organism to produce a useful chemical. Another example is the using of enzymes as industrial catalysts to either produce valuable chemicals or destroy hazardous/polluting chemicals. White biotechnology tends to consume less in resources than traditional processes used to produce industrial goods. Blue biotechnology is a term that has been used to describe the marine and aquatic applications of biotechnology, but its use is relatively rare. The investments and economic output of all of these types of applied biotechnologies form what has been described as the bioeconomy. Bioinformatics is an interdisciplinary field which addresses biological problems using computational techniques, and makes the rapid organization and analysis of biological data possible. The field may also be referred to as computational biology, and can be defined as, conceptualizing biology in terms of molecules and then applying informatics techniques to understand and organize the information associated with these molecules, on a large scale. Bioinformatics plays a key role in various areas, such as functional genomics, structural genomics, and proteomics, and forms a key component in the biotechnology and pharmaceutical sector.

NOTES

Application of Biotechnology in Medicine In medicine, modern biotechnology finds promising applications in such areas as pharmacogenomics; drug production; genetic testing; and gene therapy.

Pharmacogenomics DNA Microarray chip Some can do as many as a million blood tests at once. Pharmacogenomics is the study of how the genetic inheritance of an individual affects his/ her bodys response to drugs. It is a coined word derived from the words pharmacology and genomics. It is hence the study of the relationship between pharmaceuticals and
133 ANNA UNIVERSITY CHENNAI

DBA 1736

NOTES

genetics. The vision of pharmacogenomics is to be able to design and produce drugs that are adapted to each persons genetic makeup. Pharmacogenomics results in the following benefits 1. Development of tailor-made medicines. Using pharmacogenomics, pharmaceutical companies can create drugs based on the proteins, enzymes and RNA molecules that are associated with specific genes and diseases. These tailor-made drugs promise not only to maximize therapeutic effects but also to decrease damage to nearby healthy cells. 2. More accurate methods of determining appropriate drug dosages. Knowing a patients genetics will enable doctors to determine how well his/ her body can process and metabolize a medicine. This will maximize the value of the medicine and decrease the likelihood of overdose. 3. Improvements in the drug discovery and approval process. The discovery of potential therapies will be made easier using genome targets. Genes have been associated with numerous diseases and disorders. With modern biotechnology, these genes can be used as targets for the development of effective new therapies, which could significantly shorten the drug discovery process. 4. Better vaccines. Safer vaccines can be designed and produced by organisms transformed by means of genetic engineering. These vaccines will elicit the immune response without the attendant risks of infection. They will be inexpensive, stable, easy to store, and capable of being engineered to carry several strains of pathogen at once. Bio-Pharmaceutical products Most traditional pharmaceutical drugs are relatively simple molecules that have been found primarily through trial and error to treat the symptoms of a disease or illness. Biopharmaceuticals are large biological molecules known as proteins and these usually target the underlying mechanisms and pathways of a malady (but not always, as is the case with using insulin to treat type 1 diabetes mellitus, as that treatment merely addresses the symptoms of the disease, not the underlying cause which is autoimmunity); it is a relatively young industry. They can deal with targets in humans that may not be accessible with traditional medicines. A patient typically is dosed with a small molecule via a tablet while a large molecule is typically injected. Small molecules are manufactured by chemistry but larger molecules are created by living cells such as those found in the human body: for example, bacteria cells, yeast cells, animal or plant cells. Modern biotechnology is often associated with the use of genetically altered microorganisms such as E. coli or yeast for the production of substances like synthetic insulin or antibiotics. It can also refer to transgenic animals or transgenic plants, such as Bt
134 ANNA UNIVERSITY CHENNAI

EMERGING TRENDS IN TECHNOLOGY MANAGEMENT

corn. Genetically altered mammalian cells, such as Chinese Hamster Ovary (CHO) cells, are also used to manufacture certain pharmaceuticals. Another promising new biotechnology application is the development of plant-made pharmaceuticals. Biotechnology is also commonly associated with landmark breakthroughs in new medical therapies to treat hepatitis B, hepatitis C, cancers, arthritis, haemophilia, bone fractures, multiple sclerosis, and cardiovascular disorders. The biotechnology industry has also been instrumental in developing molecular diagnostic devices than can be used to define the target patient population for a given biopharmaceutical. Herceptin, for example, was the first drug approved for use with a matching diagnostic test and is used to treat breast cancer in women whose cancer cells express the protein HER2. Modern biotechnology can be used to manufacture existing medicines relatively easily and cheaply. The first genetically engineered products were medicines designed to treat human diseases. To cite one example, in 1978 Genentech developed synthetic humanized insulin by joining its gene with a plasmid vector inserted into the bacterium Escherichia coli. Insulin, widely used for the treatment of diabetes, was previously extracted from the pancreas of abattoir animals (cattle and/or pigs). The resulting genetically engineered bacterium enabled the production of vast quantities of synthetic human insulin at relatively low cost, although the cost savings was used to increase profits for manufacturers, not passed on to consumers or their healthcare providers. According to a 2003 study undertaken by the International Diabetes Federation (IDF) on the access to and availability of insulin in its member countries, synthetic human insulin is considerably more expensive in most countries where both synthetic human and animal insulin are commercially available: e.g. within European countries the average price of synthetic human insulin was twice as high as the price of pork insulin. Yet in its position statement, the IDF writes that there is no overwhelming evidence to prefer one species of insulin over another and [modern, highlypurified] animal insulins remain a perfectly acceptable alternative. Modern biotechnology has evolved, making it possible to produce more easily and relatively cheaply human growth hormone, clotting factors for hemophiliacs, fertility drugs, erythropoietin and other drugs. Most drugs today are based on about 500 molecular targets. Genomic knowledge of the genes involved in diseases, disease pathways, and drug-response sites are expected to lead to the discovery of thousands more new targets. Genetic testing Gel electrophoresis Genetic testing involves the direct examination of the DNA molecule itself. A scientist scans a patients DNA sample for mutated sequences. There are two major types of gene tests. In the first type, a researcher may design short pieces of DNA (probes) whose sequences are complementary to the mutated
135

NOTES

ANNA UNIVERSITY CHENNAI

DBA 1736

NOTES

sequences. These probes will seek their complement among the base pairs of an individuals genome. If the mutated sequence is present in the patients genome, the probe will bind to it and flag the mutation. In the second type, a researcher may conduct the gene test by comparing the sequence of DNA bases in a patients gene to disease in healthy individuals or their progeny. Genetic testing is now used for: Determining sex Carrier screening, or the identification of unaffected individuals who carry one copy of a gene for a disease that requires two copies for the disease to manifest Prenatal diagnostic screening Newborn screening Presymptomatic testing for predicting adult-onset disorders Presymptomatic testing for estimating the risk of developing adult-onset cancers Confirmational diagnosis of symptomatic individuals Forensic/identity testing

Some genetic tests are already available, although most of them are used in developed countries. The tests currently available can detect mutations associated with rare genetic disorders like cystic fibrosis, sickle cell anemia, and Huntingtons disease. Recently, tests have been developed to detect mutation for a handful of more complex conditions such as breast, ovarian, and colon cancers. However, gene tests may not detect every mutation associated with a particular condition because many are as yet undiscovered, and the ones they do detect may present different risks to different people and populations. The bacterium E. coli is routinely genetically engineered. Several issues have been raised regarding the use of genetic testing: 1. Absence of cure. There is still a lack of effective treatment or preventive measures for many diseases and conditions now being diagnosed or predicted using gene tests. Thus, revealing information about risk of a future disease that has no existing cure presents an ethical dilemma for medical practitioners. 2. Ownership and control of genetic information. Who will own and control genetic information, or information about genes, gene products, or inherited characteristics derived from an individual or a group of people like indigenous communities? At the macro level, there is a possibility of a genetic divide, with developing countries that do not have access to medical applications of biotechnology being deprived of benefits accruing from products derived from genes obtained from their own people. Moreover, genetic information can pose a risk for minority population groups as it can lead to group stigmatization.

136

ANNA UNIVERSITY CHENNAI

EMERGING TRENDS IN TECHNOLOGY MANAGEMENT

3. At the individual level, the absence of privacy and anti-discrimination legal protections in most countries can lead to discrimination in employment or insurance or other misuse of personal genetic information. This raises questions such as whether genetic privacy is different from medical privacy. 4. Reproductive issues. These include the use of genetic information in reproductive decision-making and the possibility of genetically altering reproductive cells that may be passed on to future generations. For example, germline therapy forever changes the genetic make-up of an individuals descendants. Thus, any error in technology or judgment may have far-reaching consequences. Ethical issues like designer babies and human cloning have also given rise to controversies between and among scientists and bioethicists, especially in the light of past abuses with eugenics. 5. Clinical issues. These center on the capabilities and limitations of doctors and other health-service providers, people identified with genetic conditions, and the general public in dealing with genetic information. 6. Effects on social institutions. Genetic tests reveal information about individuals and their families. Thus, test results can affect the dynamics within social institutions, particularly the family. 7. Conceptual and philosophical implications regarding human responsibility, free will vis--vis genetic determinism, and the concepts of health and disease. Gene therapy Gene therapy using an Adenovirus vector. A new gene is inserted into an adenovirus vector, which is used to introduce the modified DNA into a human cell. If the treatment is successful, the new gene will make a functional protein. Gene therapy may be used for treating, or even curing, genetic and acquired diseases like cancer and AIDS by using normal genes to supplement or replace defective genes or to bolster a normal function such as immunity. It can be used to target somatic (i.e., body) or germ (i.e., egg and sperm) cells. In somatic gene therapy, the genome of the recipient is changed, but this change is not passed along to the next generation. In contrast, in germline gene therapy, the egg and sperm cells of the parents are changed for the purpose of passing on the changes to their offspring. There are basically two ways of implementing a gene therapy treatment: 1. Ex vivo, which means outside the body Cells from the patients blood or bone marrow are removed and grown in the laboratory. They are then exposed to a virus carrying the desired gene. The virus enters the cells, and the desired gene becomes part of the DNA of the cells. The cells are allowed to grow in the laboratory before being returned to the patient by injection into a vein. 2. In vivo, which means inside the body No cells are removed from the patients body. Instead, vectors are used to deliver the desired gene to cells in the patients body.
137

NOTES

ANNA UNIVERSITY CHENNAI

DBA 1736

NOTES

Currently, the use of gene therapy is limited. Somatic gene therapy is primarily at the experimental stage. Germline therapy is the subject of much discussion but it is not being actively investigated in larger animals and human beings. As of June 2001, more than 500 clinical gene-therapy trials involving about 3,500 patients have been identified worldwide. Around 78% of these are in the United States, with Europe having 18%. These trials focus on various types of cancer, although other multigenic diseases are being studied as well. Recently, two children born with severe combined immunodeficiency disorder (SCID) were reported to have been cured after being given genetically engineered cells. Gene therapy faces many obstacles before it can become a practical approach for treating disease. At least four of these obstacles are as follows: 1. Gene delivery tools. Genes are inserted into the body using gene carriers called vectors. The most common vectors now are viruses, which have evolved a way of encapsulating and delivering their genes to human cells in a pathogenic manner. Scientists manipulate the genome of the virus by removing the disease-causing genes and inserting the therapeutic genes. However, while viruses are effective, they can introduce problems like toxicity, immune and inflammatory responses, and gene control and targeting issues. 2. Limited knowledge of the functions of genes. Scientists currently know the functions of only a few genes. Hence, gene therapy can address only some genes that cause a particular disease. Worse, it is not known exactly whether genes have more than one function, which creates uncertainty as to whether replacing such genes is indeed desirable. 3. Multigene disorders and effect of environment. Most genetic disorders involve more than one gene. Moreover, most diseases involve the interaction of several genes and the environment. For example, many people with cancer not only inherit the disease gene for the disorder, but may have also failed to inherit specific tumor suppressor genes. Diet, exercise, smoking and other environmental factors may have also contributed to their disease. 4. High costs. Since gene therapy is relatively new and at an experimental stage, it is an expensive treatment to undertake. This explains why current studies are focused on illnesses commonly found in developed countries, where more people can afford to pay for treatment. It may take decades before developing countries can take advantage of this technology. Human Genome Project The Human Genome Project is an initiative of the U.S. Department of Energy (DOE) that aims to generate a high-quality reference sequence for the entire human genome and identify all the human genes.

138

ANNA UNIVERSITY CHENNAI

EMERGING TRENDS IN TECHNOLOGY MANAGEMENT

The DOE and its predecessor agencies were assigned by the U.S. Congress to develop new energy resources and technologies and to pursue a deeper understanding of potential health and environmental risks posed by their production and use. In 1986, the DOE announced its Human Genome Initiative. Shortly thereafter, the DOE and National Institutes of Health developed a plan for a joint Human Genome Project (HGP), which officially began in 1990. The HGP was originally planned to last 15 years. However, rapid technological advances and worldwide participation accelerated the completion date to 2003 (making it a 13 year project). Already it has enabled gene hunters to pinpoint genes associated with more than 30 disorders. Cloning Cloning involves the removal of the nucleus from one cell and its placement in an unfertilized egg cell whose nucleus has either been deactivated or removed. There are two types of cloning: 1. Reproductive cloning. After a few divisions, the egg cell is placed into a uterus where it is allowed to develop into a fetus that is genetically identical to the donor of the original nucleus. 2. Therapeutic cloning. The egg is placed into a Petri dish where it develops into embryonic stem cells, which have shown potentials for treating several ailments. In February 1997, cloning became the focus of media attention when Ian Wilmut and his colleagues at the Roslin Institute announced the successful cloning of a sheep, named Dolly, from the mammary glands of an adult female. The cloning of Dolly made it apparent to many that the techniques used to produce her could someday be used to clone human beings. This stirred a lot of controversy because of its ethical implications. Agriculture Improve yield from crops Using the techniques of modern biotechnology, one or two genes may be transferred to a highly developed crop variety to impart a new character that would increase its yield (30). However, while increases in crop yield are the most obvious applications of modern biotechnology in agriculture, it is also the most difficult one. Current genetic engineering techniques work best for effects that are controlled by a single gene. Many of the genetic characteristics associated with yield (e.g., enhanced growth) are controlled by a large number of genes, each of which has a minimal effect on the overall yield (31). There is, therefore, much scientific work to be done in this area. Reduced vulnerability of crops to environmental stresses Crops containing genes that will enable them to withstand biotic and abiotic stresses may be developed. For example, drought and excessively salty soil are two important limiting factors in crop productivity. Biotechnologists are studying plants that can cope with
139

NOTES

ANNA UNIVERSITY CHENNAI

DBA 1736

NOTES

these extreme conditions in the hope of finding the genes that enable them to do so and eventually transferring these genes to the more desirable crops. One of the latest developments is the identification of a plant gene, At-DBF2, from thale cress, a tiny weed that is often used for plant research because it is very easy to grow and its genetic code is well mapped out. When this gene was inserted into tomato and tobacco cells (see RNA interference), the cells were able to withstand environmental stresses like salt, drought, cold and heat, far more than ordinary cells. If these preliminary results prove successful in larger trials, then At-DBF2 genes can help in engineering crops that can better withstand harsh environments (32). Researchers have also created transgenic rice plants that are resistant to rice yellow mottle virus (RYMV). In Africa, this virus destroys majority of the rice crops and makes the surviving plants more susceptible to fungal infections (33). Increased nutritional qualities of food crops Proteins in foods may be modified to increase their nutritional qualities. Proteins in legumes and cereals may be transformed to provide the amino acids needed by human beings for a balanced diet (34). A good example is the work of Professors Ingo Potrykus and Peter Beyer on the so-called Goldenrice. Improved taste, texture or appearance of food Modern biotechnology can be used to slow down the process of spoilage so that fruit can ripen longer on the plant and then be transported to the consumer with a still reasonable shelf life. This improves the taste, texture and appearance of the fruit. More importantly, it could expand the market for farmers in developing countries due to the reduction in spoilage. The first genetically modified food product was a tomato which was transformed to delay its ripening (35). Researchers in Indonesia, Malaysia, Thailand, Philippines and Vietnam are currently working on delayed-ripening papaya in collaboration with the University of Nottingham and Zeneca (36). Biotechnology in cheese production: enzymes produced by micro-organisms provide an alternative to animal rennet a cheese coagulant - and an alternative supply for cheese makers. This also eliminates possible public concerns with animal-derived material, although there is currently no plans to develop synthetic milk, thus making this argument less compelling. Enzymes offer an animal-friendly alternative to animal rennet. While providing comparable quality, they are theoretically also less expensive. About 85 million tons of wheat flour is used every year to bake bread. By adding an enzyme called maltogenic amylase to the flour, bread stays fresher longer. Assuming that 10-15% of bread is thrown away, if it could just stay fresh another 57 days then 2 million tons of flour per year would be saved. That corresponds to 40% of the bread consumed in a country such as the USA. This means more bread becomes available with no increase

140

ANNA UNIVERSITY CHENNAI

EMERGING TRENDS IN TECHNOLOGY MANAGEMENT

in input. In combination with other enzymes, bread can also be made bigger, more appetizing and better in a range of ways. Reduced dependence on fertilizers, pesticides and other agrochemicals Most of the current commercial applications of modern biotechnology in agriculture are on reducing the dependence of farmers on agrochemicals. For example, Bacillus thuringiensis (Bt) is a soil bacterium that produces a protein with insecticidal qualities. Traditionally, a fermentation process has been used to produce an insecticidal spray from these bacteria. In this form, the Bt toxin occurs as an inactive protoxin, which requires digestion by an insect to be effective. There are several Bt toxins and each one is specific to certain target insects. Crop plants have now been engineered to contain and express the genes for Bt toxin, which they produce in its active form. When a susceptible insect ingests the transgenic crop cultivar expressing the Bt protein, it stops feeding and soon thereafter dies as a result of the Bt toxin binding to its gut wall. Bt corn is now commercially available in a number of countries to control corn borer (a lepidopteran insect), which is otherwise controlled by spraying (a more difficult process). Crops have also been genetically engineered to acquire tolerance to broad-spectrum herbicide. The lack of cost-effective herbicides with broad-spectrum activity and no crop injury was a consistent limitation in crop weed management. Multiple applications of numerous herbicides were routinely used to control a wide range of weed species detrimental to agronomic crops. Weed management tended to rely on preemergence that is, herbicide applications were sprayed in response to expected weed infestations rather than in response to actual weeds present. Mechanical cultivation and hand weeding were often necessary to control weeds not controlled by herbicide applications. The introduction of herbicide tolerant crops has the potential of reducing the number of herbicide active ingredients used for weed management, reducing the number of herbicide applications made during a season, and increasing yield due to improved weed management and less crop injury. Transgenic crops that express tolerance to glyphosate, glufosinate and bromoxynil have been developed. These herbicides can now be sprayed on transgenic crops without inflicting damage on the crops while killing nearby weeds (37). From 1996 to 2001, herbicide tolerance was the most dominant trait introduced to commercially available transgenic crops, followed by insect resistance. In 2001, herbicide tolerance deployed in soybean, corn and cotton accounted for 77% of the 626,000 square kilometres planted to transgenic crops; Bt crops accounted for 15%; and stacked genes for herbicide tolerance and insect resistance used in both cotton and corn accounted for 8% (38). Production of novel substances in crop plants Biotechnology is being applied for novel uses other than food. For example, oilseed can be modified to produce fatty acids for detergents, substitute fuels and petrochemicals. Potatoes, tomatos, rice, tobacco, lettuce, safflowers, and other plants have been genetically141

NOTES

ANNA UNIVERSITY CHENNAI

DBA 1736

NOTES

engineered to produce insulin and certain vaccines. If future clinical trials prove successful, the advantages of edible vaccines would be enormous, especially for developing countries. The transgenic plants may be grown locally and cheaply. Homegrown vaccines would also avoid logistical and economic problems posed by having to transport traditional preparations over long distances and keeping them cold while in transit. And since they are edible, they will not need syringes, which are not only an additional expense in the traditional vaccine preparations but also a source of infections if contaminated. In the case of insulin grown in transgenic plants, it is well-established that the gastrointestinal system breaks the protein down therefore this could not currently be administered as an edible protein. However, it might be produced at significantly lower cost than insulin produced in costly, bioreactors. For example, Calgary, Canada-based SemBioSys Genetics, Inc. reports that its safflowerproduced insulin will reduce unit costs by over 25% or more and reduce the capital costs associated with building a commercial-scale insulin manufacturing facility by approximately over $100 million compared to traditional biomanufacturing facilities. Criticism There is another side to the agricultural biotechnology issue however. It includes increased herbicide usage and resultant herbicide resistance, super weeds, residues on and in food crops, genetic contamination of non-GM crops which hurt organic and conventional farmers, damage to wildlife from glyphosate, etc. 4.5 BIOLOGICAL ENGINEERING Biotechnological engineering or biological engineering is a branch of engineering that focuses on biotechnologies and biological science. It includes different disciplines such as biochemical engineering, biomedical engineering, bio-process engineering, biosystem engineering and so on. Because of the novelty of the field, the definition of a bioengineer is still undefined. However, in general it is an integrated approach of fundamental biological sciences and traditional engineering principles. Bioengineers are often employed to scale up bio processes from the laboratory scale to the manufacturing scale. Moreover, as with most engineers, they often deal with management, economic and legal issues. Since patents and regulation (e.g. FDA regulation in the U.S.) are very important issues for biotech enterprises, bioengineers are often required to have knowledge related to these issues. The increasing number of biotech enterprises is likely to create a need for bioengineers in the years to come. Many universities throughout the world are now providing programs in bioengineering and biotechnology (as independent programs or specialty programs within more established engineering fields).

142

ANNA UNIVERSITY CHENNAI

EMERGING TRENDS IN TECHNOLOGY MANAGEMENT

Bioremediation and Biodegradation Biotechnology is being used to engineer and adapt organisms especially microorganisms in an effort to find sustainable ways to clean up contaminated environments. The elimination of a wide range of pollutants and wastes from the environment is an absolute requirement to promote a sustainable development of our society with low environmental impact. Biological processes play a major role in the removal of contaminants and biotechnology is taking advantage of the astonishing catabolic versatility of microorganisms to degrade/ convert such compounds. New methodological breakthroughs in sequencing, genomics, proteomics, bioinformatics and imaging are producing vast amounts of information. In the field of Environmental Microbiology, genome-based global studies open a new era providing unprecedented in silico views of metabolic and regulatory networks, as well as clues to the evolution of degradation pathways and to the molecular adaptation strategies to changing environmental conditions. Functional genomic and metagenomic approaches are increasing our understanding of the relative importance of different pathways and regulatory networks to carbon flux in particular environments and for particular compounds and they will certainly accelerate the development of bioremediation technologies and biotransformation processes. Marine environments are especially vulnerable since oil spills of coastal regions and the open sea are poorly containable and mitigation is difficult. In addition to pollution through human activities, millions of tons of petroleum enter the marine environment every year from natural seepages. Despite its toxicity, a considerable fraction of petroleum oil entering marine systems is eliminated by the hydrocarbon-degrading activities of microbial communities, in particular by a remarkable recently discovered group of specialists, the so-called hydrocarbonoclastic bacteria (HCB). 4.6 TELECOMMUNICATIONS Telecommunication is the assisted transmission of signals over a distance for the purpose of communication. In earlier times, this may have involved the use of smoke signals, drums, semaphore, flags, or heliograph. In modern times, telecommunication typically involves the use of electronic transmitters such as the telephone, television, radio or computer. Early inventors in the field of telecommunication include Alexander Graham Bell, Guglielmo Marconi and John Logie Baird. Telecommunication is an important part of the world economy and the telecommunication industrys was estimated to be $1.2 trillion dollars in 2006. Basic elements A telecommunication system consists of three basic elements: a transmitter that takes information and converts it to a signal; a transmission medium that carries the signal; and, a receiver that receives the signal and converts it back into usable information.
143

NOTES

ANNA UNIVERSITY CHENNAI

DBA 1736

NOTES

For example, in a radio broadcast the broadcast tower is the transmitter, free space is the transmission medium and the radio is the receiver. Often telecommunication systems are two-way with a single device acting as both a transmitter and receiver or transceiver. For example, a mobile phone is a transceiver. Telecommunication over a phone line is called point-to-point communication because it is between one transmitter and one receiver. Telecommunication through radio broadcasts is called broadcast communication because it is between one powerful transmitter and numerous receivers. Analogue or digital Signals can be either analogue or digital. In an analogue signal, the signal is varied continuously with respect to the information. In a digital signal, the information is encoded as a set of discrete values (for example ones and zeros). During transmission the information contained in analogue signals will be degraded by noise. Conversely, unless the noise exceeds a certain threshold, the information contained in digital signals will remain intact. This noise resistance represents a key advantage of digital signals over analogue signals. Networks A collection of transmitters, receivers or transceivers that communicate with each other is known as a network. Digital networks may consist of one or more routers that route information to the correct user. An analogue network may consist of one or more switches that establish a connection between two or more users. For both types of network, repeaters may be necessary to amplify or recreate the signal when it is being transmitted over long distances. This is to combat attenuation that can render the signal indistinguishable from noise. Channels A channel is a division in a transmission medium so that it can be used to send multiple streams of information. For example, a radio station may broadcast at 96.1 MHz while another radio station may broadcast at 94.5 MHz. In this case, the medium has been divided by frequency and each channel has received a separate frequency to broadcast on. Alternatively, one could allocate each channel a recurring segment of time over which to broadcastthis is known as time-division multiplexing and is sometimes used in digital communication. Modulation The shaping of a signal to convey information is known as modulation. Modulation can be used to represent a digital message as an analogue waveform. This is known as keying and several keying techniques exist (these include phase-shift keying, frequency-

144

ANNA UNIVERSITY CHENNAI

EMERGING TRENDS IN TECHNOLOGY MANAGEMENT

shift keying and amplitude-shift keying). Bluetooth, for example, uses phase-shift keying to exchange information between devices. Modulation can also be used to transmit the information of analogue signals at higher frequencies. This is helpful because low-frequency analogue signals cannot be effectively transmitted over free space. Hence the information from a low-frequency analogue signal must be superimposed on a higher-frequency signal (known as a carrier wave) before transmission. There are several different modulation schemes available to achieve this (two of the most basic being amplitude modulation and frequency modulation). An example of this process is a DJs voice being superimposed on a 96 MHz carrier wave using frequency modulation (the voice would then be received on a radio as the channel 96 FM). Society and telecommunication Telecommunication is an important part of modern society. In 2006, estimates placed the telecommunication industrys revenue at $1.2 trillion or just under 3% of the gross world product (official exchange rate).[ There exist several economic, social and sovereignistic impacts. Microeconomics On the microeconomic scale, companies have used telecommunication to help build global empires. This is self-evident in the case of online retailer Amazon.com but, according to academic Edward Lenert, even the conventional retailer Wal-Mart has benefited from better telecommunication infrastructure compared to its competitors. In cities throughout the world, home owners use their telephones to organize many home services ranging from pizza deliveries to electricians. Even relatively poor communities have been noted to use telecommunication to their advantage. In Bangladeshs Narshingdi district, isolated villagers use cell phones to speak directly to wholesalers and arrange a better price for their goods. In Cote dIvoire, coffee growers share mobile phones to follow hourly variations in coffee prices and sell at the best price. Macroeconomics On the macroeconomic scale, Lars-Hendrik Rller and Leonard Waverman suggested a causal link between good telecommunication infrastructure and economic growth. Few dispute the existence of a correlation although some argue it is wrong to view the relationship as causal. Because of the economic benefits of good telecommunication infrastructure, there is increasing worry about the inequitable access to telecommunication services amongst various countries of the worldthis is known as the digital divide. SMS In 2000, market research group Ipsos MORI reported that 81% of 15 to 24 yearold SMS users in the United Kingdom had used the service to coordinate social arrangements. The cellular telephone industry has had significant impact of
145

NOTES

ANNA UNIVERSITY CHENNAI

DBA 1736

NOTES

telecommunications. A 2003 survey by the International Telecommunication Union (ITU) revealed that roughly one-third of countries have less than 1 mobile subscription for every 20 people and one-third of countries have less than 1 fixed line subscription for every 20 people. In terms of Internet access, roughly half of all countries have less than 1 in 20 people with Internet access. From this information, as well as educational data, the ITU was able to compile an index that measures the overall ability of citizens to access and use information and communication technologies. Using this measure, Sweden, Denmark and Iceland received the highest ranking while the African countries Niger, Burkina Faso and Mali received the lowest. Telegraph and telephone The first commercial electrical telegraph was constructed by Sir Charles Wheatstone and Sir William Fothergill Cooke and opened on 9 April 1839. Both Wheatstone and Cooke viewed their device as an improvement to the [existing] electromagnetic telegraph not as a new device. Samuel Morse independently developed a version of the electrical telegraph that he unsuccessfully demonstrated on 2 September 1837. His code was an important advance over Wheatstones signaling method. The first transatlantic telegraph cable was successfully completed on 27 July 1866, allowing transatlantic telecommunication for the first time. The conventional telephone was invented independently by Alexander Bell and Elisha Gray in 1876. Antonio Meucci invented the first device that allowed the electrical transmission of voice over a line in 1849. However Meuccis device was of little practical value because it relied upon the electrophonic effect and thus required users to place the receiver in their mouth to hear what was being said. The first commercial telephone services were set-up in 1878 and 1879 on both sides of the Atlantic in the cities of New Haven and London. Radio and television In 1832, James Lindsay gave a classroom demonstration of wireless telegraphy to his students. By 1854, he was able to demonstrate a transmission across the Firth of Tay from Dundee, Scotland to Woodhaven, a distance of two miles (3 km), using water as the transmission medium. In December 1901, Guglielmo Marconi established wireless communication between St. Johns, Newfoundland (Canada) and Poldhu, Cornwall (England), earning him the 1909 Nobel Prize in physics (which he shared with Karl Braun). However small-scale radio communication had already been demonstrated in 1893 by Nikola Tesla in a presentation to the National Electric Light Association. On March 25, 1925, John Logie Baird was able to demonstrate the transmission of moving pictures at the London department store Selfridges. Bairds device relied upon the Nipkow disk and thus became known as the mechanical television. It formed the basis of
146 ANNA UNIVERSITY CHENNAI

EMERGING TRENDS IN TECHNOLOGY MANAGEMENT

experimental broadcasts done by the British Broadcasting Corporation beginning September 30, 1929. However, for most of the twentieth century televisions depended upon the cathode ray tube invented by Karl Braun. The first version of such a television to show promise was produced by Philo Farnsworth and demonstrated to his family on September 7, 1927. Computer networks and the Internet On September 11, 1940, George Stibitz was able to transmit problems using teletype to his Complex Number Calculator in New York and receive the computed results back at Dartmouth College in New Hampshire. This configuration of a centralized computer or mainframe with remote dumb terminals remained popular throughout the 1950s. However, it was not until the 1960s that researchers started to investigate packet switching a technology that would allow chunks of data to be sent to different computers without first passing through a centralized mainframe. A four-node network emerged on December 5, 1969; this network would become ARPANET, which by 1981 would consist of 213 nodes. ARPANETs development centred around the Request for Comment process and on April 7, 1969, RFC 1 was published. This process is important because ARPANET would eventually merge with other networks to form the Internet and many of the protocols the Internet relies upon today were specified through the Request for Comment process. In September 1981, RFC 791 introduced the Internet Protocol v4 (IPv4) and RFC 793 introduced the Transmission Control Protocol (TCP) thus creating the TCP/IP protocol that much of the Internet relies upon today. However, not all important developments were made through the Request for Comment process. Two popular link protocols for local area networks (LANs) also appeared in the 1970s. A patent for the token ring protocol was filed by Olof Soderblom on October 29, 1974 and a paper on the Ethernet protocol was published by Robert Metcalfe and David Boggs in the July 1976 issue of Communications of the ACM. Modern operation Telephone

NOTES

147

ANNA UNIVERSITY CHENNAI

DBA 1736

NOTES

Optical fiber provides cheaper bandwidth for long distance communication In an analogue telephone network, the caller is connected to the person he wants to talk to by switches at various telephone exchanges. The switches form an electrical connection between the two users and the setting of these switches is determined electronically when the caller dials the number. Once the connection is made, the callers voice is transformed to an electrical signal using a small microphone in the callers handset. This electrical signal is then sent through the network to the user at the other end where it is transformed back into sound by a small speaker in that persons handset. There is a separate electrical connection that works in reverse, allowing the users to converse. The fixed-line telephones in most residential homes are analogue that is, the speakers voice directly determines the signals voltage. Although short-distance calls may be handled from end-to-end as analogue signals, increasingly telephone service providers are transparently converting the signals to digital for transmission before converting them back to analogue for reception. The advantage of this is that digitized voice data can travel sideby-side with data from the Internet and can be perfectly reproduced in long distance communication (as opposed to analogue signals that are inevitably impacted by noise). Mobile phones have had a significant impact on telephone networks. Mobile phone subscriptions now outnumber fixed-line subscriptions in many markets. Sales of mobile phones in 2005 totalled 816.6 million with that figure being almost equally shared amongst the markets of Asia/Pacific (204 m), Western Europe (164 m), CEMEA (Central Europe, the Middle East and Africa) (153.5 m), North America (148 m) and Latin America (102 m). In terms of new subscriptions over the five years from 1999, Africa has outpaced other markets with 58.2% growth. Increasingly these phones are being serviced by systems where the voice content is transmitted digitally such as GSM or W-CDMA with many markets choosing to depreciate analogue systems such as AMPS. There have also been dramatic changes in telephone communication behind the scenes. Starting with the operation of TAT-8 in 1988, the 1990s saw the widespread adoption of systems based on optic fibres. The benefit of communicating with optic fibres is that they offer a drastic increase in data capacity. TAT-8 itself was able to carry 10 times as many telephone calls as the last copper cable laid at that time and todays optic fibre cables are able to carry 25 times as many telephone calls as TAT-8. This increase in data capacity is due to several factors: First, optic fibres are physically much smaller than competing technologies. Second, they do not suffer from crosstalk which means several hundred of them can be easily bundled together in a single cable. Lastly, improvements in multiplexing have led to an exponential growth in the data capacity of a single fibre. Assisting communication across many modern optic fibre networks is a protocol known as Asynchronous Transfer Mode (ATM). The ATM protocol allows for the sideby-side data transmission mentioned in the second paragraph. It is suitable for public
148 ANNA UNIVERSITY CHENNAI

EMERGING TRENDS IN TECHNOLOGY MANAGEMENT

telephone networks because it establishes a pathway for data through the network and associates a traffic contract with that pathway. The traffic contract is essentially an agreement between the client and the network about how the network is to handle the data; if the network cannot meet the conditions of the traffic contract it does not accept the connection. This is important because telephone calls can negotiate a contract so as to guarantee themselves a constant bit rate, something that will ensure a callers voice is not delayed in parts or cut-off completely. There are competitors to ATM, such as Multiprotocol Label Switching (MPLS), that perform a similar task and are expected to supplant ATM in the future. Radio and television In a broadcast system, a central high-powered broadcast tower transmits a highfrequency electromagnetic wave to numerous low-powered receivers. The high-frequency wave sent by the tower is modulated with a signal containing visual or audio information. The antenna of the receiver is then tuned so as to pick up the high-frequency wave and a demodulator is used to retrieve the signal containing the visual or audio information. The broadcast signal can be either analogue (signal is varied continuously with respect to the information) or digital (information is encoded as a set of discrete values). The broadcast media industry is at a critical turning point in its development, with many countries moving from analogue to digital broadcasts. This move is made possible by the production of cheaper, faster and more capable integrated circuits. The chief advantage of digital broadcasts is that they prevent a number of complaints with traditional analogue broadcasts. For television, this includes the elimination of problems such as snowy pictures, ghosting and other distortion. These occur because of the nature of analogue transmission, which means that perturbations due to noise will be evident in the final output. Digital transmission overcomes this problem because digital signals are reduced to discrete values upon reception and hence small perturbations do not affect the final output. In a simplified example, if a binary message 1011 was transmitted with signal amplitudes [1.0 0.0 1.0 1.0] and received with signal amplitudes [0.9 0.2 1.1 0.9] it would still decode to the binary message 1011 a perfect reproduction of what was sent. From this example, a problem with digital transmissions can also be seen in that if the noise is great enough it can significantly alter the decoded message. Using forward error correction a receiver can correct a handful of bit errors in the resulting message but too much noise will lead to incomprehensible output and hence a breakdown of the transmission. In digital television broadcasting, there are three competing standards that are likely to be adopted worldwide. These are the ATSC, DVB and ISDB standards; the adoption of these standards thus far is presented in the captioned map. All three standards use MPEG-2 for video compression. ATSC uses Dolby Digital AC-3 for audio compression, ISDB uses Advanced Audio Coding (MPEG-2 Part 7) and DVB has no standard for
149

NOTES

ANNA UNIVERSITY CHENNAI

DBA 1736

NOTES

audio compression but typically uses MPEG-1 Part 3 Layer 2. The choice of modulation also varies between the schemes. In digital audio broadcasting, standards are much more unified with practically all countries choosing to adopt the Digital Audio Broadcasting standard (also known as the Eureka 147 standard). The exception being the United States which has chosen to adopt HD Radio. HD Radio, unlike Eureka 147, is based upon a transmission method known as in-band on-channel transmission that allows digital information to piggyback on normal AM or FM analogue transmissions. However, despite the pending switch to digital, analogue receivers still remain widespread. Analogue television is still transmitted in practically all countries. The United States had hoped to end analogue broadcasts on December 31, 2006; however, this was recently pushed back to February 17, 2009. For analogue television, there are three standards in use. These are known as PAL, NTSC and SECAM. For analogue radio, the switch to digital is made more difficult by the fact that analogue receivers are a fraction of the cost of digital receivers. The choice of modulation for analogue radio is typically between amplitude modulation (AM) or frequency modulation (FM). To achieve stereo playback, an amplitude modulated subcarrier is used for stereo FM. The Internet The Internet is a worldwide network of computers and computer networks that can communicate with each other using the Internet Protocol. Any computer on the Internet has a unique IP address that can be used by other computers to route information to it. Hence, any computer on the Internet can send a message to any other computer using its IP address. These messages carry with them the originating computers IP address allowing for two-way communication. In this way, the Internet can be seen as an exchange of messages between computers. As of 2008, an estimated 21.9% of the world population has access to the Internet with the highest access rates (measured as a percentage of the population) in North America (73.6%), Oceania/Australia (59.5%) and Europe (48.1%) In terms of broadband access, Iceland (26.7%), South Korea (25.4%) and the Netherlands (25.3%) led the world in 2005. The Internet works in part because of protocols that govern how the computers and routers communicate with each other. The nature of computer network communication lends itself to a layered approach where individual protocols in the protocol stack run more-or-less independently of other protocols. This allows lower-level protocols to be customized for the network situation while not changing the way higher-level protocols operate. A practical example of why this is important is because it allows an Internet browser to run the same code regardless of whether the computer it is running on is connected to the Internet through an Ethernet or Wi-Fi connection. Protocols are often talked about in terms of their place in the OSI reference model (pictured on the right), which emerged in
150 ANNA UNIVERSITY CHENNAI

EMERGING TRENDS IN TECHNOLOGY MANAGEMENT

1983 as the first step in an unsuccessful attempt to build a universally adopted networking protocol suite. For the Internet, the physical medium and data link protocol can vary several times as packets traverse the globe. This is because the Internet places no constraints on what physical medium or data link protocol is used. This leads to the adoption of media and protocols that best suit the local network situation. In practice, most intercontinental communication will use the Asynchronous Transfer Mode (ATM) protocol (or a modern equivalent) on top of optic fibre. This is because for most intercontinental communication the Internet shares the same infrastructure as the public switched telephone network. At the network layer, things become standardized with the Internet Protocol (IP) being adopted for logical addressing. For the world wide web, these IP addresses are derived from the human readable form using the Domain Name System (e.g. 72.14.207.99 is derived from www.google.com). At the moment, the most widely used version of the Internet Protocol is version four but a move to version six is imminent. At the transport layer, most communication adopts either the Transmission Control Protocol (TCP) or the User Datagram Protocol (UDP). TCP is used when it is essential every message sent is received by the other computer where as UDP is used when it is merely desirable. With TCP, packets are retransmitted if they are lost and placed in order before they are presented to higher layers. With UDP, packets are not ordered or retransmitted if lost. Both TCP and UDP packets carry port numbers with them to specify what application or process the packet should be handled by. Because certain applicationlevel protocols use certain ports, network administrators can restrict Internet access by blocking the traffic destined for a particular port. Above the transport layer, there are certain protocols that are sometimes used and loosely fit in the session and presentation layers, most notably the Secure Sockets Layer (SSL) and Transport Layer Security (TLS) protocols. These protocols ensure that the data transferred between two parties remains completely confidential and one or the other is in use when a padlock appears at the bottom of your web browser. Finally, at the application layer, are many of the protocols Internet users would be familiar with such as HTTP (web browsing), POP3 (e-mail), FTP (file transfer), IRC (Internet chat), BitTorrent (file sharing) and OSCAR (instant messaging). Local area networks Despite the growth of the Internet, the characteristics of local area networks (computer networks that run at most a few kilometres) remain distinct. This is because networks on this scale do not require all the features associated with larger networks and are often more cost-effective and efficient without them. In the mid-1980s, several protocol suites emerged to fill the gap between the data link and applications layer of the OSI reference model. These were Appletalk, IPX and NetBIOS with the dominant protocol suite during the early 1990s being IPX due to its
151

NOTES

ANNA UNIVERSITY CHENNAI

DBA 1736

NOTES

popularity with MS-DOS users. TCP/IP existed at this point but was typically only used by large government and research facilities. As the Internet grew in popularity and a larger percentage of traffic became Internet-related, local area networks gradually moved towards TCP/IP and today networks mostly dedicated to TCP/IP traffic are common. The move to TCP/IP was helped by technologies such as DHCP that allowed TCP/IP clients to discover their own network address a functionality that came standard with the AppleTalk/IPX/NetBIOS protocol suites. It is at the data link layer though that most modern local area networks diverge from the Internet. Whereas Asynchronous Transfer Mode (ATM) or Multiprotocol Label Switching (MPLS) are typical data link protocols for larger networks, Ethernet and Token Ring are typical data link protocols for local area networks. These protocols differ from the former protocols in that they are simpler (e.g. they omit features such as Quality of Service guarantees) and offer collision prevention. Both of these differences allow for more economic set-ups. Despite the modest popularity of Token Ring in the 80s and 90s, virtually all local area networks now use wired or wireless Ethernet. At the physical layer, most wired Ethernet implementations use copper twisted-pair cables (including the common 10BASET networks). However, some early implementations used coaxial cables and some recent implementations (especially high-speed ones) use optic fibres. Optic fibres are also likely to feature prominently in the forthcoming 10-gigabit Ethernet implementations. Where optic fibre is used, the distinction must be made between multi-mode fibre and single-mode fibre. Multi-mode fibre can be thought of as thicker optical fibre that is cheaper to manufacture but that suffers from less usable bandwidth and greater attenuation (i.e. poor long-distance performance). Summary This unit would have given an insight into some of the contemprory technologies such as nanotechnology and biophram and biotechnology alongwith the fundamentals of telecommuncations. Questions 1. What do you understand by globalisation of the industry? Explain using examples. 2. Write briefy on a. Nano-technology a. Nano-materials b. Bio-technology c. Bio-pharma industry d. Gene therapy 3. Elaborate on the types of contemporary communcation devices.

152

ANNA UNIVERSITY CHENNAI

EMERGING TRENDS IN TECHNOLOGY MANAGEMENT

UNIT V

NOTES

TECHNOLOGICAL COMPETITIVENESS IN COUNTRIES


Introduction In this unit a little light is thrown on exiting procedures such as BPR and TQM and how these procedures enable global competitiveness. Additionally, a little discussion on collaborative intelligence is also documented. Some technology compertitiveness of developed and developing countries are also discussed. Learning Objectives. History and methodology of BPR Quality management evolution Examples of collaborative knowledge The Importance of High-Technology Industries Share of World Markets Global Competitiveness of Individual Industries Exports by High-Technology Industries Foreign Markets Industry Comparisons Competition in the Home Market National Demand for High-Technology Products National Producers Supplying the Home Market Global Business in Knowledge-Intensive Service Industries Major Technology Areas New technological Frontiers in India
153 ANNA UNIVERSITY CHENNAI

DBA 1736

NOTES

5.1 BUSINESS PROCESS REENGINEERING Business process reengineering (BPR) is a management approach aiming at improvements by means of elevating efficiency and effectiveness of the processes that exist within and across organizations. The key to BPR is for organizations to look at their business processes from a clean slate perspective and determine how they can best construct these processes to improve how they conduct business. Business process reengineering is also known as BPR, Business Process Redesign, Business Transformation, or Business Process Change Management. History In 1990, Michael Hammer, a former professor of computer science at the Massachusetts Institute of Technology (MIT), published an article in the Harvard Business Review, in which he claimed that the major challenge for managers is to obliterate nonvalue adding work, rather than using technology for automating it. This statement implicitly accused managers of having focused on the wrong issues, namely that technology in general, and more specifically information technology, has been used primarily for automating existing work rather than using it as an enabler for making non-value adding work obsolete. Most of the work being done does not add any value for customers, and this work should be removed, not accelerated through automation. Instead, companies should reconsider their processes in order to maximize customer value, while minimizing the consumption of resources required for delivering their product or service. This idea, to unbiasedly review a companys business processes, was rapidly adopted by a huge number of firms, which were striving for renewed competitiveness, which they had lost due to the market entrance of foreign competitors, their inability to satisfy customer needs, and their insufficient cost structure. Even well established management thinkers, such as Peter Drucker and Tom Peters, were accepting and advocating BPR as a new tool for (re-)achieving success in a dynamic world. Hammer and Champy (1993) define BPR as ... the fundamental rethinking and radical redesign of business processes to achieve dramatic improvements in critical contemporary measures of performance, such as cost, quality, service, and speed. In order to achieve the major improvements BPR is seeking for, the change of structural organizational variables, and other ways of managing and performing work is often considered as being insufficient. For being able to reap the achievable benefits fully, the use of information technology (IT) is conceived as a major contributing factor. While IT traditionally has been used for supporting the existing business functions, i.e. it was used for increasing organizational efficiency, it now plays a role as enabler of new organizational forms, and patterns of collaboration within and between organizations.
154 ANNA UNIVERSITY CHENNAI

EMERGING TRENDS IN TECHNOLOGY MANAGEMENT

BPR derives its existence from different disciplines, and four major areas can be identified as being subjected to change in BPR - organization, technology, strategy, and people - where a process view is used as common framework for considering these dimensions. Business strategy is the primary driver of BPR initiatives and the other dimensions are governed by strategys encompassing role. The organization dimension reflects the structural elements of the company, such as hierarchical levels, the composition of organizational units, and the distribution of work between them. Technology is concerned with the use of computer systems and other forms of communication technology in the business. In BPR, information technology is generally considered as playing a role as enabler of new forms of organizing and collaborating, rather than supporting existing business functions. The people / human resources dimension deals with aspects such as education, training, motivation and reward systems. The concept of business processes - interrelated activities aiming at creating a value added output to a customer - is the basic underlying idea of BPR. These processes are characterized by a number of attributes: Process ownership, customer focus, value adding, and cross-functionality. The role of information technology Information technology (IT) has historically played an important role in the reengineering concept. It is considered by some as a major enabler for new forms of working and collaborating within an organization and across organizational borders. The early BPR literature, identified several so called disruptive technologies that were supposed to challenge traditional wisdom about how work should be performed. Shared databases, making information available at many places Expert systems, allowing generalists to perform specialist tasks Telecommunication networks, allowing organizations to be centralized and decentralized at the same time Decision-support tools, allowing decision-making to be a part of everybodys job Wireless data communication and portable computers, allowing field personnel to work office independent Interactive videodisk, to get in immediate contact with potential buyers Automatic identification and tracking, allowing things to tell where they are, instead of requiring to be found High performance computing, allowing on-the-fly planning and revisioning In the mid 1990s, especially workflow management systems were considered as a significant contributor to improved process efficiency. Also ERP (Enterprise Resource Planning)
155

NOTES

ANNA UNIVERSITY CHENNAI

DBA 1736

NOTES

vendors, such as SAP, JD Edwards, Oracle, PeopleSoft, positioned their solutions as vehicles for business process redesign and improvement. Methodology of BPR Although the labels and steps differ slightly, the early methodologies that were rooted in IT-centric BPR solutions share many of the same basic principles and elements. The following outline is one such model, based on the PRLC (Process Reengineering Life Cycle) approach developed by Guha et.al. (1993). Simplified schematic outline of using a business process approach, examplified for pharmceutical R&D: 1. Structural organization with functional units 2. Introduction of New Product Development as cross-functional process 3. Re-structuring and streamlining activities, removal of non-value adding tasks Envision new processes Secure management support Identify reengineering opportunities Identify enabling technologies Align with corporate strategy Initiating change Set up reengineering team Outline performance goals Process diagnosis Describe existing processes Uncover pathologies in existing processes Process redesign Develop alternative process scenarios Develop new process design Design HR architecture Select IT platform Develop overall blueprint and gather feedback Reconstruction Develop/install IT solution Establish process changes Process monitoring Performance measurement, including time, quality, cost, IT performance
156 ANNA UNIVERSITY CHENNAI

EMERGING TRENDS IN TECHNOLOGY MANAGEMENT

Link to continuous improvement Loop-back to diagnosis Benefiting from lessons learned from the early adopters, some BPR practitioners advocated a change in emphasis to a customer-centric, as opposed to an IT-centric, methodology. One such methodology, that also incorporated a Risk and Impact Assessment to account for the impact that BPR can have on jobs and operations, was described by Lon Roberts (1994). Roberts also stressed the use of change management tools to proactively address resistance to changea factor linked to the demise of many reengineering initiatives that looked good on the drawing board. BPR, if implemented properly, can give huge returns. BPR has helped giants like Procter and Gamble Corporation and General Motors Corporation succeed after financial drawbacks due to competition. It helped American Airlines somewhat get back on track from the bad debt that is currently haunting their business practice. BPR is about the proper method of implementation. General Motors Corporation implemented a 3-year plan to consolidate their multiple desktop systems into one. It is known internally as Consistent Office Environment (Booker, 1994). This reengineering process involved replacing the numerous brands of desktop systems, network operating systems and application development tools into a more manageable number of vendors and technology platforms. According to Donald G. Hedeen, director of desktops and deployment at GM and manager of the upgrade program, he says that the process lays the foundation for the implementation of a common business communication strategy across General Motors. (Booker, 1994). Lotus Development Corporation and Hewlett-Packard Development Company, formerly Compaq Computer Corporation, received the single largest non-government sales ever from General Motors Corporation. GM also planned to use Novell NetWare as a security client, Microsoft Office and Hewlett-Packard printers. According to Donald G. Hedeen, this saved GM 10% to 25% on support costs, 3% to 5% on hardware, 40% to 60% on software licensing fees, and increased efficiency by overcoming incompatibility issues by using just one platform across the entire company. Ford reengineered their business and manufacturing process from just manufacturing cars to manufacturing quality cars, where the number one goal is quality. This helped Ford save millions on recalls and warranty repairs. Ford has accomplished this goal by incorporating barcodes on all their parts and scanners to scan for any missing parts in a completed car coming off of the assembly line. This helped them guarantee a safe and quality car. They have also implemented Voice-over-IP (VoIP) to reduce the cost of having meetings between the branches.

NOTES

157

ANNA UNIVERSITY CHENNAI

DBA 1736

NOTES

The most frequent and harsh critique against BPR concerns the strict focus on efficiency and technology and the disregard of people in the organization that is subjected to a reengineering initiative. Very often, the label BPR was used for major workforce reductions. Other criticism brought forward against the BPR concept include Lack of management support for the initiative and thus poor acceptance in the organization. Exaggerated expectations regarding the potential benefits from a BPR initiative and consequently failure to achieve the expected results. Underestimation of the resistance to change within the organization. Implementation of generic so-called best-practice processes that do not fit specific company needs. Overtrust in technology solutions. Performing BPR as a one-off project with limited strategy alignment and longterm perspective. Poor project management. Development after 1995 With the publication of critiques in 1995 and 1996 by some of the early BPR proponents, coupled with abuses and misuses of the concept by others, the reengineering fervor in the U.S. began to wane. Since then, considering business processes as a starting point for business analysis and redesign has become a widely accepted approach and is a standard part of the change methodology portfolio, but is typically performed in a less radical way as originally proposed. More recently, the concept of Business Process Management (BPM) has gained major attention in the corporate world and can be considered as a successor to the BPR wave of the 1990s, as it is evenly driven by a striving for process efficiency supported by information technology. Equivalently to the critique brought forward against BPR, BPM is now accused of focusing on technology and disregarding the people aspects of change. 5.2 QUALITY MANAGEMENT Quality management is a method for ensuring that all the activities necessary to design, develop and implement a product or service are effective and efficient with respect to the system and its performance. Quality management can be considered to have four main components: quality planning, quality control, quality assurance and quality improvement. Quality management is focused not only on product quality, but also the means to achieve it. Quality management therefore uses quality assurance and control of processes as well as products to achieve more consistent quality. Quality Management is all activities of the overall management function that determine the quality policy, objectives and responsibilities
158 ANNA UNIVERSITY CHENNAI

EMERGING TRENDS IN TECHNOLOGY MANAGEMENT

and implement them by means such as quality control and quality improvements within a quality system. Quality management evolution Quality management is not a recent phenomenon. Advanced civilizations that supported the arts and crafts allowed clients to choose goods meeting higher quality standards than normal goods. In societies where art and craft (and craftsmanship) were valued, one of the responsibilities of a master craftsman (and similarly for artists) was to lead their studio, train and supervise the work of their craftsmen and apprentices. The master craftsman set standards, reviewed the work of others and ordered rework and revision as necessary. One of the limitations of the craft approach was that relatively few goods could be produced, on the other hand an advantage was that each item produced could be individually shaped to suit the client. This craft-based approach to quality and the practices used were major inputs when quality management was created as a management science. During the industrial revolution, the importance of craftsmen was diminished as mass production and repetitive work practices were instituted. The aim was to produce large numbers of the same goods. The first proponent in the US for this approach was Eli Whitney who proposed (interchangeable) parts manufacture for muskets, hence producing the identical components and creating a musket assembly line. The next step forward was promoted by several people including Frederick Winslow Taylor a mechanical engineer who sought to improve industrial efficiency. He is sometimes called the father of scientific management. He was one of the intellectual leaders of the Efficiency Movement and part of his approach laid a further foundation for quality management, including aspects like standardization and adopting improved practices. Henry Ford also was important in bringing process and quality management practices into operation in his assembly lines. In Germany, Karl Friedrich Benz, often called the inventor of the motor car, was pursuing similar assembly and production practices, although real mass production was properly initiated in Volkswagen after world war two. From this period onwards, north American companies focused predominantly upon production against lower cost with increased efficiency. Walter A. Shewhart made a major step in the evolution towards quality management by creating a method for quality control for production, using statistical methods, first proposed in 1924. This became the foundation for his ongoing work on statistical quality control. W. Edwards Deming later applied statistical process control methods in the United States during World War II, thereby successfully improving quality in the manufacture of munitions and other strategically important products. Quality leadership from a national perspective has changed over the past five to six decades. After the second world war, Japan decided to make quality improvement a national imperative as part of rebuilding their economy, and sought the help of Shewhart, Deming and Juran, amongst others. W. Edwards Deming championed Shewharts ideas in
159

NOTES

ANNA UNIVERSITY CHENNAI

DBA 1736

NOTES

Japan from 1950 onwards. He is probably best known for his management philosophy establishing quality, productivity, and competitive position. He has formulated 14 points of attention for managers, which are a high level abstraction of many of his deep insights. They should be interpreted by learning and understanding the deeper insights and include: Break down barriers between departments Management should learn their responsibilities, and take on leadership Improve constantly Institute a programme of education and self-improvement

In the 1950s and 1960s, Japanese goods were synonymous with cheapness and low quality, but over time their quality initiatives began to be successful, with Japan achieving very high levels of quality in products from the 1970s onward. For example, Japanese cars regularly top the J.D. Power customer satisfaction ratings. In the 1980s Deming was asked by Ford Motor Company to start a quality initiative after they realized that they were falling behind Japanese manufacturers. A number of highly successful quality initiatives have been invented by the Japanese (see for example on this page: Taguchi, QFD, Toyota Production System. Many of the methods not only provide techniques but also have associated quality culture aspects (i.e. people factors). These methods are now adopted by the same western countries that decades earlier derided Japanese methods. Customers recognize that quality is an important attribute in products and services. Suppliers recognize that quality can be an important differentiator between their own offerings and those of competitors (quality differentiation is also called the quality gap). In the past two decades this quality gap has been greatly reduced between competitive products and services. This is partly due to the contracting (also called outsourcing) of manufacture to countries like India and China, as well internationalization of trade and competition. These countries amongst many others have raised their own standards of quality in order to meet International standards and customer demands. The ISO 9000 series of standards are probably the best known International standards for quality management. Quality improvement There are many methods for quality improvement. These cover product improvement, process improvement and people based improvement. In the following list are methods of quality management and techniques that incorporate and drive quality improvement. ISO 9004:2000 - Guidelines for performance improvement. ISO 15504-4: 2005 - Information technology Process assessment Part 4: Guidance on use for process improvement and process capability determination. QFD - Quality Function Deployment, also known as the House of Quality approach.

160

ANNA UNIVERSITY CHENNAI

EMERGING TRENDS IN TECHNOLOGY MANAGEMENT

Kaizen - Japanese for change for the better; the common English usage is continual improvement. Zero Defect Program - created by NEC Corporation of Japan, based upon Statistical Process Control and one of the inputs for the inventors of Six Sigma. Six Sigma - Six Sigma is based upon Statistical Process Control. PDCA - Plan Do Check Act cycle for quality control purposes. Six Sigmas DMAIC method (Design, Measure,Analyze, Improve, Control) for more general improvement purposes. Quality circle - a group (people oriented) approach to improvement. Taguchi methods - statistical oriented methods including Quality robustness, Quality loss function and Target specifications. The Toyota Production System reworked in the west into Lean Manufacturing. Kansei Engineering, an approach that focuses on capturing customer emotional feedback about products to drive improvement. TQM - Total Quality Management is a management strategy aimed at embedding awareness of quality in all organizational processes. First promoted in Japan with the Deming prize which was adopted and adapted in USA as the Malcolm Baldrige National Quality Award and in Europe as the European Foundation for Quality Management award (each with their own variations). TRIZ meaning Theory of inventive problem solving BPR - Business process reengineering, a management approach aiming at clean slate improvements (i.e. ignoring existing practices). Proponents of each approach have sought to improve them as well as apply them to enterprise types not originally targeted. For example, Six Sigma was designed for manufacturing but has spread to service enterprises. Each of these approaches and methods has met with success but also with failures. Some of the common differentiators between success and failure include commitment, knowledge and expertise to guide improvement, scope of change/improvement desired (Big Bang type changes tend to fail more often compared to smaller changes) and adaption to enterprise cultures. For example, quality circles do not work well in every enterprise (and are even discouraged by some managers), and relatively few TQM participating enterprises have won the national quality awards. There has been well publicized failures of BPR, as well as Six Sigma. Enterprises therefore need to consider carefully which quality improvement methods to adopt, and certainly should not adopt all those listed here. It is important not to underestimate the people
161

NOTES

ANNA UNIVERSITY CHENNAI

DBA 1736

NOTES

factors, such as culture, in selecting a quality improvement approach. Any improvement (change) takes time to implement, gain acceptance and stabilize as accepted practice. Improvement must allow pauses between implementing new changes so that the change is stabilized and assessed as a real improvement, before the next improvement is made (hence continual improvement, not continuous improvement). Improvements that change the culture take longer as they have to overcome greater resistance to change. It is easier and often more effective to work within the existing cultural boundaries and make small improvements (i.e. Kaizen) than to make major transformational changes. Use of Kaizen in Japan was a major reason for the creation of Japanese industrial and economic strength. On the other hand, transformational change works best when an enterprise faces a crisis and needs to make major changes in order to survive. In Japan, the land of Kaizen, Carlos Ghosn led a transformational change at Nissan Motor Company which was in a financial and operational crisis. Well organized quality improvement programs take all these factors into account when selecting the quality improvement methods. Quality standards The International Organization for Standardization (ISO) created the Quality Management System (QMS) standards in 1987. These were the ISO 9000:1987 series of standards comprising ISO 9001:1987, ISO 9002:1987 and ISO 9003:1987; which were applicable in different types of industries, based on the type of activity or process: designing, production or service delivery. The standards have been regularly reviewed every few years by the International Organization for Standardization. The version in 1994 and was called the ISO 9000:1994 series; comprising of the ISO 9001:1994, 9002:1994 and 9003:1994 versions. The last revision was in the year 2000 and the series was called ISO 9000:2000 series. However the ISO 9002 and 9003 standards were integrated and one single certifiable standard was created under ISO 9001:2000. Since December 2003, ISO 9002 and 9003 standards are not valid, and the organizations previously holding these standards need to do a transition from the old to the new standards. The ISO 9004:2000 document gives guidelines for performance improvement over and above the basic standard (i.e. ISO 9001:2000). This standard provides a measurement framework for improved quality management, similar to and based upon the measurement framework for process assessment. The Quality Management System standards created by ISO are meant to certify the processes and the system of an organization and not the product or service itself. ISO 9000 standards do not certify the quality of the product or service. Recently the International Organisation for Standardisation released a new standard, ISO 22000, meant for the food industry. This standard covers the values and principles of ISO 9000 and the HACCP standards. It gives one single integrated standard for the food industry and is expected to become more popular in the coming years in such industry.
162 ANNA UNIVERSITY CHENNAI

EMERGING TRENDS IN TECHNOLOGY MANAGEMENT

ISO has a number of standards that support quality management, one group describes processes (including ISO 12207, ISO 15288)and another describes process assessment and improvement ISO 15504. The Software Engineering Institute has its own process assessment and improvement methods, called CMMi (Capability Maturity Model integrated) and IDEAL respectively. Quality management terms Quality Improvement can be distinguished from Quality Control in that Quality Improvement is the purposeful change of a process to improve the reliability of achieving an outcome. Quality Control is the ongoing effort to maintain the integrity of a process to maintain the reliability of achieving an outcome. Quality Assurance is the planned or systematic actions necessary to provide enough confidence that a product or service will satisfy the given requirements for quality. 5.3 COLLABORATIVE KNOWLEDGE A Collaborative Innovation Network, or CoIN, is a social construct used to describe innovative teams. It has been defined by the originator of the term, Peter Gloor (a Research Scientist at MIT Sloans Center for Collective Intelligence) as a cyberteam of self-motivated people with a collective vision, enabled by the Web to collaborate in achieving a common goal by sharing ideas, information, and work. COINs feature internal transparency and direct communication. Members of a COIN collaborate and share knowledge directly with each other, rather than through hierarchies. They come together with a shared vision because they are intrinsically motivated to do so and seek to collaborate in some way to advance an idea. The five essential elements of collaborative innovation networks (what Gloor calls their genetic code) are that they evolve from learning networks, feature sound ethical principles, are based on trust and self-organization, make knowledge accessible to everyone, and operate in internal honesty and transparency. COINs rely on modern technology such as the Internet, e-mail, and other communications vehicles for information sharing. Creativity, collaboration, and communication are their hallmarks. COINs existed well before modern communication technology enabled their creation and development. Peter Gloor and Scott Cooper, in their book, describe Benjamin Franklins Junto organization in Philadelphia as a COIN paradigm. Franklin brought together people with diverse backgrounds, from varying occupations, but of like mind to share knowledge and promulgate innovation.

NOTES

163

ANNA UNIVERSITY CHENNAI

DBA 1736

NOTES

Similar is the concept of the Self-Organizing Innovation Network which have been described by one author, Robert Rycroft of the Elliott School of International Affairs of George Washington University as follows: The most valuable and complex technologies are increasingly innovated by networks that self-organize. Networks are those linked organizations (e.g., firms, universities, government agencies) that create, acquire, and integrate diverse knowledge and skills required to innovate complex technologies (e.g., aircraft, telecommunications equipment). In other words, innovation networks are organized around constant learning. Selforganization refers to the capacity these networks have for combining and recombining these learned capabilities without centralized, detailed managerial guidance. The proliferation of self-organizing innovation networks may be linked to many factors, but a key one seems to be increasing globalization. Indeed, globalization and self-organizing networks may be coevolving. Changes in the organization of the innovation process appear to have facilitated the broadening geographical linkages of products, processes, and markets. At the same time, globalization seems to induce cooperation among innovative organizations. Robert Rycroft Examples An example of the COIN idea at work may be SpineConnect, a community of spine surgeons interacting in a variety of ways, ultimately with the goal of producing innovation. It cannot be stated with certainty that the group had its genesis as a COIN, but it does illustrate some of the concepts. Starting out as a knowledge sharing community, enabling surgeons from around the world to share difficult and unusual cases, it quickly emerged as a community to produce innovation collaboratively. Since its launch in October 2005, the surgeons have used SpineConnect to produce original research and take their ideas and create patents. As the community matures, more ambitious goals are being pursued, such as creating a better classification system of disease for spine. Isotelesis is a collaborative project which seeks to facilitate innovation and the semantic integration of the worlds knowledge. This global network may change the way scientific and technological research will be conducted, accelerating discoveries and enhancing interoperability. Since this multidimensional framework may be thought of as a global mind, Isotelesis seeks to use this intelligence for global projects. This would shift the paradigm from centralized and decentralized databases, to a distributively integrated database. This allows humanity to contribute to its future development, working together with coherent intelligence towards common goals. Iso= same, equal + Telesis= attainment of desired ends through intelligence. isotelesis: the principle that any one function is served by several structures and processes.

164

ANNA UNIVERSITY CHENNAI

EMERGING TRENDS IN TECHNOLOGY MANAGEMENT

Collaborative intelligence is a measure of the collaborative ability of a group or entity. Knowledge derived from collaborative efforts is increasing proportionally to the reach of the world wide web, collaborative groupware like Skype, NetMeeting, WebEx, iPeerAdvisory and many others. [IQ][1] is a term readily used to describe or measure an intelligent quota of a person. [EQ]has been used to measure the Emotional Intelligence of a person to describe how a person handles emotions in a given situation. CQ or Collaborative Intelligence measures the collaborative ability of a group. CQ is a fairly new term arising from the visibility of collaborative efforts of companies and entities. CQ is a situation where the knowledge and problem solving capability of a group is much greater than the knowledge possessed by an individual group member. As groups work together they develop a shared memory, which is accessible through the collaborative artifacts created by the group, including meeting minutes, transcripts from threaded discussions, and drawings. The shared memory (group memory) is also accessible through the memories of group members. Distributed collaborative intelligence is the act of a group collaborating within a virtual sphere of interaction. Group members can interact in real time or asynchronously even though they are not located within the same physical space. Technologies used to enhance distributed collaborative intelligence and to facilitate group problem solving are: Messaging Synchronous conferencing technologies like instant messaging, online chat and shared white boards. Asynchronous messaging like electronic mail, threaded, moderated discussion forums and web logs. Stigmergy Wiki Social evolutionary computation

NOTES

The ability of a group to solve a problem collectively is potentially directly proportional to the number of members in a group; however effective architecture of interaction is needed to achieve this. Critical success factors for a high collaborative intelligence quotient are: Group moderation and facilitation Adherence to a small set of fundamental rules relate to member interaction No limits to thinking; or the promotion of creative thinking Strong group membership feedback Quality control.
165 ANNA UNIVERSITY CHENNAI

DBA 1736

NOTES

Ideas need to be nurtured, but the solutions should be upheld after a critical peer review. The construction of a deeply documented group memory or knowledge base Most countries acknowledge a symbiotic relationship between investment in S&T and success in the marketplace: S&T support competitiveness in international trade, and commercial success in the global marketplace provides the resources needed to support new S&T. Consequently, the nations economic health is a performance measure for the national investment in R&D and in science and engineering (S&E). The Organisation for Economic Co-operation and Development (OECD) currently identifies four industries as high-technology (science-based industries whose products involve above-average levels of R&D): aerospace, computers and office machinery, communications equipment, and pharmaceuticals. High-technology industries are important to nations for several reasons: High-technology firms innovate, and firms that innovate tend to gain market share, create new product markets, and/or use resources more productively High-technology firms develop high value-added products and are successful in foreign markets, which results in greater compensation for their employees Industrial R&D performed by hightechnology industries benefits other commercial sectors by generating new products and processes that increase productivity, expand business, and create high-wage jobs . 5.4 TECHNOLOGY COMPETITIVENESS IN DEVELOPED COUNTRIES The Importance of High-Technology Industries The global market for high-technology goods is growing at a faster rate than that for other manufactured goods, and high-technology industries are driving economic growth around the world. During the 19-year period examined (198098), high-technology production grew at an inflation-adjusted average annual rate of nearly 6.0 percent compared with 2.7 percent for other manufactured goods. Global economic activity was especially strong at the end of the period (199598), when high-technology industry output grew at 13.9 percent per year, more than three times the rate of growth for all other manufacturing industries. Output by the four high-technology industries, those identified as being the most research intensive, represented 7.6 percent of global production of all manufactured goods in 1980; by 1998, this figure rose to 12.7 percent. During the 1980s, the United States and other high-wage countries devoted increasing resources toward the manufacture of higher value, technology-intensive goods, often referred to as high-technology manufactures. During this period, Japan led the major industrialized countries in its concentration on high-technology manufactures. In 1980, high-technology

166

ANNA UNIVERSITY CHENNAI

EMERGING TRENDS IN TECHNOLOGY MANAGEMENT

manufactures accounted for about 8 percent of total Japanese production, approaching 11 percent in 1984 and increasing to 11.6 percent in 1989. By contrast, high-technology manufactures represented nearly 11 percent of total U.S. production in 1989, up from 9.6 percent in 1980. European nations also saw high-technology manufactures account for a growing share of their total production, although to a lesser degree than seen in the United States and Japan. The one exception was the United Kingdom, where high-technology manufactures rose from 9 percent of total manufacturing output in 1980 to nearly 11 percent by 1989. The major industrialized countries continued to emphasize high-technology manufactures into the 1990s. In 1998, high-technology manufactures were estimated at 16.6 percent of manufacturing output in the United States, 16.0 percent in Japan, 14.9 percent in the United Kingdom, 11.0 percent in France, and 9.0 percent in Germany. Taiwan and South Korea typify how important R&D-intensive industries have become to newly industrialized economies. In 1980, high-technology manufactures accounted for less than 12 percent of Taiwans total manufacturing output; this proportion jumped to 16.7 percent in 1989 and reached 25.6 percent in 1998. In 1998, high-technology manufacturing in South Korea (15.0 percent) accounted for about the same percentage of total output as in the United Kingdom (14.9 percent) and almost twice the percentage of total manufacturing output as in Germany (9.0 percent). Share of World Markets Throughout the 1980s, the United States was the worlds leading producer of hightechnology products, responsible for more than one-third of total world production from 1980 to 1987 and for about 30 percent from 1988 to 1995. U.S. world market share began to rise in 1996 and continued moving upward during the following two years. In 1998, the United States high-technology industry accounted for 36 percent of world hightechnology production, a level last reached in the 1980s. Although the United States struggled to maintain its high-technology market share during the 1980s, Asias market share followed a path of steady gains. In 1989, Japan accounted for 24 percent of the worlds production of high-technology products, moving up 4 percentage points from its 1980 share. Japan continued to gain market share through 1991. Since then, however, Japans market share has dropped steadily, falling to 20 percent of world production in 1998 after accounting for nearly 26 percent in 1991. European nations share of world high-technology production is much lower and has been declining. Germanys share of world high-technology production was about 8 percent in 1980, about 6.4 percent in 1989, and 5.4 percent in 1998. The United Kingdoms hightechnology industry produced 6.7 percent of world output in 1980, dropping to about 6.0 percent in 1989 and 5.4 percent in 1998. In 1980, French high-technology industry
167

NOTES

ANNA UNIVERSITY CHENNAI

DBA 1736

NOTES

produced 6.1 percent of world output; it dropped to 5.3 percent in 1989 and 3.9 percent in 1998. Italys shares were the lowest among the four large European economies, ranging from a high of about 2.7 percent of world high-technology production in 1980 to a low of about 1.6 percent in 1998. Developing Asian nations made the most dramatic gains since 1980. South Koreas market share more than doubled during the 1980s, moving from 1.1 percent in 1980 to 2.6 percent in 1989. South Koreas share continued to increase during the early to mid1990s, peaking at 4.1 percent in 1995. Since 1995, South Koreas market share has dropped each year, falling to 3.1 percent in 1998. Taiwans high-technology industry also gained world market share during the 1980s and early 1990s before leveling off in the later 1990s. Taiwans high-technology industry produced just 1.3 percent of the worlds output in 1980. This figure rose to 2.4 percent in 1989 and leveled off at 3.3 percent in 1997 and 1998. Global Competitiveness of Individual Industries In each of the four industries that make up the high-technology group, the United States maintained strong, if not leading, market positions between 1981 and 1998. Competitive pressures from a growing cadre of high-technology-producing nations contributed to a decline in global market share for two U.S. high-technology industries during the 1980s: computers and office machinery and communications equipment. Both of these U.S. industries reversed their downward trends and gained market share in the mid- to late 1990s, thanks to increased capital investment by U.S. businesses. For most of the 19-year period examined, Japan was the worlds leading supplier of communications equipment, representing about one-third of total world output. Japans production surpassed that of the United States in 1981 and held the top position for the next 14 years. In 1995, U.S. manufacturers once again became the leading producer of communications equipment in the world, and they have retained that position ever since. In 1998, the latest year for which data are available, the United States accounted for 34.4 percent of world production of communications equipment, up from 31.5 percent in 1997. Aerospace, the U.S. high-technology industry with the largest world market share, was the only industry to lose market share in both the 1980s and the 1990s. For most of the 1980s, the U.S. aerospace industry supplied more than 60 percent of world demand. By the late 1980s, the U.S. share of the world aerospace market began an erratic decline, falling to 58.9 percent in 1989 and 52.1 percent by 1995. The United States recovered somewhat during the following three years, supplying about 55 percent of the world market from 1996 to 1998. European aerospace industries, particularly the British aerospace industry, made some gains during the period examined. After fluctuating between 8.5 and 10.5 percent during the 1980s, the United Kingdoms industry slowly gained market share

168

ANNA UNIVERSITY CHENNAI

EMERGING TRENDS IN TECHNOLOGY MANAGEMENT

for much of the 1990s. In 1991, the United Kingdom supplied 9.7 percent of world aircraft shipments; by 1998, its share had increased to 13 percent. Of the four U.S. high-technology industries, only the aerospace and pharmaceutical industries managed to retain their number-one rankings throughout the 19-year period; of these two, only the pharmaceutical industry had a larger share of the global market in 1998 than in 1980. The United States is considered a large, open market. These characteristics benefit U.S. high-technology producers in two important ways. First, supplying a market with many domestic consumers provides scale effects to U.S. producers in the form of potentially large rewards for the production of new ideas and innovations. Second, the openness of the U.S. market to competing foreign-made technologies pressures U.S. producers to be inventive and more innovative to maintain domestic market share. Exports by High-Technology Industries Although U.S. producers benefit from having the worlds largest home market as measured by gross domestic product (GDP), mounting trade deficits highlight the need to serve foreign markets as well. U.S. high-technology industries have traditionally been more successful exporters than other U.S. industries and play a key role in returning the United States to a more balanced trade position. Foreign Markets Despite its domestic focus, the United States was an important supplier of manufactured products to foreign markets throughout the 198098 period. From 1993 to 1998, the United States was the leading exporter of manufactured goods, accounting for about 13 percent of world exports. U.S. high-technology industries contributed to the strong export performance of the nations manufacturing industries. During the same 19-year period, U.S. high-technology industries accounted for between 19 and 26 percent of world high-technology exports, which was at times twice the level achieved by all U.S. manufacturing industries. In 1998, the latest year for which data are available, exports by U.S. high-technology industries accounted for 19.8 percent of world high-technology exports; Japan was second with 9.7 percent, followed by Germany with 6.5 percent. The gradual drop in U.S. share during the 19-year period was in part the result of emerging high-technology industries in newly industrialized economies, especially in Asia. In 1980, high-technology industries in Singapore and Taiwan each accounted for about 2.0 percent of world high-technology exports. The latest data for 1998 show Singapores share reaching 6.4 percent and Taiwans share reaching 5.0 percent.

NOTES

169

ANNA UNIVERSITY CHENNAI

DBA 1736

NOTES

Industry Comparisons Throughout the 19-year period, individual U.S. high-technology industries ranked either first or second in exports in each of the four industries that make up the high-technology group. In 1998, the United States was the export leader in three industries and second in only one, pharmaceuticals. U.S. industries producing aerospace technologies, computers and office machinery, and pharmaceuticals all accounted for smaller shares of world exports in 1998 than in 1980; only the communications equipment industry improved its share during the period. By contrast, Japans share of world exports of communications equipment dropped steadily after 1985, eventually falling to 12.5 percent by 1998 from a high of 36.0 percent just 13 years earlier. Several smaller Asian nations fared better: for example, in 1998, South Korea supplied 5.9 percent of world communication product exports, up from just 2.4 percent in 1980, and Singapore supplied 10.6 percent of world office and computer exports in 1998, up from 0.6 percent in 1980. Competition in the Home Market A countrys home market is often considered the natural destination for the goods and services domestic firms have produced. Proximity to the customer as well as common language, customs, and currency make marketing at home easier than marketing abroad. With trade barriers falling, however, product origin may be only one factor among many influencing consumer choice. As the number of firms producing goods to world standards rises, price, quality, and product performance often become equally or more important criteria for selecting products. Thus, in the absence of trade barriers, the intensity of competition faced by producers in the domestic market can approach and, in some markets, exceed that faced in foreign markets. U.S. competitiveness in foreign markets may be the result of two factors: the existence of tremendous domestic demand for the latest technology products and the pressure of global competition, which spurs innovation. National Demand for High-Technology Products Demand for high-technology products in the United States far exceeds that in any other single country; in 1998, it was larger (approximately $768 billion) than the combined markets of Japan and the four largest European nationsGermany, the United Kingdom, France, and Italy (about $749 billion). In 1991, Japan was the worlds second largest market for high-technology products, although its percentage share of world consumption has generally declined since then. Even though economic problems across much of Asia have curtailed a long period of rapid growth, Asia continues to be a large market for the worlds high-technology exports. National Producers Supplying the Home Market Throughout the 198095 period, the worlds largest market for high-technology products, the United States, was served primarily by domestic producers, yet demand
170 ANNA UNIVERSITY CHENNAI

EMERGING TRENDS IN TECHNOLOGY MANAGEMENT

was increasingly met by a growing number of foreign suppliers. In 1998, U.S. producers supplied about 75 percent of the home market for high-technology products; in 1995, their share was much lowerabout 67 percent. Other countries, particularly those in Europe, have experienced increased foreign competition in their domestic markets. A more economically unified market has made Europe especially attractive to the rest of the world. Rapidly rising import penetration ratios in Germany, the United Kingdom, France, and Italy during the latter part of the 1980s and throughout much of the 1990s reflect these changing circumstances. These data also highlight greater trade activity in European high-technology markets compared with product markets for less technology-intensive manufactures. The Japanese home market, the second largest market for high-technology products and historically the most self-reliant of the major industrialized countries, also increased its purchases of foreign technologies over the 19-year period, although slowly. In 1998, imports of high-technology manufactures supplied about 12 percent of Japanese domestic consumption, up from about 7 percent in 1980. Global Business in Knowledge-Intensive Service Industries For several decades, revenues generated by U.S. service-sector industries have grown faster than those generated by the nations manufacturing industries. Data collected by the Department of Commerce show that the service sectors share of the U.S. GDP grew from 49 percent in 1959 to 64 percent in 1997. Service-sector growth has been fueled largely by knowledge-intensive industriesthose incorporating science, engineering, and technology in their services or in the delivery of those services. Five of these knowledgeintensive industries are communications services, financial services, business services (including computer software development), educational services, and health services. These industries have been growing faster than the high-technology manufacturing sector discussed earlier. This section presents data tracking overall revenues earned by these industries in 68 countries. Combined sales in 1997 dollars in these five service-sector industries approached $8.4 trillion in 1998, up from $6.8 trillion in 1990 and $4.8 trillion in 1980. The United States was the leading provider of high-technology services, responsible for between 38 and 41 percent of total world service revenues during the entire 19-year period examined. The financial services industry is the largest of the five service industries examined, accounting for 31 percent of revenues in 1998. The U.S. financial services industry is the worlds largest, with 52.9 percent of world revenues in 1998. Japan was second at 5.9 percent, followed by Germany at 4.1 percent. Business services, which includes computer and data processing and research and engineering services, is the second largest service sector, accounting for nearly 28 percent
171

NOTES

ANNA UNIVERSITY CHENNAI

DBA 1736

NOTES

of revenues in 1998. The U.S. business services industry is the largest in the world, with 36.0 percent of industry revenues in 1998. France is second with 17.1 percent, followed by Japan with 12.9 percent and the United Kingdom with 6.1 percent. Unfortunately, data on individual business services by country are not available. Communications services, which includes telecommunications and broadcast services, is the fourth-largest service industry examined, accounting for 12.3 percent of revenues in 1998. In what many consider the most technology-driven of the service industries, the United States has the dominant position. In 1998, U.S. communications firms generated revenues that accounted for 36.8 percent of world revenues, more than twice the share held by Japanese firms and six times that held by British firms. Because in many nations the government is the primary provider of the remaining two knowledge-intensive service industries (health services and educational services), and because the size of a countrys population affects the delivery of these services, global comparisons are more difficult and less meaningful than those for other service industries. The United States, with the largest population and least government involvement, has the largest commercial industries in the world in both health services and educational services. Japan is second, followed by Germany. Educational services, the smallest of the five knowledge-intensive service industries, had about one-fourth of the revenues generated by the financial services industry worldwide. U.S. Trade Balance in Technology Products Although no single preferred methodology exists for identifying high-technology industries, most calculations rely on a comparison of R&D intensities. R&D intensity, in turn, is typically determined by comparing industry R&D expenditures or the number of technical people employed (e.g., scientists, engineers, and technicians) with industry value added or the total value of its shipments. Classification systems based on R&D intensity, however, are often distorted by including all products produced by particular high-technology industries, regardless of the level of technology embodied in each product, and by the somewhat subjective process of assigning products to specific industries. In contrast, the classification system discussed here allows for a highly disaggregated, more focused examination of technology embodied in traded goods. To minimize the impact of subjective classification, the judgments offered by government experts are reviewed by other experts. Ten Major Technology Areas The Bureau of the Census has developed a classification system for exports and imports that embody new or leading-edge technologies. This classification system allows trade to be examined in 10 major technology areas:

172

ANNA UNIVERSITY CHENNAI

EMERGING TRENDS IN TECHNOLOGY MANAGEMENT

Biotechnologythe medical and industrial application of advanced genetic research to the creation of drugs, hormones, and other therapeutic items for both agricultural and human uses. Life science technologiesthe application of nonbiological scientific advances to medicine. For example, advances such as nuclear magnetic resonance imaging, echocardiography, and novel chemistry, coupled with new drug manufacturing, have led to new products that help control or eradicate disease. Opto-electronicsthe development of electronics and electronic components that emit or detect light, including optical scanners, optical disk players, solar cells, photosensitive semiconductors, and laser printers. Information and communicationsthe development of products that process increasing amounts of information in shorter periods of time, including fax machines, telephone switching apparatus, radar apparatus, communications satellites, central processing units, and peripheral units such as disk drives, control units, modems, and computer software. Electronicsthe development of electronic components (other than opto-electronic components), including integrated circuits, multilayer printed circuit boards, and surfacemounted components, such as capacitors and resistors, that result in improved performance and capacity and, in many cases, reduced size. Flexible manufacturingthe development of products for industrial automation, including robots, numerically controlled machine tools, and automated guided vehicles, that permit greater flexibility in the manufacturing process and reduce human intervention. Advanced materialsthe development of materials, including semiconductor materials, optical fiber cable, and videodisks, that enhance the application of other advanced technologies. Aerospacethe development of aircraft technologies, such as most new military and civil airplanes, helicopters, spacecraft (with the exception of communication satellites), turbojet aircraft engines, flight simulators, and automatic pilots. Weaponsthe development of technologies with military applications, including guided missiles, bombs, torpedoes, mines, missile and rocket launchers, and some firearms. Nuclear technologythe development of nuclear production apparatus, including nuclear reactors and parts, isotopic separation equipment, and fuel cartridges (nuclear medical apparatus is included in life sciences rather than this category). To be included in a category, a product must contain a significant amount of one of the leading-edge technologies, and the technology must account for a significant portion of the products value.
173

NOTES

ANNA UNIVERSITY CHENNAI

DBA 1736

NOTES

Importance of Advanced Technology Product Trade to Overall U.S. Trade Advanced technology products accounted for an increasing share of all U.S. trade (exports plus imports) in merchandise between 1990 and 1999. Total U.S. trade in merchandise exceeded $1.7 trillion in 1999; of that, $381 billion involved trade in advanced technology products. Trade in advanced technology products accounts for a much larger share of U.S. exports than of imports (29.2 percent versus 17.5 percent in 1999) and makes a positive contribution to the overall balance of trade. After several years in which the surplus generated by trade in advanced technology products declined, exports of U.S. advanced technology products outpaced imports in 1996 and 1997, producing larger surpluses in both years. In 1998 and 1999, the economic slowdown in Asia caused declines in exports and in the surplus generated from U.S. trade in advanced technology products. Technologies Generating Trade Surpluses Throughout the 1990s, U.S. exports of advanced technology products exceeded imports in 8 of 11 technology areas. Trade in aerospace technologies consistently produced the largest surpluses for the United States. Those surpluses narrowed in the mid-1990s as competition from Europes aerospace industry challenged U.S. companies preeminence both at home and in foreign markets. Aerospace technologies generated a net inflow of $25 billion in 1990 and nearly $29 billion in 1991 and 1992; trade surpluses then declined 13 percent in 1993, 9 percent in 1994, and 4 percent in 1995. In 1998, U.S. trade in aerospace technologies produced a net inflow of $39 billion, the largest surplus of the decade, and 1999s surplus was only slightly smaller at $37 billion. Trade is more balanced in five other technology areas (biotechnology, flexible manufacturing technologies, advanced materials, weapons, and nuclear technology), with exports having only a slight edge over imports. Each of these areas showed trade surpluses of less than $3 billion in 1999. Although U.S. imports of electronics technologies exceeded exports for much of the decade, 1997 saw U.S. exports of electronics exceed imports by $1.1 billion, which jumped to $4.2 billion in 1998 and $9.4 billion in 1999. This turnaround may be attributed in part to Asias economic problems in 1998 and a stronger U.S. dollar, which may have reduced the number of electronics products imported from Asia in 1998. Imports from Asia recovered to pre-1998 levels in 1999, with the largest jumps in imports coming not from Japan but from South Korea, the Philippines, and Malaysia. Technologies Generating Trade Deficits In 1999, trade deficits were recorded in three technology areas: information and communications, opto-electronics, and life science technologies. The trends for each of these technology areas are quite different. Only opto-electronics showed trade deficits in each of the 10 years examined. U.S. trade in life science technologies consistently generated annual trade surpluses until 1998. Life science exports were virtually flat in the last two
174 ANNA UNIVERSITY CHENNAI

EMERGING TRENDS IN TECHNOLOGY MANAGEMENT

years of the decade, while imports jumped 24 percent in 1998 and 21 percent in 1999. Interestingly, in a technology area in which the United States is considered to be at the forefront (information and communications), annual U.S. imports have consistently exceeded exports since 1992. Nearly three-fourths of all U.S. imports in this technology area are produced in Asia. Top Customers by Technology Area Japan and Canada are the largest customers for a broad range of U.S. technology products, with each country accounting for about 11 percent of total U.S. technology exports. Japan ranks among the top three customers in 9 of 11 technology areas, Canada in 7. European countries are also important consumers of U.S. technology products, particularly Germany (life science products, opto-electronics, and advanced materials), the United Kingdom (aerospace, weapons, and computer software), and the Netherlands (life science products and weapons). Although Europe, Japan, and Canada have long been important consumers of U.S. technology products, several newly industrialized and emerging Asian economies now also rank among the largest customers. South Korea is a leading consumer in three technology areas (electronics, flexible manufacturing, and nuclear technologies) and Taiwan in two (flexible manufacturing and nuclear technologies). Top Suppliers by Technology Area The United States is not only an important exporter of technologies to the world but also a consumer of imported technologies. The leading economies in Asia and Europe are important suppliers to the U.S. market in each of the 11 technology areas. Japan is a major supplier in six advanced technology categories; Canada, France, Germany, Taiwan, and the United Kingdom in three. Smaller European countries are also major suppliers of technology to the United States, although they tend to specialize. Belgium was the top foreign supplier of biotechnology products to the United States in 1999, the source for 25.5 percent of imports in this category. Switzerland also was among the top three suppliers of biotechnology products with 11.3 percent. Many technology products come from developing Asian economies, especially Malaysia, South Korea, and Singapore. Imports from these Asian economies and from other regions into one of the worlds most demanding markets indicate that technological capabilities are expanding globally. U.S. Royalties and Fees Generated From Intellectual Property The United States has traditionally maintained a large trade surplus in intellectual property. Firms trade intellectual property when they license or franchise proprietary technologies, trademarks, and entertainment products to entities in other countries. These transactions generate net revenues in the form of royalties and licensing fees.
175

NOTES

ANNA UNIVERSITY CHENNAI

DBA 1736

NOTES

U.S. Royalties and Fees From All Transactions Total U.S. receipts from all trade in intellectual property more than doubled between 1990 and 1999, reaching nearly $36.5 billion in 1999. During the 198796 period, U.S. receipts for transactions involving intellectual property were generally four to five times larger than U.S. payments to foreign firms. The gap narrowed in 1997 as U.S. payments increased by 20 percent over the previous year and U.S. receipts rose less than 3 percent. Despite the much larger increase in payments, annual receipts from total U.S. trade in intellectual property in 1997 were still more than 3.5 times greater than payments. This trend continued during the following two years, and by 1999, the ratio of receipts to payments had dropped to about 2.7:1. U.S. trade in intellectual property produced a surplus of $23.2 billion in 1999, down slightly from the nearly $24.5 billion surplus recorded a year earlier. About 75 percent of the transactions involved exchanges of intellectual property between U.S. firms and their foreign affiliates. Exchanges of intellectual property among affiliates have grown at about the same pace as those among unaffiliated firms, except during the late 1990s, when the growth in U.S. firm payments to affiliates exceeded receipts. These trends suggest both a growing internationalization of U.S. business and a growing reliance on intellectual property developed overseas. U.S. Royalties and Fees From Trade in Technical Knowledge Data on royalties and fees generated by trade in intellectual property can be further disaggregated to reveal U.S. trade in technical know-how. The data describe transactions between unaffiliated firms where prices are set through a market-based negotiation. Therefore, they may better reflect the exchange of technical know-how and its market value at a given time than do data on exchanges among affiliated firms. When receipts (sales of technical know-how) consistently exceed payments (purchases), these data may indicate a comparative advantage in the creation of industrial technology. The record of resulting receipts and payments also provides an indicator of the production and diffusion of technical knowledge. The United States is a net exporter of technology sold as intellectual property, although the gap between imports and exports narrowed during the late 1990s. During the first half of the 1990s, royalties and fees received from foreign firms have been an average of three times the amount U.S. firms pay foreigners to access their technology. Between 1996 and 1998, receipts plateaued at about $3.5 billion. In 1999, receipts totaled nearly $3.6 billion, little changed from the year before but still more than double that reported for 1987. Japan is the worlds largest consumer of U.S. technology sold as intellectual property, although its share declined significantly during the 1990s. In 1999, Japan accounted for about 30 percent of all such receipts. At its peak in 1993, Japans share was 51 percent.

176

ANNA UNIVERSITY CHENNAI

EMERGING TRENDS IN TECHNOLOGY MANAGEMENT

Another Asian country, South Korea, is the second largest consumer of U.S. technology sold as intellectual property, accounting for nearly 14 percent of U.S. receipts in 1999. South Korea has been a major consumer of U.S. technological know-how since 1988, when it accounted for 5.5 percent of U.S. receipts. South Koreas share rose to 10.7 percent in 1990 and reached its highest level, 17.3 percent, in 1995. The U.S. trade surplus in intellectual property is driven largely by trade with Asia, but that surplus has narrowed recently. In 1995, U.S. receipts (exports) from technology licensing transactions were nearly seven times the U.S. firm payments (imports) to Asia. That ratio closed to just more than 4:1 by 1997, and the most recent data show U.S. receipts from technology licensing transactions at about 2.5 times the U.S. firm payments to Asia. As previously noted, Japan and South Korea were the biggest customers for U.S. technology sold as intellectual property; together, these countries accounted for more than 44 percent of total receipts in 1999. Until 1994, U.S. trade with Europe in intellectual property, unlike trade with Asia, fluctuated between surplus and deficit. In 1994, a sharp decline in U.S. purchases of European technical know-how led to a considerably larger surplus for the United States compared with earlier years. The following year showed another large surplus resulting from a jump in receipts from the larger European countries. In 1999, receipts from European Union (EU) countries represented about 35 percent of U.S. technology sold as intellectual property, more than double the share in 1993. Some of this increase is attributable to increased licensing by firms in Germany, the third largest consumer of U.S. technological know-how. In 1999, Germanys share rose to 9.3 percent, up from 6.9 percent in 1998 and more than double its share in 1993. These latest data show receipts from France and Sweden rising sharply during the late 1990s, causing a considerably larger surplus from U.S. trade with Europe in intellectual property in 1998 and 1999. U.S. firms have purchased technical know-how from different foreign sources over the years, with increasing amounts coming from Japan, which since 1992 has been the single largest foreign supplier of technical know-how to U.S. firms. About one-third of U.S. payments in 1999 for technology sold as intellectual property were made to Japanese firms. Europe accounts for slightly more than 44 percent of the foreign technical knowhow purchased by U.S. firms; the United Kingdom and Germany are the principal European suppliers. Footnotes In designating these high-technology industries, OECD took into account both direct and indirect R&D intensities for 10 countries: the United States, Japan, Germany, France, the United Kingdom, Canada, Italy, the Netherlands, Denmark, and Australia. Direct intensities were calculated by the ratio of R&D expenditure to output (production) in 22 industrial sectors. Each sector was given a weight according to its share in the total output
177

NOTES

ANNA UNIVERSITY CHENNAI

DBA 1736

NOTES

of the 10 countries using purchasing power parities as exchange rates. Indirect intensity calculations were made using technical coefficients of industries on the basis of inputoutput matrices. OECD then assumed that, for a given type of input and for all groups of products, the proportions of R&D expenditure embodied in value added remained constant. The input-output coefficients were then multiplied by the direct R&D intensities. 5.5 SCIENCE AND TECHNOLOGY IN INDIA A New Frontier The tradition of science and technology (S&T) in India is over 5,000 years old. A renaissance was witnessed in the first half of the 20th century. The S&T infrastructure has grown up from about Rs. 10 million at the time of independence in 1947 to Rs. 30 billion. Significant achievements have been made in the areas of nuclear and space science, electronics and defence. The government is committed to making S&T an integral part of the socio-economic development of the country. India has the third largest scientific and technical manpower in the world; 162 universities award 4,000 doctorates and 35,000 postgraduate degrees and the Council of Scientific and Industrial Research runs 40 research laboratories that have made some significant achievements. In the field of Missile Launch Technology, India is among the top five nations of the world. Science and technology, however, is used as an effective instrument for growth and change. It is being brought into the mainstream of economic planning in the sectors of agriculture, industry and services. The countrys resources are used to derive the maximum output for the benefit of society and improvement in the quality of life. About 85 per cent of the funds for S&T come directly or indirectly from the Government. The S&T infrastructure in the country accounts for more than one per cent of the GNP. S&T in India is entering a new frontier. Atomic Energy The prime objective of Indias nuclear energy programme is the development and use of nuclear energy for peaceful purposes such as power generation, applications in agriculture, medicine, industry, research and other areas. India is today recognised as one of the most advanced countries in nuclear technology including production of source materials. The country is self-reliant and has mastered the expertise covering the complete nuclear cycle from exploration and mining to power generation and waste management. Accelerators and research and power reactors are now designed and built indigenously. The sophisticated variable energy cyclotron at Kolkata and a medium-energy heavy ion accelerator pelletron set up recently at Mumbai are national research facilities in the frontier areas of science.
178 ANNA UNIVERSITY CHENNAI

EMERGING TRENDS IN TECHNOLOGY MANAGEMENT

As part of its programme of peaceful uses of atomic energy, India has also embarked on a programme of nuclear power generation. Currently eight nuclear stations are producing eight billion kilowatt of electricity. Four more nuclear power stations are planned. The new nuclear reactors are designed in India. The peaceful nuclear programme also includes producing radioisotopes for use in agriculture, medicine, industry and research. Space The Indian Space Research Organisation (ISRO), under the Department of Space (DOS), is responsible for research, development and operationalisation of space systems in the areas of satellite communications, remote sensing for resource survey, environmental monitoring, meteorological services, etc. DOS is also the nodal agency for the Physical Research Laboratory, which conducts research in the areas of space science, and the National Remote Sensing Agency, which deploys modern remote-sensing techniques for natural resource surveys and provides operational services to user agencies. India is the only Third World Country to develop its own remote-sensing satellite. Electronics The Department of Electronics plays the promotional role for the development and use of electronics for socio-economic development. Many initiatives have been taken for a balanced growth of the electronics industry. The basic thrust has been towards a general rationalisation of the licensing policy with an emphasis on promotion rather than regulation, besides achieving economy of scale with up-to-date technology. A multi-pronged approach has been evolved for result-oriented R&D with special emphasis on microelectronics, telematics, and high-performance computing and software development. Application of electronics in areas such as agriculture, health and service sectors has also been receiving special attention. For upgrading the quality of indigenously manufactured products, a series of test and development centres and regional laboratories have been set up. These centres for electronic design and technology help small and medium electronics units. A number of R&D projects have been initiated to meet the growing requirements of the industry. Oceanography India has a coastline of more than 7,600 km and 1,250 islands, with its Exclusive Economic Zone covering over 2 million sq. km and continental shelf extending up to 350 nautical miles. The Department of Ocean Development was established in 1981 to ensure optimum utilisation of living resources, exploitation of non-living resources such as hydrocarbons and minerals, and to harness ocean energy. Two research vessels, ORV Sagar Kanya and FROV Sagar Sampada, are assessing and evaluating the resource potential.

NOTES

179

ANNA UNIVERSITY CHENNAI

DBA 1736

NOTES

Survey and exploration efforts have been directed to assess sea bed topography, and concentration and quality of mineral nodules. In August 1987, India was allotted a mine site of 150,000 sq. km in the central Indian Ocean for further exploration and development of resources. India is the only developing country to have qualified for Pioneer Status by the UN Conference on the Law of the Sea in 1982, and it is the first country in the world to have secured registration of a mine site. India has sent 13 scientific research expeditions to Antarctica since 1981, and has established a permanently manned base, Dakshin Gangotri. A second permanent station, an entirely indigenous effort, was completed by the eighth expedition. The objective is to study the ozone layer and other important constituents, optical aurora, geomagnetic pulsation and related phenomena. By virtue of its scientific research activities, India acquired Consultative Membership of the Antarctic Treaty in 1983 and acceded to the Convention on the Conservation of Antarctic Marine Living Resources in July 1985. India is also a member of the Scientific Committee on Antarctic Research, and has played a significant role in adopting a Minerals Regime for Antarctica in June 1988. A National Institute of Ocean Technology was set up for the development of oceanrelated technologies. It is also responsible for harnessing resources of the coastal belts and islands. Biotechnology India has been the forerunner among the developing countries in promoting multidisciplinary activities in this area, recognising the practically unlimited possibility of their applications in increasing agricultural and industrial production, and in improving human and animal life. The nucleus of research in this area is the National Biotechnology Board, constituted in 1982. A Department of Biotechnology was created in 1986. Recently, the Biotechnology Consortium India Ltd. was set up. It will play the role of a catalyst in bridging the gap between research and development, industrial and financial institutions. Some of the new initiatives taken include developing techniques for gene mapping, conservation of biodiversity and bioindicators research, special biotechnology programmes for the benefit of the scheduled castes and scheduled tribes and activities in the area of plantation crops. The areas which have been receiving attention are cattle herd improvement through embryo transfer technology, in vitro propagation of disease resistant plant varieties for obtaining higher yields, and development of vaccines for various diseases. Council of Scientific and Industrial Research (CSIR) CSIR was established in 1942, and is today the premier institution for scientific and industrial research. It has a network of 40 laboratories, two cooperative industrial research
180 ANNA UNIVERSITY CHENNAI

EMERGING TRENDS IN TECHNOLOGY MANAGEMENT

institutions and more than 100 extension and field centres. The councils research programmes are directed towards effective utilisation of the countrys natural resources and development of new processes and products for economic progress. It is now playing a leading role in the fulfilment of the technology missions evolved by the Government. 5.6 TECHNOLOGY TRANSFER Technology transfer is the process of sharing of skills, knowledge, technologies, methods of manufacturing, samples of manufacturing and facilities among industries, universities, governments and other institutions to ensure that scientific and technological developments are accessible to a wider range of users who can then further develop and exploit the technology into new products, processes, applications, materials or services. While conceptually the practice has been utilized for many years (in ancient times, Archimedes was notable for applying science to practical problems), the present-day volume of research, combined with high-profile failures at Xerox PARC and elsewhere, has led to a focus on the process itself. Transfer process Many companies, universities and governmental organizations now have an Office of Technology Transfer (also known as Tech Transfer or TechXfer) dedicated to identifying research which has potential commercial interest and strategies for how to exploit it. For instance, a research result may be of scientific and commercial interest, but patents are normally only issued for practical processes, and so someone not necessarily the researchers must come up with a specific practical process. Another consideration is commercial value; for example, while there are many ways to accomplish nuclear fusion, the ones of commercial value are those that generate more energy than they require to operate. The process to commercially exploit research varies widely. It can involve licensing agreements or setting up joint ventures and partnerships to share both the risks and rewards of bringing new technologies to market. Other corporate vehicles, e.g. spin-outs, are used where the host organization does not have the necessary will, resources or skills to develop a new technology. Often these approaches are associated with raising of venture capital (VC) as a means of funding the development process, a practice more common in the US than in the EU, which has a more conservative approach to VC funding. In recent years, there has been a marked increase in technology transfer intermediaries specialized in their field. They work on behalf of research institutions, governments and even large multinationals. Where start-ups and spin-outs are the clients, commercial fees are sometimes waived in lieu of an equity stake in the business. As a result of the potential complexity of the technology transfer process, technology transfer organizations are often multidisciplinary, including economists, engineers, lawyers, marketers and scientists. The
181

NOTES

ANNA UNIVERSITY CHENNAI

DBA 1736

NOTES

dynamics of the technology transfer process has attracted attention in its own right, and there are several dedicated societies and journals. Technology assessment (TA, German Technikfolgenabschtzung) is the study and evaluation of new technologies. It is based on the conviction that new developments within, and discoveries by, the scientific community are relevant for the world at large rather than just for the scientific experts themselves, and that technological progress can never be free of ethical implications. Also, technology assessment recognizes the fact that scientists normally are not trained ethicists themselves and accordingly ought to be very careful when passing ethical judgement on their own, or their colleagues, new findings, projects, or work in progress. Technology assessment assumes a global perspective and is future-oriented rather than backward-looking or anti-technological. (Scientific research and science-based technological innovation is an indispensable prerequisite of modern life and civilization. There is no alternative. For six or eight billion people there is no way back to a less sophisticated life style. TA considers its task as interdisciplinary approach to solving already existing problems and preventing potential damage caused by the uncritical application and the commercialization of new technologies. Therefore any results of technology assessment studies must be published, and particular consideration must be given to communication with political decision-makers. The United States Department of Defense (DOD) assesses technology maturity using a measure called Technology Readiness Level. The ETC Group has proposed an international treaty for technology assessment entitled ICENT - International Convention for The Evaluation of New Technologies Some of the major fields of TA are: Information technology Nuclear technology Molecular nanotechnology Pharmacology Organ transplants Gene technology Health technology assessment (HTA)

182

ANNA UNIVERSITY CHENNAI

EMERGING TRENDS IN TECHNOLOGY MANAGEMENT

Technology Readiness Level

NOTES

Uses of Technology Readiness Levels The primary purpose of using Technology Readiness Levels is to help management in making decisions concerning the development and transitioning of technology. Advantages include: Provides a common understanding of technology status Risk management Used to make decisions concerning technology funding Used to make decisions concerning transition of technology Disadvantages include: More reporting, paperwork, reviews Relatively new, takes time to influence the system Systems engineering not addressed in early TRLs 5.7 COLLABORATIVE INTELLIGENCE Collaborative intelligence is a measure of the collaborative ability of a group or entity. According to Stephen James Joyce author of Teaching an Anthill to Fetch Developing
183 ANNA UNIVERSITY CHENNAI

DBA 1736

NOTES

Collaborative Intelligence @ Work, collaborative intelligence (CQ) is the ability to create, contribute to and harness the power within networks of people and relationships. Knowledge derived from collaborative efforts is increasing proportionally to the reach of the world wide web, collaborative groupware like Skype, NetMeeting, WebEx, iPeerAdvisory and many others. IQ is a term readily used to describe or measure an intelligent quota of a person. EQ has been used to measure the Emotional Intelligence of a person to describe how a person handles emotions in a given situation. CQ or Collaborative Intelligence measures the collaborative ability of a group. CQ is a fairly new term arising from the visibility of collaborative efforts of companies and entities CQ is a situation where the knowledge and problem solving capability of a group is much greater than the knowledge possessed by an individual group member. As groups work together they develop a shared memory, which is accessible through the collaborative artifacts created by the group, including meeting minutes, transcripts from threaded discussions, and drawings. The shared memory (group memory) is also accessible through the memories of group members. Distributed collaborative intelligence is the act of a group collaborating within a virtual sphere of interaction. Group members can interact in real time or asynchronously even though they are not located within the same physical space. Technologies used to enhance distributed collaborative intelligence and to facilitate group problem solving are: Messaging 1. Synchronous conferencing technologies like instant messaging, online chat and shared white boards. 2. Asynchronous messaging like electronic mail, threaded, moderated discussion forums and web logs. Stigmergy 1. Wiki 2. Social evolutionary computation The ability of a group to solve a problem collectively is potentially directly proportional to the number of members in a group; however effective architecture of interaction is needed to achieve this. Critical success factors for a high collaborative intelligence quotient are: 1. Group moderation and facilitation 2. Adherence to a small set of fundamental rules relate to member interaction 3. No limits to thinking; or the promotion of creative thinking 4. Strong group membership feedback
184 ANNA UNIVERSITY CHENNAI

EMERGING TRENDS IN TECHNOLOGY MANAGEMENT

5. Quality control. Ideas need to be nurtured, but the solutions should be upheld after a critical peer review. 6. The construction of a deeply documented group memory or knowledge base. Summary Some insights about BPR, TQM, Tranferred Technology, Collaborative Innovation, Technology in developed and developing countries have been given. Questions 1. Explain when BPR is to be done and what steps are to be followed. 2. Does Quality play a role in technology upgradation? Explain 3. How is technology used in developed countries? 4. How can technology speed up the growth in developing countries? 5. Elaborate on Collaborative Knowledge.

NOTES

185

ANNA UNIVERSITY CHENNAI

DBA 1736

NOTES

NOTES

186

ANNA UNIVERSITY CHENNAI