You are on page 1of 50

INNOTECH JOURNAL (Vol. XI No.

2 / July-December 1987)
Trends in Computer Education

FOREWORD

THIS ISSUE of the INNOTECH Journal incorporates some innovations, like new type style,
layout and general makeup. It contains six articles on the theme Computer Education. Each
explains the many uses of computer in educational planning, administration and strategy.

The Use of Microcomputers in Educational Administration" introduces educational administra-


tors to uses of microcomputers in the planning and management of educational systems and
institutions, including problems and limitations of their use. The article discusses some common
misconceptions about microcomputers and explains the configuration and uses of these
equipment

"Achieving Curriculum-Integrated Computing" discusses the degree of integration of computer-


ization into the academic curriculum- in both quantitative and qualitative dimensions. It pre-
sents the uses of instructional documentation (as an aid to full implementation of a program)
and the diffusion model (or how innovations spread throughout a potential user population) as
a means of incorporating computing into coursework.

"Computer Technology and its Application to Teacher Training Programs" traces the develop-
ment, issues and applications of computer technology as it relates to teacher education. It
suggests a computer curriculum that involves such topics as 'hands-on' work with computers
(to cover basic operations and familiarity with a variety of software) to instruction on various
levels of computer application in instruction. It also covers the flexibility of computers for
teaching graphics, business, music, vocational, English and social science subjects.

The Information Age: An Opportunity to Restructure Curriculum and Instruction. It also covers
the flexibility of computers flexibility of computers for teaching graphics, business, music,
vocational, English and social science subjects, highlights the use of educational software and
hardware.

Training Teacher Trainers How to Make Use of the Computer" discusses the infusion of the
computer into education, especially addressed to primary and secondary teachers and admin-
istrators. This includes the organization and implementation of a workshop for teacher trainers.

"Computer Applications to Higher Education Institutions" traces the analogy between the
successive stages of human satisfaction in psychology and trends in meaning and expectations
associated with "Computer Literacy." It dwells on the impact of computer-from a proper
perspective of the needs and plans of action through Systems Approach to utilize resources to
best possible effect.-OSP

Innotech Journal 1
THE USE OF MICROCOMPUTERS IN EDUCATIONAL ADMINISTRATION

By PAUL HURST

Introductioneg

THIS MANUAL is intended to introduce educational administrators in developing countries to


the possible uses of microcomputers in the planning and management of educational systems
and institutions, and to the problems and limitations of their use. By "administrator" I mean
principals of schools, district, local or regional education officers, and planners of national
systems. Obviously the tasks of these various types of administrator vary considerably, and
the potential uses of microcomputers will vary accordingly. The manual is organized into
sections dealing with different applications rather than different types of administrators and
their needs.

Section 1 deals with a number of misconceptions about microcomputers and their use in educa-
tion in developing countries. Section 2 explains the principal features of microcomputers and
peripheral devices. The emphasis throughout is on the use of available programs, and no
programming skill is assumed or necessary.

1. Some Common Misconceptions

Misconception No. 1: Microtechnology is inappropriate

Some people believe that microcomputers are a form of technology that is inappropriate to
developing countries because it is too sophisticated. Of course, there have been instances in
the past of technology from industrialized countries being imported into low-income countries
and having a devastating effect because of its unsuitability for such an environment.

Microcomputers are not in themselves unsuitable for use in a developing country, even a very
poor one. First and foremost they are quite easy to use, provided that the operator has the
appropriate software available and a good quality manual which explains how to use it. Most
applications software can be understood from the manual, and the user can usually learn how
to use it with practice and without attending a course. Almost all of the software these days is
menu-driven, which means that a menu of choices or options is always displayed on the TV
screen (display or monitor). The user does not have to memorize the commands which instruct
the computer, although as one becomes familiar with the command syntax it is usually possible
to set the menu level down or cause it to disappear altogether, thus converting the program to
a command-driven mode.

Moreover, programs invariably contain a help menu - that is to say, a means of summoning up
pages of advisory text from within the program which explains how to use it. Further, the more
advanced kinds of program are generally accompanied by tutorial disks, which contain
sequenced exercises to train the novice user in getting the best out of the program,

It is not necessary to learn programming languages and skills to operate a microcomputer,

2 Innotech Journal
provided you have bought a good program (which a programmer has written for you). The
command syntax which is used to instruct the computer via the program is usually written in
everyday language with obvious meanings like COPY, QUIT, DO, ERASE and the like. You
do need access to a service engineer since computers, although fairly reliable, are not always
trouble-free. And you do need a reliable power supply. Otherwise anyone anywhere in the
world can use a microcomputer, provided the machine is kept within its operating temperature
range (e.g. not in strong direct sunlight in Mauritania or outdoors in the Arctic Circle), within
its humidity range, and away from dust.

Misconception No. 2: Microcomputers Cause Unemployment

Strictly speaking there is nothing a microcomputer can do that a human being cannot do, but the
main reason for using them is because they are capable of performing routine operations a
great deal more quickly than humans. And they don't get bored. There are many tasks that
computers cannot do, particularly those which involve human relations and creativity. Essen-
tially therefore we should use micros only when it is cost effective to do so, and when we are
thereby releasing the talents of humans to b devoted to more interesting and more challenging
work than routine tasks, (assuming the machine can do them and do them more cheaply).

The introduction of micros into educational administration on a considerable scale might con-
ceivably lead to some loss of jobs among low-grade clerical workers who are being employed
simply to carry on routine record-keeping chores. In practice, however, it seems to be nearly
always the case that micros are used to develop and extend the range of tasks that can be
undertaken by a group of people, since some of them will be to some extent liberated from
routine file-keeping or statistical calculations. It needs to be remembered that in some cases
the use of micros can actually entail more work because the machines must have data entered
into them and be operated by someone thereafter. Micros are interactive, and require an
operator to instruct them. This is unlike large mainframe computers which are usually run in a
non-interactive batch mode- that is to say, the mainframe computer is given its instructions in a
batch which it then completes unattended.

In view of the foregoing, the use of micros is unlikely to lead to savings. First there is the ex-
pense of acquiring and maintaining them, and second there is unlikely to be much reduction in
personnel. If clerks are redeployed to more interesting work, this is making the system more
efficient (cost-effective) by extending its range at a tolerable cost rather than by making
savings. Reduction of existing costs through the use of micros is not the usual experience.

Misconception No. 3: Microcomputers are too expensive for the Third World

In recent years, the cost of computing has declined dramatically. The introduction of miniaturized
integrated circuits (called chips) not only makes it possible to put a great deal of computing
power and memory into a machine the size of a small suitcase or a sewing machine, where
previously a suite of rooms full of equipment was needed, but also means that the process of
industrialization of the manufacture and the economies of scale that are achieved through
large-volume sales of these machines has brought prices down drastically.

Innotech Journal 3
A machine suitable for use in a large school or a district education office or a ministry planning
unit can now be had for the equivalent of around US$1000, and the cheapest office machines
appearing on the market can be bought for about $600, including a printer. Prices may well
decline further. It is now more expensive to buy a fast letter-quality printer than the computer
to which it is attached. But even at these prices it may still be unfeasible or uneconomic for
low-income countries to deploy micros widely in education. Even putting one machine in every
secondary school might be beyond the resources of some countries, or an unsound investment
in the light of other priorities. However, it might make economic sense to put a micro into very
large secondary schools (at least to begin with) and one in every regional education office, for
example. In any event micros ought to be introduced into a national system in a gradual and
experimental way. It would be quite easy to place a bulk order and find that the wrong ma-
chine had been purchased in relation to requirements.

Another important consideration is the human problems involved in the acceptance and imple-
mentation of the use of microcomputers, and an experimental and phased sequence of intro-
duction should pay substantial attention to dealing with these problems.

Misconception No. 4: Micros in education means teaching machines

Many people associate microcomputers in education with computer-assisted learning (CAL).


However there are some who consider that-despite the fashionable wave of enthusiasm for
these gadgets in the classroom-the role of microcomputers as a teaching tool is very limited. A
microcomputer cannot be anything more than a sophisticated programmed learning machine,
and experience often indicates that the kind of learning that takes place when a student inter-
acts with a machine is very limited. "The computer cannot be a tutor" as Neville Coghill put it.
"The teacher who can be replaced by a machine, should be" is another aphorism with the
same import. It is outside the scope of this manual to go into this controversy in detail, but the
automated classroom remains as much a myth in industrialized countries as it did 17 years ago
when Anthony G. Oettinger published his book Run Computer Run; The Myth of Educational
Technology. Paradoxically it is in administrative applications that the clearest advantages of the
micro lie. Most of the principal developments of the microcomputer and micro software that
have taken place in the industrialized countries, were intended for commercial users, and in
fact the applications software discussed in this manual has been developed almost entirely for
use in commercial environments. As we shall see the effect of this is that much of this software
is readily adaptable to administrative purposes in educational environment; some of it, how-
ever, is unsuitable.

2. Microcomputers Explained
(This section is based on an original draft by David Turner)

Nobody who simply wants to use a computer really wants to be involved in understanding what
the computer does, or how it does it. After all, using a computer is primarily intended to make
our lives easier. We do not achieve this by getting involved with the inner workings of the
computer ourselves, nor by getting involved with the jargon that computer fanatics use. We
really want to know whether it will do the job, not whether it has ROM, RAM, EPROM,
hard disk, soft disc, 16 bit processing or whatever.

4 Innotech Journal
However, at least once, when we buy a machine, we need to understand the manufacturer's
jargon, so that we know that we are getting what we want. If we do not get it right, we will
start to come against the limitations of the machine we have bought, and will become uncom-
fortably aware of what the jargon means in practice.

What we set out to do in this section is to explain what a microcomputer is, what it does, and
how it does it. The main reason for this is not because it will make you any better as a com-
puter operator, or programmer, much less as a service engineer. It is simply that the way the
computer does its job places limitations on what it can do. At the most basic level, the com-
puter is made up of electronic circuits which are either on or off. The computer cannot handle
ambiguity at that level. The computer's answer to any question is either "Yes", or "No". Some
years ago, when there was perhaps more euphoria about the potential of computers than there
is today, computerization was seen as a sensible way of solving most problems. Among these
was the perennial educational administration task of writing a school timetable. Given full
information, a computer would be ideally suitable for solving such a problem. But the majority
of decisions in timetabling are vague in form; "Mrs. Jones teaches biology normally, but if we
put some pressure on her she might teach some chemistry. We could sweeten the pill by giving
her some advanced biology work". The computer is inherently unable to cope with such
ambiguity as it is involved in the ideas of persuading, using pressure, sweetening the pill and so
on. programming in ambiguity is immensely difficult, and despite large investment it is proving
harder than originally expected to program computers for speech recognition, speech simula-
tion, translation, timetabling and so on.

This is the kind of description that we want to present in this section: what is a computer?; what
does it do?; how does it do it?; and most importantly of all, what does that mean in terms of
the kind of task it can do well and kind of tasks it can hardly do at all?

Trying to answer any of these questions is rather like trying to hit a fast-moving target. The
technology, and with it the capability, of computers is changing quickly. In addition to that,
computers which use last year's obsolete technology are doing sterling work, and are still on
sale alongside the latest models. What is more, last year's technology can often be bought
relatively cheaply. The only thing that is really certain is that the detailed information in this
section is out of date before it is written. We will describe 8 bit technology, even though 16 bit
and 32 bit technology is on the market and we will devote a section to what 16 and 32 bit
operation means in terms of increased capability.

An explanation of "8 bit" and "16 bit" is in order. A bit or binary digit, is the way a computer
stores numbers. It is represented as a voltage which is either switched on or off, and can
therefore, only have one of two values, zero or one. Computers count in exactly the same way
as we do, except that any number which is not made up entirely out of zero's or one's is
omitted. So they count: O (zero) 1 (one), 10 (two), 11 (three), 100 (four) 101 (five), 110
(six), 111 (seven), 1000 (eight) and so on. Each zero or one is a bit. The biggest number an 8
bit machine can handle is 11111111, or two hundred and fifty five. Obviously, 16 bit machines
can handle much bigger numbers, the largest being around sixty five thousand, and 32 bit
machines can handle numbers so big it hardly bars thinking about.

Innotech Journal 5
First of all, then, what is a computer? We are dealing here with microcomputers which might
handle routine administration tasks in educational establishments. The prefix microcomputer
used to be contrasted with minicomputers and mainframe computers. The distinction between
these are largely differences of speed of operation and size of memory, but the areas in
between the types are becoming blurred by the introduction of new machines which are ever
more capable. The thing that really distinguishes mainframe computers from microcomputers is
a facility known as timesharing, which involves the computer starting several jobs at once, and
doing routine jobs on one while it is doing the main processing on another. This distinction
should become clearer as the operation of computers is explained, but even this clear cut
difference between microcomputers and mainframes is disappearing with networking and
multitasking. Another difference is that the mainframe processes in batches whereas the
microcomputer is interactive with the operator. As a minimum, the microcomputer will have a
central processing unit (CPU), a keyboard and display (a cathode ray tube like a TV), and a
large range of different ways of storing information in its memory. The memory will include
random access memory (RAM), read only memory (ROM), and floppy disks, and may
include hard disk. Random access memory is volatile (the contents disappear when the power
is switched off); ROM and disks are more permanent records.

Central Processor Functions

The CPU is at the heart of the computer. The main component is a processor chip, usually the
8080 or 80Z on older machines and the 8086 or 8088 on more recent machines. It is this that
performs all the magic of the computer, except that what it can do is not very magical. It can
remember about 20 numbers between zero and 255*, it can add two of those numbers
together, it can multiply one of those numbers by two, it can move one of those numbers out
to another place in its memory, and it can take a number from somewhere else in its memory.
In short, there are about thirty instructions it can respond to which involve adding numbers up,
multiplying by two, or moving numbers in and out of memory. The whole problem of getting
the computer to do what we want is to persuade it to add up numbers and move them around
in me right order. To achieve this, we have to give it instructions or a program.

A very important part of me computer is the random access memory. This is from 64 to 256
thousand electronic pigeon holes where it can store numbers. These numbers can quickly and
easily be moved into and out of me CPU as required. Some of those numbers will be the
numbers the CPU is working with, and others will be lists of instructions. Since me CPU can
only respond to about 30* instructions, these instructions can be stored as numbers between
one and thirty. If we send it a number twelve, it will know which numbers it is supposed to
add up or move.

Floppy disks give the computer a much bigger amount of money. Again, all that is stored is
numbers, sometimes numbers the computer works with and other times instructions in the
form of numbers. Typically a double sided, double density 5 1/4 inch diameter floppy disk can
store about four hundred thousand numbers between zero and two hundred and fifty six. The
floppy disk is simply magnetic material, like recording tape, where me numbers can b re-
corded. Each number has its own place on the disk and these places are arranged in concen-

6 Innotech Journal
tric tracks on the disk. Other disk sizes are available. 8 inch disks used to be very common. A
number of manufacturers use a 31/2 inch disk. It's in a hard plastic case, but it is still called a
floppy disk. The sizes of disks, and the amount which can be sorted on them, is one of the
areas where you can expect rapid changes in the near future. Indeed disks themselves may
become obsolete quite soon. But we will return to the detailed operation of disks later.

When the computer is turned on, all of the numbers stored in me RAM are zero. The implica-
tions of this are important. If the only memory that the computer had was RAM, then it would
forget everything every time it was turned off. An instantaneous drop in voltage lasting on a
fraction of a second could be enough to wipe all the memory from RAM. A computer which
loses everything from memory every time the person in the next office turns the lights on would
b no sort of computer at all. A good electrical supply, free from interference, is essential for the
proper working of a computer.

Apart from being a warning not to turn a computer off without making sure it has saved every-
thing important in RAM to some other form of memory, this underines the fact that a computer
with only RAM would be virtually useless. The CPU has no instructions and nothing to work
with, so it can do nothing at all. The only thing it can do is collect 8 few instructions from a
special ROM. These instructions are programmed in by the manufacturers of the computer,
and tell the CPU to do a small number of things. On our computer they tell it to send a mes-
sage to the screen. But more importantly, they tell me CPU to read one of me tracks from the
first disk drive. The exact way this is done varies from machine to machine: some wait for the
disk to be loaded, some wait for a key to be pressed, and some just automatically look for me
track on the disk. These differences are not very important. The important thing is that the
limited set of instructions permit the computer to take a set of instructions from me disk and
load it into the RAM. This process is called a cold boot. It is cold, because it is starting from
scratch, and it is a boot, because me computer is pulling itself up by its own bootstraps,
teaching itself me things it needs to know to run.

The instructions which are loaded in this first stage are very important; they are the system
programs for the computer. For some years me more or less standard bit system program was
called CP/M, produced by Digital Research. More recently, 16-bit system programs, such as
MSDOS and PC-DOS have started to replace CP/M. CP/M is in two parts, BIOS and
BDOS. What should already be clear is that the names are fairly similar, frequently ending in
"DOS". So what is "DOS?"

BDOS is me system program known as me "basic disk operating system". The other part of
CP/M, BIOS, is the "basic input output system". Any system program must perform roughly
similar functions. The system program is a list of instructions which the computer often per-
forms, taking information from disk or from the console, or sending information to disk or to
me console, or controlling a printer or modem. This is important for a number of reasons.
Eventually a full program is going to b taken into the computer, which will need to handle
information from console and disks and so on. It will use me system. How does the system
help?

The first observation, of course, is that our application programs can be shorter, because often

Innotech Journal 7
repeated instructions are already in the computer, in the system. But the operations system
performs some other important functions. There are a large number of different ways of getting
the disks to work, or to handle printers and modems. Different manufacturers make different
pieces of equipment, different circuits, and so on. The programmer is really no more interested
in how these the operations are actually done than we are. All he needs to know is what
instructions to give the system program. This allows a great degree of standardization on very
different equipment.

A program which has been written to work with a particular system can relatively easily be
converted to run on a very different machine using the same system. It will be much harder to
convert it to run on a different system. Again, none of this is important from the user's point of
view. If I buy a program to run on my computer, I expect the supplier to ensure that it will run
properly. This is the job of a specialist programmer, and I will not get involved in it. However,
if my machine runs a common system, programs will be cheaper, and I may have a wider
choice of programs. In selecting a computer in the first place, choosing a common system
could well be a wise choice in terms of keeping future options open. And it is in this area that
claims about "compatability" are important. Full compatability means using the same system,
and not all "compatible" machines are what they claim.

For these reasons, in choosing a computer system, the correct way of going about it is to decide
what you want to do, and men choose the program you want to run. Then you decide which
system or systems can support the main program. Finally a computer must be chosen which
runs one of those systems. This important point is often not understood by people who are
unfamiliar with computers. Choose the machine first, and you may well choose the wrong
machine.

The CPU next checks through the directory to find a program with a specific name. The name
varies from computer to computer: our computer looks for a program called AUTOST.COM.
It is a command program which tells me computer how to self start. The fact that it is a
command program is important, and is indicated by the suffix. COM in the file name. It means
that it is made up of strings of numbers which can be interpreted as instructions by the CPU
directly. More details about types of programs are given in the next section. Actually,
AUTOST.COM does not do very much. It sends a message to the screen which tells us who
made the computer, and what program is going to be loaded next, and then it loads the
program. In fact, the only good thing about AUTOST.COM is that it gives a convenient way
of telling the computer which program you would like to run.

To recap, to run any program in the computer, whether it is a word-processing program,


spreadsheet, or database, we need a disk with the program on it. Also on the disk must be the
system program, and a program which does the same job as AUTOST.COM. Then when we
put the disk in the machine (and possibly press a key), the computer will go through a se-
quence of actions automatically. It reads the ROM to find out how to load the what to do
next; and it loads the program into the RAM, starting at a particular point in the memory.
Finally, the CPU goes to the beginning of the program and starts performing the instructions in
it. The computer will then follow the program through, sending messages to the screen and so
on, until it gets to the point where it needs the first instructions from the operator, and then it

8 Innotech Journal
will stop.

At this point, what has the computer got in its memory? Take a practical example; I am typing
on a machine with a 64k RAM which runs CP/M. This means it has roughly sixty four thou-
sand pigeon holes in its memory. (Roughly, because actually "644k" means "64 x 1024". I
shall stick to the rough notion that "k" means "1000".) Each one of the pigeon holes can
contain a number between zero and 255*. Each one of those pigeon holes has an identity
number, or address, so that the Computer can find numbers which it has stored there.

The first two hundred and fifty addresses, or pigeon holes, in me RAM are a sort of scratch pad
where the computer leaves itself messages. These messages include the addresses where it
can find the system program when it needs it, where the printer is connected in case you ask it
to print something, and information that it needs to get anything from the disks. At the other
end of the memory, the top four thousand addresses are used to run the video display. Below
that is the system program, which takes up six thousand addresses. So, although we bought a
computer with a 64k RAM more than 10k is already being used by the computer.

The computer has loaded the word processing program, starting at address number 256*, just
above its scratch pad, which is where it expects any program to be. Now the word processor
takes up 75k, which means it does not fit. Fortunately, me people who wrote the program
were able to divide it up into pieces, and the piece it loads into RAM is only 17k. That means
that if the program needs to do something that is a little out of the ordinary it has to get more
of the program from the disk. That makes it a little slow doing some operations, as it has to
wait while it finds its instructions. Between the word processing program and the system
program is a space of just less than 37k which the computer can use for storing what I type. It
can store one letter, space, or punctuation mark in each address, so if I type more than about
twelve pages it starts moving what I have written onto a disk. This can slow the computer
down, too.

So, with only 64k of memory, the operation of programs can be a little slow, and there will
certainly be programs that cannot run in so small a memory space. This is particularly true now
that integrated programs which do word processing, spreadsheet and database operations are
becoming more common. The answer is more memory, but mere is a problem in the way the
processor handles memory. An 8-bit processor cannot handle more than 84k of memory at
one time. With 16 bit machines, much larger RAM is possible, and 256k is now almost
universal as a minimum. But not all of that may be available to you, and you must make sure
that at a machine can run the largest programs you are likely to want.

Now the top 4k are being used for memory mapped video. What does that mean? Quite
simply it means that whatever is in that piece of memory shows on the screen. If the computer
puts the number 65 in address number 63,000, then a capital 'A' will appear in a particular
place on the screen: 63,000 because that is the address that represents that particular spot on
the screen, 65 because that is how the computer stores a capital 'A'. The computer uses a
code to represent letters. The code is called ASCII; 65 is 'A', 66 is 'B' and so on. When it
gets to 97, it starts again with 'a'.

Innotech Journal 9
From me computer's point of view, writing on the screen is easy, and that means quick. All it has
to do is move a number to an address in memory, and writing appears on the screen. But it
does take up memory. The alternative is for the computer to treat me screen as if it were
something quite outside and apart from itself, like a printer. Then it would send message to the
screen. This is slower, but does not use up valuable memory. Except when you buy a com-
puter, you do not actually choose between the different ways in which the computer could
send messages to the screen. The design engineer has weighed up all the pro's and con's, and
you either take it or leave it. But when you buy, you ought to test several machines to see
how they perform, and whether you like them.

GLOSSARY OF MICROCOMPUTER JARGON

8 Bit: A form of arithmetic which uses eight bits or binary digits Handles numbers up to 255.
Also applied to CPU's which mainly use 8 bit arithmetic, though 8 bit processors can perform
some 16 bit operations for handling addresses. See Bit.

16 Bit: A form of arithmetic using 16 bits. Also applied to CPU's using mainly 16 bit arithmetic.
Largest number handled is 65,335. The most common form of arithmetic now used in micro-
computers. See 8 Bit.

32 Bit: A form of arithmetic using 32 bits. No practical limit to the size of the number which can
b handled. See 8 Bit.

Address: A number assigned to a location in memory, so that the location, and its contents, can
be identified.

ASCII: American Standard Code for Information Interchange, a standard code for passing
letters, punctuation marks and some other symbols between pieces of equipment.

Assembler: A program which can translate a program written in a high level language into
instructions which can be used by the computer directly.

Basic: A widely used high level computer language, especially used for teaching purposes.

Baud Rate: Rate at which information signals are passed from one machine to another.

Bit: (to rhyme with sit) Binary digit A 1 (one) or O (zero), indicated by a voltage which is high or
low. All arithmetic functions are possible in numbers which use only l's or O's, although
multiplication and subtraction are considerably easier than in normal number systems.

Boot: The boot is the process by which the computer takes the system programs from the disk
and loads them into RAM. The process can be started completely from scratch when the
machine is turned on (a cold boot) or used to lift a modified system from a disk or to recover

10 Innotech Journal
from some program errors (a warm boot).

Byte: (To rhyme with kite) A group of eight bits. See Bit

Centronic: A manufacturer of computer equipment. The name applied to one of the standard
layouts for a parallel interface. See Parallel Port.

Cobol: A high level language used mainly for business purposes.

Cold Boot: Sea Boot.

Command Program: A computer program prepared in numbers which can be directly inter-
preted by the CPU. Stored in files which end in the suffix COM.

Communication Protocol: A system of signals used when information is passed from one ma-
chine to another, to ensure that the receiving machine can control the rate at which information
is passed.

Console: The screen and keyboard of the computer.

Continuous Stationery: Paper in which pages are joined top to bottom so that they can pass
continuously through a printer.

Control Characters: Characters produced on the keyboard by holding down the "Control" key
at the same time as pressing another key. Assigned values blow 32 in ASCII, they cannot be
printed by a printer but indicate special functions including carriage return, line feed, back-
space, end of file and so on.

CP/M: Control Program/Monitor. A common set of system programs produced by Digital


Research, of whom CP/M is a registered trade mark.

CPU (Central l Processing Unit): The heart of the computer which performs the arithmetic, can
remember about 30 numbers and can move numbers directly in and out of RAM.

Daisy Wheel: A metal or plastic wheel which has all the letters and punctuation marks per-
formed on it, which prints letters by pressing a carbon or ink impregnated ribbon onto paper.
Also used to describe printers which use daisy wheels.

Dot Matrix: A system of forming letter by which around forty pins are pressed against the paper
through a carbon or ink impregnated ribbon. Also used to describe printers which use this
system of printing. Some of the better dot matrix printers permit the control of single dots,
which gives good graphics capability.

Data Bus: The internal wires which are used for passing information around inside the computer.
A bus or busbar is a general electrical engineering term for a supply wire which many pieces of
equipment can be joined to. The data bus is controlled by a data handling chip which is

Innotech Journal 11
separate from the main processor chip. The combination of the CPU and the data bus
controller gives the computer most of its characteristics

EPROM: Erasable/Programmable Read Only Memory: Memory which the computer engineer
can store instructions in for the CPU. Can be erased and new information put in it, but this
can only b done with special equipment and knowledge. Contents of memory unaffected by
being switched off and on.

Escape Key: On computer keyboard used to recover from some types of errors. Assigned the
value 27 in the ASCII I code. Frequently used in special instructions to the printer, e. g.
changing print style on dot matrix printers.

Fan Fold Paper: See Continuous Stationery.

Field: Each record in a database will be divided into fields which appear in each record Each
field will be a single item of information, e. G. name, address, age.

File: Information is stored on disk in files. A file name is given to the file when the file is first
opened.

Floppy Disk: A disk of magnetic recording material on a plastic base. Numbers can be re-
corded on the surface of the disk and retrieved later on Information is permanent, in the sense
that the information remains when the disk is removed from the machine or the machine is
turned off, but floppy disks are easily damaged if not handled correctly. Copier of important
information should be kept.

Formatting: The process of preparing a floppy disk for use by recording Zero's all over the
working surfaces. The process destroys any information on the disk.

Fortran: A high level language usually used for technical and engineering purposes.

Friction Feed: System of passing paper through e printer where the paper is held firmly be-
tween two rubber rollers. Not suitable for precision positioning of paper as is required for
using continuous stationery.

Function Keys: Keys on the computer keyboard which are specially arranged to give se-
quences of instructions to the computer. Usually the exact function of the keys is automatically
set by the system program or the program being run, but it may be possible to change them.

Hard disk: A system for storing large amounts of information which is retained when the com-
puter is turned off and on. Originally similar to floppy disks, using hard plastic, larger surfaces
and higher speeds, they are now usually made from electronic components.

Hardware: The physical machinery and equipment of the computer as opposed used to the
programs or software.

12 Innotech Journal
IEEE 488: One of the standard layouts for a parallel interface. See Parallel Port.

Input Port: Port for taking information into the computer. See Port.

Intel 8080: The most t common low level language used by those who went to program directly
in the instructions which the CPU of an 8 bit machine can interpret, and the
assembler for it. Each instruction in the language converts directly into a sequence of no more
than four numbers.

Interfacing: Arranging that two machines can communicate with each other. Normally involves
arranging the connections and modifying programmes so that necessary control characters are
used appropriately.

Interpreter: A program which can take a program written in a high level language and translate it,
instruction by instruction, into a form which can be used by the CPU. Contrast with an assem-
bler, which translates the whole program before running any of it. Interpreters run slower than
assemblers.

Keyboard: The main way for information and instructions to feed into a computer while it is
running a program. Computer keyboards are usually similar to typewriter keyboards, but with
extra keys. Note for example, that the computer treats 1 (one) completely differently from I
(lower case ell)

Language: A set of instructions which are more or less similar to English or standard mathemati-
cal notation, or which are in the form of mnemonics Each instruction can then be translated
into instructions which the CPU can use by an assembler or interpreter. Languages make
programming easier, because the instructions make some kind of sense, so they can be
remembered, and because the assembler or interpreter can usually take over routine problems
such as allocating RAM for storing numbers during calculations. Languages are usually suited
to specific purposes.

Loader: A program which takes programs which have been as assembled and adds the instruc-
tions so that the system programs will put the program in the correct place in RAM to enable
them to run. A program need only be "loaded" once, after which it can be run normally.

Mainframe Computer: A large computer with powerful CPU and large and rapid RAM. To
make full and economic use of this capacity the computer will normally have multiple ways of
entering programs and information. These may be "on line" consoles similar in appearance to
a microcomputer console, or "off line" such as punched cards and paper tape.

Memory Mapping: A system of handling the sending of information to the console or output
ports whereby placing a number in a particular memory address is equivalent to sending the
information. Similarly for inputting information This is quicker than most other ways of han-
dling information handling

Menu-Driven: A way of preparing programs so that the possible options are presented on the

Innotech Journal 13
screen to the operator, thus making it possible-- in theory to barn how to use a program from
the program itself. The alternative of non-menu driven programs involves the learning of simple
instruction, and may involve the writing of special files containing of instructions. With non-
menu driven programs the user is completely reliant on written manuals.

Microcomputer: Computers of small capacity, comprising as minimum a CPU and a few


thousand locations of RAM. This category extends upwards to include most desktop ma-
chines.

Minicomputer: Computers of medium capacity, between microcomputers and mainframe


computers. The boundaries of these groups are not precise.

Modem: A device which can be used for transferring information between the computer and a
telephone, and hence, via another modem, to another computer. Normally a relatively straight
forward way of passing information from one computer to another, it can be used to gain
access to remote databases and computer facilities.

Monitor: The screen on which the computer displays information.

Networking: A system of connecting microcomputers together so that they can all make use of
an expensive resource which would be underused by a single microcomputer This is usually a
hard disk or ROM. It may also permit the transfer of information from one microcomputer to
another.

Output Port: Port for putting information out of the computer. See Port.

Paper Tape: Long strips of paper about two centimeters wide on which letters could be
punched, in ASCII, as rows of eight holes for entering programs and information into comput-
ers, and now rarely seen.

Paper Tape Punch: A machine for punching patterns of holes in paper tape into the computer.
See Paper Tape.

Parallel Port: A port in which eight wires can be used to transmit the information in one
8 bit number simultaneously, or in parallel See Port.

Peripherals: Genetic name for any piece of machinery which can be connected to the computer,
e.g. printers, modems, magnetic or optical text readers, and external hard and soft disks. In
some contexts may include things which would normally be considered to be part of the
computer, e. g. internal hard and soft disks, and console.

Port: A plug and socket through which information can be passed into or out of the computer.

Pascal: A high level language used mainly for mathematical purposes.

Printer: Any device which is capable of taking information and changing it into print on paper .

14 Innotech Journal
Printer can frequently do more, in terms of producing graphics. A wide range of qualities and
technologies is available in printers. Dot matrix printers are usually quicker and cheaper, while
daisy wheel printers usually produce better quality print, and are slower or more expensive.
See Dot Matrix and Daisy Wheel.

Print Spooler: Additional memory which enables the computer to run a printer, and possibly
more than one, while having the CPU available most of the time for other processing functions.
Print spoolers may be either internal to the computer, as they usually are now, or external.

Program: A sequence of instructions for the CPU to perform. May be in instructions which the
CPU can use directly, or may be in a high level language which needs to be assembled or
interpreted.

RAM (Random Access Memory): The main working memory of the computer. Contents
destroyed when the machine is turned off.

Record: In a database the information is divided into records, or all the information on a single
case. Each record will contain all the fields which relate to that case.

ROM (Read Only Memory): Memory which can store fixed information or instructions for the
CPU. Since the memory contents are not destroyed by turning off and on, it is most impor-
tantly used for storing the instructions which the computer needs to start up.

RS232: The industry standard for serial ports.

Serial Port: A port for passing information into or out of the computer, whereby all eight bits of
an 8 bit number are transmitted down a single wire, one after the other.

Sheet Feeder: A mechanism which is capable of feeding individual sheets of paper into a printer.

Soft Disk: See Floppy Disk

Software: Generic term for programs, as opposed to hardware

System: See System Programs.

System Programs: The first programs' loaded into the computer from disk, which control the
basic functions of the computer, handling the input end output of information and controlling
the disk drives.

Tractor Feed: A mechanism which keeps continuous paper straight when feeding long runs of
paper through a printer, and ensures that the top and bottom margins do not change. Uses
rubber teeth fitting into holes down the edge of the paper.

VDU (Visual Display Unit): Another term for either a monitor or console, depending on con-
text. See Monitor or Console.

Innotech Journal 15
Warm Boot: See Boot.

ACHIEVING CURRICULUM - lNTEGRATED COMPUTlNG

By SAMUEL MUDD AND WILLIAM WILSON

THE DEGREE of integration of computing into an academic curriculum has both a quantitative
and a qualitative dimension. Given the "course" as the standard unit of work in most academic
institutions, it is possible simply to count the number of courses and sections requiring comput-
ing to have a crude index of the extent to which computing has penetrated a particular curricu-
lum. A more complete index of integration from a quantitative point of view can be developed
by incorporating into the index the number of assignments in each section of course. Compari-
sons of the course/section assignment percentages over time, across departments, divisions, or
institutions can then be made. Such percentages, as useful as they can be, do not reflect the
quality of the integration of the computing that is incorporated into the course work.

One important aspect of the quality question has to do with the complexity of the computing
material assigned, whether it be a relatively low level word processing or CAI program or a
conceptually more demanding simulation or programming assignment. This dimension of
quality is not dealt with in the following discussion, since quality in that sense depends upon the
needs of the students relative to the level of the course and its development over the semester.
Our concern is for the quality of the incorporation of the computing assignment into the course
regardless of the cognitive complexity characteristics of the work assigned.

Academic computing from the standpoint of incorporation quality depends on the extent to
which the computing assignments are meaningfully derived from, and meaningfully relatable
back to, the subject matter of the course involved. It is impossible to overestimate the impor-
tance of this qualitative aspect of curriculum-integrated computing (CIC). The assessment of
how well computing assignments relate to the rest of the course material requires a careful
examination of the instructional documentation associated with those assignments.

The management of computing integration into an undergraduate curriculum must, of course, be


concerned with both the quantity and the quality of work adopted. The availability of hard-
ware and off-the-shelf software does not guarantee that adoption will occur. Dartmouth,
probably the leader in academic computing in the U.S. during the 1970s, provides an instruc-
tive example. Cohen [1] reported a 6-year follow-up of a faculty user study by Nevison [2]
involving over 300 faculty respondents. The pertinent data were as follows:

Item 1975 1981 Gain

Use computing in research 43% 56% 13%


Have written computer program 45% 56% 11%
Use computing in course 29% 34% 5%

16 Innotech Journal
Notice that faculty use in the research and program-writing activities increased at twice the rate
(2% per year) that course use increased (1%). The question is, '"Why the adoption lag far
course integration?"

One interpretation of these data is that the adopter population was saturated, that few potential
adopters of computing were yet uncommitted to that teaching innovation. A second accounts
is that the most likely adopter pool of self-starters was exhausted, but that a substantial
reservoir of faculty remained who under appropriate conditions would undertake computing in
their courses; computing facilities, for example, may not be able to support me usage some
faculty would like to initiate. Finally, it could be that the easily converted computer applications
had already been done, if so me remaining applications may require more thought and time,
resulting in a slower implementation pace.

It is the purpose of this paper to present two concepts the authors judge essential to planning me
conditions necessary to encourage and manage the systematic involvement of uncommitted
faculty: Instructional documentation and the diffusion model.

INSTRUCTIONAL DOCUMENTATION

Materials prepared to structure student computing assignments have been a part of academic
computing from the beginning. Most of these materials have been inadequate. It is one of our
points that there is a fundamental connection between the inferior quality of the typical student
instructional "handout" and the reluctance of many faculty to use computing in their courses,
even when these same instructors are heavy users in their own research and scholarship. This
connection has two aspects.

Procedures

Most obviously the time price that the instructor pays responding to the procedural questions of
the student user as he or she works dutifully through me handout can discourage the use of
computing work in a course. Few instructors can maintain a patient manner responding to an
identical question time after time. It is not the fault of the students, of course, if the instructional
documentation provided for the assignment is defective with respect to its procedural (cook-
book) component. It should be acknowledged, however, that a procedural deficiency is a
nuisance both to the student and to the instructor.

Subject matter context

There is another more common and more fundamental defect in the usual instructional docu-
ment, whether it be a screen presentation or a printed handout. The typical computing assign-
ment simply does not provide the student user with sufficient background material about the
computing exercise to tie it meaningfully to the content and substance of the course for which it

Innotech Journal 17
is assigned. Ideally a computing exercise handout would be sufficiently complete that it could
stand alone; that is, the student at his or her current point of development in the subject matter
of the course at the time of the assignment is provided enough information to appreciate fully
the point of the computing work.

Instructional documentation development cycle

Thorough documentation demands substantial "infront" development effort before the exercise is
assigned to students. The preparation of the material relating the exercise to course content
requires that the instructor not only know what the course is designed to accomplish, but also
how that course unfolds in the experience of the student. Normally the experienced teacher
can make an initial analytic determination of content adequacy after which the document can
be pilot tested on a few students before committing it to an entire class.

The pilot work is even more important for the procedural aspect of the instructional documenta-
tion. The vagaries of particular types of input devices, the syntax of the specific operating
system, the format of the final write-up of the exercise, etc. are all points at which the student
can be frustrated by ambiguous statements which distract from the primary instructional
function of the exercise.

Incessant procedural questions can overload the instructor to the point of rejection of computing
due to the lack of time (and patience) to service inadequate instructions. Fortunately the
procedural component of instructional documents can be piloted by one or two colleagues or
students, then field tested on a small class or one section of a large course before committing it
to multiple sections and large classes.

Illustrative case

At Gettysburg College, the authors developed a simple exercise to introduce beginning psychol-
ogy students to the College computer and the uses of computers in psychology. A short, three
paragraph text, Introduction to Computing in Psychology, describing such uses was prepared.
The text was deliberately flawed to provide students with material on which to learn several
commands used on our mainframe to edit text (several words misspelled, a duplicate line,
etc.). An instructional document of five pages was drafted to guide students through the
exercise from sign-on, through the editing problems, to listing the text (to be handed in), to
sign-off. The assignment assumed no previous experience with any computer. Over two
semesters (and one change in mainframe) the instructional document was piloted, revised,
field-tested, and revised again.

The final version of the exercise was implemented in spring, 1983 in one lower level psychology
class. The average time required by students to run the exercise is shown in the table below.

Type user n Time required (min) Usefulness

18 Innotech Journal
Experienced 10 44 6.1
New 18 67 7.3

With respect to procedural adequacy it was reassuring to find that the difference in average time
to complete the exercise for those who had used editing functions before (44 min) and those
students who had never used it (67 min) was only 23 min.

The most positive factor of the full class implementation of the document was that only one
student-instructor troubleshooting interaction occurred. In that case the student failed to read
ahead to see that a keyboard input was required to continue the exercise. Equally encouraging
was the experience of a departmental colleague who used the document with two introductory
classes in the Fall, 1983 semester. He reported just one trouble-shooting contact concerning
the exercise, and that was an individual who wanted to know if he could do more with the full
editor.

To get a feel for the contextual adequacy of the document, students were also asked to rate the
usefulness of the exercise on a scale from 1 (waste of time) to 9 (valuable use of time). Expe-
rienced students (n =10) had an average rating of 6.1 while inexperienced students (n = 18)
averaged 7.3. Even though the exercise was old hat for experienced users they rated the
experience positively while the new users rated their computing introduction substantially
higher. Although positive ratings do not guarantee that the exercise made sound programmatic
sense to the students, it is certain that negative ratings would be a sign that the assignment
somehow did not fit student experience and expectation for the course in which they were
given that assignment. In this case the text on which students practiced a sample of commanda
was prepared especially to introduce students in beginning psychology courses to three
general uses of computing in psychology: data analysis, control of experimental equipment,
and simulations. In other words the text on which word processing was introduced was
appropriate to the context of the course in which the introduction is made. The assignment
was context-relevant.

The experience with The Introduction to Word Processing, and The Uses of Computing in
Pyschology text materials, demonstrates that a careful developmental process can produce a
meaningful instructional document that serves student learning effectively without swamping the
instructor with repetitious troubleshooting interactions. There are, of course, no free lunches.
Substantial instructor time and effort must go in up front in the development of the documents.
But once that time/effort price is paid, the document works autonomously to provide substan-
tial student involvement with minimal instructor servicing. Further, the document tends to be
durable over time to the extent that hardware and software changes do no occur.

In the case of the illustrative exercise. Introduction to Computing in Psychology the assignment
provided a natural stepping-stone to bring the naive user to the system and to provide him or
her with some initial, guaranteed success. From that introduction the instructor can move to a
selection of content-oriented exercises tailored to the particular course.

Innotech Journal 19
THE DIFFUSION MODEL

Turning now to the quantitative aspect of CIC, the problem is how to foster adoptions of
computer work by non-user faculty. Framing the issue in terms of adoption calls into play a
powerful model of the diffusion process found to be valid over a wide range of technological
innovations in agriculture, medicine, education, and industry [3] .

The formal diffusion model specifies the cumulative percentage of adoptions by the population to
follow an S-shaped, ogive function of time, the steepness of which varies from innovation to
innovation [4] . In addition to the invariance of the shape of the diffusion curve a number of
robust findings have been shown in study after study (see Rogers and Shoemaker [5] for a
review of this literature). Several variables have been found to be useful for planning the
incorporation of an innovation throughout a user population. Three of those factors are de-
scribed briefly below, after which a plan will be described which capitalizes on these determin-
ing variables to promote the diffusion of computing into the course structure at Gettysburg
College.

A substantial amount of adoptation behavior can be accounted for in terms of the interaction of
three main determinants: adoption type, adoption stage, and type of information/influence
source. The following description of each is taken from Zaltman [6], an excellent introductory
source.

Adopter types

Five adopter types have been identified. Innovators, constituting about 3% of the population,
are highly visible, successful, and respected by the community. They are not leaders but they
are watched carefully by the community. They have wide contacts outside the community.
These individuals are first to adopt, thereby bringing the innovation into the community. Early
Adopters, constituting the next 16% of- the adopters, are the younger, formal leaders of the
community. They participate more in community activities and tend to "keep up" with more
technical sources of information. The Early Majority category (34%) includes the somewhat
older, less prosperous, but informed leadership class. They are less active in the community
and less likely to use formal (written) sources of technical information. The Late Majority,
consisting of about 34% of the target population, are older, less successful, less well educated,
and less active in the community. Laggards, the 16% of the community to adopt, if they adopt
at all, are the oldest, least successful, and least inclined to participate in community activities.

Adoption stages

There are five more-or-less distinct stages of the individual adoption process. In the first stage
the potential adopter develops an Awareness of the innovation, but has only general informa-
tion that such exists. In the second stage, /Interest, the individual begins to collect information
about the new product (technique, etc.). If interest holds, the stage of (mental), Evaluation

20 Innotech Journal
begins in which the potential adopter imagines using the innovation. During this stage the
expected problems and payoffs are weighed. The next stage, Trial, is the period when the
adopter tries out the innovation on a small scale. If all goes well during the trial stage, the stage
of Adoption is entered and the process of adoption for that innovation is completed for that
particular adopter.

Information/influence sources

Important early work by rural sociologists established that there were four general sources of
influence/ information on the adopters of agricultural innovations: Informal/ Personal Sources
(friends, neighbors, etc.), Government Agency Personnel/ and Publications (e.g. county
agents, etc.), Commercial Sources (e.g. farm supply houses, etc.), and Mass Media [4]. The
basic finding over dozens of studies was that the source of information and influence reported
by adopters varied systematically with adopter type, adoption stage, and type of innovation.

The findings concerning the interaction of these three variables are complex, but systematic. The
implication of the research is straightforward so far as its application is concerned. In any
given situation where a program of adoption is to be planned for a given innovation it is
necessary to take into account stage of adoption, type of adopter, and preferred mode of
information/influence.

Illustrative case

The diffusion model was used in the preparation of a plan to promote the adoption of CIC at
Gettysburg College, a typical liberal arts institution of 1850 students and a fulltime faculty of
130. The plan was organized around three levels of the curriculum: the course, the depart-
ment, and the overall institution.

At the level of the single course, where a single instructor desires unique applications for his or
her course, the following steps are recommended to staff responsible for integrating computing
into me curriculum.

(l) Locate software relevant to the application in a specific course.


(2) Alert course faculty to existence of software (or faculty alert Coordinator of Academic
Computing).
(3) Train course faculty in use of program.
(4) Help faculty develop instructional documentation for new exercise.
(5) Serve as consultant to the instructor on the following items:
(a) development of instructional support system for new exercise;
(b) installation of computer exercise in course syllabus;
(c) evaluation and modification of exercise as dictated by student experience with exercise and
computer center personnel recommendations with regard to procedural difficulties.

The involvement of a department which has shown little interest in the incorporation of comput-

Innotech Journal 21
ing into the departmental curriculum can proceed according to the following steps:

(l) "Cultivate" a computer advocate among faculty in the target department, possibly through the
course unit strategy outlined above.

(2) Acquire a sample of specimen software relevant to the content domain of the department.

(3) Have the departmental advocate demonstrate use of program to departmental staff.

(4) Obtain commitment by department to development over time of a library of


instructional programs.

(5) Prepare a departmental catalogue or handbook of instructional documentation for student


computer exercises.

In addition to the cumulative work going on at the course and department level, efforts can also
be made at an institutional level as follows:

(l ) Integrate appropriate programs across departments. For International Futures (CONDUIT)


is applicable to economics, political science, energy studies, and sociology. Models have been
developed in many disciplines using EXPER SIM (CONDUIT).

(2) Employ developed department(s) (above) as model(s) for other departments in the College.

(3) Promote actively an awareness and understanding of domain-specific, curriculum-integrated


computing. At Gettysburg, a Practicum on Computing is held each year within a specific
discipline. This allows our faculty to explore, as well as those from other institutions, relevant
application software and to interact with colleagues who are actively working with computing
courseware.

Evaluation of CIC

The success of any attempt to integrate computer work fully into a curriculum is relatively easy
to establish with respect to the quantitative aspect of that integration. If departmental liaison
positions are filled it will be a routine matter to establish from semester-to-semester the
number and type of computing exercises, the number of courses and sections using computing,
the number of exercises in each, and the number of students involved in each. Such courses
census information can be compiled at the departmental level and cumulated across academic
divisions to represent the overall college status of CIC from semester to-semester. In practice
the matter-of-fact publication of such data typically operates as a powerful incentive for late
adopters to get involved in computing (taking advantage of the so-called "bandwagon effect").

Evaluation of the qualitative aspect of CIC is not so convenient, but it is accessible. Much has
been made of the critical place of instructional documentation in the integration of computing
work into the flow of course meaning as it unfolds over the semester or term. The incorpora-

22 Innotech Journal
tion of these course documents into a departmental computing handbook makes them readily
accessible for critical review by colleagues. Such reviews may be quantified in the form of
ragings, similar, for example, to NSF proposal review ratings, or may simply be in the form of
a narrative as in a book review. Both formats are common in academic work and should,
therefore, be readily understood, if not eagerly embraced.

A second source of assessment of the qualitative (meaningfulness) aspect of CIC is student


ratings of the work (see above evaluation of The Use of Computers In Psychology), It is our
experience that student evaluations are forthright, slightly positively biased, and valid represen-
tations of students'' perceptions of their experience.

In combination student evaluations and peer review of instructional documentation provide rich
sources of information about program quality. To the extent that document quality reflects
effective CIC such information allows careful inferences to b made about the quality of CIC,
while course census data support more direct conclusions about the quantitative diffusion of
computing into the academic curriculum.

REFERENCES

1. Cohen P.A., Computing in college courses: the Dartmouth experience. Presentation at the
Annual Meeting of the American Research Association, New York (1982).

2. Nevison J. M., Computing as a matter of course: the instructional use of computers at


Dartmouth College. Dartmouth College, Hanover, N. H. (1976).

3. Gersho A. and Mitra D., A simple growth model for the diffusion of a new communication
service. IEEE Trans. Syst. Man Cybernet SML-5, 209 216 (1975).

4. Ludington C., The Adopt/on of Now Products: Process and Influence. Foundation for
Research on Human Behavior, Ann Arbor, Mich. (1959).

5. Rogers E. M. and Shoemaker F. F., Communication of Innovations A Cross-cultural Ap-


proach, 2nd edition. The Free Press, New York (1971).

6. Zaltman G., Marketing: Contributions from the Behavioral Sciences. Harcourt, Court &
World, New York (1965).

Innotech Journal 23
COMPUTER TECHNOLOGY AND ITS APPLICATION TO TEACHER TRAINING
PROGRAMS

By DARIUS R. YOUNG

In a 1984 article in the Omni magazine, Toffler brings to our attention that over the past 300
years, the Industrial Revolution gave rise to a chain of interconnected mass societies from
Europe and North America to East Asia. These societies were based upon mass production,
mass distribution, mass education, mass media, mass entertainment and mass political move-
ments, not to mention weapons of mass destruction. Now we have a deluge of data in the
form of mass information.

Without question, we are leaving the Industrial Age and entering the Information Age. Coupled
with this change is the introduction of information technology and high technology. By the year
2000, it is estimated that 80 percent of our workforce will be involved in information manage-
ment of some form or another. That workforce is in school today. Today's students need the
education and training to cope with the Information Age. However, teachers must be educated
and trained before they, in turn, can educate and train their students in the school (Agee,
1985, p. 97). The purpose of this paper is to discuss some of the issues and applications of
computer technology as it relate to teacher education.

Historical Background

Computers were initially introduced in American schools not by teachers, not by administrators,
not even by computer manufacturers, but by white, middle-class mainly suburban parents. As
pointed out by the Carnegie Corporation, these were the parents who were going into the
newly opened computer stores and buying one device after another to take home and find out
what it could do. Many found how useful these devices could be and became the first to put
pressure on the schools to introduce computers into their children's education. Parents would
literally deliver a computer to the school and say to the school personnel, 'Do something with
it'. Meanwhile the parents had no clear idea of what it might mean for kids to be competent
users of the machine. It did appear to these parents that students who learned computing
would have a real advantage in life over children who did not (Carnegie Quarterly 1985, P. 3).

This situation started somewhat of a chain reaction in that now inside the schools, administrators
were faced with a dilemma: parents were demanding that the schools teach their children how
to use computers, while on the other hand the majority of the teachers were skeptical and
sometimes hostile, towards teaching students about computers.

Initially, teachers obtained their computer training through short workshops, seminars and other
inservice activities. Most of these activities were superficial and dealt with only the rudiments
of computer literacy. Frequently these workshops were conducted by vendors of computers
which obviously had increased sales in mind. These computer salespeople, often well-meaning
individuals, had no concept of educational or instructional techniques and methodology. The
teachers/students learned how to load and run a program, play some computer games under

24 Innotech Journal
the guise of computer-aided instruction and maybe did a little programming. At the conclusion
of the class, many teachers would buy themselves a computer and some software.

Were these teachers trained how to teach computing to their students? In most cases no. The
teachers became frustrated, the students un-taught and the schools out mega-dollars. As one
can imagine, from the foregoing, the beginning approach to computer teacher education was
fractured and fragmented (Agee, 1985, p. 96).

Computer Literacy

Danil Watt (1880) defined computer literacy as "the skills, knowledge, values, and relationships
that allow the teacher to comfortably use the computer as an instructional tool to prepare
students to be productive citizens in a computer oriented society."

Those responsible for teacher education can no longer ignore the computer and its growing
impact upon education programs. However, there is much talk, and often disagreement, about
what computer education ought to be. There are those who say a computer is a high-technol-
ogy tool, a problem-solving tool. Most agree that the computer can be a useful tool, but
disagree on how it is to b used. Sandoval, 1984, says the distinction needs to be made
between the teacher who knows and can teach about computers and the teacher who can use
computers to facilitate the learning of any subject matter. He further states that until the present
training of teachers in the use of computers is refocused and redefined, the highly desired and
expected increase in educational quality and productivity which computers are predicted to
facilitate will not occur. More direction needs to be taken relating the uses of computers to the
total teaching process. This means integrating teaching strategies, application of learning
theories, classroom management, materials evaluation and the evaluation of learning with a
computer. The focus should be more on the use of the computer as a tool at the teacher's
disposal to help in the management of the environment to insure learning.

Much of the focus in computer education is on the computer itself and how it works. To illus-
trate this point, Marc Tucker of the Carnegie Corporation says that real computer literacy
should signify mastery over a powerful tool of intellectual and creative endeavor. He likens
pencil technology to computer technology. The pencil is the focus point in "courses in the
pencil". We study the structure of the pencil, how to sharpen pencil points, the history and
social impact of the pencil. All of this might be interesting, but what children really need to
know is how to use pencils to achieve something else. Courses in pencil use are questionable,
if students do not have pencils in their hands. Analogues to this situation has faced most
students with computers. By the fall of 1985 there is still only about one computer to every 45
students in America (Carnegie Quarterly, 1985, Summer). The widespread availability of
computers or lack of it is a severe problem in computer literacy. Now a number of major
universities offer courses in computer literacy to teachers. Many of these computer courses
train teachers to use the computer as a teaching aid. This is limited use of computers. Teachers
are left in a state of flux on how to teach using a computer and how to find quality educational
software. This situation comes about in that the teacher has not been trained in the evaluation
of educational software, let alone in the teaching methodology for computer-assisted instruc-

Innotech Journal 25
tion. Computer training as stated by James Dunne of Teachers College in New York tends to
deal with people getting acquainted with low-level computer literacy activities. Just getting
acquainted with a microcomputer is not enough. Learning a computer language like BASIC,
LOGO or PASCAL may help a teacher get a raise in pay or even a job in the business
community, but it won't train the learner how to teach computers and computing.

Computer Curriculum

As one might suspect with debate over what is or is not computer literacy, the same problem
occurs with computer curriculum. Typically, a computer education program consists of 1) an
introduction to the history of computers and their impact in society; 2) knowledge of computer
application in instruction; and 3) computer program in one or more languages (Hoth, 1985,
January). This type of program leaves much to be desired. Teachers don't need to know the
history of film in order to use films appropriately and effectively, it is difficult enough to main-
tain adequate levels of knowledge about the current applications and potential of computers,
let alone discuss possible changes in society in the future. What teachers really need to know
are learner needs and the software which will meet those needs efficiently and effectively.

Hoth, (1985) in view of inadequate computer curriculum, suggests a literacy program that
involves: 1 ) "hands-on" work with computers, to cover basic operations and familiarity with a
variety of software; 2) instruction in a glossary of basic computer terminology; 3) instruction in
practices which protect software and defend against software piracy; 4) education in prin-
ciples of instruction design and discussion of subsequent criteria for evaluation of software for
their classes; and 5) instruction in various levels of computer application in instruction.

Basic level competencies attained by teachers participating in this type of program would be:

1) To operate microcomputers and hardware accessories correctly and to demonstrate their


correct operation to others;

2) To define certain basic terms like RAM, byte, load, boot, modem, etc. and to describe their
significance to the ability to use microcomputers.

3) To identify general types of problems amenable to solution by microcomputers and the tools
necessary for solving such problems.

4) To identify inappropriate uses of computers in problem-solving;

5) To distinguish between computer-aided and computer-managed instruction, and to describe


several examples of each;

6) To describe methods of using software which prevent computer piracy and promote ethical
computer uses;

7) To use a variety of instruction and classroom management software appropriate for classes;

26 Innotech Journal
8) To identify several sources of current information on computers in education and generally to
rate the sources; and

9) To cite the general criteria for evaluating instructional software, and to cite examples of
criteria as they relate to various software.

Many computer literacy programs at teacher training institutions often ignore critical competen-
cies. The main emphasis for a teacher's training in microcomputer literacy ought to be for
instructional purposes. Very few teachers need to be trained to program their own instruction
as now there is such a wide variety of software available. Evaluating existing software for
instructional use is of major importance to the computer literacy curriculum.

Certification

Another issue of debate is the licensing of teacher in computer education. Many educators are
saying that all teachers should have a basic knowledge of computer use and that this should be
a requirement in teacher certification (Cameron F. and Craighead, 1984, p. 23).
A funded project, the Northwest Council for Computer Education has as one of its goals
examined the question of certification. Moore, in her article "Preparing Teachers to teach
about Computers and Computing", suggested that "computer science must be recognized as a
discipline such as mathematics, biology, chemistry and physics. As a result of this recognition,
certification requirements of teachers in computer education must parallel those of other
disciplines. Moore's project developed and recommends four certification categories.

1. General: Certification for all teachers.

2. Specific/Elementary: Teachers capable of teaching about computers and computing at the


elementary school level.

3. Specific/Secondary: Teachers capable of teaching about computers and computing at the


secondary school level.

4. Specific/School Computer Curriculum Coordinator: Coordinators capable of planning,


developing, organizing, and directing computer programs for the schools.

As of 1984, standards for certification have not been set by states in the USA. Certification
appears to be hampered by the ability of colleges and universities to provide programs to
meet certification requirements. Meanwhile, some institutions within the teacher education
program require courses in computer education for general certification of teachers (Moore,
1984, p. 20).

Applications of Computer Technology

Innotech Journal 27
The number and type of applications of the computer for educational purposes increases day by
day. Probably, the most traditional and widely used application is that of Computer Assisted
Instruction (CAI). The most common approach using this method involves programs that
reproduce lessons of practice and drill. This type of instruction is beneficial in determining the
student's ability to learn new material or master previously learned material in a patient,
tolerant manner at the student's own rate of learning (Ruff, 1985, pp. 197-198).

Vargas (1986) writing in the Phi Delta Kappan, says that computers have the flexibility to teach
effectively, but-only if the CAI programs adapt those features shown to be necessary for
learning. As it is now many programs are filled with serious instructional flaws (p. 738).

Even though widely used, CAI programs are rarely assessed in terms of instructional effective-
ness or how much students learn using them. There is a dearth of research on this topic.

Education simulation is another popular use of the computer. Simulations can be a very effective
teaching tool. They encourage active responding and provide continual, immediate feedback.
Often the student is thrown in a situation that requires learning by trial and error. Because of
this and because most simulations do not provide step by-step learning, some students who
experience initial failure do not wish to continue with the program.

Subject matter content can be presented by tutorials. Most tutorial programs provide several
pages of text before asking anything of the student. They often are essentially traditional
workbooks displayed on a screen. After a student is presented with several screens of text in
the form of a mini-lecture, a quiz is usually given. The student's response to tutorial programs
is limited in terms of feedback.

Criticism has been leveled at CAI programs in terms of being cost-effective. Many of the
practice and drill of basic skills could be done a lot cheaper or just as well with paper and
pencil. Warning has been given by Levin and Meister in their article "Is CAI Cost-Effective?"-
These authors say that educators should not blindly assume that CAI is best in terms of cost
effective education. They cite peer tutoring in particular as not being cost-effective (Levin and
Meister, 1986, p. 745).

The authors also say that there is little or no evidence to support the notion of CAI being more
cost effective than other types of instruction. Educators should use caution when considering
CAI computer usage. The microcomputer may be an expensive electronic textbook if not
used properly.

A very viable application of computer use is networking and the use of data bases. The "elec-
tronic classroom" consisting of teachers and students communicating with each other, with the
library, and with data bases via computer networks can improve and enrich teaching. This
concept can be expanded to include students from other courses and even from other coun-
tries by satellite technology. Bugliarello (1984) indicates that global networks offer us an
expansion of our biological intelligence to that of hyper-intelligence as a result of computer
networking. Networking also lends itself to computer conferencing. This is a system linking
people together using personal computers, communication technology and conference soft-

28 Innotech Journal
ware connected through telephone lines to communications satellites. This concept has wide
possibilities for international education and allows participants to take and to add messages at
their own convenience.

Computer networks allow teachers and students to exchange ideas, techniques and a variety of
information for instructional purposes. The applications are endless for bringing people to-
gether electronically by means of computers and communication links (Grayson, 1984, p. 15).

Computer Aided Drafting (CAD) applications for teacher education have numerous uses for
teaching of "graphics". The CAD system can be used as a tool for drafting and design. In
physical education those aspects of sports medicine and exercise physiology utilize a type of
CAD for the diagnosis of human performance. Computer programs can be used by teachers
and students to analyze the motion of every limb and muscle giving information for improving
or altering movements to obtain optimal performance. The movements are digitally copied into
a computer and reproduced for analyzing and study (Teich, Mark and Pamela Weintraub,
1985, p. 40).

Computer graphics can also be used in the teaching of art. Artistic compositions can be per-
formed by using various types of CAD computer programs. This application could be appro-
priate for both commercial as well as fine arts instruction.

In the training of business teachers it is a must to be proficient in the use of computer word
processors. All business students must have competencies in computer keyboarding, account-
ing procedures and the use of financial spreadsheets as part of their subject matter content.

Music teachers can compose a sonata or specialized computer programs. Music teacher
educators have great opportunities to use computer program as a teaching tool.

Vocational teachers could be teaching their students job costing, project planning, and manage-
ment techniques using sophisticated computer based programs for the building trades. This
can be coupled with computer graphics for designing machine parts or industrial processes,
etc.

English teachers have a new tool in using the computer for writing. Computers make it much
easier for students to write and to edit what they have written. The teachers find it easier to
read and comment on the students' writing which is essential in the teaching and learning how
to write well.

Teachers of social science can use microcomputers to enable students to manipulate large
volumes of data and increase their capacity to analyze them. Social science teachers need to
be aware of the various data bases and how to access them by telecommunication.

Summary

The use and application of computer technology to teacher training programs is vast! Research

Innotech Journal 29
is limited on the use of this technology for teacher education. But fortunately, as pointed out by
De Vault and Harvey in their article, 'Teacher Education and Curriculum Development in
Computer Education," teachers are becoming increasingly sophisticated in their understanding
and aspirations for computer use in the curriculum and for the computer knowledge of their
students. More attention is now given to pre-service than in service teacher education in
computer technology. Many teachers have taken it upon themselves to become computer
literate.

Despite all of the excitement generated, very few North Americans have computers, but their
numbers are increasing at a staggering rate. Naisbitt (1982) says the home computer explo-
sion is upon us, soon to be followed by a software implosion to fuel it. It is projected that by
the year 2000, home computer system including printer, monitor, and so forth should cost only
about twice that of the present telephone, radio, recorder, television system. If Naisbitt's
projections are correct this could well mean that a large portion of the population will have
access to home computers. This has enormous implications for children in the home, the
teachers of the children and particularly the teachers of these teachers at institutions of higher
learning.

We need models of computer education for teachers which have been proven effective. The
future is bright and endless opportunities prevail in computer teacher education.

REFERENCES

Agee, Roy, "Are We Really Training Computer Teachers?" Technological Horizons in Educa-
tion 1985, March, pp. 97-99.

Bugliarello, George, "Hyperintelligence The Next Evaluationary Step", The Futurist, 1984,
December, pp. 6-11.

Cameron, Allan B. and Donna Craighead, "Teacher Training: Preparing for the Fifth Basic",
Association of Educational Data Systems Monitor, 1984, November/December pp. 23-24.

DeVault, M. Vere and John G. Harvey, "Teacher Education and Curriculum Development in
Computer Educations" Technological Horizons in Education, 1985, March pp. 83-86.

Grayson, C. Jackson, "Networking By Computer" The Futurist, 1984, June, pp. 14-17.

Hoth, Evelyn K., "Debunking Myths about Computer Literacy for Teachers", Educational
Technology, 1985, January pp. 37-39.

Levin, Henry K., "Debunking Myths about Computer Literacy for Teachers", Educational
Technology, 1985, January 745-749.

Moore, M.L., "Preparing Teachers to Teach About Computers and Computing", Proceedings of
the NECC, 1984, pp. 286-290.

30 Innotech Journal
Moore Margaret L., "Preparing Teachers to Teach About Computers and Computing", Asso-
ciation of Educational Data Systems Monitor, November/December, 1984, pp. 19-22.

Naisbitt, John, Megatrends: New York, N.Y., Warner Books, Inc., 1982.

Ruff, Thomas P., "High Technology and Education", The Clearing House, 1985, January, pp.
197-198.

Sandoval, Hugo F., "Teacher Training in Computer Skills: A Call for or Redefinition", Educa-
tional Technology, 1984, October, pp. 29-31.

Teich, Mark and Pamela Weintraub, "Ultra Sports", Omni, 1985, August, pp. 40-42.

Toffler, Alvin, 'The Data Deluge - Artificial Intelligence", Omni, 1983, October, p. 42.

Vargas, Julie S., "Instructional Design Flaws in Computer Assisted Instruction", Phi Delta
Kappan, 1986, June, pp. 738-744.

Walls D. "Computer Literacy: What should School be Doing About It?", Classroom Computer
News, 1980, January, pp. 26-27.

THE INFORMATION AGE:


AN OPPORTUNITY TO RESTRUCTURE CURRICULUM AND INSTRUCTION
THROUGH TECHNOLOGY

By ELIZABETH S. MANERA

THE PAST Is Prologue"-so reads the inscription over the U.S. Archives Building in Washington,
D.C. It is important to b reminded of the past so we can honor its accomplishments. How-
ever, we must not hold to the past, only to maintain the status quo. We must be cognizant of
the present and look expectantly to the future. This is especially true as educators try to
grapple with the issues that face them today. We have moved from a time when a person
could learn all there was to know about any given subject to a period where that's impossible
since knowledge doubles approximately every five years.

Even so, the purpose of school has generally remained the same. Teachers' primary task is to
help students learn how to think and how to use the basic skills-reading, math and writing - so
they can solve life's problems in the future. Until now, these three basic skills have been
adequate. However, with the knowledge explosion, we must take a look at our curriculum and
make some changes.

The past is prologue

Innotech Journal 31
Less than a hundred years ago, a student pursuing a graduate degree had to learn everything that
was known about his field. When tested, he had to be able to give the one right answer to that
question. That would be impossible today. Not only is there much more to know, but, fre-
quently, there are several appropriate answers.

Just as the amount of information has increased, the speed for recording such information has
also increased. It took the scribe months to laboriously write or copy a scroll or manuscript,
but today, the printing press mass produces newspapers on a daily basis that are larger than
most of the handwritten manuscripts. Prior to the printing press, only the church and the
wealthy possessed the few written documents. Therefore, only a few people had reason to
read or write. However, as Guttenberg's hand press and later the mechanical printing press
were developed, books became mass produced. Consequently, the need to read grew with
the expanding volume and reduced cost of the printed word.

Today few secretaries take shorthand because most executives now use a tape recorder dictat-
ing machine from which the secretary can type directly. This allows both people to work at
their own speed and at the same time in two different places. In the very recent past, further
reduction in time spent on written communication occurred as secretaries used word proces-
sor computer programs which allow editing, spelling check and major revisions to be made
without having to retype the entire paper. Such technological development has greatly en-
hanced the secretary's productivity.

Just as the printing press assisted the scribe, the computer has assisted the mathematician. One
hundred fifty years ago, Charles Babbage invented a mechanical calculating machine which
was the forerunner of the present handheld calculator (Wall, February 1988). It was a very
clumsy apparatus with little capability, but led to today's small hand-held calculators like
Hewlett Packard's Model 41. This hand-held calculator is more powerful than early comput-
ers which took up whole rooms. This type of advancement is occurring so rapidly it is mind
boggling. For instance, Intel's new microchip, the 80386, will allow the construction of a
microcomputer r which is more powerful than today's minicomputers. It is projected that such
microcomputer will be in the testing phase and may be in production the later part of 1986.
This illustrates that from the initial microcomputer developed in the early seventy's to the
present time, a period of less than 15 years, we have advanced five generations in computer
technology. Although this trend may slow down, it will continue in an upward direction.

Each day we are given more and more information to add to our enormous knowledge base.
The voluminous amount of information is so great that an individual can only be proficient in a
very narrow area. An individual can broaden his knowledge base by sorting and retrieving
information both inside his area of expertise and in related
areas which he might not be able to find without computer technology. For example, in the field
of education, a person may not be able to retrieve an article on teaching techniques which was
written in a business or industrially training publication if it had to be done manually. Searches
with the computer usually do pick up all cross references as well.

Use of the computer outside the halls of academia is even more widespread. What this implies is
that almost all students graduating from any level of education will be required to use a com-

32 Innotech Journal
puter in his work area. This means he must b computer literate. In the context of this paper
computer literate means a person can utilize the computer hardware and standardized soft-
ware with ease.

Computer eras

Carl Hammer (1985), a computer research consultant, classified the nature and use of comput-
ers into three eras. First was the Data Processing Era, which started in 1950 and continued
until 1980. Though computers are used to routinely process data for us today, such as payroll
or inventory control Hammer felt that in 1980, we had begun to move into the second era,
which he called The Procedural Era. During this era, Hammer views the computer as a tool to
check for inconsistencies, remove contradictions, and bridge gaps in procedures primarily
developed for large organizations. Data flow machines and signal processors are meeting the
highly specialized needs of the scientific and engineering communities. In addition, the so-
called "expert" computer systems are making low level decisions like medical diagnostic
programs. This Procedural Era is expected to continue well into the 21st Century.

The third era visualized by Hammer is the Cognitive Era in which the computer and software will
be sufficiently advanced to make high-level decisions. We will move into the Cognitive Era
when computer software has been created which can make higher level decisions. In the next
century there will be computers which can learn from other computers and from interaction
with human users. This will require, in addition to heuristics, a new construction model to be
used as the base for constructing new model.

Issues in computer usage

While Hammer is contemplating the electronic advances to be accomplished in the 21st Century,
schools are dealing with the computers we have today. Educators need to address a number
of issues for the remainder of this century-quality of educational software, hardware availabil-
ity, teacher usage, and student use.

Educational Software

First, let us look at educational software. Originally, software was developed by people with
program skills, but no educational content background. As educators have become more
involved in computer usage, they have created more effective software by incorporating
psychological principles of learning. The market not only has better written software, but now
is getting better matches between curricular objectives and the programs- (Mace, April 14,
1986). Many textbook companies are developing software which can be used with their
textbooks. As programs are developed, more public domain "shareware" is available. The
present cost of good educational software is prohibitive - but somehow the producers need to
be rewarded and piracy curbed. Yet, schools must be able to afford the "good" programs. As
more color and animation is included in the interactive software, the taint of entertainment or

Innotech Journal 33
"learning can't be fun" syndrome will be overcome. Then the software will be accomplishing its
objective and there will truly be quality software available.

Hardware Availability

A second issue is hardware availability. Should there be enough computers so every student has
his own, or at least has one in those classes where they are used extensively? If there are only
20 or 30 computers per school, should the computers be housed in a lab or parceled out one
or two to a classroom? Or should the computers be moved into the classroom for a scheduled
period of time when needed by the students, just like movie projectors? If the computers are
in a lab, should they be networked for less costly software usage? Will teachers use comput-
ers when they only have one computer in the classroom?

Once the computers have been placed in classrooms, the problem of breakage rears its ugly
head. Frequently, when the keys stick, the disk drive breaks or wires are cut, there is no
immediate provision made for repair. Not until computers become as teacher proof or student
proof as other teaching tools, will they be continuously utilized in the classroom.

Teacher Usage

The third area of concern has to do with teacher usage. Though Maltoon (1982) was referring
to instructional television, his statement is applicable to educational computers and sums up
many teachers' intransigent feelings. He said:

It has been found that teachers reject or at least resist change because of failure to recognize the
need for improvement, fear of experimentation, unwillingness to give time and disillusion or
frustration with past experiences. In addition, teachers traditionally tend to be conservative
and usually will not be impressed by the results of investigations and research or new theories
of education.

Many teachers subscribe to the theory that teaching is an art, that rapport with students, timing,
creativity, imagination, improvisation and rhythm are subtle, imprecise and intangible factors
which pervade teaching. These "sacred cows" developed a "folklore" of teaching which
includes the occupational wisdom, the norms, and daily teaching practices of experienced
teachers who can see no real reason for changing what has "worked"; thus, a deep-seated
conservatism reaffirms, rather than challenges, the role of the schools. However, when teach-
ers see that change will help solve some instructional problems, they will incorporate whatever
new technology presents itself for their use. These tools must pass the tests of efficiency: Is it
versatile? Effective? Reliable? Simple? Durable? What will the cost be in personal energy and
finances versus the value the students will receive through its use? Once decided that it is a
good change, then teachers have to learn to operate the computers and to create or search for
software which will match their instructional objectives. These teachers also believe that
computers motivate their students to learn content that is relevant and meaningful and that a
changing instructional strategy helps everyone's attitude toward learning and the classroom

34 Innotech Journal
activities.

Student Usage

The fourth issue relates to how students use classroom computers. At present, students use
computers in four major ways. First, computers are used for computer-assisted instruction
(CAI). This allows the students to be tutored by preprogrammed software. Through drill and
practice, simulation, problem solving, or tutorial programs, these have been most helpful with
special education students, gifted students who need enrichment and the youngsters learning
basic skills. The Center for Research into Practice has said that research studies have indi-
cated that computer drill and practice is not the most cost-effective way of providing children
with drill and practice opportunities. However, the author has found that if the computers are
available for multiple use that is other than drill and practice, then the drill and practice portion
is still a useful tool, especially for the slow learner.

Secondly, students learn to program computers so they can control the computer and make it
do what he/she wants it to do. Programming may be taught at all ages with an appropriate
level of computer language, from Logo for elementary students to Pascal or Fortran for
advanced students.

The third usage is through computer managed instruction utilizing programmed teacher utilities or
Data base, spread sheets, or word processing. All of these are designed to make teaching
procedures more efficient, less time consuming with fewer errors in student record keeping.

The fourth area includes problem solving, decision making and simulations which sometimes are
included in CAI, but which I prefer to treat separately. All these techniques require greater
skill, use higher level thinking skills, and generally challenge students to a greater degree.
However, the number of such programs has been limited, though software producers are
trying hard to keep up with the demand.

Computer

Electronic systems have invaded all walks of life. Computers are being used in the checkout line
at the grocery store in which the mechanical voice tells you how much has been charged for
each item, in burglar systems in homes and offices, synchronizes traffic lights in the city and
helps to make airline reservations. The greatest impact has probably been in the office where
clerical and- other office personnel have to deal with information. It is estimated that by 1990
more than 90 percent of the entire United States' work force will effectively be knowledge
workers in which they spend most of their working hours using some type of computer to
create, store, retrieve and use data and information.

Therefore, if our white collar workers are going to be totally involved with computers, the effect
on blue collar workers will be to raise them into white collar jobs. Today, the auto mechanic
uses the computer to check the car's timing, spark plugs, and diagnose a number of other

Innotech Journal 35
automotive ills. The tool and die maker no longer uses the lathe or milling machine but oper-
ates a computer which can cut metal to extraordinarily close tolerances with far greater
consistency than even an expert machinist can do. In factory assembly lines robots assemble
highly complicated articles with greater accuracy, faster and with fewer rejects than when
assembled by human labor. In each of these examples, there must be a human present to
supervise the work of the computer, to check the activity of the assembly line, to reprogram
the computer, or to make repairs. This more technical responsibility for the blue collar worker
means he no longer has to do the tedious, repetitive manual labor previously demanded.

In a technically oriented society, if present day blue collar workers do not become computer
literate, they will be without jobs and live much as our functional illiterates do today. They will,
in fact, be the functional illiterates of the 21st Century. It is essential, then, for society, espe-
cially teachers to begin to get ready for these changes.

The machine does not think for us, it is just a machine. But it will do what it is programmed to
do. A computer will sort data, do mathematical problems that man could not do, not because
he is not capable, but because of the time involved. As an example, sending a man to the
moon would never have occurred without the computer's mathematical ability. Another
example is the data base of Internal Revenue Service (IRS). The IRS computers have the
ability to sort through the entire population to check income, select specific items for study,
correlate individual returns with employer's returns, and then provide the data for an agent for
audit.

As business finds more help from computers, we find that more elaborate programs are created.
The Borland Optical laser reader has been developed to do a spelling check with a built-in
dictionary, and it has a thesaurus which will sort through and elect synonyms to strengthen a
document. A water processing plant uses a computer to check the quality of the water as it
enters and leaves, the flow rate and the efficiency of the membranes in separating salts from
the water. The model home uses the computer to control household functions. It will monitor
and adjust the environment: turning on the furnace or air conditioner for the heating and
cooling; opening and closing shades, drapes or shutters; setting the burglar alarm-on at mid-
night and off from 6:00 a.m. to 8:30 a.m. when the family leaves, then on until 4:30 p.m. when
the children arrive home from school. The sprinkles are set to water the grass either at a
prearranged time or according to the sensors in the ground which indicate to the computer that
the ground is too dry. The computer will turn on the oven or stove to cook dinner at a set time
as well as turn on the stereo or radio and the lights so all will be ready for the arrival of the
household members.

However, none of this can be done by the computer unless man sets them to do his bidding. Just
as we have seen examples of computer usage in all walks of life, we will see teachers use
computers in a variety of ways. But, first, we must next look at computer literacy as a curricu-
lar concern. When computer mania took over in the early 80's we found that sound educa-
tional practice and policy were preempted by political decisions based on media, vendor or
legislative pressure. When deciding on curricular change, the primary educational goals need
to be curricular change, the primary educational goals need to be carefully identified and
studied. I believe that goal is to help students function productively in our technological society

36 Innotech Journal
when they leave school. Therefore, we need to consider what computer related experiences
are necessary for people to be successful in the "real" world. Ronald Banwart (1986) has
identified such a list.

1. interaction-the ability to figure out how to use various information producing devices;

2. software use - the ability to adapt to the nuances of various programs as well as the ability to
learn how to use a particular program;

3. a general comprehension of what a "computer"

4. ability to use the computer to access databases - and a comprehension of what a network is;

5. an awareness of the limitations of computers and how to be alert for their misuse;

6. ability to create programs-even if there is only one within the context of an "authorizing
system" or a data base query language;

7. ability to use word processing, database and spreadsheet programs;

8. Awareness of ethical use/misuse of computers.

Many people feel there is no need to teach computer literacy as a separate curricular offering
with separate objectives for each grade, even though it has been the most widespread use of
computer instruction time in the elementary school (Manera, 1984). Others feel that if com-
puter literacy is taught as a separate course that it should be designed as a short-term program
to be phased out as rapidly as possible at upper grades and taught to primary students only
until they are literate (Banwa, 1986).

The concept of a basic skill is that the skill is required for a person to grow within the confines of
his society or culture. Reading, writing and basic mathematics became basic skills with the
advent of the printing press and the beginning of the business "paper" of the 16th Century. The
advent of the information age is now upon us. To compete in this information age, individuals
must now become proficient in information age, individuals must now become proficient in
information processing skills, i.e., become computer literate, so as to access, locate and sort
available information.

If the primary purpose of education is to help students to learn to effectively use their basic
skills, so they can think as informed citizens capable of solving unforeseen problems of tomor-
row, then we must provide them the opportunity to develop information processing skills. This
means that the computer is an indispensable tool which provides equal access to information,
utilizing this information to help solve tomorrow's problems. At this point, computer literacy
becomes a "basic skill" needed by all citizens who expect to be contributing members to
society.

Computer literacy must be added to the curriculum as the fourth basic skill if we expect the

Innotech Journal 37
students to take their place in society.

TRAINING TEACHER TRAINERS HOW TO MAKE USE OF THE COMPUTER

By DENNIS HARPER and TAN FONG KHOW

1. Background at the Institute of Education

The Institute of Education (IE) is the only teacher training institution in Singapore. It is
responsible for the preservice training of all secondary and primary teachers and administrators.
In addition, IE has a Masters in Education program as well as extensive inservice courses.
Research on educational issues is also an integral activity of the Institute.
Singapore is very committed to improving human resources. The Ministry of Education sees the
infusion of the computer into education as a major priority if Singapore is to have a competi-
tive edge with the rapidly developing industrial nations such as Hong Kong, Taiwan and
Korea.

However, before the students in the schools could be exposed to computer usage on a vast
scale, the teacher trainers must be "computer literate." IE staff consists of more than 160
lectures and supervisors. The majority of the staff have never used a computer. Only about
twenty of them use a computer regularly. They use the computer primarily for word process-
ing. Because IE was expected to play a major role in bringing computers into the schools, it
was obvious that an extensive staff development program was needed to fulfill this role.

In order to carry out this training, it would be necessary for staff to have extensive hands-on
computer experiences. As the staff was too large to attend a single week-long workshop it
was decided that an in-depth be given to one representative from each of IE's 13 departments
and all members of the Pedagogical Studies (PS) department. This department is responsible
for all computer education at IE. Twenty-four staff members attended the workshop. These
departmental representatives would then make arrangements with the PS department to
conduct similar workshops for the remainder of their staff. PS staff arranged with representa-
tives of Apple Computer to have 12 Macintosh computers made available to users. The
Macintosh was chosen because it is easy to use and the software is generally user friendly.

2. The objectives of the workshop were:

• To create a core of experts among the teacher trainers at IE so they could multiply their skills.

• To increase the productivity of the participants. This includes lesson planning, overhead
transparency production, record keeping, writing of research papers and articles, etc.

• To enable teacher trainers to discuss with their student teachers ways in which the computer
can be integrated into the curriculum.

• To motivate participants to use the computer, and

38 Innotech Journal
• To promote cooperation and good feeling among the staff regarding the use of computers.

The workshop was not aimed to teach the staff programming or computer assisted instruction.
The software used was mainly application programs.

3. Preparation for the Workshop

A letter of invitation was sent to each departmental head asking them to nominate one member
of his or her staff who was willing not only to commit time during the week long workshop but
also to conduct workshops for other members of their department.

A course outline, timetable, lesson plans, overheads, and other materials were generated for the
workshop. These materials were made available to all participants 3 weeks prior to the
workshop so that they could prepare for the workshop. The workshop emphasized lectures
working on real tasks that they had to do anyway. Assignment were given with this in mind.
The software was chosen for its potential usefulness to the lectures and its value towards
meeting the course objectives.

The timetable was structured to have lessons and general hands-on activities during the morning
and time to do the assignments during the afternoon. Workshop leaders were available at all
times. The schedule is found in Figure 1.

(Pls. see Figure 1)

The amount of materials presented was vast but it was felt that this extensive overview of the
major types of application software would be most profitable. Accomplishing this much in one
week was possible because of extensive preparation, the enthusiasm of the participants and
the use of the Macintosh computer. The Macintosh uses the same basic commands for all its
software, so when the word processor was learned, the other applications became easier. All
Macintosh software used could also be integrated with each other. The Macintosh computers
were provided by the local Apple distributor at no cost. They also made the machines avail-
able on a loan basis until the time IE could set up its own Macintosh laboratory.

In addition to the computers and software, printers, paper, blank diskettes, ribbons, and other
supplies were also made available to participants.

4. Delivery of Workshop

The first two-hour session saw the group of 24 teacher trainers split into two groups. One group
of 12 attended an one-line tutorial teaching the basic Macintosh features. This on-line tutorial
needed to be completed by individual participants. Nearly every computer make comes with
such a tutorial program. Since none of the participants had used the Macintosh before, this
hands-on tutorial was essential. The remaining 12 participants met in a separate lecture hall
and discussed the objectives and outline of the workshop. This session included discussions
concerning the peculiarities of a computer workshop compared to other non-computer

Innotech Journal 39
workshops they may have attended. These peculiarities include feelings of frustration, different
ways of thinking, changing ways of doing things, waiting for help, exploitation of the trial and
error method, eye and finger strain, etc. After one hour the groups switched places.

The morning's second two-hour session found the participants coming together and working in
pairs doing word processing. Features of word processing were demonstrated and practiced.
The afternoon was spent learning to use the MacPaint program which enables the user to
insert pictures and diagrams into articles or overhead transparencies. The participants then
worked on their assignment which was to word process a document of their choice.

A laser printer was made available for this workshop. This was a strong motivating factor for the
participants as the quality of their work was like none they had ever seen or dreamed of
before. Much effort was taken to ensure that the participants see how easy computers could
make their work, how much time they could save and that the results were stunning in com-
parison to non-computer methods.

On the second day, participants learned how to use a thought processor effectively. The class
created a scholarly article using the Think Tank software. They were also involved in using a
software package to create charts and diagrams. As an assignment, participants had to
produce a chart or diagram on an overhead transparency.

Electronic publishing was the topic of day three. The participants learned how to insert text and
diagrams into a pleasing layout. Each participant then created a newsletter for his or her
department using this software.

On the fourth day participants learned how to use a powerful spreadsheet and graphic package.
For the days assignment each workshop member produced a computerized class record
sheet.

The final day of this workshop was spent on getting acquainted with a data base program. In
addition, five other software packages were demonstrated. Demonstrations were available at
five workstations and the participants visited each station on rotation. The five applications
dealt with were (1) test bank and examination generation, (2) music composer, (3) digitizing
photographs, (4) color printing, and (5) visual-based data base system.

In addition to learning and using the software, discussions took place relating the application
software to the education process. Problems that arose during the afternoon hands-on ses-
sions were discussed in detail the following morning. The final day was highlighted by a
luncheon for all participants. A good feeling of camaraderie was very much in evidence at this
occasion.

5. Workshop Evaluation

A post-workshop evaluation questionnaire was completed by each participant. The question-


naire contained queried the participants about the workshop's content, presentation, organi-
zation, overall rating, and comments/suggestions.

40 Innotech Journal
The first question asked the participants to rate the usefulness of each software package. A scale
of 1 (for very useful) to 5 (not useful at all) was used Figure 2 shows that all the software
packages were found to be useful, but the word processor was clearly the favorite.

(Pls. see Figure 2)

The participants were then asked to rank what they felt were the three most useful pieces of
software. Figure 3 shows that MacWrite was ranked number one by 55% of the teacher
trainers. Figure 4 indicates the number of times each piece of software was mentioned in the
top three.

(Pls. see Figure 3 & 4)

Implications of Figures 2-5 are that application programs are seen as valuable and useful. Word
processing, as would be expected, was found to be the most useful but thought processors,
electronic publishers, spreadsheets and forms/diagrams generators were also popular.

The participants were asked to rate various aspects of the workshop on a 1 (excellent) to 5
(poor) basis. The results are shown in Figure 5.

(Pls. see Figure 5)

The majority of the participants felt that the hands-on sessions were the most useful. For further
improvement to the workshop, participants suggested that:

• Free diskettes be provided to store personal data.

• Machines be made available during the lunch break.

• Individual machine be made available to each participant instead of having one machine to
two participants.

• Some initial and simple background reading be made available.

Another question asked what other type of software would you like to learn to use. Responses
included statistical packages, typing tutors, and Logo. All participants said they enjoyed the
workshop and felt all academic and nonacademic staff should learn how to use similar appli-
cation tools.

6. Follow up

Innotech Journal 41
Participants wanted to immediately begin working on similar workshops for their own depart-
ments but geared to their departmental needs. IE is now slowly giving these workshops. They
take place during lunch and in the late afternoon. The departments of Early Childhood Educa-
tion and Educational Administration are particularly keen.

Additional follow-up activities include:

• Members of the pedagogical studies department are also available at set hours for staff
consultation on the use of computers.

• A 36 station computer laboratory is available to faculty members during the school operating
hours.

• A Macintosh Users group has been set up and meets monthly to discuss developments with
this machine. An IBM users group will be organized soon.

Conclusion

Colleges of education throughout the world are finding it necessary to integrate computer
education with other teacher education curricula. In order for the teacher-trainers to gain the
required skills and enthusiasm to do this, the research has shown that a well-planned, intensive
(40 hour) staff development workshop can succeed if the software used is seen as useful and
easy to use by participants.

COMPUTER APPLICATIONS TO HIGHER EDUCATION INSTITUTIONS

By NILAKANTAN NAGARAJAN

COMPUTERS HAVE come to occupy an important niche in our society, and have since
wielded a wide and significant impact in our present as well as future lives in diverse ways.
They have already made measurable inroads into the campuses, and have catalyzed a change
in the attitudes towards computers of all component constituents of college life in their curricu-
lum- and career. It may just be too late to think or wishful to imagine that computers will go
away, or that business will continue as usual without being affected by the introduction of
computers on campus. As much as some segments of society and collegiate governance may
disparage the compelling claims that computers are useful in education, it is undeniable that our
present generation of students will suffer in completing for their job prospects, and in graduat-
ing from their colleges and universities without more than basic computer literacy.

Trends in 'computer literacy'

In my introductory lecture on EDP Systems and MIS Concepts for undergraduate and graduate

42 Innotech Journal
students in Business Administration, I emphasize the essential fact that the topics discussed at
present for their understanding of electronic and information technology will become totally
irrelevant and redundant in the next, say, five years, due to the tremendous advances in the
electronic and allied technologies. The very term 'Computer Literacy' is itself not definable in
absolute terms, but should slowly and steadily evolve a relative meaning and significance,
depending on the development and state-of-the-art of computer technology at the particular
point in time. Just like the successive stages of human satisfaction under psychology, there will
be trends in the meaning and expectations associated with 'Computer Literacy'.

The following four tables have been prepared to summarize such trends over the four decades
from 1965 to 1995. (Information derived from the presentation of Dr. Kenneth M. King at the
1985 conference of EDUCOM, discussing the development of the concept of computer
literacy.)

Table 1. Trends from 1965

• Formal languages as the most powerful tools.


• Knowledge of computer languages essential for advancement in career.
• Choice between control over or by the computer
• Is Computer Science a science?
• Will machines ultimately become more intelligent than humans?
• Will PL/1 replace COBOL and FORTRAN?

Table 2. Trends from 1975

• A working knowledge of Application Packages is as important as that of a language.


• Computing will play a vital role in a student's future career.
* Will decentralized computing become the wave of the future?
• Will minis replace mainframes?
• Will CAI work?
• Is Computer Science artificial intelligence?
• Will FORTRAN h ever replaced?

Table 3. Trends from 1985

• Development of new concepts and tools x 4GL, databases, graphics, and networks necessi-
tate mastery of many tools!

• Should every student b required to own a micro?

Is one pixel worth 1000 words?


Is parallel computing the pattern of the future?
Is there a computer revolution?

Innotech Journal 43
Table 4. Trends from 1998

• Everyone will have the ability and access to use a vast variety of computer tools.
• Computer Science basics will form part of the school curriculum.
* As a result, there will b a need to change undergraduate and graduate curricula.
• Will integrated computing work?
• Is Electronic University a viable possibility for education delivery?
• Will COBOL or FORTRAN ever be replaced?
• Changing technology will make lifelong education a growth industry!

This brief review of the patterns in the technological advances and tools made available for data
collection, processing and retrieval by end users revels the urgent and essential need to
revamp our thinking process and planning strategies to keep up and cope with the changing
technology and its resultant demands.

Such a radical revision in our attitude and expectations is further rendered more relevant in the
realms of higher education, mainly due to these disparities currently prevalent: a) the widening
gap between the rapid strides of technological advances and the resources available in higher
education for research, training and instruction, and b) the integral differences in the aspects of
available resources and anticipated role between the developed and developing nations in the
electronic future.

I wish to dwell upon the impact of the computer (r) evolution on campus and its imperative
implications for serious consideration of all concerned. It is not with a view to prescribe any
instant panacea to our present problems and future challenges that lie ahead, but to present a
proper perspective of the needs and plans of action from an overall Systems Approach, in
order to utilize our resources to the best possible effect and effectiveness, and optimum
benefit.

Computer Equipment

Computers are now as essential as chalkboards, test tubes, and scholarly periodicals. . .

Our entire economy is an increasingly information-based economy, in which nearly half of


America's work force is processing information of some kind. Colleges and universities simply
must have computer equipment, instruction, and research if they are not to be like Renaissance
universities still teaching theology, Latin and feudalism when the society is moving into as-
tronomy, the classics, and international trade and exploration. (6-19)

During my recent stay in India working for a month conducting EDP seminars, I found a tremen-
dous upsurge of awareness among young students and job aspirants about computers and
computer applications. I have observed many of them spend their hard earned savings to learn

44 Innotech Journal
computer languages and systems, etc. in institutes and schools which advertised course
offerings. This was commendable, but somewhere in the process, it was a pity. The real and
regrettable pity was the lack of hands-on experience with computers in such courses. Many
complain that they learn only the EDP concepts and program coding without even seeing a
computer - leave alone, a micro, a mini of mainframe-or debugging their work or running the
job for getting practical experience. We cannot teach chemistry by doing experiments on
blackboard! Neither can we still be teaching only Dalton's atomic theory in this nuclear era.
So there is a real need to complement and supplement computer courses with proper and
adequate computer hardware and requisite peripheral devices to impart instruction on com-
puters.

Under the same topic, even the provision of such facilities may become lopsided between the
haves and the have-nots among the institutions. This disparity is a fact; of life even in schools
and colleges in the US and other western nations. While some campuses and school districts
can boast of their rich endowments and property-tax bases, enabling them to provide all the
facilities to their students, there remain thousands of other colleges and schools who have to
scrounge around their meager resources for allocation to acquire even the essential equipment!
A parallel scenario can be developed to imagine the plight of poor students in our third world
countries.

Computer Across Facilities

Many people believe that by the turn of the century, cities will be wired with communication
grids that will link most homes, corporations, and educational institutions with libraries, data-
bases, and computer centres. Through these networks, electronic text will provide people with
research information on required topics, telecommunication links between students and
professors, academic researcher with another associate, and an employee with his/ her office.
Correspondence courses, computer aided instruction, laboratory simulations, and dissemina-
tion of important information and instant Delphi-method discussion among experts by telecon-
ferencing are also touted as a few other possibilities for practical services to end-users.

Many college campuses in the US provide ample terminal facilities for access to the 24-hour
computer center operations in library areas, instruction departments, lecture halls and even in
the student dormitories. Basic courses are offered in many institutions through CAI methods,
with the availability of faculty advisors for further assistance and guidance to students, if
needed. Doctoral students and faculty are able to get support information on their projects
through information retrieval from several databases available for the purpose. Administrators
can utilize sophisticated financial management models like the EDUCOM Financing Planning
Model, EFPM or the WICHE planning model called RRPM for resource planning.

Several educational software packages are available to help the student to better the perfor-
mance in competitive tests, specific subjects and improve the skills in decision-making and
planning portfolios and other management strategies by playing simulation games, etc. As can
be seen, thus, opportunities are aplenty, limited however, only by
the unavailability of resources and lack of commitment on the part of campus management.

Innotech Journal 45
Electronic University

With the development of telecommuting concepts and open access of myriad forms of relevant
materials for study, will the electronic university be far behind? With each student progressing
at his or her own pace in the prescribed courses with no need to assemble in the amphitheater
or lectureroom at the scheduled periods, we can imagine the relegation into oblivion of the
traditional type of classroom instruction, mainly and merely based on text/book materials.
Further, availability and unique capability of computer-mediated class conferencing systems
are viewed as practical alternatives to the traditional educational delivery systems, with im-
mense and imaginative possibilities for higher education.

While on the positive side we can look forward to campuses functioning without the familiar
'strike' either by the students or faculty, and better vistas for slow and timid learners, and
possible accommodation of large sizes of student groups by such telecommuting and telecon-
ferencing concepts, there may be serious and significant limitations depending on the educa-
tional environment where such systems will ultimately be utilized. However much the machines
may be used in our modern environment to relieve the human intervention, the need for
interpersonal relations is inherent in the field of education. So, just as computer applications
can and should be used to the best possible benefit in easing, traditional systems, the human
touch should also be present and support human effort and endeavor, in education as in other
fields, but not to replace. Just as managers and decision-makers use the computer-generated
information to support their action, education can, and should benefit from these concepts.

College Management

The traditional disclaimer that colleges and universities, like other nonprofit organizations, cannot
be managed effectively due to the intangible nature of the outputs of these institutions, is
steadily supplanted by a growing appreciation of the need to apply the proven management
methods to such organizations.

The Carnegie Report on "Three Thousand Futures: The Next Twenty Years For Higher Educa-
tion" has proposed a checklist of imperatives for colleges and universities to meet their prob-
lems in the next two decades. The following recommendations can easily be recognized as the
essential characteristics of the Systems Approach followed in the efficient and effective man-
agement of a corporation (1)

• Analyze all factors that are likely to affect future enrollments.

• Insist on institution-wide or system-wide planning.

. Anticipate future problems and avoid moving from crisis to crisis.

• Encourage flexibility and INNOVATION (capitals provided to underscore its importance).

46 Innotech Journal
• Devise imaginative ways of seeking support sources of funds.

. Encourage strong leadership by the chief executive.

• Strive for the most effective use of the resources.

It is also encouraging and noteworthy that higher education is increasingly being referred to as
'knowledge industry' and 'educational enterprise' A recent study sponsored by the Association
Council for Policy Analysis and Research, representing leading Washington-based Higher
Education Associations has concluded that higher education should be treated as a major
industry!(8)

In recent times of fiscal crisis caused by a combination of factors like spiraling inflation, competi-
tion among different social causes and programs, and growing insistence for better account-
ability of resources, there is a promising trend towards utilizing the concepts, tools and control
procedures in the nonprofit organizations for better management, just as in the profit sector of
the economy.

The contemporary college, administrator faces a complex and challenging world.... He must
make difficult and unpopular decisions as individuals and groups compete for the same scarce
resources. . . . Thus, the modern collegiate institution is operated by a large administrative staff
and its key administrators, like those in business and industry. . . (3)

What being businesslike usually means in a service institution is little more than control of
cost.... Because there is no competition in the service field, there is no outward and imposed
cost control of service institutions as there is for business. . . (3)

It is a healthy signal that several service institutions, including colleges and universities, have
become 'management conscious' in recent times, which means that they begin to realize,
according to Drucker, "that they are at present not being managed."
As indicated earlier, several management information systems packages are now available for
college finance planning and management, student records, online registration, automated
library cataloging and other uses. With the implementation of such computer applications in
campus management, there will accrue enormous benefits to the entire campus in both tangible
savings as well as intangible benefits to include productivity improvement, morale boosting,
better control of assets and improved decision-making by top management.

Cooperation

Currently there are only two major manufacturers of supercomputers in the world. There is
considerable interest and enthusiasm in the expanding possibilities of positive benefits that may
be opened up for campuses, industry and others. But, beyond their use to find the largest
prime number in about two hours of their superprocessing capabilities and the help in meteo-
rological forecasts, research shows very limited use of these giant computers in universities.

Innotech Journal 47
This: contention was revealed in a survey sponsored by the Higher Education Panel of the
American Council on Education (ACE). The respondents ranked the following types of
assistance as paramount to facilitate increased use of such supercomputers in the future:

a. Access time on supercomputers.


b. Opportunities to gain knowledge about their technical capabilities.
c. Access through telecommunication links to remote supercomputer centres.(5)

This essential need for assistance for better utilization of vast computer capabilities further
emphasizes the crying need for cooperation among industry, government and academic
institutions. Without supercomputers, US experts agree, their competitive edge will lag behind
other nations adopting more aggressive policies on availability and use of supercomputers.

Such cooperation and coordination of efforts and strategic approach are of vital importance if
we are to attain a unified manner of progressing to ultimate objectives. A similar meeting of the
minds and pooling of talents and resources can help individual institutions also, as discussed
earlier, to acquire more and better computer facilities. It will eventually prove a worthy return
over the long term in terms of their investment in human resources development.

Faculty Development

It is very important that the quality of instruction and research in schools and colleges should be
commensurate with the skills and aspirations of the incoming student population over the next
decade and beyond.

According to an old axiom, "If you cannot do anything else, TEACH; and if you cannot teach
anything else, teach LATIN." This may be modified now to mean "If you cannot teach anything
else, teach COMPUTERS; and if you don't know about computers, teach EXPERT SYS-
TEMS!"

By dint of tradition, wheels at the ivory tower move very slowly in keeping up with the winds of
change. By human frailty, faculty caught up in the computer invasion on campus are reluctant
to face the stark reality of their inadequacy to meet the demands of coming generations of
computers and students. There should, therefore, be a wartime emergency approach to tackle
this mammoth problem of far-reaching consequences. In the States, among several bodies
interested in faculty development, ACE and AACSB have already launched programs for the
benefit of faculty. These focus on new and refresher training on advancing trends in information
technology.

Implications and Lessons

You will recall that caught unaware by the Sputnik episode, President Kennedy exhorted his
nation and people with his clarion call to 'put the man on the moon.' India's Prime Minister,
Rajiv Gandhi has the vision to set the nation ready for the '21st century.' Similar aspirations

48 Innotech Journal
and urgent aims have been expressed by other leaders.

But so much effort will have to be sincerely put forth to buttress such calls, before the avowed
aims could be accomplished. Without target, one cannot proceed in the proper direction and
evaluate one's progress or performance. We in the developing nations do not have the time or
luxury to dissipate our energies or resources in wasteful ways. We have the benefit of the
experiences of developed countries in their exploits to achieve their eminence and excellence
in electronics. We can avoid the pitfalls and problems faced by them in their path to progress,
and not just blindly and apishly imitate them in every aspect. In this context, I wish to quote
from the message of Philippine President, Mrs. Corazon Aquino, conveyed in the recent
ASEAN foreign ministers conference:

After 19 years of existence, the ASEAN should be evaluating the impact of regional
economic cooperation instead of endlessly discussing how to get it off the ground.... If the
developed countries are not buying as much from the ASEAN as the region wants, it is not
the malevolence ... but their own survival as rich countries also had to deal with recession....
Charity begins at home. Let us take
that message to heart. (4)

Once such an initial step has been accomplished, further vistas should be explored to coordinate
the efforts among the nations in South Asia, in my dream, encompassing India and Japan. In
this progressive manner, the ultimate aim will be the universal cooperation among the world
community for the overall and unfettered benefit and benevolence of all people through the
computer chip which is common to all of us!

A national agenda and plan should be formulated and followed implicitly without being dissected
or diverted by political factions and interest groups for the introduction of facilities for com-
puter awareness and literacy among the different segments of the population of the region.
Courses should be developed to meet the present and upcoming needs of technology, with
adequate support in material and human resources, financed from industry and government
sectors. Failure to recognize this essential element in national progress and stability will inevita-
bly cause a worse kind of brain drain to the 'Promised Land' among talented yet frustrated
youth.

Increased use of computers does not and should not be mistaken to mean outright rejection of
other values in our educational and societal mores. Research studies in the US point to the
decline in the level of mathematics knowledge and logical reasoning, etc., and serious projects
have been planned to overcome such and other deficiencies in the quality of education, with
the end in view to implement the proper revival and maintenance of quality in instruction at all
levels of a student's life.

Those of you who dabble with computers will know how important it is to have a good com-
mand of mathematics. Since computers are going to dominate life in the future, we must ensure
that students have a good grounding in this subject.

The heavenly bird, swan, according to Hindu scriptures has the ingrained power to sift and sip

Innotech Journal 49
only pure milk from any container of diluted mixture, leaving behind the worthless water. We in
the Orient can be like this swan in deriving valuable lessons from the expertise and exploits of
the western nations in regard to computers!

REFERENCES

1. Carnegie Council on Policy Studies in Higher Education. Three Thousand Futures; The Next
Twenty Years for Higher Education. San Francisco, CA: Jossey-Bass, 1980.

2. Dallaire, Gene. "American Universities Need Greater Access to Supercomputers." Commu-


nications of the ACM, April 1984 (Vol. 27: No. 4).

3. Drucker, Peter F. Management: Tasks, Responsibilities, Practices. New York: Harper and
Row, 1974.

4. Hindu. "Aquino Asks ASEAN to Raise Economic Ties." Madras, India, June 25, 1986.

5. Holmstron, Engin Inel. Access to Supercomputers. Washington, D.C.: American Council on


Education. January, 1986.

6. Kelley, George. Academic Strategy. Baltimore, MD: Johns Hopkins Univ. Press, 1983.

7. King, Dr. Kenneth. "Evolution of the Concept of Computer Literacy." EDUCOM Bulletin.
Fall, 1985.

8. Nagarajan, Dr. Nilakantan. "Businesslike Management of Higher Education." Select Reading:


on Productivity. Bombay, India: Times Research Foundation, 1983.

9. Sprunger, Benjamin E. and William H. Berquist. Handbook for College Administration.


Washington, D.C.: Council for Advancement of Small Colleges, 1978.

50 Innotech Journal

You might also like