Professional Documents
Culture Documents
An Executive Guide To IT Information
An Executive Guide To IT Information
Robert Plant obtained his Ph.D. in Computer Science at the University of Liverpool in 1987. He is currently
an associate professor for the School of Business Administration at the University of Miami, and specializes
in teaching MIS Strategy both there and at other universities and companies. His research interests focus
on the role of information systems in strategic management.
Stephen Murrell obtained his D.Phil. in Computation in 1986 from the Oxford University’s Programming
Research Group. He is currently a lecturer in Computer Engineering at the University of Miami, where he
specializes in teaching programming, algorithms, and operating systems. His primary area of research is
in programming languages.
An Executive’s Guide to
Information Technology:
Principles, Business Models, and Terminology
Robert Plant
and Stephen Murrell
University of Miami, Florida
CAMBRIDGE UNIVERSITY PRESS
Cambridge, New York, Melbourne, Madrid, Cape Town, Singapore, São Paulo
Cambridge University Press has no responsibility for the persistence or accuracy of urls
for external or third-party internet websites referred to in this publication, and does not
guarantee that any content on such websites is, or will remain, accurate or appropriate.
Contents
Introduction 1
v
Contents
vi
Contents
Denial-of-service attack 99
DHCP (Dynamic Host Control Protocol) 101
Digital 102
Digital cash 103
Digital certificate 105
Digital Millennium Copyright Act (DMCA) 107
Digital signature 107
Digital wallet 110
Disclosure and interception of communications laws 110
Disk 111
Distributed database 115
Domain name 117
Dvorak/QWERTY keyboards 119
Dynamic web pages 120
e-Commerce/e-business 123
Efficiency 124
Electronic data interchange (EDI) 126
Email 128
Encryption 131
End-user development 133
Enterprise information portal (EIP) 134
Enterprise resource planning (ERP) systems 135
Entity-relationship diagram (ERD) 137
Ethernet 138
ETL (extracting, transforming, and loading of data) 139
European Union Directive on Privacy and Electronic Commerce 2002 141
Fiber optics 143
File server 144
File system 146
Firewall 148
Flash memory 150
Formal methods 151
Fortran 155
Fourth generation 156
FTP (file transfer protocol) 158
Functional programming 159
Fuzzy logic 161
Global positioning system 163
Groupware 164
Hacker 166
Hardware, software, firmware, application, program 167
vii
Contents
viii
Contents
ix
Contents
Spyware 308
Storage 310
Structured design methodologies 313
T-Carrier 315
TCP/IP (Transport Control Protocol for the internet protocol) 315
Telnet 317
Thirty-two bit 319
Transaction-processing system (TPS) 321
Trojan horse 322
UML (Unified Modeling Language) 324
Unix 324
URL (Uniform resource locator) 326
Value added network (VAN) 329
Video 329
Virtual machine 331
Virtual memory 333
Virtual organization 335
Virtual private network (VPN) 336
Virtual reality 338
Virus 339
Visual Basic 342
Voice over IP (VoIP) 343
W3C (the World Wide Web Consortium) 346
Walk-through review 346
Waterfall model 347
Web services 348
Wireless application protocol (WAP) 349
Wireless network 351
World Wide Web 352
WYSIWYG 356
X.12 358
X.25 359
XML 359
Y2K problem 362
Zip 363
Index 365
x
Introduction
In writing this book, we have drawn upon that they must understand technology at
our experiences as professors, consultants, two levels. Firstly, they must understand
and technologists to provide a resource for what the technology actually is, and this is
executives, students, and other readers who addressed in our text by the provision of an
have a desire to achieve a rapid understand- overview section for each term. This section
ing of a technology, a computer-related pro- aims to cut through the ‘‘technobabble”
cess methodology, or a technology-related and the ‘‘spin” and to provide a solid basis
law. of understanding the technology, such that
The book provides not only the defini- executives can appraise the role of a tech-
tions for over 200 terms, but also a con- nology within the IT organization and the
cise overview of the term, the associated organization as a whole.
business value proposition, and a summary A second aspect of an executive’s role is
of the positive and negative aspects of to understand the business value proposi-
the technology or processes underlying the tion behind the technologies, and this is
term. addressed in the second section presented
The book addresses a problem faced by for every term. This enables executives to
many executives working in the twenty- come up to speed quickly on the major
first century organization, that of under- issues and how a technology fits into the
standing technology without needing to larger role it plays in their enterprise and
become a technologist. Today’s executives beyond; again references are provided to
use or are responsible for technology in enable further understanding to be gained.
nearly every aspect of their organization’s A third aspect of today’s executives’ func-
operations; however, the pace of change in tion is to understand their obligations
technology, the misinformation provided with respect to legislation and regulatory
from informal sources, and general opaque- frameworks such as Sarbanes--Oxley and
ness of the terminology can be off-putting the HIPAA acts. The book addresses this by
for executives and managers. In order to including a set of UK and US legislative
help executives overcome these problems, requirements under which organizations
we have drawn upon over twenty years of and their executives in those jurisdictions
teaching at the executive level and our must operate.
backgrounds as technologists to provide Finally, each item is concluded with con-
clear, understandable descriptions for the cise summaries of the positive and negative
most important technology and terminol- aspects generally associated with the tech-
ogy in use within today’s organizations. nology. This allows executives to ascertain
The executives’ need to understand tech- very quickly the key points associated with
nology is undeniable, but their role dictates the technology, law, or process model.
1
Introduction
Please note that the content of all law-related articles in this book represents the views of the authors who are not legal
experts, and the articles should not be read as presenting definitive legal advice.
2
Advertising
3
Advertising
when the user clicked upon it, it acted as a that vendors give away without charge but
hyperlink taking them to the new site and have embedded advertising into the code
presenting more information on the prod- so that it suddenly appears upon the screen
uct, and, of course, an opportunity to buy and may or may not be removable, or could
it. The advertiser would then pay the orig- be subject to a time-delayed closure. As a
inal web site for the traffic that ‘‘clicked condition of use, some adware programs
through,” and possibly pay more should a go on to install programs on the user’s
successful transaction occur. machine, known as Adbots, and these pro-
Banner ads proved in many cases to grams then act as a type of Spyware sending
be less effective than desired and so the back information to their advertising
industry started to create personalization source pertaining to the user’s behavior.
software based upon individuals’ online Often these additional installations are
web-surfing activities, the goal being to undisclosed.
present specific products and services based Illicitly installed software may redirect
upon users’ past preferences. These person- web browsers to access all pages through
alization systems did draw criticism from a proxy, which can add banner advertise-
civil liberties groups concerned that they ments to totally non-commercial pages that
could profile individuals, and selling the in reality bear no such content. Home-page
information would be an infringement of hijacking is also a common problem: a web
the right to personal privacy. browser’s default or home page may be
As the internet evolved, more sophisti- reset to a commercial site, so that, every
cated advertising developed, including the time the web browser is started up, it
infamous Pop-up ad, an intrusive message shows advertisements or installs additional
that suddenly appears upon the screen in adware.
its own window. Pop-ups are often embed-
ded Java applets or JavaScript programs Business value proposition
that are initiated from the web page as it Well-placed internet advertising can be of
is viewed; usually the pop-up window can significant benefit to the company or indi-
be closed, but some of the more persistent vidual sending it out. In the 2000 US presi-
pop-up advertising cascades so that when dential election the Republican Party’s use
one window is closed another opens, slowly of email to spur its registered voters in
taking over the machine’s resources. It may the State of Florida to go out and vote
be necessary to shut the machine down has been acknowledged as influential in
completely to escape this situation. Careful George W. Bush’s closely contested victory
control of the web browser’s security set- over Al Gore, and changed the nature of
tings can usually prevent the problem from political campaigning. The internet can
occurring in the first place. also be used as a non-intrusive distribution
A form of advertising that is also intru- model for advertising that is driven by
sive, and in some instances criminal, is user demand, an example of which was
Adware (the term is used to denote a certain the advertising campaign by BMW, which
type of intrusive advertising but is also the used top film directors, writers, and famous
registered and trademarked name of a com- actors to create short films in which their
pany that sells anti-spyware and anti-spam vehicles were as much the stars as the
software). Adware takes the form of pop- actors. The ‘‘BMW films” series proved to be
up advertisements or banner ads and hugely popular and reinforced the brand.
comes from several sources. One source is Many early dot-com-era (1995--2000) com-
certain ‘‘freeware” or ‘‘shareware” products panies attempted to finance their business
4
Agent
5
Agent
a major increase in personal productivity, of condition that can easily and reliably be
but requires that the natural language prob- detected and acted upon by very simple
lem be mostly solved first. The problem software. However, it is widely believed that
is that software must be capable of fully the extreme speed of these agents has a
understanding human prose, not just rec- destabilizing effect on the markets, and
ognizing the words, but determining the may even have been responsible for some
intended semantics, and that is still the minor market crashes. The slowness of
subject of much current research. human reactions, and the need for a real
It is generally envisaged that agents will person to make orders, allows common
not be static presences on the human sense to slip in where it could not be pro-
controller’s own computer, but should be vided by a simple automatic agent.
able to migrate through the network. This Agents are also used by some online
would enable an agent to run on the com- financial services such as automobile-
puter that contains the data it is read- insurance consolidators, who send their
ing, and would therefore make much better bots out to the sites of other companies.
use of network bandwidth. It does, how- The bots fill out the forms on those compa-
ever, introduce a serious security and cost nies, retrieve a quote and then these quotes
problem: agents must have some guar- are all displayed on the consolidator’s site.
antee of harmlessness before they would The unauthorized use of bots on third-party
be allowed to run remotely, and the sites is prohibited by many organizations;
expense of providing the computational however, preventing their access to a web
resources required would have to be borne site open on the internet can be almost
by somebody. impossible.
The idea of an agent as a meaningful
representative of a person, acting on their Summary of positive issues
behalf in business or personal transactions, Agents can be created and used to auto-
is still very much a matter for science mate processes. Agents can be sent out over
fiction. The problem of artificial intelli- a network to collect and return informa-
gence, providing software with the ability tion to their owner.
to interact with humans in a human-like
manner, is far from being solved. It is Summary of potentially negative issues
believed by some that a large number of Agent technology is primitive and only
relatively simple agents, in a hierarchy well-structured relatively simplistic tasks
involving higher-level control agents to can be performed safely. Bots and agents
coordinate and combine efforts, may be the are very unwelcome on many web sites and
best way to create an artificial intelligence. networks, and web site providers should be
This theory remains unproven. aware that other organizations may make
covert use of their online resources, pre-
Business value proposition senting the results as their own; that can
The use of agents in business has some result in an unexpected increase in band-
degree of controversy attached to it. Sim- width and server load when a popular
plistic agents have been used for some time service is provided.
to carry out stock market transactions. If
Reference
it is desired to sell a particular stock as r J. Bradshaw (1997). Software Agents
soon as it crosses a particular price thresh-
(Cambridge, MA, MIT Press).
old, constantly monitoring prices to catch
exactly the right moment would require a Associated terminology: Artificial
lot of human effort. This is exactly the kind intelligence, Natural language processing.
6
Algorithm
7
Algorithm
8
Algorithm
called for inputs to be sorted. Repeated design to solve the computational problem
experience tells us that most program- independently of the final implementation
mers do not notice that the simple sorting language. This description may be written
method is too slow for large data sets, and, in a style that is easier to read than finished
of those who do notice, very few indeed program code, and thus amendments to
are able to work out a significantly faster the design can be made in a more informed
alternative. One of the essential parts of a manner than would be possible by exam-
formal training in programming is a long ining the code. Knowledge of algorithms
and demanding study of the large collec- and associated data structures also enables
tion of algorithms that have already been programmers to determine the efficiency of
discovered and analyzed, together with the their solutions. A simple and obvious algo-
Data Structures (carefully tailored, seemingly rithm that works well in simple cases may
unnatural ways of organizing data for effec- be completely unsuitable for commercial
tive access) that go with them. As with any use. A professional programmer must thus
other engineering profession, it is impossi- understand that alternate algorithms exist,
ble to do a good job without a thorough and be able to select the appropriate one,
knowledge of what has been tried before. perhaps even invent a new one, for any
If a programmer starts the job fully armed given situation. Further, the computability of
with what is already known, they will have an algorithm must also be considered, to
some chance of finding something new. ensure that the algorithm is theoretically
Inventiveness is important: not all prob- sound.
lems have been seen before. A programmer
who does not already know the standard
Summary of positive issues
algorithms and data structures is doomed
Algorithms are an abstraction of a problem
to nothing more than rediscovering the
to be solved; thinking of a general algo-
basics.
rithm rather than a specific problem allows
a programmer to write cleaner, more main-
Business value proposition
tainable code that has a strong chance of
Many of the algorithms and data structures
being reusable in other projects. An enor-
for standard and interesting problems are
mous collection of existing fully analyzed
thoroughly documented and analyzed in
algorithms with well-tested implementa-
sources such as those provided by ACM, and
tions is freely available, and a vast range
the standard textbooks well known to all
of books is also available on the subject.
trained programmers. Knowledge of estab-
Design, analysis, and proof of new algo-
lished algorithms not only gives program-
rithms continues to be a major direction
mers the tools necessary for solving stan-
of research.
dard problems without having to ‘‘reinvent
the wheel” at every step, but also provides
an essential foundation for designing truly Summary of potentially negative issues
new solutions. As Isaac Newton in 1676 Algorithmic design needs qualified spe-
explained his success in inventing calculus cialists in computer science and soft-
and understanding gravity, ‘‘If I have seen ware engineering. Inappropriate algorithm
further it is by standing on the shoulders design can result in inefficient or unwork-
of giants.” able solutions being implemented. Lack of
Algorithms are an abstraction, a high- knowledge of algorithmic techniques by
level view of a method, that enables pro- software professionals is a major cause of
grammers to construct and investigate a inefficiency and poor performance.
9
Analog
10
Anti-virus software
more easily and accurately with a digital systems, watching for unexpected activity,
system. The analog world does not gener- and catching new viruses as quickly as pos-
ally impact modern business computing. sible. Once a new virus has been caught, a
new signature can be generated, and up-
Summary of positive issues loaded to all operational anti-virus soft-
Analog processing is a requirement when ware.
interfacing a computer system with real- It is essential that anti-virus software be
world inputs and outputs. continually updated. Reputable anti-virus
producers provide frequent automatic up-
Summary of positive issues dates through the internet. Anti-virus soft-
Analog processing is inherently inaccurate ware can only detect a virus that was
and inconsistent, but all analog signals already known at the time of its last
may be converted to digital form before update. There is no protection against a
processing, thus negating such issues. new form of virus.
When anti-virus software detects a virus,
Associated terminology: Digital, Audio, it will generally attempt to remove it from
Video. the system cleanly, restoring affected files
to their proper state. Sometimes this repair
operation is not possible, and the last resort
Anti-virus software is to quarantine the damaged file. This
means simply moving it to a protected loca-
Foundation concepts: Virus, Security tion from whence it can not be executed,
Definition: An application that detects and removes thus rendering it harmless without destroy-
viruses and other illicit software. ing the potentially important file that it is
attached to.
Overview Anti-virus software generally works in
Anti-virus software has the ability to recog- two modes. Periodically it begins a ‘‘scan,”
nize viruses that have already been investi- actively inspecting every single file on disk
gated by the software authors. Generally, a or in memory, looking for signs of infec-
‘‘signature” for each known virus is created, tion. This is a long process, and is typi-
which consists of certain recognizable key cally configured to happen overnight. In
attributes. This means that slightly mod- addition to periodic scans, the software
ified viruses, or viruses embedded within will also inspect all incoming (and possi-
other files, will also be detectable. bly outgoing) data, including emails, down-
A virus, beyond consisting of executable loaded web content, inserted floppy disks,
code, may have any form. There is noth- etc. Anti-virus software can also be con-
ing about a virus per se that can be detec- figured to monitor currently running soft-
ted. Only already discovered forms may ware, as well as that resident on disk, and
be detected. Fortunately, most viruses are halt any active viral processes.
unimaginative copies of existent forms,
and anti-virus software can be extremely Business value proposition
effective. Anti-virus software is a vital aspect of a
However, when a truly new virus is corporate security framework. A plan must
released upon the world, there is nothing be in place to ensure that the latest anti-
that any anti-virus software could possibly virus software is active and loaded onto
do to detect it. Protection relies upon the all email servers to intercept emails with
authors of the anti-virus software contin- known viruses and take the appropriate
ually monitoring thousands of vulnerable action. Anti-virus software should also be
11
Application development methods
12
Application generator
13
Application server
languages and systems, such as Ada, Cobol, will need to be trained in the methodolo-
Java, C++, C, Visual Basic, PL/1, DB2, SQL, gies used by the systems and must under-
and DDL. stand the implications and the limitations
of the code being generated. The code
Business value proposition generated will be consistently based upon
The use of application-development envi- the design, but the performance of that
ronments enables organizations and indivi- code will need to be assessed if perfor-
dual developers to create executable code mance is critical, since the code is often
rapidly from a high-level design. The appli- not optimized. Furthermore, it is impor-
cation generators provide a uniform devel- tant to determine whether designs and
opment environment, e.g., a UML GUI, code from different systems can be shared
through which code of consistent quality across tools, architectures, and databases.
is generated. The code that is generated
Reference
can then be tested and maintained through r J. Herrington (2003). Code Generation in
other functional aspects of the system.
Action (Greenwich, CT, Manning
This style of development is especially
Publications).
suitable to situations in which repetitive
incremental designs that operate upon Associated terminology: UML,
the same database and structures need to Object-oriented, Formal methods,
be developed. Development through appli- Structured design methodologies, Cobol,
cation generators allows the programmers Java, C++, C, Visual Basic.
to focus upon more important problems,
testing, and add-on functionality.
Application server
Summary of positive issues Foundation concepts: Server, Client.
Application generators provide developers Definition: An application server is a computer upon
with the potential to create high-quality which applications reside and that allows those appli-
designs rapidly, using systems that enforce cations to be accessed by client computers.
design rules through the developers’ inter-
face, from which consistent documented Overview
source code is generated. The design meth- Network architectures are sometimes defi-
ods available for support range from formal ned in terms of tiers, and the terms Two-
methods to traditional lifecycle methods. tier and Three-tier architecture are frequently
Similarly the range of languages output used. Each tier represents a layer of tech-
by these systems is equally wide, rang- nology. In a two-tier architecture Clients
ing through Ada, Java, C++, Cobol, and typically represent one layer of the archi-
real-time systems. Many tools are available, tecture, and the Server or servers represent
some from open-source vendors and others the other. In three-tier architectures a mid-
from commercial organizations, with cor- dle tier is placed between the client and the
responding levels of support. server. This intermediate tier is sometimes
referred to as the Application server and
Summary of potentially negative issues controls the information flow and directs
Application generators are not intended to requests that are made upon other system
provide a complete substitute for human resources such as a Database server, which
programmers but can be used to relieve may be distributed across the third tier.
them of repetitive, unrewarding, and error- The three-tier model allows greater num-
prone programming tasks. The developer bers of clients to be connected to the
14
Application service provider (ASP)
15
Architecture
16
Artificial intelligence (AI)
efficient working CPU, not with how the The CIO and the IT organization within
registers and arithmetic units themselves an enterprise must develop, document, and
are made. maintain systems architectures with the
Network architecture refers to the over- aim of supporting and enabling the busi-
all ‘‘map” of a network: which computers, ness processes of that enterprise. Intimate
routers, servers, switches, etc. are connec- knowledge of the architecture at all levels
ted to which, in what local groupings, and enables technology and business decisions
through which protocols. to be made in such a way as to facilitate
Software architecture is concerned with the continued alignment of corporate strat-
how the major components of an applica- egy and the technological architecture that
tion, perhaps a database, a user interface, underpins it.
and a network service provider, are combi-
Reference
ned together to make a single, integrated, r S. Spewak (1993). Enterprise Architecture
well-engineered whole.
Planning: Developing a Blueprint for Data,
Applications, and Technology (New York,
Business value proposition
John Wiley and Sons).
The term ‘‘architecture” is used by tech-
nologists and systems professionals in Associated terminology: CPU, Enterprise
many ways: computer architecture, net- resource planning.
work architecture, software architecture.
For CIOs the term ‘‘systems architecture”
is frequently used to describe the over- Artificial intelligence (AI)
all corporate computing resource which
contains as subsystems the other archi- Definition: Computer-based technologies that sim-
tectures. When designing systems architec- ulate or exceed human levels of task-related perfor-
tures many issues are considered, ranging mance or expertise, or that duplicate some aspect of
from technical issues such as the techni- intelligent human behavior.
cal limitations and capabilities of various
competing architectures to the ability of a Overview
particular architecture to support business The origins of artificial intelligence stem
processes. from the research of workers such as Alan
The selection and implementation of an Turing, who in the 1930s asked ‘‘Can a
architecture can be further complicated in machine think?” and devised a test known
several ways, including by the nature of the as the Turing test as an understandable way
organization’s existing systems, which may to approach the question. The test basically
or may not be compatible with the strate- consists of a human judge undertaking a
gic and technical requirements of the envi- conversation via a terminal and keyboard
sioned architecture, and the influence of (as in a Telex system) with a selection of
external entities such as major customers other entities. Some of those others may
who may place pressure upon a supplier to be real humans, others will be computer
use a particular technology, which then has or robotic systems. If the human can not
to be integrated into the overall architec- reliably tell which is which, Turing’s claim
tural design. The selection of a new archi- is that we must accept that the non-human
tecture may be influenced by factors such participants have achieved intelligence. The
as regulatory frameworks that require a point of the test is to isolate judgment from
certain technology to be used, or the strate- irrelevant or ‘‘unfair” influences: many
gic decision to use open-source software as claim that machines can never really think
a core of the software architecture. because they have no souls, or are only
17
Artificial intelligence (AI)
obeying their programming. The Turing ently provides the only possible solutions,
test allows a purely objective assessment traditional approaches being unsuitable or
based on merit alone. nonexistent. The incorporation of AI solu-
Since its inception, AI has developed tions into an application can therefore
many branches and can be thought of result in a distinctive differentiated solu-
in a variety of ways. The branches of tion for an organization’s products, for
AI include Machine learning, Natural lan- example washing machines that incorpo-
guage understanding, Neural networks, general rate fuzzy logic may be superior in per-
problem solving, Robotics, Expert systems, formance to traditionally controlled wash-
vision processing, and speech recognition. ing machines since they may be able to
These sub-disciplines have evolved at vary- adapt better to unforeseen combinations of
ing rates, with some aspects still being con- circumstances.
fined to research laboratories, whilst others
have become mainstream software design Summary of positive issues
techniques. Artificial intelligence has matured marke-
In its modern sense, AI is not limited to dly since its early days, and many of
attempts to duplicate or simulate intelli- the early techniques and applications that
gent human behavior. The basic techniques were experimental and cutting-edge have
of AI are frequently used in solving prob- subsequently become part of a computer
lems for which no method of solution scientist’s general ‘‘tool kit.” For example,
is known. Heuristic and Minimax searching the development of expert ‘‘knowledge-
allow a computer application to search based” systems may now be carried out
effectively an infinite range of possible through mature established commercial
solutions and find one that is at least ade- applications and supported through well-
quate. Neural networks can learn to extrapo- researched development models and test-
late solutions to problems that have never ing techniques. Artificial intelligence tends
arisen before by self-training based on to refer to methods that are still in the
samples of similar problems and their research arena, and when the technology
known solutions. These and other tech- becomes mature it merges into the gen-
niques follow the human pattern of learn- eral set of techniques available to software
ing from and adapting past experiences; engineers.
they can not be relied upon to produce
perfect or optimal solutions, but when Summary of potentially negative issues
no Algorithm is known, and no program- The newer techniques and applications
mer has been capable of creating a direct available through this branch of com-
solution, they may be the only alternative puter science are advanced in nature and
available. require specialized knowledge to design,
implement, and test. Consequently they
Business value proposition are expensive to develop. Additionally, the
Artificial intelligence covers a wide variety research nature of many aspects of AI
of areas, and the formulation of a work- requires careful management considera-
ing solution frequently requires the incor- tion before adoption, as does the suitability
poration of a wide variety of disciplines, of AI systems in respect to high-risk imple-
techniques, and skills. While this can be mentations. The validation and verification
an extremely difficult task to manage and of AI systems requires high degrees of spe-
incorporate into a working application, the cialized knowledge and remains an area of
incorporation of these AI techniques frequ- active research.
18
ASCII
It is essential to be aware of the true This code is still used as the primary rep-
nature of any AI technique built into a resentation in nearly all computers, but it
product; many produce generally very good is rather primitive in concept. Providing
results, but with no guarantees, and sce- representations only for the keys that
narios at the edge of a system’s range appeared on a teletype, ASCII is adequate
may produce markedly poorer results than only for standard American-English orthog-
customers expect. raphy. It does not even support complete
British usage (words such as anæsthetic
Reference
and œstrogen, and the £ sign), let alone
A. Barr and E. Feigenbaum (eds.) (1981). The
the accents and diacritical marks required
Handbook of A.I., Volumes I, II, and III
by European languages (élève, Würze, mañ-
(London, Pitman).
ana), or the thousands of symbols required
Associated terminology: Application by Asian languages.
generator, Expert systems, Fuzzy logic, As a remedy, an international standard
Machine learning, Natural-language known as ISO-8859 has been introduced.
processing, Robotics, Algorithm. This is an extension of ASCII, leaving all
of the existing codes in place, but also
assigning meaning to the 128 codes that
ASCII ASCII (for historically valid technical rea-
sons) did not use. This allows the more com-
Foundation concepts: Digital, Bit. mon accents and marks used in West Euro-
Definition: American Standard Code for Information pean languages, but is still a source of great
Interchange: the standard encoding for text stored on frustration for what it leaves out.
computers. Another more significant extension is
known as Unicode. Using more than a
Overview single eight-bit byte, Unicode can represent
All information stored on a computer has many thousands of characters and obscure
to be in numeric form. ASCII provides a symbols. Programmers are encouraged to
numeric representation for every symbol use this Wide character representation for
that can be typed on a ‘‘standard” keyboard. new development, but, due to compati-
Each character is represented by a unique bility issues, extra complexity in program-
small number as follows: ming and processing, and simple human
0--31 invisible characters such as inertia, converted applications are still in
ENTER and TAB. a minority.
32 space
33--47 ! ‘‘ # & % $ ‘ ( ) * + , - . / Business value proposition
48--57 the digits 0 1 2 3 4 5 6 7 8 9 The ASCII standard is universally known
58--64 :;<=>?@ and almost universally used, and has been
65--90 the capital letters A B C D E F key technology for computing since its
G . . . XYZ invention. The character set provides a
91--96 [ \ ]ˆ-- common representation for interactions
97--122 the lower-case letters a b c d e f between computers and other equipment,
g . . . xyz and thus improves reliability of communi-
123--126 { | } ∼ cations, reducing the need for error-prone
conversions, and lowering the cost of tech-
thus any typed text may be represented by nology ownership. ISO-8859 is the extended
a sequence of small numbers. standard used by web browsers.
19
Assembler
20
Audio
21
Audio
which means that there is a smallest repre- recordings occupy a huge amount of disk
sentable change and nothing smaller than space, and any application that requires
it will be detected, and the varying volt- audio signals to be transmitted over a net-
ages can be sampled only so many times work will consume a large amount of band-
per second. However, the precision of the width. For this reason, there has been a lot
human ear provides a very useful cap on of research into the compression of audio
requirements; there is no point in record- files, and a number of very effective meth-
ing aspects of a sound that nobody is capa- ods are available.
ble of hearing. If the voltage produced There is sufficient redundancy in typ-
by a microphone is measured only 50 000 ical sound recordings that lossless com-
times per second, it will capture every- pression (i.e., perfect compression: after
thing within the range of human hear- decompression the restored data will be
ing, and that is well within technological identical to the original) may sometimes
capabilities. halve the file size. Standard applications for
Once an audio signal is in digital form, producing ‘‘zip files” approach this com-
it can easily be manipulated according to pression ratio, and special-purpose loss-
complex mathematical formulæ to produce less audio-compression software may fare
effects that would otherwise require very slightly better. However, it is rarely neces-
expensive signal processing hardware. Fil- sary to reconstruct digital audio with per-
tering to remove noise, or to enhance a weak fect accuracy, since the human ear is not
voice, or even to remove one voice from a perfectly precise. Lossy compression meth-
group; pitch changes to enhance a voice ods allow the user to select the appropri-
or modify one aspect of a sound; and a ate compromise between high compression
wide variety of other transformations can ratio and high fidelity for any given sit-
all be applied with nothing more than a uation. The well-known MP3 (MPEG Audio
standard personal computer. Editing oper- Layer 3), AAC (Advanced Audio Coding), and
ations, in which parts of a recording are WMA (Windows Media Audio) formats can
cut out and pasted together in a different reduce an audio recording to about 20% of
order, removing any audible discontinu- its original size without great loss of qual-
ities, are also easily performed. Scrambling, ity, by progressively discarding the parts
or audio encryption, is also much simpli- of the signal that are least likely to be
fied after digitization. Software for Digi- detectable by human listeners. When the
tal signal processing is widely available for original audio contains no high-frequency
most computing platforms, and is relatively components (e.g., spoken-word recordings)
easy to implement for a software engineer or when low quality is acceptable (e.g.,
with some knowledge of the mathematics cheap telephony), much greater compres-
of Fourier transforms. sion ratios are possible.
Digitized audio signals stored as is, just
as a sequence of numbers directly repre-
senting measurements of the sound pres- Business value proposition
sure, are known as WAV or Wave files, Since the advent of the internet, the use of
although the correct technical term is PCM sound on computers has radically departed
or Pulse code modulation. This is the format from the past when computers were pri-
used on audio CDs, and is the most conve- marily silent devices. The primary use of
nient for digital signal processing and play- audio is in entertainment and internet-
back. Wave files are rather large; a high- related applications, and there are myriad
fidelity stereo recording requires about applications, ranging from online instruc-
10 MB per minute. This means that long tional courses for employees, through
22
Audio
23
Backup
24
Backup
must store its backup archives in a loca- available, and auto-loading units that can
tion that is both secure and distant. The hold ten tapes at a time and automatically
same principle applies to non-networked eject one and load another when neces-
computers: the only safe solution is to sary extend the capacity to genuinely use-
make copies of the backup archives on ful sizes. Providing adequate backups of
some removable storage medium, which data is a technological problem that can
should then be removed to another not be ignored or solved by default.
location. There are also legal considerations.
Choosing a suitable removable storage Backup archives containing copies of con-
medium is a serious problem. The simplest fidential data must be protected at least
and cheapest solution is to use writable as carefully as the original. Old tapes that
CDs (known as CD-R, CD-RW, and CD-RAM). have reached the end of their working life-
High-speed drives typically cost less than time or that are no longer needed can not
$100, and, when bought in quantity, the simply be thrown away; they must be prop-
blank disks cost only a few tens of cents. erly destroyed and made unreadable, or the
However, the capacity of a CD is only data must be completely overwritten first.
650 MB. That can be stretched a little by the In nearly all circumstances it is permissi-
careful use of data Compression, but is only ble to make a copy for backup purposes of
really adequate for the frequent Incremental purchased copyrighted software (commer-
backups that save only those files that have cial applications and operating systems, for
changed recently. It would take 250 CDs example) because CDs, even those used by
to perform a full backup on a typical well- reputable companies to distribute expen-
used disk drive. sive software, do sometimes fail or break.
Another alternative is the writable DVD However, care must be taken that illicit
(known as DVD−R, DVD−RW, DVD+R, installations are not performed from the
DVD+RW, DVD−RAM, DVD−DL, and backups, and, if the original software is
DVD+DL). These are in effect simply CDs sold or given to another party, the backups
with higher capacities. They are a little of it must be destroyed or erased.
slower, and rather more expensive, but
from the user’s perspective fundamentally Business value proposition
the same. The capacity of a DVD is 4.7 GB, The correct and successful backup of data
or, for the much more expensive dual-layer is vital to any organization and thus the
(−DL and +DL) versions, 9.4 GB: seven (or ultimate responsibility to ensure compli-
fourteen) times the capacity of a CD. So ance falls on the CIO. The task itself may
DVDs can hold more incremental backup technically fall under the responsibilities of
material, but would still not be suitable the network manager with input from the
for backing up entire disks. chief security officer. Typically the backing
Magnetic tapes in various forms provide up of data is considered through a process
another alternative. Until quite recently, known as Information lifecycle management
magnetic tapes were seen as the perfect (ILM) in which the data is guided all the way
solution to the backup problem, since a from creation to disposal through a series
single tape could store the contents of a of processes and technologies that support
few complete disk drives. Unfortunately, the value and nature of the data at each
disk technology has improved far more point in time. For example, transactional
rapidly than tape technology, and the ratio data may be archived incrementally (only
has been reversed. Currently, tape car- the changes are archived) to a backup tape
tridge drives with capacities between 1 server until the data is extracted from the
and 400 GB and greatly varying prices are transactional data, transformed and then
25
Bandwidth
loaded into a data warehouse, at which backup processes surrounding the informa-
point the data may be backed up incre- tion lifecycle.
mentally onto tapes and these removed
to another location for safe keeping. The Summary of positive issues
mechanisms and forms of the data backup The backup requirements of organizations
will also depend upon the scale of data can be related to well-known and estab-
being backed up. For a small company the lished models such as the information
capacity requirements may be such that a lifecycle. Technical issues surrounding the
CD or DVD provides sufficient storage capa- establishment of backup servers, use of tan-
bility for a weekly or monthly backup. For dem systems, off-site disaster recovery, reg-
those companies which require a complete ulatory compliance, and security are well
100%-redundant systems capability there understood, documented, and supported.
are specialist computers available from var-
ious manufacturers, which contain two or Summary of potentially negative issues
more of everything with identical data pro- All data-storage media are fallible. Data
cessing and storage occurring in each of storage and backup systems require the
the two systems; thus, should one CPU or expenditure of resources and capital. The
disk drive fail, the system can continue to creation of local data that is not supported
run as normal in the other system while by a network server may not be subject to
the problem is corrected. RAID (redundant or adhere to policies that enable data to
array of inexpensive disks) techniques pro- be recovered through backup procedures.
vide similar concurrent backups for disks Some of the technologies such as CD and
only. DVD are limited in capacity and there-
The legal issues facing many corpora- fore unsuitable for large-scale backups at
tions, particularly those operating in the data-rich organizations. Some data storage
United States, dictate the need for strong media have a wide range of sub-categories
backup policies, acts such as Sarbanes-- and require care in the selection of devices.
Oxley and HIPAA require data not only to Failure to adhere to regulations pertain-
be stored correctly but in some instances ing to data backup and storage can lead to
also to be disposed of correctly. Again these prosecution.
issues need to be resolved by the organi-
Reference
zation in the form of processes and poli- r D. Cougias, E. L. Heiberger, and K. Koop
cies pertaining to data backup. Clearly legal
(eds.) (2003). The Backup Book: Disaster
issues are important, but the possibility of
Recovery from Desktop to Data Center, 3rd
loss of intellectual property or even gov-
edn. (Lecanto, FL, Schaser-Vartan Books).
ernment secrets dictates that data must be
stored in the most appropriate way (e.g., Associated terminology: Bug, RAID,
using encryption), and disposed of securely Information lifecycle management.
(e.g., by passing a large electromagnetic
pulse through the system or at the very
least writing to the data storage medium Bandwidth
several times until the whole data-storage
area has been covered and all prior data Foundation concept: Bit.
made unreadable and incapable of being Definition:Themaximumrateofinformationtransfer.
extracted through any special forensic pro-
grams). Overview
There are many third-party companies, The quantity of information is measured
vendors, and consultants to support the in bits. One bit is the smallest amount
26
Bandwidth
27
Bandwidth
in after reception. Returning to the original Organizations must also fully deter-
example, the string ‘‘YES” is a 24-bit mes- mine the drivers of the external demands
sage, but if it is known in advance that the for bandwidth requirements. A business-
only possible strings are ‘‘YES” and ‘‘NO,” to-business (B2B) e-commerce marketplace
the transmission can be reduced to one bit, for example, wholly dependent upon its
1 for ‘‘YES” or 0 for ‘‘NO.” Just one bit is online customers, would typically be con-
transmitted, but the receiver converts that cerned with metrics such as the end-
one bit back to ‘‘YES” or ‘‘NO” before dis- to-end response time experienced by its
playing the result. There was only one bit users when accessing and interacting with
of real information in the original 24-bit the marketplace (end-to-end response time
message, so a compression ratio of 24 to 1 refers to the time it takes to send a request
was possible. to a site and then receive a response back).
Bandwidth limitation is one of the To ensure customer satisfaction the B2B
fundamental requirements of information company world need to invest in a network
theory; no reputable authority disputes it. that has a bandwidth provision in excess of
If an invention claims to give you a trans- that needed to accommodate the demands
mission rate of 100 MBits per second on of a peak user population interacting with
a 10 MBit line, it must be doing so by the marketplace.
compressing the data, removing redundant Network facilities are often purchased
information. If the data you are transmit- from third-party service providers such as
ting does not have a 90% redundancy rate, ISPs or web-hosting services. Care needs to
the invention can not work. be taken when choosing a service provider
to ensure that the bandwidths and ser-
Business value proposition vices contracted are actually provided. For
Organizations need to determine both example, a web-hosting service may offer
their internal and their external band- a ‘‘bandwidth” of 10 MBytes per day, but
width requirements. The determination of that does not mean that the access speed is
internal bandwidth requirements is based high, it simply means that your customers
upon the amount of data transmitted may download up to 10 MBytes from that
internally within the organization across site per day without incurring extra costs
the LAN. External bandwidth requirements for you. Hence contracts need careful man-
are determined by considering the data agement and examination.
requirements of the company in relation
to any customers, vendors, or other entities Summary of positive issues
with which it interacts electronically. The cost of bandwidth and networking
In reality, determination of bandwidth equipment is continuing to decrease in real
requirements for companies can be a com- terms. As the availability of bandwidth con-
plex task. Typically network managers and tinues to grow, the number of devices that
CIOs are forced to trade off performance will be capable of connecting over a net-
against cost. The total cost of ownership work will continue to grow.
will be calculated inclusive of issues such
as the future bandwidth requirements for a Summary of potentially negative issues
growing organization, the scalability of the Total cost of ownership needs to be com-
network connections, the type of data traf- pletely understood in order to determine
fic, and the growth rates associated with the system requirements correctly, since
the different types of data traffic placed underestimation can have an extremely
upon the networks. negative consequence for the organization.
28
Bar code
29
Batch processing
30
BCS (British Computer Society)
31
Benchmark
32
Binary
33
Biometrics
34
Bit
such as whorls and ridges, and their rel- technologies and solutions are continuing
ative sizes and positions, can be quite to develop and are not infallible.
successful.
Reference
Biometric systems are still very much
J. Woodward, M. Orlands, and P. Higgins
in their infancy. There is no established
(2002). Identity Assurance in the
accepted set of biometric parameters that
Information Age: Biometrics (New York,
may be used for reliable recognition, and
McGraw-Hill).
the details of many existing systems are
jealously guarded commercial secrets. A Associated terminology: Encryption,
low-cost biometric system is unlikely to be Security.
of any use except as a gimmick or symbolic
deterrent. Even high-cost ones need to be
thoroughly tested before any investment is Bit
made.
Foundation concepts: Binary, Digital
Business value proposition Definition: A single binary digit, zero or one.
Biometrics systems can be purchased and
used to secure a variety of products and Overview
processes. These include fingerprint sys- The smallest piece of information that can
tems for doors, computers, and any device exist or be communicated is the bit; a sin-
that can be loaded with software and con- gle symbol 0 or 1.
nected to a USB fingerprint reader. The The meaning of the symbol is com-
use of facial recognition biometric systems pletely context-dependent: it could repre-
provides a convenient, cost-effective mech- sent ‘‘no/yes,” ‘‘off/on,” ‘‘the number I am
anism for the provision of a high level thinking of is less than/not less than 1024,”
of security. Other biometric systems such or the resolution of any other dilemma.
as iris recognition, hand geometry, signa- Usually a single bit is just part of a larger
tures, and voice prints are developing in communication, but any information can
their levels of reliability and acceptance. be represented as a series of 0s and 1s
(see ASCII, Binary). Information quantity is
Summary of positive issues always measured in bits and the useful-
Biometric technologies have matured and ness or carrying capacity of any channel
‘‘smart cards” with photographic biomet- of information is known as its Bandwidth
ric and cryptographic technologies are and is measured in bits per second. This
now required by agencies such as the US is true whether the channel is a digital
government’s federal agencies for person- fiber-optic cable, a high-definition color-
nel identification (Homeland Security Pres- television broadcast, a person tapping on
idential Directive 12). Technologies such a morse-code key, or the human voice.
as fingerprint readers have become acces-
sible and low-cost solutions are available. Business value proposition
The biometrics industry is supported by Bits provide the common basis of informa-
a large consulting base and academic tion measurement and representation for
literature. all digital computing, networks, and com-
puting devices.
Summary of potentially negative issues
Biometric security solutions can require a Summary of potentially negative issues
significant investment in ongoing adminis- Computing when considered at the individ-
tration, monitoring, and support. Biometric ual bit level can be extremely data intense
35
Bluetooth
and has necessitated the development of Bluetooth occupies layers 1 and 2 of the
higher-level tools, representations, and OSI seven-layer model, and thus may be
languages in order to manipulate the data used in place of network cards and cables
efficiently and effectively. in a local area network. Normal TCP/IP net-
Associated terminology: ASCII, Bandwidth, work traffic may be carried over Bluetooth
Compression. devices. In this capacity, Bluetooth is a pop-
ular means of adding mobile internet con-
nectivity to a portable computer by inter-
facing it with a suitably enabled cellular
Bluetooth
telephone.
Foundation concept: Wireless network.
Definition: Bluetooth is a specification (protocol) for Business value proposition
wireless data transmission between devices that Bluetooth is an open standard adminis-
support that protocol. tered by the Bluetooth SIG (https://www.
bluetooth.org/) that has been adopted by
Overview manufacturers and employed in a wide
Bluetooth is a protocol for wireless commu- variety of devices, including wireless head-
nication between devices of various kinds, sets, wireless-enabled telephones, computer
and has been designed to be economic in mice and keyboards, printers, USB adap-
terms of processing power requirements. ters, ‘‘Bluetooth access points” that allow
The protocol has its origins in a Special connection to a wireless network, and wire-
Interest Group (SIG) created in 1988 that less office equipment such as electronic
took its name from the tenth-century whiteboards, as well as Bluetooth-enabled
Danish king Harald Blatand (Harold gaming consoles, medical equipment devi-
Bluetooth in English). The SIG was focused ces, and GPS devices.
upon the development of an open stan- A primary adopter of the technology
dard for wireless communication between has been the automobile industry, to faci-
devices operating within a short range litate device-to-device communication,
of each other and having a low power focusing upon hands-free cell-phone opera-
consumption requirement. tion through the vehicle’s voice-recognition
The system works by wirelessly connect- system.
ing two or more devices in what is known
as a Piconet, using a base band frequency
of 2.4 GHz. The system has been designed Summary of positive issues
to work in noisy radio-frequency environ- Bluetooth is an open standard that facili-
ments, where there is already a lot of radio- tates communication between devices at a
frequency energy being transmitted and short distance using a frequency-hopping
less robust systems may be unable to work technique that enables it to work in noisy
through the interference. It achieves this by radio-frequency environments such as an
employing a technique referred to as Fre- office or home. Bluetooth has three levels of
quency hopping whereby the master device security and authentication systems built
communicates to the other devices (known into it. Multiple piconets can be combined
as slaves) in its piconet first at one frequency into one Scatternet.
and then at another, hopping through the
frequencies during the communication. If Summary of potentially negative issues
some of the frequencies are too noisy, and The Bluetooth system works at a close
the communication fails, then it will still proximity (up to 100 m depending upon
be successful on others. the power class of the system) and has a
36
Broadband
limit on the number of devices that can Several technologies have evolved to pro-
communicate on one piconet (one master vide this service. One such technology was
and seven slaves). The bandwidth of the termed an Integrated Services Digital Network
specification is low (1 Mbps). There is a prac- (ISDN) and is an ITU-T standard. Two lev-
tical limit to the number of piconets and els of ISDN have been defined: the basic
scatternets that can be in very close prox- rate interface (BRI) allows for two channels
imity due to interference from overlapping of 64 Kbps or one of 128 Kbps; the second
signals. level of ISDN is known as a primary rate
interface (PRI) and can carry 30 channels
References
r http://www.bluetooth.com of data at 64 Kbps per channel, providing a
r R. Morrow (2002). Bluetooth: Operation total bandwidth of 1.92 Mbps. These chan-
nels known as ‘‘bearer” or ‘‘B” channels can
and Use (New York, McGraw-Hill
be leased wholly or in part.
Professional).
A second broadband technology known
Associated terminology: OSI seven as the T system (T−/+1, T−2, T−3, etc.) is
layer-model, TCP/IP. discussed in the T-Carrier article.
The Bluetooth brand is owned by A third broadband technology is known
Telefonaktiebolaget LM Ericsson. as the digital subscriber line (DSL), a mech-
anism for transmitting digital data over
the standard copper telephone wire found
Broadband in most residential areas. Since DSL works
over a telephone line, it has the advan-
Foundation concept: T-Carrier. tage of being a dedicated (not shared) line.
Definition: Broadband is a general term that refers to There are several varieties of DSL, based
a high-speed data-transmission link. on the technology used and the distance
of the end user from the service provider.
Overview The speed of DSL broadband connections
The term broadband does not imply any spe- is often asymmetrical (ADSL), meaning that
cific minimum bandwidth for data trans- transmissions to the user (downloads) have
mission over a physical cable. Broadband is a different bandwidth from transmissions
a term generally used to describe any high- from the user (uploads). Typically, residen-
speed link for the transmission of data or tial users download much more than they
voice in a residential setting, that has a upload, and the asymmetry reflects this.
bandwidth higher than a dial-up internet Typical downstream rates are speeds of
connection. up to 1.5 Mbps while upstream rates of
Dial-up internet connections are based 640 Kbps or less are common. At the other
on the use of analog modems to connect end of the DSL bandwidth spectrum is
a computer to an internet service provider very-high DSL (VDSL), which uses fiber-optic
(ISP) over the normal cables owned by the cables to the end user, providing down-
telephone-service provider. Dial-up services stream bandwidths of 55 Mbps for lines
have been available to the general public of up to 300 m in length and 13 Mbps
since before the internet was deregulated over lengths beyond 1500 m, with upstream
in 1995 at a variety of bandwidths depend- bandwidths of 1.6--2.3 Mbps.
ing upon the technology used. The base A broadband connection can typically
bandwidth for dial up is 56 Kbps (Kbps = also be obtained through residential cable
thousand bits per second). television service providers. This requires
Broadband technologies are based upon the use of a cable modem rather than a tele-
the transfer of data as a digital signal. phone modem. While bandwidth typically
37
Bug, debugging
38
Bug, debugging
human being will actually be human and engineers investigating a computer failure
make a mistake, failing to follow the proce- found an actual dead bug (in the sense of
dure and introducing the very errors that an insect) stuck somewhere in the works
those procedures are supposedly guaran- and causing the problem. This is not neces-
teed to prevent. sarily a true story, but it is widely believed
There is one approach that provides and often repeated.
a theoretical chance of success: Formal 2: Remote surveillance devices in com-
methods together with Computer-aided design puting take two forms.
(CAD); for further details see those entries. (1) Software illicitly installed on a com-
Even those methods do not really guaran- puter that accesses stored data or records
tee success. Formal methods succeed only activities, transmitting records to the
if someone has a correct understanding installer or keeping them for later retrieval.
of the problem to be solved, and success- Spyware (q.v.) is an example of a soft-
fully represents that understanding in a ware bug, but much more sophisticated
mathematical formulation. CAD allows a forms exist. Good anti-virus software will
programmer to convert the mathemati- find many instances of software bugs, and
cal formulation into usable software with- Adware removal tools find some others, but
out error, because the human program- in critical situations where high-value data
mer provides only the intelligence required is at stake, there are no tools that may be
to decide how to proceed; the computer completely relied upon. A trusted techni-
itself performs the actual steps of program- cian, fully familiar with exactly what soft-
ming, and refuses to perform any step ware is supposed to be on a computer, and
that does not conform to provably cor- performing frequent regular checks, is a
rect pre-established rules. Unfortunately, good but expensive precaution. The strict
somebody has to create the CAD soft- use of a Firewall (connection blocking soft-
ware in the first place, and, if the CAD ware or hardware) also helps a lot, but
application has a single bug in it, then nothing offers complete protection. It is
every program it helps to create could be quite possible (although certainly unlikely)
wrong. that spying utilities could be built into the
Human beings make mistakes, and can operating system or even the firewall soft-
not be prevented from doing so. Comput- ware itself.
ers can not understand the requirements (2) Physical devices (the ‘‘bugs” of spy
for software projects, and can not program films) attached to the computer, which
themselves. Testing is not an adequate solu- may transmit information by radio or by
tion to the problem. Of course, testing subverting existing channels (such as a
is essential, but it is not sufficient. Test- network connection), or simply store infor-
ing can not possibly cover every case that mation internally until the device can be
may come up. All major software manufac- retrieved by the planter. Kits are available
turers have extensive testing facilities; the for ‘‘sweeping” work areas and detecting
fact that bugs are still common is proof any radio-frequency emissions, but they are
that testing does not solve the problem. of limited reliability, since bugs do not have
Good programming practice can certainly to transmit all of the time, there are many
drastically reduce the number of bugs in frequencies that carry untraceable electri-
programs, but their elimination is not a cal noise, and bugs are not limited to radio
practical possibility. transmissions. Again, frequent visits from a
It is claimed that the term ‘‘bug” origi- trusted technician who knows exactly what
nated in the early days of computing, when should be inside the computer will help,
39
Bus
but only partly. It is quite possible for hard- Summary of positive issues
ware manufacturers to build perhaps disk Development methodologies and tech-
drives or even CPUs with recording equip- niques are available to minimize the occur-
ment embedded in their design. That is not rence of errors in a software system; Formal
to say that it does happen, but it is certainly methods are a prominent path. The use of
possible. verification and validation techniques also
assists in the minimization of errors in sys-
Business value proposition tems. Software developers have tools such
1: The term bug has several uses within as debuggers available to assist them in
the technology community, the most com- locating errors in their code.
mon being the bugs or errors in a pro-
gram code. A famous example of bugs in Summary of potentially negative issues
the code is the ‘‘blue screen of death” that Errors and bugs are present in all but
occurs when a personal computer crashes. the most carefully developed systems, and
Errors in software code are made by the even careful development is no guarantee
programmers during the development pro- against them. The use of formal methods
cess. Errors occur anywhere the code does throughout development would theoreti-
not satisfy the specification or where the cally lead to perfect bug-free software, but
specification is in fact wrong and the code formal methods require exceptional levels
manifests the specification errors. Typically of specialist training, and available tools
both cases are referred to as bugs. There provide incomplete coverage of the soft-
are steps that can be taken to eradicate ware development lifecycle. Even with the
bugs and errors through the use of valida- most perfectly designed software, simple
tion (‘‘are we building the right product?”) errors in typing (either of the code itself,
and verification (‘‘are we building the prod- or in later data entry) may cause an error
uct right?”) techniques. Programmers some- that lies dormant and undetected for years.
times have to examine thousands of lines of
Reference
code to locate the error; however, software r S. Murrell and R. Plant (1977). ‘‘A survey
products known as debuggers are available
of tools for validation and verification
to assist them in this task.
1985--1995,” Decision Support Systems,
2: Bugs can also refer to surveillance
Volume 21, No. 4.
devices. In situations where secrecy is
required, countermeasures may need to be Associated terminology: Formal methods,
in place to overcome these issues. A bug Reliability, Programming language.
may be of the cold-war variety with micro-
phones in walls and telephones, but more
likely in a modern context is invasive
Bus
software known as Spyware. Key-stroke cap-
turing software could be placed on a vul- Foundation Concept: CPU.
nerable computer, and the use of software Definition: A common connector used to transfer data,
to break into data repositories containing commands, or signals between connected devices. A
important intellectual property is always common pathway.
a threat. Careful counter-surveillance mea-
sures need to be put into place, covering Overview
both physical and software threats, and In computing, the word Bus has the same
this is the responsibility of the company’s derivation as the behemoth of public
chief security officer and the CIO. transportation: omnibus, meaning ‘‘for all”
40
Business continuity service provider
(Latin, dative plural of omnis). A bus is standard for a few years, supporting high-
a single elongated pathway shared by a performance video cards, but now PCI
number of subsystems, and used by all for (Peripheral Component Interconnect) is the
communications. The alternative design is almost universal standard for PCs. IDE, EIDE,
to provide every subsystem with a direct and ATA (Advanced Technology Attach-
connection to every other subsystem; this ment) are the commonly used busses for
could produce a faster overall system, since connecting disk drives to the motherboard.
different pairs of subsystems could com- Off the motherboard, and usually out-
municate at the same time, but would side the computer, there is a third level
be prohibitively expensive and complex to of bus use. Instead of having a socket for
implement. A bus replaces a multitude of every possible device that might ever be
direct connections with one shared med- connected to a computer, have one socket
ium. Busses do require special design con- that connects to a bus, and many different
siderations, to make sure that two pairs of devices may in turn be connected to that
subsystems do not attempt to communicate bus. USB (Universal Serial Bus) is the best-
on the same bus at the same time, but this known current bus system, and, although
is only of concern to computer hardware many devices may be connected to a sin-
designers; the question of whether or not gle USB bus, it is so popular that comput-
to have a bus is irrevocably pre-decided for ers usually have two or four USB sockets.
everyone else. SCSI (Small Computer System Interface) is
Busses occur at three distinct levels in another common standard, now waning in
a computer system. Inside the CPU, busses popularity, for high-speed external devices.
are used to connect the individual parts See Port for more information on USB and
(arithmetic units, sequencers, memory con- SCSI.
trollers, etc.) to make the whole CPU work Minicomputer and mainframe manufac-
as effectively as possible. The design of turers are usually less concerned with com-
the bus inside a CPU is invisible to users patibility with third-party equipment, so
of the computer, and it can be effectively larger computers tend to have proprietary
judged only by measuring the actual deliv- internal bus systems (which are often called
ered performance of the CPU in controlled Backplanes), such as Unibus and Q-bus, but
tests or Benchmarks (q.v.) there are also independent standards such
The CPU and other major components as the IEEE-1014 VMEbus. Mainframes often
are connected by another level of busses on use proprietary busses for peripheral inter-
the Motherboard. Again, there is very little connection, such as IBM’s FICON (Fiber
choice available, motherboard designers connectivity) and ESCON (Enterprise System
provide the best bus they can, and it can Connection), which can provide commu-
not be changed. If a user takes a strong nication at 100 Mbits per second over dis-
dislike to the currently favored bus tances exceeding ten miles, and Digital’s
design (currently PCI), they will have a DSSI. There are also independent standards
hard job finding any alternatives. The such as ANSI’s FDDI (Fiber Distributed Data
favored bus design does change as techno- Interface).
logy improves. The original IBM PCs used
a 16-bit bus called ISA (Industry Standard
Architecture), which was improved by EISA Business continuity service provider
(Extended ISA), and the short-lived Micro-
Channel Architecture. VESA (Video Electro- Foundation concepts: Information lifecycle manage-
nics Standards Association) was a popular ment.
41
Business intelligence (BI)
Definition: Business continuity service providers are potential threats and can work with enti-
organizations that perform technology-related disas- ties to provide documentation, training,
ter planning and recovery services. resources, and processes to ensure continu-
ity of service.
Overview
Owing to the mission-critical nature of Summary of positive issues
technology within organizations, it is BCSPs provide independent assessments of
imperative for them to undertake disaster risks and threats to an organization. They
recovery assessments and planning. Business are capable of presenting current best-in-
continuity planning (BCP) commences with class process and technical solutions to cor-
a risk assessment of the current imple- porations.
mentation and covers all aspects of the
IT organization, assessing potential threats Summary of potentially negative issues
to the integrity of the organization and The development of BCP requires frequent
its systems. Risks include power failures, updating as threats and risks change. The
security breaches, failure of data backup processes, people, knowledge, key docume-
systems, hardware failure, environmental nts, technology, infrastructure, customers,
threats (e.g., hurricanes), and infrastruc- vendors, and other ‘‘critical resources”
ture threats. The second stage of BCP associated with corporations continuously
involves managing the risk and develop- change and this also forces BCP on a con-
ing responses to the threats through such tinuous basis.
means as backup power systems, UPSs, Reference
firewalls, improved physical and software r H. Wang (2004). ‘‘Contingency planning:
security measures, redundant hardware, emergency preparedness for terrorist
duplicate facilities (e.g., in environments attacks,” IEEE Aerospace and Electronic
not threatened by extreme weather), and Systems Magazine, Volume 19, Issue 3.
stronger infrastructures. The use of stress
analysis upon systems is a useful mecha- Associated terminology: Hosting, Backup,
nism employed to show the potential for Information Technology Infrastructure
weakness in the systems. Library.
The assessment of risk levels and the
planning processes surrounding this assess-
ment may be attuned to the business risk Business intelligence (BI)
assessment. Should the threat be deemed
high, an assessment of critical systems and Foundation concept: ERP.
emergency-level responses may be made for Definition: The use of analytic software applications
all technologies, people, and processes crit- to monitor business processes.
ical to the functioning of the business.
Overview
Business value proposition Business intelligence or Business analytics soft-
The use of BCP allows businesses to con- ware helps corporations to perform anal-
sider a variety of risks and threats that ysis of their business processes. Business
could impact their business. The business intelligence systems operate upon data
continuity service providers (BCSPs) are that has been extracted, transformed, and
third-party specialists who develop con- loaded into specialist databases from
tingency plans for businesses. As service online operational databases such as those
providers they act independently to assess found in ERP systems. These databases are
the risks associated with a wide variety of typically referred to as Data warehouses and,
42
Business intelligence (BI)
depending upon the architecture of the country, showing call volumes from each
business intelligence system, the data ware- state. The user could then drill down to
house may be a single Enterprise data ware- see the volumes from the counties within
house that contains within it specialist Data a state, then the cities, the zip codes and
marts, e.g., a finance data mart, a customer so on. At each level data relating to pro-
relationship management (CRM) data mart cess metrics will be available to the user.
and a human capital management (HCM) Business intelligence systems also allow
data mart, or alternatively individual spe- score cards to be created for processes and
cialist data marts may have been created sets of processes based upon performance
from, and exist outside of, the main enter- targets and the actual results as they
prise data warehouse. relate to those metrics. The aim of busi-
Business intelligence systems create an ness intelligence software is to help man-
environment in which a process or set of agers to make faster, better decisions based
processes may be carefully monitored. Data upon accurate, easy-to-interpret, and timely
is drawn from the operational databases data.
as required (e.g., every day or every hour)
and placed into the data marts. The soft-
Summary of positive issues
ware then uses the data from one or more
Business intelligence has emerged as a
data marts, depending upon the architec-
mature branch of the software industry
tural design, to generate reports. For exam-
and many vendors offer industry-specific
ple, a business intelligence system may be
solutions for a variety of processes within
used to monitor a call center, providing
an industry or function. Business intelli-
the management with a visual ‘‘dashboard”
gence systems can usually be built onto an
through which to monitor such parameters
existing data mart or warehouse. The sys-
as call volume, response time, length of
tems are intended to be highly graphical in
call, and average number of calls per hour.
nature and possess powerful user interface
Business intelligence systems simplify the
capabilities.
production of reports.
43
Business process re-engineering
44
Business process re-engineering
importance of radical redesign rather than manage their non-core processes. This
continuing to incrementally change pro- allows an organization to focus upon its
cesses that had originally been created core activities.
decades ago for a different technologi-
cal world. Re-evaluating and re-engineering Summary of potentially negative issues
processes can, if performed correctly, create BPR requires that organizations under-
smoother, faster, more flexible, lower-cost stand the implications of change. There
end-to-end business processes such as order- is a potential for serious negative conse-
to-cash, procure-to-pay, and plan-to-report quences if the BPR effort is performed
procedures. These end-to-end process cycles badly. Unsuccessful ERP implementations
are typically supported by ERP systems and that have been central to BPR efforts have
many large re-engineering efforts are thus in the past caused organizations to go out
centered on implementing an ERP and the of business (FoxMeyer, a drug distribution
best-in-class process model that it brings company, suffered such a fate). BPR efforts
with it. require careful implementation because
the workforce may be hostile to process
Summary of positive issues change and work against the implementa-
BPR may be undertaken in conjunction tion of new systems and processes. Careful
with a systems re-engineering project, usu- expectation management is required.
ally in the form of an ERP. ERP systems
Reference
are mature and supported by consultants
M. Hammer (1990). ‘‘Re-engineering work:
and vendors who offer solutions for com-
don’t automate, obliterate,” Harvard
panies of all sizes. BPR can enable orga-
Business Review, July--August.
nizations to determine what is core and
what is non-core to their business, and tran- Associated terminology: UML, ERP,
sition to vendor-supported systems that Data-flow diagram.
45
C, C++, C#
46
C, C++, C#
In the 1980s, some improvements and complexity of C++, and has proved to
additions were made to the design of C, be another exceptionally popular program-
which resulted in there being two versions ming language, today rivaling C++, the
of the language. The original is known as Microsoft corporation introduced C# (pro-
K&R C, in honor of the authors (Kernighan nounced ‘‘See Sharp,” like the musical
and Ritchie) of the standard reference. note). It is essentially a fusion of C++ and
K&R C is now obsolete. The new version Java. Like Java, it provides better safety both
was eventually adopted as an international for programmers and for users than does
standard (ISO/IEC 9899), and is now gener- C++, automates many of the difficult pro-
ally known as ANSI C. gramming tasks (such as memory manage-
Once C had been adopted as a general- ment), and provides an extensive library
purpose programming language, it made of predefined operations useful for gen-
sense to put back into the language some of eral and network programming. C# makes
the features that had been deliberately left many of the minor design decisions differ-
out when it was designed as a systems pro- ently from Java, but is seen by program-
gramming language, and to include new mers as being a very similar tool to work
features that were not then part of the esta- with.
blished paradigm for programming. C++
(pronounced ‘‘See Plus Plus”) is a major Business value proposition
extension of C, which was designed by The major benefit of C is that it is a pow-
Bjarne Stroustrup starting as early as 1979. erful language that can be used at a vari-
It too has been adopted as an international ety of levels, from systems programming to
standard (ISO/IEC 14882). C++ adds object- application development. For many major
oriented programming and a vast library of systems developments C has become the
predefined algorithms (known as the STL, de facto standard programming language.
or Standard Template Library) as well as a For example, several ERP-systems deve-
number of less major features. It also reas- lopers have re-written their code into C,
serts many of the safety rules that are allowing their systems to be more easily
inherent in general-purpose languages, but modified than would have been possible
tend to be left out of systems languages. when the systems were written in a proprie-
However, adherence to the rules remains tary or older procedural programming lan-
optional. guage. Use of the C language in these large
C++ is an exceptionally large and com- systems developments has also encouraged
plex language with many arcane rules. application developers to create comple-
Even the most experienced of programmers mentary applications, since the program-
are frequently surprised by the details. mers do not have to learn a new program-
There was some negative reaction, founded ming language, and the use of a common
on the reasonable notion that, if the language between applications reduces
language is so complex that it even has integration issues and problems. The use of
surprises for experienced experts, what C in conjunction with Unix also provides
chance have normal programmers of get- positive benefits due to their compatible
ting things right? But C++ continued design features.
undaunted, and has easily overtaken C in The popularity of C has led to a large pro-
popularity. grammer base from which organizations
Following on the popularity of Java, can draw. Numerous compilers and tools
which provides the full power of object- have been created to support program-
oriented programming without all of the mers, along with an enormous amount of
47
Cable Communications Policy Act of 1984 (CCPA)
48
Cables and connectors
acceptable terms of data protection in the Inside a desktop computer, the shield-
subscriber contract since all providers may ing provided by the metal casing is usually
require detailed personal information from enough to protect the internal wiring, so
those wishing to subscribe to their service, long as that wiring is not unduly long. For
e.g., the individual’s social-security number, example, it is not unusual for technically
telephone number, bank-account details, minded computer owners to buy extra-
and driver’s license details. length IDE (motherboard-to-disk) cables so
that their disk drives may be placed more
Reference
r Cable Communications Policy Act 47 conveniently. The extra length picks up
extra interference, and can have a seri-
USC §551.
ous effect on system reliability. Mainframe
Associated terminology: Security, computers with larger enclosures usu-
Encryption. ally have very carefully designed internal
shielding, which should not be interfered
with.
Cables and connectors Outside of the computer, cables for
high-speed communications must be prop-
Definition: A flexible physical connection that acts as erly shielded; that is (part of the reason)
a conduit for power, control signals, or data. why SCSI and USB-2 connectors seem to
be so unreasonably expensive. Shielding
Overview may take the form of a grounded metal
In the early days of computing and elec- sheath completely surrounding the conduc-
tronic communications, data processing tors (as in coaxial cable), or may be pro-
and transmission speeds were so slow that vided by carefully pairing signal-carrying
the components of a computer could be and grounded conductors (as in twisted
connected together with simple insulated pair), or some combination of the two.
copper wires without any loss of perfor- The connectors at the ends of these cables
mance. In modern computing, the design usually provide some kind of metal fixing
of cables adequate for high-speed data mechanism; this is not just to hold the
transfer has become a science in itself. cables in place, but also to provide conti-
When a rapidly changing electrical sig- nuity in the grounding.
nal flows along a conductor, such as a wire, Non-electronic communications media
some of its energy is actually transmitted do not suffer from electronic interfer-
into the environment as electromagnetic ence. The prime example is Fiber optics, in
radiation (that is exactly how radio trans- which the conducting wires are replaced by
mitters work). The more rapid the signal, transparent filaments, and signals are sent
the more of it gets transmitted. Conversely, along them as rapidly modulated flashes
when a conductor passes through an elec- of light, somewhat in the style of Morse
tromagnetic radiation field, part of that code flashed on an Aldis lamp. Complete
field is picked up, and converted to elec- immunity to electronic interference means
tric current in the conductor (that is how that fiber optics may be used in environ-
radio receivers work). Modern office equip- ments where electrical connections will
ment, especially computers, emits electro- not work, but the equipment required to
magnetic radiation on a wide spectrum of send and receive signals along fiber optics
frequencies, and those very same emissions is much more expensive. Of course, light
are picked up by that same equipment as itself would be interference for a fiber-optic
interference that can be strong enough to cable, but the cable is easily shielded by
mask the true operating signals. providing an opaque covering.
49
Cache
Breaks and kinks in high-speed cables are Summary of potentially negative issues
also much more trouble. At low speed, a The most common type of cable covering
break in a conductor simply results in no is PVC casing and, should it catch fire, the
signal getting through, an unfortunate con- fumes can be hazardous. Many fire codes
dition, but one that is easily detected. At require that a special form of cable known
high speeds, the effect of self-interference as ‘‘Plenum rated” cable be used, which is
can allow some attenuated portion of the fire-resistant. This is especially important if
signal to cross the break, making detec- the cable is to be strung in the overhead
tion and repair more difficult. A kink in spaces and recesses above offices and not
a cable can cause a short-span change in enclosed in special ducting. Plenum cable
impedance (the cable’s resistance to the is more expensive than PVC-cased cable and
flow of electricity); at low speeds, this sim- budgets need to be adjusted to take account
ply reduces the signal strength slightly; of this expense.
at high speeds, it can cause reflections
and echoes of the signal that drastically Associated terminology: Ethernet,
increase the overall error rate, and reduce Bandwidth.
throughput to a fraction of its expec-
ted value. With coaxial and twisted-pair
cables (as used in broadband network con- Cache
nections), and especially with fiber-optic
cable, it is essential never to kink or spindle Foundation concepts: Storage, Memory, Disk.
the cable or force it into sharp turns, and, Definition: A small but frequently accessed part of a
unless communications speed is not impor- data set, kept conveniently at hand for rapid access.
tant, cables that have been kinked but still
seem to work should probably be replaced. Overview
Carelessness with cables can be an expen- With any kind of data retrieval system,
sive mistake in the long term. there are likely to be some data items
that can be identified as more likely to be
needed than others. For example, a corpo-
Business value proposition
rate web server may store thousands of
A few extra dollars for a well-designed,
pages, any of which could be accessed at
carefully manufactured cable or connector
any time, but the corporation’s ‘‘home page”
can easily save thousands of dollars in
and its search page will probably be acces-
‘‘troubleshooting” and repair costs, and
sed far more frequently than any others.
incalculable amounts in saved down-time.
When retrieval of data requires some
Flimsy-looking metal covers should never
time or expenditure of computational
be removed from computer parts; they
effort, it makes sense to set aside these
may be too flimsy to provide any protec-
most frequently accessed items in some spe-
tion from physical damage, and it may
cial location that allows faster access if
seem sensible to increase ventilation, but
such a place exists. The set-aside store of
metal covers provide essential electrical
frequently accessed entries is called a Cache.
shielding.
In a system where normal access is
through a network connection, a cache
Summary of positive issues may sensibly be kept on a local disk. For
A wide range of cables with a variety of example, a web browser usually keeps a
bandwidths and associated connectors is cache of recently visited pages, so that
available. pressing the ‘‘back” button does not require
50
CAD/CAM (computer aided design/computer aided manufacturing)
any network access. In a system where nor- larger one is superior. Only a complete mea-
mal access is to data on a disk, a cache is surement of the system’s speed in use will
likely to be kept in memory. For example, be meaningful.
a file-processing system may keep copies of
recently accessed files in memory, because Summary of positive issues
it is a common pattern to keep working Caches allow computers to act more quickly
on the same file until some task has been than would otherwise be possible.
completed.
Web search engines often keep a cache
of the results of popular searches, keeping
Summary of potentially negative issues
The use of a cache adds slightly to a
dynamic statistics so that changes in pop-
system’s complexity, but it is a well-
ularity (as with people frequently access-
understood technology, and is unlikely to
ing web sites that have been in the news
have any significant negative effects.
recently) can be taken into account. A
database system will often keep a cache Reference
of recently accessed records, or important r J. Handy (1998). Cache Memory Book, 2nd
indexes. edn. (San Diego, CA, Academic Press).
Caches also play an important part in
Associated terminology: CPU, Bus.
hardware design. A disk drive controller
will usually contain a few megabytes of
memory, which is used as a cache to store
frequently and recently accessed data, and, CAD/CAM (computer aided design/
because memory accesses are so much computer aided manufacturing)
faster than disk accesses, the presence of
such a cache speeds up many common Definition: Computer aided design and computer
tasks by significant factors. Computer CPUs aided manufacturing are two computer-based sys-
and motherboards usually contain a cache tems that provide a basis for product modeling and
made of exceptionally fast memory, so that automated manufacturing.
a similar speed-up can be achieved for
normal memory accesses. Overview
Computer aided manufacturing (CAM) as we
Business value proposition know it today evolved from early computers
The use of caches is an important mecha- that used paper tape to issue instructions
nism through which a computer’s abilities to numerically controlled devices (NCDs).
can be extended. A web server that stores An NCD is controlled by a machine-control
popular corporate web pages in a cache will unit (MCU), which is composed of two
be faster than one that has to read repeat- parts: a data-processing unit (DPU), which
edly from backing storage; similarly, a com- reads and processes the instructions in the
puter that stores a user’s previous web page control program, passing the instructions
access in a cache will give faster access to on to the other part, the control unit
it than will one that doesn’t. The cache is (CU), which converts those instructions
built into the system’s design and, for soft- into electro-mechanical control commands
ware systems, it is usually an easy matter to for the machine itself. In the early days
reconfigure the size. Hardware caches are this required special languages for each
not normally at all flexible. It is a mistake device because each manufacturer used a
to compare cache sizes for one CPU or disk different instruction set. This problem was
drive with another, and conclude that the overcome when, in 1958, a language was
51
Capacity planning
created by researchers at MIT to tackle it. with the styling processes rather than the
The language, known as APT, allowed pro- engineering processes that would underlie
grammers to write in a standard code and a design. These would be considered in the
then, through NCD-specific Compilers, trans- CAD stage of the development which fol-
form the code into instructions that the lows the CAS stage.
NCD machine would understand. The CAD/CAM systems allow organi-
The development of Computer aided design zations to capture their organizational
(CAD) is generally acknowledged to have knowledge within the systems and repro-
started when Ivan Sutherland created a sys- duce aspects of a design with low cost in
tem known as Sketchpad at MIT in 1963. future designs, thus avoiding the syndrome
Sketchpad was an interactive graphics app- of ‘‘reinventing the wheel.”
lication that utilized one of the first graph-
ical user interfaces and a Light pen to enable Summary of positive issues
programmers to input designs. During the CAD/CAM provides a fast, cost-effective
next twenty years the CAD research and mechanism to develop designs. The results
vendor community developed more sophis- of the system are precise and repeatable.
ticated applications to enable designers or The systems allow 3D images to be created,
programmers to create wire-frame repre- scaled, manipulated, and viewed. The sys-
sentations of objects, view models as solid tems can be linked to VR, CAS, and other
objects, rotate and scale objects, view cross tools used in the concept-to-production
sections of objects, etc. lifecycle.
As the CAD systems evolved it was natu-
ral for the output from CAD systems to be Summary of potentially negative issues
funneled into existing CAM systems, allow- The cost of the software itself together with
ing the production and design issues to the required training can be considerable.
be examined. CAD/CAM systems have been The integration of the components, the
used by organizations to design and manu- CAD and the CAM, needs to be carefully
facture nearly every type of product, with considered to ensure that the systems can
automotive and aeronautic systems being communicate with each other using com-
amongst the early adopters and driving patible, neutral data file formats such as
forces behind much of the technological the ANSI Initial Graphics Exchange Specifi-
progress. cation (IGES) standard.
Reference
Business value proposition r D. Schodek, M. Bechthold, J. Griggs, K.
CAD/CAM has the major advantage of Kao, and M. Steinberg (2004). Digital
allowing designers and production engi- Design and Manufacturing: CAD/CAM
neers to examine their models in virtual Applications in Architecture and Design
space without the intense commitment of (New York, John Wiley and Sons).
resources and time that traditional meth-
ods require. These technologies are also Associated terminology: Virtual reality.
tied in with other types of systems such as
virtual reality (VR) modeling and computer
aided styling (CAS) to enable designers to Capacity planning
extend the design processes, both by view-
ing their systems in more realistic ways and Definition: Capacity planning is the process of deter-
with industry design-specific software. For mining current and future resource requirements
example, the use of CAS in the automo- and developing cost-effective solutions to meet those
bile industry focuses on helping designers needs.
52
Capacity planning
53
Cell computing
54
Cell computing
by parallel processing. However, many com- In the 1980s, the semiconductor manu-
mercially useful computational tasks do facturer Inmos, in conjunction with the
have a very parallel nature. For example, if designers of CSP and Occam, produced
1 000 000 000 un-indexed documents stored and marketed a self-contained single-chip
on disks are to be searched for those that computer, called the Transputer, which
contain some key phrase, the task would was designed to support cell computing
take a very long time, at least a day, and systems. Although it was technically suc-
up to a month, depending on the sizes of cessful, the Transputer never approached
the documents. Two computers could per- the low prices that full-scale cell comput-
form the task in exactly half the time; ten ing requires, and was not a commercial
could do it in exactly one tenth of the time. success.
Some tasks have a very serial nature, Wolfram claims to have developed a
some have a very parallel nature, and most whole new basis for science, built on
are somewhere between the two extremes. the ideas of cell computing, in his self-
Gene Amdahl, in 1967, analyzed the sit- published 2002 book; the book is to say
uation and the result is now known as the least controversial, but does add to
‘‘Amdahl’s Law”; it is still the most fre- the development of the subject. These
quently cited reference for the speed-up of ideas grew originally from a computer
processing obtainable through the use of game/simulation known as Conway’s Game
multiple processors. of Life, which was invented by the mathe-
Cell computing really comes into its own matician J. H. Conway in 1970.
at a different scale, not using a few dozen The practical implementation of cell
obsolete computers picked up cheaply, but computing would have immediate appli-
using many thousands, perhaps even mil- cation to high-power graphics production
lions, of purpose-built computers. The Pen- (one computing cell responsible for every
tium 4 (introduced in 2001) has nearly pixel of the screen), and a brief inves-
45 000 000 transistors on a single chip; the tigation of this possibility reveals the
Intel 8080 microprocessor (introduced in important fact that some programming
1974) had only 4500 transistors, but was problems actually become much easier if
still computationally useful. It is already they are designed for large-scale cell com-
possible to construct a single chip that puting instead of the traditional uniproces-
contains 10 000 individual computationally sor computers. It is also expected that cell
useful computers. computing would be very helpful in mod-
Software development for cell comput- eling physical phenomena, especially the
ing would require major retooling in the atmospheric models that support weather
industry. The ability to design a program forecasting. It is equally clear that the full
for thousands of slow processors is a quite benefits of cell-computing will not be fully
different skill from those normally pos- known until cell-computing hardware is
sessed by software engineers. Unusually, generally available.
the area has been very well investigated; It is known that the Neuron, the basic
C. A. R. Hoare of Oxford University thor- brain cell, has exceptionally simple func-
oughly developed the computational the- tionality that is easy to duplicate with very
ory in a series of papers on communicating inexpensive circuitry. A neuron is incapable
sequential processes (CSP), culminating in of performing any calculation or harboring
his influential 1985 book. The program- any thought alone; it is only through the
ming language Occam implements the vast number (over 100 000 000 000) of neu-
essential parts of CSP and is available for rons in the brain, and their complex inter-
most programming platforms. connectivity, that intelligent thought is
55
Chaos theory
possible. The possibility of being able to fab- multiple processors programmed in tra-
ricate a brain’s worth of artificial neurons ditional ways. Several vendors are work-
is still in the very distant future, but the ing on and deploying new forms of cell
undeniable success of real neurons fuels technology.
copious academic research toward what is
perhaps the ultimate application of cell Summary of potentially negative issues
computing. The use of large-scale communicating pro-
Recently there has been some revival of cesses using a cell processing model is
interest in cell computing in the main- largely confined to the research laboratory
stream of the computing industry, in the and will require extensive research to
form of IBM’s CELL project, or broadband bring the technologies into the commercial
processor architecture. This has been fueled realm.
by the desire for ever more realistic graph- References
ics in video game consoles, which are amon- r G. Amdahl (1967). The validity of the
gst the most computationally demanding single processor approach to achieving
computer applications in common use. large-scale computing capabilities,”
in Proceedings AFIPS National Computer
Business value proposition Conference (Anaheim, CA, AFIPS),
The commercial potential for cell comput- pp. 483--485.
ing is undeniable, but the current reality r C. A. R. Hoare (1985). Communicating
of the technology is that it lags consider- Sequential Processes (Englewood Cliffs, NJ,
ably behind theory. Industry has for the last Prentice-Hall).
few decades pursued the goal of putting r G. Jones (1987). Programming in Occam
more and more transistors onto a single (Upper Saddle River, NJ, Prentice-Hall).
chip rather than working on simpler chips r S. Wolfram (2002). A New Kind of Science
that can communicate with each other in (Champaign, IL, Wolfram Media).
large numbers. As the limits of technology
are reached, it will become more impor- Associated terminology: Parallel
tant for the extremely high density chips processing.
to communicate with each other and sub-
divide the tasks to which they are set in
order to continue to make gains in perfor- Chaos theory
mance. The move to cellular systems will
also require a major change in the style of Definition:Achaoticsystemisonethatmayhavebroad
programming and the development of new areas of stable predictable behavior, but also has areas
software systems to support the technology in which immeasurably small changes in the initial
and its environment. conditions produce significant changes in the results,
thus making long-term predictions impossible.
Summary of positive issues
The theory of communicating sequential Overview
processing is well known and understood Contrary to beliefs current in popular cul-
in the computer science research com- ture (Jurassic Park to be precise), chaos the-
munity. Cell computing has been develo- ory does not predict that complex systems
ped in the past and applications based will fail. It has absolutely nothing to say on
upon those technologies were developed. the subject.
Amdahl’s Law governs the processing per- The truth about chaos theory is much
formances obtainable through the use of more interesting and surprising. Since
56
Chaos theory
Newton postulated the basic laws of physics ably small change in conditions makes the
in the mid seventeenth century, people difference. These catastrophe points always
have generally believed that the universe exist; no matter how accurately condi-
works like a big machine: everything that tions are measured, there are always transi-
happens is the direct result of other things tional points at which that accuracy is not
that happened before. If you know how the enough to predict correctly a major yes-or-
machine is initially set up, you can predict no event. If a forecast says ‘‘no hurricane”
exactly what it will do in the future. when there is a hurricane, then, of course,
Weather forecasting is a good example. everything else is going to be completely
The underlying science of the weather is wrong too.
well known: how a small volume of air, It is generally accepted that other sys-
land, or sea behaves when temperature, tems of a similar nature, namely interact-
pressure, or other attributes vary, and how ing components with conditions that can
neighboring small volumes of air, land, or not be perfectly controlled or measured,
sea interact with each other are all under- can be expected to exhibit chaotic behavior
stood. Until quite recently it was believed and resist long-term forecasts. The behavior
that if data could be gathered, accurately of human or animal populations and stock-
measuring the current state of the entire market prices are prime examples.
atmosphere, a sufficiently powerful com- The fine boundary between sets of con-
puter could use the known equations of ditions with drastically different outcomes
physics to determine how the system would is known to science as the Julia set
evolve over time, and produce accurate (after Gaston Julia, French mathematician,
long-range weather forecasts. 1893--1978), and is often a fractal (grossly
It was realized from the beginning that simplified, a fractal is a complex shape
imperfect measurements would produce whose length can not be measured). The
imperfect forecasts, but it was quite rea- most famous of these is the Mandelbrot
sonably expected that good-enough mea- set, which is produced from a very sim-
surements would produce good-enough ple mathematical formula, x = x 2 + c, and
forecasts, and gradual improvements in demonstrates that even exceptionally sim-
measurement technology would produce ple, completely known mathematical equa-
corresponding improvements in forecast tions can exhibit chaotic behavior.
accuracy.
It is now known that this is not the Business value proposition
case. In many scenarios it does work out The study of chaos theory and fractals has
quite well: small errors in measurements led to the abandonment of projects now
produce only small errors in the forecast, known to be infeasible, the ability to pro-
allowing rain and drought to be predicted; duce realistic graphical renderings of nat-
these are known as stable conditions. ural scenes from very little data, and the
Unfortunately, unstable conditions abound. discovery of some exceptionally effective
Consider the formation of a hurricane: data compression methods.
there are conditions under which a hurri-
cane will form, and there are conditions Summary of positive issues
under which it won’t. Between those two The study of chaos theory has revealed
sets of conditions, what happens? Initial some fundamental truths of the universe
conditions are infinitely variable, but a hur- and produced some useful techniques as a
ricane either forms or it doesn’t. There is by-product, rather in the manner of NASA
some point at which the tiniest immeasur- moon missions being credited with the
57
Chaos theory
58
CIO (chief information officer)
r M. Barnsley (2001). Fractals Everywhere web-site operators, and the parents, and
(San Diego, CA, Elsevier). their interactions. The act aims not to be
r G. Williams (1997). Chaos Theory Tamed onerous and has special provision for cases
(Washington, DC, Joseph Henry Press). of limited interaction (such as a site receiv-
ing an email from a child to which only a
single response is needed). In such a case
parental notice and consent are not requi-
Children’s Online Privacy Protection
red as long as all personal information
Act of 1998 (COPPA) obtained from the child is deleted upon
Foundation concept: Security. execution of the email.
Definition:TheUSChildren’s OnlinePrivacyProtection
Act of 1998 is intended to “prohibit unfair or deceptive Summary of potentially negative issues
acts or practices in the collection, use, or disclosure A problematic aspect of the act has been
of personally identifiable information from and about the mechanism through which ‘‘verifiable
children on the internet.” parental consent” is obtained. At the out-
set of the act the final rule devised a
Overview ‘‘sliding scale” that allowed the methods
The US Children’s Online Privacy Protection used by web sites to ‘‘be reasonably cal-
Act of 1998 (COPPA) (Title XIII, Section 1301, culated in light of available technology, to
1998) defines the privacy policies to which ensure that the person providing consent is
‘‘web sites or online services” aimed at chil- the child’s parent.” These methods include
dren under 13 must adhere. Key provisions postal mail, ‘‘print-and-send” faxes, credit-
of the act are the placement of prominent card usage, toll-free verification, and digital
policy notices on web sites; the notification signatures.
of parents that a site is going to collect References
personal information; the mechanisms for r The Federal Register, March 15, 2006,
obtaining parental consent prior to a site Part III, Federal Trade Commission,
obtaining or collecting personal data; to 16 CFR Part 312, Children’s Online
disallow collection of data as an incentive Privacy Protection Rule; Final Rule.
for further involvement in the site by the r 15 USC §§6501--6505 (2006).
child; to allow parents to review informa-
tion collected on their child and have that Associated terminology: Law
information deleted if they wish; to allow cross-reference.
parents to prohibit an online service or web
site from collecting any further data on
their child; the role of schools in providing CIO (chief information officer)
permission; the dissemination of data to
third parties; and the procedures by which Foundation concept: MIS.
online services maintain and delete data. Definition: The chief information officer is a senior-
level executive responsible for all corporate informa-
Business value proposition tion systems.
The act requires the operators of web sites
aimed at children of 12 and under to oper- Overview
ate within a defined framework. The role of a Chief Information Officer (CIO)
is to ensure that the systems and technol-
Summary of positive issues ogy within a company are aligned to the
The act is comprehensive, defining the company’s strategy and that all systems are
responsibilities of the service providers, the compliant to government regulations. The
59
Click-stream tracking
CIO is a member of the executive layer who transformers. As such, a visionary CIO who
typically reports to the CEO and the board can also execute technology solutions has
of directors or through the office of the become a major corporate asset as both
CFO. technology and corporate strategies con-
The CIO as head of the IT organization is tinue to evolve rapidly.
an integral part of the corporate strategic
Reference
planning group and helps formulate corpo- r M. Earl, M. Feeney, and D. Feeney (1994).
rate strategy by bringing an understanding
‘‘Is your CIO adding value?,” Sloan
of the role of technology within the com-
Management Review, Volume 35, Issue 3.
pany’s marketplace, the impact that new
technologies will have upon the industry Associated terminology: Sarbanes--Oxley
in which the company operates, and the Act.
technology-related regulations that impact
the environment in which the company
operates (e.g., HIPAA, Sarbanes--Oxley). Click-stream tracking
CIOs manage departments that encom-
pass complex business processes and tech- Foundation concept: e-Commerce.
nologies that support business processes. Definition: Click-stream tracking systems are appli-
Issues with which the CIO has to contend cations that monitor web site accesses and compile
include business process re-engineering, data based upon visitor activity.
systems developments, maintaining legacy
systems, training, technology assessment, Overview
developing contingency plans, ensuring A click stream defines a sequential list
legal compliance, technology deployments (‘‘stream”) of activities (also known as
(ERP systems, robotic systems, etc.), tech- ‘‘clicks” or ‘‘hits”) that an individual visitor
nology outsourcing, and technologies that performs on a web site. Click-stream track-
support specific functional aspects of the ing systems monitor the click streams, con-
business (e.g., customer relationship man- solidating the data and generating reports.
agement systems). The applications that monitor click
streams range from those that come as
Business value proposition part of an internet server package, to cus-
The role of the CIO has changed consider- tom solutions from vendors who specialize
ably over time as IT organizations within in tracking and monitoring internet and
companies have increased in visibility and network traffic. Systems typically offer a
value. MIS departments were, up until the variety of data analysis tools to monitor a
mid 1980s, frequently considered as data wide variety of parameters, including met-
processing operations, run as cost centers rics such as the total number of visits to
within the organization by data process- the site, total number of pages viewed, total
ing managers. Subsequently, MIS depart- number of hits to the site, total number of
ments became recognized as value centers unique visitors, total number of new visi-
providing ‘‘enabling” technologies to the tors, total number of repeat visitors, total
core business processes and offering the usage number, average number of visits
potential for competitive advantage to be per day (week, month, etc.), time period
gained through their strategic deployment. of highest activity levels (hour, day, month,
During this period data processing man- etc.), average length of visits, average num-
agers became replaced by CIOs, who began ber of times a site is visited by an individual
to view themselves not only as organiza- (assuming that the IP address is known),
tional enablers but also as organizational most frequent visitor, domain, host, or user,
60
Client
61
Client–server
62
Client–server
the security relations between computers a request. This is in direct contrast to the
within that network. Thus it became clear peer-to-peer organization, in which the two
that the computers within an organization communicating systems have the same sta-
needed to be connected through a central- tus. With a client--server system, it is impor-
ized computer, the server, or a group of tant that the server(s) have known, rela-
servers acting in a coordinated manner yet tively static locations (usually IP addresses),
controlled by a central primary computer. but clients can be very mobile.
Then, when co-workers need to share a file, The client--server relationship does not
the file in question is copied from the file apply only to whole computers. Any given
server, over the network, onto one of the application that has network access may
client workstations where it is to be used. act as a server or as a client, or even
If any modifications are made, the changes both. On any computer at any time, there
are sent back to the file server so that all could be a number of applications acting as
other workers will be able to access the new servers, passively waiting for requests, and
version. a number of applications acting as clients,
Similarly, when email arrives from out- driven by the user actively making connec-
side the organization, there needs to be a tions with servers. FTP is an example of an
designated computer that is always online application that acts both as a server and
and ready to receive it. When an associate as a client concurrently.
wishes to read their email, their client
workstation makes contact with the email Business value proposition
server, and retrieves all relevant unread The three-tier client--server architecture is
messages. by far the most common computing con-
Having one computer (or more) spe- figuration found within organizations. It
cially designated as ‘‘the server” means that is predominantly centralized in nature,
important files can be kept safe in one with flexibility of client location and access
central location that is carefully protected, through the networking protocols utilized
and there is always an online presence to connect the client to the server (TCP/IP).
to receive email and handle web services, The network response can be improved
regardless of what the various workstations on large-scale global client--server networks
may be doing. Although the server is prob- by defining groups of clients and having
ably the most important of the comput- mirror-image or secondary domain control-
ers, its role is mostly passive. Client--server lers administer the users of those clients.
architectures are also known as three-tier The primary and secondary controllers co-
architectures from the fact that the clients coordinate in order to remain synchronized
(tier one) request information from the pri- in terms of user profiles, security levels,
mary server (tier two) and they in turn file structures, etc. This ability to group
request the service from the specialized users into domains and have localized con-
servers (tier three), which provide a service trol yet global access to resources has made
such as keeping files, emails, or web pages; this the standard choice for the majority
they respond only to requests from the pri- of organizations because it simultaneously
mary server. facilitates security and flexibility of the
This is by far the most common organi- network.
zation for network services. When commu- Several variations on the nature of client--
nications are made, it is because one com- server architectures exist, two major cate-
puter or system was passively waiting for gories of which are Thin client and Fat client.
requests, and another actively made such In thin-client architectures the client itself
63
Clone
64
Cluster
platform: the IBM PC. IBM-PC-compatible For most desktop/workstation uses, the dif-
software became as popular as IBM PCs. ferences between manufacturers and CPU
This gave rise to the ‘‘Clone.” A clone is families are almost irrelevant, so, although
a computer that is exactly compatible with the variety of choices may seem bewilder-
an IBM PC, so completely compatible that ing, it is not really a problem.
it can run all software designed for an IBM
PC without modification, and thus may be Summary of potentially negative issues
bought instead and substituted directly for Many very cheaply made computers simply
the more well-known product. do not work properly.
The IBM-PC design is based on the Intel
microprocessor architecture. Any CPU that
behaves exactly as the corresponding Intel Cluster
CPU would behave may be used in its place.
Foundation concepts: Server, Network, Disk.
Thus the second type of clone: a micropro-
Definitions:
cessor CPU not made by Intel, but designed
to be directly substitutable on PC moth- 1. A group of computers configured to act as one.
erboards. Many of the product lines that 2. A group of blocks on a disk.
began as Intel clones have developed into 3. Agroupofsimilardataitemsthatmayberelated.
totally independent designs that are no
longer clones in the normal sense. Overviews
1. In situations where a single computer is
Business value proposition powerful enough to support the needs of
The vast majority of computers today are a moderate number of concurrent users,
clones, so much so that the term is losing but not powerful enough to support the
its meaning. For a computer to be a clone large number actually expected, it is com-
is no longer a derogatory term. There are mon to use clustering. A Cluster is a group
many variables to consider in selecting a of computers, usually configured identi-
CPU, and no single product or manufac- cally as servers. They will generally have
turer can claim to be ‘‘the” best. the same software installed, and, either
However, designing a whole computer through disk-sharing or by providing each
is an expensive and complex undertaking. computer with its own copy, will have
Poor design, especially of the motherboard, access to the same data.
results in poor performance and unreliabil- The cluster appears to the outside world,
ity. It is important to select a manufacturer under casual inspection, to be a single com-
that can put enough resources into proper puter. A system of dynamic load balanc-
development of their products. A good clue ing is used to redirect all user requests to
is to start reading the ‘‘user manual”: if the the member of the cluster that is currently
text is of poor quality or incomprehensible least occupied. Two users ‘‘logging in” at
this may be an indication of other quality the same time may find themselves con-
issues associated with the computer itself, nected to different real computers, but still
or its manufacturer. having access to all the same files. A cluster
of servers is often called a Server farm.
Summary of positive issues With clusters, it is common to use a
Competition provided by alternate man- system of Load balancing. This is a soft-
ufacturers ensures healthy development ware consideration that attempts to keep
budgets, continuing improvement of prod- the processing load on each of the mem-
uct lines, more options, and lower prices. bers of a cluster reasonably balanced.
65
Cluster
66
Cobol
67
Collaborative commerce
they were amended by many generations similar systems) to be sent across a network
of programmers. This leads to difficulties efficiently and effectively.
in understanding, modifying, and testing An example of collaborative commerce
aged Cobol systems. occurs when company A (e.g., a car manu-
facturer) sends to company B (a tier-1 com-
References ponent manufacturer) a data file (in XML)
r D. McCracken (1970). Guide to Cobol
that was generated by their ERP system’s
Programming (New York, John Wiley and ‘‘planning and forecasting module,” which
Sons). was itself generated in the ERP module that
r http://www.cobolstandard.info/wg4/
models customer demand. Upon receipt by
standard.html. company B the information is routed into
r H. Hinman (2004). Microsoft.NET for
their ERP system and their ‘‘planning and
COBOL Programmers, 2nd edn. forecasting module” re-aligns company B’s
(Sunnyvale, CA, Fujitsu Software production levels, inventory levels, human
Corporation). resource requirements, and capital require-
ments, and notifies their tier-2 supplier of
Associated terminology: C and C++, Java,
the changes. The majority of these chan-
XML.
ges and messages can occur without any
human intervention at all if the process
models are designed to allow such an event
Collaborative commerce to occur.
The advent of internet technologies such
Foundation concepts: ERP, Network. as XML has allowed the development of net-
Definition: Collaborative commerce is the sharing works that allow many companies to inter-
of information between two or more entities to connect and share data in large quantities
improve the collaborators’ service delivery or product effectively and efficiently. The data-sharing
development. concept that underpins collaborative com-
merce has developed the concept of a Data
Overview synchronization hub. Such a hub allows com-
Collaborative commerce has fundamentally panies to place data pertaining to items
existed in one form or another since in an accepted standard format such as
the beginning of modern manufacturing EAN.UUC. The data is then placed in a ‘‘data
when, for example, manufacturers would pool” (a database where many companies
share their schedules with a supplier. How- place their data). This data pool is usually
ever, with the advent of more sophisticated administered by a third party (but for large
technologies such as networking, CAD/CAM companies this can be done in-house) and
design applications, virtual reality models, they send information pertaining to the
materials requirements planning systems, dataset to the global synchronization net-
ERP systems, and messaging technologies, work (GSN) provider, which holds the infor-
collaborative commerce has evolved dra- mation and the location of each item’s Data
matically to the point where corporations pool (q.v.). Customers or other companies
link aspects of their value chains together then search the GSN registry through their
into a value system. The ability for compa- chosen data pool provider and notify the
nies to share data across corporate bound- data pool provider of the items they wish
aries has been facilitated by the adoption to subscribe to. The GSN provider then uses
of common base technologies such as XML a ‘‘synchronization engine” automatically
and TCP/IP protocols. These protocols allow and continuously to synchronize the data
information generated by ERP systems (or between the two companies. For example, a
68
Compiler
69
Compiler
70
Complexity
being produced for each machine and fre- being closely coupled to the hardware and
quently the language designers would not operating systems of the computer being
stick strictly to the ANSI definitions of the used.
‘‘standard” language, thus causing more
References
variance. r R. Wexelblatt (1981). History of
In order to resolve this situation IT
Programming Languages (New York,
organizations within commercial settings
Elsevier).
have focused on the set of languages and r T. Bergin and R. Gibson (1996). History of
their compilers that are portable or inter-
Programming Languages -- II (New York,
operable, that is they will work on any
ACM Press--Addison-Wesley, 1996.
platform. This movement started with the
marriage of Unix and C, which are often Associated terminology: C and C++, Java,
open source in nature, enabling the pro- Cobol.
grammers to understand the constraints
and workings of the systems in which they
were programming. By the late 1980s C Complexity
had become a favored and common devel-
opment language and evolved into C++, Foundation concept: Algorithm.
an object-oriented programming (OOP) envi- Definition:Complexityisameasureoftherelationship
ronment, which was more powerful than between the amount of data to be processed and the
C yet still portable across platforms. While time required to perform that processing. It is not a
C++ is the predominant choice of devel- simple matter.
opment language for the majority of pro-
grammers, other compiled languages that Overview
are heavily intertwined with the platform In computing, Complexity is a technical term
upon which they operate are still widely that describes the fundamental speed of an
used, e.g., Visual Basic and J++ are predom- algorithm. It is not directly connected with
inantly development languages for inter- the common meaning of the word, ‘‘how
facing with Microsoft applications and for complicated something is.”
developing web interfaces. Take for example a simple searching
application. It takes as its input a plain
Summary of positive issues text file, perhaps a corporate report, per-
Compilers shorten the development time haps a novel, it doesn’t matter. The applica-
for programmers to write programs. The tion’s job is to search the entire file to see
library features of compilers save program- whether a particular word or sequence of
mers from needlessly duplicating previ- symbols appears anywhere within it, a com-
ous standard functions every time they mon task. Obviously, the longer the file is,
write a program. Open-source platform- the longer the search will take. We could
independent compilers allow developers to perform a simple experiment and work out
write code that will run upon a variety of the relationship. If we make the applica-
devices. tion search a 10 MB file, and find that it
takes 1 second to do so, we could reason-
Summary of potentially negative issues ably conclude that it would take 2 seconds
There exist millions of lines of code written to search a 20 MB file, and 10 seconds to
in thousands of programming languages search a 100 MB file.
and their variants, few of which will com- People naturally expect a linear relation-
pile and run on any other systems. Com- ship between the size of a problem and the
pilers are frequently proprietary in nature, time required to solve it. If the data size is
71
Complexity
doubled, the time required is also doubled; ing curriculum, and it is a very complex
if the size is multiplied by 100, then the subject.
time required is also multiplied by 100. This The magnitude of the problem is simple
is so much a part of normal life that it to illustrate. Consider the testing phase of
is usually just taken for granted. Driving a new software application. It is tested on
200 miles takes twice as long as driving a data set of 1000 records, and found to
100 miles; reading two reports takes twice take just 1 second to perform its task. A
as long as reading one report. naive developer would assume that process-
Interestingly, the relationship between ing 10 000 records would take 10 seconds;
data size and processing time is often very 1 000 000 records would take 1000 seconds
far from linear, and the assumption of (17 minutes) and so on. The table below
linearity can lead to serious miscalcula- shows what the true completion times
tions. One simple example is ‘‘long multi- would be for some common algorithm com-
plication,” as learned in elementary school. plexities.
To multiply together a pair of two-digit
numbers, four individual simple multipli- Data
cations are required (to work out 63 × 75, size Linear Quadratic Cubic Logarithmic
one calculates 6 × 7, 6 × 5, 3 × 7, and 3 × 5,
1000 1s 1s 1s 1.00 s
then with adding, carrying, and luck, one
2000 2s 4s 8s 1.10 s
produces the correct answer). It took me 3000 3s 9s 27 s 1.17 s
12 seconds to multiply together those two- 4000 4s 16 s 1 min 1.20 s
digit numbers. With the assumption of 5000 5s 25 s 2 min 1.24 s
linearity, I would expect it to take me 10 000 10 s 100 s 17 min 1.31 s
24 seconds to multiply together a pair of 100 000 100 s 21/2 hours 12 days 1.70 s
1 000 000 17 min 12 days 31 years 2.00 s
four-digit numbers, 120 seconds to multi-
ply together a pair of twenty-digit numbers,
and so on. This is a dramatic illustration, but not a
However, such a prediction is completely dramatized one. After finding that a new
invalid. Closer analysis reveals that mul- application works properly, and seeing that
tiplying together a pair of twenty-digit it takes 1 second to process 1000 data items,
numbers requires 400 individual simple many programmers would not bother to
multiplications, not 40, so it would take me test it with 1 000 000 data items, especially
1200 seconds, not 120. The amount of work after deducing that the test would take 17
required grows with the square of the data minutes. However, if the algorithm turned
size. Different problems (and, sometimes, out to be cubic, it would in fact take 31
different solutions to the same problem) years, not 17 minutes, and large-scale cus-
produce a different relationship between tomers would be justifiably angry. As the
amount of data and required processing table shows, non-linear complexities can
time. Linearity is possible, but must not be also work the other way, giving unexpect-
assumed. edly fast responses.
Non-linear relationships between the
amount of data and the time required to Business value proposition
process it are very common in computing. Failure to understand the complexity of a
A professional programmer must be aware problem makes it impossible to estimate
of this, and must know how to perform a even roughly how long it will take to solve
basic complexity analysis for an algorithm. a problem. This in turn makes it impossi-
This is a major component of a comput- ble to select the appropriate solution for
72
Compression
any given circumstances. Examples of such constrained and determined by many fac-
problems include the algorithms used to tors, including budget, hardware, legacy
search and compare databases. If a com- software, and operating system constraints.
pany were to develop a new product, say
References
a search engine for the internet, the algo- r D. Knuth (1975). Fundamental Algorithms
rithmic complexity of the approach taken
(New York, Addison-Wesley).
needs to be considered by the organiza- r M. Hofi (1975). Analysis of Algorithms
tion’s programming team. This complexity
(Oxford, Oxford University Press).
would include the way the data is struc-
tured (the data structure) in the database, the Associated terminology: Algorithm,
way the data is placed on the server config- Database, Computability.
uration, and the search algorithm that oper-
ates over the data. In order for the new
product to be successful, it is thus imper- Compression
ative that all of the technical complexity
issues be considered and that these issues Foundation concepts: Bit, Binary, Digital.
are well understood and resolved such that Definition: Reducing the size of a data set (the amount
platform and software purchases such as of storage it requires) without losing any of the data.
the database upon which the data will
reside can be made in an informed man- Overview
ner. For example, a poorly selected database Compression and Encryption are very similar
environment may lead to the programmer operations. Both convert data into an alter-
using a sub-optimal data structure to store native form, from which the original may
the data, which forces the selection of a be fully recovered later. With encryption,
sub-optimal search algorithm to be used the aim is to make the recovery impossible
and leads to a sub-optimal product being for unauthorized third parties. With com-
produced as a result. Thus, complexity can pression, the aim is to make the alternative
be directly tied to the value proposition of form as small as possible.
products as well as the total cost of own- Data in its Raw form, as it is collected,
ership and return-on-investment decisions unprocessed, tends to contain a lot of
associated with a product. redundancy. That is, with some analysis it
would be possible to represent exactly the
Summary of positive issues same information in a much less bulky
The mathematics of computational com- form. Human speech is a clear example. To
plexity, while not easy, is formally devel- record one minute of speech faithfully, so
oped within the discipline of computa- that a play-back is indistinguishable from
tion and computer science. The algorithmic the original to the human ear, requires
complexities of many well-known func- about 5 000 000 bytes of data. If the only
tions are documented and understood by real information in that speech is the
computer scientists. words that were spoken, it will amount to
not much more than 100 words, or a total
Summary of potentially negative issues of 500--1000 characters, which require one
Failure to determine the complexity of an byte each. The same information can be
algorithm can easily make a software prod- recorded in one two-thousandth of the
uct completely worthless. Not understand- amount of data. It has lost all the sounds of
ing the whole picture of product complex- human speech, but retained all of the use-
ity can be fatal. Algorithmic complexity is ful information. That is one form of data
73
Compression
compression: extracting the true informa- behind Huffman encoding, which is the basis
tion from the surrounding raw data. of many commercial compression schemes.
Another form is illustrated by the rep- Further compression may be achieved by
resentation of graphical images in digital noticing that whole areas of the image
form. The most common representation for are the same color, and encoding a repre-
such images uses a triplet of small num- sentation of that area together with just
bers (in the range 0--255) for each pixel; one copy of its color. Detecting large areas
the first indicates the degree of redness in of a single color is very easy for human
the pixel, the second greenness, and the observers, but computationally difficult. A
third blueness. So a bright red pixel would simpler version, a sequence of pixels within
be represented by (255,0,0), a greenish-blue a single row that are all of the same
one by (0,120,240), a black one by (0,0,0), color, is much easier, and is known as RLE,
and a white one by (255,255,255). These Run-length encoding, a popular compression
numbers are simply stored as they are, in scheme for some limited applications.
top-to-bottom, left-to-right order, to make Far greater compression ratios may be
an uncompressed image file. A number achieved by noticing that certain patterns
in the range 0--255 requires exactly one of pixel color recur over and over again,
byte of storage (that is why that particu- and complex encodings that refer back to
lar range is used), so a small picture of size previous parts of an image to reconstruct
just 200 × 200 pixels would occupy 120 000 patterns can be devised. This is the basis
bytes uncompressed. of LZW (Lempel--Ziv--Welch) and GIF (Com-
If it is observed that a particular picture puServe Graphics Interchange Format) com-
involves just four different colors, each of pression. Another variation of this provides
those colors may be given its own binary the logic behind ‘‘Zip files.” Zip files are
encoding, perhaps 00 for white, 01 for most commonly used under Windows, but
red, 10 for blue, and 11 for green. Then a are also compatible with other systems;
sequence of four neighboring pixels can be they are single files that contain ‘‘archives”
written in a single 8-bit byte: 00000110 for for whole folders or directories all com-
white, white, red, blue. That results in a pressed. Individual files may be accessed
total size of 10 000 bytes for the 200 × 200 and decompressed or replaced either indi-
image, compressing it to just over 8% of its vidually or in groups. Zip files are a pop-
raw size. Recoding is the simplest form of ular and convenient means of distributing
compression. whole data collections and applications.
If it is further observed that, of the four If perfect reconstruction of the original
colors that appear in the image, white raw data is not required, then Lossy (as
is extremely common (perhaps 90% of all opposed to Lossless) compression methods
pixels are white), red is next (at 8% of may give even greater savings. In simple
all pixels), and blue and green are quite terms, it may be observed that a large
rare (at 1% each), a variable-length encod- region of an image consists not of exactly
ing will produce additional savings. In the same shade of blue, but of a small
this example, a one-bit code, 0, could be range of very similar shades. The whole
used for white, a two-bit code, 10, for red, region could be approximated by a sin-
and three-bit codes, 110 and 111, for blue gle shade region, and the departure from
and green. The sequence white, white, red, true might not be apparent to observers.
white would be encoded as 00100. For the More complex lossy methods can involve
whole image, we would have a total size discovering mathematical functions that
of 44 800 bits or 5600 bytes, less than 5% approximate the changes in pixel values
of the original size. This is the basic idea within a region to any desired degree of
74
Compression
accuracy. Lossy methods may be very good uments. Scanners can encode their images
for images, especially if they are only to be in a bitmap format, producing extremely
viewed by humans, but are not suitable for large but easy-to-process files, but, for stor-
critical data, where approximating a cus- age after processing, a compressed form of
tomer’s name or account balance would not the file should be acceptable. All popular
be acceptable. image file formats (except BMP) can provide
Since bandwidth limitations are a uni- a great deal of compression.
versal fact of life for the computing com- A further aspect of compression that is
munity, compression methods have been extremely useful for organizations is the
researched intensively. A great many very ability to send compressed files over net-
successful compression algorithms have works, in essence increasing the availabil-
already been fully worked out and imple- ity of bandwidth for other tasks.
mented. A programmer faced with the Network managers can specify processes
challenge of data compression would be and enforce policies for the application of
well advised to perform a careful literature compression methods pertaining to their
search before reinventing the wheel. Com- network and its users. They can also ensure
pression is possible only when the original that applications whose main focus is the
data contains some redundancy; compres- manipulation and storage of data files that
sion works by removing or reducing that are large and require significant resources
redundancy. Once data has been well com- are optimized. These applications include
pressed, further compression will not make document-management systems that facil-
it any smaller, and in some cases may pro- itate the use, linkage, version archiving,
duce a slight enlargement. and long-term storage of heterogeneous file
types (image files, text files, web files, XML
Business value proposition files, audio files, video, etc.)
The use of compression techniques for
the transmission and storage of data pro- Summary of positive issues
vides organizations and individuals with Compression allows large files to be stored
the ability to leverage their infrastructure, and transmitted in a smaller size than
incurring a relatively low overhead. A prin- their normal format. Decompression allows
cipal benefit of compression applications is files to be restored to their original format.
the ability to store data at a reduced size; There are many well-researched and well-
while this incurs overhead during compres- supported compression algorithms, tools,
sion and decompression, if the data is being and packages available for the purpose of
stored for the long term (say 7 years for compression. Compression techniques are
audit and compliance purposes) the stor- built into many software applications and
age savings can be considerable and the automatically generate compressed files for
work to compress the data may be a one- storage. Most files can be significantly com-
time expense. Data compression on a user’s pressed with lossless methods if exact
local computer system also allows consider- reconstruction of the original is needed,
able savings in storage requirements. A pop- and even further compressed with lossy
ular use of such technology is for the local methods if a good approximation to the
archival storage of emails in compressed original is sufficient.
format; this enables users to read old mes-
sages if they wish, after decompressing Summary of potentially negative issues
them, but reduces the local storage require- Compression and decompression impose a
ments. Similar processes can be associated resource overhead on the system when the
with the storage of images or scanned doc- operations are performed. If compressed
75
Computability
files are to be transmitted to other users, do it, but it is not known to be impossible.
the recipient must have the decompres- In fact, we know that it is possible, we just
sion software in order to read them. don’t know how. The ability might just be
Repeated compression and decompression round the corner, or we might never find
using high-ratio lossy methods results in it out.
Generation loss, in which the data is contin- In the physical world, most things clas-
uously degraded by each operation. sified as impossible may well not be. With
the speed-of-light example, it may be that
Reference
r K. Savood (2000). Introduction to Data new discoveries in physics will completely
change our beliefs; that does not make
Compression, Morgan Kaufmann Series
impossible things become possible, it sim-
in Multimedia and Information Systems
ply makes us realize that we were wrong in
(San Francisco, CA, Morgan Kaufmann).
the first place.
Associated terminology: Audio, Video, In computer technology, things called
Information lifecycle management. impossible are usually merely beyond our
reach. An expert may say that 2048-bit RSA
encryption is safe, meaning that, with cur-
rently foreseeable technology, encrypted
Computability messages will remain unreadable. But
nobody really knows how technology will
Foundation concept: Algorithm. develop in the future. Totally new, cur-
Definition:Whattasksarereallyimpossibletoperform rently unimagined, designs for computers
with a computer system? could render all these assumptions invalid.
The idea of a Quantum computer is currently
Overview no more than science fiction, but, if a prac-
The concept ‘‘impossible” is usually con- tical one were to be constructed, it would
fused with the concepts ‘‘I don’t know how have exactly this effect.
to do that” and ‘‘that isn’t within our tech- Additionally, there is the possibility of
nological grasp.” It is often said that it new developments in the theory of com-
is impossible for anything to travel faster putation. Continuing with the same exam-
than light, but the truth is simply that our ple, one of the great unknowns is the
current theories of physics do not allow ‘‘P equals NP” question, which many theo-
anything to accelerate beyond the speed of reticians have devoted their entire careers
light. We do not really know that these the- to trying (and failing) to answer. Nobody
ories are correct, and human understand- knows whether P equals NP or not, and
ing of physics has been completely turned currently it is a very dusty academic sub-
on its head any number of times in the ject. But if the answer is ever found, the
past. People may say that it is impossible to results will be striking. If it turns out that
fly to the moon for less than $300; we can’t P does not equal NP, then RSA encryption
do it now, but we have no idea what tech- really is safe. If it turns out that P equals
nological advances the future may hold. NP, then immediately, without any further
Similarly in programming, there are work, literally thousands of intractable
many things that we just don’t know how problems suddenly become easy; cracking
to do. True artificial intelligence, a com- RSA encryption (and a lot of other things)
puter mimicking a human well enough instantly becomes viable. What was called
that an impartial blind observer can not impossible is now possible.
tell that it is not really a human, has not To make matters more complex, theoret-
been achieved, and nobody knows how to ical computing does know of some things
76
Computer
that are genuinely impossible. Things that, have only a limited understanding of and
no matter what new technological leaps exposure to the discipline and its impact
or theoretical insights occur, must remain upon their work.
completely impossible; problems that if
References
solved would result in a contradiction in r J. Hopcroft and J. Ullman (1979).
logic itself. The study of computability, that
Automata Theory, Languages, and
is, possibility and impossibility in computer
Computation (New York, Addison-Wesley).
processes, is deeply complex and very easy r J. Brady (1977). The Theory of Computer
to misunderstand. As a general rule, only
Science (London, Chapman and Hall).
those with advanced degrees in computer r A. Turing (1936). ‘‘On computable
science or engineering are even aware that
numbers with an application to the
the subject exists, and far fewer have any
Entscheidungsproblem,” Proceedings of the
expertise in the matter.
London Mathematical Society, 2 (42),
230--265.
Business value proposition
Any attempt to do the impossible is likely Associated terminology: Algorithm,
to be a financial disaster. A belief that Complexity.
something is impossible when it isn’t can
result in a technological disaster. When-
ever a programmer claims that something Computer
is impossible, it is important to find out
what they really mean by that, and whether Definition: A calculating and data-manipulating
they really have a valid basis for the claim. device capable of storing its own program of
For example, the ability of a program to instructions and following those instructions without
check itself for correctness is theoretically intervention.
impossible and this impossibility derives
from Turing’s thesis, thus it is important Overview
that, should claims such as ‘‘self-checking Everybody who is reading this will already
software” be made by vendors, this should have a fairly good idea of what a com-
be thoroughly examined and understood puter is. Instead of discussing unnecessary
before a program is used in what could be technical details, this article will present
a critical process, such as one involving a the major varieties of computer currently
threat to human life. available.
Palmtop, PDA (portable digital assistant).
Summary of positive issues These are computers so small that they can
Computability is a theoretically sound be held in the palm of one hand. Gen-
field of study and significant amounts of erally they have much lower processing
research literature exist to help computer power than any other kind of computer
scientists and professionals examine the (because the very restrictive space require-
underlying mathematical and logical basis ments leave very little room for power-
of their programs and specifications. ful batteries) and very restricted storage
capacity. Some, but certainly not all, are
Summary of potentially negative issues restricted to running a small range of spe-
Computability theory is an extremely spe- cially reduced software applications. They
cialized and complex field of endeavor, have small displays and either miniatur-
which is primarily studied by research- ized keyboards or none at all (using a stylus
ers and academics. Programmers without instead), so they can be quite unsuitable for
advanced course work in computation may normal office work.
77
Computer
Notebook, laptop, portable. These terms ations. Since disks are a significant com-
have grown to mean almost exactly the ponent in the cost of a computer, diskless
same thing. Originally, a portable computer workstations can be financially very attrac-
was one that could be carried around, but tive. The performance of a diskless worksta-
doing so was not necessarily a pleasant tion can be very poor, because it is strictly
experience; such computers are generally bounded by available network bandwidth.
not made any more. When shades of mean- Microcomputer, minicomputer, main-
ing are intended, a notebook is usually an frame, super-computer. These are vaguely
extra-small laptop. defined points on the spectrum of overall
Desktop. A general term that now covers processing power for computers. A micro-
nearly all computers in common use. Any computer is normally taken to be any
computer that can fit on a normal desk and computer whose entire CPU is on a sin-
still leave enough of the desktop available gle integrated-circuit chip. This covers all
for other uses. desktop computers now, and the term has
Server. The term ‘‘server” properly fallen out of use. Originally microcomput-
applies to a software configuration: a net- ers were thought of as essentially toy com-
work application that provides some ser- puters that were not suitable for any seri-
vice to a set of client systems on demand. ous work. Minicomputers were computers
The word is often used to describe a that were really suitable for use by only one
desktop-style computer that is at the higher user (or a very few concurrent users), but
end of the performance range, and so were certainly not just toys. This term has
might reasonably be used for running also fallen out of use, since the realms of
server applications. The term is not strictly microcomputers and minicomputers have
defined, and servers range from a personal completely merged to create the desk-
computer that runs applications that do top computer. A mainframe is a high-power,
not require especially high performance high-price computer, usually very large and
to clusters of high-performance computers having a staffing and facilities requirement
that run enterprise-class systems such as to keep it running (e.g., air-conditioned
ERPs. rooms, automated non-destructive fire
Workstation. The counterpart to extinguishers, and heavy-duty uninterrupt-
‘‘server.” The term is not strictly defined, ible power supplies). Mainframes provide
but is generally taken to mean a desktop- much faster processing and storage access,
style computer intended for use by one and much higher input and output band-
person at a time. Workstations are not at width than other computers, typically
the top of the performance range. being able to support thousands of con-
Thin client. A very-low-power worksta- current users. A super-computer is simply
tion, often one that is used only for display a very fast mainframe, often supporting
purposes, all the real work being done by fewer users and having a lower access band-
a server that receives commands from, and width than a typical mainframe, but pro-
transmits responses to, the thin client over viding even higher computing speed.
a network connection. A thin client could Embedded system. Not all computers
be a reasonable means to provide restricted are clearly identifiable objects intended
public access, perhaps including just web- for direct use by people. Often comput-
browser capabilities. ers are built into other devices as part
Diskless workstation. A variety of thin of the control or monitoring system. Mod-
client, which has no disks or other perma- ern cars, microwave ovens, and even wash-
nent storage media of its own, and so is ing machines are computer-controlled. An
totally reliant upon a server for all oper- embedded system is a computer that is
78
Computer-based training (CBT)
79
Computer Fraud and Abuse Act, 1986
80
Computer Misuse Act of 1990 (UK only)
intent is to defraud, and it affects inter- the intent of accessing any program or data
state or foreign commerce or a government on that computer, if they know that that
computer. access is unauthorized. The act explicitly
states that the intent does not have to be
Business value proposition to access any particular program or data,
The legislation primarily covers intentional or even to access any particular computer,
misuse or unauthorized access to federal so simply releasing a virus or Trojan horse
(US) computers, equipment, and resources. on the world at large is covered if it ever
The act also covers general misuse of works. The offense can earn up to five years’
computing resources (such as those at imprisonment.
corporations) resulting in ‘‘damage.” The Section 3 of the act adds that it is an
legislation acts as a deterrent to misuse offence to do ‘‘any act which causes an
and provides the judiciary with the abil- unauthorized modification of the contents
ity to impose penalties upon those found of any computer.” Section 17 clarifies the
guilty. position, stating that it is an offence just to
cause a program to be run without autho-
rization.
Summary of positive issues
Spyware (q.v.) is, in its most common
The act forbids intentional unauthorized
usage, software designed to provide infor-
access to a computer system and access
mation from a computer, without its
that exceeds authorization. The act covers
owner’s consent or knowledge, to a third
the misuse of access rights to inflict dam-
party. This is fully covered by Section 1
age upon other systems, including medical
of the act, which forbids simply accessing
and government systems, damage to indi-
data without authorization, regardless of
viduals in the form of physical injury, and
whether any fraud is committed or any
threats to public safety. Financial damage
harm is done, or even whether any files are
from misuse is set at a lower bound of
modified.
$5000.
Section 3 covers unauthorized modifica-
References tions of the desktop setup, changing a web-
r 18 USC 1030; Public Law 99--474. browser’s default ‘‘home page,” and intro-
r http://www.usdoj.gov/criminal/ ducing viruses and worms into the system.
cybercrime/1030NEW.htm.
Associated terminology: Virus, Trojan Business value proposition
horse, Law cross-reference. The legislation primarily covers the inten-
tional misuse of any computer within the
UK and its jurisdiction, either directly or
Computer Misuse Act of 1990 through remote means such as releasing a
(UK only) Trojan horse. The act also covers the unau-
thorized modification of a database or the
Foundation concept: Security. contents of a computer. The act also is pow-
Definition: The law of the UK which covers unautho- erful in that it covers the situation when a
rized computer use. user has accessed data to which they are
not authorized.
Overview The act provides a strong deterrent
Section 1 of the act clearly states that a against the misuse of computing resources
person is guilty of an offence if they cause at corporate level by malicious or even over-
a computer to perform any function with inquisitive individuals.
81
Computer Security Act of 1987
82
Cookies
business, or data need but are not usu- to upgrade or change at the vendor’s pace
ally considered ‘‘standards” in the univer- rather than independently. Such standards,
sal sense of those supported by bodies such not developed by authorizing bodies, might
as ISO and ANSI. However, should the tech- not become adopted by a wide user base
nology developed by a vendor become pop- and could lead to problems in connect-
ular, indiscernible, or even forced upon ing with other entities that adopted what
customers, such protocols may be known turned out to be the ‘‘industry best prac-
as ‘‘industry standards,” but should still be tice” or de facto standard.
considered proprietary in nature. Some vendors do not always support all
There is a fourth set, termed ‘‘open stan- the connectivity standards, and a compro-
dards,” for which the specifications and mise may be required in order to enable
protocols are explicitly available to anyone communications. For example, the develop-
who wishes to adopt them or modify them. ment of a middleware solution or simply
the adoption of a second choice, which is a
Business value proposition more supported standard, may be required.
A wide variety of connectivity standards is
Associated terminology: Protocol,
available to facilitate connection between
Client--server, W3C, IEEE.
hardware, software, and network devices,
through both wired and wireless media.
The use of widely adopted standards allows
organizations to connect with a wide spec-
Cookies
trum of users who have also adopted that Foundation concept: e-Commerce.
standard. International and national stan- Definition: A cookie is a small packet of information
dards are typically adopted and used by created by a web server and stored on a web-browsing
technology vendors to enable their prod- computer without the knowledge of the computer’s
ucts to connect with other products and owner. It contains information about recent transac-
are developed to maintain compatibility tions with the server, and is used to identify returning
between vendors. Open standards allow browsers and provide some context for their visit.
organizations to amend and modify tech-
nology protocols to meet their particular Overview
requirements. Browsing the web is a Stateless activity.
This means that, when a browser visits a
Summary of positive issues web site, an internet connection is made
There exists a wide range of connectiv- between the browser and the server, the
ity standards that allow entities to com- browser transmits a request for a document
municate in a variety of ways and incur or page of information, the server trans-
the overhead that is appropriate for their mits the requested information, and the
requirements. International, national, and connection is terminated. If a second docu-
industry-specific standards tend to have ment is needed from the same server, a new
wide support and established consulting connection may be required. No informa-
resources are available to establish, main- tion survives from one request to the next.
tain, and support them. Even if a complex interaction is under way,
in which the user is completing a multi-
Summary of potentially negative issues part Form to supply essential data for a
The adoption of proprietary connectivity transaction, each stage is completely dis-
standards imposes a degree of dependency connected. A web server works by listening
upon the vendor and it may be necessary for anonymous requests for data, servicing
83
Cookies
those requests, and immediately forgetting eroding about cookies, and when they do
them. provide the best solution, some clear state-
This would make reliable commercial ment of policy may calm users.
transactions impossible. If the browser’s The ‘‘cookie” debate is centered upon
order for a product, the server’s response third-party privacy issues, where compa-
with the price, and the browser’s provision nies use cookies to develop ‘‘one-to-one”
of credit-card data and shipping address marketing campaigns aimed at the individ-
can not be kept together, the system is not ual computer and its user. Primarily a com-
going to work. Cookies are one possible pany will contract with a marketing com-
solution to the problem. pany to develop profiles of its customers
A web server can create a cookie that con- in order to develop customized responses
tains within it essential details of a transac- and advertising. Typically, the marketing
tion, and send it to the web browser along company places cookies on their clients’
with the information that is to be viewed. sites and, since a cookie can be associated
The browser will store the cookie, possibly with any object (e.g., a banner advertise-
permanently, on disk. Every time a connec- ment, download icon, or web page), the pro-
tion is made to a server, the cookies previ- file for individuals’ surfing habits can be
ously sent by that server are sent back to developed for that specific company’s web
it. Thus the server can reconstruct earlier site.
parts of the transaction. The server must Marketing companies also aggregate data
usually record all cookies that it has sent by placing cookies on multiple sites and
for validation and other purposes. thus, when a customer visits any of these
Cookies can persist on the browser’s com- sites, the information pertaining to a par-
puter for an unlimited time, and create ticular cookie is not actually sent directly
a record of the owner’s web-browsing his- back to the company whose site is being
tory that the owner may well rather not visited but rather the information is sent
have exist. There is a widespread percep- to the marketing company, allowing aggre-
tion that cookies can extract private infor- gated data to be stored. The aim of this
mation from a computer and secretly trans- practice is to create an overall score for an
mit it back to a server. There can be no individual that can then be passed on to a
doubt that some insecure systems have client company summarizing an individual
allowed cookies to be released to unautho- customer’s total web usage.
rized third parties, creating a potentially In light of the controversial practice of
severe violation of privacy and contributing cookie aggregation, the Internet Engineer-
to identity theft. Cookies are not active; ing Task Force, an influential body through
they can not ‘‘scan” a computer or erase which internet standards and policies are
its files. It is technically possible for a developed, is examining the issue of ‘‘third-
virus to be transported by a cookie, but party” cookie requests and is developing
some other means would be required to policy positions that may limit or stop such
activate it. practices being employed without the cus-
tomers’ consent.
Business value proposition
Bad publicity surrounding cookies gener- Summary of positive issues
ates dissatisfaction with sites that insist Cookies are a simple device that eases
on using them. Other solutions to the and facilitates internet and electronic-
stateless-server problem may be available commerce transactions. They are univer-
and should be considered. However, there sally available on all platforms and permit
is nothing inherently wrong or privacy- reliable transactions to take place.
84
CPU (central processing unit)
Summary of potentially negative issues rize and be tested on than for any usefully
The negative public perception of cookies descriptive purpose.
has grown from the use of cookies by third In a desktop or personal computer, the
parties who aggregate the cookie data and term CPU is now usually used for the sin-
profile the users, which can be perceived as gle chip (integrated circuit) that provides
an invasion of privacy by computer users the ‘‘brains” of the computer, such as a Pen-
or even an unacceptable security risk for tium IV, a G4, or an Alpha, to name but
many users. a few, even though it also contains a sig-
nificant amount of cache memory and I/O
References control. The term Microprocessor is usually
r D. Kristol and L. Montulli (2000).
used for any CPU that consists of a single
‘‘RFC2965: HTTP State Management integrated circuit.
Mechanism” (proposed standard), There are three distinct schools of
http://www.ietf.org/rfc/rfc2965.txt. thought on overall CPU design philosophy.
r http://www.cookiecentral.com/dsm.htm.
The traditional philosophy holds that CPUs
Associated terminology: Web server, should be designed with as many bells and
Security. whistles as possible, able to perform com-
plex operations such as evaluating polyno-
mials and rotating four-dimensional vec-
tors with single instructions. This is known
as the CISC (Complex Instruction Set Com-
CPU (central processing unit) puter) design, and probably reached its
acme in the now-discontinued DEC VAX
Foundation concepts: Hardware, Computer.
line of processors. An alternate philosophy
Definition: The functional unit in a computer respon-
is that, if the processors are made as sim-
sible for performing arithmetic and logical operations,
ple as possible, limiting their abilities to
carrying out the execution of programmed instruc-
performing simple arithmetic and logical
tions, and controlling the whole system.
tests, they will be able to work much more
quickly and efficiently. This is known as the
Overview RISC (Reduced Instruction Set Computer)
In the days of mainframe computers, the design, and probably reached its acme in
circuitry that performed the arithmetic the now-discontinued Inmos Transputers. A
and control functions of a computer would third philosophy is that CPUs should not
usually occupy at least one quite large be powerful or complex or fast; a large
and impressive looking box, adorned with number of very cheap processors cooperat-
flashing lights and switches. The memory ing on one task can do a better job than
would occupy another nearby box, and one very fast powerful processor. This is
there would be a lot of other large boxes the CSP (Communicating Sequential Proces-
also connected to it, responsible for con- sors) design, and is currently re-emerging
trolling disk drives, magnetic tapes, print- in IBM’s Cell Computers.
ers, and so on. These boxes had to be given Which of the three philosophies is cho-
names, and were called the Central process- sen may have little effect on the end
ing unit (or CPU), Memory unit, and I/O (input user, or even on applications programmers,
and output) unit, respectively. These desig- but, for organizations involved in low-
nations have survived as names for identifi- level (operating systems and languages),
able parts of modern computers, although real-time, or video-game programming, the
they are much less significant, and serve basic design of the CPUs to be used is a
more to give students something to memo- major consideration.
85
Cracking
86
Cracking
When an expensive software application direction for any of the modern strong
is sold, the customer must have some way methods).
of installing it on their computer, and re- The distinction between a true crack,
installing it after something goes wrong; software that actively discovers passwords
this is easily handled when software pack- or generates activation keys, and sim-
ages are distributed on CDs. Since the early ply publishing a known password or key,
1990s it has been very easy for anyone to is vitally important. Many ‘‘cracks” web
make copies of any CD. This causes a sig- sites do simply provide a list of known
nificant problem for the software manufac- keys, many or all of which have simply
turer, since one legitimate sale may result been copied from people who legitimately
in hundreds of black-market copies enter- bought the software in question. This is
ing into circulation. In the age of high- not a crack, and it is a simple matter for
speed personal internet connections, it is a savvy software manufacturer to make
very common to find the entire contents of each of the published keys invalid. Once a
popular CDs freely downloadable from pub- key-generation method has been discovered
lic web sites. A common solution is for soft- and made public, there is very little that a
ware manufacturers to design their prod- manufacturer can do about it without cut-
ucts so that they will work only after a ting off all of their legitimate customers.
secret identification code has been entered;
the product may communicate with head- Business value proposition
quarters (over the internet) to ensure that The ‘‘industry” of cracking is an aspect of
no two products have been activated with the software culture that developers and
the same key. Manufacturers must design a network owners need to be aware of. Under-
system that allows them to generate a very standing the risks associated with password
large number of different keys, all of which and access-control mechanisms is a vital
will be accepted by the software, whilst aspect of controlling intellectual-property
keeping it impossible for customers to gen- ownership and security.
erate those keys for themselves.
There is a major underground indus- Summary of positive issues
try devoted to cracking all of these appli- Security and control of passwords and
cations of software security. Anyone with access mechanisms is an aspect of systems
an internet connection can freely down- development and ownership that is control-
load cracking kits that do a surprisingly lable by the use of carefully designed and
good job of discovering any poorly chosen conscientiously implemented formal pro-
passwords. For most software that requires cesses to manage access, storage, and use
an activation key, there is software avail- of critical information.
able that will instantly generate new valid
Summary of potentially negative issues
ones. Of course, every major government
Cracking is endemic and a part of the soft-
has a department devoted to cracking all
ware culture; it is unlikely to go away.
of the major encryption algorithms (we
may take some comfort from the fact that Associated terminology: Hacker,
there is no evidence of success in this Encryption.
87
Database
88
Database
in some electronic form, this will require but by the relationship between data items
some simple but specialized programming in different tables. For example, to record
to perform the data conversion automat- the fact that Q. X. Smith sold 75 brown
ically. If the data does not exist in elec- cows to Amalgamated Meats Incorporated
tronic form, then a long period of manual for $15 000 on July 4, 1993, there might
data entry will be required. Paper forms can be an entry in the employees table indi-
be automatically scanned, and, although cating (amongst other things) that there
scanners now have very good resolution is an employee named Q. X. Smith with
and high reliability, the written informa- ID number 3636; there might be an entry
tion must be converted into a computer- in the products table indicating that there
recognizable digital form. Optical character is a product named Brown Cow with
recognition (OCR) technology is not suffi- UPC 2135712; there might be an entry in
ciently advanced, especially for decoding the customers table showing a customer
hand-written characters, that it could be named Amalgamated Meats Incorporated
relied upon for commercially or legally with reference number 25008, and there
essential data. The time spent verifying and might be a record in the sales table con-
correcting the results of automatic OCR can taining just the six numbers 3636 25008
be as much as the time taken to enter the 2135712 75 15000 19990704. This organiza-
data manually. tion saves a lot of space in the data files,
Once the data is entered, a DBMS appli- and goes a long way toward ensuring data
cation usually provides some default user consistency (see Normalization for further
interface in the form of an interactive explanation), but does mean that the pro-
Query language, the most popular of which cedure for answering simple queries like
is SQL, the Structured Query Language (pro- ‘‘What is the total value of sales made by
nounced ‘‘Ess Cue Ell”; Sequel is a slightly Q. X. Smith?” becomes quite complex.
different thing, a complete DBMS soft- DBMSs, even light-weight single-user ver-
ware package, rather than just the lan- sions for personal computers, are often
guage). SQL is a standardized language for not stand-alone applications, but a client--
all database operations; it covers database server pair. A client provides the front
creation, and record entry and deletion, end, interacting with the user and com-
not just queries, and it can be mastered posing correct queries, perhaps even ren-
by nearly all after some technical train- dering the user’s commands into SQL. The
ing, but it can not be reliably used by command is sent to the Database server, a
unskilled untrained employees, so most second application, not necessarily on a dif-
organizations also require some significant ferent computer, for processing. The even-
expenditure in software development to tual responses are then reprocessed by the
produce a safe and reliable user interface. client for display in a suitable format. The
Commercial DBMS software, especially that use of a database server even when only
intended for use on desktop and similar sys- one computer is involved simplifies design,
tems, often provides a more intuitive and and is a great aid to scalability and acces-
user-friendly graphical interface, but it still sibility. It is a simple matter to change
requires a fair amount of training before it the connection to a remote server when
can be used effectively. conditions demand, but continue to work
Nearly all database systems in use today as before. When a database becomes too
follow the Relational database model. That large to handle locally, multiple database
means that some real information is rep- servers are used, making a Distributed
resented not by records in single tables, database.
89
Database administrator (DBA)
90
Data-flow diagram
database-related products, e.g., data ware- are self-configuring, requiring little outside
houses, and products that interact with intervention.
database systems, e.g., ERP systems. The
DBA is responsible for the physical data- Summary of positive issues
base design, including changing existing DBAs act as the controlling authority for
databases to accommodate modified or any corporate data, database, or system
new applications. Other duties include the relating to data. DBAs are highly trained
training of employees in database tech- and skilled in database technologies. DBAs
nologies and development of contingency- act to ensure the security of their database
planning activities with the CIO and chief systems and to enforce regulatory require-
security officer so that any data corrup- ments.
tion or loss can be remedied quickly and
effectively.
Summary of potentially negative issues
DBAs are required to be highly skilled and
Business value proposition must have on-going educational training.
The DBA performs a very important tech-
Employment of a weak or under-skilled
nical function within any IT organization,
DBA can lead to catastrophic database prob-
managing and developing the corporate
lems. Some systems require very specific
database so that it runs efficiently and
skills not typically found in a DBA, e.g.,
effectively. DBAs should be technically
knowledge of ERP systems.
highly skilled individuals who understand
the inner workings of database systems. Reference
This takes extensive training and knowl- r C. Mullins (2002). Database
edge of the system upon which they are Administration: The Complete Guide to
working. It is essential that the train- Practices and Procedures (New York,
ing of the DBA is regularly enhanced, Addison-Wesley).
since the technologies of the systems upon
which they work continue to evolve. Simi- Associated terminology: Data Protection
larly, the regulatory environment in which Act, Sarbanes--Oxley, HIPAA.
the corporation and its systems operate
also changes and DBAs need to be edu-
cated in their regulatory requirements, e.g., Data-flow diagram
Sarbanes--Oxley and HIPAA.
The role of the DBA is undergoing a Foundation concept: Software development lifecycle.
change in terms of the nature of the Definition: Data-flow diagrams are graphical models
databases upon which they operate. The thatshowtheflowofdatathroughasystem,theexter-
advent of complex packages such as ERP nalsourcesanddestinationsofdata,theprocessesthat
systems requires specialized skill sets in transform the data, and the locations where the data
order to be able to understand those is stored.
systems and their database functionality.
The databases of these systems are not Overview
intended to be managed in the same The concept of a logical data-flow model,
way as the traditional databases of the usually known simply as a data-flow dia-
past. ERP databases are highly specialized gram (DFD), is used in the analysis and
and extremely complicated, so customiza- design phases of the system’s lifecycle as a
tion of their structures, even by experts, communication tool to understand, verify,
is highly ill-advised because the systems and validate the system requirements.
91
Data-flow diagram
A DFD is constructed using the following eling are performed. Traditionally, using
symbols: structured techniques, process modeling
was performed prior to data modeling.
A process (e.g., check for stock availability,
However, today, due to the influence of
process credit-card payment):
information engineering, data modeling
precedes process modeling.
92
Data mining
as Semantic data models, Object models, and the product consumed, or did they buy
the Unified modeling language, which facil- more of another product in association
itate the mapping of process models into with the promoted product?
constructs that are more rigorous and are The theory and practice of data mining
easier to map onto more modern program- continue to develop and evolve. A relatively
ming environments such as Object-oriented new branch of data mining is Text mining.
programming. Much of the data on the internet and in
databases is actually textual in nature
Reference
r T. DeMarco (1978). Structured Analysis and rather than numeric (e.g., web pages con-
taining normal English prose). This has led
System Specification (New York, Yourdon
to a wide variety of techniques being used
Press).
to analyze the text, its patterns, and its con-
Associated terminology: UML, tent. The goals for undertaking text mining
Normalization. are the same as those for quantitative data
mining, namely the identification of new
and novel data patterns within seemingly
Data mining heterogeneous data.
93
Data pool
94
Data Protection Act (UK only)
Summary of potentially negative issues (1) the individual has consented to the
The conversion of data to the UCC standards processing;
can be a challenging resource-intensive (2) processing is necessary for the
problem. performance of a contract with the
individual;
References (3) processing is required under a legal
r http://www.worldwideretailexchange. obligation (other than the one imposed
org/cs/en/index.htm. by the contract);
95
Data quality audit
96
Data warehouse
can be many terabytes in size and it would the historical data to be dumped and writ-
clearly be impracticable to survey the whole ten off or to be scrubbed through the ETL
data set. Stage 3 involves the establish- process; this is a great opportunity to re-
ment of a base rule set to use in conjunc- establish the integrity of the data set.
tion with the sample data set (e.g., the
volume held in a tanker transporting fuel Summary of positive issues
oil can not be less than zero or greater DQAs facilitate the establishment of statis-
than the tanker’s capacity). Stage 4 involves tical data showing the quality of a data set.
expanding the rule set, establishing con- Tools, methodologies, and consultants are
fidence levels, and designing the output available to support the DQA process.
report structures. Stage 5 involves encour-
aging the user community to examine and Summary of potentially negative issues
modify rules, recalculate the confidence A large amount of resources may be needed
intervals, and adjust any reports that will to audit large old data sets and systems.
result from the audit. Finally, in stage 6 the Syntactic errors in data are easier to iden-
analysis is performed, and results are col- tify than semantic ones. Data sets are no
lected and examined. longer just numerically based and may
A poor result in a DQA should lead an include heterogeneous data sets, e.g., XML
organization to re-examine the data-entry files, pdf files, graphical images, and audio
and process models around which the sys- files, which are more difficult to audit.
tems are based. It is possible to clean up the
data and re-establish data integrity, but this Reference
r M. Gonzales (2004). ‘‘The data quality
can be a difficult and expensive proposi-
tion that may also involve cutting out data audit,” Intelligent Enterprise, July.
that is out of bounds and can not be cor-
rected through knowledge available from
other data sources. Data warehouse
An alternative to modifying a data set is
to establish a new data set through a pro- Foundation concepts: Database, ERP, File server.
cess known as ETL (extraction, transforma- Definition: A data warehouse is a file server that
tion, and loading of data), a process that contains data that has been extracted, transformed,
is usually related to the establishment of a and loaded into it from other databases. The data is
Data warehouse. then accessible for analytic programs, commonly
known as business intelligence software, to run upon
Business value proposition it. The results are frequently presented in a graphical
DQAs are essential components of a cor- format.
poration’s data management process. The
older the data set, the higher the proba- Overview
bility that it contains many errors. Modern The term Data warehouse is used to describe
information systems such as ERP systems, a dedicated facility for aggregation and
which have only one instance of a data item storage of data. Data warehouses are repos-
in its singular database, attempt to mini- itories of data that has been extracted from
mize the opportunity for data corruption corporate systems. The data undergoes a
to occur, both in the form of data-entry con- transformation process (or is Scrubbed as it
straints and through detecting errors in the is sometimes called) in which the data is
applications themselves. checked for type and consistency, before
The creation of a new data warehouse or it is loaded into the relational database
the establishment of an ERP system allows of the warehouse where the information
97
Decision support system (DSS)
is stored. This process is known as ETL way allows for focused one-time inquiries,
(extraction, transformation, and loading). with trends and exceptions being processed
Once the data has been loaded, it is avail- easily, and thus facilitates rapid decision
able to be manipulated and examined so making. This type of analysis would be
that analysis may be performed upon it. expensive and very difficult in traditional
Data warehouse analytic software allows databases.
the data to be examined from multiple
perspectives, termed Multi-dimensional data Summary of positive issues
views (or Slicing and dicing). This analysis is Data warehouses are secure data repos-
also frequently termed Cubical analysis since itories whose data has been ‘‘scrubbed”
the data is often manipulated into a three- and cleaned, resulting in a database with
dimensional cube, but more dimensions high integrity. The data structure used to
are possible. For example, a data warehouse store the data, namely the data cube, facil-
for a computer manufacturer may be struc- itates rapid data analysis and a highly
tured in three dimensions, of which the flexible approach to business intelligence
first is a dimension that is based upon the querying.
sales markets: north, west, south, and east.
A second dimension may be based upon Summary of potentially negative issues
product type (PC, PDA, laptop, and server), Data warehouses require dedicated servers
while the third dimension may be sales by if they are to be most effective and the
quarter. This would provide the basis for data extraction, transformation, and load-
data analysis by ‘‘slicing and dicing” these ing (into the data cube itself) can be diffi-
three dimensions. The programs that run cult, thus a careful return on investment
on the data warehouse allow the analyst analysis needs to be undertaken when
to drill down on the data, perhaps start- developing such systems.
ing with an analysis of sales for all the
References
markets the company serves over one year, r W. Inmom (1996). Building the Data
then drilling down to a quarter sales
Warehouse (New York, John Wiley and
period, then drilling down to a given
Sons).
month, and then again down to a given r E. Vitt, M. Luckevic, and S. Misner
week or day.
(2002). Business Intelligence (Redmond,
WA, Microsoft Press).
Business value proposition
The use of a data warehouse allows a busi- Associated terminology: Business
ness to have a central repository for his- intelligence, Data mining, Database, OLAP,
torical data. A data warehouse removes the ETL.
need to keep the data in other online sys-
tems such as the organization’s ERP. The
data warehouse facilitates the backup and Decision support system (DSS)
security of the data. The data after ETL will
be consistent, verified, and archived. The Foundation concept: Business intelligence.
executive information system software or Definition: A decision support system is an application
Business intelligence software that runs on through which managers and executives can analyze
the data warehouse allows the data to be corporate data.
examined at multiple levels of detail and
the results to be presented in a variety Overview
of formats, including graphical ones. The Decision support systems (DSSs) are the
ability to quickly manipulate data in this predecessors of what has become known
98
Denial-of-service attack
99
Denial-of-service attack
to prevent proper use of a system. Because and is especially effective because hun-
of this, many self-righteous DoS attackers dreds of computers can be used in uni-
manage to convince themselves that they son to create an unsurvivable flood on a
are not really doing anything illegal. whole target network. The messages that
A DoS attack is often designed to exploit cause a flood attack can be traced back to
a known error in a popular operating sys- their source, but the source often turns out
tem, and, since there are so many known to be an innocent third party’s unsecured
errors in popular operating systems, there home computer, which may have seemed
is a very wide variety of attack methods to be running quite slowly, but would have
available to the attacker. A very popular shown no other sign of having been com-
attack a few years ago was known as the promised.
‘‘ping of death,” in which a deliberately
invalid message was sent using the ICMP Business value proposition
protocol, and, due to a bug in the operat- There is no positive aspect for businesses
ing system, receiving that message would regarding a DoS attack; the only thing that
immediately crash the target computer. No organizations and individuals can do is
permanent harm done, just reboot and it’s guard against them. This entails having a
running again, but all it took was a very strong security policy in place and under-
simple program, easily downloaded from standing what the attacks are and how they
the internet, to send the same message break into an organization or take over a
automatically every 5 minutes, rendering computer.
the target computer completely useless. To A strong and continuously updated fire-
this day, many installations have disabled wall system is the best form of defense,
their ‘‘ping” service even though the bug which should be combined with regular
has long since been fixed. Many responses systems checkups to ensure that the sys-
to attacks seem to be based more on super- tem has not been taken over and become
stition and voodoo than on sense. Some a zombie controlled from outside the orga-
attacks make use of known bugs in net- nization. While it is important that organi-
worked applications, where reception of a zations maintain an understanding of the
particular message might result in the tar- latest DoS strategies employed and defend
get application filling a whole disk with themselves against them, it is also impor-
bogus files, or embarking upon a non- tant that, when a technique has been
terminating computation. defeated, the company assess whether sys-
Another form of DoS attack requires no tems ‘‘lockdowns” are still necessary. Such
knowledge of any operating-system flaws was the case when the ‘‘ping of death”
at all. Simply flood the target computer was a peril to systems; organizations per-
with as many internet-connection requests manently closed down their ‘‘ping” ser-
as you can send; it will be unable to process vice, denying access to a useful utility, even
them all, so normal requests from legiti- though the bug through which the DoS
mate users will be crowded out. attack was being made has subsequently
A common means of mounting DoS been resolved.
attacks untraceably is to take over an
unwitting third-party computer, or prefer- Summary of potentially negative issues
ably a lot of them, through virus and DoS attacks can be problematic for organi-
Trojan-horse software, and use those com- zations and result in systems outages.
puters to act as an army of Zombie com- These outages may be prolonged and per-
puters sending the flood of messages. This sist until a fix has been found by the
is called a Distributed denial-of-service attack, operating systems vendor, security software
100
DHCP (Dynamic Host Control Protocol)
vendor, or other security specialist, depend- puters and other network devices that are
ing upon the type of attack. normally connected at any one time, but
not enough for every single device that
Reference
r J. Mirkovic, S. Dietrich, D. Dittrich, and they are responsible for. This is a very com-
mon situation for internet service providers
P. Reiher (2004). Internet Denial of Service:
(ISPs), for which only a fraction of their cus-
Attack and Defense Mechanisms
tomers will be actually using the internet
(Englewood Cliffs, NJ, Prentice-Hall).
at any given time.
Associated terminology: Virus, Trojan When a DHCP-enabled device first
horse. attempts to access the network, it must
negotiate with a DHCP server for an
address. If there is an IP address available,
DHCP (Dynamic Host Control Protocol) the DHCP server will allocate it temporar-
ily to the requesting device, giving it
Foundation concepts: Internet protocol, Network. a Lease on the number with a specified
Definition: A system for temporarily allocating IP expiration time. The device may then
addresses to computers on an as-needed basis. use that IP address as its own. When the
device disconnects, or when the lease
Overview expires, the DHCP server simply reclaims
Every computer or network-enabled device the allocated IP address, and the device’s
connected to the internet must have an network communications stop working. A
IP address (internet-protocol address). An IP device may request a lease renewal at any
address is four small numbers separated by time before expiration, and it is granted if
dots (such as 221.39.107.82), which uniquely demand is not too high.
identify the computer, distinguishing it DHCP allows a large set of computers to
from all others. All information transmit- share a not-so-large set of IP addresses, but
ted over the internet uses IP addresses to is successful only if demand for addresses
specify its destination; a computer without does not exceed the number available. For
an IP address will be unable to receive or end-user systems DHCP provides a generally
transmit anything. satisfactory service, but it is not at all suit-
Long runs of consecutive IP addresses able for systems that play host to perma-
are allocated by a central authority (see nent or long-term network services, such
Domain name) to individual organizations, as web servers and email servers. In order
and those organizations are responsible for to be found by their clients, servers must
allocating the individual IP addresses to the have a known, and therefore non-variable,
individual computers within their domain. IP address.
In the early days of the internet, this Network address translation (NAT) is ano-
scheme worked very smoothly, but demand ther technology that may be used either in
has reached such levels that there are conjunction with, or instead of, DHCP; it
not enough IP addresses to go round. IPv6 helps to solve the problem of having too
(the new version of the internet protocol, few IP addresses available to meet demand.
in which addresses are much longer) will
eventually relieve some of the problems, Business value proposition
but is still not in general use. Network administrators use DHCP to over-
DHCP, the Dynamic Host Control Proto- come the problem of not having sufficient
col, is a very popular solution. An orga- IP addresses available to them to allocate
nization may have enough IP addresses one IP address per device. Thus, assum-
allocated to it to cover the number of com- ing that the network is to be connected
101
Digital
102
Digital cash
103
Digital cash
things. Since copying can not be prevented, governments may enjoy the ability to iden-
security is instead based on preventing tify all taxable incomes, the impact that
double-spending. Somehow, when a vendor such a transaction trail would have upon
receives a digital cash payment, they must society has limited the political will to
be able to check immediately that it has not implement such systems.
already been spent, and at the same time The current limits of deployment of dig-
ensure that it can not be spent again in ital cash include the implementation of
the future. To be truly cash-like, it would systems for electronic payment from bank
still have to be possible for the receiving to bank (e.g., to pay a water bill or the
vendor to spend it legitimately, so illegiti- mortgage), which are in effect wire trans-
mate double-spending is an extremely dif- fers, systems for electronic payment from
ficult problem to overcome. bank to a credit card (e.g., this can be used
Also, to prevent double-spending, all to make payments related to online auc-
pieces of digital cash must in some way tions), digital-cash-based debit cards (e.g.,
be different from all others. If they were in the United States welfare recipients use
not unique, it would be logically impossible electronic benefits transfer (EBT) systems in
to distinguish genuinely different tokens which a debit card has replaced the use
from illicitly double-spent copies. But, if all of paper ‘‘food stamps”). However, none of
tokens are unique and digital, it would be these systems approach true digital cash
possible for the issuer to record all tokens and the probability of large-scale use of
issued together with the identity of the per- such systems in the near future is low.
son to whom they were issued, thus intro-
ducing traceability and missing the point
of digital cash. Summary of positive issues
Digital cash provides the ability to trans-
Business value proposition fer money electronically from one person
The processes associated with implement- to another. The use of identifiable digital
ing digital cash have been in existence cash allows governments and individuals
since the 1990s; however, they have to trace payment trails. Anonymous digital
received very little publicity due to their cash can reduce the complexity of transac-
complexity and the need to change the pay- tions, and increase customer confidence.
ment behavior of a large group of the popu-
lation. The two types of digital cash, anony- Summary of potentially negative issues
mous and identified cash, both have many With identified digital cash governments
issues associated with them. can monitor all digital transactions, and
Anonymous digital cash allows people small payments may incur disproportion-
to transfer monies without a trace being ate transaction costs. With anonymous cash
left, which would be a problematic issue the lack of a transaction trail may facilitate
for governments that rely upon a paper illegal activities. There are significant prob-
trail to prevent unlawful activities. Physical lems in developing processes to prevent
cash makes it hard for a criminal to carry double-spending, ensuring proof of owner-
around millions of dollars, pounds, or yen ship, and backing up cash, and as a result
without attracting attention. the digital cash protocols are extremely
Identified cash has problems associated complex.
with it that stem primarily from the fact
that every transaction would be traceable, References
and complete profiles of an individual’s r D. Chaum (1983). ‘‘Blind signatures for
spending habits could be created. While untraceable payments,” in Advances in
104
Digital certificate
Cryptology CRYPTO ’82, Lecture Notes in always be possible for someone to subvert a
Computer Science, ed. D. Chaum, R. communication channel, temporarily redi-
Rivest, and A. Sherman (New York, recting accesses to the public directory to
Plenum Press). a fraudulent one of their own.
r D. Chaum (1985). ‘‘Security without Digital certificates are an attempt to
identification,” Communications of the solve this most serious problem. The under-
A.C.M., October. lying idea is that, if there is one entity
anywhere that can be fully trusted (per-
Associated terminology: Cracking, haps a saintly person with a lot of secu-
Protocol. rity and computing power, perhaps a trust-
worthy corporation!, or even a trustworthy
government!), they would become a Certi-
Digital certificate fication authority (CA). Everybody, on first
receiving their own public-key--private-key
Foundation Concepts: Digital signature, Public key, pair, would register their public key with
Encryption. the CA. The CA would very thoroughly
Definition: A public key, digitally signed by a trusted investigate the individual to verify their
authority. identity, doing at least as much as modern
governments do before issuing passports.
Overview Then the CA would then issue the appli-
The only real problem with public-key cant with a digital certificate, a simple and
encryption, once the strength of the algo- short document saying essentially ‘‘This . . .
rithm has been established, is key manage- is the true public key for . . .” The certificate
ment. As well as the obvious need to keep is encrypted using the CA’s own private
private keys utterly private, there is a cor- key.
respondingly urgent need to make public There are two essential ingredients to
keys utterly public. this scheme. The first is that everybody
If a criminal can trick anyone into believ- in the world should know the CA’s public
ing that a public key generated by him is key; it should be stored indelibly and
in fact your public key, then, in the digital immutably on all systems, and never need
world, he can be you. Encrypted messages to be looked up from any directory service.
intended for you will be readable by him The second is that the CA must be perfectly
and only him, and, possibly worse, he will trustworthy and perfectly secure. A dishon-
be able to sign legal documents (contracts, est CA could get away with almost any-
bank drafts, confessions) as you. thing. If the CA’s private key is ever revealed
This is why wide publication of public then all identities are open to theft.
keys is essential. Everybody you do busi- If the CA is perfectly trustworthy and per-
ness with should ideally have a secure copy fectly secure, then digital certificates are
of your public key as provided directly by as reliable as the encryption system used.
you; everyone you might do business with Nobody can ever present a fraudulent pub-
in the future should have instant access to lic key to steal another’s identity, because
an incorruptible public-key directory ser- nobody would ever accept a public key that
vice. Of course, this ideal can not practi- does not come as part of a digital certifi-
cally be realized. If you had to deliver your cate. Nobody can make a false digital cer-
public key to everybody in person, elec- tificate without access to the CA’s private
tronic commerce would be pointless, and key.
how would they verify your identity even Secondary CAs may also be created, and
then? With a public directory, it would would most probably be necessary since the
105
Digital certificate
load on a single universal CA would be tion process also results in more efficient
overwhelming. Each secondary CA would certification management and a potential
have its own digital certificate of identity for reducing the total cost of ownership
signed by the one primary CA. Then every involved in the certification-related pro-
digital certificate issued by the secondary cesses.
authority would contain an extra clause Digital certificates and associated appli-
that is a copy of the secondary authority’s cations can be used for a variety of pur-
own digital certificate. That means that poses, including the certification of identi-
those receiving a certificate issued by a sec- ties, adding encryption, confirming sources
ondary authority may still validate it with of data/information, and restricting access
knowledge only of the primary CA’s public to information on the basis of certified
key. identify.
Currently, there are some organizations
offering competing services as trusted CAs,
Summary of positive issues
and their public keys are not securely built
Certificates are easy to create and use
into computer systems. Digital certificates
through CAs. They provide positive identi-
are offered for costs in the range of hun-
fication of data authors, individuals, com-
dreds of dollars. It is envisaged that each
panies, or other entities.
organization would obtain a single digi-
tal certificate from a central authority, and
then act as its own secondary certification Summary of potentially negative issues
authority for internal purposes. Great care must be taken in selecting a CA:
anybody can set up a web site, call them-
Business value proposition selves a CA, advertise on the internet, and
Digital certificates are a means that allows sell digital certificates. A CA that is not com-
companies or individuals to authenti- pletely trustworthy is much worse than no
cate the entities (e.g., companies, people, CA at all. A prospective customer must ask
and governments) that they interact with themselves what makes this private com-
online. There are two primary options pany so thoroughly worthy of trust, and
for businesses wishing to deploy digital how can they guarantee complete security
certificates; they can administer their own of their encryption keys?
certificates, or they can outsource the man- Digital certificates are currently a very
agement of the certificates. expensive proposition, especially for indi-
It would be difficult for most organiza- viduals and small organizations. It may be
tions to administer and issue their own hoped that the advent of secondary CAs
digital certificate. Also they would have could ease the situation, as could govern-
the problem that their certificate would ment regulation in this commercially vul-
be doubted by many, if not all, of those nerable area which impinges directly upon
who received it, hence the advent of CAs. national-security concerns.
The existence of a CA allows organiza-
References
tions and individuals to make use of an r B. Schneier (1996). Applied Cryptography
established certification process. Further,
(New York, John Wiley and Sons).
these authorities usually provide a suite r J. Feghhi and P. Williams (1998). Digital
of applications that work in conjunction
Certificates: Applied Internet Security (New
with software from other vendors (e.g., web-
York, Addison-Wesley).
server vendors) to provide a secure cost-
effective mechanism for certificate man- Associated terminology: Phishing,
agement. The outsourcing of the certifica- Cracking, Hacker.
106
Digital signature
107
Digital signature
may falsely claim that a true signature was lished. If the two versions match, the doc-
forged. ument is genuine.
The notion of a digital signature as a dig- The sender can not later deny having
itized version of a person’s normal signa- signed the document. The fact that the fin-
ture, which may be included as an image gerprint was successfully decrypted using
in a document, is patently absurd. Digital their public key is proof that it was origi-
documents are even easier to modify than nally encrypted using their private key, and
paper ones, and a signature graphic can nobody else has access to that.
be cut out of one document and added to The document can not be modified after
another seamlessly in seconds. ‘‘Digital sig- it has been signed, because any change
nature” should not be confused with ‘‘elec- to the document will result in a change
tronic signature,” which is simply the sys- to the fingerprint (one-way hash), and the
tem to capture digitally the pen strokes fraudster can not substitute a new finger-
made when signing for a paperless credit- print, because it must be encrypted with
card purchase with a stylus and electronic the sender’s private key.
pad. Digital signatures are useful for resolving
The true concept of a digital signa- intellectual property and non-disclosure
ture relies upon two essential technologies: disputes, since they enable individuals to
Public-key encryption (q.v.) and One-way-hash prove that they know something without
operations (q.v.). To summarize, in a public- revealing what it is until later. Essential
key encryption system, every individual information may be recorded in a doc-
has two encryption keys, one of which ument and digitally signed in the nor-
is completely private and secret, whereas mal way. The digital signature alone, with-
the other is totally open and public. Data out the document that it applies to, is
encrypted with one of the keys can only be sent to the other party. They can not
decrypted with the other. A one-way hash reconstruct the document from the signa-
is an operation applied to any data that ture alone, but the sender is also inca-
reduces it to a moderately sized number (a pable of constructing another document
few dozen digits); this number acts as a fin- later that would match that signature.
gerprint for the data, so that the slightest This mechanism also allows a third party
change to the data would produce a totally to witness a document with their own
different fingerprint number, but the fin- digital signature without being able to
gerprint does not contain enough informa- read any sensitive information it may
tion to reconstruct any part of the original contain.
data. The whole system of digital signatures
A digital signature for a document is cre- relies upon complete openness. All of
ated by applying a one-way hash to that the procedures must be public, otherwise
document, encrypting the resultant finger- nobody will be able to verify signatures;
print using your private key, and appen- every participant’s public key must be pub-
ding the result to the document. It is a lished as widely as possible, otherwise it
simple, fast, and secure procedure. becomes possible for fraudsters to extend
Anyone receiving this signed document fake public keys for their victims. The only
can easily calculate what the fingerprint thing that must be kept secret is the private
should be: they have the document, and key. There must be no possibility of a pri-
one-way hash procedures are always openly vate key ever being revealed under any cir-
published. They can also easily decrypt the cumstances. The advance of electronic com-
copy of the fingerprint appended by the merce almost equates private key with per-
sender: public keys are always openly pub- sonal identity.
108
Digital signature
The DSA (Digital Signature Algorithm) sole control; and (d) it is linked to the
is an embodiment of the DSS (Digital Sig- data to which it relates in such a manner
nature Standard) produced by the NIST that any subsequent change of the data is
(National Institute for Standards and Tech- detectable.” The Basic Electronic Signature
nology) under authority of the Computer is a general definition that covers digi-
Security Act of 1987. It is based on a vari- tally recorded hand-written signatures; it is
able key length of between 512 and 1024 the Advanced Electronic Signatures that
bits, which would be very large for normal correspond to the true digital signatures
encryption, but has attracted criticism for discussed above. Many EU countries have
not being long enough for such a critical subsequently created their own legal frame-
purpose. Some cryptanalysts are suspicious works in association with the EU directive,
of the DSA, because it was designed by the including the UK implementation of the
NSA (National Security Agency), and not directive termed the ‘‘Electronic Signatures
everyone is totally willing to trust them. Regulations 2002.”
The US and EU laws principally aim to
Business value proposition ensure two things: that the signatory is
The United States enacted a variety of laws who they say they are, and that the digi-
pertaining to digital signatures, including tally signed document is authentic. These
the Electronic Signatures in Global and two aspects of signed digital documents
National Commerce Act of 2000 which aims form the basis of a ‘‘non-repudiation” ser-
to ‘‘facilitate the use of electronic records and vice, a service that can provide proof of
signatures in interstate and foreign commerce data origin, transmission, and delivery, pro-
by ensuring the validity and legal effect of con- tecting the sender and receiver from false
tracts entered into electronically” (www.ftc.gov). claims.
At the state level, the Uniform Electronic
Transactions Act of 1999 is intended to Summary of positive issues
‘‘remove barriers to electronic commerce by Digital signature technology is well estab-
validating and effectuating electronic records lished and backed by legal frameworks in
and signatures,” and the Digital Signature many countries. Non-repudiation services
and Electronic Authentication Act of 1998 exist to validate data transfer.
(SEAL), which is an amendment to the Bank
Protection Act of 1968, is intended primar- Summary of potentially negative issues
ily to enable the use of electronic authenti- The terms ‘‘digital signature” and ‘‘elec-
cation techniques by financial institutions. tronic signature” are sometimes used inter-
In 1998 the European Union (EU) changeably when they may in reality rep-
issued an ‘‘Electronic Signature Directive” resent different levels of technology and
(1999/93/EC), which defines two different security. Not all countries have legislation
electronic signatures, namely the Basic to cover the use of digital signatures and
Electronic Signature which they define as country-specific contract law needs to be
‘‘Data in electronic form which are attached examined prior to commencing electronic
to or logically associated with other electronic transactions.
data and which serve as a method of authen-
tication” and an Advanced Electronic Sig- References
nature, which must meet the following r B. Schneier (1996). Applied Cryptography
requirements: ‘‘(a) it is uniquely linked to (New York, John Wiley and Sons).
the signatory; (b) it is capable of identifying r Directive 1999/93/EC of the European
the signatory; (c) it is created using means Parliament and of the Council of 13th
that the signatory can maintain under his December 1999 on a Community
109
Digital wallet
framework for electronic signatures, about having their personal financial data
Official Journal of the European held by a third party.
Communities, L13/12, January 2000.
Associated terminology: Digital certificate,
Associated terminology: Digital cash, Digital cash.
Digital certificate.
110
Disk
service to the public (such as web-hosting ted, such as the activities of an officer or
services and data processing bureaus), agent of the US government in the nor-
similarly forbidding the release of com- mal course of his official duty to conduct
munications either carried or stored by electronic intelligence, e.g., the FBI’s Car-
that service. The third adds a prohibition nivore system used from 1998 to January
against the providers of communications 2005, which sifted through email at ISPs,
and remote computing services releasing capturing emails for only those individuals
any information about subscribers or cus- named in a court order.
tomers to any government entity.
While these aspects of the act cover the The Cyber Security Enhancement Act,
privacy framework for electronic communi- Section 225 of the Homeland Security
cations and the contents of those com- Act of 2002, which amends various
munications, there is an extensive set of sections of 18 USC
exemptions built into the act, some of This act takes the form of a set of amend-
which are necessary for business to occur ments to previous legislation related to
(for instance, an email provider must be security and privacy on electronic (com-
able to send email out of a mail box to its puter) systems. The act strengthens exist-
intended recipient). Other exceptions allow ing laws in various ways, such as explicitly
federal, state or local-government access in making it an offense to advertise online
certain circumstances, such as the need for illegal devices, defined as ‘‘any electronic,
information in the case of life-threatening mechanical, or other device knowing or
emergencies. having reason to know that the design of
such device renders it primarily useful for
Wiretap Act, 18 USC 2511 the purpose of the surreptitious intercep-
The act is primarily intended to cover the tion of wire, oral, or electronic communica-
intentional interception, use, or disclosure tions” (18 USC 2512). Previously only ‘‘news-
of any communication transmitted over a paper, magazine, handbill, or other publi-
wire, orally, or electronically. The law also cation” were specified.
makes it an offense to use any device to Laws pertaining to ISPs and their ability
intercept communications. This aspect of to report issues concerning life-threatening
the act can be used to cover traditional behavior were amended to allow greater
‘‘bugs,” ‘‘wire taps,” and eavesdropping, flexibility in to whom they can disclose the
and to prosecute the malicious use of unau- information (18 USC 2702). The laws per-
thorized ‘‘packet sniffers” (programs that taining to hacking were strengthened and
can examine network traffic, looking for now cover fatal injury as a result of hack-
passwords or other information). The act ing (18 USC 1030). Sentencing guidelines
also makes it illegal to disclose any infor- were substantially strengthened for some
mation so found, for example placing a cor- offences.
porate password obtained through a packet
sniffer on a ‘‘cracker” website. Subsequent Associated terminology: Law cross-
to that it is also illegal for information to reference.
be taken from that cracker web site and
used, knowing that the information itself
was obtained illegally. Disk
The act does describe a large number
of situations in which the interception, Foundation concept: Storage.
use, and disclosure of information from Definition: The primary device for long-term data
wire, oral, or electronic sources are permit- storage.
111
Disk
112
Disk
so gradual erosion must occur) accumu- floppy disks (in the 8 and 51/4 inch for-
late, until inevitably the whole drive fails. mats) had less than rigid cases and were
It must be understood that any data stored indeed slightly floppy. The read--write heads
on disks is subject to unpredictable loss; do not hover microscopically above the
the only solution is to keep backup copies surface, but actually make close contact,
on different media of anything that is pushing into and bending the surface. For
essential. this reason, floppy disks must rotate much
The parameters of a disk that determine more slowly than hard disks, have a much
its capacity are the number of surfaces, the smaller capacity (generally only just over
number of tracks, and the amount of data 1 MB), and are not kept spinning full-time.
that can be packed into one track. Stating They also have a much more finite life-
the number of tracks can be ambiguous time, but are correspondingly more robust,
(does 100 000 tracks mean 100 000 tracks and can be dropped from a great height
per surface, or a total of 100 000 overall?). without harm. Floppy disks are encased in
Rather than simply resolving the ambigu- a square protective sleeve, which is also
ity with a decisive definition, a different used to hold them in place when they are
term is usually used. A Cylinder is the stack inserted into the drive. Floppy disks are
of tracks at the same radial position on all always exchangeable: any number of dif-
surfaces, so a disk with five surfaces and ferent disks may be inserted into a sin-
100 000 cylinders has an unambiguous total gle drive, used, then removed and replaced
of 500 000 tracks overall. by another. Hard disks on small comput-
On any single track, the data is not stored ers are never exchangeable, but the high-
as one long stream of bits, but is divided up performance disk drives used on main-
into a number of Blocks. A block is almost frames do still sometimes have an exchan-
universally 512 bytes or 4096 bits. Until geable disk pack as an option.
recently it was almost universal for there The first working disk drive in commer-
to be 63 blocks on each track, but now the cial production was the IBM 350, which was
number varies considerably, with the outer introduced in 1956 as a component of the
(longer) tracks holding hundreds of blocks. IBM 305 computer. It was the size of a large
For efficiency of organization and access, wardrobe, had a capacity of 4.5 MB, and was
and to ensure sufficiently accurate position- available for lease at $35 000 per year. The
ing, a block is the smallest amount of data first hard-disk drive available for an IBM PC,
that can be written onto a disk at one time. 26 years later, was the Seagate ST412, with
If a single byte is to be changed, it is neces- a capacity of only 10 MB. IBM also intro-
sary to read the whole 512 bytes that sur- duced the first floppy disk, in 1971. In the
round it, modify that one byte, and then 8 inch format, it had a capacity of approxi-
write the entire 512 back again. In modern mately 0.1 MB, and was originally a system
times, blocks are often erroneously called engineer’s tool, not a general data-storage
Sectors, but the term sector properly refers medium.
to the entire wedge-shaped collection of The ZipDrive disk, Jaz disk, and simi-
blocks at the same position on every track. lar devices are two kinds of compromise
Disk drives are often referred to as Hard disk technology. A lower-capacity version
disks to distinguish them from Floppy disks, (around 100 MB) uses the same technology
an alternative technology that through as floppy disks, but is much more finely
cheapness became very popular in the and expensively constructed, so it gives
1970s, but is now disappearing. Floppy higher capacity, access speeds, and relia-
disks are made of a flexible material and bility. The larger-capacity version uses the
are much less finely constructed; early essential technology of hard disks, but with
113
Disk
114
Distributed database
115
Distributed database
database being too large for one server to base system can be designed, its general
handle. modus operandi must be decided: is it
Horizontal partitioning does relieve the going to keep all of the data in a ‘‘live”
problem of too-large database tables. A transactional database that is immediately
database of 100 000 000 records could be accessible, or is the data going to be pre-
split amongst ten servers, each holding partitioned, with old data that is no longer
10 000 000 complete records. Any query will likely to be accessed by online systems
probably have to be sent to all ten servers, being placed in a Data warehouse, and only
and the individual partial results will then the live data remaining in the transactional
need to be combined before being returned database?
to the user, but that is one of the eas- When an organization’s overall require-
ier tasks. Horizontal partitioning usually ments have been defined, then the database
implies the three-tier model of application system can be designed. The design of a
servers, and often does little to relieve the database is a technically challenging task
problem of having too many queries to and for any non-trivial case it requires
answer quickly, since each query must be a highly skilled and trained database
processed by multiple servers. administrator (DBA). The DBA needs to be
Vertical partitioning may help to reli- skilled not only in database technology
eve both problems. With a database of 100 but also in the specific environment in
000 000 records, each of ten servers would which they are to operate. Databases differ
hold the more closely related parts of every from vendor to vendor, and from version
one of the 100 000 000 records. With care- to version, so very specific knowledge is
ful planning, it can be arranged that the needed.
most common queries can be answered by Central to the DBA’s task is the deter-
a single server, but for those uncommon mination of how and whether to partition
queries based on fields that never appear the data across a set of computers. There
together on any one server, processing can are several methods of partitioning, and
become exceptionally complex and ineffi- the choice of solution is complex, involving
cient. With vertical partitioning, the prob- a careful examination of the options. This
lems of temporary inconsistencies immedi- requires knowledge of the interactions that
ately following insertions and deletions are occur between the hardware, the operating
greatly magnified. system, the network, and the application
The decisions on how the data should be software, as well as understanding the prac-
distributed across multiple servers and the tical constraints surrounding the database
design of the algorithms for answering the system itself.
various possible queries go hand-in-hand. The overall design of the database needs
One can not effectively be done before the to be carefully considered, and the trade-
other. The design of a distributed database off amongst performance, ability to scale,
is one of the most complex challenges fac- and cost must be built into the technical
ing an IT professional. business case developed to help identify a
solution.
Business value proposition
As corporations grow, their databases grow Summary of positive issues
with them. It is not unknown for some Distributed databases and partitioning
organizations to have databases of multiple techniques allow for a variety of solu-
terabytes, and sources have reported that tions to managing very large databases. Dis-
some government agencies have databases tributed databases allow DBAs to attempt to
of over 100 terabytes. Before a large data- optimize their database systems.
116
Domain name
Summary of potentially negative issues names rests with ICANN (the Internet
Creating distributed databases is extremely Corporation for Assigned Names and
challenging technically. A poorly designed Numbers).
partition of the database can cause Once a domain name has been assigned to
database failures, instability, and unaccept- an organization, that organization is free
ably poor performance. to create any number of sub-names to refer
to individual servers or internal networks,
Reference by prefixing extra components to the begin-
r M. Ozsu and P. Valduriez (1999).
ning of the domain name. For example, the
Principles of Distributed Database Systems owner of ‘‘company.com” has complete
(Englewood Cliffs, NJ, Prentice-Hall). control over all names that end with
‘‘.company.com,” and can freely define
Associated terminology: Database
‘‘www.company.com,” ‘‘sales.texas.company.
administrator, CIO.
com,” and anything else of a similar nature.
When a web browser, or any other inter-
net client, attempts to access a server using
Domain name a name like ‘‘sales.texas.company.com,” a
system known as DNS (Domain Name Ser-
Foundation concept: Internet protocol, Network. vice) is invoked. This is a network of servers,
Definition: A readable and memorable name asso- embedded in the internet, which provides
ciated with an IP address, providing a convenient the necessary information for converting
reference to internet-connected computers. names back to the numeric IP addresses
that are required for internet communica-
Overview tions.
All computers connected to the internet There is a ‘‘Top Level DNS Server” for
must have a numeric IP address associated all names ending in ‘‘.com”; this server
with them. This is the only way that com- is the ultimate authority for all ‘‘.com”
munications are routed across the inter- addresses, which is why proper domain
net, so it is an absolute requirement. IP name registration is essential. The top-
addresses currently consist of four smallish level server is asked what it knows about
numbers (range 0-255) separated by dots, ‘‘sales.texas.company.com,” and responds
for example ‘‘192.168.45.234.” These num- with a partial answer: the address of
bers are not at all memorable, and are an another DNS server that is responsible for
extremely unsatisfactory way to publicize all addresses ending in ‘‘.company.com.”
the address of a web site. With the advent The question is then asked again of this
of IPv6, these addresses will become even server, and it may reply with the com-
longer sequences of numbers (see Internet plete answer, or it may instead reply
protocol for details). with the address of another DNS server
A Domain name is a more user-friendly, that is responsible for all names in the
human-oriented name associated with an ‘‘.texas.company.com” subdomain. Eventu-
IP address, having a more familiar and ally, if everything has been set up correctly,
memorable form such as ‘‘MyCompany- the question will reach a DNS server that
Name.com” or ‘‘AnotherCompany.co.jp.” knows the complete address, and commu-
Before a domain name can be used, it must nications may begin.
be registered with the internet authorities; When a domain name is registered, it is
there are numerous organizations provid- also necessary to make sure that there is at
ing domain name registration services for least one DNS server that can answer DNS
a fee; ultimate responsibility for domain queries about hosts within the domain.
117
Domain name
For an organization that does not have the United States passed the Federal Anti-
its own 24-hour computing service, this Cybersquatting Consumer Protection act of
will probably involve additional fees. It is 1999. However, for smaller organizations
normally considered essential to have two or individuals, obtaining a ‘‘meaningful”
DNS servers prepared to handle queries domain name can be problematic, since
about any domain, since some down-time many popular names and terms have been
is inevitable, and without a working DNS, used already, e.g., www.robertplant.com.
internet sites can not be found. This can cause the need for a re-branding
The last component of a domain name effort.
does have some significance, and can not The choice of domain name of course
be chosen freely. There are seven traditional does not have to be limited to local names,
top-level domains: ‘‘.com” was intended for e.g., a company in the UK can choose
multinational companies, but has come to to have a .com address, or even a co.us
be used by US companies; ‘‘.edu” is for US address. It is, of course, very difficult for
educational institutions; ‘‘.gov” is for US people to guess a domain name and even
government agencies; ‘‘.int” is for interna- powerful search engines might not easily
tional organizations; ‘‘.mil” is for the US help a potential customer locate the com-
military, ‘‘.net” is for networks, and ‘‘.org” pany they are looking for without specific
is for other US organizations. In order to information. There is no domain-name-
relieve demand on the exceptionally pop- index equivalent of the ‘‘Yellow Pages,” so
ular ‘‘.com” names, six new ones were for example, finding a hardware shop in
created: ‘‘.biz,” ‘‘.info,” ‘‘.name,” ‘‘.aero,” London that has not had much internet
‘‘.coop,” and ‘‘.museum.” Non-US individu- traffic or been in business long could be
als and organizations are expected to use difficult.
country-specific domain names; these end
in two dotted components, one indicating
Summary of positive issues
the kind of organization, and the other
The domain name-space is regulated by
(always two letters) indicating the coun-
ICANN. Domain names remove the need for
try it is based in; so, for example, ‘‘.co.uk”
people having to remember IP addresses.
is the UK-specific version of ‘‘.com,” and
Domain names are unique. The configu-
‘‘.co.jp” is the Japanese. There is a ‘‘.us” code
ration of a DNS server is relatively easy
for the United States, but it is rare for a
and connections through ISPs are available
US corporation to use ‘‘.co.us” instead of
almost universally.
‘‘.com.”
118
Dvorak/QWERTY keyboards
119
Dynamic web pages
expensive than the QWERTY standard key- basic nature of web interactions. When a
boards. Typists need to be retrained (or web browser wishes to access a particular
at least to retrain themselves), and often page, it composes a simple command in
resist the change. Switching to Dvorak key- the HTTP language. This command usually
boards incurs some expense, and a period begins with the word ‘‘GET,” and contains
of much reduced productivity while typists an identifier for the desired page (its URL),
gain familiarity with them. together with any user inputs that may
have been provided in a form. This request
Reference is transmitted to the relevant server. When
r R. Soukoreff and I. MacKenzie (1995).
a web server receives a GET request, it
‘‘Theoretical upper and lower bounds translates the URL into a simple file name,
on typing speed using a stylus and soft and looks for that file. If the file con-
keyboard,” Behaviour & Information tains simple HTML or plain text, the entire
Technology, 14, 370--379. file is sent, unprocessed, to the requesting
browser, which then displays the content
on the screen.
Dynamic web pages If the server finds that the file actually
contains Server-parsed HTML (SHTML), then
Foundation concepts: Web page, Internet, HTML. the processing is slightly more complex.
Definition: Web-page content that is not static, but Instead of just transmitting the entire file
can vary according to circumstances, and react to user as-is, the server reads the file looking for
inputs and requirements. special tags. Most of the file is simply sent
to the browser as-is, but Server-side includes,
Overview surrounded by ‘‘<!--#” and ‘‘-->” are
The most basic web pages are simple text replaced by the appropriate text. For exam-
files with a fixed HTML content. Every time ple, if the text ‘‘The file xxx.html is
a particular web page is accessed by a <!-- fsize file = “xxx.html”-- >
browser, the content shown will be exactly bytes long” were to appear in an SHTML
the same. This is, of course, perfect for pro- document, and the server had a file
viding fixed information that does not vary, called xxx.html 7055 bytes in length,
but provides only limited possibilities for then the text ‘‘The file xxx.html is
user control and interaction (no more than 7055 bytes long” would be sent to the
the ability to click on a link to another browser. SHTML is of only limited power,
page), and makes truly interactive opera- but does increase the scope of simple web
tions such as online registration, form sub- pages.
mission, hit counters, shopping carts, and If the requested file is of the CGI (Com-
database queries impossible. mon Gateway Interface) type, then the server
Dynamic web pages provide content that assumes that it is an application pro-
is not created until it is needed; with each gram, and executes it. The running pro-
browser access, the page is created on-the- gram receives any inputs that were sent as
fly, and thus may be different each time part of the request, and all output created
and can even react to browser input. There by the running program is sent directly
are five very commonly used technologies to the requesting browser. Anything that
for dynamic web page creation: SHTML, the server is capable of doing can be pro-
CGI, JSP and ASP, and Scripting, and many grammed into a CGI application; using CGI
others that are just variations on a theme. is just like running an application over
To understand dynamic web page gen- the web. Programmers can use any lan-
eration, it is essential to understand the guage that the server supports, and have
120
Dynamic web pages
access to all operating-system services. CGI monly used to make the cursor or hyper-
is by far the most powerful and flexible of text links change their form depending on
the dynamic generation methods, and that the position of the mouse. Scripts are most
does have its drawbacks. Writing CGI code commonly written in JavaScript and Visual
requires real programming knowledge, and Basic.
is not a task for unskilled workers. There is
also a security risk: CGI can do anything,
Business value proposition
so a careless programmer could acciden-
Dynamic web pages created through the
tally open up the whole system to outside
SHTML, CGI, JSP, and ASP web technolo-
attack. In skilled hands, CGI is safe, simple,
gies allow businesses to build more sophis-
and powerful.
ticated web sites that facilitate a variety
JSP (Java Server Pages) and ASP (Active Server
of activities ranging from gaming to B2C
Pages) are very similar technologies. Both
e-commerce and B2B data exchanges. The
are based on the idea of SHTML: the server
web technologies allow adoption of a vari-
processes the file as it is being sent to the
ety of processing models; for example, in
browser, and substitutes information gen-
the scripting model the processing is at
erated on-the-fly for special tags in the text.
the client, whereas the JSP model per-
With both ASP and JSP, the special tags
forms the processing at the server, and
contain parts of (or even complete) pro-
this flexibility allows developers to create
grams that are executed to generate the
systems that are matched to their busi-
new content. These program segments can
ness need and their systems architecture
do anything from providing the current
and to take into account the technologies
date and time to substituting the results
deployed by their customers, partners, and
of complex cross-network database queries.
vendors.
In JSP the language of the program ele-
ments is based on Sun’s Java; in ASP it
is based on Microsoft’s Visual Basic. The Summary of positive issues
use of JSP requires Java support on the Dynamic web content is necessary if web
server; the use of ASP requires a Microsoft sites are to go beyond merely provid-
server. ing static information, and interact with
Scripting is a different kind of system. users. e-Commerce can not be conducted
When scripting is used, the file retrieved with merely static web pages; the closest
by the server contains program code, but is approach would be to provide customers
not processed by the server at all. The file, with forms that they must print, complete
together with any code it contains, is sent by hand, and fax in to make an order, and
directly back to the requesting browser, such a procedure would certainly discour-
and the program code is run locally by age many potential customers.
the browser. This has the positive effect CGI has unlimited power, allowing any-
of relieving the server of a lot of the pro- thing a programmer is capable of coding,
cessing load, but at a very high price. It but is lacking in convenience, and does
is the browser that has to execute the require some technical programming abil-
script, so it has no access to any informa- ity. Javascript provides much greater protec-
tion stored on the server. Of course, scripts tion against both accidents and malice, but
may open communications channels back has a much greater likelihood of arousing
to the server, but that adds a great deal of the suspicions of a user’s security system
complexity both to the script and to the and being blocked. Scripting allows simple
server, and somewhat defeats the purpose user interactions without any load on the
of scripting. Scripting elements are com- server.
121
Dynamic web pages
122
e-Commerce/e-business
123
Efficiency
data exchange. Data-synchronization B2B to locate and use alternative sources of sup-
systems can be in the form of consortia, ply. Organizations need to evolve their sites
with the members benefiting from low- accordingly. Lack of alignment in corporate
ered development costs since they share the branding, service, or product offerings for
overhead associated with technical devel- bricks-and-clicks organizations can lead to
opment. e-Procurement portals also allow the loss of customers or confusion on the
members to reduce overall costs by avoid- part of customers as to their relationship
ing the search effort involved in access- with the company.
ing multiple corporate sites, hosting a Data-synchronization B2B systems requ-
site themselves, or competing against each ire a relatively high degree of technical,
other with proprietary systems. process, and organizational sophistication
of active participants. B2B procurement
Summary of positive issues hubs may evolve until a particular hub
Many successful B2C models have emerged, becomes dominant; selection of hub mem-
and traditional companies such as the UK bership is an important strategic sourcing
supermarket chain Tesco (www.tesco.com) issue.
understand the importance of building
upon established strengths and ensuring Reference
r R. Plant (2000). e-Commerce: Formulation
that the online business model is aligned to
the overall corporate strategy. Tesco’s busi- of Strategy (Englewood Cliffs, NJ,
ness model is such that physical orders Prentice-Hall).
taken over the web (e.g., groceries, clothes, Associated terminology: Internet,
DVDs) are filled and delivered from the Dynamic web pages, W3C.
store nearest to the customer, while service
products (e.g., car insurance, holidays, and
flights) are provided via the web.
The technical costs associated with Efficiency
B2C and B2B systems developments have
declined as tools, standards, support, and Foundation concept: Algorithm.
methodologies have been evolved both by Definition: A measure of how well resources are used
commercial vendors and by organizations by an application.
such as the World Wide Web Consortium
(W3C.org). Overview
B2B models such as data synchroniza- The general scientific definition of effi-
tion centers provide companies with the ciency is the output actually produced by a
ability to share and distribute data to a system expressed as a fraction of the maxi-
large number of businesses and entities at mum output any such system could theo-
a lower cost than would be possible with retically produce. An engine is 50% efficient
a proprietary system. Membership of a B2B if half of the energy put in (in the form
procurement exchange reduces the trans- of fuel) is converted into useful work (the
action costs associated with the procure- other half being lost through friction, pol-
ment process and reduces costs for mem- lution, etc.)
bers because they do not have to build the While this definition can be useful in
procurement system themselves. computing, it is not the definition nor-
mally used by computing personnel. In
Summary of potentially negative issues most cases there is no ‘‘theoretical best”
B2C models continue to evolve as emerging performance that an application may be
technologies change the customers’ ability measured against. In computing, the term
124
Efficiency
‘‘efficiency” is usually used as an umbrella gram run faster, it may need more mem-
term for absolute measures of perfor- ory, but to make a program run with less
mance. memory, it may have to be made slower.
For example, it may be necessary to pro- There is of course a fundamental differ-
cess a large amount of information to pro- ence between time efficiency and memory
duce a summarized report. Application A efficiency. Fast execution is usually a most
might load all of the information from disk desirable quality, but, if a program is slow,
into main memory before processing and all you have to do is wait a little longer and
thereby produce faster results than appli- the results will appear. If a program needs
cation B; which might perform exactly the too much memory, there is nothing to be
same task only taking the data in smaller done about it.
chunks, and therefore working much more The efficiency of human effort is another
slowly. A naive observer would conclude factor that must not be forgotten. If a pro-
that application A is better because it pro- grammer spends a week making one part
duces the same results but faster. A deeper of an application more efficient, so that
analysis is required. it now takes one second to run instead of
For example, tests may be performed, ten, is that an effective use of resources?
comparing the two applications on a num- The answer of course depends on circum-
ber of different data sets: stances. If this part of the application is run
many times by customers, then the total
improvement can be enormous.
Data-set Time Memory Time Memory
size for A for A for B for B Business value proposition
20 MB 0.15 s 21 MB 6.00 s 2 MB The methods used to determine the effi-
40 MB 0.30 s 41 MB 12.00 s 2 MB ciency of normal engineering problems
60 MB 0.45 s 61 MB 18.00 s 2 MB typically allow clear results to be deter-
mined. Computing systems do not always
provide such clear-cut decisions. In reality
From this (trivially simple) experiment, there are trade-offs between the different
one might conclude that application A ways of solving any given computing prob-
requires 0.0075 seconds per megabyte of lem; for example, using more processing
data, whereas application B requires 0.3 sec- power against using more memory or vice
onds per megabyte, so application A is 40 versa.
times faster than application B. But the A programmer who is aware of the con-
amount of memory occupied by applica- straints applicable to the system being
tion A grows alarmingly as the size of developed can attempt to construct appli-
the data set grows, whereas application cations that maximize some quantifiable
B uses only a constant amount of mem- measure of efficiency. The determination
ory. Of course, there are many other fac- of efficiency within large corporate systems
tors to be taken into consideration, such as environments is a very difficult problem
how much human intervention each appli- to solve, due to the interaction of many
cation needs, how much network band- systems-related variables. For example, a
width they require, and how much mem- corporate web server servicing simple web-
ory the application processing the data viewing requests involves many processes
requires. and sub-processes, including the network-
This illustrates an important concept: data-traffic profile, the efficiency of the
there is often a trade-off between one form web-page design, the hardware that sup-
of efficiency and another. To make a pro- ports the system, and the efficiency of the
125
Electronic data interchange (EDI)
126
Electronic data interchange (EDI)
127
Email
128
Email
Users are normally able to select which is a message to all members of a predefined
to be used in the options for their email- list.
browsing client.
When the email-browsing client success- Business value proposition
fully connects to the POP or IMAP server, it Email has become one of the most valu-
identifies itself by sending the user-name able of all corporate communication tools.
and secret password, and is given a list of Email is accessible almost universally on a
any unread emails waiting for it. Usually, it wide variety of devices, and has become
will then ask for each of those messages in almost indispensable in today’s organiza-
turn, and request that they be deleted, then tion.
display the list of messages to the user, and The technology of email has evolved suf-
allow him or her to do with them as they ficiently that end users are unaware of the
will. The whole procedure of connecting to majority of technical issues involved, since
the POP or IMAP server and retrieving the these fall upon the network manager. Users
list of new messages is repeated every few do, however, expect a seamless service from
minutes for as long as the browser contin- their systems and demand as close to 100%
ues to run. That is the only way that new up-time as possible, and, when the ‘‘email
messages are detected; the server has to be server” is down, they expect all unread
asked explicitly, it will never send a notifi- emails to be available when the system
cation. comes back on line. While the creation of
Email messages were originally nothing an email server is quite straightforward,
more than plain text, but modern systems and in an ideal world providing 100% relia-
allow any kind of data to be attached to the bility would be no problem, this is not the
message. Attachments are sent to the SMTP case in a real-world implementation.
server as a continuation of the main mes- The email systems of organizations are
sage, all in one unbroken transmission. It in essence the digital equivalent of a cor-
is the email-browsing client that picks out porate ‘‘front door” and everyone’s email
the attachments and treats them as sepa- comes and goes through it. This makes
rate entities. Modern systems allow whole it both vital from a strategic communi-
web pages to be sent as the main mes- cations perspective and simultaneously a
sage, so it is displayed to the recipient with key point of attack for those with evil
all typesetting and images in place. If the intent. The network manager may employ
email viewer has insufficient security set- a variety of techniques to prevent email
tings, such a message may also contain an from being attacked and these include
embedded program that will be run auto- proxy servers and firewalls to prevent spam
matically as the message is displayed. This attacks and emails with dangerous attach-
is a sufficient opening for a virus to be ments entering the system; Trojan horses,
installed and activated. viruses, worms, and other programs that
Broadcasting is not possible with current users unfortunately open by mistake are
email systems: every message must have a primary concern. It is problematic that
a specific recipient stated, so sending to these malevolent system attacks continue
multiple recipients requires multiple send- to evolve and change, forcing the network
ings. There are products known as ListServs manager and corporations (as well as indi-
(no error: there is no ‘‘e” after the ‘‘v”) that viduals) to deploy ever-increasing resources
automate the sending of a single message to defend against them. One of the worst
to a long list of multiple recipients. Email scenarios for network managers to con-
clients generally understand the idea of sider is the complete loss of the email
‘‘mailing lists” and will automatically send server with the discovery that there is
129
Email
no backup. Many system managers have emails from customers and vendors, some
rightly lost their jobs over this neglect and legitimate, some not. It is advisable for
it is vital that a backup procedure is in executives to have private email addresses
place and checked frequently to ensure as well as their more public ones; the
that it works; even a few hours’ worth non-private emails can then be filtered
of lost emails can be considerably damag- through an assistant. For example, it
ing to many organizations. The network is particularly easy for people to guess
manager needs to consider issues such as an executive’s email addresses if every-
capacity planning for the server and the one in the company has the same email
memory demands placed upon it by users. address form (e.g., psmith@company.com,
Many users save all their emails and wish jsmith@company.com). Some system man-
to access them all from their inbox and agers establish email address lists for pre-
are reluctant to archive them, their email validated correspondents and filter incom-
box acting as a ‘‘Rolodex.” This can place ing email for executives through that list.
significant storage demands upon the file
server and the network manager will usu-
Summary of positive issues
ally put in place quotas and other proce-
Email aids communication when used
dures to prevent users being wasteful of
in a filtered, secure environment. Email
storage.
allows users to access communication asyn-
Many companies have strict policies
chronously (when they want to, rather than
about the type of emails that can be sent
when the system or external forces decide
(both internally and externally), since they
it is time). Email servers are easy to estab-
are documents of the organization and as
lish and the technology protocols are well
such will be archived by the organization,
known. Third-party email systems are avail-
and are subject to discovery in legal pro-
able and accessible via the internet. Email
ceedings. Workers need to understand this
can carry attachments and can be sent out
basic point and must not send internal
to many addresses at once via a ListServ.
emails saying ‘‘don’t buy this stock, my
Email can provide an audit trail of cor-
research shows it is awful” and then send
respondence and may be required to be
out external emails to customers saying
archived for regulatory compliance. Email
‘‘buy this stock, it is great” because these
can be accessed through a wide variety of
conflicting positions may one day be used
devices and services.
in a lawsuit. For legal and other reasons
some people simply prefer not to use email.
President G. W. Bush stated ‘‘I don’t email,” Summary of potentially negative issues
adding ‘‘and there’s a reason. I don’t want Email can facilitate attacks on corporate
you reading my personal stuff.” (speech at networks and individuals through spam
the American Society of Newspaper Editors and malicious attachments. Email can
conference in Washington, DC, April 14, reduce productivity: the easier it is to
2005). ask a question, the more likely a person
For most people email is too critical is to ask that question without thinking
to day-to-day life for it to be neglected, about it first; since the days when access
but email can become very intrusive and to customer support required writing a
detract from workers’ productivity. Again letter or paying for a long-distance tele-
procedures and policies need to be put phone call, the number of totally unnec-
in place to assist productivity. Executives essary enquiries has grown enormously.
are also frequently the subject of direct Employees will often spend an inordinate
130
Encryption
amount of time sending and responding to techniques. The only way to be confident
personal emails if a proper policy is not that an encryption method is safe is to use
enacted and enforced. Archiving requires one that is well known and has already
the expenditure of resources by network withstood years of attack by the entire
managers to ensure regulatory compliance. cryptanalysis community.
Encryption always depends upon a Key,
References
r D. Wood and M. Stone (1999). something like a secret password or num-
ber that plays an essential part in the
Programming Internet Email (Sebastopol,
encryption process. The same encryption
CA, O’Reilly Press).
r S. Cobb (2002). Privacy for Business: Web scheme used on the same message, but
with a different key, will yield completely
Sites and Email (Saint Augustine, FL,
different results. The safety of encryption
Dreva Hill LLC).
is based on keeping the key secret, not
Associated terminology: Instant the method. The simplest of cracking tech-
messaging. niques, and in many cases the most success-
ful, simply involve trying out every possi-
ble key. If, for example, encryption keys are
Encryption four-digit PINs, then there are only 10 000
possible different keys. A computer could
Foundation concept: Security. try them all out in turn in a fraction of
Definition: Disguising information to make it inacces- a second, and be guaranteed to crack the
sible to certain others. message.
It is always essential that there should
Overview be so many possible keys that they could
Obscurity is not security. Hiding informa- not possibly all be tried. This is clearly
tion, or encoding it using a strange pro- illustrated by the example of 56-bit DES
cess that ‘‘nobody could ever guess,” pro- (Data Encryption Standard), a well-known
vides no security at all. One disgruntled worldwide standard. The 56-bit DES encryp-
employee, one inspired enemy, and all is tion scheme supports 256 (which is about
lost. Since the first electronic computer was 70 000 000 000 000 000) different keys. A
built in 1943 for the specific purpose of 4 GHz Pentium 4 could probably perform
breaking enemy codes, data security has about 7 000 000 experimental decryptions
moved firmly into the realm of abstract per second, which means that it would
mathematics. Anybody who tries making take around 300 years to try every pos-
up their own encryption schemes without sible key. That may sound secure, but it
extensive knowledge of cryptanalysis is set- means that any industrial rival could sim-
ting themselves up for disaster. ply buy 300 computers, leave them run-
The simple encryption schemes that ning for a year, and have everything. Or
have been tried and tested for centuries, 3000 computers could do it in about a
which are based on alphabetic substitu- month. Deploying 3000 computers would
tion (for example, A → K, B → F, C → W, cost some money, but there are economies
D → A, . . .) and permutation (for exam- in scale, and the value of being able to read
ple, hello → lhloe), can be cracked in a all rivals’ secrets would certainly be much
fraction of a second with the aid of a higher. To make matters worse, special-
computer. A new encryption scheme may purpose integrated circuits have been fab-
seem safe to its inventor, but could crum- ricated that are many times faster. DES is
ble quickly under attack from unsuspected expected to be superseded by AES (Advanced
131
Encryption
Encryption Standard); it has a more open ric system, it is necessary to have just one
design, and supports 256-bit keys. AES is key pair (an encryption key and a decryp-
the first encryption method approved by tion key) for each individual involved, and
the US government for ‘‘top secret” docu- all combinations of secure communication
ments to be released to the public. become possible. See Public key--private key
Increasing the key size has an inordi- for further information.
nate effect on the security of the system: Another encryption scheme of value is
56-bit DES keys are approximately 16-digit the One-way hash. This is a kind of encryp-
numbers; 112-bit DES keys are approxi- tion that can not be decrypted even if
mately 32-digit numbers, a fairly small the key is known. The original message is
increase in size, but a major increase in never recoverable even by legitimate read-
security. It increases the time required ers. Although it sounds pointless, this tech-
to try all possible keys on a farm of nique is very valuable, and makes possible
3000 computers from one month to Digital signatures and Digital certificates.
2 000 000 000 000 000 000 years. Of course, All encryption schemes (with one excep-
there is no practical limit to the size of a tion, the One-time pad) are in principle crack-
key, except that legitimate encryption and able. The only security lies in the fact that
decryption by those who do know the key the entire world’s cryptanalysts are work-
will also take a little longer. ing on them and have been for a long time.
The need for large key sizes brings about It would be very difficult to keep a flaw
its own problems. Nobody can be expected secret, and what discoveries one lab does
to remember a 32-digit number, but if ever manage to keep secret will probably soon
it is written down or recorded in any way, be repeated by others.
simple theft becomes an attractive means
of breaking the best encryption. Key secu- Business value proposition
rity and key exchange are serious problems There are many tried and tested encryp-
for the security-conscious organization. tion algorithms, and the choice of algo-
There are two basic forms of encryption rithm should not be made without com-
method: symmetric, in which the same key petent research. Amongst the best known
is used for encryption and decryption; and are DES, a symmetric algorithm that lies
asymmetric, in which case the key that under a cloud of suspicion: it contains
must be used to read a message must be unexplained steps and there is some belief
different from the one used to encode it. that it contains a Back door that would allow
Symmetric systems require that a different faster cracking by those in the know with-
key is produced for each possible pairing of out needing a key search. AES will proba-
communicants: if four people A, B, C, and D bly replace DES in the near future. RSA is a
wish to communicate securely, they could slower but highly trusted asymmetric algo-
pick a single key for their group, but that rithm. RC4 is simple, fast, and symmetric.
would mean that they have no internal pri- PGP has become almost an internet stan-
vacy. Instead six different keys are needed, dard, using a combination of symmetric
one for when A and B communicate, one and asymmetric techniques. Many encryp-
for when A and C communicate, one for A tion algorithms are subject to patents, the
and D, one for B and C, one for B and D, and status of which changes as time progresses.
one for C and D. To give N people private
communication, approximately 12 N 2 keys Summary of positive issues
are needed; the key-management problems A wide variety of encryption mecha-
can become substantial. With an asymmet- nisms and tools is available to implement
132
End-user development
encryption. A proactive encryption plan relies upon keys not being kept by any party
will result in greater data security, but the one owner.
customer confidence, and corporate-data It is easy for a programmer to over-
integrity. estimate the security of a new encryp-
tion scheme they have invented. Assessing
Summary of potentially negative issues the security of an encryption algorithm
In the United States there are restrictions is an exceptionally difficult task, requir-
on the export of some encryption tech- ing extensive training and experience well
nologies. In 1996 the US Department of beyond that provided by even the most
Commerce relaxed export restrictions, stat- advanced technical degrees. Management
ing that ‘‘Any encryption commodity or would be well advised not to give full rein
software, including components, of any to amateur cryptographers designing cor-
key length can now be exported under a porate security procedures.
license exception after a technical review
References
to any non-government end-user in any r R. Needham and M. Schroeder (1978).
country except for the seven state support-
‘‘Using encryption for authentication
ers of terrorism.” Encryption software can
in large networks of computers,”
not be exported from the United States
Communications of the A.C.M., Volume 21,
to Cuba, Iran, Iraq, Libya, North Korea,
No. 12.
Sudan or Syria. However, in 2004 several r B. Schneier (1996). Applied Cryptography
restrictions were re-implemented: ‘‘mass-
(New York, John Wiley and Sons).
market” encryption products with more r http://www.bxa.doc.gov/encryption/
than 64 bits of symmetric key require
MassMarket Keys64bitsNUp.html.
review before they may be exported and
re-exported without a license. The law cov- Associated terminology: Public
ers specific products and technologies and key--private key, One-way hash, Password.
US-based companies need to understand
their obligations under this legal frame-
work. The UK and the EU have similar laws End-user development
pertaining to symmetric algorithms over
64 bits and require the granting of an Open Definition: End-user development refers to any sys-
General Export License (OGEL) and prohibit tem that is either written or configured by an end user
export to Afghanistan, Iraq, Libya, North rather than being completely developed by a systems
Korea, and China. professional for that user.
Regardless of legal restrictions, strong
encryption algorithms are universally Overview
known, and even published in books, so The development of computing from the
a moderately competent programmer any- 1940s until the 1980s allowed users to
where in the world could recreate them obtain a wide variety of processing solu-
easily. tions and reports. These systems were gen-
The UK also has enacted the Regulation erally built and supported by a professional
of Investigatory Powers Act 2000, which staff of operators, systems analysts, systems
requires that anyone who has created an programmers, and applications program-
encryption key must be able to produce the mers. This led to delays in the produc-
key if required to do so by a legal author- tion of systems and in the implementation
ity. This is a very controversial act, since of changes, even relatively insignificant
the safety of public-key encryption systems changes on a report generated by the
133
Enterprise information portal (EIP)
system. As technology became more flexi- side the ability of end users to configure
ble and organizational pressures demanded internal systems to meet their own spe-
more flexible reporting, the delay from cific systems needs. Applications can usu-
request to delivery became difficult to jus- ally be configured to form a common basis
tify. In order to overcome this, technologies upon which all users develop their systems,
were deployed to allow users to develop thus avoiding the problems associated with
their own applications. Some of these ran every user developing disparate systems.
on the corporate data systems while oth-
ers were stand-alone application suites that Summary of potentially negative issues
allowed users to create their own databases It needs to be remembered that end users
and applications on the corporate client-- are usually not computer scientists or infor-
server system. For example, this allows a mation technologists. While they may be
user at a company that has an ERP sys- skilled in the processes around which
tem to create their own reports to their they develop their systems, they might not
own individual specification. End-user com- be fully aware of the issues and limita-
puting also occurs when users design (or tions that can have significant effects. This
more typically configure) applications that requires that any end-user development be
are then run upon a personal computer. done in a controlled and carefully managed
Spreadsheets are a very common example. environment. A further issue for organiza-
tions is that, while the technology depart-
Business value proposition ment of the organization is freed from the
The ability of users to develop their own obligation to perform all systems develop-
data sets and then design and run appli- ment, end users now need to be trained
cations to provide customized reports typ- and have their skill sets upgraded as the
ically reduces the demands upon the IT technology and the organization’s use of it
professionals and the IT organization over- change.
all. Usually these applications are devel- Reference
oped as configurable modules of commer- r M. Mahmood (2005). Advanced Topics in
cial applications. This saves the end user End User Computing Series, 2nd edn.
from having to become a technologist in (Hershey, PA, Idea Group Publishing).
addition to doing their own job, while
maintaining some degree of integrity in Associated terminology: ERP, Visual Basic.
the system being generated and used. This
is important because systems frequently
Enterprise information portal (EIP)
need to download data and communicate
with other users as well as with corpo- Definition: An enterprise information portal provides
rate offices, so a wide mix of data types, users with access to data, applications, and reporting
structures, and reporting would cause capabilities directly from their desktop, drawing from
more problems than end-user development corporate ERP, business intelligence, and data ware-
solves. house systems.
134
Enterprise resource planning (ERP) systems
135
Enterprise resource planning (ERP) systems
new version of the data is immediately vis- frequency identity tags), relieving the cor-
ible and accessible by all modules. porate IT organizations of the task.
ERP systems enable end-to-end process The ERP environment allows greater
cycles such as the procurement-to-payment security and accountability as well as
cycle and are capable of interacting with better corporate governance (for example,
ERP systems in external organizations simplifying Sarbanes--Oxley and HIPAA
through XML or other file types. ERP system compliance). ERP solutions, vendors, and
modules are typically run on a single oper- consultants are available to support com-
ating system and the modules have a con- panies of all sizes from small businesses to
sistent user interface. the largest global enterprises. ERP systems
facilitate the development of data ware-
Business value proposition houses since the single database resolves
ERP systems are composed of a set of many of the data-conflict issues associ-
modules that perform a variety of busi- ated with extracting data from multiple
ness activities ranging from logistics to sources.
customer-relationship management. The
primary benefit of the ERP systems phi- Summary of potentially negative issues
losophy is that the system modules are ERP system implementations can be com-
designed to integrate together, interacting plex and expensive, depending upon the
through the single database. The modules scale of the implementation, the number
are configurable to allow companies to of modules, the number of users, and the
align the processes contained in the mod- number of critical functions being pro-
ule with their own needs. vided. Failure to implement them correctly
ERP systems vendors have traditionally can result in severe consequences, so the
attempted to encode best industry process decision to implement an ERP is a board-
practices into their systems. Consequently, level decision. An ERP system must be
ERP vendors have created industry-specific assessed with respect to the functionality
versions of their systems, e.g., healthcare, that a company will require from it. The
manufacturing, and retail. ERP systems inability to meet the critical core needs of a
allow organizations to focus upon their company might tempt organizations to cus-
core business processes and decommission tomize their ERP system, something that
old, hard-to-maintain, legacy systems. should generally be avoided.
Several vendors have written their ERP
Summary of positive issues systems in proprietary specialized program-
The design of ERP systems as integrated ming languages and this can affect the
software environments whose modules effort required to create any specialized
communicate through a single database modules or add-on functions. Some ERP sys-
relieves organizations of the burden of tems do not support common operating
maintaining multiple databases and mod- systems and databases; this can affect the
ules that may have run on disparate operat- selection of the systems architecture.
ing systems and architectures. ERP vendors
Reference
constantly revise their systems and work to r D. O’Leary (2000). Enterprise Resource
incorporate not only best practice at the
Planning Systems (Cambridge, Cambridge
process level but also the latest legal and
University Press).
regulatory requirements. ERP vendors also
work to incorporate the latest technolo- Associated terminology: XML, Operating
gies into their environments (e.g., radio- system.
136
Entity-relationship diagram (ERD)
137
Ethernet
138
ETL (extracting, transforming, and loading of data)
millions of bits per second; the second is nology in use is that of ethernet, due to its
always the word ‘‘base” (except in the rare low cost and the wide range of technical
case of 10-broad-36, where ‘‘broad” indi- options for those deploying the system.
cates that multiple channels of a cable- The different options available to the net-
television system are used); the third indi- work designer need to be carefully consid-
cates the type of cable used. When the third ered through a set of parameters, includ-
component of the name is a number, as in ing the speed and bandwidth requirements
10-base-2 or 10-base-5, it means that coaxial for the organization, the budget available,
cable is used, and the number itself is the and the possible future need of the organi-
maximum length for a run of cable in hun- zation for an increased bandwidth capacity.
dreds of meters. So 10-base-2 means rather
cheap narrow coaxial cable with no runs of Summary of positive issues
more than 200 m (to be pedantic, the max- Ethernet is an almost universal network
imum is 185 m), and a transmission rate of technology. The technology is easy to imple-
10 Mbps; 10-base-5, also known as Thickwire ment, scale, and adapt to meet organiza-
Ethernet, uses a thicker and more expensive tional needs. Ethernet technology is used
coaxial cable to allow longer runs of up to and universally understood by computer
500 m. systems capable of being networked.
When the third component of the name
is a letter, it indicates a specific kind of Summary of potentially negative issues
non-coaxial cable. ‘‘T” refers to twisted- Network systems need to be monitored and
pair cable (telephone-like), and ‘‘F” refers stress tests must be performed. As network
to fiber-optic cable. A second letter or traffic grows, there is a potential for bottle-
digit may be added to give more specific necks to be created and systems performa-
details: 100-base-F is Fast Ethernet using nce to be reduced.
fiber-optic cables; 1000-base-T is Gigabit Eth- References
ernet using twisted-pair cables. The letter r C. E. Spurgeon (2000). Ethernet: The
‘‘X” is used informally to mean ‘‘anything,” Definitive Guide (Sebastopol, CA, O’Reilly
so 1000-base-X means all kinds of Gigabit Press).
Ethernet. ‘‘L” and ‘‘S” refer to light of long r W. Stevens (1994). TCP/IP Illustrated (New
and short wavelength on fiber-optic cables. York, Addison-Wesley).
When any kind of cable other than coax-
ial is used, it is no longer possible simply
to connect multiple computers to the same ETL (extracting, transforming, and
strand of wire; additional items of hard- loading of data)
ware such as hubs and switches are needed
to connect multiple systems. With twisted- Foundation concepts: Database, Data warehouse,
pair and fiber-optic cables, each segment of ERP.
cable connects only two devices, but it still Definition: A set of processes used to ensure that data
works in fundamentally the same way as is of a high quality, consistent, and uniform.
with coaxial cable. See Network devices for
details. Overview
The effective use of a data warehouse or
Business value proposition ERP system requires that the data in its
Several types of LAN technologies exist, database is of a high quality and uniform
including token ring, fiber-distributed data in nature. In order to achieve this, certain
interface (FDDI), and ARCnet. By far the processes are undertaken when the data
most extensive and popular network tech- warehouse is initially created or when data
139
ETL (extracting, transforming, and loading of data)
is added to it. These are the data-extraction, Once the data warehouse or ERP has
data-transformation, and data-loading pro- been established, all future data that is
cesses of ETL. written to the database needs to adhere
The very nature of data warehouses, to the data structures and conditions that
in that they are read-only data reposito- have been established for the database, and
ries intended to be used for analytic pur- hence must pass through a transformation
poses and compliance archiving, requires process.
that the data contained within them be of
the highest quality. ERP systems similarly
require data that is consistent and of a high Business value proposition
quality. To ensure that these criteria are A data warehouse that has not been popu-
met, the data placed in the systems first has lated with consistent data is useless since
to be extracted from the source database, the whole purpose of a data warehouse
then examined for a variety of possible cor- is to be able to use the consolidated data
ruptions prior to incorporation, and finally to perform Business intelligence (also known
placed into the database of the ERP or data as business analytics). Without consistent
warehouse. data, the old adage ‘‘garbage in, garbage
The extraction process is usually per- out” applies and may render any analysis
formed by a software tool that is able to useless.
configure itself to adapt to the type (e.g., ETL is also used when the database for
relational or hierarchical) and data struc- an ERP system is being created. As with
tures of the originating database. The tool data warehouses, ERP systems require a
may need to be configured for a variety of cleansed, consistent database if they are to
types of legacy database that may be over work as designed.
30 years old in some cases. Inside these
databases there is an infinite number of Summary of positive issues
possible data structures in which the data The data-warehousing and ERP industries
could be encoded. are mature, and this has led to a wide
The data-extraction tool must also be range of vendor-supported ETL products to
capable of combining the data from a manage the process. Vendors range from
set of sources, each potentially different database vendors, data-warehouse vendors,
in nature (e.g., a relational database, a and ERP vendors to specialty third-party
spreadsheet, an indexed sequential file, vendors. There are many consultants and
and a flat file). The process of data database specialists available to support the
transformation (sometimes referred to as task. Once the data has been ported over
data cleaning or ‘‘scrubbing”) ensures that to the new database, this facilitates future
the data warehouse contains just one system-migration efforts since the data is
unique data element for each data item already clean and consistent.
in the database. This may be necessary
because the originating databases may
have used different data representations, Summary of potentially negative issues
for example the warehousing database The creation of a data warehouse or ERP
system may have used ‘‘US” to repre- requires the data to pass through an ETL
sent a country, the customer-relationship- process; failure to perform this process cor-
management systems may have used rectly will lead to poor and possibly catas-
‘‘U.S.A.,” and the human-resources systems trophic systems performance. The ETL pro-
may have used ‘‘United States,” all meaning cess can be a time-consuming activity and
the same thing. requires significant effort to ensure that
140
European Union Directive on Privacy and Electronic Commerce 2002
141
European Union Directive on Privacy and Electronic Commerce 2002
data set and than have to opt out if they ing recourse for individuals who wish to
desire. This prevents the merger of data remove or modify their own data.
sets within the business community with- r Choice; individuals must be given the
out the prior consent of the individual. opportunity to choose (opt out of)
The 2002 directive built on an earlier whether their personal information will
directive (95/46/EC) that discusses the con- be disclosed to a third party or used for
troversial issue of inter-country data trans- a purpose incompatible with the pur-
fer. Article 25 of 95/46/EC states that ‘‘The pose for which it was originally col-
Member States shall provide that the trans- lected or subsequently authorized by the
fer to a third country of personal data individual. For sensitive information, an
which are undergoing processing or are explicit affirmative choice (opt in) must
intended for processing after transfer may be given if the information is to be dis-
take place only if, without prejudice to closed to a third party or used for a pur-
compliance with the national provisions pose other than its original purpose or
adopted pursuant to the other provisions of the purpose authorized subsequently by
this Directive, the third country in question the individual.
ensures an adequate level of protection.” So r Onward transfer pertains to the disclosure
far the EU Commission has recognized five of information to a third party, stating
countries as meeting the provisions of the that organizations must apply the notice
article: Switzerland, Canada, Argentina, and choice principles.
Guernsey, and the Isle of Man, plus the US r Access covers the rights that individuals
Department of Commerce (DOC) through have pertaining to their ability to cor-
its ‘‘Safe Harbor Privacy Principles,” and the rect, amend, or delete information when
US Bureau of Customs and Border Protec- it is inaccurate.
tion for the transfer of air-passenger name r Security pertains to the ‘‘reasonable pre-
records in order to provide adequate border cautions” safe harbors must take to pro-
protection. tect personal information from loss, mis-
The United States has proved a difficult use, and unauthorized access, disclo-
market for the EU to ignore with respect sure, alteration, and destruction.
to data traffic, since US corporations oper- r Data integrity pertains to the relevancy of
ate in a ‘‘sectoral” framework that ‘‘relies the use of data in terms of the reason for
on a mix of legislation, regulation, and which it was collected; and section viii
self regulation” (US DOC) and are clearly details enforcement (US DOC).
not subject to EU law. A ‘‘safe-harbor” com-
promise was developed, in which organiza- References
tions can either join a self-regulatory pri- r Official Journal of the European
vacy program that adheres to the safe har- Communities, L 201/37, 2002.
bor’s requirements or develop their own r US Department of Commerce (2000).
self-regulatory privacy policy that conforms Safe Harbor Privacy Principles, Issues
to the safe harbor (US DOC). (Washington, DC, US Department of
The safe-harbor policy includes compo- Commerce).
nents pertaining to the following issues.
r Notice, notifying and informing individ- Associated terminology: Law cross-
uals about data collected and provid- reference.
142
Fiber optics
143
File server
The disadvantages of fiber optics are that Summary of potentially negative issues
the cables are more expensive, and the Fiber-optic cable technology (the cable itself
equipment needed at either end of a cable and the light--electronic signal converters)
to convert between light and electronic sig- is more expensive than traditional metal
nals is much more expensive. Fiber optics cable technologies. Connecting two cables
can only carry signals; optical computing together is difficult. All devices connected
devices, although of great theoretical inter- to the cable need their own power supplies
est, do not yet exist. Connecting together because no useful power may be transmit-
two fiber-optic cables is very difficult. It ted through a fiber-optic cable.
is also impossible to transmit any useful
amount of power along a fiber-optic con- Reference
r D. Goff (2002). Fiber Optic Reference Guide,
nection, so all connected devices must have
their own power sources. 3rd edn. (Woburn, MA, Focal Press).
Associated terminology: LAN, Ethernet.
Business value proposition
Fiber-optic cables are advantageous for sev-
eral compelling reasons. The first is their File server
ability to carry a far greater bandwidth
Foundation concepts: Network, Client–server.
than is possible over any other medium.
Definition:Acomputerorgroupofcomputerssetaside
Second, they do not suffer from any electro-
to act as a data repository, storing files for many users,
magnetic interference (EMI) and hence can
and making them accessible over a network.
be incorporated into system designs where
metal cables would either be impractical
Overview
or require special and expensive treatment
When it was normal for organizations
(e.g., automotive and aeronautic engine-
to have one main computer, which was
management systems). Third, the cables
used by everyone who needed computer
permit data to be transmitted over large
resources, all files were kept on the same
distances without the need for signal boost-
system, and were, in principle at least,
ers. Fourth, the cables themselves are
accessible to all. Now that everybody has
much lighter than metal wire, and thus
their own computer, sharing files and
are favored in weight-sensitive applications
accessing one’s own files from a different
(e.g., in airliners).
location has become an important concern.
Although the price of the cable is higher
Portable removable media (such as flash
than that for metal-based cables, the versa-
memory cards) are a partial solution, but
tility and advantages of the technology will
require that the need for a particular file
increase its use and, as demand rises, costs
is anticipated. The various computers on
of supply will drop. Eventually fiber optics
a network could all be set up to run FTP
will probably become the de facto standard
(file transfer protocol) servers so that they
media for data transmission.
can be retrieved by the network from any-
where, but that adds to the administra-
Summary of positive issues tive load, and creates a significant security
Fiber-optic cables are light-weight, have a risk.
high-capacity bandwidth, are not subject to The other solution is to provide a File
EMI, and can carry signals for up to 100 server. This is one or more specially des-
miles without the need for retransmission ignated computers on a network that are
or a signal booster. responsible for storing files securely and
144
File server
making them available to authorized users. cult to find reliable implementations from
Normally, when a file server is in use, the a variety of sources. Using a file server can
users’ own computers cease to be the pri- also be an equally simple procedure. In pop-
mary storage site for their own files; every- ular systems, a Shared folder is used. This
thing of importance resides on the file appears as a perfectly normal folder on the
server, it is downloaded when work starts, desktop, which can be opened and accessed
worked on locally, then uploaded back to just like any other folder, but in fact repre-
the file server when finished. Many users sents an automated portal to a file server.
prefer to keep their own files on their own For example, adding a file to the shared
computers, perhaps wisely not trusting all folder is translated behind the scenes into
their eggs to one basket; they must sim- the correct sequence of network commands
ply remember to upload up-to-date copies to send that file to the server. Many users
of their files to the server at reasonable happily use a network file server for years
intervals. without ever realizing it. NFS (Network File
The use of a file server is a great aid to System, not to be confused with NSF, the
system maintenance: creating regular back- National Science Foundation), originally cre-
ups of all files is much easier if all the files ated by Sun in 1984, is one of the old-
are in one place. File servers can also pro- est and most popular of current systems.
vide financial gains: if all large files are kept Versions of it are often freely available
on a file server, then the individual work- for most versions of Unix and many other
station computers can have much smaller platforms.
and therefore cheaper disks than would
normally be possible. Some organizations Business value proposition
go all the way to using Diskless systems, in File servers are used in organizations for
which the individual workstations have no a variety of reasons, including central-
disks at all, relying on network access to a ized security, user management, applica-
file server for absolutely everything. tion license management, cost, backup pro-
There are some negative aspects to the cesses, efficiency, and integration. Having
use of file servers. If all files reside on all the files on one server allows that server
a network-accessible file server, then the to be at the center of all security efforts; all
security of the entire network and of requests for data can be examined equally
that server in particular becomes a major and verified, preventing localized security
concern. Network traffic will be greatly breaches and duplication of effort in secur-
increased if every file access requires com- ing many localized data storage facilities.
munication with a remote file server, so An extension of centralized security is the
higher-bandwidth network infrastructure centralized control of user access and the
may be required. Even the fastest of net- creation of user profiles (descriptions of
works can not provide the same access what resources each user is permitted to
speed as that available from a local disk access). This can help in the creation and
drive, so there may be an appreciable slow- enforcement of access policies and ensure
down in some applications. compliance with regulatory requirements
The term file server may be correctly such as Sarbanes--Oxley.
used to refer either to the computer that The use of file servers allows network
holds the files, or to the software applica- managers to ensure that the applications
tion that runs on that computer handling running on corporate systems are in accor-
the requests. File server software is tech- dance with the licenses held by the orga-
nologically fairly simple, so it is not diffi- nization. The centralization of licenses also
145
File system
146
File system
FAT (File Allocation Table), which has It is the file system of choice for
three varieties: FAT-12 (used only for Windows-XP.
floppy disks), FAT-16, and FAT-32. The Unix. All versions of Unix (Linux,
number 12, 16, or 32 in the name FreeBSD, Solaris, etc.) use minor vari-
refers to the number of bits used in ations on the same file-system design
the unique identification number of which, surprisingly, has no name
each file. For example, FAT-12 uses only (apart from UFS, which simply stands
12 bits, which can produce only 212 for Unix File System). Its most recog-
or 4096 different numbers, so, under nizable and namable feature is the I-
the FAT-12 system (and therefore on Node, a small data structure used to
any PC floppy disk), it is not possi- represent an individual file. I-Nodes are
ble to have more than 4096 files. The probably also the most troublesome
same logic imposes a limit of 65 536 feature, since they must be created
files on any FAT-16 system. This restric- in advance when a disk is formatted;
tion, together with the maximum of if the number of I-Nodes needed was
11 characters in a file name and 4 GB underestimated and they are all used
maximum disk size, has rendered FAT- up, it is impossible to create any new
16 obsolete. FAT-32, introduced in 1996, files. Unix file systems generally allow
relieves some of these limits, support- for very long file names; and always
ing disks of up to 2048 GB with indi- provide individual access permissions
vidual files of up to 4 GB and the now- on files and directories.
standard long file names. However, FAT ISO9660 is the file system used for
systems provide hardly any support data on compact disks (CDs). It has
for recovery after disk crashes, and two forms: Level-1, which allows only
even the unexpected loss of electrical 11-character file names; and Level-
power can result in enormous loss of 2, which allows up to 32-character
data. FAT systems also suffer from Clus- file names. Both permit folders to
tering, a technique that significantly be nested only eight deep. Joliet is
raises the maximum supportable disk a commonly used set of extensions
size at the expense of also raising the that greatly relieves these restrictions.
minimum possible file size. A normal Micro-UDF is similar in spirit to
FAT-32 system on a large disk will ISO9660, and is used for data DVDs.
require every file, even if it contains UDF (the Universal Disk Format) is
only a single byte of data, to occupy another newer extension to ISO9660,
at least 32 768 bytes of disk space. FAT- which is used only for optical disks
32 is generally available only under (CDs, DVDs, etc.)
Windows. HFS (Hierarchical File System) and HFS-
NTFS (the Windows-NT File System) is Plus were the file systems used by
a Microsoft product introduced for Apple Macintosh computers before
Windows-NT, and supported by all MacOS-X, which is based on Unix
modern Windows variants. It provides (the BSD version). They support long
faster access to data in large files, file names, file access permission set-
automatic encryption, logging of ting, and the more complex file
file accesses at various levels, indi- structures that Macintosh systems
vidualized file-access protection were famous for, but have other
settings, and good recovery after limitations similar to those of FAT
disk, computer, or software failures. systems.
147
Firewall
148
Firewall
on a case-by-case basis for anything unex- ine network traffic and resource requests,
pected. assess the threat level, and take the appro-
The primary purpose of firewalls is, of priate action. The application of firewall
course, to restrict communications into technology is the responsibility of the net-
or out of a controlled area. If used care- work manager and the chief security officer
fully, they can prevent attacks from get- (CSO) (in a small organization the CSO may
ting in and secrets, corporate or personal, well be the network manager). An appro-
from getting out. Firewalls can be a heavy- priate firewall technology and vendor need
handed solution; employees often bemoan to be selected and put in place. In the
the fact that they can not use FTP and selection of a firewall, the network man-
therefore can’t download the latest version ager needs amongst other things to con-
of some application, but that is just the fire- sider the reliability of the device, its costs
wall doing its job especially well. It is more relative to performance, whether a hard-
often firewall-management policy that is at ware or software device is appropriate, the
fault rather than the firewalls themselves. scalability of the selected system, vendor or
Of course, a firewall is just another third-party support for the device, the his-
human construct, as fallible as any other. torical record of the vendor, and the total
If a firewall system itself has security vul- cost of ownership.
nerabilities through which it can be taken The implementation of a firewall device
over, then all is lost. Single-purpose fire- needs to take into account the sensitivity
wall hardware units are the least likely to requirements of the network to which it is
have such vulnerabilities, but are by no attached and the requirements of its users.
means to be considered immune. Software Should all traffic be prevented from enter-
firewalls are certainly the most vulnera- ing the system or just select traffic? Insen-
ble, since they are merely software appli- sitivity will allow too much traffic and the
cations themselves, and are just as likely potential for breaches in security; if the
to be replaced or modified by a success- sensitivity levels are too high users will
fully attacking virus. For any system hold- be unhappy because ‘‘normal” emails may
ing critical data, the selection of a firewall be rejected. These aspects of deployment
is a critical stage in setting up security, and need to be contained within a firewall-
requires diligent research: some providers management policy document.
have expensive products and a long history
of satisfied customers; some providers are Summary of positive issues
very cheap, but have never been heard of Firewall technology helps to prevent
before. A small saving at setup may have a unwanted data traffic from entering a
very large cost later. A vulnerable firewall network. The technology is mature, well
means an open computer system. understood, and supported by vendors, con-
sultants, and network specialists. Firewalls
Business value proposition can be used to prevent access to specific
The key to a secure IT operation is to ensure resources.
that unwelcome traffic stays outside the
walls of the organization. If there is any Summary of potentially negative issues
possible doubt about the security of a com- Firewalls are not immune from attack
puter system (and with closed-source sys- themselves, and software firewalls in many
tems how can there not be a doubt?), then instances may themselves be targets of hos-
the provision of a reliable firewall system tile intent. Poor firewall-management poli-
is central to safe operation. Firewalls exam- cies can result in restricted access to the
149
Flash memory
network both for internal and for external memory) that satisfies requirements (4) and
users. (5). It is a relatively new technology, and
advances toward the first three requireme-
Reference
r J. Wack, K. Cutler and J. Pole (2002). nts are being made, but there is still a long
way to go before flash memory can replace
Guidelines on Firewalls and Firewall Policy:
either RAM or disks in general-purpose
Recommendations of the National Institute
computers. At the time of writing the best
of Standards and Technology, US
statistics for commercially available units
Department of Commerce Special
are very approximately as follows:
Publication 800--41 (Gaithersburg, MD,
US Department of Commerce).
Criterion Units RAM Flash Disk
Associated terminology: Proxy, Security.
1
Cost per GB $100 $100 2 c/
Capacity GB 2 4 500
Speed ns per byte 0.5 50 250
Flash memory
Non-volatile -- No Yes Yes
Robust -- Yes Yes No
Foundation concepts: Storage, Memory, Disk.
Definition: A form of stable memory maintaining its
contents indefinitely without any power consump- (In this table, each technology is given
tion, and having the robustness of a solid-state device, its best possible circumstances. For exam-
with no moving parts. ple, flash-memory cards can attain trans-
fer rates of 20 MB per second, which gives
Overview 50 ns per byte, even though a single byte
In an ideal world, the memory used can not be accessed in 50 ns; similarly, disk
in a computer would have five essential drives can provide one byte in an average of
characteristics: it would be 250 ns only as part of a large transfer.)
(1) of low cost, Currently, flash memory provides a use-
(2) of high capacity, ful portable medium. Data may be trans-
(3) of high speed, ferred to it very quickly and conveniently,
(4) non-volatile (meaning that it should and it is very light and hard to break,
not forget its contents when the power so it may be transported with great ease.
supply is turned off), and Usually, flash memory is in the form of
(5) robust (both solid state, having no small cards or other portable devices. The
moving parts that will inevitably suffer cards must be connected to a computer
from wear and fail, and having no through a special USB device; the popular
delicate components). (and mutually incompatible) forms are Com-
pact flash (two different forms), Secure digi-
Naturally, the real world is far from perfect, tal, and Memory stick (two forms). Other flash
and no form of memory yet devised satis- memory devices have a built-in USB plug,
fies all five of the requirements. Tradition- so they need no extra support.
ally, the problem has been solved by build- There is one further disadvantage to cur-
ing computers with two forms of mem- rent flash memory technologies. Although
ory, one (usually called RAM) that satisfies they are completely solid state, the pro-
requirements (3) and (5), and the other (usu- cedure required to erase and write data
ally in the form of a disk drive) that satisfies is slightly destructive, and flash memory
requirements (1), (2), and (4). devices can not be expected to survive more
Flash memory is a form of EEPROM (electro- than one million cycles (fewer for many
nically erasable programmable read-only kinds). Reading data has no destructive
150
Formal methods
effect, only erasing and writing. This limi- flash memory stores considerably less data.
tation is of no significance to current uses, There is no built-in security associated with
but, if flash memory is ever to be used the devices.
to replace RAM, for which a million read--
write cycles occur in less than a second, this Reference
r P. Cappelletti, C. Golla, P. Olivo, and E.
will be another hurdle to be overcome.
Zanoni (1999). Flash Memories (Boston,
Business value proposition MA, Kluwer Academic).
Flash memory has many desirable prop-
erties for a memory device. It is capable
of holding a meaningful amount of data,
as opposed to the floppy disk which has Formal methods
become almost redundant due to its small
Foundation concept: Software development lifecycle.
storage capability, which forces users to
Definition: Mathematically rigorous software devel-
employ compression or other techniques
opment methodologies.
to save even modest files. Flash memory
has relatively fast data transfer rates, and
most useful of all is that it is portable and Overview
can employ the convenience of a USB con- Software development is sometimes viewed
nection. These characteristics make it ideal as a science, sometimes as an art, but
for the business traveler or individual who mostly is considered to lie in an ill-defined
wishes to carry their data with them on a variable position somewhere between the
trip. two. The scientific method is based on
Security concerns also need to be a continuous cycle of observation, the-
addressed by flash memory users; the orization, and experimentation. Observa-
devices have no inherent security built tion corresponds to investigating what is
into them and thus users need to ensure needed in new software, or observing the
that the data stored is encrypted as neces- results of tests of existing software. Theo-
sary (e.g., to meet HIPAA requirements). Of rization corresponds to working out what
course, the same security concerns apply those observations mean, and thinking
equally to other portable media. The small of a way to implement or improve the
size of the memory devices aids their porta- implementation of a solution. Experimen-
bility and can enhance the data security tation corresponds to running or testing
because users can wear the device on a cord the implementation, and seeing what hap-
around their necks and thus reduce the pens in that test brings the cycle back to
potential for data theft; a laptop is a much observation.
more obvious target for a thief than is a The artistic method is based on experi-
hidden flash memory device. ence and the intuitive expertise that comes
with it: the experience of having imple-
Summary of positive issues mented many software projects before,
Flash memory is convenient, portable, easy and knowing which techniques work well
to use, possesses a relatively high transmis- under which circumstances, and the aca-
sion rate, can hold more data than a floppy- demic experience of having learned the
disk device, and is solid state in nature. ‘‘right” way to do things and knowing how
to apply that knowledge. Neither method
Summary of potentially negative issues can be successful alone; successful software
The cost of flash memory is higher, per unit developers slowly discover their preferred
of capacity, than that of disk memory, and place on the art--science spectrum.
151
Formal methods
152
Formal methods
1, 2, and so on without limit, testing each has been a lot of research into formal
in turn to see whether it satisfies [some methods, and some variably useful tools
condition]; when one is found, stop the have been produced to aid the process.
search because you’ve got the answer.” This The most promising of these are Z (pro-
describes the same result, but this time also nounced ‘‘zed”), VDM (the Vienna Devel-
says how to find it. opment Method), The Hoare Logic, the
The well-understood method of mathe- ACL2 Theorem Prover, the HOL Theorem
matical proof supports exactly that kind Prover, CSP (Communicating Sequential
of transformation. It repeatedly applies Processes), and Extended ML, although
small transformations that are known to there are many more. Some non-traditional
preserve correctness to an initial state- programming language paradigms, partic-
ment. If the initial statement was correct, ularly pure functional programming and
the result will also be correct. The for- logic programming, transform specifica-
mal specification is transformed into an tions that would normally seem to be
executable program by applying a care- completely abstract and unexecutable into
fully chosen sequence of simple changes directly executable programs.
that have already been proven to preserve
correctness. This is also a very difficult Business value proposition
stage. There are software development tools The need to create programs quickly and
that will apply the chosen transformations correctly is at the very center of computing.
to the guaranteed correct specifications, The promise of perfect programs, flawlessly
removing the drudgery and the possibil- working and optimized, first time and
ity of transcription errors, but at each step every time, is the ideal sought by program-
there will be an enormous number of possi- mers, IT-project managers, and everyone
ble transformations that could be applied, connected to an IT organization, includ-
and it is absolutely impossible for a soft- ing the end users. However, this utopia
ware tool to know which is the ‘‘right” is rarely, if ever, reached by anyone, even
one. those organizations certified as Level 5 on
Even formal methods rely completely on the Software Capability Maturity Model.
human intelligence. The difference is that Typically software is developed in a very
any program produced by formal meth- ad hoc manner. For example, some code
ods is absolutely guaranteed to be correct. may be taken from a previous project,
What is not guaranteed is that any effec- edited, tested, and then added to. The pro-
tively executable program will be produced cesses may follow a rough informal pro-
at all. Heuristic searches and other tech- cess model, e.g., a waterfall model, or a
niques of artificial intelligence can help to spiral model, or be termed a ‘‘prototyping”
automate the search for the right transfor- process model and couched in JAD termi-
mations, but it is logically impossible for nology. Fundamentally, many professional
there to be any method that will always and trained computer scientists acknowl-
find the answer. Even human programmers edge that, if houses, bridges, and high-rise
can’t guarantee to be able to program a buildings were constructed to the same
solution to every problem they might be standards as the majority of software, they
given. would either fall down during construction
The potential gains from making for- or be condemned as unsafe and unfit for
mal methods easier to use are enormous. occupation.
The ability to produce guaranteed per- The rise of formal methods as a disci-
fectly correct programs just sometimes pline grew from the recognition that the
would be worth a lot. As a result there profession needs to move away from the
153
Formal methods
position where anyone can claim to be a that professional engineers, chartered sur-
programmer and proceed to sell bad code veyors, and others do, with a requirement
to the public, toward becoming an engi- for continuous training and acknowledge-
neering profession. Formal methods aim to ment of liability for errors, the current ad
provide the professional programmer with hoc situation of random development will
a set of tools and methodologies that, when continue.
used correctly, will result in systems that
match their specifications, as agreed upon Summary of positive issues
by the software engineer, the user, and, Formal methods are the subject of much
when appropriate, the corporate manage- current research and great academic
ment. interest; successful use of formal methods
The use of formal methods in IT organi- would provide great benefits to an orga-
zations is very limited, and there are sev- nization. The Software Capability Maturity
eral reasons for this. The first is that the Model is available to measure an orga-
training to become a true software engi- nization’s progress from the informal to
neer is long, technically challenging, and the repeatable formal style of develop-
requires intellectual rigor. Second, poten- ment. A wide range of formal methods
tial software engineers are put off from exists, including functional programming
studying this topic because the demand and proof systems. Formal methods enable
from industry is low and the rewards con- the user and the software engineer to agree
sequently limited. Third, companies do not on a specification of the product that will
recruit software engineers because many be delivered. Formal methods permit proof
IT organizations do not understand what that the specification and the delivered
true software engineering offers them. A code are the same.
mistaken view is that adhering to a pro-
cess model such as MIL-STD-2167A, using Summary of potentially negative issues
the waterfall method, or using RAD meth- Formal methods are very much an evolv-
ods equates to software engineering. This ing technology. There is a large academic
is simply wrong, weak development meth- literature base, but few usable software
ods that do not encapsulate mathematical development tools that truly adhere to
design principles are not formal methods formal methods. Formal methods require
of software engineering. Fourth, IT organi- intense specialist training. Formal specifica-
zations are typically not willing to change tions are difficult for users and customers
their development model to this higher to read and hence be able to comment
standard, since it demands a higher caliber upon or give informed assent to. Adoption
of human resources, is perceived as taking of formal methods requires different devel-
longer to develop code, and is unintelligible opment processes from those of non-formal
to non-technically trained developers, ven- styles of development.
dors, and users.
Some organizations have attempted to References
use formal methods, with varying degrees r W. Gibbs (1994). ‘‘Software’s chronic
of success. Formal methods are particularly crisis,” Scientific American, September.
suitable to problems for which failure may r A. Harry (1996). Formal Methods Fact File,
threaten human life, e.g., in the space pro- VDM and Z (New York, John Wiley and
gram, the nuclear industry, and airlines. Sons).
However, until the computer industry as r C. Jones (1990). Systematic Software
a whole demands that its workers possess Development Using VDM (Englewood
true professional qualifications in the way Cliffs, NJ, Prentice-Hall).
154
Fortran
155
Fourth generation
156
Fourth generation
subject to continuous wear, which gradu- transistor is a very robust (compared with a
ally destroys the device. vacuum tube, almost indestructible), small,
Fast reliable computing could only be cheap, low-power device built from semi-
built on a technology that allows machines conductor materials. Electricity can flow
without moving parts to be built. This through semiconductors just as electrons
seems to be a contradiction in terms, can flow through a vacuum, and applying
but the only thing that moves in a CPU a voltage correctly to a third connection
is electricity: electrons flowing through a can disrupt or enhance a current flowing
crystalline lattice of atoms. Other parts of a between two primary connections, allow-
computer (disk, keyboard, mouse, CD drive) ing one signal to control another just as
still have mechanical components, and are with a triode. Although the physics is very
the only parts of a modern computer (apart different, a transistor is conceptually simi-
from software) that are ever likely to fail. lar to a triode. Second generation comput-
The ability to use one electrical signal to ers used transistors in their logic circuitry
control or switch another electrical compo- where the first used vacuum tubes. This
nent without any mechanical intervention greatly improved reliability, price, and com-
was realized with the invention of the tri- puting power. A first generation computer
ode in 1906. Triodes and related devices, with perhaps 1000 vacuum tubes might
known collectively as vacuum tubes in be expected to run for a couple of hours
the United States and valves in the UK, before a tube burns out and has to be repla-
are extremely delicate and expensive. They ced. Significantly increasing the number
consist of a fragile glass tube containing, of tubes would significantly decrease the
in a vacuum, a small heating element mean time between failures. A second gene-
that heats an electrode (the cathode) to ration computer with many more transis-
around 4000 ◦ F. Electrons are boiled from tors would usually operate reliably until
the surface, and fly through the vacuum to some auxiliary mechanical component
another electrode (the anode), completing failed. The second generation lasted until
an electrical circuit. A voltage applied to a about 1970.
third mesh electrode (the grid) interposed A transistor is essentially a tiny piece of
between the anode and cathode suppresses silicon subtly modified (‘‘doped”) in strate-
the flow of current, and thus allows one gic places to make different kinds of semi-
purely electrical signal to control another. conductor material. It is possible to take a
With their thin glass, hard vacuum, and larger piece of silicon, make a more com-
high temperatures, vacuum tubes have a plex pattern of modifications to it, and con-
very short operational lifetime, but they struct multiple transistors on a single solid
can operate many thousands of times faster component. Individual transistors have to
than any mechanical device. They were in be individually mounted and soldered onto
common use for decades, but it was not circuit boards along with wire connectors
until the 1930s that anyone realized that and other parts to make a useful electronic
they could be applied to digital signals component. An integrated circuit is a single
to perform numerical computations. This chip of silicon with many transistors and
was the first generation: electronic com- connectors etched onto it. One small sili-
puters with logic circuitry built from vac- con chip can replace a whole circuit board
uum tubes. The first generation of comput- and be manufactured in large quantities
ers started with Colossus in 1943 and ENIAC completely automatically. Third-generation
in 1946, and continued until the late 1950s. computers use integrated circuits for all
Vacuum tubes made computers possi- of their major components. This makes
ble, but transistors made them practical. A another significant reduction in size, power
157
FTP (file transfer protocol)
A triode, an equivalent transistor, and two integrated circuits, one SSI, containing about 25 transistors,
and one VLSI, containing about 25 000 000 transistors. The scale on the left is in inches.
158
Functional programming
no special setup, but FTP servers are not availability on computers makes this app-
always provided for free, and need to be roach useful for data transfer anywhere a
installed and configured correctly. Installa- network connection is available.
tion and configuration is just a few min-
utes’ work for the right personnel, but must Summary of positive issues
be done correctly. A mis-configured FTP FTP is a cheap, efficient, and widely avail-
server can leave every piece of data on the able method for data transfer.
computer open to the entire internet.
In normal use, a user wishing to transfer Summary of potentially negative issues
data will start up the FTP client software It requires care in configuration. Security
and tell it to connect to the desired server issues in several areas need to be addressed,
system. The user provides a user-name and including the use of ports used to commu-
password, and, if they are accepted by the nicate through, the firewall configuration,
server, files and folders may be transferred and the data files that are allowed to be
in either direction, often with a single click accessed.
or drag-and-drop operation. Server admin-
istrators may control exactly which files References
r W. Stevens (1994). TCP/IP Illustrated (New
and folders any given user or group of
users may access, and what kind of accesses York, Addison-Wesley).
r http://www.ietf.org/rfc/rfc0959.txt?
(read, modify, delete) they may make.
FTP also has an anonymous mode, in number=959.
which certain files are made accessible to Associated terminology: File server,
all, without the need for a user-name or Server--client, Electronic data interchange,
password. This provides a very convenient X.12.
means for making files publicly available,
but is also a possible security flaw: acci-
dentally allowing anonymous FTP circum- Functional programming
vents file protections. Most web browsers
understand FTP, and provide a very intu- Foundation concepts: Programming language, For-
itive interface for downloading files. For mal methods.
large amounts of data, FTP is superior to Definition: A style of programming in which compu-
HTTP, the default web protocol. tations are expressed as pure mathematical functions
FTP can be problematical for firewalls, that can be directly executed.
since every FTP transfer requires the use of
an arbitrarily selected internet port, so sim- Overview
ply leaving ‘‘the FTP port” unblocked is not Functional programming is a widely mis-
enough. Modern firewalls are constructed understood concept. The description is
with complete knowledge of FTP and other simple -- pure mathematical functions are
commonly used protocols, and are usually used to express computations (programs) --
able to handle the situation, albeit with but the true significance of ‘‘pure math-
some complexity. ematical” is often missed. A function in
the mathematical sense is not what most
Business value proposition programmers think of as a function. A
FTP is a simple, efficient mechanism for mathematical function is a fixed mapping
data transfer between two computers con- from each of a set of possible inputs to a
nected over a network. The approach allows corresponding output. The same function
large amounts of data to be transferred applied to the same inputs must always
easily and effectively. FTP’s almost universal deliver the same results, regardless of
159
Functional programming
circumstances. In fact, the idea of ‘‘circum- major areas of computer science research.
stances” does not mean anything in this Popular implementations that either are
setting; mathematics does not vary accord- purely functional or at least support mathe-
ing to context. matically functional programming include
This very basic notion of requiring that LispKit, KRC (Kent Recursive Calculator),
the same function applied to the same Miranda, Haskell, AFL (A Functional Lan-
inputs always produces the same results guage), FP, Hope, and Clean, but there are
is essential to all of mathematical reason- many more. A theoretical construct known
ing. The standards of proof simply do not as Monads is a topic of current research,
work if you can write the same thing twice attempting to make the benefits of func-
but have it represent a different value each tional programming available to more tra-
time. If this absolute constancy is applied ditionally oriented programmers.
to the design of programming languages, it
becomes possible to produce perfect proofs
Business value proposition
of a program’s correctness. New techniques
Functional programming has a vast poten-
of program development become possible,
tial for the development of mathematically
in which a statement of the problem to be
correct programs. Functional programming
solved is simply transformed into a solution
constructs also help programmers to cre-
to that problem.
ate solutions in such a way that they are
The difficulty is that standard program-
not constrained by arcane rules of proce-
ming languages do not behave at all like
dural programming. Unfortunately, many
this. The vast majority of programmers
programmers are unaware of this branch of
would consider it impossible to write a pro-
computer science and hence the use of this
gram to perform any useful task if it is
style of programming in the ‘‘real world”
not allowed to change anything. True func-
has been limited.
tional programming does not permit vari-
It should be noted that the amount of
ables, loops, ‘‘print statements,” or any of
‘‘lines of code” required to perform an oper-
a multitude of features normally thought
ation in a functional language is frequently
of as essential to programming. In order
very small compared with that for a pro-
to take advantage of the vast benefits that
cedural language, and the functions them-
functional programming has to offer, pro-
selves can also be easy to read with some
grammers have to almost relearn their
training. The style of programming has the
trade right from the beginning. That is not
potential for providing cleaner, smaller (in
an easy, or a cheap, thing to do.
terms of amount of code), partially self-
There are some popular programming
documenting programs, which reduces the
languages that are often called functional,
overhead on the developer. However, when
when they really are not. Lisp, the work-
programmer productivity is measured in
horse of artificial intelligence, is very fre-
terms of lines of code delivered, a very dis-
quently thought of as a functional lan-
torted picture is given.
guage because in Lisp everything looks like
a function. Appearances are irrelevant; Lisp
is not a mathematically functional lan- Summary of positive issues
guage, and hence does not provide the ben- Functional programming is mathemati-
efits of one. That is not to say that it is a cally pure and is open to undisputable
bad language; Lisp is very useful, it is just proof. A significant academic literature has
not functional. developed on the topic. Many free func-
Many real functional languages are avail- tional programming languages are avail-
able, and it has occasionally been one of the able. The programs themselves can be
160
Fuzzy logic
transformed and reasoned about mathe- does not mean that it is probably tall. It is
matically. because the idea of tallness is not precisely
defined, it is a fuzzy concept. The 0.85 rep-
Summary of potentially negative issues resents the judgment that it is quite tall.
Functional programming needs an aware- By most standards the Empire State Build-
ness and knowledge of pure mathematics. ing is tall, as when compared with people,
The approach is not widely understood or other skyscrapers, or red ants, but it is not
known amongst programmers. The inter- tall by all standards: when compared with
preters are often relatively slow at execut- mountains or thunderclouds, it is distinctly
ing the code. short.
The use of fuzzy logic allows automatic
References decision-making processes to progress
r S. Peyton-Jones (1997). Implementing
when an exact algorithm can not be
Functional Programming (Englewood found, and it is generally considered to be
Cliffs, NJ, Prentice-Hall). a technique of artificial intelligence and
r J. Darlington (1982). Functional
expert systems working in the domain
Programming and Its Applications of human descriptions. For example, a
(Cambridge, Cambridge University search for ‘‘big orange cats” would be hard
Press). to code with an exact algorithm because
r R. Bird (1998). Introduction to Functional
both ‘‘big” and ‘‘orange” are fuzzy human-
Programming (Boston, MA, Pearson). oriented descriptions. It would be pointless
r P. Hudak (2000). The Haskell School of
to ask the user for further details, exactly
Expression (Cambridge, Cambridge specifying how big in terms of inches or
University Press). pounds, and it would be absurd to attempt
to quantify how orange the cat must be.
Associated terminology: Logic
A fuzzy logic system would allow ‘‘big” to
programming, Formal methods, Software
be determined as a fuzzy combination of
development.
‘‘long” and ‘‘heavy,” accepting candidates
that are quite long and quite heavy,
together with those that are very long but
Fuzzy logic only moderately heavy, and those that are
very heavy but only moderately long, thus
Foundation concept: Knowledge-based systems. encapsulating all reasonable meanings of
Definition: A system of logical reasoning that handles ‘‘big.” It could then similarly filter all can-
imprecise knowledge or information. didate cats, having reduced requirements
for the degree of orangeness for those
Overview that are exceptionally big, and reduced
Fuzzy logic is a reasoning system that allows requirements for the degree of bigness for
truth values other than ‘‘yes” and ‘‘no.” The those that are startlingly orange.
validity of a statement is expressed as a Fuzzy logic is very difficult to work with
number somewhere between 0.0 (for com- effectively. The assignment of levels of tall-
pletely false) and 1.0 (for completely true). ness, bigness, and orangeness on the basis
For example, the truth value of ‘‘two plus of measurable quantities is entirely arbi-
two is four” would probably be exactly 1.0, trary and a matter for the system designer
whereas the truth value of ‘‘the Empire to decide. This makes success almost impos-
State Building is tall” might be 0.85. This sible because all users will have different
is not because we are uncertain about the ideas of how these terms should be defined.
height of the Empire State Building, and it Then there is the problem of how to
161
Fuzzy logic
combine fuzzy values: if a cat is big to the through which ‘‘fuzzy” problems could be
tune of 0.74, and orange to the tune of 0.9, considered. While probability theory can be
how should those numbers be combined to applied to a variety of problems, it is not
give a single measure of how ‘‘big orange” suited to all, and it requires that the prob-
it is? Again, there are no clear answers; it is lem solver has sufficient historical data to
for the system designer to decide, and for make a determination. Fuzzy logic allows
the users to inevitably disagree. software engineers to create programs that
For these reasons, fuzzy logic has not work for a variety of situations based upon
been a very commonly used tool. At first imprecise knowledge of inputs, the state of
sight it seems to be an obvious and good the system, and even the desired results.
idea, but it is very resistant to encoding in Fuzzy logic technologies have been incor-
the precise language of computer logic. porated into a wide variety of processes
Fuzzy logic was introduced by Lofti and products, including those found in
Zadeh of the University of California in automobiles, e.g., cruise-control systems,
Berkeley in 1965. He and others have devel- traction-control systems, anti-lock systems,
oped a properly mathematically rigorous and cornering systems; household goods
basis for fuzzy logic, which is known as such as toasters, microwave ovens, and
possibility theory. The notation and con- refrigerators; and entertainment products
cepts of possibility theory can easily be such as video cameras for automatic expo-
confused with probability theory, but it sure and focus control.
is a completely different thing. Probabil-
ity theory is entirely uncontroversial and Summary of positive issues
admits no imprecision, only uncertainty. Fuzzy logic allows programmers to use
Fuzzy logic and standard probability the- truth systems that are non-binary in
ory may be applied to similar computa- nature. It allows determination of out-
tional situations, but have very different comes when probability theory is not appli-
semantics. Saying that the probability of cable. Fuzzy logic is a well-researched aca-
X being tall is 0.85 means that we don’t demic discipline and can be applied to a
really know whether X is tall or not; it wide variety of problems and processes.
probably is, but we can’t be sure. Saying
that the fuzzy measure of X’s tallness is
0.85 means that we are absolutely certain
Summary of potentially negative issues
Fuzzy logic mathematics can become com-
that it is quite tall. Probability deals with
plex and it requires some special training.
uncertainty over well-defined attributes;
Fuzzy logic is difficult to work with effec-
fuzzy logic deals with certainties about ill-
tively. The combination of fuzzy values in
defined attributes. Exactly how connectives
logic is difficult to determine in many sit-
like ‘‘and” and ‘‘or” combine probability val-
uations.
ues is well known; how they combine fuzzy
logic values is inherently unknowable. Reference
r G. Klir and B. Yuan (1995). Fuzzy Sets and
Fuzzy Logic (Boston, MA, Pearson).
Business value proposition
Fuzzy logic was originally developed in Associated terminology: Artificial
the 1960s by Lofti Zadeh as a mechanism intelligence.
162
Global positioning system
163
Groupware
164
Groupware
165
Hacker
166
Hardware, software, firmware, application, program
167
Health Insurance Portability and Accountability Act of 1996 (HIPAA)
168
Host, host name, hosting
169
Human–computer interaction
170
Hypertext, HTML
171
Hypertext, HTML
172
Hypertext, HTML
with the original. The viewer may read the ing Group. HTML continues to evolve, with
new document, then return to the original, a working draft of the Extensible HyperText
or read the two together, or follow up on Markup Language (XHTML) version 2.0 hav-
another reference in a limitless chain. The ing been released on May 27, 2005. XHTML
immediate and concurrent access to linked (HTML written in XML) advances certain
documents without the requirement for aspects of earlier versions of HTML and is
any searching is what puts the ‘‘hyper” into backwards compatible, such that all older
‘‘hypertext.” HTML documents continue to display cor-
The support for connected documents, rectly on newer browsers.
or Hyperlinks, provided by HTML is surpris-
ingly primitive. The author of a document Summary of positive issues
must decide exactly which other docu- HTML is an open-source, non-proprietary
ments should be linked with any given sec- system. Basic HTML is easy to program, flex-
tion of his or her composition, and explic- ible, universally used in hypertext devel-
itly add markup tags giving the location of opment, and understood by all browsers.
the document as a URL (universal resource HTML continues to evolve through the
locator, q.v.). There is no mechanism for activities of W3C.org. HTML provides a
specifying a general look-up rule for all strong basis for web-site development and
words in a document, nor is there any tools are available to help developers
mechanism for ensuring that the linked improve their productivity.
document has not changed, and is still
relevant. What we have today is barely Summary of potentially negative issues
more sophisticated than the Memex sys- HTML has become by default the standard
tem invented by Vannevar Bush (but never vehicle for hypertext presentation. Organi-
built) in 1932; he envisioned a library of zations need to deploy resources to under-
documents on microfilm physically linked stand it fully and understand its limita-
together with fine threads, and a special tions. This may be in the form of training
machine to display those documents and and/or being active in the W3C.org HTML
follow threads when the user pulls a lever. Working Group. The flexibility of HTML can
The World Wide Web as we know it today be problematic to developers wishing to
is for the most part a widely distributed col- develop a web page with fixed dimensions.
lection of hypertext documents. Most are HTML is the dominant web development
written in HTML, some still in plain text, language and will remain so for the fore-
and some use proprietary systems (the most seeable future.
popular of which is Acrobat), for which
References
viewers are available for free, but com- r C. Musciano and B. Kennedy (2002).
position requires commercially available
HTML and XHTML: The Definitive Guide
software. Surprisingly, there are no LaTeX
(Sebastopol, CA, O’Reilly Press).
viewers available that can be integrated r P. Gralla (2004). How The Internet Works
with the standard web browsers.
(Indianapolis, IN, Que).
r L. Lamport (1986). LaTeX User’s Guide &
Business value proposition Reference Manual (New York,
HTML is universal in its use as the base for Addison-Wesley).
hypertext web documents. HTML is a non- r V. Bush (1945). ‘‘As We May Think,”
proprietary format developed through the Atlantic Monthly, July.
W3C.org consortium and the HTML Work- r http://www.w3.org/TR/REC-html40/.
173
ICANN (Internet Corporation for Assigned Names and Numbers)
174
Index
175
Index
176
Information lifecycle management (ILM)
177
Information Technology Infrastructure Library (ITIL)
data from one type of storage medium to Many vendors support the ILM philosophy
another has changed. Accordingly, HSM has through the provision of tools and services.
evolved into what are termed Automated
data migration (ADM) tools that intelligently Summary of potentially negative issues
move data from one type of data stor- The development of an ILM initiative can
age to another. These tools are often built be resource intensive, potentially requiring
upon a Storage area network (SAN), a high- a company to re-engineer its systems to
performance independent sub-network of ensure alignment between the IT strategy
a larger corporate network that specifi- and the ILM strategy. The risks associated
cally connects dedicated storage devices with corporate data security, location, and
together, allowing high volumes of data ownership over the entire lifecycle need to
to be stored, accessed, and moved around be assessed.
a network as required, without impacting
Reference
the existing corporate network. r R. Maier, T. Hädrich and R. Peinl (2005).
Enterprise Knowledge Infrastructures
Business value proposition (Berlin, Springer-Verlag).
ILM aims to provide complete control and
Associated terminology: ERP, Data
management of the data lifecycle. The use
warehouse, ETL.
of an ILM philosophy that is backed with
SRM and ADM tools allows organizations to
achieve several goals. Firstly, they create a
Information Technology
system that balances data accessibility with
value and need. Secondly, the ILM philos-
Infrastructure Library (ITIL)
ophy helps ensure regulatory compliance Foundation concepts: CIO, MIS.
with regard to data storage. Thirdly, the Definition: A collection of best practices for the provi-
systems allow efficient data storage across sion of IT service management (ITSM).
each data storage level. Fourthly, the sys-
tems allow much of the overhead to be Overview
automated, allowing the systems manager The Information Technology Infrastructure
to focus upon other issues. Fifthly, the ILM Library† is a collection of IT service manage-
philosophy and systems may be designed ment best practices that aim to provide the
to map onto an organization’s disaster- leaders of IT organizations with a top-down
recovery and contingency plans. business-driven approach to the manage-
ILM is supported by a variety of tools and ment of IT processes, people, and technol-
vendors, including Storage service providers ogy. The library was originally developed in
(SSP), which act as providers of storage ser- the 1980s by the UK government’s Central
vices, including offering a wide range of Computer and Telecommunications Agency
services for outsourcing of data storage. (CCTA). Since then it has been refined
These include storing data, providing and extended by the UK’s Office of Gov-
backup services, and remote data-conti- ernment Commerce (OGC), which subse-
nuity services via SANs, VPNs, or internet quently published a Version 2 of the ITIL
connections. (1998--2004) and has commenced work on
Version 3. A British Standard (BS 15000-
1:2002) based upon ITIL has also been deve-
Summary of positive issues
loped; this standard has subsequently been
ILM enables a data-value approach to data
storage to be taken. ILM assists organi- †
ITIL is a registered trade mark of OGC -- The Office of
zations with regulatory data compliance. Government Commerce.
178
Information Technology Infrastructure Library (ITIL)
179
Instant messaging (IM)
ITIL is aimed at reducing the costs of ature on ITIL and a growing body of con-
IT deployment and maintenance while sultants and practitioners to support the
raising service levels. The standards-based implementation of ITIL. Certification of ITIL
approach allows consistent guidance to practitioners is available, this being sanc-
be provided by the IT organization, and, tioned by professional IT bodies such as
through certification-based training, it cre- the British Computer Society and the UK
ates a consistent knowledge level within Government’s Office of Government Com-
the workforce, also leveraging the ability merce. The ITIL approach to service man-
to recruit certified individuals to support agement is scalable and structured, and
ITIL-based procedures. The standards-based allows a standardized and more efficient
approach to IT service delivery can also be and effective implementation of IT prac-
applied to contract vendors delivering out- tices. It can be combined with other IT gov-
sourced services, in that a contactor can be ernance frameworks, including CobiT and
required to be ITIL-certified. This provides ISO 17999.
a higher basis for IT service quality and
for competitive bidding on service provi- Summary of potentially negative issues
sion due to the processes being fully under- ITIL provides guidance on how to achieve
stood by multiple vendors. best practice IT service management, but is
ITIL certification is sanctioned by the ITIL not specific on the fine details of implemen-
Certification Management Board (ICMB) tion. The implementation process can be
and is composed of the OGC, the ITSMF slow and resource-intensive, and the asso-
(IT Service Management Forum) and the ciated process changes can be disruptive in
two examination institutes EXIN and ISEB. the short term.
The ICM has created three levels of certi-
fication: foundation, practitioner, and man- References
r http://www.bs15000.org.uk/.
ager. Foundation certification recognizes
r http://www.itsmf.com/.
that the holder is familiar with best prac-
r http://www.ogc.gov.uk/.
tices in ITSM, practitioner certification rec-
r B. Worthen (2005). ‘‘ITIL power,” in CIO
ognizes that the holder understands the
theory of ITSM and how to apply that the- Magazine, September 1, 2005.
ory in practice, and the manager certifica- Associated terminology: ISO/IEC 17999,
tion recognizes that the holder can manage British Computer Society, Sarbanes--Oxley.
ITIL solutions across a range of service-
management areas.
The ITIL is frequently used in combina-
tion with ISO/IEC 17999 and the COSO/ Instant messaging (IM)
CobiT frameworks. Typically, ITIL is used as
the backbone framework for delivery and Foundation concepts: Email, Client–server.
support processes in conjunction with ISO Definition: An interactive internet communications
17999, around which security controls are method, akin to a typed telephone call.
created, while COSO/CobiT is used on con-
trols and metrics that pertain to financial Overview
systems and Sarbanes--Oxley compliance. Whereas email is strictly non-interactive, a
message is typed and sent, and the recipi-
Summary of positive issues ent opens and reads it when they feel like
ITIL is a widely used methodology for it, Instant messaging (IM) makes the basic
implementing best practices in IT service concept of email interactive. Communi-
management. There is a substantial liter- cating by email is the internet version of
180
Instant messaging (IM)
181
Internet
answered by simply reading the manual, operate. Instant messaging provides a low-
and may completely alienate customers. cost communication option.
182
Internet
internal network if it is configured cor- by ICANN, but that is not usually a cost-
rectly, but are not valid for use on the effective solution.
whole internet. Computers arranged in this Credit for inventing the internet has
way are unable to communicate with the been awarded to many people. Three dif-
outside world except through Proxy servers ferent groups, Paul Baran of the RAND
(see Network Address Translation and Proxy corporation, Leonard Kleinrock of MIT,
for details), and are said to form a Private and Donald Davies and Roger Scantle-
internet. bury of NPL (National Physical Laboratory,
Internet communications: communica- UK) seem to have developed the idea of
tions within one Local-area network (LAN) Packet-switched networks (see Packet switch-
are referred to as intranet communications. ing) independently and at about the same
Communications between systems that are time, around 1962. In 1968, the Advanced
not on the same LAN are referred to as Research Projects Agency (ARPA) of the US
internet communications. Internet com- government funded the first internet-like
munications can not be based on aspects computer network, which was known as
of LAN protocols, such as ethernet hard- ARPANET, and came on line in 1969, con-
ware addresses (MAC addresses), and must necting UCLA (University of California, Los
use higher-level protocols (such as IP, the Angeles) and SRI (Stanford Research Insti-
internet protocol) instead. This merely is a tute), soon joined by UCSB (University of
technical distinction. California, Santa Barbara) and the Univer-
The internet: The internet is a cover- sity of Utah. ARPANET gradually expanded
all term for all of the computers, both over the years, linking with other national
servers and clients, and other network- networks, until it grew to be what is today
enabled devices, together with the connec- called the internet.
tions between them, that form the global The TCP/IP protocol suite was invented
digital communications network. by Vinton Cerf of Stanford and Bob Kahn
The internet is tied together by IP, the of ARPA (by then renamed DARPA, the D is
Internet protocol, a system that defines how for Defense) starting in 1973, and in 1974
individual devices are addressed, and how the word ‘‘internet” was used for the first
transmissions are routed throughout the time in their published paper describing
network from one device to another. It pro- TCP. In 1983, the University of Wisconsin
vides a uniform format for transmissions introduced the Domain-name system, which
that is capable of carrying any form of dig- completed the basic infrastructure of the
ital data over any kind of connection, with modern internet. The system that is con-
reasonable efficiency. sidered by many to give life to the modern
Every device must have an IP address allo- internet, the ‘‘World Wide Web,” was not
cated to it before it can participate in the invented until 1990 (see World Wide Web for
internet. IP addresses may be static (per- details).
manently allocated to a particular device), The Backbone of the internet consists of
or dynamic (allocated on demand and a large number of major switching cen-
released after a short period). Blocks of ters each connected to a moderate number
IP addresses are allocated to organizations of others. Many smaller switching centers
by an international organization known as may be connected to each of the major
ICANN (Internet Corporation for Assigned ones, and many smaller yet to each of
Names and Numbers), and those organi- them, down to the level of the LAN. Each
zations then allocate individual addresses individual computer generally has one
within their blocks as they see fit. Individ- communications path available to one of
ual IP addresses can be allocated directly the major switching centers, but the major
183
Internet
switching centers are very highly intercon- architecture, and cost of these systems is
nected, and may communicate with one classified.
another over a wide variety of paths, choos-
ing whichever is best at any given moment. Business value proposition
This is what gives the internet its strength: The internet has enabled individuals, cor-
any switching center could fail, and only porations, and entities within corpora-
those individual computers relying on it tions to connect together in many new
would become disconnected; the internet and productive ways. These include Online
as a whole would be undamaged, since communities in which virtual communities
there will always be a variety of paths that unite members to solve problems, discuss
avoid any out-of-service node. One of the issues, and foster commerce. The inter-
original purposes behind packet-switched net has spawned a whole new mechanism
networks was to provide a military com- for the development and delivery of busi-
mand structure that would still be opera- ness models and commerce including Col-
tional after a disastrous nuclear attack. laborative commerce, Business-to-business (B2B)
Since the World Wide Web converted commerce, and business-to-consumer (B2C)
the internet from a governmental and aca- commerce.
demic research tool into a wildly popular The technology has grown to include
public utility, and email became a mass new mechanisms for connecting to the
communications (and miscommunication) internet and protocols such as WAP allow
medium, the original purpose of the inter- mobile devices using Micro-browsers to
net has become compromised. This has receive and send information over wireless
led to the creation of the Internet2 con- internet connections. The technologies that
sortium, devoted to the creation of an run the internet such as TCP/IP have allowed
advanced internet, with the intent of fos- companies to base their intranet and inter-
tering new technologies in both hardware net architectures upon one consistent set of
and software that will improve the speed, protocols, and this relieves them of the bur-
reliability, safety, and bandwidth of the dens associated with managing multiple
internet. protocols and their interactions. Languages
As the original internet became deregu- such as XML have revolutionized the way
lated, the US military decided to create its data is transmitted, releasing organizations
own secure ‘‘internet” known as SIPERNET from outdated fixed formats or proprietary
(Secure IP Routing Network). It is intended data formats.
to link all five branches of the military in
one network and possess greater connec- Summary of positive issues
tivity for mobile forces, connecting ship The internet provides an open-standard-
to shore, and field troops to operational based architecture for inter-computer com-
bases and to headquarters through satel- munication. The internet is available uni-
lite and other mechanisms. A second mil- versally via landlines or wireless devices
itary internet was created for the Navy (including satellite phones). The internet
and Marines and is known as NIPERNET protocols have simplified network manage-
(Navy Internet Protocol Router Network). A ment for CIOs and individuals.
third military internet is the JWICS net-
work (Joint Worldwide Intelligence Com- Summary of potentially negative issues
munications System); through it the US The internet has morphed into a commer-
military securely transmits data classified cial, academic, and public-domain space
as top-secret to specific designated recip- that is becoming more congested, and
ients. Naturally, much of the technology, its users are often subject to those with
184
Internet protocol (IP)
malicious intent, e.g., viruses, worms, computer on the internet to ‘‘know” exactly
Trojan horses, phishing, and the use of net- how to communicate with all the others.
work sniffers for those wishing to attempt To make communications practical, a lay-
internet fraud. ered system has been developed, and the
Internet protocol (IP) is one of its fundamen-
References tal components.
r K. Hafner (1998). Where Wizards Stay Up
At the lowest level, every computer has
Late: The Origins of the Internet (New York, the ability to communicate along the kind
Simon and Schuster). of connection that it has, and no other. Typ-
r D. Groth (2003). A+ Complete (Hoboken,
ically, a small-to-medium office will have a
NJ, Sybex--John Wiley and Sons). group of computers all connected together
r P. Gralla (2004). How the Internet Works
on a Local area network (LAN) that con-
(Indianapolis, IN, Que). sists of cheap copper wires and small elec-
tronic components called Switches and Hubs.
Associated terminology: Virtual private
This collection of wires and hubs connects
networks, World Wide Web, Internet
together only the local group: it does not
protocol.
extend to any great distance, and the major
arteries of the internet certainly use much
more sophisticated technology. Since all
Internet protocol (IP) the computers in the local group have the
same kind of connection, they could eas-
Foundation concepts: Internet, Network, Protocol. ily communicate with each other, but not
Definition: The internet is the worldwide commu- with anything outside the group.
nity of computers interconnected by telephone lines, Every local group of computers has one
cable, satellite, etc. The internet protocol is the set of special member, sometimes a normal com-
establishedrulesandproceduresthatmakestheinter- puter with two network connections, some-
net work, by establishing a universal communications times a special-purpose piece of hardware.
language and addressing scheme This device with two network connections
forms a Bridge or Gateway between one LAN
Overview and another, or between a LAN and a larger
The internet exists as a large heterogeneous internet artery. This one device must, of
collection of interacting technologies, and course, be capable of communicating on
is likely to continue to do so for the fore- both of the networks it is connected to,
seeable future. When computers are physi- and they may be using completely differ-
cally close together, a simple, cheap copper ent technologies. The purpose of the IP is
wire connection between them gives excel- to provide a single uniform ‘‘language” that
lent results. When computers are widely allows all of the other computers to make
separated and their owners do not have use of this gateway. A computer on one side
large budgets (home users particularly), of the gateway might be capable only of
a connection through the public utilities communicating along cheap copper wires;
(e.g., telephone lines) is often the only the computers on the other side may (for
viable solution. When major businesses are example) be capable of communicating by a
spread over geographically large areas, no high-bandwidth satellite link, perhaps even
physical connection is viable, so wireless using technology that didn’t exist when the
(satellite) connections are the method of other computers were made. When a ‘‘cop-
choice. All of these different physical con- per wire” computer needs to communicate
nection media work in different ways, and with a ‘‘satellite” computer, they can not
it would not be at all practical for every use their own built-in methods, because
185
Internet protocol (IP)
they are totally incompatible. Instead, they There is some logic to address assign-
use the IP. ment. A large company with many thou-
The IP strictly defines message formats sands of computers may be given all the
that are independent of the technology IP addresses that begin with (for example)
being used. A copper-wire-connected com- 127.35 (known as a Class B address) and
puter composes a message in the IP for- allowed to allocate them as they see fit.
mat, then sends it to the gateway of its A smaller company may be given a Class
own LAN over the copper wires that con- C address, which would give them con-
nect it. The gateway computer receives the trol over all the addresses beginning with
message, and, because the IP is universal (for example) 127.35.101. An individual com-
and invariant, it can understand the mes- puter owner may be given a single full
sage, and see that it needs to be sent to address for their own personal use.
one of the computers on the satellite side. The current IP addressing scheme com-
It is capable of communicating both along bines four small numbers (as in the exam-
copper wire and by satellite, so it simply ple above) to make one numeric address.
‘‘re-wraps” the IP message in the specialized Because each of the small numbers has
satellite message format, and sends it on its a restricted range, the total number of
way. possible IP addresses is something under
Typically, a message transmitted over 4 000 000 000. That seemed an absurdly
the internet will make a few dozen such large number at the time, but is much less
Hops through gateways between LANs and than the current population of the world. If
internet arteries before it reaches its the average number of computers per per-
destination. The IP completely defines the son exceeds about 23 , there just won’t be
universal message format that allows this enough addresses to go round. Since many
to happen, and also provides a universal things that are not really computers are
addressing scheme. now internet-connected, the world has
It is essential that computers on the already nearly run out of IP addresses. The
internet should have fixed, simple addres- next generation of addresses, combining
ses that uniquely identify them, just as 16 small numbers to make one numeric
homes and businesses must have known address, is known as IP6, and is already
addresses to receive their mail. being implemented. This will provide an
The internet protocol uses simple large enormous number of different addresses
numbers as addresses. Usually, these nume- (over 300 000 000 000 000 000 000 000 000 000
ric addresses are seen as four smallish num- 000 000 000).
bers separated by dots (e.g., 127.35.101.98), ATM (Asynchronous transfer mode) is ano-
but this is just a notational convenience. ther routing system, which was originally
It is really just one big number split into intended as an alternative or even replace-
four parts to make it easier (for humans) to ment for IP. Under ATM, data is divided
handle without error. Every computer on up into much smaller packets, all with a
the internet at any time has one of these fixed size (48 bytes of data plus 5 bytes
numbers assigned to it and no other. The of header information), called Cells. Before
IP tells the Gateway computers how to cor- data is communicated between two sites,
rectly forward any message; given the IP a Virtual circuit is set up, which specifies
address of the destination, it is easy for the communications path to be taken; then
any gateway to work out which one of its all data uses that virtual circuit, following
neighbors should receive the message in the same path from source to destination.
order for it to eventually reach its intended These two changes allow greater control
recipient. over the overall transmission rate, and this
186
ISO/IEC 17799
can significantly reduce Jitter (signals break- way, without any specialist or non-standard
ing up and becoming ‘‘choppy”) in audio tools or software.
and video signals. However, improvements
in WAN bandwidth made IP work much Summary of potentially negative issues
more smoothly, and ATM is a very complex The IP was designed at a time when nobody
protocol, so it did not achieve its propo- anticipated the vast number of computers
nents’ goals, and IP still reigns. that would eventually be interconnected.
As a result, its universal addressing scheme
Business value proposition is rapidly running out of addresses to use.
Access to the internet, through the IP, is IP4 (the current version) will have to be
almost a sine qua non for any modern busi- replaced by the already designed and imple-
ness. The use of internet protocols allows mented IP6, or some other alternative.
a clear and simple technology strategy to Although both IP4 and IP6 are well known
be adopted by the whole organization. The and stable, the potential for disastrous
acknowledgement of IP as the standard pro- upheaval as one system is replaced by the
tocol for messaging within and between other is as great as the potential for dis-
organizations allows the internal IT organi- aster was with the Y2K bug. It will proba-
zation to focus its protocol monitoring on bly turn out to be a problem-free transition
just one set of technology protocols. The IP (like Y2K turned out to be), but nothing is
will continue to evolve and advance in the guaranteed until it is all over.
future, and a corporate IT group needs to
References
monitor these changes. IP working groups r http://www.ietf.org/.
have been set up by major software and r W. R. Stevens (1994). TCP/IP Illustrated
technology vendors such as The Internet
(New York, Addison-Wesley).
Engineering Task Force to discuss and mold r D. Groth (2003). A+ Complete (Hoboken,
the shape that the future IPs will take. The
NJ, Sybex--John Wiley and Sons).
IP is fundamental to all devices and soft- r P. Gralla (2004). How the Internet Works
ware that operate over the internet; thus
(Indianapolis, IN, Que).
all vendors will need to comply with the
standard as it evolves, and there are no seri- Associated terminology: Internet, TCP,
ously competing protocols in this area. In DHCP, Network.
the future, a major area of concern for busi-
ness users will be in the transition from
IP4 to IP6, ensuring that devices connect
correctly and that all of the software is ISO/IEC 17799
upgraded.
Foundation concept: Security.
Definition: ISO/IEC 17799 is an international standard
Summary of positive issues intended to provide guidance for IT professionals in
The IP is what makes the internet work. establishing a set of security processes and policies for
All of the positive and negative issues for their organization.
the internet itself are essentially issues for
the IP. Overview
Specific to the protocol is the fact that it The ISO/IEC standard 17799 has its ori-
is very simple and universally established. gins in the British Standard BS7799 origi-
This means that any two computers that nally developed by the UK’s Department of
can be connected together can be expected Trade and Industry. The standard is really
to be able to communicate in a meaningful a code of practice that aims to provide
187
ISO/IEC 17799
188
ISP (Internet service provider)
189
ISP (Internet service provider)
consideration for a business. Unfortunately, Other issues that can influence the selec-
in some markets there is a limited choice, tion of an ISP include the scalability of
by regulation, because of limited backbone the ISP’s capacity, since a rapidly growing
access due to incumbents with embed- company may rapidly outgrow a small ISP.
ded ownership, or simply through the What additional services are offered by the
dynamics of supply and demand. When ISP ISP (e.g., email server, firewalls, security)
options are available the cheapest company and how long the ISP has been operational
might not always be the best; similarly are also reasonable considerations. It may
the most expensive might not offer a truly be worthwhile for a company locating in an
superior product and thus metrics beyond emergent market to ask for references from
simply the financial need to be employed. the ISP. However, in mature markets the
Frequently companies allow users a grace costs of switching from one ISP to another
trial period during which they can connect are usually relatively low.
for free. During this period system connec-
tivity tests can be made, such as running
a ‘‘speed-tester” application to ensure that Summary of positive issues
the speed of the connection is as adver- ISPs offer an easy and relatively inexpen-
tised. sive mechanism for connecting to the inter-
Sometimes ISPs advertise the speed of net. There are many ISPs in developed
their systems in asymmetric terms such economies. Costs of switching from one ISP
that downloads are one speed while input to another are low. Many ISPs offer many
from the user is another; if bi-directional additional services beyond connection ser-
speed is important then the true speed of vices.
the system connection need to be estab-
lished. A second issue is the nature of the Summary of potentially negative issues
connection in terms of down-times. Some Some markets have limited ISP access. In
companies offering high-speed connections some markets, ISPs are owned and oper-
such as DSL offer a dial-up backup option ated by governments that regulate and
when DSL is unavailable. The nature and monitor the traffic that can be passed
quality of customer service also needs to through them. The speed of the connection
be established -- if you’re having problems offered by some ISPs may fluctuate depend-
with a connection, will the ISP be there ing upon load and conditions on the
24 × 7 to assist? The contract with the ISP network.
needs also to be examined to ensure that
the ISP adheres to the level of privacy Reference
you expect. While legislation exists in the r T. Casey (2002). ISP Liability Survival
United States, European Union, and many Guide: Strategies for Managing Copyright,
other countries, it is by no means universal Spam, Cache, and Privacy Regulations
and, should corporate secrets or even what (New York, John Wiley and Sons).
could be considered ‘‘free speech” be trans-
mitted through VoIP, strong encryption is Associated terminology: Hosting,
recommended. Internet, Broadband, Network, Modem.
190
Java
191
Java
to do anything, or they can be run at any marked the end of the old mainframe-era
intermediate security setting. languages of Cobol and Fortran. Simulta-
The guaranteed cross-platform compati- neously it moved programmers toward a
bility and control of security make Java more structured and controllable method-
programs very web-friendly. A Java pro- ology of program design.
gram may be embedded into a web page Java also provided an open-source alter-
and automatically run when that page is native to traditional programming environ-
accessed. This allows much greater func- ments associated with C and C++ with
tionality than a normal passive web page which programmers had wrestled for years.
could provide. The program itself is run One of Java’s strengths is that it facili-
through the JVM, so it will work equally tates the creation of powerful and flexible
well on all platforms, and the user retains code for internet-based applications. The
complete control of all security settings, strength of Java for CIOs is based upon the
so they can happily let the program run, portability of the system: the code is com-
secure in the knowledge that it can’t do piled down to a form that is capable of
anything they wouldn’t want it to do. being run on nearly any device (including
Smallish Java programs designed to be some Java-enabled telephones). This style
accessed from web pages in this way are of development alleviated the platform-
called Applets, meaning ‘‘little applica- specific problems that in the past were so
tions,” with the ‘‘-let” suffix as in ‘‘piglet.” problematic for IT organizations. For many
The only serious widely upheld com- organizations, a lack of cross-platform
plaints about Java were that its libraries operability was taken as sufficient reason
were produced perhaps too quickly, and for for staying with an outdated software--
many years the standard libraries remai- hardware platform combination for far too
ned very ‘‘buggy” while new features were long.
being cranked out at high speed. The major A key to the language’s success is the
remaining complaint is that Java very ability of programmers to learn it quickly,
heavy-handedly enforces some extremely especially if they had previous experience
arguable policy decisions: it reports as with C++. Additionally, programmers like
errors things that are common and accep- the availability of software libraries which
ted programming practice, and certainly keeps them from having to ‘‘reinvent the
not wrong; some of the essential libraries wheel” and enables them to focus upon
make decisions that put speed of execu- more specific and important tasks. This
tion above flexibility and clean design. clearly raises productivity and systems reli-
These libraries can not in any practical ability, since many of the library functions
way be replaced by the programmer, and have been improved over the years as pro-
force some very unfortunate implemen- grammers find and document bugs and
tation practices on experienced expert problems.
programmers.
Summary of positive issues
Business value proposition Java is free; the compiler, libraries, and
In 1995 the release of Java coincided with virtual machine for a wide variety of
the deregulation of the internet and these platforms may be downloaded from Sun’s
two events radically changed the way that web site. There are many online resources
organizations thought about their software to help Java programmers, and very many
development. Java and the need for pro- books, including technical references,
grams to address the internet in essence introductions, and academic texts, that
192
JavaScript
193
Joint application design (JAD)
194
Joint application design (JAD)
is followed in the sessions; Subject matter approach is well known and a wide vari-
experts, who understand the business func- ety of resources is available to support the
tion being addressed by the system; End development approach.
users, who ultimately will use the system;
Developers, software and hardware special- Summary of positive issues
ists who will create the system; a Scribe JAD is a well-known and well-supported
who records all decisions and compiles the method for systems development. It focuses
documentation; and a Tie breaker, who is on the problem and brings all stakeholders
the final arbiter and usually a member of together. JAD sessions help to develop sys-
senior management. Additionally, Observers tems that are supported by good documen-
are usually present but do not partici- tation.
pate in active discussion. The sessions are
Summary of potentially negative issues
intended to be peer-to-peer meetings where
The JAD approach works well for small
participants can express themselves freely
applications, but can incur a significant
regardless of rank.
resource overhead for the developer in
development of larger applications.
Business value proposition
The JAD methodology has been a popu- References
lar approach to systems development, espe- r J. August (1991). Joint Application Design:
cially among consultants. The approach The Group Session Approach to System
can be used for development of new sys- Design (New York, Yourdon Press).
tems and of new procedures that will r J. Wood and D. Silver (1989). Joint
link to existing systems, and as a mecha- Application Design (New York, John Wiley
nism to perform systems maintenance. The and Sons).
195
Knowledge-based systems
196
Knowledge-based systems
197
Knowledge engineer
Knowledge Engineering Review, Volume 18, structed, provide expert levels of perfor-
Issue 1. mance that can be duplicated at multiple
r S. Murrell and R. Plant (1997). ‘‘A survey locations and applied to multiple prob-
of tools for validation and verification lems. The systems can also act as corporate
1985--1995,” Decision Support Systems, knowledge repositories.
Volume 21, No. 4.
Associated terminology: Artificial
Summary of positive issues
The knowledge-based systems community
intelligence, Machine learning, Knowledge
has a mature literature and application
engineer, Logic programming.
tool sets have been developed to support
the creation of such systems. Faster and
more accessible hardware and software sys-
Knowledge engineer tems have facilitated the wider develop-
Foundation concept: Knowledge-based systems. ment and use of knowledge-based systems.
Definition: A knowledge engineer is an information
systems professional who works in the area of arti- Summary of potentially negative issues
ficial intelligence, eliciting knowledge from domain The acceptance of knowledge-based tech-
experts and representing that knowledge in a com- niques remains limited due to the special-
puter system. ized systems requirements and the high
cost of development. Knowledge engineer-
Overview ing is an area of active research and the the-
Knowledge engineers perform the task of oretical basis of knowledge-based systems
extracting information from domain expe- continues to develop as an aspect of artifi-
rts. This process is known as Knowledge elic- cial intelligence.
itation. A variety of elicitation techniques References
are used, including interviews, document r A. Gonzalez and D. Dankel (1993). The
reviews, and structured techniques such as Engineering of Knowledge-Based Systems:
the Repertory grid (a method for extract- Theory and Practice (Englewood Cliffs, NJ,
ing knowledge in which experts compare Prentice-Hall).
the similarities and differences between r B. Gaines, and M. Shaw (1993).
two sets of task-related parameters within ‘‘Knowledge acquisition tools based on
a grid structure). Having elicited the knowl- personal construct psychology,”
edge, the knowledge engineer then per- Knowledge Engineering Review, Volume 8,
forms a careful analysis and structures that No. 1.
information so that it can be represented r G. Kelly (1955). Psychology of Personal
effectively in a data structure on a com- Constructs (New York, Norton).
puter. These data structures are known as
Knowledge representations and include Produc- Associated terminology: Artificial
tion rules, frames, and Semantic networks. The intelligence.
knowledge engineer is also responsible for
maintaining the system and verifying that
the system performs correctly.
Knowledge management
Business value proposition
Knowledge engineers perform the valuable Definition: Knowledge management is the process
task of compiling scarce knowledge and of creating, locating, encoding, and utilizing corpo-
placing that knowledge into a computer rate knowledge to support and enhance corporate
system. The resultant systems, if well con- processes.
198
Knowledge management
199
Knowledge management
200
Law cross-reference
Overview
A Subnet is a subset of a larger network con- Law cross-reference
sisting of a number of computers and other
network access devices that have related IP In the United States, national or federal
(internet protocol) addresses, and are all law relevant to computer systems is gen-
connected to the larger network through erally limited to protecting privacy, intel-
the same network bridging device, or Gate- lectual property rights, and computer sys-
way. The existence of subnets is an arti- tems owned by governments or financial
fact of the way the internet protocol works; institutions (although the Computer Fraud
the broadcast network messages that com- and Abuse Act does provide some protec-
prise ARP (the Address Resolution Protocol, tion to corporations). Other matters usually
which allows IP addresses to be associated fall within the domain of state law, which
with particular hardware) will not be trans- of course means that there are more than
mitted beyond a local subnet. However, 50 different versions currently in effect.
the organization of computers into smaller- Those responsible for or dependent upon
than-necessary subnets can be an aid to computer systems should always bear in
network administration. mind that self-protection in the form of
A Local-area network (LAN) is similar in security and vigilance is the only real pro-
concept to a subnet, but less rigidly defi- tection. No matter how strong and fiercely
ned. A LAN is an organizational device that enforced national and state laws may be,
may consist of part of a subnet or a num- they provide no protection against abuses
ber of combined subnets. It is simply the originating from other nations. The idea of
portion of a network that is devoted to one enforceable global laws against spam, child
organization or sub-organization, and gen- pornography, fraud, and software piracy is
erally has some uniform management. completely out of the question for the fore-
A Wide-area network (WAN) is even less seeable future; there will always be law-
firmly defined. A WAN can simply be a less jurisdictions in which computer crim-
group of connected or related LANs, most inals may hide. So long as the law only
commonly the collection of LANs belong- forbids the origination of illegal materials,
ing to a single organization, or it could it will provide very little protection. Only
be any grouping up to and including the forbidding internet service and telecommu-
whole internet. nications companies from relaying illegal
The term Internet is generally used to material would have a real effect, and with
refer to the whole world of connected com- current technology that would be excep-
puters, the World Wide Web. It is also used tionally difficult.
201
Law cross-reference
202
Legacy system
computer systems, and introducing any port the present and future processes of a
‘‘computer contaminant” (viruses, etc.) into company presents an opportunity to assess
a computer system a third-degree felony (5 the point at which systems will need to
years’ imprisonment, $5000 fine), but if the be replaced or upgraded. Legacy systems
illegal action results in damage valued at can incur high maintenance and support
$5000 or over, furthers fraud, or interrupts
costs and increased costs for the orga-
a public service, it becomes a second-degree
nization as a whole if the technologies
felony (15 years’ imprisonment, $10 000 fine).
employed cannot support changing process
817.568, Criminal use of personal identi- needs.
fication information.
This act makes the fraudulent use of
another’s personal identifying information Summary of positive issues
without consent a third-degree felony, but There are few positive aspects to systems
that increases to a first-degree felony (30 architectures that do not support changing
years’ imprisonment, $10 000 fine) if the processes other than the fact that those sys-
fraud is valued at $50 000 or more, or if it tems may be well known and understood.
involves 20 or more victims. For example, a Cobol system that supports
the purchasing functions may, in a best-
Legacy system case scenario, be completely specified, doc-
umented, tested, and maintained. However,
Foundation concepts: Software, Hardware, Architec- maintaining such a purchasing system for
ture. a very large organization can be extremely
Definition: A legacy system is a systems architecture difficult and a resource drain upon the
that does not support the functional requirements of organization. The ability to run such a sys-
an organization. tem with a deep knowledge and resource
base facilitates an orderly transition to a
Overview more flexible environment such as an ERP
The term Legacy system comes from the fact system.
that a great quantity of program code is
handed down to programmers from previ-
ous generations of programmers; hence it Summary of potentially negative issues
is a legacy of the past. The term is gener- Legacy systems might not support the
ally used to indicate that a system is old, architectural or process requirements of an
but this is an over-generalization. A system organization. At best they will be expen-
needs to be considered in terms of its abil- sive to maintain and potentially deny the
ity to support the current and future pro- company access to the required processes
cesses of an organization; an inability to necessary to support operations. At worst
support changing process requirements is the system could fail or not adhere to legal
now taken as the definition of a legacy sys- requirements (such as HIPAA compliance),
tem. The total architecture needs to be con- and has the potential to put a company out
sidered in respect to the alignment of pro- of business.
cess requirements and requires an assess-
ment of the components of the system’s Reference
r W. Ulrich (2002). Legacy Systems:
architecture: the software, the hardware,
and the network components. Transformation Strategies (Englewood
Cliffs, NJ, Prentice-Hall).
Business value proposition
Assessment of a corporate system’s archi- Associated terminology: HIPAA, ERP,
tecture with respect to its ability to sup- Cobol.
203
Logic programming
204
Logic programming
205
Logic programming
open-source implementations of logic lan- tions have poor input and output facilities.
guages available, and there are many tools Very few programmers are trained in the
and development environments that sup- logic style of programming. Logic programs
port this approach to programming. The can be difficult to interface with other sys-
documentation associated with logic pro- tems, and logic programs are usually much
gramming is extensive. slower than equivalent programs written in
the traditional style.
Summary of potentially negative issues
Logic programs that solve traditional com- Reference
mercial problems can be difficult to write r W. Clocksin and C. Mellish (1984).
and database handling can also be diffi- Programming in Prolog (Heidelberg,
cult due to the fact that many implementa- Springer-Verlag).
206
Machine learning
207
Maintenance
208
Memory
developed, since this will then help the IT the expense of setting up a quality assur-
organization not only to determine the cost ance program that establishes baselines
and efforts required to perform mainte- and metrics. Companies can then look at
nance but also to establish better processes processes by which to migrate their systems
and development methods to reduce code to environments that require less, easier, or
complexity in the future, and hence make cheaper maintenance.
the code more maintainable. Maintenance programming is not gener-
The establishment of a software quality ally considered a popular career path for
assurance program helps to ensure that systems professionals, who would rather be
proactive maintenance prevents the need creating new systems than modifying old
for reactive maintenance. Reactive main- ones. Some forms of maintenance program-
tenance is expensive and generally prob- ming are inherently dangerous: some sys-
lematic because programmers and systems tems that have been in existence for over
analysts are under pressure to perform a 30 years will have been modified exten-
‘‘quick fix” in order to re-establish services, sively during that period, and there might
and might not be allowed sufficient time to be no surviving overall structure, nobody
determine the true underlying cause of the who understands the existing system, and
problem. Endlessly fixing the symptoms of certainly nobody who understands how all
an undiscovered software or hardware fault the previous quick fixes might interact. A
can be very expensive. mistake in the maintenance process may
With the move toward configurable sys- lead to complete systems collapse.
tems such as ERP systems, there has been a
Reference
decline in the amount of traditional main- r The Journal of Software Maintenance:
tenance activities, such as modifying legacy
Research and Practice (New York, John
Cobol programs, and this has freed pro-
Wiley and Sons).
grammers to work on more critical core
systems development. Associated terminology: Cobol, ERP,
Software maintenance will never disap- Software metrics.
pear, since systems are always being mod-
ified or upgraded due to changes in the
regulatory or technical frameworks within Memory
which they operate.
Foundation concepts: Central processing unit,
Storage.
Summary of positive issues
Definition: The physical components of a computer
Maintenance is a well-researched area of
system that support long- or short-term retention of
software engineering. Consultants and con-
information.
tract programmers are available to support
a wide range of maintenance requirements,
Overview
including the maintenance of legacy sys-
The need for memory in a computer system
tems that are written in languages long
is absolute. Memory is needed in order to
thought to be extinct.
keep the sequence of commands that the
computer is currently obeying (i.e., the
Summary of potentially negative issues ‘‘code” for the program or application
In order to prevent maintenance from itself) and the data that it is working on.
becoming a never-ending and very expen- All other programs that are available to the
sive activity, it is necessary to undertake computer must also be stored so that they
209
Memory
can be executed on demand; similarly, all etc.), and records just a few lines of inform-
data that those applications could access ation for each interaction, the total amount
must be kept available. of information that must be kept is already
In human terms, the distinction between 100 times the size of the Bible. A large,
intellectual effort (processing or comput- active corporation could easily generate
ing) and remembering (using memory) is trillions of pieces of information.
difficult to see: we have a single organ, Current technology is capable of satis-
the brain, that performs both tasks. Indeed, fying all four of those requirements, but
according to current theories in neuro- not at the same time. The main conflict is
science, there is no difference between the between speed and capacity. Small amou-
two tasks. In a computer system, there is a nts of incredibly fast memory are afford-
sharp distinction that must be understood. able and reliable. Vast amounts of relatively
If a computer is capable of performing slow memory are affordable and reliable.
2 000 000 000 operations per second (which Vast amounts of incredibly fast memory are
is not an unrealistic proposition), that not just unaffordable, but also technologi-
means that it has a Central processing unit cally infeasible.
(CPU) that is capable of obeying 2 000 000 To satisfy all of the requirements, a com-
000 instructions per second. Those instruc- puter system has a complex memory struc-
tions have to come from somewhere: the ture, consisting of many layers of different
CPU does not make its own decisions technology working together to cover all of
about what to do. Those instructions are the requirements, as listed below.
the program or application that is being
run, which was previously created by a CPU registers: a few dozen bytes of
human programmer, and is kept in the memory, physically part of the CPU,
computer’s memory. The computer’s mem- and working at the same speed as it,
ory component must work at incredible holding just the immediate details of
speeds: 2 000 000 000 times per second, the one step of the computation.
CPU will effectively ask it ‘‘what shall I do Cache: around a megabyte of
next?,” and expect to be given the next exceptionally fast memory used to
instruction to obey. hold temporarily just the portion of
In addition to this requirement for excep- an application that is currently
tional speed, there are other conflicting running and the portion of the data
demands. Absolute reliability is essential: a that it is actively processing; this is
single wrong step in a program invalidates copied from slower memory as and
everything it is doing, and an error rate of when it is needed. Part of the cache
just one in a billion would result in two (called ‘‘L1”) is usually part of the CPU
failures every second. chip, and another part (called ‘‘L2”) is
Long-term retention of data is another usually an extra chip soldered onto
essential. Corporate records and databases the motherboard.
must be kept safely for an unlimited Main memory: hundreds, or even
amount of time. Computers must have thousands, of megabytes of fast
memory that continues to work even memory. This usually holds the entire
when the computer itself is turned off or running application and all data it is
damaged. likely to make use of in the near
The fourth requirement is for capacity. future. Applications are copied from
If a company just has 25 000 customers, slower memory into main memory
has an average of only 100 interactions with when execution begins. Main
each customer (sales, purchases, enquiries, memory, cache, and CPU registers are
210
Memory and disk size
211
Memory and disk size
data it can record (this corresponds to the almost invariably a mistake. The metric or
length of the memory), and how big each of S.I. prefix kilo (as in kilogram or kilometer)
those pieces of data can be (corresponding is always written as a small ‘‘k,” so there
to its width). A computer that can store a should be no confusion.
million ten-digit numbers has more capac- From a design point of view, the idea
ity than does one that can store two million of a computer with 1000, 1 000 000 or
three-digit numbers. 1 000 000 000 bytes of memory is absurd,
Since the beginning of the computer era, whereas 1024, 1 048 576, and 1 073 741 824
the width of memory, the size of an indi- bytes would be perfectly reasonable sizes. In
vidual data item, was always called a Word. computer technology 1024 is a ‘‘nice round
Different kinds of computer would have number,” just as 1000 is a nice round num-
different word sizes (e.g., ICL in the 1970s ber to humans. Memory capacity is always
frequently used 24-bit words, giving 7-digit some multiple of 1024 bytes, and, with 1024
data items; DEC/PDP in the same period fre- being so close to 1000, the letter ‘‘K,” being
quently used 36-bit words, giving 11-digit so close to the metric prefix ‘‘k,” was an
numbers). Thus memory size was always obvious choice. 12K is an easy number to
quoted in terms of how many words there remember; 12 288 is not. Originally, when
are and how big a word is (e.g., 49 152 24- only technically proficient people ever dis-
bit words for an ICL-1902). The word Byte cussed memory capacity, there was never
was always used to refer to some portion any confusion.
smaller than a whole word of memory. The other prefixes, M, G, and T, have
Since the supremacy of the desktop com- a less clear standing. They were ‘‘bor-
puter the terminology has changed. In com- rowed” directly from the metric system,
mon usage today, the word byte exclusively where M for ‘‘mega” means 1 000 000, G for
means exactly eight bits (enough to store ‘‘giga” means 1 000 000 000 and T for ‘‘tera”
a two-to-three-digit number) and word, if means 1 000 000 000 000. When somebody
it is used at all, just means two bytes. talks about ‘‘one megabyte” or 64 MB of
The amount of memory in a computer is memory, what do they mean? Although
always given simply as the total number ‘‘mega” very clearly means ‘‘one million,”
of bytes, with no reference being made to no computer ever had one million bytes of
the width. This simplifies comparisons, but memory. What is meant in these cases is
hides useful data: a computer with 64-bit- 1 048 576 (i.e., 1024 × 1024), which is a stan-
wide memory will generally be much faster dard unit of memory size.
than a similar computer with 8-bit-wide
A ‘‘megabyte,” 1 MB, is most commonly
memory.
1 048 576 bytes.
Since a computer usually has a very large
A ‘‘gigabyte,” 1 GB, is most commonly
amount of memory, special abbreviations
1 073 741 824 bytes.
are used for large multipliers; these are K,
A ‘‘terabyte,” 1 TB, is most commonly
M, G, and occasionally T. There is a great
1 099 511 627 776 bytes.
deal of confusion and a certain amount of
deceit involved in the use of these abbrevi- Some manufacturers take advantage of
ations. this unfortunate naming to exaggerate the
The abbreviation ‘‘K” is pronounced sim- capacity of a disk drive by using pre-
ply as ‘‘kay”; it is not ‘‘kilo.” In memory fixes from the metric system in a con-
capacities, a capital ‘‘K” is always used, and text in which the computer-technology pre-
the multiplier that it represents is 1024. A fixes are obviously expected. For example,
computer with 4 KB (‘‘four kay bytes”) has if a disk has a capacity of 3 221 225 472
4096 bytes of memory. The word kilobyte is bytes, an honest label could say ‘‘3 GB,”
212
Middleware
‘‘3072 MB,” or even ‘‘3 221 225 472 bytes.” A these machines were not capable of com-
deceptive label might say ‘‘3.22 GB.” municating without additional software to
The difference seems too small to worry act as an intermediary.
about: 3.22 GB is only just over 7% more Another form of middleware is known
than 3 GB; but the problem can be signif- as a Front-end processor. This is software, or
icant. $10 is just one tenth of one per- even special-purpose hardware, that inter-
cent more than $9.99, but the psychological faces a system with the outside world, per-
effect of the price difference is well known. haps reformatting or condensing inputs,
When confronted with a choice between a or perhaps converting output into a more
3.22 GB disk drive from an unheard-of man- human-friendly form. Front-end processors
ufacturer and a 3 GB disk drive from a rep- are a variety of middleware that makes a
utable one, many purchasers opt for the single system accessible, rather than inter-
3.22 GB drive because it seems to be bet- connecting two systems.
ter value for money, even though it is in Middleware requires specific information
fact exactly the same in terms of capacity, about the protocols used and the formats
and may well be significantly worse value of the data being received from each sys-
in terms of reliability. tem. The development of middleware has
traditionally been a complex task, further
Business value proposition complicated by the need to have differ-
Read the label carefully. A difference of a ent middleware solutions for each partner
few percent in disk capacity will have no computer that might ever be connected.
significant effect, but a difference in relia- Then, when a data format, operating sys-
bility is the difference between successful tem, or communications protocol changed,
productivity and ruinous disaster. the middleware would also have to be
changed; if many different computers were
involved, each change would represent a
major expense.
Middleware Many of the problems associated with
enabling computers and applications to
Foundation concept: Software. communicate that arose during the pre-
Definition: Middleware is code that is used to enable internet era have subsequently been eased
two or more other applications that would otherwise or resolved. It should be noted that the
not be able to communicate to do so. term ‘‘middleware” was not commonly
used until the late 1980s, when distributed
Overview corporate systems architectures started to
The term Middleware is used to denote soft- evolve. Prior to this the term ‘‘systems inte-
ware that enables two other software sys- gration” tended to be used.
tems to communicate and pass data from A primary enabler of easier communi-
one to the other. The need for middleware cation has been the adoption of standard
became pressing in the 1970s, when orga- internet protocols such as TCP/IP, which
nizations commonly ran different applica- provide well-defined languages and media
tions on different machines made by dif- through which computers can exchange
ferent manufacturers, that each had its data. The development and adoption of
own proprietary operating system, data- XML, which defines a uniform format for
base, and network-communication pro- all kinds of data, has freed developers from
tocols (a heterogeneous environment). having to write customized data-conversion
Standardized, system-independent network software for every entity with which their
protocols had not yet been developed, so system communicates.
213
MIS (management information systems) department
The problem of inter-operability has also create middleware in line with a standard
been addressed by the Object Management methodology and specification system.
Group (OMG), whose members focus upon
producing and maintaining specifications Summary of potentially negative issues
for Inter-operable enterprise applications. Their The development of middleware can be a
best-known product is the Common Object difficult and resource-intensive operation,
Request Broker Architecture (CORBA) speci- and the standards, particularly CORBA,
fication, which provides a specification may be very complex.
(through an interface definition language,
IDL) that allows developers to define inter- Reference
r P. Bernstein (1996). ‘‘Middleware: a
faces for their programs using a standard
language that can be understood and used model for distributed system services,”
by any IT organization. Communications of the A.C.M., Volume 39,
No. 2.
Business value proposition Associated terminology: ERP.
Middleware has always been used to enable
computer systems to communicate with
each other. The term Systems integrator is MIS (management information
used for consultants who build middleware systems) department
to interconnect disparate systems. An exam-
ple of this is when an organization pursues Definition: The management information systems
a ‘‘best-of-breed” strategy for an ERP imple- department is the functional entity within a corpo-
mentation. In such a project, a company ration that is responsible for information systems
may buy an HRM system from one company and the alignment of those systems with corporate
and a financial system from another. In strategy.
order for the systems to communicate effec-
tively with each other, the system would Overview
potentially require some middleware to be In the context of a business, the MIS depart-
written. This would require, on the part ment or ‘‘IT organization” is the functional
of the system integrators, specific infor- area tasked with providing the business
mation about the packages, including the enterprise with the technology resources
protocols used and the data requirements necessary to perform its functions effec-
of each. The process of integrating mod- tively. The MIS department, through the
ern systems is simplified when vendors office of the chief information officer (CIO),
adopt open standards and protocols such as provides advice on technological strategy
TCP/IP and XML. It is further aided by ven- to the board of directors and senior exec-
dors’ adoption of the CORBA interface defi- utives of the company, while maintain-
nition language, or a similar technology, so ing current systems, developing systems
that a system’s interface requirements can to meet future needs, ensuring regula-
be clearly understood. tory compliance, developing disaster con-
tingency plans, ensuring security, and sup-
Summary of positive issues porting and enabling the business units
Middleware techniques are well known and divisions.
and supported within the IT community. In the context of academic study, the MIS
The Object Management Group has created department is usually located within the
the CORBA specification, which provides a School of Business and focuses upon provid-
specification (the interface definition lan- ing business-related information systems
guage, IDL) that allows organizations to skills to its students. This differentiates
214
Modem
MIS departments from computer science form the very core of the business’ com-
departments, which focus upon the theo- petitive strengths. The MIS department is
retical aspects of computing and program- also tasked with ensuring systems integrity
ming systems, and computer engineering and security, maintaining and upgrad-
departments, which tend to focus upon the ing current systems, assessing outsourcing
technological aspects of systems. options, working with software vendors,
developing systems directly with customers
Business value proposition and suppliers, complying with regulatory
Corporate MIS departments emerged in the agencies, developing disaster contingency
mid 1980s, evolving from Data processing plans, and supporting and enabling the
departments. The role of MIS departments business units and divisions.
has been transformed in many ways and The study of MIS occurs in the MIS
can be considered to reflect what Venkatra- departments of Business Schools where
man terms a Value center rather than a Cost scholars undertake research into the mul-
center as they were previously considered titude of activities associated with corpo-
to be. rate systems. There are over fifty scholarly
A value-centered MIS organization is journals that publish the research of the
based upon four concepts. Firstly, rather academic community and several profes-
than being a cost center the MIS depart- sional organizations and societies that pro-
ment provides a technological environ- mote scholarship, education, and standards
ment that not only supports corporate pro- in the professional IT community.
cesses but also enables those processes to
References
achieve high operational efficiency. Sec- r N. Venkatraman (1997). ‘‘Beyond
ondly, the MIS department also acts as a
outsourcing: managing IT resources as
service center that provides ‘‘drivers” of
a value center,” Sloan Management
competitive advantage to the organization.
Review, Spring.
A typical example is to think of the help r MIS Quarterly.
desk as a cost center, but in the value- r Communications of the A.C.M.
centered approach it is the contribution of r Information Systems Research.
the help desk to the overall performance r Journal of MIS.
of the enterprise that is considered, not its r Information & Management.
absolute cost. The third dimension of the r European Journal of Information Systems.
value-centered MIS department is its r Journal of Information Technology.
ability to act as an investment center that
attempts to derive new high-yielding busi- Associated terminology: Chief
ness opportunities from existing IT resou- information officer, Enterprise resource
rces and emerging technologies. The fourth planning.
aspect of the value-centered approach is to
consider the MIS department as a profit
Modem
center that provides services and products
to external markets at a profit. Foundation concepts: Binary, Bit, Analog, Bandwidth.
The MIS department is tasked with many Definition: A device that converts digital data signals
responsibilities in the modern organiza- into audio form (and vice versa), enabling transmis-
tion, ranging from providing advice on sion on telephone lines.
technological strategy to the CEO and
Sarbanes--Oxley compliance information to Overview
the board of directors, to the development The word ‘‘modem” is a contraction of
of mission-critical software systems that Modulator--demodulator. Modulation is the act
215
Modem
of modifying one signal, the Carrier, so that stood but most basic facts of information
it carries another signal with it; Demodula- theory. If the carrier signal is cut off at
tion is the reverse: extracting a signal that 5 kHz (quite a high note really, in musi-
was carried on the back of another. cal terms), then no clever inventions will
With a modem, the carrier is an audio get more than 5000 bits of information
signal, an audible whistle, and the signal through per second. High-speed modems
carried is any piece of digital data. All data rely on the fact that data and information
may be encoded as a simple sequence of are not quite the same thing: the data that
1s and 0s: binary. A sequence of 1s and 0s people want to transmit usually has a high
can easily be used to control an audio sig- degree of redundancy, and can easily be
nal by varying its pitch. A 1 is represented compressed into something smaller: 50 000
by one note, and a 0 by another. The elec- bits of personal data could be quite easy to
tronic circuitry required to perform this compress into 5000 bits of real information.
kind of modulation is extremely simple The use of dial-up service over a normal
and cheap. Demodulation, namely detect- telephone line requires a modem at both
ing which note is being received, and con- ends. Broadband connections (DSL, cable
verting it back to a 1 or 0 as appropriate, is modem, etc.) use devices that are called
similarly simple and cheap. modems, and they do modulate and
In essence, that is all a modem is. Digital demodulate signals, but they are not using
data is converted into a series of musical a standard unmodified telephone line; so
notes and back again. The reason for this they are not subject to the same bandwidth
is, of course, the desire to make computers limitations. Direct network connections
communicate over long distances. Setting (ethernet, fiber optics, etc.) use devices that
up a large-scale digital network is a very are never called modems, even though they
expensive and time-consuming task. The do perform the same tasks of modulation
public telephone network was already in and demodulation outside the realm of
existence, and reached almost everywhere, audio signals.
so it provided an obvious solution. Tele-
phone networks can transmit only audi- Business value proposition
ble signals; purely digital data will sim- Modem technology has been an integral
ply not get through, so the modem is the part of computer networks for decades, and
essential intermediary, converting digital today it provides high degrees of reliability
data into a form that can be transmitted at low cost. Modems range from external
on a standard unmodified public telephone devices through which any computer may
network. be connected to an external network to the
Public telephone networks apply band- built-in devices typical of laptops, and pro-
width limitation to all transmitted signals. vide a series of options for systems man-
The highest frequencies are cut off, which agers to select from. Wireless and other
is why the letters S and F can be hard to network connections use a device that is
distinguish (the difference is in the details technically a modem, but almost never so
of the high-pitched hiss), and dog whistles called.
are completely cut off. The reason is sim- While traditional modems are in gen-
ple economics: with bandwidth limitation, eral terms slower than broadband modems,
many signals can be multiplexed to share they are still sometimes useful as a mecha-
the same long-distance lines. nism for interactive situations that require
If the carrier signal has a limited band- a consistently low degree of latency in the
width, then so does the carried signal. This response. A common example occurs when
is one of the most commonly misunder- connecting to a command-line system such
216
Motherboard
217
Motherboard
218
Multicast
219
Multicast
message to that address, and all routers and ing, multicasting within a LAN is a useful
gateways must know to forward messages mechanism for distributing data. Multicast-
with multicast addresses in multiple direc- ing provides an efficient and effective tech-
tions. This requires a lot of cross-network nique for corporate audio and video con-
organization; most internet routers do not ferencing, and for online training over the
support multicasting. True multicasting same internal network.
can be relied upon only in a controlled
network environment, perhaps a corporate Summary of positive issues
network in which all routers are subject to Multicasting is a useful, efficient, and effec-
the same corporate policy. Without such a tive technique for communicating amongst
controlled and friendly environment, out- a group of users on a LAN.
of-LAN multicasting is a very complex and
inefficient technology.
Summary of potentially negative issues
Multicasting beyond the confines of a LAN
Business value proposition
is not at all well supported. It is still a devel-
Multicasting is a possible technique for dis-
oping research topic.
tributing a communication amongst the
members of an interacting network com- Associated terminology: LAN (local-area
munity. While the technology for internet network), OSI seven-layer model, Ethernet,
multicasting beyond a LAN is still develop- Internet protocol.
220
Natural language processing (NLP)
221
Natural language processing (NLP)
properly is to perform full NLP to under- One aspect of NLP technology that has
stand the grammatical structure of the been deployed in a variety of ways, with
input, and then reverse the process to con- a variety of levels of success, is that of
vert the structure back to speech. Auto- voice recognition. Voice recognition is an
matic translation is still an incompletely extension of NLP in which spoken words
solved problem. in audio format are taken as input, rather
The ability to capture context was stud- than typed text. This adds a whole new
ied by Noam Chomsky, who formulated layer of difficulty to an already intractable
a new theory for the basis of linguis- problem: that of recognizing words, one
tic research and computational NLP sys- of the many tasks that humans handle
tems. His theory is based upon the use with ease but for which no reliable com-
of Generative grammars, constructs used to putational method has yet been discov-
describe how a sentence is formed: e.g., a ered. This is not to say that the technolo-
[sentence] may be decomposed into a [noun gies associated with voice recognition have
phrase] followed by a [verb phrase], the not been commercially exploited; indeed,
[noun phrase] and the [verb phrase] may specialist systems do exist. For example,
then be broken down into other constructs word-processing systems that recognize the
such as [determiner], [noun], [verb], [article], user’s voice commands have been created,
[adjective], [adverb], etc. These constructs although considerable training of the sys-
may then be used to create formal gram- tem is frequently required when the sys-
mars through which an input stream of tem is to be used in domains with specialist
words may be parsed as a first step toward vocabulary requirements.
extracting their meaning. Call centers use a technology known
There have been many types of grammars as Interactive voice response (IVR) to auto-
used within natural language research, mate aspects of their customer service oper-
including Phrase-structured grammars, Trans- ations. These systems work in a dialog
formational grammars, Case grammars, and mode and, to be successful, are required
Augmented transition networks, and parsing is to be very flexible in their voice recogni-
a central task upon which NLP is based. tion component due to the potential range
Typically, commercial systems such as of voices, accents, and dialects with which
those used in customer-relationship man- they have to interact. So far, they can
agement are built to cover dialogs in very successfully handle only the simplest of
limited areas, and the limited vocabu- situations.
lary helps simplify the processes of gram- NLP is also used in systems that automate
mar construction, parsing, and response the reading of customer emails, attempt-
generation. ing to interpret their desires and respond-
ing appropriately. An area of growth for
Business value proposition NLP is that of computer-based training, for
Public perception of NLP has been very pos- example for instruction in a foreign lan-
itive since the days of seeing Robby the guage, mathematics problems, or corporate
Robot from Forbidden Planet and the tinny- processes.
voiced computer of Star Trek responding
helpfully and in perfect natural language Summary of positive issues
to every question posed by the crew. Even The academic literature on NLP and voice
though reality still does not come close to recognition systems is extensive. Many com-
those ideals, and shows no signs of doing so mercially supported systems are available
in the foreseeable future, the general pub- for a wide variety of tasks. NLP systems
lic continues to believe otherwise. are best deployed on automating routine
222
Network
223
Network
not desired, each LAN will also incorpo- cheap and easy to build. This is typical of
rate a bridging device (sometimes a com- Thinwire ethernet.
puter with two network cards, sometimes A Star topology comprises a central hub
a special-purpose device such as a Switch, a or Switch into which a cable from each com-
Bridge, or a Router). The bridging device com- puter runs, so that a computer sends a
municates both on the LAN and on the next message to the hub, which then relays the
level of network above. Any computer on message to either all the computers in its
the LAN wishing to communicate outside network, or, in the case of a switch, only
the LAN must have all communications for- the correct one. This configuration is easy
warded by the bridging device. This simpli- to build and is resilient unless the hub has
fies network management (only one device a problem. Most modern LANs have a star
needs to know how to deal with the inter- topology.
net as a whole), and provides a useful secu- A Ring network, sometimes called a Token
rity bottleneck. ring, is a network of machines connected
The bridging devices for a number of in a circle. Messages are sent in one direc-
LANs may be similarly connected together tion around the ring, from one computer to
on a larger-scale network, possibly built the next, until a machine recognizes itself
with higher-bandwidth components since as the intended recipient. If the message
it must carry all inter-LAN and internet makes its way all around the ring, then
traffic, and this larger network has its own the original sender recognizes it and the
bridging device that connects it to the next message is recognized as undeliverable. To
level of network. Ultimately, there will be a control proper sharing of network band-
very-high-capacity connection to the entire width, a dummy message called the Token
internet backbone. LANs are most commonly is passed around the ring when there is no
built on ethernet technology, giving band- real message to be transmitted; a computer
widths from 10 to 1000 million bits per sec- may initiate the passing of a new mes-
ond. Higher-level connections use a much sage only when it receives that token. Ring
wider variety of technology to support the networks are not very robust, since a sin-
much higher capacities needed, includ- gle machine failure will prevent the whole
ing fiber optics, microwaves, and satellite network from functioning.
communications. In a pure physical Mesh topology every
The way in which computers are con- machine is connected to every other, offer-
nected in a network is sometimes referred ing redundancy and fault tolerance; how-
to as the ‘‘topology” of that network. These ever, this is impractical in reality and hence
topologies take their names from the shape Incomplete-mesh topologies are usually built
of the network and include Bus, Star, Ring, instead, in which only a selected few of
and Mesh. the many possible connections are actually
A Bus topology consists of a single cable made. The backbone of the internet has an
along which computers are connected in incomplete-mesh topology.
a long row. A computer communicates by There are several hybrid models that have
putting on the cable a message that is read been created and these are typically Star bus
by all the computers, but acted upon only and Star ring. A star bus uses a bus back-
by the one with the correct address. This bone to link hubs together and run star
topology is effective only when there are configurations from those hubs. A star ring
only a few computers, since only one com- uses a hub as a means of transmitting the
puter can put a message on the cable at message or token around a virtual star con-
any one time. The bus topology is, however, figuration linked into the hub.
224
Network
225
Network-address translation (NAT)
226
Network devices
227
Network devices
a network increases rapidly as the num- SMTP (email-sending), and IMAP (email-
ber of computers connected increases, until receiving) services are all provided together.
it reaches a point at which the network Similarly, a computer may be both a server
is saturated and performance degrades and a client (for different applications) at
sharply. A switch breaks a network into the same time. When the term ‘‘server”
smaller sub-networks with much less cross- is used to refer to an actual computer, it
traffic. means either a computer that runs some
A Bridge is a particular form of switch server software, or a computer that is sup-
that usually has just two connections, and posedly more powerful than the average
is generally used to connect a Local area workstation and therefore would be more
network (LAN) to a larger grouping. Instead suitable for running heavily used server
of being pre-programmed with information software.
on network structure, a bridge will often
‘‘learn” the network for itself by simply Business value proposition
keeping note of the kind of traffic that New network devices continue to be devel-
occurs on either side. oped and planned: these include routers
A Router is similar to a switch, but that can ‘‘understand” the type of traffic
works at a different level. Switches gener- that they direct and facilitate faster, more
ally decide whether or not to repeat traf- efficient data routing; elevators (lifts) that
fic on the basis of physical identification can transmit their status back to the man-
of the traffic’s destination. A router typi- ufacturer, allowing proactive maintenance
cally looks at the IP address, and may use planning; and refrigerators with bar-code
much more general rules (such as ‘‘all readers that can order groceries over the
IP addresses beginning with ‘111.222’ are internet from preferred vendors.
found along cable A”). Many devices sold The development of network devices
as switches for domestic or office use also that communicate over wireless networks
perform some router functions. also continues to evolve, involving devices
A Gateway is a kind of router that very embedded in cellular telephones, automo-
frequently has the form of a computer biles, PDAs, and laptop computers.
running special software, rather than a
single-purpose smaller piece of hardware. A
gateway is the bottleneck connection bet- Summary of positive issues
ween a LAN and the network as a whole. There is a very large set of options for
The terms ‘‘server” and ‘‘client” properly network developers to build their networks
refer not to an item of hardware, but to from, connecting ‘‘intelligent” devices, such
a particular implementation style for net- as PDAs and cellular phones, together thro-
worked software. A Server is a software app- ugh routers, switches, and gateways.
lication that provides some network ser-
vice; once started, it ‘‘listens” for requests Summary of potentially negative issues
coming from other systems, processes As network device use grows, bandwidth
them, and sends back responses. Servers are demands will continue to grow and tech-
passive systems that wait until some other nology upgrades will continue to be
system specifically requests that they act. required.
A Client is a software application that
makes use of a server in that manner. Reference
Any particular computer may be running r L. Peterson and B. Davie (2003).
a large number of different servers con- Computer Networks: A Systems Approach
currently. Commonly, FTP, HTTP (i.e., web), (San Francisco, CA, Morgan Kaufmann).
228
Neural network
229
Neural network
230
Normalization
greatly simplifies the development of appli- easy to see that 3/7 and 3/7 are the same,
cations and the training of neural nets. but noticing that 27/63 and 3/7 are equal
requires some intellectual effort. Normal-
Summary of potentially negative issues ization provides a single standard represen-
Neural network technologies are applica- tation for all possible values. If all data is
ble to small specific problems rather than represented by its normal form, checking
to systems within large ill-defined problem for equality is as simple as noticing that
areas. The training of a neural network can two things look the same.
take a considerable amount of resources In computing, it can save a significant
and time. A complete understanding of amount of processing time if all data is
neural network technologies requires train- stored in a normal form. No loss of expres-
ing in mathematics and computer science. sivity is incurred, since output filters may
The use of neural networks requires that be applied to ensure that data is printed in
the user understand that the system is not whatever form is desired.
perfect and dependence upon the results For data items more complex than frac-
implies an understanding of the risk level tions, the design of a normal form and
associated with that system; neural net- the procedures for converting data into
works do not have the ability to explain it can be a complex task, but the bene-
their behavior. fits are correspondingly greater. The most
important commercial aspect is in the
References design of Multi-table relational databases. As
r J. Freeman and D. Skapura (1992). Neural an example, consider a small business that
Networks (New York, Addison-Wesley). accepts orders from a number of regular
r J. A. Anderson (1995). Introduction to customers. It would be perfectly sensible
Neural Networks (Cambridge, MA, MIT for them to arrange their database in (at
Press). least) two tables. One table would record
full information on all of their customers
Associated terminology: Machine
(name, address, tax identification, account
learning.
balance, etc.), and another table would
record all the orders (customer name, item
ordered, date, quantity, etc.). Although the
Normalization division into multiple tables is a sound
decision, the given design of those two
Foundation concept: Database. tables is not.
Definition: The conversion or reformatting of data into What happens when a customer changes
a standard, rationalized, uniform representation that their name, an option commonly taken
still provides the same information. by people and corporations? Naturally the
database will need to be updated. Chang-
Overview ing the customer table is a simple oper-
Fractions all have multiple representations: ation, but the order table will also need
1/2, 2/4, 5/10, and 48/96 are all really the to be changed to keep it consistent. Every
same thing. In general use, this does not single record of an order made by the cus-
present a problem, and sometimes has a tomer will need to be updated, and that is
benefit: the notion of ‘‘five out of ten” may a long procedure. While the update is in
be more accurately expressive in some cir- progress, the whole database will be in an
cumstances. However, when a comparison inconsistent state and unsafe for use. The
is required, having many representations design is also unsafe; if the customer name
for the same thing does cause trouble. It is for an order is mistyped, and that error
231
Normalization
goes undetected, there will be no way for require but also allows these activities to
an automatic database system to associate be carried out with precision. The high
that order with the originating customer. technical standards associated with nor-
A set of rules introduced by E. F. Codd in malized systems also help to ensure that
1972, known as Boyce--Codd Normal Form the systems work to specification. This may
(BCNF), provides a normal form for rela- require more effort on the part of the tech-
tional database design. If the tables of a nology team, but ensures that the system
database satisfy the BCNF rules, then the will not have unexpected problems later
problems indicated above, and many oth- and require troubleshooting, which is an
ers, will not occur and the database will expensive and unpredictable activity.
generally make more efficient use of disk
space, provide faster response times, and be Summary of positive issues
more robust. The rules for database normalization have
Normal forms exist for many other been developed and used for over 30
domains. Disjunctive normal form is used years and they provide a formal basis for
for logical formulæ; it allows equivalence database development. In other areas, nor-
to be determined by simple visual inspec- malization of data may be expected to
tion, common operations to be performed increase efficiency and reliability.
quickly, and direct conversion into hard-
ware designs. Backus normal form is used Summary of potentially negative issues
to specify the syntax of programming lan- Normalization can require very specialized
guages, and permits automatic parser gen- expertise or an increased training effort on
eration. Church’s normal form for Lambda- the part of the organization and technolo-
expressions is an essential tool of theoretical gist.
computer science.
Reference
r E. Codd (1972). ‘‘Further normalization
Business value proposition
of the data base relational model,” in
Normalization enables technologists to
Data Base Systems, ed. R. Rustin
develop database systems that are efficient
(Englewood Cliffs, NJ, Prentice-Hall).
and effective in their structure. This not
only facilitates the development, mainte- Associated terminology: UML, Data-flow
nance, and growth of the database as needs diagrams, Entity-relationship diagram.
232
Object-oriented
233
Object-oriented
methods are dependent upon inheritance, oriented programming and the language
and make it simpler to define data objects C++ as merely different names for the
that contain within them diverse kinds of same thing, and C++ is a language of
other data objects. This also increases code almost impenetrable complexity. Object-
reusability and reduces testing and main- oriented programming can be clean, clear,
tenance turn-around times. Unfortunately, simple, and understandable, as the orig-
inheritance and polymorphism, although inal object-oriented language, Smalltalk,
simple concepts, bring with them a host showed.
of attendant concerns that seriously com- The advent of the Java programming
plicate programming languages: what hap- language from Sun Microsystems in the
pens when a new data type is defined mid 1990s gave programmers a somewhat
in terms of two incompatible older data more manageable environment in which to
types? What happens when inherited code develop their applications, since it prohib-
operates on data types that had not been ited many of the unsafe constructs freely
defined when it was written? There are available in C++. The availability and use
many other very technical questions that of Java has helped to reduce the complex-
have to be resolved for any system that ity associated with the object approach and
incorporates these ideas. has encouraged programmers to produce
Java solved many of the arcane questions systems in a manner more closely associ-
by simply forbidding the conditions that ated with their original intent, that of facil-
lead to them being relevant. Adherents of itating correct, clearly structured, reusable
C++ claimed that this made the language software.
far inferior to their favorite, but that claim
has not been supported by the evidence
of years of practice. Object-oriented pro- Summary of positive issues
gramming can be a very simple technique, Object-oriented programming allows for
and, when it is, it is of great benefit to all the development of readable, reliable and
concerned: programmers, their managers, reusable code, if it is used properly. The
and their customers. It is part of the logical object-oriented style has an extensive lit-
progression of software technology, in the erature and is supported by many major
same direction as Structured programming, software vendors. Java is becoming more
which also met with great resistance from popular as a language and supports the
industry for a very long time, but is now a object-oriented style of programming.
sine qua non of software design.
Summary of potentially negative issues
Business value proposition
The use of C++ and the poor use of Java
Object-oriented programming was inten-
have in many cases caused complicated,
ded to be a methodology through which
impenetrable programs to be written, espe-
programmers could produce readable, reli-
cially in those instances when program-
able, and reusable code, and was populari-
mers are allowed to believe that complica-
zed by the C++ programming language.
tion is acceptable.
However, while the theory upon which it
was based is sound, the practical use of References
the programming style has been clouded r M. Weisfeld (2003). The Object Oriented
to varying degrees by misunderstandings Thought Process (Indianapolis, IN, Sams).
and unbroken old programming habits. A r J. Rumbaugh, M. Blaha, W. Premerlani,
primary problem is that many see object- F. Eddy, and W. Lorensen (1991). Object
234
OLAP (online analytical processing)
Oriented Modeling and Design (Englewood There are differing degrees of sophistica-
Cliffs, NJ, Prentice-Hall). tion available in the OLAP applications and
r A. Eliëns (1995). Principles of Object the mechanisms through which the data
Oriented Software Development (New York, is stored. The term OLAP was originally
Addison-Wesley). used by Dr. Ted Codd in 1993 and he used
r A. Goldberg and D. Robson (1989). his relational data model (he also created
Smalltalk-80, 1st edn. (Boston, MA, the term Relational database) as the basis of
Addison-Wesley Professional). OLAP. This is still used in some instances
and is known as ROLAP. Subsequently to
Associated terminology: C++, Java,
the relational model, the use of multidi-
Programming language, Algorithm.
mensional databases evolved and the terms
Data cube, Star model, and Snowflake model
were coined to describe the data structures
OLAP (online analytical processing)
underlying the OLAP implementation. The
Foundation concept: Database. OLAP model has been extended to be web-
Definition: Online analytical processing is a set of ana- based or web-enabled (termed WOLAP) and
lytical tools that enable users to examine the relation- this allows users to access the OLAP via a
ships that exist within a structured data set, usually network connection. A fourth level of OLAP
held in the form of a data warehouse. is the use of spreadsheets, which, through
their in-built data-manipulation tool sets
Overview (e.g., pivot tables) and their graphical tools,
OLAP is a collection of analysis techniques can be useful for handling smaller data
that are applied to data sets in order to sets held in relational and non-relational
determine relationships between the ele- structures.
ments of that set. A well-known example is
that a supermarket used OLAP to look for Business value proposition
correlations in sales, and it identified that OLAP systems are beneficial to corporations
customers coming into their stores in the wishing to analyze their data sets. The
evenings just to purchase diapers typically historical accumulation of corporate data
also purchased beer at the same time, leav- provides a potentially rich source of infor-
ing with two items rather than one item. mation for companies. The data is typi-
Knowledge of this unusual combination of cally divided into live transactional data
products purchased allowed the store own- and historical data, namely data that has
ers to achieve better stock modeling and been extracted from the live data set, trans-
demand forecasts. formed into a new data structure, and then
Typically OLAP is used to manipulate and loaded into a separate database (which can
examine financial or sales data models (e.g., be a data warehouse). The OLAP systems
looking at sales at a variety of geographical then manipulate the data to allow for the
levels: sales by country, region, district, or identification of trends and examination
zone) and uses statistical and other tools to of subsets of data, and present the data
identify trends and anomalies. To help in in a variety of formats. OLAP systems have
the analysis, mathematical and statistical several advantages, including the ability
tools are built into the OLAP packages and to add to revenues through better ana-
accessed through a variety of graphical user lysis of sales and marketing, improved
interfaces that allow data to be presented customer satisfaction through enhanced
as graphs, pie charts, linearity charts, and product quality and service, provision of
so on. better, more accurate reporting, and the
235
One-way hash
236
Online communities
237
Open source software
interested in games such as Dungeons and running race, or an event such as a plan-
Dragons or Multi-User Dungeons; and Com- ning meeting). At the other end of the
munities of transaction, e.g., online business- technology spectrum are communities
to-consumer (B2C) and business-to-business built around large-scale technology plat-
(B2B) communities such as Covisint, a pro- forms that enable data synchronization,
curement exchange focused upon the auto- exchange, and procurement between the
motive industry. members.
Online communities use a variety of
levels of technology to execute their inter- Summary of potentially negative issues
actions. These include ‘‘list servers” that Unregulated communities can lead to poor
distribute emails; chat rooms; web sites and unreliable data and information. The
through which emails and messages are overhead required to monitor and maintain
posted; electronic procurement software communities can be high. The costs associ-
systems through which members buy, sell, ated with data synchronization exchanges
and interact; and highly secure, highly reli- and communities can also be high. Compet-
able data-synchronization exchanges thro- itive pressures to join procurement portals
ugh which members distribute the data can force companies to join communities
pertaining to their products and pricing. and be subject to their rules and norms.
The control over access to communities can
be unregulated (as with many communities Reference
of interest, for which activity and access are r R. Plant (2004). ‘‘Online communities,”
self-regulated by the membership), open Technology & Society, No. 26.
but regulated by a moderator, or fully reg-
ulated by a controlling body. Associated terminology: e-Commerce/
e-Business
Business value proposition
An online community can be a power-
ful binding force that can be harnessed Open source software
to drive commerce. Commercial ventures
such as eBay and Amazon started out in Definition: Open source software is any software
the form of online ‘‘communities” within whose source code is freely available, and is typically
which members regulated and monitored not subject to fees or royalties.
trades and recommended books to each
other. List servers can be sponsored or Overview
augmented by the provision of advertising Since the origins of computing there has
and product placement. Electronic procure- always been a culture of ‘‘free software”
ment systems provide a focused industry- available within the computing commu-
or product-specific location for participants nity. The free software has included soft-
to concentrate their efforts, reducing the ware from corporations and individual pro-
need to build their own individual systems. grammers who have placed tools, games,
applications, and ‘‘fixes” (patches) on their
Summary of positive issues systems for users to download via FTP or
There is a variety of mechanisms for pro- the internet. The Free Software Foundation
viding a central location through which (FSF) has been a strong advocate of free soft-
participants can interact. The range of ware since it was established in 1985; its
technologies allows low-cost communities members are ‘‘dedicated to promoting com-
to be created quickly and spontaneously puter users’ rights to use, study, copy, mod-
(e.g., for a specific event such as a marathon ify, and redistribute computer programs.”
238
Open source software
The FSF ‘‘promotes the development and ‘‘open source” and recognized as such (they
use of free software” (http://www.fsf.org/) offer certification signified through a trade-
and is particularly well known for GNU, marked ‘‘OSI Certified” logo) the software
its collection of Unix utilities. Additionally, must satisfy ten criteria:
the Open Source Initiative is an organiza-
tion that provides more structure to the (1) Free redistribution,
provision of ‘‘free” code. (2) Source-code availability,
Open source code is typified by Linux, a (3) Modification to and subsequent
variant of the Unix operating system that redistribution of the code must be
was created mostly by Linus Torvalds, who permitted,
worked on the system in the early 1990s (4) Integrity of the author’s source code,
and finalized the first version in 1994. He (5) No discrimination against any
made this freely available via the internet persons or groups,
to other programmers, who subsequently (6) No discrimination against any fields
added and contributed to the development of endeavor,
of the system, which they made into a (7) Distribution of license,
POSIX-compliant system (POSIX is a set of (8) License must not be specific to a
requirements for a standardized core of product,
Unix). Linux has subsequently been dis- (9) License must not restrict other
tributed under the GNU General Public software, and
License from the Free Software Foundation. (10) License must be technology-neutral.
This allows its source code to be freely dis-
tributed and made available to the general Interestingly, in the early days of com-
public subject to certain conditions (see puting, a large proportion of all software
http://www.linux.org/info/gnu.html). was free. Software would generally only
Open source became a well-known term work on one kind of computer, and, given
as a consequence of the browser wars that the extraordinarily high cost of computers
occurred at the height of the internet rev- themselves, manufacturers would usually
olution in the late 1990s. The battle of the throw in the software as part of the pack-
web browsers gained a high profile dur- age, not having to worry about their com-
ing this period because it is a concern petitors’ customers being able to benefit
central to all internet users and develop- from it. Failure to provide and maintain an
ers. The battle culminated in 1998 when operating system at no extra charge would
the Netscape Communications Corporation have made computers virtually unsellable.
decided to make the client-side source This all changed with the advent of cheap
code of their well-known eponymous sys- personal computers at a price that could
tem available via the internet for free. not possibly support adequate software
Following the Linux and Netscape initia- development, especially when desktop com-
tives the open source community became puters became relatively uniform, so one
more formalized under the auspices of manufacturer’s free software would equally
the Open Source Initiative (OSI) (http:// benefit their competitors.
www.opensource.org/), which was founded
in 1998 as a non-profit corporation ‘‘dedica- Business value proposition
ted to managing and promoting” what they Open source systems allow organizations
have termed the ‘‘Open Source Definition,” to have access to free software for which
which is itself derived from the Debian the source code is available. The sup-
‘‘Free Software Guidelines.” This definition port and quality of the code can vary
states that, for software to be considered considerably from program to program and
239
Operating system
organizations need to be aware of the risks The existence of free software can put
associated with using such systems. How- excessive pressure on traditional software
ever, it is also the case that some com- companies, which may well have been pro-
mercial vendors, rather than funding the ducing a far superior product, but may be
continued support of an aging software unable to stay in business in the face of
package for a small (but sometimes dedi- zero-cost ‘‘competition.” There are enough
cated) group of users, will place the source users who value cheapness over quality and
code in the public domain. This allows enough uninformed corporate buyers who
users and interested parties to work on the never have to use the free software that
code, upgrade it, and maintain it. This has they recommend to make this a serious
the effect of keeping the product’s users concern in many areas.
happy (they no longer have to pay a license It is not totally clear whether the free-
fee) and having the software supported for software movement is really beneficial over-
free by a user group, and as a consequence all. There are situations in which reli-
the vendor’s brand is not weakened by hav- able software is absolutely critical, and
ing disenchanted ex-customers. the typical free-software distribution agree-
The open-source movement has devel- ment about providing no warranties and
oped and maintains products in a very wide accepting no responsibility is inadequate.
range of product areas, including operat- If skilled programmers are expected to give
ing systems, word processing, spreadsheets, away the fruits of their labors for free, they
databases, and internet browsers. This has will not be able to devote themselves pro-
allowed some companies to use exclusively fessionally to the project.
free-source software in their selection of
products and thus is highly favored in orga- References
r Free Software Foundation, 51 Franklin
nizations and countries that can not afford
proprietary vendor-managed software. Street, Fifth Floor, Boston, MA 02110,
USA.
r http://www.fsf.org/.
Summary of positive issues r http://www.opensource.org/.
The Open Source Initiative attempts to
bring professionalism and standardization Associated terminology: Linux (see Unix).
to the domain of open-source systems.
Open-source software is of low cost and fre-
quently under continuous development by
user groups. Open source allows users to Operating system
read, modify, and develop the source code
of an application. Foundation concept: Software.
Definition: Essential, permanently running software
Summary of potentially negative issues that keeps a computer operating.
Open-source code can vary in quality and
might not ever have been rigorously tested. Overview
The code might not have any documen- In the old days, running a computer
tation or specifications associated with it. required a full-time staff. Programs had to
Companies using open-source code need be ‘‘fed in” manually from decks of cards
to ensure that the code is actually legally or magnetic tapes, then the input data
open-source code rather than just copied had to be made available in a similar way,
code that has been released under another and everything had to be tended carefully
name. until the final results could be extracted,
240
Operating system
and another program could be started in Operating systems today often present
its place. two alternative faces to users. The tradi-
As more convenient forms of storage, tional interface involves typed commands
principally disks, became available, the and responses, and is familiar to many in
requirement for human operators became the form of DOS (Disk Operating System)
an intolerable bottleneck, wasting far too or the ‘‘Dos shell.” By modern standards,
much of the computer’s still very expen- DOS is a very basic job-control language,
sive time. The solution was to make the capable of running only one application
computer operate itself. For each program at a time, and arranging for its own
run, a set of operating instructions would execution to be continued only if nothing
be written, exactly stating which program goes wrong, after an application has fin-
should be run, which files contain the data, ished. ‘‘Industrial-strength” operating sys-
what to do if anything goes wrong, and tems usually have a similar typed com-
what to do with the results. These instruc- mand interface, known as a Shell, Monitor,
tions were written in a form that could be or Command-line interface, that is gener-
understood and obeyed directly by the com- ally preferred by technical users. The now-
puter, usually called a Job-control language. discontinued TOPS and VMS, together with
The job-control instructions could even be all varieties of Unix, follow this pattern.
stored in a disk file, completely automat- The alternate interface is the now univer-
ing the whole process, relegating human sal Windowing environment, or GUI (graphi-
intervention to disaster recovery and paper cal user interface), preferred, or, more pre-
loading. cisely, demanded, by non-technical users.
In order for the computer to be able Usually this windowing environment is just
to understand the job-control instructions, a secondary interface running on top of
some programming is required. Software a traditional primary one, although the
must read and interpret the instructions, popular Windows operating systems have
and carry out the actions indicated. This abandoned this model and reversed the
software was the earliest form of operat- situation.
ing system. When a computer is first started Since these early beginnings, the scope,
up, some special action is required to load complexity, and error-proneness of oper-
and start execution of the operating sys- ating systems have all grown enormously.
tem’s software. From then on, it is self- Typical operating systems provide a large
maintaining; the operating system never library of standard program components
relinquishes control until the computer is to simplify the software-implementation
turned off. Of course, all software can (and process. Multiprogramming allowed more
usually does) have design flaws; an error than one application to be runnable con-
in the operating system can have a wide currently, so that, when one is temporarily
range of effects, from minor annoyances to unable to continue, perhaps waiting for a
introducing subtle and undetected errors slow input device, another could run in its
into the results of programs and destroying place. Multiprogramming grew into time-
data. Some kinds of error will simply cause sharing, the system under which multiple
the operating system to stop running, at applications do in fact run concurrently,
which point the computer becomes unus- with control switching between them per-
able (‘‘freezing up”) until it is restarted; this haps 100 times per second. Virtual mem-
is known as a System crash, and is some- ory, which allows more applications to run
times visible as the infamous ‘‘blue screen than would actually fit in the computer’s
of death.” memory, is another system that requires
241
Operating system
constant operating-system supervision. Net- system across all these devices benefits the
work access requires oversight to ensure organization by potentially reducing costs,
that incoming messages are delivered to through a reduction in maintenance, train-
the right application. The ability to inter- ing, and license expenses.
pret the movement and clicks of a mouse A second strategy organizations may
and to provide a visual ‘‘desktop” requires wish to pursue is the ‘‘best-of-breed” strat-
much more computational effort than egy, whereby one operating system is
is needed with a traditional job-control selected to work on the server cluster,
language. another on the desktops, and yet others
The full, efficient, and effective use of on the laptops, PDAs, and cellular tele-
an operating system requires a significant phones. Alternatively, a low-cost strategy
degree of training. Many science-fiction may be appropriate, implying that free-
books and films depict operating systems ware and open-source systems can be cho-
as intelligent agents (e.g., the HAL-9000 on sen with less regard to their applicability
the spaceship in the film 2001: A Space or usability.
Odyssey and the computer on the Star Trek A major concern in operating system
spaceship Enterprise). While it is possible selection is the level of support associ-
to have natural-language voice interfaces, ated with a particular system and ven-
the ‘‘cognizant” systems in these films are dor. The metrics associated with support
still a long way off; their study is an level include the cost of the license and
ongoing aspect of research into artificial the associated maintenance levels (if any),
intelligence. the cost associated with service packs
(upgrades and fixes), user groups and user
Business value proposition community, conferences, technical papers,
The selection of an operating system and the level of support from application
depends upon the use to which it is to be vendors.
put, the knowledge level and skill of the It is also important to consider the rela-
operator, and the resources available to sup- tionship between the operating system and
port that system. There exists a range of the applications that are to be supported
operating systems to meet a user’s require- by that system. This is important in the
ments depending upon the configuration overall design of the systems architec-
of these variables. One aspect of operat- ture (the total combination of applications,
ing system selection that may override the operating systems, networks, etc.) because
others is the maintenance of consistency not all operating systems can run all appli-
across the organization. Hence an organi- cations or database systems. Equally, some
zation may choose to select a GUI-based applications such as ERP systems may run
operating system that a majority of the in a thin client mode, making the operat-
users will find easier to understand than ing system on the client almost irrelevant
a command-line interface. Versions of that so long as the client can run a browser.
system may be used across a range of plat- A further issue that is important in many
forms supporting a variety of applications. purchasing decisions is the lifespan of the
At one end of the spectrum one version operating system, and the nature of any
may support the corporation’s ERP system, superseding operating system. The nature
while clients may run another version of of the technology upon which an operating
the operating system, portable laptops a system works will be one factor that deter-
third, and hand-held devices such as PDAs mines whether the operating system needs
and cellular telephones yet another ver- to be changed to stay in alignment with
sion. The use of one ‘‘family” of operating the technology in use (e.g., CPU, memory
242
Optical character recognition (OCR)
systems, network requirements, monitor mal user groups and are considered legacy
and peripheral devices). A second factor is by their original manufacturer. Some oper-
the amount of support (and cost) that a ating systems are not backward-compatible
vendor provides to transition to the next and thus do not support the applications
version. A third factor is the ability of that ran on previous versions.
new versions of the operating system to be
Reference
backward-compatible so that all the previ- r A. Silberschatz, P. Galvin, and G. Gagne
ous applications will continue to run.
(2004). Operating System Concepts (New
Simpler items of machinery, such as
York, John Wiley and Sons).
automobiles and washing machines, are
often computer-controlled, and have soft- Associated terminology: Unix, Compiler.
ware that controls their functions (e.g., the
engine management system, traction con-
trol, braking system, etc.). However, that
software is not usually considered an oper- Optical character recognition (OCR)
ating system, since it is really just a piece
of Firmware, and does not control the oper- Definition: Processing a digital scanned image of
ations of other software. In some cases, printed or written words, to isolate and identify the
onboard computers do interact with each individual letters and the words they form.
other, and do run a number of different
applications depending on changing cir- Overview
cumstances, and many manufacturers use It seems like a very simple problem. Given a
a special reduced-size version of a normal good digital image of some printed words,
computer operating system to act as the work out what they say. The shapes of char-
basis of the onboard controller. acters are well known, and printed words
provide a great deal of consistency. Natu-
Summary of positive issues rally, understanding handwriting could be
There are several types of operating system more difficult, but understanding printed
interfaces available, including command- characters seems to be a simple matter of
line-style and GUI-style systems. Applica- seeing which of 26 already known patterns
tions vendors can provide data regarding each letter matches.
operating systems requirements for their Optical character recognition (OCR) is in fact
system. Operating systems are available in a an exceptionally difficult task. Even with
variety of configurations to support a vari- the enormous commercial gains to be had
ety of applications and platforms. Many from being able to process printed doc-
operating systems have both formal and uments automatically, no system has yet
informal support groups. been perfected. The problems are numer-
ous. When scanned with enough resolu-
Summary of potentially negative issues tion to give sufficient detail, no two printed
The types and variety of operating systems characters ever look exactly the same. Varia-
have been consolidating since the 1980s, tions in lighting, texture of paper and ink,
resulting in three distinct primary camps: slight imperfections in the image, imper-
the Unix-like, the Windows family, and the fect alignment, and a host of other prob-
others. Some operating systems lack suffi- lems mean that looking for an exact match
cient support from vendors, have limited with predetermined letter shapes will
functional options, contain bugs, and have never succeed. Added to that is the prob-
poor security. Some operating systems are lem that different fonts, type sizes, and
not supported at all except through infor- styles (bold, italic, etc.) provide a potentially
243
Optical storage
infinite variability to how any letter may inal by eye to ensure correctness. Predomi-
look. nantly, OCR technology is effective in appli-
There are many commercially available cation areas that use special optical char-
software packages that do a creditable job acter sets such as those found on checks.
of OCR, even down to properly setting Another application of OCR is in comput-
paragraphs and diagrams in popular word- ing devices that accept stylized handwrit-
processors’ file formats. None of them is ten input, but these systems also require
totally reliable, or even comes close to the extensive training of users if they are to
degree of accuracy that a capable and care- work effectively.
ful human reader could achieve. For the
collection of essential data, for which accu- Summary of positive issues
racy is of some importance, it is always OCR can enable large quantities of doc-
necessary to have a human operator com- umentation to be entered into a system
pare the original printed document with (with varying degrees of accuracy). OCR sys-
the results produced by OCR software. This tems that use specially designed characters
does not by any means mean that OCR and that are read in special-purpose readers
is pointless. Human operators can com- are effective in automating processes (such
pare the two versions much more quickly as mail sorting). Some computing devices
than they could possibly type the entire allow hand-written input through the use
document, so OCR is a great time saver; of special styli and screens.
it simply means that a totally automatic
process can not be expected to produce Summary of potentially negative issues
error-free digital transcriptions of printed OCR of documents does not provide 100%
documents. reliability of duplication when scanning
The similar problem of handwriting ‘‘real-world” documents that were not
recognition is many times more difficult. specifically created to be processed elec-
Either users must write in a specially desi- tronically.
gned highly stylized way (such as the stra-
Reference
nge alphabet familiar to users of the Apple r H. Bunke and P. Wang (1997). Handbook
Newton), or the system must be trained:
of Character Recognition and Document
hand-held through a selection of samples to
Image Analysis (Singapore, World
learn a new writer’s style. Even then, reli-
Scientific Publishing Company).
ability is far lower than for printed-word
OCR. Associated terminology: Natural language
processing, Neural networks, Machine
Business value proposition learning.
The potential for OCR is vast, since the abil-
ity to capture electronic versions of whole
collections of documents would move orga-
Optical storage
nizations closer to the envisioned paper-
less office. However, the technology has not Foundation concepts: Storage, Disk, Backup.
yet reached the point at which this can Definition: Data-storage systems based on reflection
be effortlessly and reliably achieved. OCR and refraction of light, rather than on the traditional
software is available to scan documents magnetic properties of materials.
and store the results in a digital form, but
the reliability of these systems is less than Overview
total, and it is necessary to cross check the Primary storage in computer systems, that
electronic document against the paper orig- known as main memory or RAM, is now
244
Optical storage
exclusively built from solid-state electro- occurs more quickly in cheaply made CDs:
nics; data is stored as static electrical when the glue holding the two layers of
charges. Secondary storage, usually known plastic tightly together starts to fail, air can
as disk, is almost exclusively built from get to the aluminum layer, and that is the
spinning disks on whose surfaces data is beginning of the end. If CDs are to be used
recorded as microscopic magnetic fields. for archival or backup purposes, the life-
Optical storage provides another alternative, time of the chosen media must be carefully
in which data is stored as minute deflec- investigated.
tions or deliberate defects in a reflective or DVDs (Digital video discs or Digital versa-
refractive medium (often just a thin layer tile discs) provided a large increase in capac-
of shiny aluminum). Data is read from opti- ity, and hardware for writing data to DVD
cal media by shining an accurately focused blanks is now as widely available as that for
laser light onto it, and measuring how the CDs. A DVD uses the same basic technology
reflection of that light is affected. as a CD, in the same way; improvements in
Laser disks, or Disco-vision, were introduced technology between the introduction of the
in 1969, but were fundamentally analog two simply allow for more to be done in the
systems, and could not have been used same space. DVDs use smaller bumps, read
effectively for digital data storage. The first by a shorter-wavelength laser (visible red
optical storage devices suitable for com- instead of infrared) from a thinner sheet of
puter systems were the now universally aluminum. The standard capacity DVD can
familiar Compact discs or CDs. Although CDs store 4.7 GB of data, although more expen-
were originally used exclusively for audio sive dual-layer versions with a capacity of
recordings, they store their contents in a 8.5 GB are available. Dual-layer is not the
completely digital form, and are ideally same thing as double-sided; double-sided
suited for all kinds of digital data. A CD DVDs are not generally used in computers
consists of two layers of transparent plas- because of the need to turn them over or
tic with a very thin layer of metal, usu- have two laser pickups.
ally aluminum, sandwiched between them. The DVD market is confused by a profu-
Microscopic bumps are printed onto the sion of standards. The most basic is DVD-
aluminum during manufacture, or may be ROM: this is a read-only format; data must
‘‘burned” in by laser light in a CD recorder. be written during manufacture, and can
When a CD is played, another laser shines not be modified later. DVD--R and DVD+R
onto the surface, and the microscopic are two slightly different write-once for-
bumps cause detectable changes in its mats: data may be written onto a blank
reflection. disc once; after that it can not be modi-
CDs have a usual capacity of 650 MB, fied. These are the cheapest and most use-
but that may be slightly increased in some ful for archival backups and software dis-
models. Data may be read from a CD at tribution. Most hardware can read both
rates of about 10 MB per second, and CD- the --R and the +R versions, but some recor-
recording hardware may fill a CD in about ders can write onto only one format.
5 minutes at best. Although CDs do pro- Domestic DVD video players are also often
vide good long-term data storage, they do capable of playing only videos recorded
not last for ever. Scratches on the plastic in their preferred format. DVD--RW and
surface may often be repaired completely, DVD+RW are re-writable formats; the discs
or even ignored successfully, but deterio- may be erased and re-recorded many
ration of the aluminum layer can not be (but not unlimited) times. The difference
repaired. This deterioration is much accel- between --RW and +RW is the same as the
erated by storage in poor conditions, and minor incompatibility between −R and +R.
245
Optical storage
DVD--RAM and DVD+RAM are extensions of rent research. If successful, it could result
the RW idea; instead of having to erase the in a rapid increase in available storage
whole disc before writing new data onto capacities, and greater reliability since it
it, single blocks of data may be erased and might not require any moving parts. No
re-written individually, so a DVD±RAM disc such products are commercially available,
behaves like a normal but slow hard disk. or promised for the near future; holo-
The DL suffix indicates ‘‘dual layer”; dual- graphic memory currently lingers in the
layer blanks are much more expensive than world of fiction.
single-layer ones, and require special hard-
ware for writing. Currently, only DVD--R DL Business value proposition
and DVD+R DL are available. Optical storage offers organizations and
Two further improvements to optical individuals the ability to store and archive
disc technology have recently become avail- data in a semi-permanent medium at rel-
able. Both take the same basic technology atively low cost. For several decades tapes
even further, using a violet laser to detect and ‘‘floppy” disks have been used to
even smaller, more closely packed bumps archive data; however, these media have the
in the reflective layer. One is HDDVD (High- same inherent problem in that the data is
density DVD) with a standard capacity of encapsulated in a magnetic coating that
15 GB, but a 30 GB dual-layer version; the can degrade over time. The same is true
other is called Blu-ray (because of the laser for audio and video cassette tapes, which
color), and has a standard capacity of 25 GB, frequently give very-low-quality play-back
but also dual- and quadruple-layer versions after a few years. While computer tapes are
for capacities of 50 GB and 100 GB. usually stored in a temperature-controlled
Magneto-optical discs, which were popu- environment and used infrequently, they
lar for a period, are not very widely seen suffer the same inherent limitations (in
now. In a magneto-optical system, data is some systems the tapes need to be replaced
stored as microscopic magnetic fields on as frequently as every ten backup cycles).
the surface of a rotating disk, but a tightly CDs and DVDs solve this problem by not
focused laser light is also applied when using magnetic materials that can degrade,
data is written. This has the dual effect of but instead their data is burnt onto the
allowing smaller magnetic fields to be writ- disc, although it must not be forgotten
ten (thus increasing capacity) and render- that these media do still degrade in their
ing those fields optically detectable. When own ways. While CD and DVD technology
data is read back, only a beam of polar- has been evolving, its major limitation is
ized light is needed, no magnetic pickups; the amount of data that can be written
the light interacts with the magnetic field onto a disk. ‘‘Juke-box” technology for CDs
when it is reflected, and the data is read has been developed, allowing many CDs to
from the reflected light beam. Magneto- be created without human intervention or
optical discs are much more complex and delays.
expensive than traditional magnetic discs, The blu-ray optical technology has the
and, once the price/capacity ratio for the potential to expand the options open to
latter had fallen significantly, the former network users and managers since the data
fell out of favor. storage capacity is much higher than for
Holographic storage, in which data is older optical storage devices, and the speed
stored throughout a three-dimensional vol- at which data may be saved is also much
ume instead of merely on a two-dimen- increased. Blu-ray technologies have been
sional surface, is the subject of some cur- designed to accommodate the needs of
246
OSI seven-layer model
the high-definition television communities; the kinds of interactions permitted, all the
however, the technologies will inevitably be way to providing a convenient end-user
adapted to computer-related uses. interface. Such complex systems can be
implemented reliably only if they are divi-
Summary of positive issues ded into more manageable subsystems that
Optical storage technologies offer individ- may be developed independently. For most
ual users an easy way to store relatively new development, the ISO’s OSI (Open Sys-
high amounts of data on almost non- tems Interconnection) seven-layer model is
volatile media. New technologies such as used.
blu-ray have been developed to provide
higher capacity storage at higher data 1. The ‘‘physical layer” is concerned with
transfer rates. the kinds of cables and connectors used,
and the nature of the electrical (or
Summary of potentially negative issues other) signals transmitted.
Optical storage technologies have been lim- 2. The ‘‘data-link layer” describes the for-
ited in size, and over a long period of time mat of data packets sent on the first
the medium itself is subject to decay if not layer, and communications when a
well maintained. The variety of acronyms direct link between two systems exists.
for CD and DVD media is large and confus- 3. The ‘‘network layer” controls long-
ing and care needs to be taken when select- distance communication, when a direct
ing a disc type for use. Holographic storage link does not exist, and data packets
is currently a research area but no products have to take a number of ‘‘hops” to
have yet been demonstrated for commercial arrive.
use. 4. The ‘‘transport layer” specifies how data
flows between applications, and handles
Reference delivery receipts and retransmissions if
r F. Yu and S. Jutamulia (1996). Optical
necessary.
Storage and Retrieval (New York, Marcel 5. The ‘‘session layer” specifies the
Dekker). sequences of messages that must
be sent and received in order to
achieve particular goals, the language
OSI seven-layer model of request and response between
connected applications.
Foundation concepts: Network, Protocol. 6. The ‘‘presentation layer” describes the
Definition: A commonly used abstraction of network format and representation of the indi-
software architecture. vidual messages, and any data that they
carry.
Overview 7. The ‘‘application layer” is the layer of
Network applications can be very complex interest to users, which provides the
pieces of software, because there are so purpose behind using the other layers,
many different levels of control that they namely the application that controls
need to exert: from correctly controlling communications and provides, uses, or
the transmission of signals on the network displays the transmitted data.
hardware, through routing data over poten-
tially long distances and complex network Developers can expect the first four lay-
topologies, ensuring that all signals are ers to be provided as part of any working
received correctly as transmitted, defining computer system. If a totally new kind of
247
Outsourcing
application is under development, the last The TCP/IP protocol stack happily existed
three layers must all be created, but, if an with four levels long before the OSI model
interface to an existing kind of system is was introduced. Insisting on seeing every
being developed (perhaps a new email ser- system as consisting of seven layers can pro-
vice or a new web browser), then layers 5 duce a distorted and confusing view.
and 6 will already exist, and only the final
References
layer is needed. r W. R. Stevens (1994). TCP/IP Illustrated
At each layer of implementation, the
(New York, Addison-Wesley).
developer may rely upon the previous lay- r International Organization for
ers already existing and working correctly.
Standardization (ISO) (1994). ISO/IEC
Each new layer simply adds a new level of
7498: Open Systems Interconnection, The
functionality, making full use of the previ-
Basic Model (Geneva, ISO).
ous layer. For example, IP (the internet pro-
tocol) is a layer-3 protocol, it is responsible Associated terminology: Ethernet,
for delivering data from any computer to Internet protocol, TCP/IP, Client--server.
any other anywhere on the network. It uses
a layer-2 protocol (such as ethernet) which
is already capable of delivering data over a Outsourcing
local area network (LAN), so all it has to be
concerned with is the forwarding required Foundation concepts: Business process re-
when two computers are not on the same engineering.
LAN. Definition: Outsourcing is the use by a company of a
third-party provider to perform a function or imple-
Business value proposition ment a process on its behalf.
Protocol stacking based on the OSI seven-
layer model is of great benefit to network Overview
software developers. The clean division of Outsourcing has been used in business for
network tasks into a number of layers decades. As organizations began to under-
allows the developer to concentrate on just stand that they did not have to perform
the one relevant aspect of design, knowing all the functions of business themselves,
that surrounding layers are isolated and known as being Vertically integrated, they
well specified. Within businesses that are moved toward an outsourcing model, con-
not actively involved in technology develop- tracting with other vendors to supply com-
ment there is less need for the MIS staff to ponents and provide services. For example,
be concerned with all the minutiæ of the an automobile manufacturer may choose
OSI model, since the majority of the soft- not to make brakes at all, but to have a
ware in use has been designed to be simply company that specializes in brakes produce
‘‘plugged in” to a network, with all issues and supply them under contract.
related to the protocols already resolved. The history of outsourcing in the context
of information technology (IT) has a slightly
Summary of positive issues different past and rationale from those of
The OSI seven-layer model provides a robust outsourcing in the manufacturing sector.
theoretical model for all systems developers In the 1960s not all companies could afford
to construct systems around. computers, and they utilized ‘‘computing
service bureaus” to undertake their data
Summary of negative issues processing. During the 1970s and 1980s,
The OSI seven-layer model is frequently as computing became more pervasive, com-
seen as the way the internet works. It is not. panies began to change their views on
248
Outsourcing
software development. Rather than writing be prohibitive, move IT assets and staff off
all their own programs, it became much the balance sheet, or alleviate the effects of
easier to purchase packages, in effect out- staffing shortages.
sourcing their development effort.
Outsourcing remained a useful option Summary of positive issues
for many organizations that wished to The ability to source technology skills
supplement their own internal IT organi- around the planet has made a wide variety
zation; in particular, the outsourcing of IT of options available to organizations. While
training for employees became a popular costs are frequently considered to be a pri-
option. However, a seismic change occurred mary driver of outsourcing, other issues
in 1989 when Kathy Hudson, the CIO of (such as the ability to provide a call cen-
Eastman Kodak, decided to outsource the ter that is located in a suitable time zone)
company’s entire IT operation. This strate- also drive the decision to locate IT services
gic decision changed the way that IT in distant locations. Strategic outsourcing
outsourcing was viewed by organizations; enables organizations to use IT to achieve
instead of considering it an expense, they a variety of strategic ends, including cost
saw the opportunity to ‘‘sell” their IT oper- advantages, human capital advantages, and
ations and then pay for a service oper- customer-service provision.
ated by another company according to a
contract. Summary of potentially negative issues
In the twenty-first century, IT outsourc- Outsourcing can be associated with a vari-
ing has evolved significantly from the total ety of positive attributes, but many of them
outsourcing experienced by Kodak. While may be positive in the short term and
very large total outsourcing contracts still problematic in the long term. Outsourc-
occur, companies now undertake selective ing may provide a cash bonus from the
process sourcing, which, due to advances in sale of the IT function, but, if the out-
technologies, can potentially be performed sourced functions are not performed as
at any place on the planet. well as expected, it may be very difficult
to bring the IT function back in-house (also
known as Insourcing). While different loca-
Business value proposition tions have different costs associated with
The use of process and technology out- them, they also frequently have differing
sourcing has become a popular mechanism levels of security provision and legal regu-
by which organizations can achieve a strate- lations to work within, which may prove
gic goal. The goal may be to allow the com- detrimental should a problem occur.
pany to focus on strategic IT initiatives,
Reference
achieve a higher level of operational perfor- r S. Cullen and L. Willcocks (2004).
mance than could be developed in-house,
Intelligent IT Outsourcing (Burlington, MA,
achieve a level of performance at lower cost
Butterworth-Heinemann).
than could be achieved in-house, complete
a short-term specialist project when the Associated terminology: Application
setup costs for in-house development would service provider, Hosting, ISP.
249
Packet switching and circuit switching
250
Parallel processing
251
Parallel processing
with other processors can be expected to records each. Each person would sort their
grow directly with the number of proces- own half, but, because of the quadratic
sors involved. Viewing simple graphs show- nature of the sorting method used, each
ing (1) how processing time reduces with person would take only a quarter of a year
increasing number of processors sharing to sort their share of the records. They
the load, (2) how communications time would then have to merge the two sorted
grows, and (3) the sum of processing time piles into a single sorted pile, but that oper-
and communications time illustrates the ation is an order of magnitude faster than
point clearly: in cases like this, there is the sorting. So two people could sort the
an optimal number of processors to use, records in just over one quarter of the time
beyond which adding more actually pro- taken by one person. Ten people could do it
duces a slower final result. in just over one hundredth of the time. The
The second essential consideration is the only reason why 250 000 people working on
algorithmic Complexity of the problem at the problem couldn’t do it in a few seconds
hand. If the algorithm chosen is signifi- is the communications overhead: merging
cantly worse than linear in its time com- 250 000 piles of four sorted records each
plexity, then parallel processing is very would take quite some time.
likely to be beneficial. If it is better than lin- So, there are cases in which parallel pro-
ear, the benefits are not so clear, and deeper cessing pays off handsomely, cases in which
analysis will be required. For example, con- it has a negative effect, and, of course,
sider the problem of sorting a large num- everything between those two extremes.
ber of database records into alphabetical Parallel processing is not a simple matter,
order. Naive sorting algorithms, the kinds and needs thorough investigation for any
of method that a person naturally uses, given problem domain. Fortunately, this
are quadratic in time. That means that the is a well-researched area, and there is a
time taken to solve a problem grows with vast literature available to those who know
the square of the number of data items what to look for.
involved: twice as much data means four There are also choices to be made regard-
times as long to process; ten times as much ing the style of parallel processing to be
data means a hundred times as long to pro- used. Should there be a number of inde-
cess. Now suppose that it would take one pendent computers connected by a net-
person a whole year to sort a collection of work, or a single specially designed com-
one million records into order (that is not puter with multiple CPUs? The former,
a totally unreasonable figure, if the records known as Distributed computing, or a Loosely
are small and easy to handle). Two people coupled system, allows systems to be built
working on the problem would divide the from off-the-shelf components, but has far
records up into two halves, with 500 000 slower communications; it is the kind of
252
Parallel processing
system that is often used when a prob- to some useful network-coordinated back-
lem is devolved into a number of quite ground task in their idle periods. For exam-
large sub-problems, such as with queries on ple, the SETI@home (Search for Extraterres-
very large databases. The latter, known as a trial Intelligence at Home) research group
Tightly coupled system, is a much more pow- is looking for a signal from outer space, and
erful strategy, but very expensive. Very few relies on vast numbers of volunteers via the
vendors offer large multi-CPU computers; internet to load a program that acts like
it may even require customized computer a screensaver running in the background
design. only when the computer is idle, to perform
Another possibility is to have a single signal processing. It is estimated that they
powerful computer act like a larger num- have many millions of users on their net-
ber of smaller computers by dividing up its work, contributing many teraflops of com-
time amongst a number of separate tasks, puting power and trillions of floating-point
rapidly switching its attention from one to operations a second to solve the problem. In
another, maintaining the illusion that all 1999, a similar cooperative effort involving
are being processed concurrently. This is about 100 000 personal computers world-
called Multi-processing, and is a technique wide succeeded in cracking a secret mes-
used by all general-purpose operating sys- sage encrypted with 56-bit DES in less
tems to support multiple applications run- than 24 hours. The combined idle time of
ning at the same time. Multi-threading is a large corporation’s network of comput-
an emerging technique that allows a single ers is a formidable resource waiting to be
CPU to run more than one program at the tapped.
same time without having to switch atten- The use of tightly coupled systems is cur-
tion rapidly from one process to another. rently limited to either very simple small
This is related to the concept of a Thread, systems or highly complex problems requir-
which is a stream of execution within a sin- ing extensive processing-time and data-
gle application, allowing one application to throughput capabilities. One such system
work on multiple problems without resort- is IBM’s massively parallel Blue Gene/L com-
ing to complex programming techniques. puter, which has been designed to scale to
65 536 dual-processor nodes with a peak
Business value proposition performance of 360 teraflops (a teraflop
Parallel processing is an area of comput- is 1 000 000 000 000 arithmetical operations
ing that holds great potential for a vari- per second). This system is being used to
ety of large and complex problems. It is investigate, amongst other things, the prob-
unlikely that a simple office application lems of protein folding in the biological
such as word processing would benefit sciences.
substantially, if at all, from implementa- The adoption of loosely coupled and
tion on a parallel-processing architecture widely distributed processing in the style of
at current levels of technology, but signifi- SETI@home requires special software to be
cant improvements are likely for computa- deployed, and it is clearly not suitable for
tionally intensive tasks in the foreseeable applications in which security or response
future. time is a major overriding concern, but
The use of a distributed, loosely coupled its applicability to problems in the public
form of parallel processing is attractive for domain and for the public good is clear.
those with significant computational prob- The adoption of tightly coupled processing
lems. Many corporate PCs spend as many is constrained to a certain class of major
as 17 hours a day doing nothing (24 hours problems and is also constrained from a
a day on weekends), and could be assigned user’s perspective by the limited number of
253
Password
254
Password
or their own names. The reason this is a could not be accessed by normal users.
problem is that guessable passwords really The computer operating system’s own file-
can be guessed. Someone trying to break protection services were used to ensure
into a system can simply try a lot of com- that only the system administrator and the
mon password choices for a large num- password-verifying software could access
ber of accounts, and can expect occasional the password file. In modern times, when
successes. ‘‘Cracking kits” are readily and new security flaws seem to be discovered
freely available for download: these are every day, this is clearly not sufficient.
software packages that automatically scan One successful virus, worm, or Trojan-horse
through whole dictionaries, trying out an attack could reveal everything. The solu-
endless stream of possible passwords at tion is that passwords are not stored at
very high speed. No real word or simple pat- all. When a user sets or changes their pass-
tern is safe. Systems that require their users word, it is encrypted using a One-way hash
to pick four-digit PINs (personal identifica- (a kind of encryption scheme that can not
tion numbers) as passwords could almost be reversed: it is impossible to recover the
be considered pre-compromised. original message from the encrypted ver-
There are solutions for each of these sion, even if the key is known). Only the
three problem areas. The first is a software encrypted version is stored. Each time the
solution called SSL (secure-sockets layer). If user logs in, the password they enter is also
both computer systems have SSL installed encrypted in the same way, and the result
and elect to use it, every byte transmitted is compared with the previously stored ver-
is automatically encrypted using a strong sion. This means that the password file
crypto-system. Security is enhanced by the could be published in a newspaper with-
fact that a new, very long encryption key out any passwords being revealed. Until
is automatically generated for each ses- recently, it was standard for all Unix sys-
sion, so long-term eavesdropping provides tems to keep their password files in a uni-
no additional leverage. Incredibly, it is pos- versally readable location.
sible for two systems to securely choose Unfortunately, increased computing
a new encryption key so that they both power has made that system less than
know what it is, but even an interloper who perfect. Since the one-way-hash encryption
had complete access to everything trans- system is well known, it is possible for
mitted (even knowing both parties’ public a programmer who has access to an
and private keys) could not know it (the encrypted password to create a program
best known of these methods is known as that simply tries out all possible passwords,
Diffie--Hellman). SSL is available as a stan- encrypting them all, and waiting for a
dard part of nearly all modern operating match. For systems that allow passwords to
systems, and costs nothing to use beyond a be of any length, it is not physically possi-
small requirement for extra computation. ble to try out all possible passwords because
It simply requires that the server is set there will be an infinite number of them,
up to use it. When a web browser warns but most users, if left to their own devices,
that information is being sent insecurely, it will pick a simple memorable password
usually means that the server is not using of five or six letters, and that is an open
SSL. invitation to automatic cracking software.
The second problem can not be solved Cracking kits commonly run through all
perfectly, but may be significantly reduced words in a very large dictionary of many
by simple standard software methods. It languages, try all short combinations of
used to be thought enough to store pass- letters even if they don’t make words, and
words in a specially protected file that even try adding digits in various places.
255
Password
Cracking software can be left running in password file has been released, because
the background for many days until it the cracking software does not then need
happens upon a match. to go through the official log-in procedure
The only way to make passwords at all for each attempt.
safe is to use a combination of one-way Another good security measure is to
encryption, plus putting the password file require that the password verification pro-
in an inaccessible place, plus keeping every cedure makes a note of each log-in attempt
possible security measure in place. Even that fails because of an incorrect password.
then, it is essential to prevent users from If each user who successfully logs in is
picking easily guessable passwords (any informed of the number of recent failures,
software system can be constructed to auto- they will remember if they did not mistype
matically reject passwords that are not long their own password, and at least know
enough, or that appear in some standard that a break-in attempt has been made.
dictionary, or even ones that don’t con- It is also common for systems to disable
tain a sufficiently complex mixture of non- accounts temporarily, making them inac-
letters). However, if password memoriza- cessible even with the correct password,
tion is rendered too difficult, then users if there have been some (typically three)
must resort to writing down their pass- attempts at access with incorrect passwords
words in supposedly secret places, and all within a moderate period (often ten min-
hopes of security are then lost. If sys- utes to an hour). This reduces the number
tems designers let users make a totally free of attempts that an automatic cracking sys-
choice of passwords (subject to their being tem can make to such a degree that it can
long and complicated enough of course), not be expected to succeed at all.
then users will be able to use the same
password for each system, and if only one Business value proposition
password has to be remembered, it can be The processes surrounding the assignment
quite long and complex. Systems that reject and management of passwords in an orga-
passwords for seeming offensive or politi- nization are primarily the responsibility
cally incorrect provide no benefit (nobody of the chief security officer; in smaller
but the user in question ever sees the offen- organizations this task usually falls to
sive password), and serve to reduce overall the network administrator. The approaches
security. taken to secure systems access must be
Another effective solution is to deliber- based upon best-practice principles from
ately slow down the password verification the assignment of passwords to the stor-
process. It should normally be possible to age of passwords as one-way hashes. It must
verify a password with less than a microsec- be remembered that the value of the infor-
ond of computing, but if the verification mation stored by the system and password
process includes deliberate delays, so that it ‘‘protected” is proportional to the amount
takes a whole second or two, then the crack- of effort that someone dedicated enough
ing kits that work by repeatedly attempt- will undertake to break the password and
ing to log in with different passwords will access the system. Even though one-way-
be slowed down to such an extent that hash algorithms are typically unbreakable,
they could never scan through all the pos- reports since 2004 have shown that small
sibilities. Genuine users will not be disad- bit-length algorithms are capable of being
vantaged, because logging in happens only broken and thus, if the data is valuable
once per session. This solution does not enough, even more secure hash algorithms
help in situations in which the encrypted need to be used.
256
Patent
It is therefore imperative that all organi- processes to ensure higher levels of secu-
zations and individuals follow the security rity. Security policies need to be established
procedures completely and revise those pro- and independently verified. Policies need
cedures on a regular basis. It is advanta- to reflect the real security risks that a vio-
geous to have an independent security con- lation may cause. Many users choose the
sultant examine the procedures and report same password for all of the systems that
directly to a senior member of the organi- they access. While this may simplify their
zation, such as the CIO. This avoids prob- lives, it does increase the potential harm if
lems such as network administrators pro- ever that password is compromised. How-
viding a log-in account under a false name ever, that is unlikely to be as dangerous as
so that, in the event that they get fired, forcing users to write down their passwords
they can return to the company’s network because they have too many to remember.
and perform illicit activities. The security
Reference
scan can cover all access points, remote-log- r R. Smith (2001). Authentication: From
in practices, encryption methods to prevent
Passwords to Public Keys (New York,
the use of packet sniffers, the possibility
Addison-Wesley).
of keyboard-monitoring programs that cap-
ture key-strokes including passwords, and Associated terminology: Encryption,
spyware technologies that monitor users to Hacking, Cracking.
capture password activity. Even the possibil-
ity of rogue hidden cameras in buildings
being used to capture pictures of keyboard Patent
entries needs to be considered.
Definition: A patent is a legally enforceable property
Summary of positive issues right granted to an inventor, allowing the inventor for
By following best-practice principles, pass- a limited time to prevent or regulate others making,
word security may be incorporated into using, offering for sale, or selling the invention. This
an overall technology-security plan. Techno- right is assigned in exchange for public disclosure of
logies exist to help ensure that passwords the invention when the patent is granted.
selected conform to a pattern that is not
simple to break. Most operating systems Overview
may be configured to ensure that pass- The establishment of a patent requires a
words are changed on a regular basis government agency to acknowledge the
and that multiple log-in attempts are not validity of the patent and assign the rights
allowed, thus reducing the risk of robotic of that patent to an individual or other
programs running through a dictionary. legal entity. Most countries have their own
Large-bit-length one-way-hash algorithms patent system and intellectual-property (IP)
may be used to store passwords centrally in laws and a large agency to administer
a secure manner. Passwords may be multi- them. The US Patent and Trademark Office,
ple levels deep and strengthened by the use the European Patent Office, the UK Patent
of a physical access device such as a finger- Office, the Japanese Patent Office, and the
print or a security swipe card. State Intellectual Property Office of the Peo-
ple’s Republic of China are among the most
Summary of potentially negative issues influential. In addition there is The World
No system that uses passwords alone can Intellectual Property Office (WIPO) which
be considered totally secure and it is neces- was formed in 1883 to enforce the Paris
sary to use supplemental technologies and Convention, a treaty that allows patents
257
Peer to peer
258
Person month
259
Phishing
approximately six hours a day and takes r B. Boehm (2000). Software Cost Estimation
into account holidays, average sick days, with COCOMO-II (Englewood Cliffs, NJ,
and weekends. Prentice-Hall).
Associated terminology: Software
Business value proposition development.
A standard work period helps project man-
agers to plan development schedules. The
European Union has an upper limit to the
number of hours an employee can work per Phishing
month and this number can also act as a
Foundation concept: Security.
basis for PM calculations.
Definition: Attempting to trick others into revealing
personal and financial information through fraudu-
Summary of positive issues lent web sites or electronic mail.
PMs standardize the process of software
development and scheduling. Overview
Phishing has become a very widespread
problem for users of the internet. A crim-
Summary of potentially negative issues
inal organization or individual sends out
The use of a standard number in project
emails that are an exact copy of legiti-
scheduling and management to represent
mate emails from a well-known company
a group of programmers is dangerous. As
that many random recipients are likely to
with any project development, there is a
have accounts with. The emails look exactly
standard distribution of programmer skills
like official emails from that company, but
in any project team. The use of a single
direct the recipient to a web site for some
number that equates to programmer effort
vital purpose. Often the email will warn
also might not take into account the vari-
that an account is about to be involuntarily
ances across projects and the project teams
closed, or a large bill is about to be referred
that work together on different aspects of
to a collection agency, and this is the cus-
a development.
tomer’s last opportunity to do something
It is vitally important to take into
about it.
account the fact that, when more than one
The web site that the customer is referred
person works on a project, a significant
to will have an address (URL) that seems
amount of time and effort is expended on
to be right for the company, and the con-
communications. Forty people working for
tent will be carefully set up to duplicate
one hour can not achieve anything close to
the company’s real web sites. Of course, it
the same productivity as one person work-
will be completely controlled by the crimi-
ing for forty hours. This is a fundamental
nal ‘‘phishers.” Victims may be made to feel
flaw in simplistic use of PMs to represent
more comfortable by the fact that they have
an amount of effort.
to log in using their pre-established user-
References name and password, but, of course, the site
r F. Brooks (1995). The Mythical Man-Month: is not checking your identity, it is simply
Essays on Software Engineering (New York, recording the user-names and passwords
Addison-Wesley). that are entered. That might be all that
r B. Boehm (1981). Software Engineering happens; stealing a customer’s user-name
Economics (Englewood Cliffs, NJ, and password is often all that is required
Prentice-Hall). for great financial gains, in which case the
260
Phishing
customer will quite probably be redirected comprehensible to most readers, but they
to the company’s real web site with a mes- contain a list of the names of some of the
sage saying that the password entered was email servers involved in transmitting the
incorrect. Otherwise, the illicit site may message. Look through them quickly, just
continue to ask for an ever-widening vari- the endings, which will normally show two-
ety of personal details, account numbers, letter country codes for any foreign server.
and anything else the customer can be per- If an email that purports to come from
suaded to divulge. a company that you frequently deal with
An alternate means of attack is to set up went through a server in an unusual or
web sites with names very similar to those remote location it is unlikely to be legiti-
of major corporations, but with slightly mate. For example, if you are in the United
misspelled names, and just wait for cus- States it would be suspicious if the com-
tomers to fall upon those sites accidentally. pany you deal with to buy fruit in Wash-
The key to safety from phishing attacks ington State is routing a message through
is to realize that email is an exception- an Asian or West African site.
ally unsafe medium, and that, with stan- Of course, phishing does frequently orig-
dard email clients, it is impossible to know inate from within one’s home country, and
where an email message really came from. the only way to be really safe is very sim-
The return address and all other informa- ple. Don’t click on links in email messages
tion in an email header are exceptionally unless you positively know the sender.
easy to fake. For this reason no reputable The word ‘‘phishing” is just a respelling
company would expect customers to trust of ‘‘fishing,” which is really what it is,
email for anything vital. Legally required fishing for information. The change of ‘‘f”
notices can not be delivered by email, as to ‘‘ph” is just a strange modern fad of
there is no possible proof of delivery. Any no significance. The US Identity Theft and
time an email message asks the recipient to Assumption Deterrence Act of 1998 makes
visit a web site, for any reason, it must be most phishing criminal, but is of little help
viewed with grave suspicion, and remem- when the perpetrators are untraceable or
ber that visiting a web site can be achieved overseas.
by as little as simply clicking on a link
embedded in the message.
If an email recipient is for some reason Business value proposition
tempted to visit a web site as instructed, As a customer service it is vital that orga-
they should first re-read the email care- nizations educate and inform their cus-
fully. A vast quantity of phishing emails tomers about the phishing phenomenon.
now come from overseas locations, and the Organizations involved in internet com-
use of language is often poor. Incorrect merce must at least give the appearance
grammar and punctuation are common of taking a proactive position. Failing to
give-aways, but, of course, correct usage of respond when customers report suspicious
language is no evidence of legitimacy. As emails can destroy customer confidence,
another check, look at the email ‘‘internet and it takes very little effort to invite cus-
headers” (with the ‘‘outlook” mail client, tomers to report suspicions, and in turn
select from the menu ‘‘view” → ‘‘options,” warn them of any known form of attack, or
and they appear at the bottom of the to inform them that you would never send
dialog; other mail readers provide the same an email directing them to follow a link to
information but with a different access a web site. Such a stance, and, if possible,
sequence). The ‘‘internet headers” are not taking legal action against the offenders,
261
Port
helps to preserve a brand and maintain cus- patibility, then, when it is working and has
tomer confidence in the organization. been well tested, and its value has been
assessed, it is Ported to another system.
Summary of positive issues For well-written code, and when the two
There is no positive business value from systems are in some way similar (perhaps
phishing unless your business is actively two varieties of Unix), the porting pro-
criminal. cess can be almost trivial and does not
require any great expertise in program-
Summary of potentially negative issues ming or even full understanding of the soft-
In individual terms phishing can be ware itself. For applications that make use
extremely damaging for any victim of infor- of system-dependent features, in contrast,
mation, identity, or data theft. Phishing porting can be a major project.
can weaken customers’ confidence in an 2: Every computer has a number of con-
organization that does not have a policy nectors at the back into which other things,
to deal with phishing violations against its such as keyboards, mice, printers, network
customers. cables, and USB devices are plugged. These
References connectors are known as Ports.
r M. Whitman and H. Mattord (2004). Network ports, TCP/IP ports, and inter-
Principles of Information Security (Boston, net ports: The same word is also used to
MA, Course Technology). describe a Virtual socket, most commonly for
r L. Peterson and B. Davie (2003). network connections. Even though a com-
Computer Networks: A Systems Approach puter normally has just a single network
(San Francisco, CA, Morgan Kaufmann). port, with a single network cable plugged
into it, it can maintain a large number of
Associated terminology: Cracking, simultaneous connections with other com-
Hacking, Law cross-reference. puters. This is made possible by the operat-
ing system, which creates tens of thousands
of virtual ports or Network sockets.
Port The underlying internet protocols TCP/IP
and UDP/IP allow messages to be given two-
Definitions:
part destination addresses. The well-known
1. Reconstruction of software designed for one IP address (q.v.) specifies which other com-
kind of computer to work on another. puter the message should ultimately be for-
2. A communications connection point. warded to; the second part, the Port num-
ber, is used once the message arrives to
Overviews decide which of the many concurrently
1: Most software is originally designed running applications the message should
to work on one particular kind of com- be delivered to. Network-enabled applica-
puter under one particular kind of oper- tions obtain a port number from the oper-
ating system. Multi-platform software is a ating system on start-up, and use that port
highly commendable target, and certainly number as part of their unique address
is achievable, but it is unusual for software for receiving messages. Although a com-
that was designed to operate under one puter may have up to 65 500 network ports,
kind of system to work perfectly on another they are all just virtual constructs accessed
without a significant outlay of additional through the one physical network port at
development effort. It is very common for the back of the computer.
software to be developed without any spe- Serial ports: nine or fifteen individual
cial consideration for cross-platform com- connectors in two parallel rows. Serial ports
262
Port
are almost obsolete and usually go unused sometimes called Full USB) is now used
on modern computers. Serial ports were only on older or low-end equipment. USB
used to connect to external dial-up modems 2.0 (also called High-Speed USB) is fully
or very old mice. Communications through backward-compatible, so any USB 1.1 device
a serial port occur as a series of single bits may be plugged into a USB 2.0 socket. USB
in a long stream; since only a single bit 2.0 is rapidly replacing all of the other
can be sent at once, serial port communica- ports normally found on the back of a
tions are usually quite slow. The speed and computer (keyboard, mouse, serial, parallel,
other details of bit transmission are not and SCSI). At such high signal rates, it is
fixed, and must be set on a per-device basis. essential that good solid connections are
Some serial devices Auto sense the settings, made through properly constructed and
and are almost plug-and-play, whereas oth- undamaged cables; the only slight disad-
ers require an exact configuration to be vantage of USB 2.0 is that the cables tend
pre-set, which causes trouble when the to cost a little more.
instructions are inevitably lost. SCSI ports: SCSI (Small Computer Sys-
Parallel ports: 25 individual connectors tems Interface) became an ANSI standard
in two parallel rows. Until recently, parallel in 1981, and rapidly became the connec-
ports were the only low-cost means of con- tion of choice for the high-speed con-
necting external devices to a single com- nections needed for external disk drives,
puter with a fast transfer rate. Parallel ports CD-ROM burners, high-resolution scanners,
were used to connect printers and scanners, and other devices. Its original transfer
but have now largely been replaced by USB. speed of 40 000 000 bits per second was
PS/2 ports: small round sockets with six much faster than anything else available
individual connectors and a small rectan- for PCs or any other small-to-medium-sized
gular tongue, normally used for connecting computers. SCSI now has many versions,
a keyboard and mouse, but they are in the which vary in their levels of compatibility:
process of being replaced by USB. PS/2 ports Fast, Wide, Ultra, Ultra-Wide, Ultra-2, Ultra-
are usually color-coded, mauve for the key- 2-Wide, Ultra-3, and Ultra-320. The fastest of
board and green for the mouse. these are still faster than USB 2.0 (Ultra-2-
USB ports: small rectangular connectors Wide can transfer data at 640 000 000 bits
with a central tongue. USB (Universal Serial per second), but SCSI is far more expensive,
Bus) is a relatively new technology that always requiring costly shielded multi-core
adapts the old idea of serial ports to an cables, and usually requiring an additional
age of ‘‘smart” devices. A USB device has interface card in the computer.
some minimal processing power of its own,
and can accurately identify itself when Business value proposition
plugged into a system and recognize com- The term port has two major meanings, as
mands intended for it rather than for other described above. The first refers to trans-
devices. This means that a moderate num- plantation of software from one system
ber of USB devices may be connected to a to another. This may be highly desirable,
single port (through a Hub). Improved tech- and much depends upon the degree to
nology means that USB devices can com- which the original designer utilized func-
municate at speeds that were impossible tions specific to a particular computer or
for serial devices. There are two standards operating system. If the design is highly
in current use: USB 1.1, which allows com- specific in its use of a unique hardware
munication at 12 000 000 bits per second; resource, then portability will be limited
and USB 2.0, which allows communica- or may require significant effort. If the
tions at 40 times that speed. USB 1.1 (also system uses only standard, non-customized
263
Power protection
264
Power protection
ensure that data integrity is maintained the right voltage at all times. Particularly in
during a power failure. industrial areas, the turning on and off of
An Uninterruptible power supply (UPS) is large inductive loads (essentially anything
usually a rechargeable battery unit: it is with a big electric motor) can result in volt-
plugged into the normal mains electric- age fluctuations, both over and under the
ity supply, and the computer is plugged nominal value. Modern computers tolerate
into it. Normally, the UPS remains passive, a wide range of supply voltages, but there
but, when the mains supply fails, the bat- are always limits. A UPS constantly moni-
tery automatically takes over and temporar- tors the supply voltage, and should auto-
ily provides power to the computer. The matically switch to battery power not just
changeover is so fast that the computer is for total failures, but also for the duration
totally unaffected. The capacity of any bat- of excessive voltage swings.
tery is severely limited, so a UPS will keep Incorrect wiring. It is not unheard-of for
a computer working for no longer than a the electrical system in a building to be
few minutes. This is enough for most power incorrectly installed: the live and neutral
failures, and in critical installations should conductors can be crossed over, the ground
be enough time to start up an emergency conductor can be left unconnected, etc.
generator. These errors can lead to safety problems
UPS hardware ranges from small low-cost themselves, and can prevent other safety
devices that provide a few minutes’ pro- equipment from functioning. A surge pro-
tection to a small computer, to industrial- tector can not divert excess voltage to
strength units with their own electrical ground if the ground wire is disconnected.
generators that automatically start before Inexpensive wiring checkers are widely
the batteries become discharged, and could available, when plugged into the electri-
keep a whole installation running for days cal supply they indicate via red and green
without an external power source. lights whether the wiring is correct.
Surges. In areas with very primitive pub- Electro-magnetic interference (EMI).
lic utility infrastructure, electrical supply Whenever wires pass through a varying
lines are often above ground, exposed to magnetic field, stray currents are induced
the weather. If lightning strikes the dis- in it. All electrical equipment creates
tribution line, there can be an enormous varying magnetic fields, so the modern
surge of power with the potential to destroy office is something of a farm for induced
any connected electrical equipment. Lesser currents in wires. The introduction of
events (such as transformers switching) can high-frequency signals into electronic
cause lesser, but still damaging, surges. A equipment can have unfortunate effects,
Surge protector is a small unit connected in so power-protection equipment often
the same manner as a UPS. The most com- includes EMI filters. Many power cables
mon design uses a simple semiconductor have EMI filters built in; they are small
device, a Varistor, which shunts the excess thick tubes that appear to be clamped to
voltage to the electrical ground. The varis- the wire.
tors can be destroyed by a severe surge, leav-
ing the computer unprotected from subse- Business value proposition
quent surges: the indicator light should be UPS devices are used to prevent unexpected
checked frequently. A UPS usually incorpo- systems outages that can lead to the loss
rates a surge protector. of data, hardware damage, and business
Under- and over-voltage. It is very diffi- downtime. The devices are universally avail-
cult for a public utility to maintain exactly able, in a variety of specifications, and are
265
Privacy Act of 1974
266
Programming language
267
Programming language
268
Programming language
269
Programming language
sequence of ordered steps. Without spe- the de facto standard language for scien-
cial training, programmers tend to think tific programming and Cobol became the
that this is the only way programming de facto standard for commercial systems
could be. An alternative is the Functional development. Various attempts were made
style, in which a program is a set of to produce a language that would be good
pure mathematical functions that specify for everything; PL/I is the most notable,
the correct mapping from inputs to out- combining features of Algol, Fortran, and
puts without saying what order the indi- Cobol into one big confusing whole. Pro-
vidual steps of the computation must fol- gramming was also performed in a vari-
low. In the pure functional style every- ety of other languages that ranged from
thing involved in a program is strictly con- assembly language to system-command lan-
stant, but many languages follow the form guages such as JCL (Job Control Language),
of functional programming without follow- which was used by systems programmers
ing the rules. Lisp is an example of this: to ensure that programs were executed cor-
programs are thought of as functions, but rectly, and that they had access to the cor-
in the mathematical sense they are not, rect resources of the system.
and Lisp does not provide the full ben- The weight of maintaining and develop-
efits of true functional programming. In ing the huge amount of Fortran and Cobol
the logic-programming style, programs are code, which was typically poorly designed,
simply logical formulæ specifying the cor- structured, and developed, was one of the
rect relationship between questions and factors that led to their decline. In the
answers, not stating how that relationship 1980s and 1990s programmers were influ-
is to be realized. The non-procedural styles enced by the need to develop programs that
require great expertise and deep theoret- were more in alignment with the needs
ical understanding on the part of a pro- of the end user rather than the technolo-
grammer, but, once mastered, they provide gist. They also needed to develop code that
major advantages, including rapid proto- was able to operate on a range of systems
typing and the possibility of Proving a pro- that included PCs and workstations, and
gram correct. The Object-oriented style is a would work correctly on various operating
view of how data and computations should systems. Managerial considerations were
be encapsulated in a program, and may also becoming a key influence on which
equally be applied to all three of the major programming languages lived and which
paradigms. The two popular languages, died. Project managers were keen to ensure
C++ and Java, are both imperative and that any code developed was reusable and
object-oriented. structured, and could be maintained by
any programmer versed in that language.
Business value proposition The programming languages also needed
Programming languages come in all styles to facilitate inter-program communication,
and flavors. They can generally be classi- since many systems were becoming either
fied in terms of their ages and ‘‘eras.” For embedded inside other systems, or con-
example, the mainframe era of the 1960s nected as part of a heterogeneous program-
and 1970s was dominated by Algol-type ming environment. This led to the emer-
languages and the two major commercial gence and adoption of several key current
languages Fortran and Cobol. The Algol languages, chief amongst them C, C++,
style facilitated the development of creative Java, and Visual Basic.
programming techniques and led directly While the C programming language
to the modern languages Pascal, Modula, gives programmers the power to perform
BCPL, C, C++, and Java. Fortran became assembly-level types of operations within
270
Programming language
higher-level programs and to access all to become difficult to maintain, and will
aspects of the computer’s hardware, it is become a heavy legacy environment within
too easy for inexperienced or untrained the US Department of Defense, since very
programmers to abuse or accidentally mis- few programmers are being trained in
use the language. This was corrected to Ada, and it is not an easy language to
a large degree with C++, which retained pick up.
the power of C but provided a more struc- In the 1980s and 1990s there was also a
tured programming environment. C++ resurgence of interest in AI programming
has proved to be a popular programming systems and languages, with Lisp and Pro-
language; however, it is a very complex lan- log becoming popular for the creation of
guage to learn and to extract the full poten- knowledge-based systems. The prominence
tial from. One answer to this was the cre- of these systems has declined since then,
ation of Java, which offers many of the fea- however, but many of these systems are
tures of C++ but simplifies the program- still in operation. Some KBS programs have
mer’s workload. Visual Basic provides non- been translated into C or used through
technical developers with a mechanism ‘‘shell” environments. Again, these legacy
through which they can develop simple systems pose major maintenance problems
programs, and is considered especially use- for project managers because the support
ful for creating systems with graphical user environments that were available at their
interfaces of the type familiar to office and creation have disappeared or not been
commercial users. updated. Lisp and Prolog require special
In the 1980s there was a major move- training, since they do not work in the
ment toward developing one standard pro- manner familiar to most programmers.
gramming language that would be good The use of programming within many
for all purposes. The movement was led organizations to create applications has
by the US government, and resulted in the been on the decline since the 1990s as CIOs
programming language Ada. In 1987 The US began to understand the value of using
Department of Defense mandated the use commercially produced applications such
of Ada for nearly all projects under its con- as enterprise resource planning (ERP) sys-
trol, with directive 3405.2 ‘‘Computer Pro- tems to replace many home-developed pro-
gramming Language Policy,” which stated grams. Interestingly, a leading ERP vendor,
that ‘‘The Ada programming language shall SAP A.G., chose to write their system in
be the single, common, computer pro- their proprietary language ABAP/4, which
gramming language for Defense computer not only discourages customization of an
resources used in intelligence systems, for SAP ERP system (typically a good thing) but
the command and control of military also ensures that SAP A.G. has complete
forces, or as an integral part of a weapon control over the development of the ABAP/4
system.” For various reasons, the Ada move- language and as a consequence their ERP
ment was unsuccessful, and in 1997 the application.
policy was reversed. A new memorandum The role of programming languages in
from the Assistant Secretary of Defense the future is likely to continue along a
stated ‘‘I have directed my staff . . . to elim- development path that increasingly allows
inate the mandatory requirement for use programmers to create systems that are
of the Ada programming language in favor able to interact easily, use the features
of an engineering approach to selection of of the web, provide high-quality graphical
the language to be used.” As a consequence interfaces, and operate equally well upon a
of this there exists ten years’ worth of range of operating systems and hardware
Ada code in critical systems that are likely platforms.
271
Protocol
Protocol
Business value proposition
Definition: A set of rules governing the interactions The ever-increasing ability for comput-
between a number of systems or components. ing devices to connect together, to run
272
Proxy
273
Public key–private key
future accesses to the same may be handled the corporate network, and the potential
locally. for using third-party providers for specialist
This second kind of proxy is also used services.
more controversially to filter internet com-
munications. Proxies may intercept web- Summary of positive issues
page accesses, file transfers, and even Proxy servers are strong mechanisms for
email messages, and scan for content that protecting corporate networks. The tech-
is deemed somehow inappropriate. Appli- nology is mature and widely understood by
cations of this range from corporations network managers. Proxy servers may act
attempting to prevent leaks of industrial as a mechanism for servicing web requests
secrets, or to prevent employees from wast- quickly without having to route every
ing time with web browsing, to shielding request internally and thereby increase net-
minors from explicit sexual material, or work traffic. Third-party providers are avail-
even to preventing customers from seeing able to offer specialist services such as
rivals’ materials. spam filtering.
An even more controversial use of prox-
ies involves modifying material that is sent Summary of potentially negative issues
or received without the author’s consent or Proxy servers that are used to filter spam
even knowledge. If a user can access the and other traffic need to be maintained
internet only through a proxy, then that and monitored to keep them current and
proxy has complete control over all of their effective.
communications. One application of this is
References
adding Banner advertisements to email mes- r P. Gralla (2004). How the Internet Works
sages, but far more sinister uses are easy to
(Indianapolis, IN, Que).
imagine. r L. Peterson and B. Davie (2003).
Computer Networks: A Systems Approach
Business value proposition
(San Francisco, CA, Morgan Kaufmann).
Proxy servers provide a mechanism
through which a network may be pro- Associated terminology: Internet
tected from unwanted external messages. protocol, DHCP, Firewall, Network-address
They also add value by providing a mecha- translation.
nism for servicing requests without having
to place unnecessary extra loads upon the
corporate network. A proxy server, whether Public key–private key
acting as a filter or providing redirection
services, does not need to be physically loca- Foundation concept: Security.
ted at the business, but may instead reside Definition: An encryption system using two keys, one
at the ISP telecommunications company’s for encryption and a different one for decryption; one
site, reducing the bandwidth requirements key is made public, whereas the other is kept secret.
on the corporate internet connection. The
provision and maintenance of proxy servers Overview
may in many cases be sensibly outsour- Public-key/private-key systems (often just
ced to a web-hosting company. called Public key) are based on asymmet-
While there are costs associated with the ric Encryption algorithms that require two
setup, maintenance, and operation of proxy separate keys (or pass-codes) for each com-
servers, the cost is offset by the decrease munication. Data that was encrypted using
in internal network traffic that results, the one key can be decrypted only by using the
reduction in malicious data traffic entering other key, and vice versa; it is not possible
274
Public key–private key
to deduce one key from the other. RSA Total secrecy: a message from person A
(named after Rivest, Shamir, and Adleman, to person B may be encrypted twice, once
the system’s inventors) is the best-known using person A’s private key, then again
and most respected of such systems. using person B’s public key. The result is
For each individual participating in a that only person B can read the message
public-key/private-key system, one matched because only they have the required pri-
key-pair is created. The keys are usually very vate key, and person B can be confident
large numbers, hundreds of digits long, so that it really did come unmodifed from
that a Brute-force attack, systematically try- person A.
ing every possible key to decrypt an inter- Private records: naturally, any partic-
cepted message, is impossible. One of those ipant may encrypt their own personal
keys is given in absolute secrecy to the secrets using their own public key, and be
individual; it is known as their Private key. secure in the knowledge that only they can
The private key must never be revealed or decrypt them.
left unsecured under any circumstances. Key exchange: The asymmetric encryp-
The other key, known as their Public key, tion systems needed for public-key systems
is published as widely as possible, and are more thoroughly trusted, but much
becomes part of the individual’s public digi- more computationally demanding than the
tal identity. Public keys must be well known usual symmetric systems, so the transmis-
and easily accessible. This arrangement per- sion of large amounts of data can be very
mits a wide variety of secure communica- difficult. A valuable technique for large
tions. secure transmissions is to make up a totally
Digital signature: if you wish to make a new key for each transmission, and send
statement in such a way that everyone read- that key only using the slow but very secure
ing it can have absolute confidence that public-key system. Once the recipient has
what they are reading is exactly what you received and decrypted this one-time key
wrote, simply encrypt it using your own (or Session key), it is used to encrypt the
private key. Everybody in the world has confidential data under a fast but perhaps
access to your public key, so they can eas- less trustworthy symmetric encryption sys-
ily decrypt it and read the message; the tem, and never used again. Many cryptan-
fact that your public key did successfully alytic attacks rely on having a collection
decrypt the message is proof that your pri- of intercepts all encrypted with the same
vate key must have been used to encrypt it. key, so the ability to use a new key for each
Only you have your private key, so you must transmission, and be sure that that key can
have sent the message. Successful encryp- not be intercepted, provides a significant
tion methods are carefully designed so that increase in security.
the tiniest change to an encrypted message
will result in absolute gibberish when the Business value proposition
message is decrypted, so illicit modifica- The use of public-key/private-key technolo-
tions are impossible. gies has a wide range of applications. Pri-
For your eyes only: if you wish to send a mary amongst them is the encryption of
message and be sure that only person X can data to be communicated between two
read it, simply encrypt it using person X’s or more parties and of data to be held
freely available public key. The encrypted within a company or by an individual.
message may be sent over a completely Emerging uses include the incorporation of
open and insecure channel; it doesn’t mat- encryption into a variety of processes such
ter who intercepts it, since only person X as voting and election mechanisms, smart
has the private key required to decrypt it. cards, digital time stamps, authentication
275
Public key–private key
276
Quantum computing
277
RAID (redundant array of independent disks)
278
Rapid application development (RAD)
279
Reliability
modeling, process modeling, code genera- to each segment, may result in a loss of
tion, testing, and, finally, production. Ide- overall consistency and deviation from the
ally the RAD team members will have original specification.
undertaken significant prior training and
Reference
will be ready to go, allowing rapid com- r J. Kerr and R. Hunter (1994). Inside RAD
mencement and execution of a project.
(New York, McGraw-Hill).
Business value proposition Associated terminology: Joint application
The RAD methodology is well known development, CAD/CAM.
amongst developers and versions of it
are incorporated into the development
approaches used by many consulting orga-
nizations. The approach is well supported Reliability
and tools are available to developers. It
relies upon iterative prototyping and this Foundation concept: Software development lifecycle.
enables the development team to deliver
prototype solutions to users more quickly Overview
than would be possible using stage-based People normally think of ‘‘one in a million”
methods (such as the waterfall method), as being the archetypal statement of long
which present solutions only at the end of odds, and a failure rate of one in a mil-
the development cycle. lion describes a very reliable thing. How-
ever, a modern computer system can exe-
Summary of positive issues cute 2 000 000 000 operations or more per
RAD has the potential to deliver software second, and an error rate of one in a mil-
solutions faster than stage-based models lion means that it will go wrong 2000 times
do. It allows the development of compo- every second. That would be totally useless.
nents in parallel and simplifies the amend- The reliability required of even cheap toy
ment of components should the specifica- computer systems is higher even than that
tion change during development. RAD has required of pacemakers and rocket-ships,
significant support in the form of soft- not because of any critical need, but sim-
ware tools and consultants. RAD works well ply because the speed of operation magni-
with other techniques for structured sys- fies the probability of error.
tems development such as Joint application The reliability of electronic hardware is
design (JAD). usually expressed in terms of the Mean time
between failures (MTBF). If a large number
Summary of potentially negative issues of identical devices were all turned on at
The RAD approach requires that the teams the same time, and used continuously until
be trained in the RAD techniques and phi- they were all broken, the MTBF would be
losophy. The approach works best when the average time that each one survived.
the scope of a project is constrained and The arithmetic of failure rates is not nec-
good use can be made of tools and Appli- essarily intuitive. For example, consider a
cation program interfaces. Owing to the frag- system consisting of three components, A,
mented nature of the development, with B, and C. A has an MTBF of 1000 hours, B
the project development being divided has an MTBF of 1000 hours, and C has an
amongst many groups, optimization issues MTBF of 2000 hours. What is the overall
need to be carefully considered if speed MTBF of the system? The solution is to con-
constraints are important. The fragmented sider a long period of time, such as 100 000
aspect of the development, with user input hours. In 100 000 hours, we would expect
280
Reliability
component A to fail 100 times, B to fail 100 Sadly, most manufacturers can only guess
times, and C to fail 50 times, giving a total at the number of faults contained in their
of 250 failures in 100 000 hours. This trans- software.
lates back to an MTBF of 400 hours for the The fact that automated space probes
whole system. manage to reach their destinations in
Unfortunately the situation is not always the outer solar system after traveling for
so straightforward. Component failures are many years, and successfully beam back
not usually evenly distributed. Consider a photographs of previously undiscovered
simple battery as a counter-example: a new moons, is a testament to the fact that peo-
battery put into an electric clock usually ple can make reliable software if they really
works for about a year; perhaps the MTBF try hard enough. One might find oneself
for clocks with new batteries is 10 months. asking why software for home and com-
If you have 100 new clocks, and put a new mercial use is so unreliable. Sadly, in most
battery in each, all at the same time, you cases the answer is simply that it can be.
will not find the clocks failing at an aver- Unlike all other engineering professions,
age of 10 per month. Many months will there is absolutely no professional regula-
pass without a single clock failing, then, tion for software engineers; expensive soft-
at around the 10-month mark, they will all ware is unguaranteed, and even comes with
fail at times very close together. The MTBF user agreements explicitly stating that it
for the system of 100 clocks will be only might not work and that the manufac-
slightly less than the MTBF for one of its turer will not take responsibility. It costs
component clocks. It is this non-linearity more to produce a good product than a bad
of the failure distribution that makes sys- one, people continue to purchase software
tems consisting of literally millions of com- from manufacturers of other software that
ponents (such as computers) able to work they already know to be unreliable, and
at all. well-trained programmers cost more than
The reliability of software is of a some- untrained ones. Human nature and market
what different nature. Unlike physical com- forces ensure that general production soft-
ponents, software that was constructed per- ware will remain unreliable for as long as
fectly does not break after extended use it is allowed to.
and suddenly fail. No matter how heav-
ily it is used, there is no wear and tear Business value proposition
on software. If software ever fails, that The ability to create reliable software is
means that it was faulty right from the a function of the development methodol-
very beginning, and the fault simply had ogy utilized to create that software. The
not previously been observed. Expressions range of software development styles and
of the mean time between failures for soft- methodologies is wide, ranging from the
ware are somewhat deceptive; what they very informal styles in which the system
really express is how long the manufac- requirements are informally encoded, if at
turer expects you to be able to use the all, in informal language, through to the
software before becoming aware of a fault use of mathematical equations to specify
that is already there. Manufacturers can required behavior, a type of development
only guess at how users will use their soft- known as Formal methods.
ware, so the mean times are no better than The discipline of software development
guesses themselves. The only real measure has not yet evolved to the point at which
of the reliability of software is the num- the programmers, analysts, and other tech-
ber of faults it already contains. If it has nologists involved in the development
no faults, it is reliable; if it doesn’t, it isn’t. processes have all become ‘‘professional
281
RFID (Radio-frequency identity) tags
engineers” in the sense that architects or orously developed but probably would not
civil engineers have in their professions. wish to incur the cost and make the effort
Professional bodies such as the ACM, IEEE, required. Thus the management issue on
and BCS have developed professional codes software reliability comes down to ques-
and membership levels such as Member tions of balancing the total cost of own-
or Fellow, and further accreditation lev- ership of the software over its life, includ-
els such as Chartered Engineer and Char- ing liability issues, against return on invest-
tered Scientist have also been designated, ment for a project.
but there is no requirement for practition-
ers to be members of any professional body, Summary of positive issues
nor is there any enforcement of minimum The reliability of software and hardware
standards of competence. within a computer system can be managed
Programmers and other computer tech- by developing software through the adop-
nologists do not need any accreditation, tion of appropriate levels of formality in
and any responsibility for product qual- the development process using the SEI’s
ity rests solely on the management team. developmental guidelines.
Managers can demand that everyone in
their development team should be mem-
bers of a professional organization such Summary of potentially negative issues
as the IEEE and have Chartered Engineer Without strong management and process
status in order to work on a project, and controls in place, the development of soft-
they can further stipulate that the team ware is subject to large degrees of variance
use a formal methodology such as Z or in terms of quality, methodological style,
VDM and that the specifications be refined and programming technique, leading to a
mathematically until they are encoded in corresponding variability in reliability.
a program. Managers rarely make such Reference
demands, since the increase in costs would r D. Ince (1991). Software Quality and
be significant. Reliability: Tools and Methods (London,
An organization that is procuring soft- Chapman and Hall).
ware may also specify that the develop-
ment team and the software vendor as Associated terminology: Formal methods,
a whole must meet Software Engineering RAID, Y2K problem.
Institute (SEI) Level 5 Engineering certifi-
cation, the highest level of certification,
which requires that code is produced under RFID (Radio-frequency identity) tags
certifiably high levels of process control.
The business value proposition for this style Definition: Radio-frequency identity tags are small,
of development is that the systems will be relatively low-cost label devices that can be used to
of very high quality and extremely well doc- automatically identify and track objects.
umented, and that future changes to the
system can be made with a complete under- Overview
standing of the consequences. The prob- The idea of automatically identifying an
lem is again that this style of development object from a distance gained prominence
is extremely expensive and technologists and became practical during the Second
who can perform at this level are extremely World War when aircraft carried transpon-
rare. A trade-off is inevitable; organizations ders tuned to a certain radio frequency,
would like to have all their systems rig- and, upon receipt of a particular signal,
282
RFID (Radio-frequency identity) tags
they transmitted back an individual iden- similar way to the active tag system, but
tifier to state that the aircraft was friendly. the passive system, having no energy source
Subsequent to the war the concept of a of its own, needs to convert the incom-
reflected-signal device was discussed in the ing radio waves into energy and use that
writings of Harry Stockman, but the tech- energy to transmit back data.
nology available at the time was not ade- The amount of data that can be held on
quate to implement his theories. passive RFID tags is growing and products
Since the 1940s many people have with 2000 bytes are available (the Fujitsu
worked on remote-identification technolo- MB89R116 has 2048 bytes of total memory,
gies and the concept has been refined 48 bytes of which are used by systems pro-
considerably. The first patent was not grams), with a ‘‘data-retention” period of 10
awarded until 1973, when Mario Cardullo years under normal conditions.
was awarded a US patent for an RFID There are current moves toward the cre-
device (the patent expired in 1990). Tech- ation of protocols and standards for RFID
nical implementation problems remained tags. Primary amongst these is the effort
for some considerable time and it was not by EPCglobal Inc., a joint venture of GS1
until the late 1980s that the technology was (formerly the EAN), GS1-US (formerly the
able to be put to practical use as a means UCC), and a group of universities, whose
of allowing drivers of cars to pay high- focus is to ‘‘establish and support the EPC-
way tolls without stopping at a toll booth. global Network as the global standard for
The tags used in cars are known as active real-time, automatic identification of infor-
tags because they have their own internal mation in the supply chain of any com-
power source (usually a battery) and this pany, anywhere in the world.” In 2004 EPC-
enables them to transmit their data fur- global approved a specification for an RFID
ther and in a wider field than is possi- air-interface protocol, known as the ‘‘Class-
ble with so-called passive tags that have no 1 Generation-2 UHF RFID specification” or
internal power supply and can only reflect just as ‘‘RFID Gen 2.” Other standards are
or interact with an externally generated being developed by ISO/IEC and the Euro-
signal. pean Telecommunications Standards Insti-
Active tags hold significant advantages tute. These protocols address problems in
over passive tags in that the internal power electromagnetic interference, sensitivity to
source enables signal quality to be higher nearby metallic objects, range of data trans-
whilst providing a greater range of reliable mission, and the use of multiple readers in
operation. However, they are also much close proximity.
more expensive to manufacture and may
require occasional human intervention, for Business value proposition
example to change the batteries. RFID tags are mechanisms for automati-
While active tags are useful, their cost cally detecting and identifying an object
and need for maintenance prohibit them from a distance. The concept bears some
from being used in mass commerce, espe- similarity to that of bar codes, but RFID
cially in situations where they would be tags can be read through an opaque cover
discarded after a brief period of use. Thus, without being in direct line of sight,
passive tags have been the subject of inten- and can have the information upon them
sive research and development, and, as the changed remotely. RFID tags may be read
cost of their production has dropped, more unobtrusively at a distance, so they could
commercial opportunities have been real- eventually lead to systems allowing shop-
ized. A passive tag system works in a very pers just to wheel their purchases past a
283
RFID (Radio-frequency identity) tags
checkout, still in their shopping cart, and lifts) and on packing cases, as well as actu-
get an accurate accounting. They are also ally embedded with items.
an effective theft-prevention device, since Civil liberties organizations and politi-
tags may be hidden inside products, with cal groups have raised concerns about the
readers placed at all exits. use of RFID. Central amongst these con-
RFID technologies have been used in a cerns is the ownership of personal infor-
variety of ways. Active tags that contain mation, and the ability of RFID readers to
their own power source are frequently used track and profile individuals. The United
by motorists who wish to pass quickly States and the European Union are inves-
through toll booths without stopping to tigating the possibility of embedding RFID
pay with cash. The tag’s identification num- tags into their citizens’ passports and driv-
ber is collected by a reader on the toll ing licenses.
booth, and the toll is automatically debited
from an established account or charged to Summary of positive issues
a credit card. RFID technologies allow rapid remote iden-
The decreasing cost associated with the tification of labeled objects. The data in
passive tags has led to their use in a grow- the device can be changed as required. Pas-
ing number of areas, for example the orga- sive tags do not require an internal power
nizers of marathons frequently have the source. Tags are easy to place on objects.
athletes attach RFID tags to their shoes Protocols and standards are being devel-
so that their ‘‘real times” can be recorded oped. RFID tags can withstand a large range
(in large events such as the Boston and of ambient conditions. Unlike with bar-
London Marathons some runners do not code systems, the readers do not have to
actually pass over the start line until sev- be in line-of-sight proximity.
eral minutes after the starter’s gun has
been fired, due to the number of run- Summary of potentially negative issues
ners ahead of them in the crowd). Busi- RFID tags contain a relatively small amount
ness applications using RFID are also grow- of data. Costs are relatively high compared
ing, including embedding them in credit with older solutions. It is difficult to ensure
cards so that the card can simply be waved that a tag has been read, and especially
in front of a RFID reader. Casinos embed that all tags have been read when process-
RFID tags inside their chips to prevent the ing collections of labeled objects, and it
use of counterfeit chips in their establish- requires extra receipt acknowledgements
ments, placing readers at the gambling to be performed. Standards and protocols
tables. are still evolving. There are controversial
Large-scale RFID use is dependent upon social issues associated with the use of
the dramatic decrease in their cost, and RFID in the public domain. It is difficult to
this is dependent upon large orders being ensure that tags are deactivated after they
placed with their manufacturers. Progress have served their legitimate purpose.
in this area is strong: in 2002 the Gillette
Company ordered 500 million RFID tags, References
and Wal-Mart, the world’s largest retailer, r H. Stockman (1948). ‘‘Communication
has moved toward 100% RFID compliance by means of reflected power,”
by vendors. Both of these initiatives indi- Proceedings of the I.R.E., pp. 1196--1204.
cate the strength of corporate confidence r J. Landt (2005). ‘‘The history of RFID,”
in this technology. Companies are placing IEEE Potentials, Volume 24, Issue 4,
RFID tags on pallets (with readers on fork- pp. 8--11.
284
Robotics
285
RSS (Really simple syndication)
At the outset of the twenty-first century, mic issues such as operational costs incur-
robots may not be able to make our beds, red by the consumption of electricity and
cook our breakfast, and drive the children maintenance, and the expected useful life-
to school, but robotic help systems for the time before new technology renders an
home are available from several vendors, expensive robotic workforce obsolete.
including Honda’s ASIMO P3, which can The military use robotic systems to oper-
navigate around a home and supposedly ate in high-risk environments such as
perform basic household tasks such as examining potential bombs or hazardous
fetching the newspaper from the front gar- materials. They have spent considerable
den. However, those who expect robots to amounts of money attempting to develop
provide any real independent assistance in automated fighting machines to replace or
the near future are doomed to disappoint- assist their human soldiers.
ment.
Much has also been written about a new Summary of positive issues
class of robotic systems termed ‘‘nanotech- Robotic systems have been researched since
nology,” that is technology whose size is the 1950s and a large literature has devel-
measured in nanometers or one-billionths oped. Industrial robotic systems are widely
of a meter and hence are built at the molec- available for manufacturing. Robots can
ular level. While the theory holds great possess the capabilities of walking, com-
promise, the reality of nanotechnology municating, and vision processing. Robots
robots being used to perform heart oper- are in use in hostile and inaccessible envi-
ations or other procedures in the human ronments such as volcanic craters and
body remains a considerable way off. Mars.
286
RSS (Really simple syndication)
287
RSS (Really simple syndication)
288
Sarbanes–Oxley Act of 2002 (SOX)
289
Sarbanes–Oxley Act of 2002 (SOX)
the compliance officer’s task because many retention of relevant records such as work
of the security barriers have been con- papers, documents that form the basis of
structed into the system by the vendor. an audit or review, memoranda, correspon-
Title IV -- Enhanced Financial Disclo- dence, communications, other documents,
sures, Section 404. Management Assessment and records (including electronic records)
of Internal Controls, ‘‘(1) State the respon- which are created, sent, or received in
sibility of management for establishing connection with an audit or review and
and maintaining an adequate internal con- contain conclusions, opinions, analyses or
trol structure and procedures for finan- financial data relating to such an audit or
cial reporting,” together with Section 409, review which is conducted by any accoun-
‘‘(1) Real Time Issuer Disclosures -- Each tant who conducts an audit . . .”
issuer reporting under section 13(a) or The retention of data and information by
15(d) shall disclose to the public on a the public company and the auditors is a
rapid and current basis such additional major aspect of Information-lifecycle manage-
information concerning material changes ment (ILM), in which the frequency of use
in the financial condition or operations of the data is related to the type and cost
of the issuer, in plain English, which of the storage medium utilized for stor-
may include trend and qualitative infor- ing that data. Included in this data set is
mation and graphic presentations, as the the email correspondence of the organi-
Commission determines, by rule, is nec- zation, the archiving of which needs spe-
essary or useful for the protection of cial concern since the volume of emails
investors and in the public interest.” The may be considerable, thus requiring spe-
development of policies and procedures to cialist storage services. Additionally, special
maintain internal controls typically falls techniques will be needed to capture other
under secure identity management (SIM), forms of electronic communication such as
whereby all security points for data and instant messaging and text messaging.
personnel are assessed and the required In order to comply with these and the
level of security is implemented. The pri- other sections of the act, CIOs have utilized
mary vehicle through which Section 409 (1) a series of frameworks. Primary amongst
is approached is the posting of information these are COSO, CobiT, Trust Services, and
upon a web site accessible via the internet. the ISO 17799 security standard.
Additionally, document-management sys- The Committee of Sponsoring Organiza-
tems are also valuable resources to provide tions of the Treadway Commission (COSO)
facilities such as versioning, archiving, and is an organization founded in 1985 to
managing heterogeneous file types (audio, examine the ‘‘causal factors that can lead
video, email, text, web documents, etc.). to fraudulent financial reporting” and to
Title VIII -- Corporate and Criminal Fraud ‘‘develop recommendations for public com-
Accountability, Section 802. Criminal Penal- panies and their auditors.” The COSO
ties for Altering Documents, ‘‘(a) (1) Any framework was developed in the 1990s to
accountant who conducts an audit of an manage the SEC’s demands for internal
issuer of securities . . . shall maintain all audit controls and has been adopted by
audit or review work papers for a period of SOX-compliance managers to meet the act’s
5 years from the end of the fiscal period in demands. The framework is composed of
which the audit or review was concluded. five dimensions,
(2) The SEC shall promulgate, within 180
days after adequate notice and an oppor- 1. Control environment
tunity for comment, such rules and regu- 2. Risk assessment
lations, as are reasonably necessary to the 3. Control activities
290
Sarbanes–Oxley Act of 2002 (SOX)
291
Scalability
corporate resources and commitment. This perfectly. If that merchant expands, and
has been especially hard on many smaller needs to store 24 000 cubic feet of merchan-
public companies and has resulted in com- dise, simply building another 3000 square
panies delisting from the public markets foot warehouse will solve the problem
and returning to being privately held enti- equally well. If they eventually need to store
ties. It has also been suggested that, as a 120 000 cubic feet, the process of building
cost of business, it has acted as a barrier a new 3000 square foot warehouse, applied
to entry for companies wishing to locate in ten times, will yield a working solution.
the United States or use the US public stock The idea of building warehouses to store
markets to raise capital. merchandise is a Scalable solution.
If a publisher decides to produce a
References small encyclopedia with 4000 entries, they
r The IT Governance Institute (2004). IT
may decide to print it as a single large
Control Objectives for Sarbanes--Oxley book, with 1000 pages printed on very
(Rolling Meadows, IL, The IT lightweight paper, in the style of a large
Governance Institute), www.itgi.org. dictionary. If the encyclopedia proves pop-
r The IT Governance Institute (2000).
ular, and a second, extended edition with
CobiT Framework, 3rd edn. (Rolling 5000 entries is planned, they could simply
Meadows, IL, The IT Governance expand the book to 1250 pages and use a
Institute), www.itgi.org. slightly stronger spine. However, the same
r The IT Governance Institute (2000).
expansion process can not be applied too
CobiT Executive Summary, 3rd edn. many times. It is almost impossible to pro-
(Rolling Meadows, IL, The IT duce a 10 000-page book: the binding falls
Governance Institute), www.itigi.org. apart, and it is too heavy to lift. This is not a
r Committee of Sponsoring Organizations
scalable process. Eventually a jump to new
of the Treadway Commission (2004). technology is required.
Integrated Control Framework, For book production, the jump to new
Volumes I & II, www.iacpa.org. technology is simple: just publish the book
Associated terminology: Internet, ILM, in more than one volume. The analog for
Email, Instant messaging, ISO/IEC 17799. computer systems, both hardware and soft-
ware, is usually not so simple.
A single desktop PC, with a suitably
robust operating system, when acting as an
Scalability enterprise web server can support a surpris-
ingly heavy load. A thousand gigabytes of
Foundation concept: Software development lifecycle. data and a million hits per day are realistic
Definition: The ability to expand a system without expectations. However, there are limits. The
making significant changes. Getting more of the same standard IDE/EIDE disk system supports a
results by using more of the same solution. total of four devices, and even wide SCSI
is limited to 15, so disk storage can not be
Overview expanded without limit. A T1 internet con-
Scalability is a very simple concept. If a nection provides 193 000 bytes per second
merchant needs to store 12 000 cubic feet of bandwidth: as usage increases, that pro-
of non-perishable merchandise, they might vides an absolute upper limit to capacity.
decide to build a 3000 square foot one- Simply using two computers instead of one
storey warehouse. With merchandise stac- does not provide a viable solution. How are
ked 8 feet high, covering 50% of the floor- accesses to be distributed between the two
space (to allow access), the size matches computers? How can one ensure that the
292
Secure
two computers have consistent versions of operational 24 hours a day, 7 days a week,
the data at all times? more sophisticated design is needed. It
In a similar vein, the simple techniques takes many years for a programmer to gain
of programming used by beginning pro- sufficient expertise in such areas.
grammers are also very often not scalable.
The commonly understood methods for Summary of positive issues
searching data sets for example are lin- There exists an extensive body of knowl-
ear with time, which means that, as the edge pertaining to the problems of scaling
amount of data to be searched grows, the software and hardware systems. Commer-
time required to search it grows in direct cial systems that will scale to significant
proportion. This sounds like an obvious levels are available for common problems
observation, but is in reality a critical prob- and tasks.
lem that can be fatal to ill-conceived soft-
ware projects. Under light load, testing
Summary of potentially negative issues
with 1 MB of data might reveal that the
Failure to understand the scale to which
average search takes one tenth of a second,
a task may grow can result in solutions
and that sounds quite acceptable. But when
that fall short, cause system failure, and be
the system goes live with perhaps 10 GB
detrimental to the business. It is expensive
of data (10 000 times as much), that one
to train IT personnel to the levels required
tenth of a second grows to one thousand
in order for them to plan, design, build,
seconds, or a quarter of an hour, reducing
and maintain systems that scale signifi-
the system to a maximum of 96 agoniz-
cantly.
ingly slow accesses per day. (See Complex-
ity, Algorithm, and Data structure for more Reference
depth.) r K. Hwang (1992). Advanced Computer
Ensuring scalability requires intimate Architecture: Parallelism, Scalability,
knowledge of algorithm analysis, data- Programmability (New York,
structure design, and network protocols, McGraw-Hill).
and requires familiarity with the vast cor-
pus of human experience that 60 years of Associated terminology: Reliability,
intense study has created. Of course, an Computability.
intelligent programmer could in principle
rediscover all of this knowledge for himself
or herself, just as an intelligent blacksmith Secure
could in principle independently reinvent
the Apollo lunar-landing module. Foundation concept: Encryption.
Definition: Secure web pages and secure transactions
Business value proposition are transmitted over the network with an encryption
Lack of scalability in software solutions is method that ensures complete confidentiality even
the hallmark of the untrained program- if the entire interaction, including key exchange, is
mer. It is often possible to learn a little pro- intercepted.
gramming by reading a book or two, and
then produce small-scale solutions, per- Overview
haps using Visual Basic or JavaScript, that When reviewing personal records, or mak-
stand up to simple test scenarios. When ing an online purchase with a credit card,
software has to support literally millions or when privacy is a concern for any
of accesses every day, with perhaps hun- other reason, the internet can be an alarm-
dreds happening concurrently, and remain ing communications medium. There are so
293
Secure
294
Security
are implemented and used correctly. No secure areas needs to be considered. For
security system can be expected to be per- example, if a trader at an investment bank
fect, but processes and measures must be left their computer logged in to the trading
in place to identify immediately any break- system, a cleaner with legitimate access to
down in security that has occurred, as the area at night could gain access to that
well as reporting any unsuccessful security system, send rogue emails, place trades, or
breaches so that weak spots may be identi- acquire proprietary information about an
fied before it is too late. Security should not IPO pricing strategy. Similarly, all techni-
only be reactive to incidents, but should cal correspondence should be shredded or
also be a continuous background presence. disposed of in a secure manner, because
It is important to assess carefully the true even seemingly irrelevant information can
value of security policies in action, since be useful to someone with bad intent, e.g.,
some seemingly valid policies can have a the CEO’s internal email address or the
negative effect; for example, the common CFO’s system ID could be a starting point
requirement to change passwords every for a hacker’s attack. The physical security
month results in users having meaningless of the computers in the organization needs
unmemorable passwords that they need to be examined since it is not unknown
to write down, and a written password is for people to load deliberate software bugs
much easier to find. that copy log-in data or corporate infor-
mation and then transmit it secretly to
Reference the attacker (or an accomplice) via the
r K. Day (2003). Inside the Security Mind:
internet.
Making the Tough Decisions (Englewood Technology clearly can play a large
Cliffs, NJ, Prentice-Hall). part in securing an organization’s systems.
Associated terminology: Security, For example, the use of encryption, thin
Internet, Protocol. clients, and password-protection systems,
requiring the operating system to time-out
and disconnect users after short periods of
inactivity, having servers disable worksta-
Security
tions and clients on the network after nor-
Foundation concept: Secure. mal work hours, and using biometrics to
Definition: An information system’s security pertains log in to systems can all play a part in secur-
to the task of understanding those aspects of an orga- ing the systems from intrusion.
nization’s information system and processes that may Security concerns also emanate from
be vulnerable to unauthorized intrusion, abuse, or external entities that attempt to intrude
attack. upon the network and its contents. Attacks
come in many forms, including attempts to
Overview hack into the network, or crack the secu-
Security of information systems is primarily rity codes associated with a network. Alter-
concerned with access, physical as well as natively, networks may come under attack
through the network. Good security com- from Viruses, Worms, Trojan horses, and
mences with an appraisal of access rights Denial-of-service attacks. Defenses against
and then enacting physical and technolog- each of these can be mounted, includ-
ical solutions to guarantee that security is ing the creation of software and hardware
achieved. barriers such as Anti-virus software, Fire-
Physical security is frequently over- walls, Proxy servers, and the use of networks
looked, but all access to secure and non- that are not capable of connecting to the
295
Security
internet unless Network address translation would potentially have the negative effect
is used. Other plagues that are usually of reducing customer confidence in their
not fatal but are detrimental to corporate system and defenses as well as making the
systems’ performance include Spam, Pop-up company a potentially attractive and chal-
ads, and Phishing. These can be addressed lenging target for other attacks. Laws are
through the use of spam filters and of also not ‘‘practically” helpful in the event
software that removes spyware and pre- that an attack successfully crashes the cor-
vents pop-ups from being executed. Phish- porate network or when private data has
ing requires corporations to be proactive been stolen; the damage has already been
and to help customers affected by such done.
attacks, and to have e-Commerce policies that
minimize their effectiveness. Business value proposition
While computers and networks can Security is an aspect of a corporate IT orga-
seem secure, frequently security measures nization that is usually regarded as a sunk
are undermined by individual weaknesses, operational cost; while the prevention of
such as leaving a laptop computer with intrusion or attack is valuable, it does not
no password protection in a car that is actually help raise the top line of the busi-
stolen, downloading data to unauthorized ness (revenue) unless a company uses it as
portable computers that are taken out of a selling point. In the last century banks
secure premises, using wireless systems used to have large safes visible to customers
that beam the signal through office or entering the bank and this encouraged
hotel walls, and using laptop computers on their customers to think that their deposits
airplanes where corporate information or would be safe. In the modern technological
client information can be surreptitiously organization it is becoming more and more
viewed. Even cell phones and cell phone a corporate imperative to ensure the secu-
conversations need to be considered as part rity of their systems and, more importantly,
of the security umbrella of an organiza- the security and privacy of their customers’
tion and solutions found for these areas of data.
concern. Many companies have created a position
Corporate IT security concerns are of Chief Security Officer (CSO) responsible
addressed by many laws, including the for all security concerns, including physi-
CAN-SPAM Act of 2003, which enacts cal and intangible assets. This is an impor-
requirements on commercially motivated tant position, especially in a public com-
email; HIPAA (the Health Insurance Porta- pany that has to adhere to corporate gover-
bility and Accountability Act, 1996) that nance legislation such as Sarbanes--Oxley.
legislates the security requirements asso- CEOs, CIOs, and CSOs all need to be aware
ciated with digital medical systems; the of technology-related security issues and
UK Computer Misuse Act of 1990, which these may influence decisions regarding
forbids unauthorized access to computers corporate strategy. For example, it is vital
and the data held on them; and the US to understand the legal consequences of
Identity Theft and Assumption Deterrence outsourcing information-based services to
Act of 1998, which makes most phish- countries with weak IT laws and protection.
ing criminal. These and other laws come One approach that has been used success-
with strong penalties to deter potential fully has been to adhere to ISO 17799:2000,
offenders, but the nature of the internet which is a code of practice for information-
makes many offenders difficult to track security management and offers guidelines
down and many companies do not wish to on the development of policies and good
prosecute every small attack because this practice in this area.
296
Server
297
SIIA (the Software and Information Industry Association)
(internet servers), and applications (appli- and that decisions affecting the return on
cation servers), and to relieve the load investment are correctly considered.
on the main server which is interacting
with its clients and determining how best Summary of positive issues
to service requests (the main server in a Servers facilitate strong centralized control
Microsoft-based system is termed the pri- of an organization’s computing resources.
mary domain controller). Servers provide a flexible, scalable, secure
architecture through which to deploy com-
Business value proposition puting resources. There is a choice of oper-
The server concept allows corporations to ating systems upon which servers can run.
develop flexible, scalable computer systems Servers can run major applications such as
that run across networks. The associated ERP systems or act as file servers for a com-
client--server technologies allow greater pany’s web-page service.
flexibility in terms of configuration, the
variety of applications that can run on the
platform, and control of user access com- Summary of potentially negative issues
pared with traditional mainframe dumb- Large server architectures require substan-
terminal configurations. Server technology tial expertise to run effectively. Depen-
may be configured to support a wide array dence upon a central system can leave the
of applications, including specialist servers users vulnerable should that system fail.
such as web servers, that address the com- References
putational needs of particular groups of r J. Chellis, C. Perkins, and M. Strebe
users. (2000). MCSE: Networking Essentials, Study
Server technology allows companies to Guide (Alameda, CA, Sybex Press).
create and administer a strong security pol- r P. Gralla (2004). How the Internet Works
icy based on the centralization of data and (Indianapolis, IN, Que).
applications in a small number of server r D. Groth (2003). A+ Complete (Hoboken,
locations. This allows the CIO, the network NJ, Sybex--John Wiley and Sons).
administrator, and the Chief Security Offi-
cer to focus resources on the problem of Associated terminology: Client, Client--
system security. Security policies may be server, Web services, Peer to peer.
implemented more efficiently on a server-
based architecture than when corporate
IT operations are distributed amongst the
business units with no central control. SIIA (the Software and Information
Server architecture also allows the IT Industry Association)
organization to standardize and regulate
its software systems rather than allowing Definition: The Software and Information Industry
individuals and business units to make Association is a trade association representing mem-
their own software choices. Server technol- bers of the software and digital-content industry.
ogy is mature enough that good cost-of-
ownership decisions can be made centrally Overview
by the IT organization. A key aspect of this The Software and Information Industry
calculation is centered upon the software Association (SIIA) is a trade association
license fees, which may be based on the for the software and digital-content indus-
number of users and the processing power try. The association provides a variety of
of the system. It is therefore important services, including representing its mem-
for the CIO to ensure license productivity bers’ interests at the governmental level,
298
Software development lifecycle
protecting the intellectual property of bers with educational resources. The SIIA is
members, promoting business develop- proactive on anti-piracy issues and provides
ment, and providing educational services certification for IT professionals pertaining
at a corporate level and as a public to piracy issues.
service.
The association is divided into four Summary of potentially negative issues
divisions: Software, Content, Education, The requirement to join an umbrella orga-
and Financial Information Services, nization such as the SIIA is optional and
through which a common policy formu- it can not be seen as the only voice in the
lation is developed and put forth in the technology industry.
form of white papers, briefings, and other
Reference
media. Each division develops the agendas r SIIA Main Office, 1090 Vermont Ave
that are important to its members, and,
NW, Sixth Floor, Washington, D.C.
through working groups and studies,
20005, USA; http://www.siia.net/.
develops policy in each area. For example,
the Software division has working groups Associated terminology: ACM, BCS, IEEE,
that consider initiatives such as web ser- W3C.
vices, open-source systems, and software
as a service.
The SIIA is also active in the development Software development lifecycle
and enforcement of anti-piracy policy, and
has created an anti-piracy department to Foundation concept: Software.
focus upon this area (the division was for- Definition: The software development lifecycle is the
merly known as the Software Publishers series of processes through which a concept is trans-
Association). The SIIA also offers a course formed into a program.
that leads to the Certified Software Man-
ager (CSM) qualification, which is aimed at Overview
network managers and technology profes- The Software development lifecycle (SDLC) cov-
sionals. ers the transformation of a concept into a
verified and validated software product. Ver-
Business value proposition ification is the process of ensuring that the
The SIIA provides a forum through which system created matches its specification;
common subject matter can be discussed Validation is the process of ensuring that
and policy positions ultimately developed. the specification satisfies the customer’s
These policy positions can then be put requirements.
before legislators by the SIIA to help pro- An SDLC can be designed to consist of
vide advice and education. The body also any combination of development methods
provides industry intelligence to mem- as long as the resulting code matches the
bers through briefings, reports, and confer- specification to a degree of rigor that sat-
ences. isfies the customer. The weakest form of
lifecycle is one in which somebody verbally
Summary of positive issues proposes an idea for a program and then
The SIIA has a large membership with a just starts to code directly on the computer,
wide range of interests. Membership fees running the resultant code and testing it
are on a sliding scale depending upon the until they are satisfied. This method is
revenues of the company joining. The SIIA clearly not scalable and makes proper veri-
is influential in providing representation at fication impossible since there is no speci-
the governmental level and provides mem- fication to measure the system against.
299
Software development lifecycle
300
Software metrics
301
Software package
302
Software package
303
Spam
be in existence during the entire expected to be examined from a process and tech-
lifespan of the product; and what levels nical perspective prior to purchase. This
of support will they be able to offer in includes such testing as stress testing to
regard to upgrades, maintenance, and tech- determine the limits of an application to
nical support? Other issues involve the abil- handle boundary problems, e.g., a sales-
ity of the software to support the com- entry system that is capable of handling
pany’s existing and future communications downloads of 1000 point-of-sale data entries
protocols (e.g., X.12, XML), and the cost per hour into its database might not be
per user to run the software. All of these scalable and hence would be unsuitable
issues and many others need to be consid- for a hypermarket dealing with 10 000
ered during the due-diligence period of ven- entries per hour. The purchase of an appli-
dor selection and should ultimately be cap- cation in a software package will come
tured in a legal contract. with a contract; however, all possible future
Organizations need to determine their issues can not be predicted in the con-
ultimate optimal business process require- tract. For example, in 1999 a contract
ments (business process re-engineering) for an accounting system would not have
prior to selecting the package, rather predicted the need to include provision
than letting the package completely dom- for Sarbanes--Oxley; hence the question of
inate the selection process. While there who pays for this upgrade would not have
might not be a complete one-to-one match been addressed, even a provision to ensure
between the organization’s desired best- that the vendor make available, for a fee,
practice model and the process model upgrades to cover such process require-
provided by the system, this sequence of ments might not have been included in the
actions will allow the selection of a package contract.
that comes closest to the ideal, and adjust-
ments may be subsequently made at the Reference
r N. Hollander (2000). A Guide to Software
system and organizational levels.
Package Evaluation and Selection: The R2 ISC
Method (New York, American
Summary of positive issues Management Association).
A wide variety of software packages is
available, covering requirements for mass Associated terminology: ERP, X.12, XML.
and niche markets. These include open
source, freeware, and commercial software
to which there is no code access; each type
of vendor and code offers different solution Spam
options based upon the process and sourc-
Foundation concepts: Email, Internet.
ing needs of the organizations.
Definition: Unsolicited, unwanted, bulk commercial
email.
Summary of potentially negative issues
While the range of vendors and solutions Overview
is wide, there is also a wide variety of Advertising is big business; direct mailing
solutions on the market. Some software to consumers or potential consumers is
is full of errors and bugs, some known, very popular, as is demonstrated by the
some unknown; and unfortunately price is quantity of junk mail delivered daily. Bulk
not an indicator of quality as one would mailing of small cards costs 24c/ per item
expect in, say, an automobile purchase. through the US Postal Service, and the
The quality of a software package needs cost is 39c/ per item for anything larger
304
Spam
than a postcard.† In addition to the cost of subject line for pre-chosen key sequences
postage, there are also printing, assembly, such as ‘‘Lotto Number Predictor” and the
and addressing costs, but still unsolicited trade names of certain little blue pills.
direct-mail advertising abounds. Email is Deliberate misspellings or false headings
free. Hence spam. easily defeat most systems.
The advantages to the advertiser are Spam exists only because sending email
manifold. Complex typesetting and high- is free. The decision that the cost of inter-
quality images may be used without cost; net access for most subscribers should
emails may be sent automatically to entire be charged as a monthly flat rate, with
easily purchasable mailing lists without no usage-dependent costs, was arrived at
any human processing; and, most impor- mostly by default, and certainly favors the
tantly of all, email is unregulated. In com- spam industry. If the origination of data-
parison with mail fraud, which is a well- bearing transmissions actually cost some-
understood crime, whose victims have lit- thing, some fractional amount, it would
tle difficulty finding out how to report it, not impact the average subscriber: the total
victims of electronic mail spamming have monthly bill would on average remain the
a more difficult time seeking redress, since same, it would just be calculated differ-
the spammers work hard to circumvent the ently. Only those who transmit inordinate
regulations and remain untraceable. amounts would find significant increases
Although email messages seem to have in costs. There is no fundamental right to
return addresses, they do not really mean free long-distance telephone calls; it may
anything. Email protocols do not require a seem surprising that internet traffic should
valid ‘‘from” address, and make no attempt be treated so differently, to the detriment
to verify those provided. Once an email has of all users. The TCP/IP protocol family as
reached the recipient, it is virtually impos- it currently exists has no mechanism for
sible to determine where it really came accounting and charging, but far greater
from, so unscrupulous retailers feel free design challenges have been met before.
to say anything they want. Illegal and con-
trolled substances, defective or nonexistent Business value proposition
merchandise, stolen goods, everything is on The spam debate has positives and neg-
offer. atives depending upon whether you are
Spam-prevention software generally a company that uses email as a direct-
attempts to flag unwanted email after it marketing channel or are an unhappy
has been delivered, telling the recipient recipient. Email is a cheap mechanism
that they probably won’t want to waste for pushing a direct-mail campaign out to
time reading it, but not preventing it potential consumers. Recipients, however,
from arriving. The effectiveness of even find spam annoying and defending against
this technique is severely limited by the spam to be a waste of corporate and indi-
fact that spam can be detected only by vidual resources.
reading and understanding the content of
a message. The natural-language problem Summary of positive issues
simply has not been solved: computers can Spamming has led to the development of
not understand free-form human commu- several laws to protect the consumer. The
nication. Current techniques are barely Controlling the Assault of Non-Solicited
more sophisticated than scanning the Pornography and Marketing (CAN-SPAM)
Act of 2003 was developed to regulate com-
†
USPS pricing, obtained from www.usps.gov, September merce by imposing limitations and penal-
2006. ties on the transmission of unsolicited
305
Spam
commercial electronic mail via the inter- prohibits deceptive subject headings and
net. The act covers and attempts to prevent requires the inclusion of a return address
spamming and the avoidance of account- or comparable mechanism in commercial
ability by spammers. Its more important electronic mail. The act also has a provision
sections forbid the following actions: to establish a ‘‘do not email” registry that
would allow recipients who object to receiv-
r the unauthorized use of a protected ing spam email to register, and requires
computer to trasmit multiple that the sender must stop sending the
commercial emails; email within ten days of the objection, but
r the use of a protected computer to in a Federal Trade Commission report to
relay or retransmit multiple messages the US Congress they stated that this could
with the intent to mislead the be counter-productive since spammers, pos-
recipients; sibly outside US jurisdiction, might use
r the use of materially false header the list as a source of email addresses,
information in multiple commercial and further that such a list ‘‘would raise
emails, and the distribution of such serious security, privacy and enforcement
emails; difficulties.”
r registering five or more email accounts
or two domain names using a false
identity with the intent to transmit Summary of potentially negative issues
multiple commercial messages; The CAN-SPAM legislation has been crit-
r the acquisition of email addresses by icized for having several gray areas and
‘‘harvesting” web sites without the loopholes through which spammers can
consent of the site’s operator; attempt to circumvent the law. Spammers
r the false registration of an internet continue to use a variety of mechanisms to
domain for the purpose of setting up place temporary or untraceable addresses
a web site to capture email addresses; in their email headers and even place just
and one line of ‘‘news” in the text to ensure
r sending commercial messages to that the message contains ‘‘newsworthy”
randomly generated email addresses. content as the act demands. An additional
problem is that a large volume of spam
To protect the public and allow the spam originates from nations beyond the juris-
filters to operate more effectively, the CAN- diction of the CAN-SPAM law. The United
SPAM act stipulates requirements for the States has not yet established a ‘‘National
transmission of messages. The act requires Do Not Email Registry.”
that the header information be technically References
accurate: it must include a correct orig- r J. Zdziarski (2005). Ending Spam: Bayesian
inating email address, domain name, or Content Filtering and the Art of Statistical
IP address. The act also requires that the Language Classification (San Francisco,
‘‘from” line in the email accurately iden- CA, No Starch Press).
tify the person sending the email, and r T. Muris, M. Thompson, O. Swindle, T.
the ‘‘header information” accurately repre- Leary, and P. Harbour (2004). National Do
sent the computer used for transmission of Not Email Registry: A Report to Congress
the email and not hide the fact that the (Washington, DC, Federal Trade
email has been relayed through a series Commission).
of machines. The header must also place
a warning label on commercial email con- Associated terminology: Email, Internet,
taining sexually oriented materials. The act Internet protocol, Law cross-reference.
306
Spoofing
307
Spyware
ensure that all adequate protections are in tions’ budgets; however, some companies
place. simply lack the funds to develop sophis-
Defending against spoofing is best ticated market research and advertising
achieved through physical security mea- campaigns. A system that would provide
sures, ensuring that no unauthorized com- almost unlimited opportunities for market
puter can access the network, and this pol- research and direct-to-consumer advertis-
icy should be enforced at all other net- ing would be very tempting indeed. That
work levels, including the ISP. This requires is exactly what Spyware is.
that organizations use ISPs and service Simply visiting a web site once is enough
providers that take spoofing seriously and to cause software from that site to be
actively enforce their policies. Concern in downloaded, installed on your own com-
this area may lead to the decision to puter, and executed, if your web browser
use only private networks and to adopt does not have sufficient security settings
secure encryption-based protocols rather selected. Software can similarly be embed-
than relying upon the insecure ones. ded in emails and automatically installed if
the email client application is not secure.
Summary of positive issues Very commonly, spyware takes the form of
Most spoofing techniques are well known a Trojan horse, some piece of free software
and may be defended against through the that is claimed to perform some useful task
use of private networks, encryption, and is made available for download through the
other techniques. internet. Many users will happily download
it, install it, and try it out. Maybe it does
Summary of potentially negative issues perform its advertised task, and maybe it
New spoofing techniques are always being doesn’t, but, if it is a spyware delivery vec-
developed and resources need to be tor, it will continue to run after it appears
deployed to keep track of threat levels to have stopped, and will have added itself
and address any newly revealed security to the system’s start-up list, so that it will
weaknesses. automatically run every time the computer
Reference is started.
r B. Schneier (2004). ‘‘Customers, Once it is running, spyware has the unob-
passwords, and web sites,” IEEE Security structed use of the entire computer. It
and Privacy, Volume 2, No. 4. can be designed to search out certain files
(address books, password files, etc.), or to
Associated terminology: Phishing, Virus, look for any file containing certain kinds
Trojan horse, Worm. of data (account numbers, PINs, etc.) and
send the contents to the spyware’s author.
It can monitor the keyboard and record
Spyware everything that is typed; it can record every
application that is run, and every file that
Foundation concept: Security. is accessed; it can record every web page
Definition:Illicitandusuallyillegalsoftwarethatspies that is viewed, and every email that is sent
on a computer’s stored data and its users’ activities, or received. Everything the computer does
reporting them through an internet connection or an can be recorded and transmitted. It could
autodialing modem to a third party. even potentially turn on a web-cam and
capture real-time pictures of the user in
Overview their home or office.
Market research and advertising consume Traditionally, the term spyware is limited
an enormous proportion of many corpora- to software that passively spies, as detailed
308
Spyware
above. Software is not limited by the term prevents proper system functions. Much
used to describe it, and spyware will com- depends upon the type of spyware that
monly play a more active role. Spyware is resident upon a system. Typically spy-
can be written to redirect web searches, so ware attaches itself to a computer through
that the operators can insert their own pre- an access to a web site or through an
ferred sites instead of the results provided email message, and then collects informa-
by the genuine search engine. It can redi- tion on the computer or its user’s activi-
rect simple web accesses so that the opera- ties. This information is then transmitted
tors’ advertisements are shown instead of over an internet connection to a remote
the real pages. The most common non- user. Clearly, in the worst instances spy-
passive act is to ‘‘pop up” unsolicited ware can run undetected for long periods
advertisements. of time, transmitting industrial secrets to
In many cases, spyware is illegal, but, in a third party, or set a web-cam in action,
the United States, legislation to protect pri- monitoring an individual’s activities in the
vate parties (both individuals and corpo- ‘‘privacy” of their home. However, not all
rations) is very weak at the federal level spyware is put on a system from an external
(government computers and those of finan- source; spyware can be placed on a system
cial institutions are protected, of course), by an employer to monitor employees’ com-
and left to state legislatures. State laws are pliance with company policies, or to check
not at all uniform, and, since most use against access to unauthorized sites. Par-
of spyware is an interstate attack, state ents may wish to place spyware on their
laws can be difficult to enforce. Many spy- children’s computer for similar reasons.
ware operators manage to convince them- The best way to avoid spyware is by not
selves that their actions are neither legally being connected to the internet; however,
nor morally wrong, and are even willing this is not always practical and the next
to believe that their victims are really stage is to install a Firewall and software
beneficiaries. to filter out problem files. The comput-
Spyware is mainly a plague on users of ers themselves can also have their security
personal computer systems. This is partly levels raised to forbid the installation or
because spyware requires some effort to running of web-resident software. Finally,
create, so that effort is directed toward the there are many commercial vendors who
systems that most people have. It is also provide anti-spyware software (some for
partly because of the large numbers of secu- free) and these act to sweep all aspects of
rity flaws discovered in popular operating a computer’s memory, flagging and then
systems and third-party software made for removing undesired spyware and adware.
them. The only way to maintain privacy is Spyware itself evolves continually and thus
to avoid spyware, and the only way to avoid it is necessary for users to update their soft-
spyware is to run a secure computer system, ware’s library of possible spyware sources
in particular never running software that and categories.
didn’t originate from a completely trusted
source. As with viruses, there are a few anti- Summary of positive issues
spyware applications available, and in this Spyware can be used to actively monitor
case some of the best are also free. any computer. Companies may wish to
monitor the online activity of an employee,
Business value proposition whereas a parent may wish to monitor
For corporations and individuals, spyware their child’s online behavior. Anti-spyware
is generally an annoyance that is ignored software is available. Certain categories
until it builds to a critical level and of spyware activity are illegal in some
309
Storage
jurisdictions. In the UK, all software no capacity for any sustained activity; hun-
installed on a computer without the dreds of millions of times per second, it
owner’s consent is illegal. In the US many must access its memory to find out what
states have similar laws, but there is no it is supposed to be doing next. Without
federal equivalent. memory, it would grind to a halt in a few
nanoseconds.
Summary of potentially negative issues Computer storage is usually divided into
Spyware can cause serious damage to an two distinct categories, Primary storage and
organization by stealing intellectual prop- Secondary storage. Primary storage, often just
erty, passwords, or other corporate data. called Memory, is the kind that is essential
Lesser damage is caused by its occupation for the nanosecond-to-nanosecond opera-
of valuable network bandwidth and slow- tion of the processor. Every instruction to
ing of infected computer systems, but this be obeyed by the computer, and every piece
can become critical if left untreated for too of data to be used, must be present in pri-
long. Spyware can be created to monitor mary storage. This means, of course, that
the personal behavior of a computer user, a computer must have as much primary
for example their online shopping activity storage as possible, but the importance of
and their web-surfing habits, or even to size is nothing when compared with the
turn on computer equipment such as importance of speed and reliability. The
web-cams. Legal recourse against spyware processing speed of a computer is com-
is limited in many jurisdictions to certain pletely bound by the speed (or Access time)
categories of software and intrusion. Spy- of its memory.
ware may be impossible to detect without There is an essential balance to be struck.
the aid of special software. The faster memory is, the more it costs,
and the more memory costs, the less of it
Reference a computer can have. Additionally, faster
r E. Schultz (2003). ‘‘Pandora’s box: memory consumes more electrical power,
spyware, adware, autoexecution, and and results in more heat that must be dis-
NGSCB,” Computers and Security, sipated from the circuits, which is a major
Volume 22, No. 5. problem in modern computer design. The
generally used solution is a complex com-
Associated terminology: Advertising,
promise. A typical modern computer has a
Virus, Trojan horse, Phishing.
large amount of reasonably fast memory
known as Main memory or ‘‘RAM,” usually
in the hundreds of millions of bytes, plus a
Storage much smaller amount of exceptionally fast
memory, known as the Cache. Whole appli-
Definition: Memory, disks, and other media used to cations and data sets are loaded into main
retain data and applications both in the short and in memory as usual, but, as an application
the long term. runs, the portions of that data that are in
immediate use are copied into the cache.
Overview This is a very complicated dynamic process
Memory or storage is one of the sine quibus performed by the CPU, but it is responsible
non of any computer system. Unlike with for significant improvements in processor
humans, it is not possible for a computer speed at minimal cost.
with poor memory retention to get by. The Another economic factor is the unsur-
central processing unit (CPU) of a computer, prising fact that it is much cheaper to make
its brain in anthropomorphic terms, has Dynamic RAM, memory that can retain
310
Storage
information for just a few thousandths of a be programmed with its permanent con-
second, than it is to make Static RAM, which tent at the factory. PROM (Programmable
retains information for unlimited periods. ROM) comes from the factory blank, and
The difference is so great that all modern can be programmed (once only) using spe-
computers use almost exclusively dynamic cial equipment. Another variety, EPROM
RAM, and are equipped with extra circuitry (Erasable programmable ROM) can actually be
required to Refresh, or remind every mem- erased and reprogrammed (erasure is either
ory cell of what it is supposed to be stor- by exposure to ultraviolet light, or elec-
ing, hundreds of times per second automat- tronically), but erasure is a very slow pro-
ically. So much research and development cess, and EPROM does not provide a viable
has been devoted to improving the perfor- form of general-purpose non-volatile mem-
mance of dynamic RAM that the overall ory. ROM is used primarily for control-
effect is a benefit. Static RAM has been rel- ling embedded processors and for storing
egated to a very small unit that runs on the permanent system-start-up procedures
battery power to keep essential system set- (BIOS) in a modern computer.
tings when a computer is turned off. The term ‘‘RAM,” which is now used to
Another unfortunate economic aspect mean general-purpose memory, is in fact
is that the only affordable memory is an acronym for Random-access memory. Ran-
Volatile. That means that it works only dom access means that the various items
when the power is on. Up until the mid stored in the memory may be accessed in
1970s, when all computer memory was any order at any time, rather than Serially.
expensive, the main memory technology in A good analog is the difference between
use was magnetic: vast arrays of minute watching a film on a DVD and on a video
magnetizable doughnuts, called Cores, that tape: DVDs are random access, you can skip
each stored one single bit in a magnetic to any point at any time; video tapes are
field. These cores were Non-volatile; a com- serial, the only way to get to a point half
puter could be turned off for a week, and way through the film is to wind through
still retain everything. Unfortunately, core half of the tape. There is no serial-memory
memory takes up a lot of space, is much technology in current use, and the mean-
slower than semiconductor memory, and is ing of RAM has changed slightly. Techni-
vastly more expensive. The only part of core cally speaking, ROM is a form of RAM; its
memory still surviving is the name; many contents may be accessed in random order.
still call main memory ‘‘the core,” and the In modern usage, however, RAM means
extensive status report produced when an strictly normal general-purpose memory.
application crashes is called a Core dump. Because non-volatile RAM is prohibitively
One cheap form of non-volatile memory expensive, and even the usual volatile RAM
is available; it is known as ROM, or Read- is by no means cheap, computers also need
only memory. As the name suggests, ROM secondary storage. In the economic trade-
provides only half of the functionality nor- offs of computer design, primary storage
mally required of computer memory: its is chosen to be as fast as possible with-
content can be read (i.e., looked at), but out becoming too small to use; secondary
can not be written (i.e., modified), so it storage is chosen to be as large as possi-
is suitable only for data that absolutely ble without becoming too slow to use. Sec-
never changes. Even essential basic appli- ondary storage currently uses totally differ-
cations and operating systems can not rea- ent technology, almost invariably the Hard-
sonably be stored in ROM because it would disk drive. Disk drives use magnetic fields to
make upgrades impossible. ROM comes store data, and so are non-volatile, retain-
in many varieties. The basic kind must ing data for many years without use. The
311
Storage
magnetic fields are stored in concentric Technologists can, however, place their
circular tracks on a rapidly spinning disk software into a hardware memory device
(hence the name). Typically, random access that is non-volatile in nature if desired.
to an item of data stored on disk takes of Various devices are available that allow
the order of a million times longer than developers to burn their software into a
random access to a data item in primary PROM, test the program in the PROM, and
storage, but capacities tend to be approxi- move the PROM onto production memory
mately one thousand times higher for sim- devices.
ilar prices. The significantly greater capac- Secondary storage is available in a wide
ities and non-volatility of disks make the variety of configurations and needs to
combination of semiconductor RAM for pri- be considered as part of a company’s
mary storage and disks for secondary stor- Information lifecycle (ILC) and the manage-
age a universal choice. ment of the company’s data. There are
many companies offering data-storage ser-
Business value proposition vices that can be used as outsourcing
The selection and use of storage in a corpo- partners.
rate setting requires that a business case be
developed for the whole system, taking into
Summary of positive issues
account the processes that are going to run
A wide range of devices and specifications
upon it. In a system for a person whose only
is available for both primary and secondary
requirement is to use a word processor,
storage. The technology is well understood
memory speed and size will typically not
and supported by vendors. Software and
be the bottleneck, but rather the person’s
hardware tools are available to help users
typing speed. However, in the case of a file
to analyze and optimize their memory. New
server that is used by corporate users, an
forms of memory are continually evolving
assessment of memory requirements must
(e.g., USB flash-memory devices). Primary
consider issues of performance, loads, and
storage is available in a wide variety of
network demands. Adding lots of mem-
configurations.
ory to a corporate server that has a low-
performance processor could still greatly
increase performance. Summary of potentially negative issues
RAM is available in a wide variety of The wide variety of memory types requires
types, each of which is available with a that the correct memory be selected for
wide array of performance options. These each situation. Adding memory to a com-
choices facilitate the acquisition of a spe- puter system might not make the system
cific memory component for a specific more effective or faster since the bottle-
purpose (e.g., cache for main memory), neck may lie elsewhere in the system. Some
and allow systems to have their memory memory specifications and types lose their
upgraded, potentially meeting new require- manufacturer’s support as they age and are
ments without the user having to purchase superseded by newer specifications. While
a completely new system. primary memory can last for ever, barring
Commercial and personal computer physical harm, secondary memory has a
users usually never change the ROM config- shorter lifespan because it contains mov-
urations of their systems, since these typi- ing parts, and an assessment of the con-
cally store system-specific applications, but dition of the memory needs to be under-
newer systems use Flash ROM, which does taken regularly, so that the appropriate
allow the BIOS to be updated without any upgrades, maintenance, and adjustments
special equipment needs. can be made.
312
Structured design methodologies
313
Structured design methodologies
to develop a concept into a program, and supported by consultants and software tool
encourages the use of structured methods vendors.
at each stage. The structured approach is
also employed in commercial methodolo- Summary of positive issues
gies such as RAD. The structured approach The structured approach to systems devel-
continues to be used in various forms opment is well established, with a long his-
and hybrids and has evolved into methods tory and extensive literature. A wide vari-
such as UML to accommodate the current ety of software tools is available to sup-
Object-oriented paradigm. Structured models port structured methods and the associ-
are generally not considered rigorous ated management techniques that accom-
enough to develop mission-critical systems pany this approach to systems design.
that may, for example, involve potential
risk to human life; for such systems a
more mathematically rigorous approach Summary of potentially negative issues
to design, known as Formal methods, was Structured design was developed in the
developed. 1970s to aid in the design of transaction
processing systems in Cobol. Early method-
Business value proposition ologies may be less suitable to the devel-
The use of structured design techniques opment of object-oriented program designs
leads to higher-quality, more maintain- and care needs to be taken in method-
able, and better-documented programs and ology selection. Structured methods tend
systems than would be the case if no uni- to be considered ‘‘less rigorous” in nature
fying design approach were taken. The since they don’t involve the mathematical
adoption of structured methods allows formalism of formal methods; as a conse-
companies to use a consistent approach to quence they may be considered unsuitable
developing their systems, allowing consis- for systems requiring absolute precision in
tency in training and more efficient plan- the development process.
ning and scheduling, recruitment, and References
management. The structured approach r E. Yourdon (1979). Classics in Software
allows for the adoption of commercial Engineering (New York, Yourdon Press).
tools and packages to support this style r G. Myers (1974). Reliable Software through
of development as well as the adoption Composite Design (New York, Mason and
of management practices such as RAD to Lipscomb Publishers).
speed up development, and cost model-
ing techniques such as COCOMO II to help Associated terminology: Formal methods,
determine system costs and effort require- UML, RAD, Object-oriented, Cobol, ERD,
ments. The methods are well known and DFD, Waterfall model.
314
TCP/IP (Transport Control Protocol for the internet protocol)
315
TCP/IP (Transport Control Protocol for the internet protocol)
316
Telnet
with directing it to the right applica- may require a higher network bandwidth
tion running on that computer. capability than less convenient alternatives
Layer 4, ‘‘application”: this is the ‘‘use- would need.
ful” layer, performing the high-level
Reference
tasks that are actually wanted, such r W. R. Stevens (1994). TCP/IP Illustrated
as delivering email or remotely con-
(New York, Addison-Wesley).
trolling equipment. The other layers
are really just services, whose sole pur- Associated terminology: Internet protocol.
poses are to make this fourth layer
possible.
Telnet
Business value proposition
TCP in conjunction with IP is the most Foundation concepts: Client–server, Operating
widely known protocol for general network system.
communications. However, it should not be Definition: A network application that gives remote
forgotten that there are alternatives. When access to any software that provides a command-line
deciding upon which protocol to choose, interface and is itself network enabled.
one must develop a business case that
examines the future requirements of the Overview
systems under consideration. This can be Before graphical user interfaces and win-
done through the creation and evaluation dowing systems became affordable and
of a set of performance indicators (such as popular, the regular way to control a com-
reliability) and then systems metrics (such puter was by typing verbal commands on
as performance requirements) determine a keyboard and reading a typed response.
which protocol would be the most suitable. This kind of interaction is still familiar
The business case will also consider the pro- to anyone who uses Unix or ‘‘the DOS
tocol requirements of other systems that prompt,” and is often still the preferred
the network would interact with, includ- mode of operation for programmers and
ing those beyond the corporate boundary. the technically minded. There is only so
Many organizations will also have technical much information that can be conveyed
staff capable of creating a proprietary pro- by clicking and dragging icons, and, when
tocol that exactly fits their specific needs; more is needed, the use of complex dialogs
a proprietary layer-3 (transport) protocol to provide the extra information can be
may certainly be a viable solution, since very tedious and inefficient.
it would ride upon IP, the lingua franca of The use of a Command-line interface,
the internet; a proprietary layer-2 (network) through which users type commands that
protocol is probably impractical, because it may be as simple or as complex as is
would not be recognized by existing net- desired, can provide a great increase in effi-
work equipment. ciency. The only cost is that users must
learn the language of the interface. Dialog
windows do at least ask for specific infor-
Summary of positive issues mation in specific places, so a user need
TCP is a well-known, highly reliable, and only understand the questions in order
established protocol. to use them properly. When faced with a
command-line prompt, there is generally
Summary of potentially negative issues no information on what the system will
TCP is a complex protocol, which places accept, and a user can not reasonably guess
a higher demand on networks and thus at the syntax and hope for the best. Use
317
Telnet
318
Thirty-two bit
319
Thirty-two bit
320
Transaction processing system (TPS)
321
Trojan horse
322
Trojan horse
from a trusted source, and be very care- Summary of potentially negative issues
ful about whom you trust. It is very tempt- Vendors of anti-virus software need to con-
ing to solve your budget woes by download- tinually modify their software to detect
ing free versions of software, but consider new and variant Trojan horses, which is
this: if a stranger walked up to you in the a never-ending and nearly impossible task.
street and gave you a free pie, would you Opening a malicious Trojan horse can
eat it? lead to serious or even catastrophic conse-
quences.
Business value proposition
References
There is no legitimate business value to a r M. Egan and T. Mather (2004). The
Trojan horse.
Executive Guide to Information Security:
Threats, Challenges, and Solutions (New
Summary of positive issues
York, Addison-Wesley).
The software device known as a Trojan r P. Szor (2005). The Art of Computer Virus
horse is known and understood, so vendors
Research and Defense (New York,
of anti-virus software can work to ensure
Addison-Wesley).
that their software identifies and removes
the offending program. Associated terminology: Virus, Spyware.
323
UML (Unified Modeling Language)
324
Unix
325
URL (Uniform resource locator)
326
URL (Uniform resource locator)
327
URL (Uniform resource locator)
328
Video
329
Video
Fortunately, Nature took a short-cut when most real scenes, so compression can come
designing the human eye, and the color- to the rescue. Compression ratios of 30 : 1
sensing cells in the retina react to stimuli may be achieved with generally acceptable
only in three broad wavelength ranges. This results, although rapidly changing or sub-
is why there are three primary colors: it has tly detailed scenes do suffer degradation.
nothing to do with the physics of light; it The most used format for digital video is
just arises from the structure of the human now MPEG (Motion Picture Experts Group),
eye. To accurately represent the color and which is currently available in four ver-
intensity of any point of light from the sions. (The popular audio format MP3 is an
human point of view, only three numbers abbreviation for MPEG-3.) MPEG is the stan-
are required, to represent the intensities dard used for domestic DVD recordings,
in each of the three bands: red, green, most digital satellite television signals, and
and blue. Thus the storage space required most streaming video systems. MPEG is
for any video signal may be computed as closely related to the JPEG system for
3 × the number of frames scanned per still images; it views the image as a series
second × the number of horizontal rows of 8 × 8-pixel blocks, which are individu-
per scan × the number of resolved dots per ally compressed. When too high a compres-
line × the length of the video sequence in sion ratio is demanded, these little squares
seconds × the amount of space required become easily visible. MPEG also groups
to store a single number. To make a pic- together small sequences of scanned
ture smooth enough to please the aver- images, taking the first whole (but com-
age viewer, each number requires only one pressed) as a Key frame, then recording only
byte, about 500 lines and 500 dots per how subsequent frames in the group dif-
line, and 30 scans per second are required fer from the first. When the scene changes
as a minimum. This results in a total of slowly, this produces great improvements
22 megabytes per second, or 120 gigabytes in compression without noticeable loss of
for a short movie; a very large quantity quality; when the scene changes quickly, it
of data indeed, and an impossible band- produces highly visible, strangely colored
width requirement for almost any network rectangular ‘‘MPEG artifacts.”
connection. This belies the deliberate myth of ‘‘dig-
The figures of 500 rows and 500 dots ital quality.” A digital signal is always an
are somewhat arbitrary; standard television approximation to an original analog sig-
signals are approximately 500 × 300 and nal, and a compressed one more so. Digital
‘‘DVD quality” is, in the United States at video does have two advantages: it is in a
least, 720 × 480. The resolution perceivable conveniently computer-manipulable form,
by the eye is much higher than that, but and perfect copies can be made without
cameras could easily be designed with a the ‘‘generation loss” familiar to those who
higher resolution if needed. The figure of have tried to copy VHS tapes. Digital video,
30 scans per second is based soundly on like digital audio, has no innate lead over
biology: the eye can not react to changes a corresponding analog signal in terms of
quite that fast, so a higher scan rate quality; it is simply cheaper and more con-
would produce no improvement for human venient for providers.
viewers; a scan rate as low as 20 per second Other video formats abound. The AVI
would produce a very irritating flickering (Audio/Video Interleave) format introduced
effect. by Microsoft is popular in video production
Twenty-two megabytes per second is an because it is easier to process and tends
almost impossible standard to meet. For- to produce better quality than MPEG, but
tunately, there is a lot of redundancy in as a consequence produces much larger
330
Virtual machine
files, so it is not popular for distribution. internet and short video clips sent to cel-
Apple’s Quicktime ‘‘MOV” format is also pop- lular telephones are certainly just the first
ular, and frequently used for short down- steps.
loadable movie clips.
Summary of positive issues
Business value proposition There are several well-known and estab-
There has been explosive growth in the lished standards associated with video.
use of digital video since low-cost video Video technologies and internet technolo-
technologies became available during the gies are continually improving, allowing
1990s. The newly affordable technologies higher-resolution and larger video files to
include moderate-quality cameras costing be transferred. Digital video delivered over
only a few tens of dollars but requiring a network is a cost-effective mechanism
direct connection to a computer, video for the delivery of short or low-bandwidth
cameras with their own built-in record- transmissions, which is particularly valu-
ing mechanisms, and, of course, digital able for training and educational purposes.
versions of traditional VCRs using com-
puter disks instead of tapes. A significant Summary of potentially negative issues
increase in affordable network bandwidth Digital video requires a large amount of
and the availability of affordable CD- and storage capacity and bandwidth. Video
DVD-burning hardware were also major compression when taken too far signifi-
influences. cantly reduces the quality of images.
The proliferation of video devices was
paralleled by the development of technolo- Reference
gies and software to allow the duplication, r D. Austerberry (2002). Technology of Video
storage, and manipulation of the video and Audio Streaming (Woburn, MA, Focal
data. These include software for the pro- Press).
duction of online videos capable of being
downloaded through the internet or multi- Associated terminology: Compression,
cast through Local area networks, which has Audio, Digital, Bandwidth.
led to a large increase in the use of, if not
in the quality of, on-demand instructional
and training video. Virtual machine
The potential uses of video technologies
are vast, and the delivery of video over Definition: A software application that perfectly
the internet has become an area of com- imitates a real computer.
mercial and academic interest. The long-
awaited delivery of video-on-demand televi- Overview
sion is now possible over high-bandwidth It is possible to design a program that pro-
networks. Wide availability of high-speed vides a perfect imitation of the hardware
broadband internet connections is needed of a real computer. By duplicating, as a
both to create a positive economic demand software simulation, all the operations of a
model and to facilitate the delivery of video given piece of hardware, a purely software
signals with sufficient image resolution. application can arrange to produce exactly
Computer systems may be connected to the same effect. Naturally, a software sim-
large projection systems to provide a home ulation will be slower than the real thing,
or corporate in-house theater effect. There and generally less efficient in other ways,
is great potential for future expansion in but in all important respects a piece of digi-
this direction; the delivery of radio over the tal hardware that is fully understood can be
331
Virtual machine
perfectly rendered in software. The result is only harm the simulation, and the simu-
called a Virtual machine. lation can be restarted in a fraction of a
Virtual machines provide four valu- second.
able services, relating to experimentation, Software construction and portability:
hardware--software compatibility, system this is currently the most popular use of
security, and software construction and virtual machines. Sometimes, the basic way
portability. that modern computers work makes cer-
Experimentation: when the purchase of tain software solutions very difficult to
a new kind of computer system is being implement, and designers conclude that it
considered, buyers can get a strong idea of would be much easier if they could design
what the new system is like to use without a whole new kind of computer just for this
having to buy or borrow a demonstration one program. Virtual machines make this
model if they can run a software simula- into a viable solution. The new computer,
tion of it on their existing computer sys- idealized for the solution of this one prob-
tem. When a new computer system is first lem, and perhaps totally infeasible for real
designed, enormous savings can be realized construction, is designed and implemented
by first constructing a virtual machine to as a software simulation. The difficult prob-
test the design and iron out the major bugs lem is then solved with a program writ-
before constructing any new hardware. ten for that new computer, and the com-
Hardware–software compatibility: most puter simulation packaged together with
software runs exclusively on one par- that program is the product. This is an
ticular hardware platform: PC software especially valuable technique when new
does not run on Macintosh hardware. programming languages are designed and
Although some software manufacturers do first implemented. Once a product has been
produce versions for different platforms, developed by this method, it is much eas-
this is by no means universal. A virtual ier to produce versions for other hardware
machine allows one computer to behave platforms: only the virtual machine needs
like another, and therefore run software to be Ported to another kind of computer;
designed for that other system. For exam- the program that it supports, the major
ple, a Macintosh computer can run a vir- part of the product, remains unchanged.
tual machine that imitates a PC, and This is part of the philosophy of Java. All
that virtual PC can then run any PC Java programs run on a virtual machine
software. known as JVM, the Java Virtual Machine.
System security: virtual machines are All Java programs are automatically cross-
not limited to imitating another kind of platform because an equivalent JVM has
computer. It is quite possible, and common, already been written for all major hardware
for a PC with Windows to run a virtual platforms.
machine that imitates a PC with Windows.
The reason for this is that untrusted soft- Business value proposition
ware (perhaps it is suspected of being a Tro- Virtual machines are software programs
jan horse or bearing a virus, or perhaps it that simulate the hardware of a real or the-
just doesn’t work very well and keeps crash- oretical computer. The best known use of
ing the computer) can be run on the vir- virtual machines is in the Java program-
tual machine; it will behave exactly as it ming area, where Java programs run on
would on the real computer, but will be a Java Virtual Machine (JVM) and as such
incapable of doing any harm, because its are highly portable, since the JVM can be
environment is simulated. It can’t erase the re-written for any hardware platform. This
real disks or crash the real computer; it can prevents the problem associated with many
332
Virtual memory
333
Virtual memory
334
Virtual organization
335
Virtual private network (VPN)
once an order has been received and full ing or provision of service to align with
payment made, then assembling them and demand or market fluctuations.
shipping the product out. This strategy
may provide great savings in inventory and Summary of potentially negative issues
warehousing costs. The concept of virtual organization incurs
In practice, pure virtual organizations a high overhead in terms of command and
exist only in the service sector (e.g., consult- control, information technology, and rela-
ing), while in the manufacturing sector a tionship management. Partners may find it
close approximation can be found in orga- difficult to provide services to the virtual
nizations that utilize techniques associated organization with the flexibility desired.
with mass customization, as advocated by Virtual organizations are dependent upon
researchers such as Joseph Pine. The term their partners to perform core and non-core
Agile has also been associated with discus- functions successfully.
sions surrounding the concept of virtual
References
organization. Steven Goldman and other r W. Davidow and M. Malone (1992). The
pioneers in this area have promoted the
Virtual Corporation (New York, Harper
agile organizational model, in which prod-
Business).
ucts and production resources are designed r J. Pine (1993). Mass Customization: The
to allow continuous product evolution as
New Frontier in Business Competition
market demands change.
(Boston, MA, Harvard Business School
Press).
Business value proposition r S. Goldman, R. Nagel, and K. Preiss
The concept of a virtual organization is in
(1995). Agile Competitors and Virtual
operation to some extent in the majority
Organizations: Strategies for Enriching the
of organizations that use a third-party ser-
Customer (New York, Van Nostrand
vice provider for any activity and manage
Reinhold).
that relationship through technology. The
pure virtual organization concept in which Associated terminology: Business process
every function is outsourced has not been re-engineering.
embraced by many organizations, but as
the ability of an organization to monitor
its outsourced functions through sophisti- Virtual private network (VPN)
cated systems increases the adoption of this
model will increase. The concept of virtual Foundation concept: Network.
organization allows organizations to con- Definition: A network that has no independent phys-
centrate on deriving the best value from ical existence, but exists only as a set of connections
each service provider, reducing the oper- over a larger real network, usually the internet.
ating expenses through technology, and
using the managerial resources to execute Overview
the business plan effectively. If an organization has offices in London,
Sydney, Grytviken, and New York, the con-
Summary of positive issues struction of a private network connect-
The concept of a virtual organization ing the four would be prohibitively expen-
allows greater flexibility and partnership sive. A worldwide network, the internet,
with ‘‘world-class” service providers. The already exists and provides public con-
application of leading technologies allows nections between any two points. A Vir-
greater agility in the company, rapidly tual private network (VPN) uses the existing
adjusting the production of manufactur- infrastructure of the internet, or any other
336
Virtual private network (VPN)
337
Virtual reality
338
Virus
339
Virus
or a web page contained a piece of soft- its scope. It could erase every byte of data
ware, just opening the email or browsing in the whole system; it could transmit all
the page would be enough to start that soft- of your personal data over an internet con-
ware running. With more recent operat- nection to anywhere in the world; it could
ing systems, user-selectable safety settings make your modem dial a very expensive 1-
exist. The user can decide that no software -900 number; or it could just pop up a mes-
ever runs automatically, or perhaps only sage saying ‘‘Got you!”
software from trusted sources. The user Quite frequently, when a virus gets into
must be aware of these settings, and delib- a computer, it will secretly take over that
erately activate them, but they do provide computer, allowing the virus creator to
some amount of protection. send commands and mount other attacks,
Unfortunately, a virus delivered through or store illegal copies of copyrighted mate-
a ‘‘back door” can still often be activated rial on a computer without the owner ever
automatically. There is a sadly common knowing. Most of the seriously damaging
software fault called a ‘‘buffer over-run.” In attacks are mounted indirectly from com-
simple terms, a software application with puters that have been taken over by a virus,
an internet connection has to set aside so that the originator of the attack is virtu-
some memory for incoming messages to be ally untraceable. This could have unpleas-
assembled in. If the incoming message is ant legal consequences for the innocent
longer than it should be, and the program- computer owner.
mer left out all the checks, an incoming It is not impossible for a virus to cause
message can over-run the boundaries of the physical harm to a computer, but it is very
assembly area and become part of the appli- unlikely, and extremely rare. Just as with a
cation itself. The checks to prevent buffer real biological virus, if it kills its victim,
over-runs are exceptionally simple, and the it has no opportunity to spread itself to
problem of buffer over-runs is well known. others.
For a commercial programmer to allow a Creation: virus creation can be a highly
buffer over-run is an act of carelessness bor- skilled and intellectually demanding job,
dering on gross incompetence, but, sadly, but, as with most criminal enterprises,
buffer over-runs are still probably the most the miscreants are usually completely
common means of entry for a virus. unskilled users of widely available ‘‘virus
Purpose: the general expectation is that kits.” Finding a security flaw in a com-
a virus is intended to erase or modify monly used application, and then work-
essential data, or even destroy computers, ing out a way to exploit it, is a very dif-
to cause great mischief and financial loss. ficult task. Unfortunately, it rarely has to
While this is definitely true in some cases, be done. Responsible software companies
the most common virus is simply an ‘‘ego are always testing their own products, and,
trip.” Somebody wants to cause just enough when they find a flaw, the only responsible
annoyance to be noticed, and earn them action is to release a ‘‘patch” (a small pro-
bragging rights amongst the community of gram that automatically repairs the flaw)
‘‘hackers,” as they like to call themselves. to their customers. Once such a patch has
The fact that great harm is not usually been released, it is much easier for a virus
intended should be of little comfort. Virus designer to work out where the flaw is,
software is often very poorly designed, and then release a tailor-made virus, knowing
can cause a lot of unintended damage. Most that most computer users will not install
importantly of all, a virus can be truly mali- the patch in time to be safe.
cious. Once an illicit program is running on By far the largest group of viruses con-
your computer, there are very few limits to sists of those created by people who either
340
Virus
blindly follow a script for virus creation, or is to expect to be the victim of a disas-
take an existing virus and insert their own trous attack and take appropriate steps in
names into it. Thus all such viruses tend advance. These include encrypting all sen-
to do the same thing, and often don’t work sitive data stored on a network-accessible
(either doing nothing, or being far more computer, and ensuring that all valuable
destructive than was intended). data is effectively backed up: copied onto
some removable media (CD, DVD, tape,
Business value proposition etc.), and physically removed from the com-
Even a virus that does no tangible harm puter and the site. The only data that can
has a negative impact on the bottom line. not be destroyed by a virus is data that is
It takes time and resources to remove a not on a computer.
virus from an infected computer, and they Most importantly, corporations and indi-
must always be removed, because seeming viduals should always install security
harmless is no guarantee of being harm- patches as soon as they are released by the
less. A virus consumes resources, tapping software company that created the operat-
the computer’s processing power and mem- ing system and other applications in use
ory, and uses up valuable communications upon the organization’s computers. It is
bandwidth. Deliberately destructive viruses also important to make sure that the patch
are another matter altogether and can par- being installed is really from the actual
alyze a company through its computing original software company. (Another form
resources. of attack, known as the ‘‘Trojan horse,” is
The only way to be completely safe from to send out free software that claims to be
viruses is either not to have a computer, something useful, perhaps a security patch
or to have a computer but no modem, and even, but is in fact just a virus in different
no internet connection, and never to accept packaging.)
floppies, CDs, or DVDs from anyone. That Special-purpose ‘‘anti-virus” software can
is clearly not a practical approach. To be be effective, but is not the silver bullet that
relatively safe from viruses, there are a few it is often believed (or hyped) to be. Compa-
simple precautions that organizations and nies that create anti-virus software monitor
individuals can follow. the viruses that are flying about the inter-
net. Whenever they detect a new one, they
r Do not buy software from companies work out a way to detect and disable it,
that have a history of exploitable and then send out a (usually free) update
software. to all of their customers. Except for a few
r Turn on all the user-selectable security rare cases, anti-virus software can protect
levels. only against a virus that has already been
r Disable JavaScript in web pages and caught and analyzed. If a company or an
other active software applications. individual is the victim of a new virus (and
r Do not open an email attachment from most victims do fall in the first few days
somebody you do not know unless it after a virus release), the anti-virus soft-
has been scanned by an anti-virus ware will probably not be able to do any-
software package. thing until the ‘‘antidote” for the new virus
r Do not access floppies or CDs that have has been released.
not been scanned for viruses. It should be noted that personal comput-
r Use a reliable Firewall. ing devices that are not ‘‘company issue,”
such as home PCs, need to be considered as
No matter how good your security is, the part of the company’s security assessment,
only responsible attitude for corporations since executives who take home sensitive
341
Visual Basic
342
Voice over IP (VoIP)
Any operating system that expects to opers from having to create every program
gain popular acclaim is going to need to for every user. This is especially important
provide some way for people to be able to because many Visual Basic programs are
carry out this kind of programming easily. used to develop reports and are applica-
There are not enough technically trained tions that are used once only or for a short
software engineers in the world to meet the period of time.
demands created by the ubiquity of com-
puters, so a non-technical programming Summary of positive issues
language is needed. Since one of Microsoft’s Visual Basic is easy to learn and widely
earliest products was a popular version of available. It was designed to work in a
Basic for microcomputers, it is hardly sur- Microsoft environment, but applications
prising that they chose Basic as the basis can be ported over to other operating
for this language. Visual Basic is a highly systems through special software. Visual
extended version of Basic, designed par- Basic programs can be developed by end
ticularly to work with the Windows oper- users, which frees technical programmers
ating systems, performing data process- and other resources to be used on other
ing and interface tasks for non-technical projects.
programmers.
Edsger Dijkstra, one of the universally
Summary of potentially negative issues
recognized fathers of computer science,
Visual Basic is not suitable for large, com-
said of Basic (Basic in general; Visual Basic
plex, or technically sophisticated projects.
did not exist at the time) ‘‘It is practically
Visual Basic is primarily designed to run
impossible to teach good programming to stu-
on Microsoft operating systems and extra
dents that have had a prior exposure to BASIC: as
resources need to be deployed to run these
potential programmers they are mentally muti-
applications in other environments.
lated beyond hope of regeneration.” He was
partly joking. When used for its intended Reference
purpose, data transfer and implementation r F. Balena (2002). Programming Microsoft
of graphical user interfaces, Visual Basic Visual Basic.NET (Core Reference)
is an immensely successful tool, and it is (Redmond, CA, Wintellect).
hard to argue with success. The only prob-
lems arise when dialects of Basic are used Associated terminology: Fortran, Cobol,
beyond their scope. C++, Java.
343
Voice over IP (VoIP)
For most people with an internet connec- Internet telephony is not the panacea it
tion, data may be transmitted to any point may seem to be. Unlike the Circuit-switched
in the world quickly, simply, and without regular telephone network, internet traf-
any additional cost. Putting those two facts fic is Packet-switched. This means that vari-
together results in Internet telephony or Voice ous parts of a single transmission may be
over IP. transmitted by different routes and at dif-
A computer with an internet connection, ferent speeds, and thus might not reach
a microphone, and a loudspeaker effec- the receiver at the ideal rate. When this
tively gives free long-distance and interna- happens, the result is either an extremely
tional telephone calls to any other simi- annoying lag in reception, or the sound
larly equipped computer. Because human becomes ‘‘choppy” or stuttering, which can
speech remains completely intelligible render speech incomprehensible. Another
under severe bandwidth limitations (3 kHz minor point for consideration is safety:
is quite good for a regular telephone call), the regular telephone network has its own
there is no requirement for a fast internet power source; after natural disasters or dur-
connection; a bi-directional 3000 bytes per ing bad weather there may be power out-
second or 24 000 baud is enough for normal ages, but often the telephones still work.
telephone quality (dial-up modems usually Computers need an operating power sup-
provide the equivalent of 56 000 baud). ply, so they can not provide telephone ser-
With even a standard speed local area vice during power outages.
network, the bandwidth may be increased
to give speech reproduction with much
higher fidelity, and still carry a great many Business value proposition
concurrent conversations, making intra- The use of VoIP technologies is growing
and inter-network conference calls a sim- as they become more advanced, broadband
ple and cheap reality. With the data being reaches high degrees of penetration, and
already in digital form, making a perfect interface technologies become easier and
recording of the entire conference is a triv- more convenient to use.
ial feature to add. The basic calculation of the total cost
In recent years, ‘‘web phones” have of ownership is based upon the need to
emerged. These are small devices looking have a physical connection, a computing
just like a traditional telephone, but con- device, and a ‘‘telephone” interface. For
taining a microprocessor and network con- a large organization in which the major-
nectivity hardware. These devices may be ity of the work force uses a computer in
plugged directly into a DSL, cable modem, performing everyday business tasks, the
or other connection, and provide internet- addition of VoIP carries a low overhead
based telephone service without the need when only straightforward telephone con-
for a computer. nections are required. However, more com-
VoIP, an acronym for voice over internet plex requirements such as call waiting,
protocol, is a standard protocol that runs voice-mail boxes, call forwarding, and tele-
on top of IP (internet protocol, the main conferencing make additional demands on
transport for internet traffic) to implement network bandwidth and storage capacity.
internet telephony. This is a rapidly grow- The major impediment to the adoption of
ing market, and new protocols are appear- VoIP for small organizations is the cost of
ing very frequently. A currently popular the equipment relative to other options
direction is to combine internet telephony such as cellular telephones and land
with Peer-to-peer technology. lines.
344
Voice over IP (VoIP)
345
W3C (the World Wide Web Consortium)
Foundation concepts: Internet, World Wide Web. 06902 Sophia-Antipolis Cedex, France.
Definition: In their own words, “The World Wide Associated terminology: Hypertext, URL,
Web Consortium (W3C) is an international consor- XML.
tium where member organizations, a full-time staff,
and the public work together to develop web stan-
dards” (http://www.w3.org/Consortium/).
Walk-through review
Overview
The W3C was founded in 1994 by Tim Foundation concept: Software development lifecycle.
Berners-Lee, the inventor of the World Definition: A walk-through review is a peer-based
Wide Web, and MIT in collaboration with review of a software development process or product.
CERN, DARPA, and the European Commis-
sion. The organization aims to develop Overview
inter-operable technologies and provide Walk-through reviews are performed on a
specifications, guidelines, software, and software component or on an aspect of the
tools to help the web to reach its full poten- software development process by a ‘‘peer”
tial. A primary aim of the consortium is group. Group members can not be from the
to develop non-proprietary standards and team that developed the product or process
promote inter-operability. The consortium and are removed from the evaluation pro-
is international in scope, has offices in cess. The aim of the review is to examine
fourteen locations around the world, and the process or product from multiple per-
draws its membership from around the spectives. An example review team might
globe (http://www.w3.org/). be composed of engineers representing dif-
The W3C is behind many of the technolo- ferent technical disciplines providing dif-
gies that are in common global use, such ferent perspectives: a software designer, a
as HTML, SOAP/XMLP, URLs, and XML. software interface designer, and a hard-
ware engineer. Alternatively, a review team
may be formed by end users, customers,
Business value proposition and systems analysts, performing a review
The W3C is an organization that provides along business performance lines. The goal
leadership in the area of web technologies. of walk-throughs is to assess the product
Individuals and corporations can become or process and, through a moderator, pro-
members and, through working commit- vide critical positive feedback rather than
tees, help to develop the formulations of criticize the process manager, developer, or
future web-based technologies. software designer.
The review process is meant to be infor-
mal and not confrontational in nature.
Summary of positive issues
While there are many different ways of per-
W3C provides a forum through which web-
forming the review process, the approach
based technologies are developed. Organi-
typically involves the developer of the soft-
zations are able to join the consortium and
ware presenting the specification and the
work with the committees to innovate and
system to the peer-review team. The team
create new technologies.
then acts as surrogate customers and traces
References the system requirements through the devel-
r http://www.w3.org/. opment to the system that customers or
346
Waterfall model
end users would receive. The team also issues and buffers the developers and their
aims to be objective in determining the egos from direct criticism.
usability aspects of the system, including
References
the human factors, the interface expe- r E. Yourdon (1989). Structured
rience, and other aspects of the soft-
Walkthroughs (New York, Yourdon Press).
ware as presented to the user. The third r M. Deutsch and R. Willis (1988). Software
aspect of the team’s role is to ensure that
Quality Engineering: A Total Technical and
the developers followed the methodology
Management Approach, Prentice-Hall
and standards as required by the vendor
Series in Software Engineering
and the customer. Having documented the
(Englewood Cliffs, NJ, Prentice-Hall).
outcomes of these review categories, the
results are presented back to the devel- Associated terminology: Software
opment team together with an evaluation development.
of whether the system needs to undergo
another walk-through review before being
released for use. Waterfall model
Business value proposition Foundation concept: Software development lifecycle.
The objective of walk-through reviews is Definition: The waterfall model is a software develop-
to provide intellectual input to a process ment lifecycle model.
or product quickly and effectively. Walk-
through review teams can be drawn from a Overview
variety of skill bases and the mix changed The Waterfall model developed by Royce in
according to the needs of the problem. 1970 was the first software development
The team may be put together rapidly, the model to address all phases of the software
assessment process executed in a timely lifecycle, capturing each phase before ‘‘cas-
manner, and the feedback quickly pre- cading” its results into the next. The model
sented to the process owners, completing is usually drawn as a series of descending
the review cycle. steps, drawn as boxes, with the results of a
higher box flowing into the next lower one,
Summary of positive issues suggesting its name ‘‘the waterfall model.”
Walk-through reviews provide a rapid val- While there is no ‘‘official” number of
idation and verification tool for managers phases in a waterfall model, a basic ver-
and organizations. The teams performing sion begins with a Requirements definition
the review can be put together on an ‘‘as- phase, in which the goals of the system to
needed” basis. Review cycle times can be be developed are considered by the techni-
very brief and reviews may be performed cal staff, the ultimate users, and the man-
by different teams for different aspects of agement from whose budget the system is
the lifecycle. being paid for. The notation and the level
of formality used to capture the require-
Summary of potentially negative issues ments are selected according to the stan-
If reviews are too informal the results dards enforced by the organization. The sec-
may be inconsistent. The team structure ond phase is Systems and software design, in
and skill sets might not be appropriate which the hardware and software process
for the task, and thus managerial care is requirements are determined and consid-
required in team selection. The review pro- ered in relation to the overall IT architec-
cess should be semi-formal and requires a ture that the system is to be incorporated
moderator who understands the technical within. The software design aspect leads to
347
Web services
the specification and design of each indi- a high-level ranking on the Software Engi-
vidual software component, and again the neering Institute’s (SEI) Software Capability
degree of formality of specification is deter- Maturity Model (CMM). The waterfall model
mined by the organization and its poli- in its non-iterative version does not reveal
cies. Stage three is Implementation and unit problems until the end of the lifecycle, the
testing, in which the programs are coded point at which the cost to fix the error is
and tested as individual entities. Stage four highest. The iterative model does facilitate
is Integration and system testing, in which earlier detection of errors but requires an
the individual programs are tested as com- update revision of the documents and pro-
plete systems. The final phase is the Opera- cesses at each level, a task that is frequently
tional and maintenance phase, generally the underperformed or not performed at all.
longest phase of any software development,
involving modifying the code, continued References
r http://www.sei.cmu.edu/sei-home.html.
testing, and integration with other soft- r I. Sommerville (2004). Software
ware programs.
The waterfall model was originally con- Engineering (New York, Addison-Wesley).
sidered a one-pass development, with each Associated terminology: Formal methods.
stage being undertaken once. However, it
was soon realized that, when an error was
found or a modification was required in Web services
any stage, it was important to return to
earlier stages and amend the design there, Foundation concept: World Wide Web.
so that all stages would reflect the same Definition
model of the system. Hence the water- 1. Any service provided by a web server and
fall model is sometimes referred to as the accessed through a web browser.
Iterative waterfall model. 2. A particular protocol or group of protocols
used by applications to exchange data over a
Business value proposition network.
The waterfall model has a long history and
many organizations have incorporated a Overview
version of it into their application devel- This is one of those troublesome phrases
opment model. Many other techniques, that for many years had a perfectly obvious
such as cost modeling systems and project and straightforward meaning (a ‘‘web ser-
management systems, have been built vice” is any service provided via the World
around the waterfall model and provide Wide Web), but was later kidnapped and
an established methodology for software used for a different meaning by a small but
development. demanding subset of society. Since the two
meanings are closely related, both being
Summary of positive issues kinds of internet software technology, but
The model is an established approach very different, it can be hard to tell which
to software development; tools and tech- meaning any given speaker intends.
niques associated with the waterfall model 1: Any service provided through the
have been developed and there is a wide World Wide Web, in other words anything
user base. that can be accessed over the internet
through a web browser, is a Web service. This
Summary of potentially negative issues covers a great multitude of things, from
The approach generally lacks formality and simple static web pages, through informa-
is not a methodology that would achieve tion services, digital music and movies, to
348
Wireless Application Protocol (WAP)
e-commerce and complex interactive envi- applications have been embraced by many
ronments. organizations and government entities,
2: When applications need to interact and provide a flexible approach to systems
or share data across the internet without access. For example, The US Navy--Marine
human assistance, some common language Internet initiative, commenced in 2000,
and protocol of communication must be aims to allow over 300 000 users access to
provided. Web services refers to the entire all of the US Navy’s estimated 80 000 appli-
collection of data languages and communi- cations over the internet.
cations protocols used by a group of appli- The technologies that underlie web
cations for this purpose. Most versions of services also facilitated the creation of
web services insist that data is represented business-to-business e-commerce and data
using the XML language. Various protocols exchanges, including the provision of data
are available, including the existing stan- synchronization engines and data pools.
dards FTP and HTTP, and the special-purpose
SOAP (Simple Object Access Protocol) and Summary of positive issues
CORBA (Common Object Request Broker Web technologies continue to mature and
Architecture). Additional systems such as their development allows greater connec-
WSDL (Web Services Description Language) tivity between entities. Web services allow
and UDDI (Universal Description Discovery access to remote applications.
and Integration) are available to allow sys-
tems and applications to identify them- Summary of potentially negative issues
selves and others, and to communicate how The terminology associated with web ser-
a particular web service is to be used. vices is used ambiguously. Web services can
be technically demanding to implement.
Business value proposition
Reference
The development of internet and web- r H. Deitel, P. Deitel, B. DuWaldt, and L.
based technologies has resulted in the
Trees (2002). Web Services: A Technical
development of many different models
Introduction (Englewood Cliffs, NJ,
through which businesses can interact and
Prentice-Hall).
communicate.
The most basic form of ‘‘web service” is Associated terminology: Data pool,
typically any interaction with a remote web Internet, Protocol.
page, such as viewing its contents, using
the web site to order a product, accessing
a database, or other function. The sophisti- Wireless Application Protocol (WAP)
cation of this form of web service leads to
the creation of e-commerce business mod- Foundation concept: Wireless network.
els through which commerce is transacted. Definition: The Wireless Application Protocol is an
A second form of web service, some- open-specification communications protocol for pro-
times referred to as a ‘‘web-enabled applica- viding information services on wireless devices.
tion,” is just that, an application accessible Not to be confused with: WAP – wireless access point.
through the use of internet technologies
and through which data and/or services Overview
can be accessed. For example, an employee The Wireless Application Protocol (WAP) evol-
who travels extensively on company busi- ved out of a need for wireless providers to
ness may remotely connect through the create a common platform through which
internet with their company’s systems and their devices could inter-operate. WAP 1.0,
use a variety of applications. Web-enabled which was developed under the auspices
349
Wireless Application Protocol (WAP)
of the WAP Forum, was released in 1998 closely related to HTML, the standard lan-
and was subsequently developed into the guage for web pages, and is compatible
backwards-compatible WAP 2.0, released in with the XML standard, but is optimized
January 2002. for wireless use. Instead of using web
WAP provides an environment in which pages, WML treats data for display as a
web applications can run on hand-held Deck of Cards, each card containing a small
devices using Micro-browsers (browsers desi- amount of data, with the user expected
gned to operate within the limited envi- to navigate through the deck at their own
ronment of a very small computer). As pace. Many wireless devices have severely
such, WAP is based on a six-layer proto- limited display regions (such as the small
col stack similar to the OSI standard Seven- screen on a cellular phone), and could
layer model, and has been developed to not be used to view normal web pages.
support internet protocols including TCP A common practice is to provide a WAP
and HTTP. Central to these layers is the gateway, acting as a proxy between servers
Wireless Application Environment Layer and viewers, converting standard HTML to
which provides the application-developer a reduced-size WML document for conve-
interface through which developers can nient viewing.
write Extensible Hypertext Markup Langu-
age Mobile Profile (XHTMLMP) documents Business value proposition
that may be displayed on any device with The WAP protocol was a major break-
a WAP micro-browser. WAP’s WAE (Wire- through in mobile computing since it uni-
less Application Environment) also facili- fied vendors under a single set of proto-
tates the development of more advanced cols and avoided the frequent problem of a
‘‘presentations” through other standards ‘‘land-grab” mentality under which differ-
including WML (Wireless Markup Lan- ent proprietary protocols battle it out for
guage), WML Script, and WBMP (Wireless market dominance. WAP provides a stable
Bitmap image). protocol environment through which deve-
The basis of WAP operations is the lopers can create web presentations aimed
standard client--server model, in which a at devices with micro-browsers, including
request for service is sent over the network PDAs, cell phones, and two-way radios.
to the appropriate server identified by its WAP must be assessed from a security
IP address or URL. In this case, communi- perspective, in that, when the system’s full
cations are wireless-oriented; the request capabilities for strong encryption (similar
commences as a wireless signal, and is to the Secure sockets layer) are used together
responded to by the web server and sent with a secure server gateway at the recipi-
back across the network. ent’s side of a request, the security of the
The WAP system is capable of running data is high. However, there is a possibil-
on top of any operating system, includ- ity of using a weakened encryption system
ing those specially written for hand-held and disabling the security system. For orga-
devices with small amounts of memory nizations or even individuals using WAP-
and limited processing capabilities (these enabled devices to transfer sensitive data,
operating systems are often optimized, careful consideration of end-to-end security
slimmed-down versions of the full operat- needs to be made by a professional systems
ing system), as well as being optimized to manager.
maximize the relatively low data transfer
rates associated with wireless systems. Summary of positive issues
Closely associated with WAP is WML (the WAP is an open standard supported by
Wireless Markup Language). WML is very the WAP Forum. The standard is universal
350
Wireless network
in hand-held devices that support micro- Metal wires are not perfect; they get in
browsers. WAP allows developers to create the way, trip people up, and get broken and
presentations specifically for micro-brow- tangled. If the public is to be given access
sers. WAP supports a form of Digital certifi- to a network, connectors and sockets have
cates and other security enhancements. to be provided, and the public will quickly
break them or fill them with foreign sub-
Summary of potentially negative issues stances. Wires also prevent mobility; it is
WAP offers a development and presenta- hard to move around freely while using a
tion environment that is limited by pro- computer that is connected by a cord to the
cessing hardware and memory, and by the wall.
data transfer rates of the wireless system. When mobility or public access is requi-
WAP has some security concerns associated red, or under other circumstances when
with the WTLS (Wireless Transport Layer wires may be undesirable, a wireless net-
Security) protocol which encrypts the data work is of course the solution. On a wireless
transmitted, and governs the security at network, every device must be fitted with
the WAP gateway where the server connects a transceiver (a transmitter and receiver in
to the network. one) that converts between the native elec-
tronic signals of the computing device and
References
r www.wapforum.org. the radio-frequency transmissions used by
r C. Arehart and N. Chidambaram (eds.) the wireless network. This additional equip-
ment adds some small expense, but at least
(2000). Professional WAP (Chicago, IL,
there is a single universally adopted stan-
Wrox Press).
dard, or at least a family of mutually com-
Associated terminology: Network, LAN, patible standards (in the IEEE 803 suite), so
Protocol, HTML, XML, Security, Digital wireless network adapters are as standard
certificate. as the ethernet adapters that they replace.
Unless a wireless network is to be totally
isolated and not part of the internet, there
Wireless network must also be a Base station or Wireless access
point, to provide a gateway.
Foundation concepts: Network, LAN. Wireless networks are not well suited
Definition: A local area network in which the connec- to all circumstances. Wireless signals are
tions between computers use not wires, but radio- not easily contained or directed. When two
frequency broadcasts. networks are close together, great confu-
sion can be caused for transceivers in the
Overview area of overlap. If a base station is not
Electrical signals naturally flow along properly secured, unauthorized users can
metal wires; they require no special trans- ‘‘hitch a free ride” on the network, using
mitters or receivers, and are exceptionally an internet connection that someone else
cheap and reliable. They are also highly has paid for. Most wireless network base
directional: a signal inserted at one end stations are easy to secure against unau-
of a wire flows straight to the other end, thorized use, but it does require some user
and does not leak out in unwanted direc- configuration, which is most often skipped.
tions. This makes metal wires seem like the If any network traffic is not encrypted,
ideal medium for inter-computer data con- it can easily be intercepted by an unde-
nections, and indeed most network connec- tectable interloper. Inside buildings, walls
tions are in the form of normal electrical and equipment, and even moving people,
cables. can interfere with network signals, and
351
World Wide Web
352
World Wide Web
describes the worldwide collection of inter- servers) use to communicate requests and
connected computers, the hardware that their corresponding documents.
connects them, and the fundamental soft- It is HTML and HTTP that make the seam-
ware and protocols that make communica- less browsing of the web, or surfing of
tions between them possible. The ‘‘World the net, possible. HTML and HTTP together
Wide Web” refers to the web of intercon- allow the carefully designed graphical lay-
nected documents, data, and applications outs with an almost infinite variety of
that spans the internet. fonts, text sizes, backgrounds, colors, ima-
The internet existed long before the ges, and other graphical elements, and the
World Wide Web that we know today (see online forms and their automatic process-
Internet for details). Since the early days of ing on which e-commerce is founded. Of
ARPANET, it has been possible for anyone course, many other applications provide
with internet connectivity to put any col- the same mixture of text and graphical
lection of data online, so that anyone else elements, and many provide better control
with the same connectivity may download over the appearance of a page, but the fun-
and access them at will (‘‘A file transfer damental thing that makes the web more
protocol” was published by Abhay Bhushan than just a collection of files accessible
of MIT in 1971). However, those data files through the internet is the ability to follow
were stand-alone documents: each could be a reference to another online document
viewed in any convenient way, and they anywhere in the world instantly and auto-
may contain references to other online doc- matically. That is the difference between
uments, but those references would be sim- text and hypertext; it is also the difference
ple inert text as might be printed in a between the internet and the World Wide
book. Any reader wishing to view a ref- Web.
erenced document would have to find it, In order to publish documents on the
manually download it, and separately view web, only one thing is required: a Web
it; not necessarily a difficult process, but server. A web server is usually thought of
not the smooth operation we are familiar as a networked computer with this desig-
with today. nated task, but it is actually just a simple
The World Wide Web did not exist until piece of software running on a networked
the introduction of a practical Hypertext computer. A single computer may in fact
system by Tim Berners-Lee of CERN (Conseil support a large number of different web
Conseil Européen pour la Recherche Nucléaire or servers. Web server software compatible
the European Council for Nuclear Research) with all popular computers and operating
in 1990. This included HTML (the HyperText systems is freely available for download,
Markup Language) and HTTP (the Hyper- but there are also commercial versions
Text Transfer Protocol). HTML (see Hypertext available that may provide more support.
for details) is a simple language that Once a web server has been installed and
allows the creation of documents that mix started on a networked computer, it is con-
text, graphics, and other media, which figured to make certain groups of files avail-
includes Hyperlinks: references to other able for browsing, and it simply waits for
online documents that can be retrieved requests composed in the HTTP language.
and displayed automatically, using the now Each request received is analyzed to dis-
familiar point-and-click technique. HTTP cover which actual file or application is
is the even simpler language that docu- wanted, and, if that access is permitted,
ment display software (web browsers) and the results are transmitted back. The selec-
the remote document storage systems (web tion of files made available by a single web
353
World Wide Web
354
World Wide Web
355
WYSIWYG
access through a standard browser makes produced on an output device such as a printer or
this form of collaboration tool a medium through a browser then the document production
through which remote or distributed work- system is a WYSIWYG system.
ers may contribute to a project in an asyn-
chronous manner. Overview
Word processors and other content-
Summary of positive issues creation software may be classified as
The World Wide Web provides a popular either WYSIWYG (pronounced whiz-ee-wig)
mechanism for disseminating information or non-WYSIWYG. Word processors in the
across the internet. A web site can be inter- 1970s and early 1980s such as WordStar
nal to an organization, whereby content generally operated in one of two modes:
may be restricted to a workforce or group of Edit mode and View mode. When in edit
workers. Externally facing web sites can be mode, strange annotations would appear
used to provide information or to sell prod- in the text (like the HTML markup tags of
ucts. The World Wide Web is based upon today, but far more arcane); users could
a set of open standards and protocols that edit the text, but it would often be hard
continue to evolve to meet new challenges. to visualize how it would appear when
The technologies and services associated printed. In view mode, the document
with web development are widely available would appear in an approximation of its
and have a wide range of capabilities; many final printed or display form, with roughly
tools are open-source and free to download. correct text layout, and perhaps even the
Domain names can be acquired to reflect right fonts, but it would not be possible
the branding statement of the organization to make any modifications to the docu-
or individual. ment without switching back to edit mode.
These were non-WYSIWYG systems. Under a
Summary of potentially negative issues WYSIWYG system there is only one mode of
The technology underpinning the World operation: editing is performed while view-
Wide Web continues to evolve and requires ing the document in its production form,
that the corporate or individual web and the document is shown in as close
master maintains a working knowledge to its final form as the display hardware
of advances. Popular domain names may permits.
already have been acquired by others. Not The first WYSIWYG word processor,
all browsers have the same capabilities: BRAVO, was created at Xerox PARC in 1974
micro-browsers on hand-held devices do and this style of word processing has con-
not have the same capabilities as desktop tinued to grow in use, having been popu-
computer-based browsers. larized by Microsoft’s Word program and
Adobe’s Frame-Maker. However, development
Associated terminology: Internet, Server-- of markup-language-based document pre-
client, Hypertext, Internet protocol. paration systems continues and systems
such as LaTeX facilitate the production of
high-quality documents and publications,
WYSIWYG especially when specialized notations and
symbolic forms are required. Often a doc-
Definition: An acronym for “What You See Is What ument processing system will provide two
You Get,” describing the visible form of a document interfaces for working in HTML and simi-
while it is being developed. If the visible form the doc- lar markup languages, allowing the user to
umenttakeswhileitisbeingcreatedisthesameasone directly edit the markup tags, or to work in
356
WYSIWYG
357
X.12
358
XML
and move toward a global standard, since sion over the network. If the called ter-
X.12 is widely used in the United States minal accepts the call, a full duplex (bi-
and EDIFACT is widely used in the Euro- directional) transfer of data can occur. The
pean Union. The ASC X.12 Standards orga- communication session can be terminated
nization also began an initiative in 2001 to by either party at any time.
harmonize X.12 with XML-based messaging
standards, but XML and X.12 remain sepa- Business value proposition
rate approaches to document messaging. The X.25 protocol was initially developed in
Reference 1976 to facilitate the creation of wide area
r http://www.x12.org/x12org/index.cfm. networks over public telecommunications
equipment. The technology is mature, hav-
Associated terminology: XML. ing been revised in 1980, 1984, and 1988,
resulting in a debugged and stable envi-
ronment that provides high-quality data
X.25 connectivity. X.25 has been implemented
widely and products with X.25 certification
Foundation concepts: Packet switching, Network. are widely available.
Definition: X.25 is an ITU-T protocol that defines a
set of standards for connecting user and network
Summary of positive issues
devices.
The technology is mature, stable, and pro-
vides high-quality connections. Products
Overview
adhering to the standard are widely avail-
The X.25 standard was defined by the Inter-
able. X.25 applies to dial-up connection
national Telecommunications Union (ITU),
methods.
an organization that operates under the
auspices of the United Nations to develop
and coordinate global telecommunications Summary of potentially negative issues
standards. The ITU-T (the final ‘‘T” signify- The X.25 protocol incurs a turn-around
ing the telecommunications sector group) delay on messaging between terminal
uses a naming convention of a letter and devices. This can be detrimental to heavy
a number separated by a period. The letter bi-directional communication over an X.25
‘‘X” signifies a group of related standards in network. Line speeds of 64 kbps are too slow
a common area, namely data networks and for many applications that require higher
open systems communications. The num- levels of bandwidth to perform their tasks
ber ‘‘25” signifies that the standard covers usefully.
the area of packet switching.
References
The X.25 standard applies to the first r http://www.itu.int/home/.
three layers (network, data, and physical) r M. Clark (2001). Networks and
of the OSI Seven-layer model and is typically
Telcommunications (New York, John Wiley
used on the networks of telecommunica-
and Sons).
tions companies (generally known as com-
mon carriers), and enables user devices and
network devices to communicate with each
other regardless of their type. XML
Communication over an X.25 network
begins when a data terminal such as a per- Definition: The Extensible Markup Language (XML) is
sonal computer calls another data termi- a standard language for describing and encapsulating
nal to request a joint communication ses- data.
359
XML
<?xml version=“1.0”?>
<purchase order> Summary of positive issues
<description> The XML approach to data description
<order type> transfer is flexible, cross-platform, and
“Urgent Shipment”</order type> widely utilized by software vendors. The
<Destination Address> language is supported by W3C and contin-
“Head Office”</Destination Address> ues to evolve, supporting encryption, com-
<Item Description> municating with web services, and facilitat-
“Large Widget”</Item Description>
ing end-to-end transactions.
</description>
</purchase order>
Summary of potentially negative issues
The names and placements of the tags are XML may not be used for the transmis-
purely at the discretion of the individual sion of healthcare data in the United States,
designer. The structural details of the tags, because HIPAA mandates the use of X.12N
360
XML
361
Y2K problem
362
Zip
363
Index
365
Index
366
Index
367
Index
368
Index
369
Index
370
Index
371
Index
372
Index
373
Index
374