You are on page 1of 23

Data Security*

DOROTHY E. DENNING AND PETER J. DENNING


Computer Science Department, Purdue Unwers~ty, West Lafayette, Indiana 47907

The rising abuse of computers and increasing threat to personal privacy through data banks
have stimulated much interest m the techmcal safeguards for data. There are four kinds of
safeguards, each related to but distract from the others. Access controls regulate which
users may enter the system and subsequently whmh data sets an active user may read or
wrote. Flow controls regulate the dissemination of values among the data sets accessible to
a user. Inference controls protect statistical databases by preventing questioners from
deducing confidential information by posing carefully designed sequences of statistical
queries and correlating the responses. Statlstmal data banks are much less secure than most
people beheve. Data encryption attempts to prevent unauthorized disclosure of confidential
information in transit or m storage. This paper describes the general nature of controls of
each type, the kinds of problems they can and cannot solve, and their inherent limitations
and weaknesses. The paper is intended for a general audience with little background in the
area.

Keywords and Phrases" security, data security, protection, access controls, information
flow, confidentiality, statistical database, statistical inference, cryptography, capabilities,
public-key encryption, Data Encryptmn Standard, security classes
CR Categortes. 4.35, 4.33, 1.3

INTRODUCTION [PARK76]. Parker believes many more cases


The white-collar criminal, the old adage probably remain undetected or unreported.
goes, is a man who has learned to steal with Banks in particular are not eager to ac-
a pencil. In the last decade or two, a few of knowledge that they have been embezzled.
the more perceptive of these entrepreneurs The median loss in reported cases was al-
have discovered that the rewards are most $500,000, and the total known loss
greater and the risks lower if they steal with from all computer crime has been about
a computer. There have already been some $100 million annually. These figures are
spectacular thefts but, so far, computer destined to rise unless effective counter-
criminals have been handicapped by a lack measures are taken against the more expert
of technical know-how and by a certain attacks of the second generation of com-
inability to think big. The sums involved in puter criminals, who are now learning their
typical cases of computer abuse would have trade.
been front-page news had they been stolen About 40 percent of reported abuses were
by armed desperados but have generally data entry problems. Most of the rest were
been smaller than the haul that might have thefts or embezzlements by a trusted em-
been made by someone with more expertise ployee who misused his access to the com-
and boldness. puter. A few were malicious pranks or sab-
The records of hundreds of cases of com- otage. Nearly all the known cases involve
puter abuse have been analyzed by Parker breaches of external security. So far, very
* This work was supported in part by National Science few computer crimes have involved
Foundation Grant MCS77-04835. breaches of internal security: design flaws

Permission to copy without fee all or part of this material is granted provided t h a t the copies are not made or
dmtributed for dtrect commercial advantage, the ACM copyright notice and the title of the publication and its
date appear, and notice is given that copymg is by permission of the Association for Computing Machinery. To
copy otherwise, or to republish, requires a fee and/or specific permission.
1979 ACM 0010-4892/79/0900-0227 $00.75

Computing Surveys, Vpl. II, No. 3, September 1979


22 D.E. Denning and P. J. Denning
CONTENTS work has been intensified in the last few
years. Systems designers and engineers are
developing hardware and software safe-
guards, and theoreticians are studying the
inherent complexity of security problem~.
Although we have made considerable prog-
ress, there is still a wide gap between the
INTRODUCTION safeguards that can be implemented in the
I ACCESS CONTROLS laboratory--safeguards well within the
Controlsfor Transaetmn-Processmg Systems reach of current technology--and those
Controls for General Purpose Systems
Example of an Object-Dependent Design available in most commercial systems.
Llmltatlonsof Access Controls Some of the safeguards that users want are
2 FLOW CONTROLS
Flow Policies
theoretically impossible or would be pro-
Mechanmms hibitively expensive.
Lmutatlons of Flow Controls This last point is probably the most im-
3 INFERENCE CONTROLS
Controlson Query Set Sizesand Overlaps
portant. Absolute security is no more pos-
Rounding and Error Inoculation sible in computer systems than it is in bank
Random Samples vaults. The goal is cost-effective internal
Llmltatlonsof InferenceControls
4 CRYPTOGRAPHIC CONTROLS safeguards, sufficiently strong that com-
The Data Encryptlon Standard (DES) puter hardware and software are not the
Pubhc-Key Encryptlon weakest links in the security chain.
SUMMARY
ACKNOWLEDGMENTS In this paper we summarize current re-
REFERENCES search in internal security mechanisms,
how they work, and their inherent limita-
tions. Internal security controls regulate
the operation of the computer system in
four areas: access to stored objects such as
within the computer system itself. But the files, flow of information from one stored
rapid proliferation of computers and the object to another, inference of confidential
increasing sophistication of users make values stored in statistical databases, and
businesses and individuals increasingly vul- encryption of confidential data stored in
nerable to abuse by computer experts. As flies or transmitted on communications
the potential rewards increase, so will the lines. The problems these four types of
sophistication of attacks on computer sys- mechanism attempt to control are illus-
tems. trated in Figure 1.
An expert criminal, for example, might We have not attempted to treat external
intercept electronic-funds-transfer mes- security controls, which affect operations
sages between two banks; within a few outside the main computing system; exam-
hours he could steal, without a trace, sev- pies are personnel screening, limiting access
eral millions of dollars. An investigative to the computer room and to certain ter-
reporter might deduce from questions an- minals, fire protection, and protection of
swered by a medical information system removable media against destruction or
that a senatorial candidate once took drugs theft. Some security mechanisms lie at the
for depression; if published, this informa- interface between users and the system;
tion might force the candidate to withdraw examples are user authentication, password
from the election even though he had been management, security monitoring, and au-
cured. An employee of a government diting. These are discussed only in relation
agency might blackmail citizens using in- to internal security mechanisms. Neither
formation purloined from a confidential have we attempted a treatment of privacy
data bank accessible over a federal com- and the law. Our objective is simply an
puter network. overview o f four areas of security research.
These three speculations represent Other papers and books that treat inter-
breaches of internal security. Internal safe- nal controls are ANDE72, GRAH72, HOFF77,
guards for data security have been actively HSIA78, MADN79, POPE74, SALT75, and
studied since the early 1960s, and in an- SHAN77. The book D]~MI78 is a collection
ticipation of future security threats this of recent papers on security research.

Computing Surveys, VoL 11, No. 3, September 1979


Data Security 229

i FILE X

READ X I FILE X

SMITH
FILE Y


FILE Z
~COPY~TO~I F,LE

i
JONES
JONES
FIGURE la. A C C E S S . Jones can be permitted to read FIGURE lb. F L O W . D e n i e d access to file Y, S m i t h
file Y and write m fileX; he has no access to fde Z. gets confederate Jones to make a copy; flow controls
could prevent this.

FIGURE 1. Four kinds of security controls.

FIGURE lC. INFERENCE. A questioner used prior FIGURE ld, ENCRYPTION. Jones illicitly obtains a
knowledge to deduce confidential information from copy of file X; but Its encrypted contents a r e mean-
a statmtical summary; inference controls could pre- ingless to him.
vent this.

0
O
O f o
0
o (HOW MANY BROWN- 0
0
O | EYED, F[VE-FOOT- o
| BLONDES HAVE o
o
Oo L.H,ODENMOLES? oo
0

READ X
MEDICAL
INFORMATION
FILE FILE X

>1 implications are discussed in PARK76,


RPP77, TURN76, and WEST72.
Overviews of external controls are given in
MADN79, NIEL76, PARK76, SALT75, and
1. ACCESS CONTROLS
SHAN77. The general concepts of all con-
trois are reviewed in GMN78. The issues of Access controls regulate the reading, chang-
privacy and the law and their technological ing, and deletion of data and programs.

Comput~lg Surveys, Vol. U, Nq. 3, September 1979


230 D. E. Denning and P. J. Denning
These controls prevent the accidental or Controls for Transaction-Processing Systems
malicious disclosure, modification, or de-
The commands issued by the user of a
struction of records, data sets, and program
transaction-processing system are calls on
segments. They prevent malfunctioning
a small library of "transaction programs"
programs from overwriting segments of that perform specific operations, such as
memory belonging to other programs. They querying and updating, on a database
prevent the copying of proprietary software
[DEAN71]. The user is not allowed to write,
or confidential data.
compile, and run arbitrary programs. In
Many access control systems incorporate
such systems the only programs allowed to
a concept of ownership--that is, a user may
run are the certified transaction programs.
dispense and revoke privileges for objects
Therefore it is possible to enforce the rules
he owns. This is common in file systems
of access at the interface between man and
intended for the long-term storage of one's machine.
own data sets and program modules. Not
A database management system is an
all systems include this concept; for exam-
example. A user can identify a set of records
ple, the patient does not own his record in
by a "characteristic formula" C, which is a
a medical information system. Access con-
logical expression using the relational op-
trol systems for owned objects must effi-
erators (--, ~, <, etc.) and the Boolean
ciently enforce privileges that are added,
operators (AND, OR, NOT); these opera-
changed, or revoked.
tors join terms which are indicators of val-
The effectiveness of access controls rests
ues or compositions of relations. An exam-
on three assumptions. The first is proper
ple is
user identification; no one should be able
to fool the system into giving him the C = "FEMALE AND PROFESSOR OR (SALARY
capabilities of another. Authentication >_ $20K)."
schemes based on passwords are common The transaction program looks up a for-
and simple, but they need safeguards to mula R specifying restrictions that apply to
thwart systematic penetration [GAIN78, the given User; it then proceeds as ff the
MORR78, PARK76, SALT75, SALT78]. user had actually presented a formula C
Schemes based on identifying personal AND R. The concept of adding to the re-
characteristics such as voiceprints or dy- quests of a user constraints that depend on
namic signatures are more reliable, but that user is common in data management
more expensive. The second assumption is systems [BoNC77, STON74].
that unanticipated observers do not gain This form of access control is potentially
access by stealing tapes or disk packs or by very powerful. The restrictions R may in-
wiretapping. The usual safeguard is encryp- clude data-dependent restrictions, which
tion, which is discussed later--information are also functions of the current values of
that could be monitored by strangers is the data [CoNw72], or history-dependent
scrambled. The third assumption is that restrictions, which are functions of the rec-
privilege-information is heavily protected; ords previously accessed [HART76]. Imple-
this is all the information that specifies the menting these kinds of restrictions can be
access each program has to objects in the very difficult. We refer the reader to HSIA78
system. No user's program can write into for details.
the segment containing its own privilege When the system allows owners of rec-
specifiers. Privilege-information is accessi- ords to revoke privileges that may have
ble only to authorized programs of the su- been passed around among users, it must
pervisor, and the privilege to call these pro- be designed to revoke also any privileges
grams is itself controlled. that emanated from the revoked privilege.
The following subsections consider two Griffiths, Wade, and Fagin have studied a
important classes of access control mecha- revocation method that stamps each privi-
nisms for transaction-processing systems lege-specifier with the time of its creation
and for general purpose programming sys- [GRIF76, FAGI78].
tems. We intend our treatment as a guide
to the literature, not a detailed study of Controls for General Purpose Systems
the many trade-offs that must be faced in General purpose systems permit users to
practice. write, compile, and run arbitrary programs.

Computing Surveys, Vol. 11, No. 3, September 1979


Data Security 231
It is not possible to certify a priori that Presence
arbitrary programs will forever meet the Bits
(changing) access rules of the system, or
that there will never be program failures or Key l Bose Length
equipment malfunctions. Therefore, these
systems provide access control mechanisms
as part of the run-time environment, often
with hardware support for performing ac- I I
cess checks in parallel with the main com-
putation. These mechanisms are typically
based on object-dependent controls (as op-
posed to data-dependent controls), which KIo
regulate access to an object irrespective of
the values stored in that object.
Object-dependent controls are also
needed in transaction-processing systems DESCRIPTOR
TABLE
to protect against faulty transaction pro-
grams, equipment malfunctions, and in-
truders-problems that cannot be pre-
vented simply by "certifying" the transac-
tion programs.
Object-dependent controls can be en-
forced by type-checking in compilers for
languages with abstract data types, or by
run-time checking in the hardware. A pro-
posal for compiler-based enforcement is PORTION OF MEMORY
given by Jones and Liskov [JoNE76]. An FIGURE 3. Multiple-segment storage.
illustration of hardware-based enforcement
is given in the next section.
been the subject of considerable research.
Example of an Object-Dependent Design Detailed treatments of this design and its
trade-offs can be found in DESN76b,
This section illustrates the architecture of FABR74, LIND76, ORGA72, ORGA73, and
object-dependent access controls. The cen- SALT75.
tral concept, capability addressing, has Most systems have a large number of
segments (data sets and programs) which
are kept in the slow-speed bulk store under
the auspices of a file system. When an
active program requires access to a seg-
Bose Length ment, the operating system provides that
program with a "capability" for the seg-
I0 ment. All of a program's capabilities are
DESCRIPTOR stored in a "capability list," which is used
by the addressing hardware when inter-
preting the program's virtual addresses. We
first describe the operation of the address-
ing hardware. Then we describe how a pro-
gram acquires capabilities.
29 Figures 2-4 summarize the mechanism
for verifying attempted accesses to seg-
ments stored in the main memory. Figure
2 shows a copy of a 20-word segment stored
in memory at the beginning (base) address
PORTION OF MEMORY 10. A descriptor of the form (B, L) records
FIGURE 2 A descrlptor addressing a segment of the base address B and length L of the
memory. segment. Programs refer to words in seg-

Computing Surveys. VoI~INNo. 3, September 1979


232 D. E. Denning and P. J. Denning
Segment
NO.
Access
Code Key
B
I'Rla .,I,o I 2o I-
IE I ' /\ ~r I ]I I O0
= I,I 5o I 25 I
25

;
~ Iol - 16oo I
, i |

K Iol
'
|
B i L
, i

R I 3 t - - B+L-

EI'I DESCRIPTOR TABLE

I0 C
"1
. 7 t

"")";" Rw .:! K

CAPABILITY LISTS

FIGURE 4. Capability lists for accessing segments.

ments by displacements (line numbers). move segments between main and second-
The command "Read D" refers to the D t h ary memory merely by updating the de-
line of the segment--that is, memory ad- scriptors. Because the keys do not change,
dress B + D. A reference is valid only if the no program is affected by relocations of
displacement is in range--that is, if 0 _ D segments in the memory hierarchy.
<:L. Figure 4 shows the final step in the
Figure 3 shows that descriptors of all scheme: the capability lists are associated
memory segments may be kept in a descrip- with programs. The access code of a capa-
tor table. Each descriptor is identified by a bility specifies one or more kinds of access:
unique key K. Now a program refers to a read (R), write (W), execute (E), or call (C).
segment by specifying the displacement Read access permits a program to copy a
and the key--thus "Read K, D" refers to value out of a memory segment; write ac-
the D t h word in the segment whose key is cess permits a program to store a value in
K. Each descriptor is stored with a "pres- a memory segment; execute access permits
ence bit" that tells whether the associated a processor to fetch instructions from a
segment is in the main store; if it is not, as segment; and call access permits a program
for key 7 in Figure 3, an attempted refer- to execute a procedure call instruction with
ence will trigger a "missing segment fault" that segment as the target. Execute access
that will cause the operating system to sus- is valuable for restricting access to privi-
pend the program and load the segment. leged operations. A program actually refers
This scheme confers considerable flexibility to a segment by a segment number S and
because the operating system can freely a displacement D. Thus "Read S, D " refers

ComputinlSurveys, VoL 11,/40. 3, September 1979


Data Security 233
to the Sth segment in the capability list of address in the bulk store (BA), a list of
the program. The reference is valid only if authorized users (AL), and the file's unique
the Sth capability specifies R-access and identifier (K). Each entry in an authoriza-
contains key K, with D being within the tion list specifies the name and type of
range specified in the Kth descriptor. access permitted of some individual desig-
This strategy has special advantages nated by the owner. The figure shows that
when segments are shared. Each program Jones has granted read (R) permission for
can be given its own capability, with its own his file Y to Smith and himself, write (W)
access code, for the common segment; the permission to himself alone, and no access
common segment can be used by different to Cox. If a program owned by Smith at-
programs whose authors require no prior tempts access to Jones's file Y, the operat-
agreement on the local segment numbers. ing system will intervene and insert a ca-
In Figure 4, programs M and N share the pability (R, K) at some position S in the
segment with key 3. capability list of Smith's program; thereaf-
With this mechanism each program mod- ter, Smith's program can access the file by
ule can have its own capability list. In Fig- issuing read commands with the segment
ure 4, program N's call access for program number S. This is the essence of the scheme
P is stored in the tenth segment of N. If used in MULTICS [OROA72, SALT75,
program N executes the command "Call SCHR72].
10," control will pass to program P. This It is also possible to create a program's
resembles a standard subroutine call, but capability list during compilation. This is
with the important difference that the natural in an environment where the pro-
called program has a capability list different gram modules correspond to managers of
from that of the caller. Program P, whose extended-type objects, and the capabilities
first capability designates the memory seg- point to the components of a particular
ment containing the descriptor table, would object. This view, proposed by Dennis and
be a certified program of the operating sys- Van Horn [DENV66], is used in the Hydra
tem; it would screen all requests to update system [COHE75], the CAP system
descriptors. When program P executes a [NEED77], and the Plessey System 250
"Return" command, program N resumes [ENGL74]. (See GEHR79 for a review.)
execution after the call and its capability Some systems permit owners to revoke
list is again in control. privileges. If all privilege-specifiers are
The concept of giving each program its stored in a central table, it is a relatively
own set of capabilities supports the princi- simple matter to purge them [GRIF76,
ple of least privilege [DENN76b, LIND76, FAGI78]. But if they are scattered through-
SALT75]. Each capability list need contain out the system in capability lists, revocation
entries only for the segments required for becomes considerably harder: the descrip-
the associated program to carry out its task. tor table must contain chains of descriptors
Damage is confined in case the program that can be broken by an owner; this
contains an error. Untrusted programs can renders revoked capabilities useless but
be encapsulated and cannot endanger un- does not purge them [NEED77, REDE74,
related programs. Critical data, such as the SALT75].
descriptor table of Figure 4, can be hidden
away, tamperproof, in a domain accessible Limitations of Access Controls
only to the program certified to manipulate
it. Most security flaws in existing systems are
The foregoing discusses how capabilities the consequences of design shortcuts taken
are used to limit access. We turn now to the to increase the efficiency of the operating
question of how programs obtain their ca- system. Hardware designed to support ac-
pabilities in the first place. cess control efficiently would go a long way
Figure 5 illustrates how a file system toward removing these flaws. An example
attaches privilege-specifiers to permanent is the high overhead in managing small
files. All users are registered in a master memory segments. Users are forced to pack
directory. Each user lists all his files in a data and subprograms into large segments,
personal directory, of which each entry which means that small program blocks
specifies the file's name (N), length (L), cannot be individually protected. This

Con~uting~ VoL~I,.No.3,,Ssl~ber 1979


234 D. E. Denning and P. J. Denning
MASTER LIST

Jones
Cox e-- - -
Smith

SMITH DIRECTORY/ ;OX DIRECTORY JONES DIRECTORY

i,5ol ! _l_q

Name Code
Jones RW
Smith R

DISK
FILE
AUTHORIZATION
STORE
LIST

FIGURE 5. Access controls for permanent fries.

makes the ideal of a separate capability list supposedly correct and trustworthy super-
for every program very difficult to achieve. visor programs can manipulate capabili-
Radical changes in computer architecture, ties and segments without restriction
designed to bridge the "semantic gap" be- [WILK68]. This difficulty is ameliorated
tween concepts in the programming lan- somewhat in MULTICS, which has a linear
guage and concepts in the machine lan- hierarchy of supervisor states, called rings,
guage, may be needed to overcome this that confer successively greater privilege;
difficulty. Myers's S W A R D machine the user can operate some untrusted
[MYER78] and Gehringer's "typed mem- subprograms in rings of low privilege
ory" [GEHR79] point in the right direction. [SCHR72]. Contrary to the principle of least
Another serious problem with existing privilege, systems based on supervisor
systems is excessive privilege vested in the states permit programs in higher rings to
operating system. A supervisor mode of run with much more privilege than they
operation takes over when the user's pro- require for their tasks. There is no efficient
gram calls any operating system program. way for two cooperating subprograms to
The supervisor mode overrides all or most have nonoverlapping sets of privileges.
of the storage protection mechanisms. The The supervisor mode problem is an in-

Computing Surveys, Vol. 11, No. 3, September 1979


Data Security 235
stance of exposure to the general problem chines, each of which can be proved cor-
of "trojan horses" [LIND76]. It arises when rect if the machines below are correct
a subprogram written by an outsider is [NEUM77]. This system is expressed in a
brought into the domain of a given user. special language, called SPECIAL, that in-
With all the user's privileges and possibly corporates specifications explicitly into the
more, the subprogram is free to wreak programs. Several other research groups
havoc--for example, by erasing files or en- have adopted a less ambitious approach;
tering erroneous data. A system capable of rather than try to prove that the entire
running each subprogram with its own set system meets all its specifications, they cen-
of capabilities offers the best practical de- tralize all the operations affecting security
fense against trojan horses, because the into a system nucleus called the "security
outsider's program can be confined to a kernel." The correct operation of the kernel
domain having the least privilege required implies that the entire system is secure.
for the agreed task. (See MILL76, POPE78b, POPE78C, and
Access mechanisms as outlined here are SCHR77.)
feasible today at reasonable cost. The in- Proving that extant authorizations are
creasing importance of sharing information continually consistent with owner's inten-
among different users of a common data- tions is fraught with difficulties. Many sys-
base, of encapsulating programs, and of tems permit users to grant others subsets
limiting damage in case of error or mal- of their own privileges. In such systems an
function all contribute to a growing pres- owner might well he interested in the safety
sure on manufacturers to build better ma- problem, which seeks answers to questions
chines. of the form "Can the programs of user X
There are additional limitations that are ever gain read access to fde Y?" Safety
much more difficult to overcome: proving problems are easily answered in the special
that a computer system continually meets case of systems conforming to the "take-
its access specifications, proving that au- grant model" [SNYD77, LIPT78]. However,
thorizations are continually consistent with Harrison, Ruzzo, and Ullman have shown
owners' intentions, and proving that infor- that the primitive operations of practical
mation stored in files and segments remains access control systems are sufficiently pow-
among authorized users. The possibility of erful to encode the state of an arbitrary
hardware malfunction, which can alter in- Turing machine into the extant access con-
formation stored in the memory, makes trol privileges; the halting problem is
rigorous proofs impossible because it sub- thereby reducible to the safety problem,
verts the necessary assumption that all pos- which means that the safety problem is
sible changes in system state are control- undecidable [HARR76]. This result is
lable. Arbitrarily low risks of damage can mainly of theoretical interest, for it is usu-
be achieved only at correspondingly high ally possible to answer specific safety ques-
investments in error-checking equipment. tions. However, this result explains why it
Proving that a computer system contin- is impossible to devise a single approach for
ually meets its access specifications is all safety questions; each one must be ana-
straightforward in principle: the prover lyzed separately.
must show that all programs and hardware Proving that stored information remains
for enforcing the current authorizations and among authorized users is also difficult be-
for permitting changes in authorizations cause a user who may read a file may also
work as specified. In practice this is easier make a copy, perhaps in code, which he can
to say than do because the correctness of then pass along to someone who is denied
many complex programs must be estab- access to the original. However, this is not
lished [GAIN78] and because automatic a genuine defect of access controls, which
"program proving" is a distant goal. Much are intended to regulate access to stored
effort has been devoted to developing for- objects, but not what happens to the infor-
mal access control models which can be mation contained in these objects. Many
elaborated into the design of a system. At leaks based on copying can be eliminated
SRI International, the PSOS (Provably Se- by augmenting an access control mecha-
cure Operating System) is structured as a nism with controls on information flow.
linear hierarchy of 14 nested abstract ma- This is studied next.

ComputingSurvCy~~oL 11, No. 3, S~ptember 1979


236 D.E. Denning and P. J. Denning
2. FLOW CONTROLS stricted), S (sensitive), and C (crypto). In-
formation is permitted to flow from an ob-
A flow occurs from object X to object Y ject with security class (i, x) to one with
when a sequence of instructions that reads class (j, y) only if i _ j and only if the
from X writes a value into Y. Copying file compartments of x are also compartments
X into file Y is an example of a simple flow. ofy. Transmissions from (2, RS) to (3, RS)
Much more subtle flows are possible, as we or to (2, RSC) are allowed, for example, but
note shortly. those from (2, RS) to (1, RS) or to (3, R)
Active flow control research began in the are not.
early 1970s. Most flow controls employ
some concept of security class; the transfer
of information from a sender to a receiver Mechanisms
is allowed only if the receiver's security
Simple flow controls can be enforced by an
class is at least as privileged as the sender's
extended access control mechanism, which
[DEsN76a]. A flow policy specifies the
channels along which information is al- involves assigning a security class (usually
called the clearance) to each running pro-
lowed to move. Flow controls can prevent
gram. The program is allowed to read a
a service program from leaking a customer's
particular memory segment only if its se-
confidential data. They can block the trans-
mission of secret military data to an un- curity class is as high as that of the segment.
classified user. It is allowed to write in a segment only if
The most general flow controls monitor its class is as low as that of the segment.
This automatically ensures that no infor-
the detailed data flows in programs. How-
ever such controls are often complex and mation transmitted by the program can
move from a higher to a lower class. A
hard to use efficiently. Controls based on
security classes are usually efficient, thoughmilitary program with a secret clearance,
for example, can read only from objects
often exasperatingly conservative.
which are unclassified, confidential, or se-
cret; it can write only into objects which
Flow Policies
are secret or top secret. It is forbidden to
The simplest flow policy specifies just two write into unclassified or confidential ob-
classes of information: confidential (C) and jects, or to read from top secret objects.
nonconfidential (N), and allows all flows Systems designed at SDC [WEIS69],
except those from class C to class N. This M I T R E [MILL76], Case Western Reserve
policy can solve the confinement problem U n i v e r s i t y [WALT75], and SRI Interna-
that arises when a service program handles tional [NEUM77] are of this type.
customer data, some of which are confiden- The extended access control mechanism
tial [FENT74, LAMP73, LIPN75]. The service has a tendency to overclassify data. Infor-
program may retain some or all of the cus- mation flows upward but never downward.
tomer's nonconfidential data, but it must This problem can be mitigated by letting
be prevented from retaining or releasing to the security class of a program rise while
its owner any of the confidential data. An running, so that it is only as high as the
income-tax-computing service, for example, highest security information that the pro-
might be allowed to retain a customer's gram has read. In this case the security
address and the bill for services rendered, class of the program is a "high-water mark"
but not the customer's income or deduc- rather than a clearance. This reduces but
tions. does not eliminate overclassification be-
Government and military computer sys- cause the high-water mark cannot be low-
tems have a more complex flow policy for ered.
classified data. Each security class is rep- An important limitation of the extended
resented by two parts (i, x) where i denotes access control mechanism is its lack of gen-
an authority level and x a category. There erality. For example, if the income tax pro-
are usually three authority levels: 1) confi- gram mentioned earlier is confidential, it
dential, 2) secret, and 3) top secret. There will be forbidden to process confidential
are 2m categories, comprising all possible customer data; if it is confidential, it will be
combinations of m compartments; typical forbidden to write nonconfidential infor-
compartments are U (unrestricted), R (re- mation into any of its fries. A usable oper-

Computing Surveys,Vol. 11, No. 3, September 1979


Data Security 237
ating system could not be secured by this when security classes can be declared for
mechanism--all its outputs would have to the variables in the program; using a tech-
be classified at least as high as every class nique similar to type-checking, the com-
of information stored within. This limita- piler can verify that each program state-
tion results from the implicit assumption ment specifies flows that are consistent
that any input of a program can flow to any with the given flow policy [DENN77]. Al-
output, which forces the designer to assume though highly efficient, this method is also
that the confidentiality of each output is as highly conservative: it will reject programs
high or higher than the confidentiality of that would be certified under a traditional
every input. flow analysis. Cohen's information-theo-
To be free of this limitation, flow controls retic scheme relieves this difficulty by cer-
must be able to examine the way informa- tifying all flows that will occur during
tion flows through the statements and var- some execution of the program [CoH~.77,
iables of a program, determining precisely COHE78]. Furtek and Millen have proposed
how inputs flow to each output. This is not a theory of information flow using analogs
straightforward. Suppose that x is 0 or 1 of prime implicants of switching theory
when this statement is executed: [FuRT78, MILL78]; it tOO can lead to more
precise certification. Reitman and Andrews
if x -- 0 then y := 0 else y := 1.
have incorporated the flow semantics of
This statement copies x to y implicitly by DENN77 into the formalism of program ver-
encoding the value of x into the control ification for more precise certification
flow. (Note that this program still transmits [REIT78].
some information from x to y even if the The foregoing methods are based on
initial value of x is unknown.) Implicit flows static analysis; they certify a program prior
of this type can be easily detected by asso- to execution and require no run-time sup-
ciating with the program counter a dynamic port. However we do not know how to
security class corresponding to the Boolean certify programs that use variables whose
expressions which influence it. security classes can change during execu-
Here is a program fragment specifying a tion or whose inputs can have different
flow that is more difficult to detect: security classes on different invocations.
These cases require a combination of static
while x ~ 0 do skip; print ('done'); stop. analysis and run-time checking. A prelimi-
This program will print 'done' only if x - 0; nary study of such a system was made by
otherwise it will enter an "infinite" loop, Fenton, whose "data mark machine" tagged
which actually means it will eventually be every memory cell (and the program
terminated abnormally. In this case the counter) with a security class; the processor
output produced by the program reveals would inhibit any instruction whose exe-
the position of the program counter when cution would violate the flow policy
the program stops, thereby revealing the [FENT74].
information encoded therein. A partial so-
lution results if one can prove that the loops Limitations of Flow Controls
of one's program all terminate [REIT78].
However, all types of abnormal program We suggested earlier that all mechanisms
termination present problems for flow de- based on security classes tend to overclas-
tectors [DENN76a, DENN77]. sify information, since only upward flow is
Several techniques are available for de- allowed. This problem can be mitigated by
tecting and verifying internal flows of pro- permitting "downgrading"--the manual
grams. The most common employ tradi- lowering of security classes by an author-
tional methods of program verification. The ized person. It is also possible to permit
allowable input/output dependencies of downward flows through certain informa-
program modules are stated as formal as- tion-losing programs. Such programs are
sertions, which are then proved by tracing supposed to filter out enough information
data flows within the program. This ap- about their inputs t h a t their results are of
proach was used for the systems at M I T R E lower confidentiality. Not much is known
[MILL76], UCLA [PoPE78c], and SRI about such programs except that, as we
[NEuM77]. A simpler approach results shall observe in the discussion of inference
238 D.E. Denning and P. J. Denning
control, many programs believed to filter snooper might be able to reconstruct this
out information actually do not. information by processing enough summar-
One type of flow cannot be controlled ies. This is called deduction of confidential
easily, if at all. A program can convey in- information by inference. When the infor-
formation to an observer by encoding it mation pertains to an individual, the de-
into some physical phenomenon without duction compromises his privacy. The ob-
storing it into the memory of the computer. jective of inference controls is to make the
These are called flows on covert channels cost of obtaining confidential information
[LAMP73, LIPId75]. A simple covert channel unacceptably high.
is the running time of a program. A program When invoking a query program, a ques-
might read a confidential value, then enter tioner supplies a characteristic formula C,
a loop that repeatedly subtracts 1 from the which is a logical formula whose terms are
value ~ntil it reaches zero, whereupon it joined by the Boolean operators (AND, OR,
stops. The owner of the program can deter- NOT). The set of records whose contents
mine the confidential value simply by ob- satisfy formula C is called the query set for
serving the running time. More complex C. The query program's response is com-
channels exploit other resource usage pat- puted from the query set; it may be the
terns such as the electric power consumed count of the records, the sum of values in
while running a program. the records, or the selection of a value, such
The only known technical solution to the as a maximum or median.
problem of covert channels requires that A record is compromised if a questioner
the owner of a job state in advance what can deduce its confidential values by cor-
resources his job will use and how much relating responses to his queries using any
time it will take. The requested resources prior information he might have. Most
are dedicated to the job, and the results, compromises are based on isolating a de-
even if incomplete, are returned to the user sired record at the intersection of a set of
at precisely the time specified. This strat- interlocking queries. Defenses include con-
egy allows the user to deduce nothing from trols that withhold response for improper
running time or resource usage that he did query set sizes and overlaps, controls that
not know beforehand; but even then he can distort the responses by rounding or falsi-
deduce something from whether his pro- fying data, and controls that apply queries
gram was successfully completed. This to random samples of the database. Exam-
scheme can be prohibitively expensive. ples are given in the next subsections.
Cost-effective methods of closing all covert
channels completely probably do not exist. Controls on Query Set Sizes and Overlaps
3. INFERENCE CONTROLS When the questioner has complete control
over the query set and when responses are
When information derived from confiden- undistorted, compromise is easy. This is
tial data must be declassified for wider dis- illustrated by a dialogue between a ques-
tribution, the rules of flow control must be tioner and a medical database:
suspended. This is true of statistical data-
bases (data banks) which contain sensitive Q: How many patients have these char-
information about individuals and must acteristics?
provide various kinds of statistical sum- Male
maries about the population. The Bureau Age 45-50
of the Census, for example, is charged by Married
law to collect information on all citizens Two children
and to report summaries of this information Harvard law degree
without revealing any particulars. Simi- Bank vice-president
larly, medical information systems are sup- A: 1.
posed to produce health statistics but not
to release health data about any one pa- Suppose the questioner knows that Fen-
tient. wick has these characteristics; he now at-
The problem is that summaries contain tempts to discover something confidential
vestiges of the original information; a about Fenwick from this query:

C o m n . t i n t ~ , ~ . t ~ v ~ V n | I | N o 3 S e n t e m b e r 1979
Data Security 239
Q: How many patients have these char- TABLE 1. POLITICAL CONTRIBUTIONDATABASE
acteristics:
Name Sex Occupation Contribution
Male ($)
Age 45-50 Abel M Journalist 3000
Married Baker M Journalist 500
Two children Charles M Entrepreneur 1
Harvard law degree Darwin F Journalist 5000
Engel F Scientist 1000
Bank vice-president Fenwick M Scientist 20000
Took drugs for depression Gary F Doctor 2000
Hart M Lawyer 10000
This query will respond w i t h " 1" if Fenwick
has taken drugs for depression and "0"
otherwise.
The principle of this compromise is sim- B) such that queries for the formulas A and
ple. The questioner finds a formula C whose (A AND NOT B) are both answerable.
query set count is 1. He can then discover Schl6rer called the formula T ffi (A AND
whether the individual thus isolated has NOT B) the tracker o f / . Table 1 shows a
any other characteristic X by asking, "How database recording n ffi 8 secret political
many individuals satisfy C AND X?." (The contributions. Suppose that k ffi 2; then
response ' T ' indicates that X is character- responses are given only to queries applying
istic of the individual and "0" indicates to at least two but not more than six indi-
not.) This attack was first reported by Hoff- viduals. Suppose further that the ques-
man and Miller [HOFF70]. tioner believes that C ffi (JOURNALIST
It might seem that this compromise could AND FEMALE) uniquely identifies Dar-
be prevented by a m i n i m u m q u e r y set con- win. The minimum query set control would
trol: prevent direct questioning about Darwin.
The dialogue below shows how Darwin's
Do not respond to queries for which there contribution can be deduced by using as
are fewer than k or more than n - k tracker the formula T ffi (JOURNALIST
records in the query set, where n is the AND NOT FEMALE) ffi (JOURNALIST
total number of records in the database. AND MALE).
The positive integer k in this control is a Q: How many persons are JOURNAL-
design parameter specifying the smallest IST?
allowable size of a query set. If the query A: 3
language permits complementation, a max- Q: How many persons are JOURNAL-
imum size n - k of the query set must also IST AND MALE?
be enforced, for otherwise the questioner A: 2
could pose his queries relative to the com-
plement (NOT C) of the desired character- By subtracting the second response from
istics (C). the first, the questioner verifies that
Unfortunately, this control is ineffective. (JOURNALIST AND FEMALE) identifies
Schlorer showed that compromises may be only one individual (Darwin). The ques-
possible even for k near n / 2 by the tech- tioner continues with
nique of a "tracker" [ScHL75, SCHL79]. The Q: What was the total of contributions
basic idea is to pad small query sets with by all persons who are JOURNAL-
enough extra records to make them an- IST?
swerable, then subtract out the effect of the A: $8500
extra records. Schl6rer called the formula Q: What was the total of contributions
identifying the extra records the t r a c k e r by all persons who are JOURNAL-
because the questioner can use it to "track IST AND MALE?
down" additional characteristics of an in- A: $3500
dividual.
Suppose that a questioner, who knows Since Darwin is the only female journalist,
from external sources that an individual I her contribution is the difference between
is characterized by the logical formula C, is the response of the second query and the
able to express C in the form C ffi (A AND response of the first ($5000).

Comnul~n~Stu~w. V ~ L ~ & ~ D t s m b e r 1979


24O D. E, Denning a n d P. J. Denning
It might seem that the effortto compro- ally returns the contribution of the wrong
mise the entire database is very high be- person [D~,MI77]:
cause the questioner would have to k n o w
Q: What was the smallest of the contri-
identifying characteristicsof each individ-
butions by persons who are J O U R -
ual in order to construct a tracker for that
individual. However, if a questioner can
NALIST?
A: $5000 (lying).
find any formula whose query set contains
Q: What was the largest of the contri-
at least 2k but not more than n - 2k rec-
ords, he can use that formula as a "general
butions by persons who are FE-
tracker" to determine the answer to any
MALE?
(unanswerable) query of the database
A: $5000 (truthfully).
[DENN79a]. Schl6rer has shown that often A minimum overlap control may also be
more than 99 percent of the possible for- subverted by solving a linear system of
mulas will be general trackers, and that the equations for an unknown data value
effort to retrieve data using trackers is usu- [DOBK79, SCHW79].
ally quite low [SCHL79]. It is possible to These examples illustrate compromises
find a tracker with at most log2 S queries, that use the combinatorial principle of in-
where S is the number of possible dis- clusion and exclusion to isolate a record. A
tinct configurations of characteristics design that can prevent this is apartitioned
[DENN79b]. database [Yu77]:
Tracker-based compromises employ
groups of records having high overlaps. The Store the records in groups, each contain-
compromise illustrated above works pre- ing at least some predetermined number
cisely because Darwin is the only JOUR- of records. Queries may apply to any set
NALIST excluded from the group (JOUR- of groups~ but never to subsets of records
NALIST A N D MALE). To protect against within any group.
trackers, one might consider a minimum With this control, attacks based on inclu-
overlap control: sion and exclusion can, at best, isolate one
Do not respond to a query that has more of the groups--but queries for single groups
are allowed. "Microaggregation" is a form
than a predetermined number of records
of partitioning: groups of individuals are
in common with every prior query.
aggregated into synthetic "average individ-
Such a control is obviously infeasible: be- uals" and statistics are computed for the
fore responding, the query program would synthetic individuals rather than the real
have to compare the latest query group ones [FEm70]. The partitioned database
against every previous one. But even ff has two practical limitations that can be
feasible, this control can be subverted in quite severe. First, the legitimate free flow
databases where each confidential value ap- of statistical summaries can be inhibited by
pears just once [DOBK79]. This dialogue excessively large groups or by ill-considered
illustrates such compromise using queries groupings. Second, forming and reforming
that overlap by one record: groups as records are inserted, updated, and
deleted from the database leads to excessive
Q: What was the largest of the contri- bookkeeping.
butions by persons who are J O U R -
NALIST? Rounding and Error Inoculation
A: $5000.
Q: What was the largest of the contri- T h e second class of inference controls is
butions by persons who are FE- based on distorting the responses. These
MALE? are usually called rounding controls be-
A: $5000. cause the exact answer to a query is per-
turbed by a small amount before it is re-
Because each contribution is unique, there ported to the questioner.
can be only one person who is J O U R N A L - Rounding by adding a zero-mean r a n d o m
IST, FEMALE, and contributed $5000 value is insecure, since the correct answer
(Darwin). Indeed, the same compromise can be deduced by averaging a sufficient
works even if the query program occasion- number of responses to the same query.
Data Security 241
Rounding by adding a pseudorandom value have at best 1/1000 chance of associating a
that depends on the data is preferable, since given sample record with the fight individ-
a given query always returns the same re- ual. These odds are too poor for compro-
sponse. Although reasonably effective, this mise of the sample to be worthwhile.
method can sometimes be subverted with Commercial data management systems
trackers [SCHL77], by adding dummy rec- now permit the construction of small- to
ords to the database [KARP70], or simply medium-scale databases which change con-
by comparing the responses to several quer- stantly through insertion, deletion, and
ies [AcHu78]. modification of records. A small random
A perturbation can also be achieved with sample would be useless because it would
error inoculation: the value in a record is not be statistically significant and because
randomly perturbed or replaced by another it would not represent the current state of
value before it is used to compute a statistic the database. For these reasons random
[BECK79, BORU71, CAMP77]. Data could be samples have been ignored as an inference
modified before being stored in a record, in control in such databases.
which case the original data are discarded. However, when combined with a mini-
This can have serious adverse conse- mum query set control, random sample
quences if the correct data are supposed to queries can be an effective defense
be available for authorized users of the [DENN79C]. The scheme works as follows.
database. Alternatively, a "perturbation As it locates each record satisfying a given
factor" can be stored permanently in the formula C, the query program determines
record along with the original data; it is whether or not to keep that record for the
applied when the data are used in a query. "sampled query set." This determination
Like rounding, error inoculation reduces should ideally be pseudorandom so that the
the quality of the statistics released. To same sampled query set is computed any
prevent compromise, large errors may have time the given formula C is presented. Each
to be introduced into the data. queried record is selected independently for
A variation of this approach, which may the sampled query set with a given, fixed
yield more accurate statistics, is data swap- probability. Statistics then are computed
ping: the values of fields of records are from the sampled query set. A minimum
exchanged so that all /-order statistics query set control inhibits the response if
(which involve i attributes) are preserved the true query set's size is too small or too
for i _ d and some d [DALE78, SCHL78]. large.
Even if a questioner succeeds in isolating a With a simulated database and p ffi
value, he has no way of knowing with which 0.9375, this method estimated counts (and
individual it is actually associated. The sums) of answerable query sets to within
problem with the approach is the difficulty 1 percent of their true values [DENN79C].
of finding sets of records whose values can However, the estimates of counts (and
be swapped. It remains to be seen whether sums) estimated with trackers contained
this control can be cost effective. errors of several hundred percent; this is
because the questioner must estimate small
Random Samples counts (or sums) by subtracting large
counts (and sums). (An illustration: The
The third group of controls is based on questioner tries to find the count for C by
applying queries not to the entire database subtracting the response 100 from the re-
but to a "random sample"--a subset of sponse 101, both of which have error 1;
records chosen at random. This group is the difference, 1, has error 2.)
potentially the most effective because it
deprives the questioner of control over the
Limitations of Inference Controls
contents of query sets. The 1960 U.S. Cen-
sus, for example, was distributed on tape as Because queries inevitably carry some in-
a random sample of one record in one thou- formation out of the database, one cannot
sand; each sample record contained no reasonably hope to design a system that is
name, and it specified location only by size impossible to compromise. The objective of
of city in one of nine geographic regions research on inference controls is to discover
[HANs71]. The cleverest snooper would how much computational work, measured

ComputiBg Survey~V~l~l~ ~S~tember1979


242 D. E. Denning and P. J. Denning
Insecure Channel

-1'
Encipher

Decipher
Decipher

Encipher
~ M

Secure Channel
FIGURE 6. Encryptmn using one key (traditional view).

by computer time or by dollars spent, would guard for data stored in, or in transit
be required to compromise a particular sys- through, media whose security cannot be
tem. This would enable designers either to guaranteed by these other controls. With
raise the cost of compromise beyond a spe- the help of a secret key (K) the sensitive
cific threshold, or to declare that the de- plaintext (M) is scrambled into unintelligi-
sired level of security could not be imple- ble ciphertext (M K) before being put in the
mented with the given cost constraints. insecure medium.
The query functions of many commercial In a traditional cryptosystem, illustrated
database systems seem to release much in Figure 6, there is a slow-speed secure
more information about confidential rec- channel by which the sender can inform the
ords than many people have suspected. receiver of the key K used to encode the
Without proper controls, databases subject message. The message itself, transmitted at
to simple compromises will be the rule high speed through the insecure medium, is
rather than the exception. When combined secure as long as the key is secret. Simmons
with minimum query set restrictions, ran- calls this symmetric encryption because the
dom sample queries appear to offer the best same key is used at both ends of the channel
defense. [SIMM79]. The code is broken if an intruder
A final defense against snooping is threat can deduce the key by analyzing the ciph-
monitoring--inspection of logs or audit ertexts. Keys are changed regularly, usually
trails for unusual patterns of queries, espe- more often than the time in which the
cially many queries for the same records cleverest intruder is believed capable of
[HoFF70]. Although it does not attempt to locating the key systematically.
control the flow of information through The code will be unbreakable if the key
query programs, monitoring threatens ex- is a sequence of random bits as long as the
posure of illegal activity. message (pseudorandom generation will
not do!); each key bit specifies whether the
corresponding message bit is comple-
4. CRYPTOGRAPHIC CONTROLS
mented or not. With such a key, called a
Access, flow, and inference controls may one-time pad, each bit of ciphertext is
not prevent accidental or malicious disclo- purely random and uncorrelated with all
sures of sensitive data. None of these con- other bits of ciphertext. Practical crypto-
trols helps if an operator leaves a listing of systems are based on keys that are much
the password file on a desk, if confidential shorter than the messages; because the in-
data are moved off-line during backup or truder may know the enciphering and de-
maintenance, if transmissions are tapped or ciphering algorithms, the security of these
played back, or if hardware and software cryptosystems depends on the secrecy of
are faulty. Encryption is a common safe- the keys and the computational difficulty

Computing Surveys, Vol. 11, No. 3, September 1979


Data Security 243
KG

List of
Secret Keys

A sA
B Se

AI
A to KG" ( A , ( I , B) SA)
KG to A: (I, K, (K,A)SS) SA
A to B: (K, A) Ss
FIGURE 7. Protocol for allocating A and B a common key K for exchanging messages.

of inverting the enciphering algorithms. from a secure key-generating facility (KG).


Overviews of cryptosystems are given by The computer KG contains a list of special
Diffie and Hellman [DIFF76], Gardner secret keys, one assigned to each computer
[GARD77], Hoffman [HOFF77], Konheim of the network; thus A and KG can ex-
[KONH78], Lempel [LF.MP79], and Sim- change secret messages encrypted with key
mons [SIMM79]. Fascinating accounts of SA known only to the two. Having decided
codes and their breakings have been writ- to send a message to B, A first transmits
ten by Kahn [KAHN67]. the message (A, (I, B) sA) to KG, wherein I
It is reasonable to suppose that military is message identifier chosen arbitrarily by
and diplomatic experts have secure chan- A. Since A's name is a plaintext prefix of
nels for exchanging encryption keys--for this message, KG can locate A's secret key,
example, secret couriers. Therefore the se- SA, and decipher the component (I, B) sA.
curity of the traditional cryptosystem is Then KG generates a key K and responds
properly measured as the work required for to A with the message (I, K, (K, A}sR)s~.
an intruder to invert the code through Only A can decode this message and obtain
cryptanalysis. the newly generated key K and the embed-
With computer networks it is no longer ded ciphertext T ffi (K, A)sR; A can also
reasonable to suppose that individual check that this message is a unique re-
senders and receivers have secure means of sponse from KG by verifying that I is one
exchanging secret keys. Needham and of his own (recent) message identifiers.
Schroeder [NEED78] and Popek [Pope78a] Then A forwards the ciphertext T to B,
have outlined methods, called "protocols," who is the only one capable of decoding it.
for simulating a secure key-exchange chan- After these exchanges, both A and B have
nel. Figure 7 gives the central idea of the the key K and can begin transmitting di-
protocol suggested by Needham and rectly to each other in code.
Schroeder. A sending computer A and a The security of this system clearly de-
receiving computer B seek a secret key K pends on the security of the key-generating

Computing Smvey~ Vol. 11, No. ~, September 1979


244 D.E. Denning and P. J. Denning
facility. Both A and B must trust KG to along with the name of the password's
generate a unique key K, to keep all keys owner. When a user N presents a password
secret, and to not monitor any of their X, the password is accepted only if C X is
private transmissions. already in the password file with name N.
This example illustrates why key man- Because no password is actually stored as
agement is essential to secure crypto- plaintext in the system, passwords are pro-
graphic control in computer networks. The tected even if the file is disclosed.
security of these cryptosystems is often less The DES is not universally regarded as
dependent on the indecipherability of the a highly secure cryptosystem. Diffie and
code than it is on the ability to secure the Hellman argue that it is possible for about
keys. Saltzer has argued that our demon- $20 million to build a highly parallel com-
strated inability to protect passwords in file puter that will locate a key by exhaustive
systems suggests that many cryptosystems search in about 12 hours [DIFF77]. At the
will be easily broken not by cryptanalysis 1978 National Computer Conference, Hell-
but by elementary attacks on the key man- man showed how to use about $4 million of
ager. (See SALT78 and also DENS79d, conventional equipment with a heuristic
EHRS78, GAIN78, MATY78, MORR78, and search to find a key within 24 hours. Diffie
PoPz78a.) and Hellman maintain that a 128-bit key
(not the 56 bits in the standard) would
The Data Encryption Standard (DES) make the DES virtually unbreakable. IBM
maintains that two DES chips in series can,
In 1977 the National Bureau of Standards with double encryption, simulate a single
announced a standard encryption algo- D E S chip with a 128-bit key [HOFF77].
rithm (DES) to be used in unclassified U.S. IBM also maintains that even with 56-bit
Government communications [NBS77]. keys D E S is not likely to be the weak link
The algorithm was developed by IBM, in the security chain.
which offers products that use DES
[EHRS78, LENN78, MATY78]. Each 64-bit Public-Key Encryption
block of plaintext undergoes a complex
transformation comprising 16 levels of sub- In 1976 Diffie and Hellman proposed a new
stitutions and permutations, all controlled kind of cryptosystem, which they called
by a 56-bit key. The algorithm can be im- public-key encryption [DIFF76]. Each user
plemented cheaply as an LSI chip, which has a matched pair of keys, the "public
would allow it to operate at a high data key" P and the "secret key" S. Each user
rate. publishes his public key to everyone
The DES can be used as in Figure 6, through a directory but reveals his secret
providing "end-to-end encryption" on the key to no one. As with the DES, the en-
channel between the sender A and receiver cryption algorithm need not be secret.
B. A user can also use DES to encipher Enciphering a message M with the public
fries for storage in removable or insecure key P gives a ciphertext M P which can be
media. However, the data must usually be sent to the owner of the public key. The
deciphered for processing; other mecha- recipient deciphers the ciphertext using his
nisms, such as access and flow controls, are secret key S to retrieve the message: (M P) s
needed to protect the data while they are -- M. Since P and S are a matched pair, no
plaintext. Rivest has studied cryptosystems one but the owner of the public key can
which allow a limited number of operations decipher M p. The operations of P and S
which can be performed directly on the are commutative; that is, ( M P ) s ffi (MS) p
ciphertext; however, t!,ese systems must ffi M. It is infeasible to compute one key
exclude compare operations if they are to given the other. Figure 8 illustrates com-
be highly secure [RIvE78b]. munication between two computers by this
The DES can also be used as a one- scheme. Simmons calls this asymmetric en-
way cipher to secure fries containing pass- cryption because different keys are used at
words [EVAN74, MORR78, WILE68]. F~ch the ends of the channel [SIMS79]. Exam-
password X is used as the key to encipher ples of specific algorithms are in DIFF76,
a predetermined plaintext C; the resulting -~-~;LL78, KONH78, LEMP79, MERK78, a n d
ciphertext C X is placed in the password file RIVE78a.

Compu/dngSurveys;Vol. 11,No. 3, September 1979


Data Security 245
Public-key cryptosystems also present an plugged into the encryption device; he
easy solution to "digital signatures," the could then guard his encryption key to the
problem of proving that a particular mes- same extent as any other key in his posses-
sage was in fact transmitted by the person sion [DENN79d].
claiming to have transmitted it. To create It is true that the public keys can be
a signature, one enciphers a predetermined distributed by a public directory without
message M with his secret key S, offering endangering the secret keys. However, a
M S as the signature. Anyone challenging user still needs assurances that the key
the signature need only apply the public received from the (purported) public direc-
key P of the purported signer, for only then tory is in fact the public key of the re-
will (MS) P = M . (See DIFF76, RIVE78a, quested individual. Confidence in correct
KONH78, and LEMP79.) Digital signatures distribution of public keys can be increased
can also be implemented with single-key if the public directory signs its responses
cryptosystems, but the solution is much less [KONF78, NEED 78].
elegant [NE~J)78, RABZ77].
A result by Shamir, Rivest, and Adleman 5. SUMMARY
suggests that a modification of public-key The four kinds of internal security con-
encryption could also be an approximate trols--access, flow, inference, and crypto-
one-time pad [SHAM79]. Users A and B graphic--complement each other. No one
each select a matched pair of keys but keep of them can solve the problems handled by
both secret. Then A sends the enciphered the other three.
message M SA to B, who then enciphers it Access controls operate at the external
and returns ( M sR )sA = (MSA)SR to A. Then interface, verifying that an individual at-
A applies his second key, PA, to obtain tempting to use the system is authentic,
M s B = ((MSA)s~)PA, which he returns to B. and internally, verifying that each running
Finally, B obtains the message M by apply- program generates references only to au-
ing his PB, since M -- ( M sR )P~. This is not thorized segments of memory. The ideal
a true one-time pad because three messages access control operates each program in a
are actually sent using the four keys. domain having the minimum privilege re-
Its proponents argue that public-key en- quired for the immediate task. The princi-
cryption is more secure than DES because ple of minimum privilege contributes
no user need rely on the security of the greatly to security by protecting against
computer network to safeguard his secret "trojan horses," and to reliability by mini-
key. Indeed, a user's local, personal com- mizing the extent of possible damage by a
puter can interface with the network malfunctioning program.
through the encryption device as in Figure Flow controls regulate the dissemination
8, and his secret key can be engraved elec- or copying of information by prohibiting
tronically into a memory chip that can be derived data from having lower confiden-

Insecure Chonnel

s~

Encipher Decipher M

Decgpher Encipher

SA
t
COMPUTER A COMPUTER B
FIGURE 8. Public-key encryption.

Computing Survo~ Vol. 11, No. 3, Septembe 1979


246 D, E. Denning and P. J. Denning
tiality than the original. The higher the BONC77 BONCZEK, R.H., CASH, J.I., AND WmN-
data's confidentiality, the stricter the rules STON, A.B. "A transformational gram-
mar-based query processor for access con-
on their dissemination. When applied to trol in a planning system," ACM Trans.
program input-to-output flow, these con- Database Syst. 2, 4 (Dec. 1977), 326-338.
BORUCH, R.F. "Mamtaining confiden-
trols offer a partial solution to the "confine- BoRv71 tiality in educational research: A sys-
ment problem." tematic analysis," Am. Psychol. 26 (1971),
Inference controls prevent "leakage" CAMP77 413--430.
CAMPBELL, D.T., ET AL. "Confiden-
through programs that produce summaries tiality-preserving modes of access to files
of groups of confidential records. They re- and to interfile exchange for useful statis-
tical analysis," Eval. Q. 1, 2 (May 1977),
duce the risk that by correlating the re- 269-299.
sponses from many summaries, a user can COHE75 COHEN, E., AND JEFFERSON, D. "Pro-
deduce the confidential values denied him tection in the HYDRA operating system,"
in Proc. 5th Symp. Operating Systems
by access and flow controls. Principles, (special issue) Oper. Syst. ReD.
Cryptographic controls protect informa- (ACM) 9, 5 (Nov. 1975), 141-160.
tion stored or transmitted in insecure me- COHE77 COHEN, E. "Information transmission in
computational systems," Proc. 6th Symp.
dia. The data encryption standard (DES) is Operating Systems Principles, (special is-
efficient and economical, though it has been sue) Oper. Syst. ReD. (ACM) 11, 5 (Nov.
criticized as being breakable and overly de- 1977), 133-139.
COHEN, E. "Information transmission in
pendent on secure key management. Pub- COHE78 sequential programs," in Foundations of
lic-key encryption does not rely on any secure computation, R.A. DeMillo et al.
central manager for safeguarding secret (Eds.), Academic Press, New York, 1978,
pp. 297-335.
keys, though it requires secure distribution CONW72 CONWAY, R.W., MAXWELL, W.L., AND
of public keys. MORGAN, H.L. "On the nnplementation
All these controls are subject to practical of security measures in information sys-
tems," Commun. ACM 15, 4 (April 1972),
(and sometimes theoretical) limitations 211-220.
which keep them from achieving their ob- DALE78 DALENIUS, T., AND REISS, S.P. Data-
swappmg--A technique for disclosure
jectives under all conditions. No mecha- control, Computer Science, Brown Univ.,
nism is perfectly secure. A good mechanism Providence, R.I., (1978).
reduces the risk of compromise to an ac- DEMI77 DEMILLO, R.A., DOBKIN, D., AND LIP-
TON, R.J. "Even data bases that lie can
ceptable level. be compromised," IEEE Trans. Software
Eng. SE-4, 1 (Jan. 1977) 73-75.
ACKNOWLEDGMENTS DEMI78 DEMILLO, R.A., DOBKIN, D.P., JONES,
A.K., AND LIPTON, R.J. (Eds.) Foun-
A preliminary version of this paper appeared under dations of secure computation, Academic
the title "The Limits of Data Security," in the pilot Press, New York, 1978.
DENN71 DENNING, P.J. "Third generation com-
issue of ABA C US, the Scientific-American-style mag- puter systems," Comput. Surv. 3, 4 (Dec.
azine for the computing community undertaken by the 1971), 175-216.
American Federation of Information Processing Soci- DENN76a DENNING, D.E. "A lattice model of se-
eties (AFIPS). We are grateful to AFIPS for permis- cure information flow," Commun. A CM
19, 5 (May 1976), 236-243.
sion to use parts of the pilot article in this one. DENN76b DENNING, P.J. "Fault tolerant operat-
We are grateful to William E Boggs, whose jour- ing systems," Comput. Surv. 8, 4 (Dec.
nahstic genius considerably improved most of the 1976), 359-389.
draft We are also grateful to Adele Goldberg and the DENN77 DENNING, D.E., AND DENNING, P.J.
"Certification of programs for secure in-
referees for their comments and suggestions. formation flow," Commun. ACM 20, 7
(July 1977), 504-513.
REFERENCES DENN79a DENNING, D.E., DENNING, P.J., AND
SCHWARTZ,M.D. "The tracker: A threat
ACHU78 ACHUGBUE, J.O., AND CHIN, F.Y. Out- to statistical database security," ACM
put perturbation for protectmn of statis- Trans. Database Syst. 4, 1 (March 1979),
tical data bases, Dept. Computing Sci- 76-96.
ence, Univ. Alberta, Edmonton, Canada, DENN79b DENNING, D.E., AND SCHLORER, J. A
Jan. 1978. fast procedure for finding a tracker m a
ANDE72 ANDERSON, J.P. "Information security statistical database, Computer Science
in a multi-user computer environment," Dept., Purdue Univ., W. Lafayette, Ind.
in Advances m computers, Vol. 12, Morris and Inst. Medizinlsche Statistik und Dok-
Rubinoff (Ed.), Academic Press, New umentation, Unlv Giessen, W. Germany,
York, 1972. Feb. 1979.
BECK79 BECK, L.L. A security mechanism for DENN79C DENNING, D.E. Securing databases un-
stattstical databases, Dept. Computer der random sample queries, Computer
Science and Engineering, Southern Meth- Science Dept., Purdue Univ., W. Lafay-
odist Univ., Dallas, Tex., Jan. 1979. ette, Ind., April 1979.

Computing Surveys, Vol. 11, No. 3, September 1979


Data Security 247
DENN79d DENNING, D.E "Secure personal corn- HANS71 HA~SEN, M.H. "Insuring confidentiality
putmg in an insecure network," Commun. of individual records in data storage and
ACM 22, 8 (Aug 1979), 476-482. retrieval for statistical purposes," in Proc.
DENV66 DENNIS, J.B., AND VAN HORN, 1971 AFIPS Fall Jt. Computer Conf., Vol.
E.C. "Programming semantics for mul- 39, AFIPS Press, Montvale, N.J., pp. 579-
tlprogrammed computations," Commun. 585.
ACM 9, 3 (March 1966), 143-155. HARR76 HARRISON, M.A., R~zzo, W.L., AND UL-
DIFF76 DIFFIE, W., AND HELLMAN, M. "New MAN, J.D. "Protection in operating sys-
directions in cryptography," IEEE Trans. tems," Commun. ACM 19, 8 (Aug. 1976),
Inf Theory IT-22, 6 (Nov. 1976), 644-654. 461-471.
DIFF77 DIFFIE, W., AND HELLMAN, M E "Ex- HART76 HARTSON, H R., AND HSIAO, D.K. "Full
haustive cryptanalysls of the NBS Data protection specifications in the semantic
Encryptlon Standard," Computer 10, 6 model for database protection languages,"
(June 1977), 74-84. in Proc. 1976 ACM Annual Conf., Oct.
DOBK79 DOBKIN, D., JONES, A.K., AND LIPTON, 1976, pp. 90-95.
R.J. "Secure databases" Protection HELL78 HELLMAN, M.E. "Security in communi-
against user inference," ACM Trans. Da- cations networks," in Proc. AFIPS 1978
tabase Syst. 4, 1 (March 1979), 97-106. Nat. Computer Conf., Vol. 47, AFIPS
EHRS78 EHRSAM, W.F, MATYAS, S.M., MEYER, Press, Montvale, N.J., 1978, pp. 1131-
C.H., AND TUCHMAN, W.L. "A crypto- 1134.
graphic key management scheme for ira- HOFF70 HOFFMAN, L.J., AND MILLER, W.F.
piGmenting the Data Encryption Stan- "Getting a personal dossier from a statis-
dard," IBM Syst. J. 17, 2 (1978), 106-125. tical data bank," Datamation 16, 5 (May
ENGL74 ENGLAND, D.M. "Capability concept 1970), 74-75.
mechanism and structure in system 250," HOFF77 HOFFMAN, L.J. Modern methods for
in Proc. Int. Workshop on Protection in computer security and privacy, Prentice-
Operating Systems, Inst. Recherche Hall, Englewood Chffs, N.J., 1977.
d'Informatique et d'Automatique, Roc- HSIA78 HslAO, D.K., KERR, D.S., AND MADNICK,
quencourt, Le Chesnay, France, Aug. S.E. "Privacy and security of data com-
1974, pp. 63-82. munications and data bases," in Proc.
EVAN74 EVANS, A. JR., KANTROWITZ, W., AND Int Conf. Very Large Data Bases,
WEISS, E. "A user authentication Sept. 1978.
scheme not reqmrmg secrecy in the corn- JONE76 JONES, A.K., AND LISKOV, B.H. "A lan-
puter," Commun. ACM 17, 8 (Aug. 1974), guage extension mechanism for control-
437-442. ling access to shared data," in Proc. 2nd
FAER74 FABRY, R.S. "Capability-based address- Int. Conf. Software Engineering, 1976,
ing," Commun. ACM 17, 7 (July 1974), pp. 62-68.
403-412. KAHN67 KAHN, D. The codebreakers, Macmillan
FAGI78 FAGIN, R. "On an authorization mecha- Co., New York, 1967.
msm," ACM Trans. Database Syst. 3, 3 KARP70 KARPINSKI, R.H. "Reply to Hoffman and
(Sept. 1978), 310-319. Shaw," Datamation 16, 10 (Oct. 1970), 11.
FEIG70 FEIGE, E.L., AND WATTS, H.W. KONF78 KONFELDER, L.M. "A method for certi-
"Protection of privacy through microag- fication," Lab. Computer Science, MIT,
gregatlon," in Data bases, computer, and Cambridge, Mass., May 1978.
the social sciences, R.L. Bmco (Ed.), KONH78 KONHEIM, A.G. "Cryptographic meth-
Wiley-Interscience, New York, 1970. ods for data protection," Res. Rep. RC
PENT74 FENTON, J.S. "Memoryless subsys- 7026 (#30100), IBM Thomas J. Watson
tems," Comput. J 17, 2 (May 1974). 143- Research Center, Yorktown Heights,
147. N.Y., March 1978.
FURT78 FURTEK, F. "Constraints and compro- LAMP73 LAMPSON, B.W. "A note on the confine-
mine," in Foundattons of secure compu- ment problem," Commun. ACM 16, 10
tation, R.A. DeMfllo et al. (Eds.), Aca- (Oct 1973), 613-615.
demic Press, New York, 1978, pp. 189-204. LEMP79 LEMPEL, A. "Cryptography in transi-
GAIN78 GAINES, R.S., AND SHAPIRO, N.Z. tion," to appear in Comput. Surv. 11, 4
"Some security principles and their apph- (Dec. 1979).
cation to computer security," in Founda- LENN78 LENNON, R.E. "Cryptography architec-
t~ons of secure computatton, R.A. De- ture for information security," IBM Syst.
Millo et al. (Eds.), Academic Press, New J. 17, 2 (1978), 138-150.
York, 1978, pp 223-236; also m Oper. Syst. LIND 76 LINDEN, T.A. "Operating system struc-
ReD. 12, 3 (July 1978), 19-28. tures to support security and reliable soft-
GARD77 GARDNER, M. "Mathematical games," ware," Comput. Surv. 8, 4 (Dec. 1976),
Scz. Am. 237, 2 (Aug. 1977), 120-124. 409-445.
GEHR79 GEHRINGER, E. "Functionality and per- LIPS75 LIPNER, S.B. "A comment on the con-
formance in capability-based operatmg finement problem," in Proc 5th Symp
systems," Ph.D. Thesis, Computer Sci- Operating Systems Principles, (special is-
ence Dept., Purdue Umv., W. Lafayette, sue) Oper. Syst. ReD. (ACM) 9, 5 (Nov.
Ind, May 1979. 1975) 192-196.
GRAH72 GRAHAM, G S , AND DENNING, P.J. LIPT78 LIPTON, R.J., AND BUDD, T.A. "On
"Protection--Principles and practice," in classes of protection systems," in Foun.
Proc. 1972 AFIPS Spring Jt. Computer dattons of secure computation, R.A.
Conf., Vol. 40, AFIPS Press, Montvale, DeMillo et al. (Eds.), Academic Press,
N.J., pp. 417-429. New York, 1978, pp. 281-296.
GRIP76 GRIFFITHS, P.P., AND WADE, B.W. "An MADN79 MADNICK, S.E. Computer security, Ac-
authorizatmn mechanism for a relational ademic Press, New York, 1979.
database system," ACM Trans. Database MATY78 MATYAS, S.M., AND MEYER, C.H.
S~st 1, 3 (Sept. 1976), 242-255. "Generation, distribution, and installation

Computing Surveys, VoL 11, No. 3, September 1979


248 D.E. Denning and P. J. Denning
of cryptographic keys," IBM Syst. J. 17, formatique et d'Automatique, Rocquen-
2 (1978), 126-137. court, Le Chesnay, France, Aug. 1974
MEEK78 MERKLE, R.C., AND HELLMAN, M.E. REIT78 REITMAN, R.P., AND ANDREWS, G.R.
"Hiding information and signatures in Certifying information flow properties of
trap door knapsacks," IEEE Trans. programs: An axiomatic approach, Syr-
Inf. Theory, It-24, 5 (Sept. 1978), 525- acuse Univ., Syracuse, N.Y., and Cornell
530. Univ., Ithaca, N.Y., 1978.
MILL76 MILLEN, J.K. "Security kernel valida- RIVE78a RIVEST,R.L., SHAMIR, A., AND ADLEMAN,
tion in practice," Commun. ACM 19, 5 L. "A method for obtaining digital sig-
(May 1976), 243-250. natures and public-key cryptosystems,"
MILL78 MILLEN, J.K. "Constraints and multil- Commun. ACM 21, 2 (Feb. 1978), 120-
evel security," in Foundations of secure 126.
computation, R.A. DeMillo et al. (Eds.), RivE78b RIVEST, R.L., ADLEMAN, L., AND DER-
Academic Press, New York, 1978, pp 205- TOUZOS,M.L. "On data banks and pri-
222. vacy homomorphisms," in Foundations
MORR78 MORRIS, R., AND THOMPSON, K. of secure computing, R.A. DeMillo et al.
Password security: A case history, CS- (Eds.), Academic Press, New York, 1978,
TR-71, Bell Labs, Murray Hill, N.J., pp. 169-179.
April 1978. RPP77 "The report of the privacy protection
MYER78 MYERS, G. Advances in computer ar- study commission," Appendix 5, in Tech-
chitecture, Wiley, New York, 1978. nology and privacy, U.S. GoD. Prmting
NBS77 NATIONAL BUREAU OF STANDARDS, Data Office, Washington, D.C. July 1977.
Encryption Standard, FIPS PUB 46, Jan. SALT75 SALTZER, J.H., ANn SCHROEDER, M.D.
1977. "The Protection of information in com-
NEED77 NEEDHAM, R.M., AND WALKER, R.D.H. puter systems," Proc IEEE 63, 9 (Sept.
The Cambridge CAP computer and its 1975), 1278-1308.
protection system," in Proc. 6th Syrup. SALT78 SALTZER, J. "On digital signatures,"
Operating Systems Principles, (special is- Oper. Syst. ReD. 12, 2 (April 1978), 12-14.
sue) Oper. Syst. ReD. (ACM) 11, 5 (Nov. SCHL75 CHL()RER, d. "Identification and re-
1977), 1-10. trieval of personal records from a statis-
NEED78 NEEDHAM, R.M., AND SCHROEDER, M.D. tical data bank," Methods Inf. Med 14, 1
"Using encryption for authentication in (Jan. 1975), 7-13.
large networks of computers," Commun. SCHL77 SCHLRER, J. "Confidentiality and se-
ACM 21, 12 (Dec. 1978), 993-999. curity in statistical data banks," Proc.
NEUM77 NEUMANN,P.G., ET AL. "A probably se- Workshop on Data Documentation, Ver-
cure operating system: The system, its lag Dokumentation, 1977, pp. 101-123.
applications, and proofs," Project 4332 SCHL78 SCHLORER, d. Security of statistical da-
Final Rep., SRI International, Menlo tabases: Multidimensmnal transforma.
Park, Calif., Feb. 1977. lion, TB-IMSD 2/78, Inst. Medizinische
NIEL76 NIELSEN, N.R., RUDER, B., AND BRAN- Statistik und Documentation, Univ. Gies-
DIN,D.H. "Effective safeguards for com- sen. W. Germany, 1978.
puter system integrity," in Proc. AFIPS SCHL79 SCHLORER, J. Disclosure from statist.
Nat. Computer Conf. Vol. 45, AFIPS cal databases: Quantitative aspects of
Press, Montvale, N.J., 1976, pp. 75-84. trackers, Inst. Medizinische Statistik und
OEGA72 ORGANICK,E.I. The multics system: An Dokumentation, Univ. Giessen, W. Ger-
examination of its structure, MIT Press, many, March 1979; to appear in ACM
Cambridge, Mass., 1972. Trans. Database Syst.
ORGA73 ORGANICK,E.I. Computer system orga- SCHR72 SCHROEDER, M.D., AND SALTZER, J.H.
nization: The B5700/B6700 series, Aca- "A hardware architecture for implement-
demic Press, New York, 1973. ing protection rings," Commun ACM 15,
PARK76 PARKER, D.B. Crime by computer, 3 (March 1972), 157-170.
Scribner's, New York, 1976. SCHR77 SCHROEDER, M.D., CLARK, D.D., AND
POPE74 POPEK, G.J. "Protection structures," SALTZER, J.H. "The MULTICS kernel
Computer 7, 6 (June 1974), 22-31. design project," in Proc. 6th Symp, Oper-
POPE78a POPEK, G.J., AND KLINE, C.S. "Design ating Systems Principles, (special issue)
issues for secure computer networks," in Oper. Syst. ReD. (ACM) 11, 5 (Nov. 1977),
Operating systems, an advanced course, 43-56.
R. Bayer, R.M. Graham, and G. Seeg- ScHw79 SCHWARTZ, M.D., DENNING, D.E., AND
muller (Eds.), Springer-Verlag, New DENNING,P.J. "Lmear queries in statis-
York, 1978. tical databases," ACM Trans Database
POPE78b POPEK,G.J., ANDKLINE,C.S. "Issues in SsYSt. 4, 2 (June 1979), pp. 156-167
kernel design, Proc. AFIPS Nat. Com- SHAM79 HAMIR,A., RIVEST,R.L., ANDADLEMAN,
puter Conf. Vol. 47, AFIPS Press, Mont- L.M. "Mental poker," Lab. Computer
vale, N.J., 1978, pp. 1079-1086. Science, MIT, Cambridge, Mass., Jan.
POPE78c POPEK, G.J., AND FARBER, D.A. "A 1979.
model for verification of data security in SHAN77 SHANKAR, K.S. "The total computer se-
operating systems," Commun. ACM 21, 9 curity problem: An overview," Computer
(Sept. 1978), 737-749. 10, 6 (June 1977), 50-73.
RAEI78 RABIN, M. "Digital signatures using SIMM79 SIMMONS,G.J., '"'Symmetric and asym-
conventional eneryption algorithms," in metric encryption," to appear m Comput.
Foundations of secure computing, R.A. Surv. 11, 4 (Dec. 1979).
DeMillo et al. (Eds.), Academic Press, SNYD77 SNYDER, L. "On the synthesis and anal-
New York, 1978, pp. 155-166. SiS of protection systems," Proc. 6th
REDE74 REDELL, D.R., AND FABRY, R.S. rap. Operating Systems Principles,
"Selective revocation of capabilities," (special issue) Oper. Syst. ReD. (ACM) 11,
Proc. Int. Workshop Protection m Oper- 5 (Nov. 1977), 141-150.
ating Systems, Inst. Recherche d'In- STON74 STONEBRAKER, M., AND WONG, E.

ComputingSurveys,Vol. II, No. 3, September1979


Data Security 249
"Access control in a relational data base ADEPT-50 time-sharing system," Proc.
management system by query modifica- 1969AFIPS Fall Jt. Computer Conf., Vol.
tion," in Proc. 1974 ACM Annual Conf., 35, AFIPS Press, Montvale, N.J., pp. 119-
pp. 180-186. 133.
TURN76 TURN, R., AND WARE, W.H. "Privacy WEST72 WESTIN, A.F., AND BAKER, M.A.
and security issues m information sys- Databanks in a free society, Quadrangle
tems," IEEE Trans. Comput. C-25, 12 Books, New York, 1972.
(Dec 1976), 1353-1361. WILK68 WILKES, M.V. Time sharing computing
WALT75 WALTER, K.G., ET AL. "Structured spec- systems, Elsevier/MacDonald, New York,
ification of a security kernel," Proc. Int. 1968; 3rd ed., 1975.
Conf. Rehable Software, (special issue) Yu78 Yu, C.T., ANY CHIN, F.Y. A study on
SIGPLAN Notwes (ACM) 10, 6 (June the protection of statistical data bases,
1975), 285-293. Proc ACM SIGMOD Int. Conf. Manage-
WEIS69 WEISSMAN, C. "Security controls in the ment of Data, 1977, pp. 169-181.

RECEIVED OCTOBER 1978; FINAL REVISION ACCEPTED MAY 1979

Cemputmg Stwve~ ~ ~ 1~. 8, September 1979

You might also like