You are on page 1of 7

focus

requirements engineering

A Reference Model
for Requirements
and Specifications
The authors define Carl A. Gunter, University of Pennsylvania
a reference model Elsa L. Gunter, Bell Labs, Lucent Technologies
for applying for-
Michael Jackson and Pamela Zave, AT&T Laboratories
mal methods to
the development
of user require- equirements for software often fall into two categories: for those

R
ments and their who commission, pay for, or use it and for those who program
reduction to a be-
havioral system it. These documents are sometimes distinct, receiving distin-
specification. The guishing names such as customer-specification document and
approach focuses feature-specification document. The distinction also appears in standards;
on the shared phe- for example, the European Space Agency software standard1 distinguishes
nomena that de-
fine the interface between the user-requirements specification cisely enough that we can rigorously analyze
between the and the software-requirements specification, substantive properties.
system and mandating complete documentation of each
environment. according to various rules. Other cases em- The Reference Model
phasize this distinction less. For instance, Reference models have a time-honored
some groups at Microsoft argue that the dif- status, and one well-known example is the
ficulty of keeping a technical specification ISO 7-Layer Reference Model, which di-
consistent with the program is more trouble vides network protocols into seven layers.
than the benefit merits.2 We can find a wide The model is informal and does not corre-
range of views in industry literature and from spond perfectly to the protocol layers in
the many organizations that write software. widespread use but is still discussed in vir-
Is it possible to clarify these various arti- tually every basic textbook on networks—
facts and study their properties, given the the model is widely used by network engi-
wide variations in the use of terms and the neers to describe network architectures.
many different kinds of software being writ- The ISO 7-layer model is successful because
ten? Our aim is to provide a framework for it draws on what was already understood
talking about key artifacts, their attributes, about networks, and it’s general enough to
and relationships at a general level, but pre- be flexible.

0740-7459/00/$10.00 © 2000 IEEE May/ June 2000 IEEE SOFTWARE 37


Our model is mine the key concept of control, which will
based on five familiar form the basis of the refinement theory,
artifacts broadly clas- which is the basic set of relations between
sified into groups the artifacts we describe later.
that pertain mostly to The distinction between environment and
the system versus system is a classic engineering issue that is
those that pertain sometimes regarded as a matter of taste and
mostly to the envi- convenience but which has a profound effect
ronment. These arti- on problem analysis. The reference model de-
facts are mands a clarification of the primitive terms
Figure 1. Five soft- that we use in the WRSPM artifacts. This
ware artifacts with ■ domain knowledge that provides pre- clarification is so important that it is the sixth
visibility and control sumed environment facts; artifact in the reference model: the designated
for designated terms. ■ requirements that indicate what the cus- terminology provides names to describe the
tomer needs from the system, described application domain (environment), the pro-
in terms of its effect on the environment; gramming platform with its software (sys-
■ specifications that provide enough in- tem), and the interface between them.
formation for a programmer to build a Designations identify classes of phenom-
system that satisfies the requirements; ena—typically states, events, and individu-
■ a program that implements the specifica- als—in the system and the environment and
tion using the programming platform; and assign formal terms (names) to them. Some
■ a programming platform that provides of these phenomena belong to the environ-
the basis for programming a system that ment and are controlled by it; we will denote
satisfies the requirements and specifica- this set e. Others belong to the system and
tions. (Sometimes the system is construed are controlled by it; we will denote this set s.
to include such things as the procedures At the interface between the environment
people who use the software employ. In and the system, some of the e phenomena
this case, the people are also program- are visible to the system: we will denote this
mable, and their program is this set of e subset ev. Its complement in e are hidden
procedures. Our model primarily focuses from the system: we will denote this set by
on programming computer platforms.) eh. Thus e = eh ∪ ev. The s phenomena are
similarly decomposed into sv and sh. We as-
We denote these artifacts W, R, S, P, and sume that e and s are disjointed—an as-
M, respectively, and give their classification sumption we will analyze later.
in Figure 1’s top Venn diagram. (We chose Terms denoting phenomena in eh, ev, and
W and M for “world” and “machine.”) sv are visible to the environment and used in
Most of our discussion focuses on the special W and R. Terms denoting phenomena in sh,
role of the specification, S, which occupies sv, and ev are visible to the system and used
the middle ground between the system and in P and M. Therefore, only ev and sv are
its environment. We give a formal analysis, visible to both the environment and the sys-
describing the relations that S must satisfy tem. We restrict S to using only these terms.
and a comparison of this work to similar at- The Venn diagram at the bottom of Figure 1
tempts to formally analyze this interface. shows the relationships among the four sets
of phenomena. A small example will help
Designations with understanding some of the ideas in the
We can view the WRSPM artifacts (see reference model. We describe a simple ver-
Figure 1) primarily as descriptions written sion of the Patient Monitoring System in
in various languages, each based on its own our terms. The requirement R is a warning
vocabulary of primitive terms. Some of system that notifies a nurse if the patient’s
these terms are common to two or more of heartbeat stops. To do this, there is a pro-
the WRSPM descriptions. To understand gramming platform M with a sensor to de-
the relationships between the descriptions, tect sound in the patient’s chest and an ac-
we must understand how the division be- tuator that can be programmed P to sound
tween environment and system is reflected a buzzer based on data received from its
in the terms used in them. This will deter- sensor. There is also some knowledge of the

38 IEEE SOFTWARE May/ June 2000


world W, which says that there is always a ∀e s. W ∧ M ∧ P ⇒ R . (1)
nurse close enough to the nurse’s station to
hear the buzzer, and that if the patient’s That is, the requirements allow all the The WRSPM
heart has stopped, the sound from the pa- events the environment performs (eh, ev) and
tient’s chest falls below a threshold for a cer- all the events the system performs (sv, sh) reference
tain time. The designated terminology falls that can happen simultaneosly. model is
into four groups (see Figure 1): We call this property adequacy. Ade-
quacy would be trivially satisfied if the as-
independent of
■ eh: the nurse and the heartbeat of the sumptions about the environment meant the choice of
patient. that there is no set of events that could sat- language for
■ ev: sounds from the patient’s chest. isfy its hypothesis. We therefore need some
■ sv: the buzzer at the nurse’s station. kind of nontriviality assumption. First, we expressing the
■ sh: internal representation of data from want consistency of the domain knowledge. various
the sensor. The desired property is
artifacts.
The specification S, which is expressed in ∃e s. W . (2)
the language common to the environment
and system, says that if the sound the sensor (This is the same as ∃ eh ev sv. W, because the
detects falls below the appropriate threshold, variables sh do not appear in W.) Clearly, we
then the system should sound the buzzer. want consistency of W, P, and M together.
The WRSPM reference model is inde- However, we need something more: a prop-
pendent of the choice of language for ex- erty that says that any choice of values for
pressing the various artifacts. However, for the environment variables visible to the sys-
uniformity, we assume all the artifacts are tem is consistent with M ∧ P if it is consis-
described using formulas of Church’s tent with assumptions about the environ-
higher-order logic. If eh = {x1, …, xn}, then a ment. (This assumption is too strong for
formula ∀eh. φ means the same as ∀x1, …, some cases where we expect the system to
xn. φ. We use HOL notational conventions prevent the environment from doing some
in the article and hope they are sufficiently events that are controlled by the environ-
obvious that readers don’t need background ment but visible to the system. The precise
beyond what we give here together with reformulation of this criteria for such cases
some general knowledge of logic. In the is a topic of ongoing research.) The desired
“dot” notation, a dot following a quantifi- property is called relative consistency:
cation means that the quantification’s scope
goes as far to the right as the parentheses al- ∀ev. (∃eh s. W) ⇒
low. For instance, (∃x. A ⇒ B) ∧ C is the (∃eh s. W ∧ M ∧ P) . (3)
same as (∃x. (A ⇒ B)) ∧ C.
The witness to the existential formula in the
Relationship between Environment conclusion can be the same as the witness in
and System the hypothesis.
The program and the world each have a Relative consistency merits some appreci-
capacity for carrying out events. W restricts ation, because there are a variety of ways to
the actions that the environment can perform get the wrong property. It is a significant con-
by restricting e or the relationship between e tribution of the Functional Documentation
and sv. The requirements, R, describe more Model,3 which asserts the relative consis-
restrictions, saying which of all possible ac- tency of the requirements with the domain
tions are desired. P, when evaluated on M, de- knowledge. Let M′ = M ∧ P, and consider the
scribes the class of possible system events. (Of property ∃e s. W ∧ M′.
course, we do not usually present programs This says that there is some choice of the
and programming platforms as formulas. environment events that makes the system
Here, we should think of P and M as the for- consistent with the environment. This is too
mulations of these artifacts in logic.) If R al- weak, because the environment might not
lows all the events in this class, then the pro- use only this consistent set of events. How-
gram is said to implement the requirements. ever, this formula should hold, and it does
In logic, this means immediately follow from the domain-

May/ June 2000 IEEE SOFTWARE 39


knowledge consistency (Formula 2) and rel- Before we begin to describe proof obliga-
ative consistency (Formula 3). ∀e s. W ⇒ tions, we make one stipulation: S must lie in
Because the M′ is much too strong, because it means the environment and system’s common vo-
that any choice of potential system behavior cabulary. In other words, the free variables
specification is (s) that W accepts, the system to be built of S must be among those in ev and sv and,
to stand proxy (M′) must also accept. An apparently mod- therefore, cannot include any of those in eh
est weakening, ∀e ∃ s. W ⇒ M′, is too weak, or sh. Because the specification is to stand
for the program because given an environment action, it lets proxy for the program with respect to the
with respect the system do anything it chooses if there is requirements, it should satisfy the same ba-
to the a corresponding value for the system actions sic properties as the program. First, we re-
that invalidates the domain knowledge. quire there to be adequacy with respect to S:
requirements, One step closer to Formula 3, ∀e. (∃ s.
it should satisfy W) ⇒ (∃ s. W ∧ M ∧ P), is again too strong. ∀e s. W ∧ S ⇒ R . (4)
The environment actions hidden from the
the same basic system include reactions to the machine’s Second, we require a strengthened version
properties as behavior. The machine is not allowed to re- of relative consistency for S:
the program. strict any of the possible environmental re-
actions. In the patient-monitoring example, ∀ev. (∃eh s. W) ⇒
it is consistent with the domain knowledge (∃s. S) ∧ (∀s. S ⇒ ∃eh. W) . (5)
for the patient’s heart to stop beating (ev)
without the nurse being warned (eh)—this These two formulas, together with Formula
is the undesirable possibility that the pro- 2, are the environment-side proof obligations.
gram is supposed to prevent. The formula On the other side, the specification is to
states that, if this can happen, then the pro- stand proxy for the requirements (and the
gram must allow it! domain knowledge) with respect to the pro-
gram. The system-side proof obligation is a
Specifications similarly strengthened version of relative
Let’s suppose now that we wish to de- consistency for M ∧ P with respect to S:
compose the process of implementing a re-
quirement into two parts: first, when re- ∀e. (∃s. S) ⇒ (∃s. M ∧ P) ∧
quirements are developed, and second, (∀s. (M ∧ P) ⇒ S) . (6)
when the programming is carried out. Two
different groups of people might do these In summary, if the software buyer is re-
tasks: users and programmers, respectively. sponsible for the environment-side obliga-
It is often desirable to filter out the knowl- tions and the “seller” is responsible for the
edge of W and R that truly concerns the system-side obligations, then the buyer must
people who will work on developing P (for satisfy Formulas 2, 4, and 5, and the seller
M) and deliver this as a software specifica- must satisfy Formula 6. The fundamental
tion. We rely on a kind of transitive prop- obligations described by Formulas 1, 2, and
erty to ensure the desired conclusion: if S 3 follow from this. We omit detailed proof
properly takes W into account in saying of this, but Formula 1 is a consequence of
what is needed to obtain R, and P is an im- Formulas 5 and 6, and Formula 3 is a con-
plementation of S for M, then P implements sequence of Formulas 4, 5, and 6. Formula
R as desired. There are several reasons for 2 was a direct responsibility of the buyer.
wanting such factorization—for example, We have referred to Formulas 5 and 6 as
the need to divide responsibilities in a con- strengthened versions of relative consis-
tract between the needs of the user and sup- tency. Without the strengthening, what we
plier. They build their deal around S, which would expect for Formula 5, by comparison
serves as their basis of communication. But with Formula 3, would be
how can we represent this precisely? Is it all
right just to say that S and W imply R, while ∀ev. (∃eh s. W) ⇒
M and P imply S? This is close and provides (∃eh s. W ∧ S) (7)
a good intuition, but the situation is not
that simple. We must properly account for which is a direct consequence of Formula 5.
consistency and control. The strengthening comes in the added con-

40 IEEE SOFTWARE May/ June 2000


straint that ∀s. S ⇒ ∃eh. W, instead of simply be real-valued functions of time, and let the
having ∃eh. s. W ∧ S. The added constraint five predicates be defined as
means that W must hold everywhere that S Our reference
holds. The weaker constraint only requires NAT: (∀ t. c(t) > 0) ∧ (∀ t. m(t) < 0)
that there is somewhere that they both hold. REQ: ∀ t. c(t + 3) = −m(t) model supplies
It might seem that the weaker Formula 7 IN: ∀ t. i(t + 1) = m(t) what is missing
should suffice in place of Formula 5. To see SOF: ∀ t. o(t + 1) = i(t)
why it does not, consider a “good” specifi- OUT: ∀ t. c(t + 1) = o(t) .
in the functional
cation S1 that satisfies Formula 5 and guar- documentation
antees R. But suppose that we are also given Each predicate is internally consistent. All model.
a “bad” specification S2 that is everywhere the predicates besides NAT are readily imple-
inconsistent with W. If we let S = S1 ∨ S2, mentable, because all establish relationships
then S satisfies Formula 4 and the weaker between their inputs at one time and their
Formula 7, but not Formula 5. An imple- outputs at a later time. The predicates satisfy
mentor could satisfy his obligation Formula all the proof obligations of the Functional
7 by implementing S2. Yet an implementa- Documentation Model, yet they are not real-
tion of S2 would behave unpredictably when izable, because ¬(∃m i o c. NAT(m, c) ∧
deployed. The extra strength of Formulas 5 IN(m, i) ∧ SOF(i, o) ∧ OUT(o, c)). If the pro-
and 6 prevent this. gram flipped the sign of its input to get the
delayed output, all would be well. The ac-
Comparing Approaches ceptability obligation is satisfiable only be-
Our reference model is a more formal and cause the antecedent of its implication is al-
complete version of some of our earlier ways false.
work.4–6 Let’s look in some detail at some of Our reference model supplies what is
the most well-known formulations similar to missing in the functional-documentation
the WRSPM artifacts and their relationships. model. To show how, we must match corre-
In the functional-documentation model,3,7 sponding parts of the two models. The corre-
there are four distinct collections of variables: spondence is not completely determined be-
m for monitored values, c for system-con- cause it is not clear whether IN and OUT
trolled values, i for values input to the pro- should be considered the system or environ-
gram’s registers, and o for values written to ment parts. We consider them system parts,
the program’s output registers. There are also but our reference model still supplies the
five predicates formally representing the nec- missing obligation, even if we interpret them
essary documentation: NAT(m, c) describing as being in the environment instead.
nature without making any assumptions In this context, both the phenomena i
about the system, REQ(m, c) describing the and o belong to our category sh. The moni-
desired system behavior, IN(m, i) relating the tored phenomena m are the same as our ev,
monitored real-world values to their corre- the controlled phenomena c are the same as
sponding internal representation, OUT(o, c) our sv, and there are no eh phenomena in the
relating the software-generated outputs to ex- functional-documentation model.
ternal system-controlled values, and SOF(i, o) The predicate NAT corresponds to our W,
relating program inputs to program outputs. and the predicate REQ corresponds to our R.
There are three major proof obligations NAT and REQ are more restricted than W
in the functional documentation model. The and R, however, because they can only make
first is called feasibility and requires that assertions about those phenomena of the envi-
∀m. (∃c. NAT(m, c)) ⇒ (∃c. NAT(m, c) ∧ ronment that are shared with the system. In
REQ(m, c)). The second states that IN must fact, REQ corresponds to the specification S as
handle all cases possible under NAT: ∀m. well as to R. The predicate SOF corresponds
(∃c. NAT(m, c)) ⇒ (∃i. IN(m ,i)). The third to the program P. IN and OUT together cor-
is called acceptability, and requires that ∀m respond to the M, except that once again they
i o c. NAT(m, c) ∧ IN(m, i) ∧ SOF(i, o) ∧ are more restricted, being limited to the special
OUT(o, c) ⇒ REQ(m, c). purposes of sensing and actuating.
Although widely accepted, these proof Because these correspondences make S irrel-
obligations are not sufficient, as we can see evant, we shall use our original Formulas 1
from a small example. Let all the variables through 3. Translated into their terms, our

May/ June 2000 IEEE SOFTWARE 41


Formula 1 is exactly the same as their accept- from the environment. The model’s appro-
ability. Our Formula 2 translates to ∃m c. priateness to the reactive module assump-
The applications NAT(m, c), which is not made explicit in the tions partially inspired Formula 5.
functional-documentation model because it is
we studied have assumed to be true by construction. Our For-

O
benefited mula 3 translates to ∀m. (∃c. NAT(m, c)) ⇒ ur reference model is meaningful
(∃i o c. NAT(m, c) ∧ IN(m, i) ∧ SOF(i, o) ∧ whether or not you are using a for-
significantly OUT(o, c)), which would have revealed the malization such as a theorem prover
just from the programming bug in the earlier example. It or a model checker. The proof obligations
clarity of also subsumes their second (unnamed) obliga- are just as sensible for natural-language
tion. Translated into our terms, their feasibility documentation as they are for formal speci-
knowing what obligation is ∀ev. (∃eh sv. W) ⇒ (∃eh sv. W ∧ fications. Moreover, there is no absolute re-
the objective of R), which is implied by our Formulas 1 and 3. quirement that the proof obligations be met
Like our reference model, the functional- in the sense of automated theorem proving.
a model’s documentation model insists on a rigid divi- On the contrary, the applications we studied
component sion between system and environment control have benefited significantly just from the
should be, even of designated terminology. Some other sys- clarity of knowing what the objective of a
tems, such as Unity8 and TLA,9,10 leave to the model’s component should be, even without
without user such matters as distinguishing environ- formalization, let alone machine-assisted
formalization. ment from system and domain knowledge proof. However, our description is precise
from requirements, but support shared control enough to support formal analyses. For ex-
of a designated term by system and environ- amples of such formal analyses and a more
ment. For example, in the Unity formalism, we in-depth discussion of some of the logic
can express that both E and M control a connections, see our technical report from
Boolean variable b, but E can only set it to true the University of Pennsylvania.13
while M can only set it to false. Our restricted Our current and future research involves
notion of control, in which each phenomenon extending and applying the reference model.
must be strictly environment or system con- We are interested in extensions that unify it
trolled, is easier to document and think about with game-theoretic modeling of our system
than shared control. When necessary, we can and environment. Such extensions might
model shared control by using two variables— make the model more complex to describe
one controlled by the system and the other by but might offer better guidance when using
the environment—together with an assertion it for applications. We are applying the
in W that they must be equal. model to problems networking and teleph-
The composition and decomposition the- ony. In networking, we are using it to de-
orems of TLA are particularly valuable scribe attributes of routing protocols; for
when parts of the formal documentation are telephony, we are using it to model distrib-
not complete and unconditional (as the func- uted feature composition.
tional-documentation model requires them
to be). For example, we can use the TLA
Acknowledgments
composition theorem to prove adequacy of a We thank Samson Abramsky, Rajeev Alur, Karthik
specification even when all of the domain Bhargavan, Trevor Jim, Insup Lee, and Davor Obra-
knowledge, requirements, and specification dovic for their input to this work. NSF Contract
CCR9505469 and ARO Contract DAAG-98-1-0466
components are written in an assumption partially support this research.
and guarantee style.
As part of a benchmark problem for
studying the reference model, we used the References
1. C. Mazza et al., Software Engineering Standards, Pren-
model checker Mocha11,12 to prove the de- tice Hall, Upper Saddle River, N.J., 1994.
sired properties of the relationship between 2. S.A. Maguire, Debugging the Development Process:
specification and program (see Formula 5). Practical Strategies for Staying Focused, Hitting Ship
Dates, and Building Solid Teams, Microsoft Press, Red-
The Mocha concept of a reactive model is mond, Wash., 1994.
extremely similar to our reference model, 3. D.L. Parnas and J. Madey, “Functional Documentation
because it allows for interface variables con- for Computer Systems,” Science of Computer Program-
ming, Vol. 25, No. 1, Oct. 1995, pp. 41–61.
trolled by the environment or system and 4. M. Jackson and P. Zave, “Domain Descriptions,” Proc.
system-controlled variables that are hidden IEEE Int’l Symp. Requirements Eng., IEEE Computer

42 IEEE SOFTWARE May/ June 2000


Soc. Press, Los Alamitos, Calif., 1992, pp. 56–64. About the Authors
5. M. Jackson and P. Zave, “Deriving Specifications from
Requirements: An Example,” Proc. 17th Int’l Conf.
Carl A. Gunter is an associate professor at the University of Pennsylvania. He does re-
Software Eng., IEEE Computer Soc. Press, Los Alami-
tos, Calif., 1995, pp. 15–24. search, teaching, and consulting in programming languages, software engineering, networks,
6. P. Zave and M. Jackson, “Four Dark Corners of Re-
and security. He is the author of a textbook on the semantics of programming languages. He
quirements Engineering,” ACM Trans. Software Eng. received his BA from the University of Chicago and his PhD from the University of Wisconsin,
and Methodology, Vol. 6, No. 1, Jan. 1997, pp. 1–30. Madison. Contact him at the Dept. of Computer and Information Science, 200 S. 33rd St., Univ.
7. A.J. van Schouwen, D.L. Parnas, and J. Madey, “Docu-
of Pennsylvania, Philadelphia, PA 19104; gunter@cis.upenn.edu.
mentation of Requirements for Computer Systems,”
Proc IEEE Int’l Symp. Requirements Eng., IEEE Com-
puter Soc. Press, Los Alamitos, Calif., 1992, pp.
198–207. Elsa L. Gunter is a member of the technical staff at Bell Laboratories. Her research in-
terests include formal methods, design and use of automated and interactive theorem provers,
8. K. Mani Chandy and J. Misra, Parallel Program De-
sign: A Foundation, Addison-Wesley, Reading, Mass.,
and mathematical semantics of programming languages. She received her BA from the Uni-
1988. versity of Chicago and her PhD from the University of Wisconsin, Madison. She was a post-
doctoral research assistant at both Cambridge University and the University of Pennsylvania.
9. M. Abadi and L. Lamport, “Conjoining Specifications,”
ACM Trans. Programming Languages and Systems,
Contact her at Bell Labs, Room 2C-481, 600 Mountain Ave, P.O. Box 636, Murray Hill, NJ
Vol. 17, No. 3, May 1995, pp. 507–534. 07974-0636; elsa@research.bell-labs.com.
10. L. Lamport, “The Temporal Logic of Actions,” ACM
Trans. Programming Languages and Systems, Vol. 16,
Michael Jackson is as an independent consultant in London and a part-time researcher
No. 3, May 1994, pp. 872–923.
at AT&T Research in Florham Park, N.J. His research interests include requirements specifica-
11. R. Alur and T.A. Henzinger, “Reactive Modules,” Proc.
11th IEEE Symp. Logic in Computer Science, IEEE
tions, problem analysis, problem frames, software design, and feature interaction. Contact him
Computer Soc. Press, Los Alamitos, Calif., 1996, pp. at 101 Hamilton Terrace, London NW8 9QY, UK; jacksonma@acm.org.
207–218.
12. R. Alur et al., “Mocha: Modularity in Model Check-
ing,” Proc. 10th Int’l Conf. Computer Aided Verifica-
tion, A.J. Hu and M.Y. Vardi, eds., Lecture Notes in
Computer Science, Springer-Verlag, New York, 1998, Pamela Zave is a member of the Network Services Research Laboratory. Her research
pp. 521–525. interests include formal methods for telecommunication systems and requirements engineer-
13. C.A. Gunter et al., A Reference Model for Require- ing. She received her AB from Cornell University and her PhD from the University of Wiscon-
ments and Specifications, tech. report, Univ. of Penn- sin, Madison. Contact her at AT&T Laboratories–Research, 180 Park Ave., Room D205,
sylvania, Dept. of Computer and Information Science,
Florham Park, NJ 07932; pamela@research.att.com.
Jan. 2000.

Training You
Can Trust
From the IEEE Computer Society and five partner universities
Continuing education courses based on
software engineering standards.
Computer Society 2000 Authorized Training Centers
• California State University, Sacramento
• New Jersey Institute of Technology
• Oregon Graduate Institute
• Southern Polytechnic State University
• University of Strathclyde

Course descriptions and registration information are


available at computer.org/education/sestrain.htm

Software Engineering Standards-Based Training


Tr a i n i n g D o n e R i g h t

You might also like