Truth of Data Flow / Control Flow & Coupling Analysis
By: George Meier and Reinaldo De Salas Flamingo
Introduction:
As part of the V&V process (test coverage), according to DO-178C, there are objectives that need to be
satisfied regarding data and control coupling for Levels A, B and C. These analyses are used mainly to
assess the modularity (one of better-known software development good practices) of a software system,
resulting in better testability, maintainability, and reducing the impact of changes while allowing greater
software reuse. The data coupling and control coupling (DC/CC) analyses were subject to clarifications
(due to misinterpretations) in recent revisions of DO-178 standard, those updates reinforce the study of
this matter by this white paper.
According to DO-178C ANNEX B, Glossary:
- Control Coupling is defined as “The manner or degree by which one software component
influences the execution of another software component.”
- Data Coupling is defined as “The dependence of a software component on data not exclusively
under the control of that software component.”
The “Software Component” term could be interpreted as: procedures, functions, subroutines, modules
and other similar programming constructs.
Practical example of DC/CC:
Consider the example of FIGURE 1; let’s suppose in this case our system is constituted by 3 modules:
main.c, calculateairspeed.c, and displayairspeed.c.
FIGURE 1 – Diagrammatic example of DC/CC in a generic embedded software architecture
The control coupling is represented by the external function calls from “main.c” of the
modules/functions “calculate_airspeed()” and “display_airspeed()”. And the data coupling is
represented by the “airspeed” parameter sent from “calculateairspeed.c” to “displayairspeed.c” module
(via global variable, shared memory, or etc…). Those DC/CC paths (functions “calculate_airspeed()” and
“display_airspeed()” and parameter “airspeed”), according the DO-178C, should be tested in normal and
robustness conditions for safe-critical systems.
As a runnable example the files may be in the following format:
main.c calculateairspeed.c displayairspeed.c
int main()
{
int airspeed=0;
static void calculate_airspeed (int * static void display_airspeed (int
while(1) {
airspeed) { airspeed) {
calculate_airspeed(&airspeed);
*airspeed=10; printf("airspeed: %d\n\r",airspeed);
display_airspeed(airspeed);
} }
}
return 1;
}
Data Coupling
For each output of each software component [the producer] that is consumed by any software
component [the consumer], a requirements based test (System Test, HSIT or SIT) shall be identified that
confirms the producer’s output correctly influences the observable behavior/outputs of the consumer.
What does this mean?
High Data Coupling (low independence, high interdependence) generally means having more global
variables: any time one module passes data to another module the two modules are considered to be
‘data coupled’. The problem that arises with global data is that there is an increased possibility of one
module’s error propagating that error across other modules, increasing the probability of a system
failure. This also has the effect of making it more difficult to find the true source of the error, since
multiple modules may modify the global data. This is why data dictionaries are typically required for
higher DAL systems, to help minimize the unforeseen effects of data coupling.
This is so important that DO-178C requires us to generate and identify a requirements based test that
proves 1) that the data is correctly produced and 2) that the test correctly confirms the expected
observable behavioral result in the consuming module.
As a general rule of thumb, we want to minimize global data, limiting the data coupling to the lowest
level possible (e.g. local variables, module or file level variables then lastly global variables). The use of
global variables will directly drive up the cost of verification of those variables. This is how great these
effects concern the authors.
The benefits of reduced data coupling include lower verification cost as well as easier troubleshooting
and greater system reliability.
Control Coupling
Wikipedia defines Control Coupling as “Control coupling is one module controlling the flow of another,
by passing it information on what to do (e.g., passing a what-to-do flag).” This seems like a simple
enough statement that shouldn’t be modified (in fact I urge people to look things up on Wikipedia as
often as possible as it typically forms a valid crowd sourced opinion). Although it may not have any
direct bearing on Avionics software development, anything that allows you grasp the basic concept is a
good thing.
In other words, control coupling is where we allow one module to control the execution flow of another
module. This can greatly affect the determinism of software. Remember determinism? That is where we
want the software to behave exactly the same (in a pre-determined manner) over any expected range of
input conditions.
This is also the reason why in the requirements stage we MUST consider both in-range and out of range
behavior of software with respect to our sensor inputs, this along with appropriate tolerances is what
makes for a very robust software design. It is much easier at the systems level to specify ALL expected
(including all probable sensor failure modes) inputs than to figure it out after several incidents (that is
totally unacceptable). This is why tolerances over abnormal (but expected) input ranges are necessary
and why in some cases secondary or even tertiary sensors are required. This is what also drives our
corner-case software testing and robustness testing. We need to know that regardless of what happens
in flight to the aircraft and that the system will continue to perform correctly in order to maintain a safe
flight environment.
Another issue with control coupling is called the ripple effect. A change in the controlling function or
module may require a change in the controlled function or module. This can create a maintenance
nightmare in keeping the software correctly functioning and documented. It may also require more than
one design iteration to fully realize the effects of change due to a high degree of control coupling. This
drives up the time to market, increases the amount of testing needed and makes the software more
difficult to maintain and obtain Certification Authority approval.
We should bear in mind that the lifespan of some of these systems can be 20 to 40 years or more.
Aircraft from WW2 are still flying today. Most of them have mechanical systems, which can be easier to
troubleshoot than software. Normal wear and tear is usually obvious on a mechanical system, or can be
discerned by non-destructive inspection or x-rays. With software we don’t have that advantage. We only
have the design data and the comments in the code. Duplicating the conditions that cause an issue in
flight can be the most difficult part of troubleshooting a software bug. This is why we want as much
design information as possible in the required documentation and why we need to maintain archival
copies of the entire development process.
So how do we minimize control coupling? Primarily by designing it into the software.
1) Maintain functional cohesion (e.g. grouping similar functions together).
2) Review, review, review.
3) Be open to all your reviewers. The more differing interpretations of how to best do something the
better off your software will be. So, engage in debates about how to best do things, but keep them
friendly debates, so everyone will share their ideas readily. Remember there is no one best way to
accomplish things in software. It all depends on the processor, hardware, and to some extent the rest of
the software. Sometimes ‘sleeping on an issue’ is the best way to solve things (keep a note pad handy),
other times asking a trusted colleague is the best way. Don’t limit your toolset to only what is listed in
the PSAC, sometimes there is no substitute for good old intuition. Usually, the most elegant solution to a
problem comes only after a few iterations of development builds and lab testing. Afterwards it will seem
so simple that you will wonder why it took so long to develop.
Conclusion:
Data Coupling and Control Coupling are ways to measure the interdependence of one module to
another. Modules should have low coupling as this minimizes the ripple effect (where changes in one
module cause errors in other modules). Low coupling is inherent to the development practice of
modularization; therefore, in safety-critical software, the DC/CC assessment and testing is performed in
order to document this balance.