You are on page 1of 9

The content of this document is proprietary.

A Comparison of MISRA C Testing Tools

Document Summary

Authors: SP & JJ

Date: 19 October 2001

Revision: 1.0

Presented at the MISRA C Forum 18 October 2001

Contents
1. Introduction................................................................................................................................ 2
2. The Tools.................................................................................................................................... 2
3. The Test Cases .......................................................................................................................... 2
3.1 Types.................................................................................................................................... 3
3.2 Linkage ................................................................................................................................. 3
3.3 Initialisation........................................................................................................................... 4
3.4 Operators ............................................................................................................................. 5
3.5 Conversions ......................................................................................................................... 5
3.6 Summary .............................................................................................................................. 6
4. An Industrial Example............................................................................................................... 6
4.1 Deviation Mechanism ........................................................................................................... 7
4.2 Industrial Scale Analysis ...................................................................................................... 7
4.3 Ease of Use.......................................................................................................................... 8
5. Discussion ................................................................................................................................. 8

1
Pi Technology, Milton Hall, Ely Road, Milton, Cambridge, CB4 6WZ, UK
Tel: +44 (0) 1223 441434 Fax: +44 (0) 1223 203999
The content of this document is proprietary.

1. Introduction
This paper describes the work performed at developed engine control system of around
Pi Technology during the Summer of 2001 in 100,000 lines of commented code.
evaluating MISRA C static checking tools.
The intent of this two-pronged attack is clear:
The purpose of this work was:
there is little value in a tool which can fully
• to examine the consistency of verify that source code complies with
interpretation of MISRA C rules between MISRA C but can only be used on tiny
different checking tools; systems. One of the significant features of
MISRA C is that it is intended for industrial
• to determine whether MISRA C checking use, not simply for academic discussion.
tools are sufficiently mature for use on an Therefore we wished to verify not only that
industrial sized software system. MISRA C is sufficiently well defined to feel
The study followed two routes: confident in claiming compliance, but also
that the tools available for checking
1. the production and application of a test compliance are adequate for use on systems
suite for individual MISRA C rules; of industrial size and complexity.
2. evaluating the testing tools behaviour The process of evaluating MISRA C tools is
when presented with a production important because the MISRA C guidelines
software system. themselves require that tools are validated
The test suite contains examples of three (see MISRA C §5.2.3). Validation suites
types of tests: those that clearly violate a exist for C compilers and it is reasonable to
MISRA C rule, those that clearly do not ask tool vendors to provide proof that their
violate a MISRA C rule, and those for which a tool is conformant to the C standard. For
rule violation is uncertain. The final test set MISRA C checkers there is currently no
consists of around 600 test cases for 90 of validation suite nor a validation process. The
the 127 MISRA C rules. onus of proving that a MISRA C checking tool
is adequate for purpose is on the software
The production software system used for developer.
evaluating the tools was a Pi Technology

2. The Tools
Several MISRA C tool vendors made their Tools which provided MISRA C checking as
tools available for evaluation purposes. The an add on feature (such as compilers that
six tools evaluated included examples from implement limited MISRA C checking) were
most of the major vendors. not included in the study. Given the required
investment in installing a MISRA C
• We evaluated both beta-test software compliance process it seems unwise to base
and long-standing mature software. this on a tool which is designed for use on
• We evaluated tools specifically targeted software for only one hardware platform.
at MISRA C checking and tools where The intention of this paper is not to identify
the MISRA C checking feature was a one particular tool as being "the best" nor to
small part of a larger static (and identify straightforward bugs in the
sometimes dynamic) checking tool. implementation of particular tools. Our main
• We evaluated tools covering a significant interest was in identifying where tools differ in
price range. their interpretation of rules and where
possible to identify the causes of these
differences in interpretation.

3. The Test Cases


The largest part of the tool study was based This test suite was written without reference
around the production and application of a to any previous tests and without reference to
set of test cases with which to exercise the a particular tool. It was intended to provide a
tools. fair set of exercises.

2
Pi Technology, Milton Hall, Ely Road, Milton, Cambridge, CB4 6WZ, UK
Tel: +44 (0) 1223 441434 Fax: +44 (0) 1223 203999
The content of this document is proprietary.

In many cases we were able to decide when This test case clearly violates the intent of the
developing tests what the expected result rule yet does not violate the rule as written.
was. Occasionally we were however unsure Of the six tools tested five claimed to check
whether the test case was compliant with a this rule: three reported this as a violation of
specific MISRA C rule. rule 13, two did not.
As the amount of time that could be spent on If we consider the text of the rule again we
this project was limited the test cases were can ask "what does it mean to use a type?"
targeted at the most interesting rules. The Clearly when we declare an object to have a
majority of the effort was expended on rules particular type we use the type but what
that are: about declaring pointers to that type? This
leads to the following test case:
• poorly defined (i.e., rules with several
possible interpretations), or typedef unsigned char *str;

• difficult to implement. bool f(str s)


{
This means that the results of the tests are return (s[0] != '\0');
weighted towards showing the more negative }
aspects of the tools and of MISRA C—rules
This test declares a pointer to a type that
which are well specified and easy to
"should not be used" and then uses a value
implement feature little.
of that type (s[0] is of type unsigned
The examples in this paper are obviously char) however none of the tools evaluated
intended to be short and concise. They are produced a warning about the use of basic
extracted from the full test suite with names types in such examples.
shortened and irrelevancies (such as
A further way in which we can use types is in
unnecessary comments) removed. The
cast operations. Two of the tools failed to
types S8, S16, S32, U8, U16, U32, F32 and
report a violation of rule 13 for:
F64 are used for signed, unsigned, and
floating types of particular bit lengths; bool is F64 fn(F64 f)
used to represent a boolean type {
return (sqrt((double) f));
The following sections discuss particular }
rules of interest.
It could be simply stated that these problems
are implementation issues with the tools
under test; or that they are irrelevant example
3.1 Types cases unrelated to real world situations. But
An important MISRA C rule that is intended the important issue is "is this code MISRA C
to promote portability of code and make the compliant?" The authors believe that each of
code more explicit is rule 13: "The basic the examples in this section is in breach of
types of char, int, short, long, float and the intent of rule 13. If they were reviewing
double should not be used, but specific- this code for conformance with MISRA C
length equivalents should be typedef’d for the then they would reject it.
specific compiler, and these type names
used in the code." This rule is interesting
because it seems perfectly well specified and 3.2 Linkage
it interacts with several other rules (which
we'll discuss later). An example of a rule that is reasonably well
specified but may cause implementation
An obvious clue to identifying possible problems is rule 24: "Identifiers shall not
interpretation differences here is that the rule simultaneously have both internal and
uses the standard C term "basic types" external linkage in the same translation unit".
(defined in ISO/IEC 9899:1990, §6.1.2.5) but
provides a different set of basic types to This rule is well defined since it borrows
those defined in the standard. The standard directly from the C standard §6.1.2.2: "If,
C definition of a basic type is "the type char, within a translation unit, the same identifier
the signed and unsigned integer types, and appears with both internal and external
the floating types". Notice that the unsigned linkage, the behaviour is undefined." The
integer types are specified here but not in the rule is however difficult to implement since,
rule text. This leads to the trivial test case: as the rule 13 text states, "the rules for
linkage in C are complicated". This rule
unsigned f(void) therefore is a good candidate for testing each
{ of the tool's understanding of standard C—a
return (0u); solid standard C parser seems a prerequisite
}
for a good MISRA C checker.
3
Pi Technology, Milton Hall, Ely Road, Milton, Cambridge, CB4 6WZ, UK
Tel: +44 (0) 1223 441434 Fax: +44 (0) 1223 203999
The content of this document is proprietary.

As acknowledged by the MISRA C Technical (see standard C §6.1.2.4). Only three of the
Clarification document the example of a tools detected this error.
breach of this rule in MISRA C is incorrect:
On a more positive note the control flow-
static S32 s; based analysis performed by some tools did
void f(void) reduce the number of false warnings such as
{ for:
extern S32 s; S32 f(S32 a)
} {
At the point of the second declaration of s the S32 b;
initial declaration is visible and therefore the if (a == 1) {
identifier inherits the static linkage from the b = 1;
initial declaration. Two of the tools however }
report rule 24 violations for this test case. if (a != 1) {
Two of the tools could not be convinced to b = 0;
provide a rule 24 violation independent of the }
input text.
return b;
In general the behaviour of the tools differed }
significantly when processing this rule and
This (deeply unattractive) code has only two
some tools have a clear misunderstanding of
executable paths - both of which lead to the
the C linkage rules. Only one tool conformed
assignment of a value to b. Only two of the
to our expectations for this class of tests.
tools seemed to identify that the branches
are related and avoided producing an
uninitialised variable warning. When the
3.3 Initialisation second if statement was replaced with the
Rule 30 states: "All automatic variables shall (preferable) else clause none of the tools
have been assigned a value before being reported an error for rule 30.
used." This is a translation of a statement of These results (except for the first) can all be
undefined behaviour from standard C §6.5.7: stated to be "quality of implementation"
"The value of an uninitialised object that has issues. However there is a more
automatic storage duration is used before a fundamental issue. It seems clear that the
value is assigned". MISRA C rule is intended to be
An obvious violation of this rule is: straightforward rewrite of the standard C
undefined behaviour into more "engineer
S32 f(void) friendly" language. However it is obvious that
{
the statements do differ: MISRA C says that
S32 a;
a variable may not be used, standard C says
return a; that the value contained in the object may not
} be used.
Rather alarmingly one tool which claimed to The tools seem to differ in their interpretation
check for this rule did not identify this as a of the word "used"; two of the tools seem to
violation (although did identify the use of believe that taking the address of a variable
uninitialised values in complex cases). constitutes use. For example:
More complex examples expose differences S32 f(void)
in the tools: {
S32 a;
S32 f(void) S32 *p;
{
S32 a; p = &a;
*p = 0;
goto l;
{ return a;
S32 b = 0; }
l: a = b; Two of the tools incorrectly reported a
} violation of rule 30 for this example based on
return a; the use of the address-of operator. A further
} tool also reported a violation but we were
In this example the returned value is derived unable to identify the root cause of its report.
from an automatic variable whose This example might seem unlikely to cause
initialisation is elided by the jump statement difficulties in realistic code but note that it

4
Pi Technology, Milton Hall, Ely Road, Milton, Cambridge, CB4 6WZ, UK
Tel: +44 (0) 1223 441434 Fax: +44 (0) 1223 203999
The content of this document is proprietary.

prevents the initialisation of data by passing it 2. one of the tools permits comma
to functions such as memset. operators in the initialisation and re-
initialisation expressions but not in the
Note that tools also need to be cautious
controlling expression;
about marking data as initialised simply
because it has been passed by reference. 3. four of the tools allow comma operators
For example: in any of the for loop expressions.
static void It is difficult to state which of these positions
init(U32 *p) is most valid.
{
} A further complication comes from the use of
the word "in". If there is a comma operator
U32 f(void) within the expression then must this be at the
{
U32 a; top level of the expression or can it be
embedded further within it? For example:
init(&a);
return a; void f(void)
} {
S32 a;
This example is obviously missing some
for (a = 0; a < 2;
initialisation code in init. Of the tools which g((a++, a)))
didn't report &a as a rule 32 violation none {
identified that the variable was never }
initialised. }
The result here is clear: the implementation This contrived example demonstrates a
of this rule is variable. Some of that comma expression within a function call
variability is due to simple quality of within an expression within a for statement.
implementation issues; some is probably due Is this acceptable? Three of the tools believe
to unclear wording in the MISRA C that this is a violation of the rule, three do not.
document.
Clearly the term "control expression" needs
to be defined if this rule is to have value.
Whether "in" is defined seems less important.
3.4 Operators It is always possible to write code that
An example of an operator-related rule which exploits this kind of loophole in the definition
indicates significant tool variance is rule 42: of a rule. Such code is however likely to be
"The comma operator shall not be used, easily identifiable in a manual review, and
except in the control expression of a for should be rejected.
loop." A for statement is defined in C as:
for (e1; e2; e3) stmt
3.5 Conversions
Where e1, e2, and e3 are optional
MISRA C rule 44 states: "Redundant explicit
expressions and stmt is a statement.
casts should not be used." All but one of the
Unfortunately the term "control expression" is tools evaluated claim to check this rule. They
not defined. The closest we can come to a all reported a rule violation for:
definition for this term is in a footnote in S16 f(S16 a)
standard C (footnote 76, §6.6.5.3) where it {
calls the second expression in the for return (S16) a;
statement the "controlling expression". Does }
the MISRA C term mean the same as the
standard C term? If so then that is surely a However if we look a little deeper there are
mistake, most examples of comma operators differences. One of the significant problems
in for loops occur in the first and third involved with producing MISRA C compliant
expressions (the "initialisation" and "re- code is in reconciling this rule with the rules
initialisation" expressions). for "sized types" (rule 13), the rule for "integer
suffixes" (rule 18), the rule for "implicit
None of the tools appear to implement the conversions" (rule 43) and the C promotion
rule if we assume that "control expression" is rules.
the same as "controlling expression". The
tools behaved as follows: As a reminder standard C §6.2.1.1 states
that: "A char, a short int, or an int bit-field, or
1. One of the tools reports every comma their signed or unsigned varieties, or an
operator as a violation of this rule (even enumeration type, may be used in an
comma operators within for statements); expression wherever an int or unsigned int

5
Pi Technology, Milton Hall, Ely Road, Milton, Cambridge, CB4 6WZ, UK
Tel: +44 (0) 1223 441434 Fax: +44 (0) 1223 203999
The content of this document is proprietary.

may be used. If an int can represent all of the second operand. However none of the
values of the original type, the value is tools evaluated identified a rule 44 violation
converted to an int; otherwise, it is converted for this text.
to an unsigned int. These are called the
The recommended way of writing this would
integral promotions.”
be (again from the technical clarification):
If we consider the following example: F64 f(S32 a, S32 b)
S16 f(S16 a, S16 b) {
{ return ((F64) a) / b;
return (S16) (a + b); }
}
This however produces a rule 43 violation
we cannot tell whether the cast is redundant from one tool and a rule 48 violation from
unless we know how S16 is implemented: another.
• on a 16-bit machine it is likely that S16 It is clear that this rule is insufficiently well
will be a typedef for int; defined to be able to claim compliance.

• on a 32-bit machine it is likely that S16


will be a typedef for short.
3.6 Summary
Four of the tools evaluated allow the user to
Executing the test cases demonstrated that
input the size of basic types. This allows the
there were significant differences in the
checking tool to understand the effects of
interpretations of the rules between the
promotion and balancing within expressions.
various tools. The rules which demonstrated
The tools did not provide a consistent set of the largest variance were:
results for this test case. What was clear
• The rules related to types. The required
was that at least one of the tools does not
behaviour of MISRA C checking tools is
understand C promotion rules (it claimed that
difficult to understand when the various
the expression "a + b" had type signed
rules on types, integer suffixes and
short—the promotion rules force it to type
casting are combined.
int), and one of the tools reported the cast as
being redundant independent of the basic • Rules which used obvious but undefined
types of S16. terms, e.g., "redundant explicit casts" (in
rule 44), "control expression" (in rule 42),
The MISRA C Technical Clarification
"unreachable code" (in rule 52), or
discusses this issue further and provides
"effectively boolean" (in rule 49).
some examples of non-compliant code:
F64 f(S32 a, S32 b) • Rules where the vendors had to make a
choice between implementing what the
rule said and what the rule meant.
{ Rule 42 is again a good example.
return ((F64) a) /
((F64) b); In total we identified around 30 rules with
} unexpected behaviours, or where static
checking tools had significantly differing
This is marked as non-compliant since
implementations of the rule.
balancing is sufficient to force the conversion

4. An Industrial Example
The test suite discussed above provides • how well does the tool scale to industrial
some detailed information on how the various sized input;
tools implement MISRA C rules. It is
however important to consider not only how • how well can the tool be integrated into a
well the tools implement the specific rules but software development process;
how easy the tool is to use; and more • how appealing is the tool to software
specifically how easy the tool is to use in an engineers.
industrial setting.
The usability of the tools was tested by
There are several considerations that affect attempting to analyse a reasonably sized
the usability of the tool: program: a 100,000 line production quality
• how does the tool handle deviations from engine controller.
MISRA C;

6
Pi Technology, Milton Hall, Ely Road, Milton, Cambridge, CB4 6WZ, UK
Tel: +44 (0) 1223 441434 Fax: +44 (0) 1223 203999
The content of this document is proprietary.

This software was written to an appropriate pragma in the source code requires further
coding standard and has been previously documentation (in order to satisfy rule 99)
analysed for errors using static analysis tools and introduces further implementation-
and by manual review. The software was not defined behaviour (the point of a pragma is to
written to be MISRA C compliant (the project introduce implementation-defined behaviour).
predates the publication of MISRA C) Given that the deviation will need further
however the coding standard used has some documentation than a pragma directive alone
overlap with MISRA C (e.g., the software is the choice of stylised comments seems
developed with "sized types" compliant with preferable.
rule 13). We expected that MISRA C tools
One small but practical point is the manner in
would identify many MISRA C violations but
which deviations are handled for tools which
few actual errors in the code.
map MISRA C rules to legacy C rules (for
example a static tool may implement several
internal rules which together implement a
4.1 Deviation Mechanism MISRA C rule; this is rarely a one-to-one
The MISRA C guidelines place a strong mapping). Some of these tools enable the
emphasis on operating a deviation user to deviate from a specified internal rule
procedure: a product need not be free from but not a specified MISRA C rule.
violations of MISRA C rules so long as the The effect of this is that it is necessary to
violations are documented and signed off by understand the tool's mapping between
a "C language expert together with manager internal rules and MISRA C rules in order to
level concurrence." For a large project the define deviations. Since the mapping is not
number of deviations and the scope of those necessarily one-to-one a single internal rule
deviations will require significant deviation may implement deviations from
management and maintenance. An inability several MISRA C rules, or several internal
to support this process reduces the value and rule deviations may be necessary to deviate
usability of a tool. from a single MISRA C rule. In general the
There are two obvious mechanisms for management of this process is likely to be
documenting deviations: cumbersome. The ability to deviate from
specific MISRA C rules simplifies this
1. Insert the deviation statement into the process.
source code at an appropriate point. The
review and sign off of this source code is
the deviation procedure. 4.2 Industrial Scale Analysis
2. Record the list of deviations produced by
If a tool is going to be useful in industry then
a tool and sign off this document. The
it needs to be able to handle industrial size
deviation process then becomes the sign
programs. Our input does not seem an
off of a list of errors.
extreme example. The source code consists
The choice between these two processes is a of 115000 lines of commented source code
management and process issue. It is worth in:
noting however that inserting the deviations
• 34 source files (84000 lines in total);
into the source code allows for easier
software maintenance: old violations are not • 34 header files (31000 lines in total).
highlighted by the checker. Only the
additional rule violations are shown. The One of the tools evaluated performed single
alternative is to be presented with a new set translation unit checks only. It was not
of deviations to compare against the old. practical to analyse the entirety of the
system. Of the remaining five tools one took
Of the six tools we evaluated: around 4 hours to analyse the source code
(although more often than not the tool failed
• one had no provision for deviating for
part way through the analysis). The
specific rules (it was however possible to
remaining tools took between 25 minutes and
select only required rules);
15 seconds to perform the analysis.
• one had provision for selecting the
Of course the value of the tool is dependent
required rules on a per-analysis basis;
on both its speed and usability and on the
• four had provision for inserting deviation quality of its results - we do not wish to
documentation within the source code. penalise tools which do more thorough (or
deeper) analysis. It is clear however that a
The mechanisms for inserting deviations into tool which takes four hours to analyse a
the source code were either through program of this size is probably not going to
#pragma directives or through stylised be often used, and a tool which takes 15
comments. Note that the insertion of a
7
Pi Technology, Milton Hall, Ely Road, Milton, Cambridge, CB4 6WZ, UK
Tel: +44 (0) 1223 441434 Fax: +44 (0) 1223 203999
The content of this document is proprietary.

seconds could be usefully incorporated into stage, and much of the benefit of static
the normal software build process. analysis has already been lost.
The most disappointing result from the The benefit of being able to analyse an entire
industrial case study was the large number of project cannot be over-emphasised. Tools
false positive results identified. As previously which enable the analysis of a single
stated the software had been developed to a translation unit are valuable; but a change to
reasonably strict coding standard. We did a single header file can trigger a huge
expect that tools would identify a number of amount of work as all files that include this
MISRA C rule violations but one tool have to be reanalysed. This re-examination
managed to produce approximately 40000 is necessary but tools which can analyse
violation reports. The vast majority of these multiple files make it much simpler. The
were false positives (due to an insufficient single translation unit tools are also limited in
analysis of the source code or to the number of rules they can check: rules
misunderstandings of the C language). It such as rule 25 "An identifier with external
was difficult to identify any real errors in the linkage shall have exactly one external
software when the "noise" level was so great. definition." can only be checked by examining
the entire program. Note however that even
Applying these tools to existing source code -
tools which analyse the entire project will
even source code developed to a similar
have some issues of this nature when
coding standard - with the intention of gaining
presented with a project developed in a
MISRA C compliance, is likely to be a long
mixture of both C and other languages (e.g.,
term activity.
the typical embedded system mix of C and
assembler).

4.3 Ease of Use MISRA C and MISRA C checking tools are


only successful if they are used on real
The incorporation of static checking tools into projects. This means that engineers need to
the build process seems to be a valuable way accept the standard and accept the checking
of getting engineer buy-in to the MISRA C tools. There are some obvious lessons from
process. Most software engineers are our short study:
somewhat sceptical about static checking
and coding standards; automating the • the tools need to be reliable and capable
checking process at the build level enables of analysing the systems that are being
instant feedback of errors saving debugging produced;
time. Several of the tools are intended for • the tools need to handle deviations
graphical use only. This makes the intelligently;
automation of running the tool awkward and
will probably result in the tool being used only • the tools need to run sufficiently quickly
at the code review stage of development. In and be sufficiently easy to automate for
the typical automotive development the users to bother using them;
environment it is likely that the software will
• the tools need to produce very few false
have been executed on the target by this
warnings - we saw far too many.

5. Discussion
The MISRA C guidelines provide some 2. In producing a test suite it is imperative
guidance on the selection of static checking that there is an expected result.
tools. In MISRA C §5.2.3 it is suggested that
3. Passing a large body of code through a
the static checking tool is validated by the
MISRA C checker will reveal much about
production of test cases to exercise the tool
the tool's robustness and usability, but
and by exercising the tool with known good
little about its checking qualities.
code.
Our test suite, which is incomplete and
We have followed this model in evaluating
largely undocumented - it is certainly not of
the tools. The basic observations from this
validation quality - took some 8 man weeks to
are:
produce. It seems unreasonable to expect
1. Producing a test suite capable of software developers to go through this
demonstrating that a tool satisfactorily process in order to evaluate a checking tool
identifies MISRA C violations is a for a published language standard; yet if they
significant piece of work. do not then they cannot reasonably claim to
be producing MISRA C conformant code.

8
Pi Technology, Milton Hall, Ely Road, Milton, Cambridge, CB4 6WZ, UK
Tel: +44 (0) 1223 441434 Fax: +44 (0) 1223 203999
The content of this document is proprietary.

The tool evaluation process will also need to would be a useful tool for the software
be repeated whenever it is necessary to use developer. There were however significant
a different checking tool (e.g., if a new differences in the quality of checking
version is shipped by the tool vendor or if a performed by the tools:
customer demands that a specific tool is
used on a project). • the number of rules tested differed
between the tools;
When developing the test suite there where
many tests where we could not determine by • the interpretation of many of the rules
examining the MISRA C guidelines whether was not consistent - several of the
the code violated the rule. Of course interpretations were baffling, it was
specifications of any sort usually have some difficult to see the relationship between
ambiguity but there were areas where the MISRA C rule text and the
detailed examination of the MISRA C rule implemented rule;
and, where possible, the source of the rule • the tools differed in the extent to which
provided no real confidence in what the rule they implemented the MISRA C technical
meant. clarification - some of the tool vendors
A good example of this is rule 108: "In the were unaware that this document
specification of a structure or union type, all existed;
members of the structure or union shall be • the usability of the tools differed
fully specified." No-one could object to this significantly, in particular there are three
rule, but unfortunately its meaning is not clear orders of magnitude difference in speed
when considered more carefully: what does between the fastest and the slowest
"fully specified" mean? The C standard tools;
requires that a structure definition shall not
contain an incomplete type (§6.5.2.1) and the • some tools had clearly had insufficient
cited source reference ("undefined 35") does testing on industrial sized systems -
not seem relevant. these were not necessarily the tools
made available in beta release.
Until issues such as these are clarified it is
difficult to see how a formal statement of The conclusion must be that claims of
MISRA C compliance can be made. It is MISRA C compliance need to be based on
impossible to claim compliance with a rule significant effort in validating the checking
that cannot be understood. It is possible that tool. The tools we evaluated showed
over time checking tools may tend towards a significant differences that the user of the tool
uniform interpretation of the rules. Even if needs to understand before claiming
this does occur there are still rules that compliance.
require manual inspection (such as rule 4). We would like to thank the tool vendors for
Ambiguity in these rules is even more difficult their co-operation in this work. The support
to manage: instead of several tools having we received from all the vendors during our
their own interpretation of the rule we have evaluation was faultless. However we cannot
each individual engineer having a different enter into correspondence with vendors,
interpretation. users or potential customers for these tools
Each of the MISRA C checking tools we
examined performed valuable checks and

9
Pi Technology, Milton Hall, Ely Road, Milton, Cambridge, CB4 6WZ, UK
Tel: +44 (0) 1223 441434 Fax: +44 (0) 1223 203999

You might also like