You are on page 1of 16

Knowledge-based method for the validation of complex simulation models

Fei-Yan Min
*
, Ming Yang, Zi-Cai Wang
Control and Simulation Center, School of Astronautics, Harbin Institute of Technology, Harbin, PR China
a r t i c l e i n f o
Article history:
Received 2 December 2008
Received in revised form 11 November 2009
Accepted 11 December 2009
Available online 28 December 2009
Keywords:
Complex simulation model
Validation method
Knowledge-based
a b s t r a c t
In traditional research, the validation of simulation models is mainly based on statistical
analysis and simulation error evaluating. As simulation models become more and more
complex, simulation behavior is more complicated and heavily dependent on simulation
conditions, traditional method cannot be applied directly. On the other hand, measured
data needed for traditional methods cannot always be available. Whats more, the valida-
tion of complex models is usually costly and time-consuming.
This research makes effort to settle the problems above. It provides three aspects of
advantage for model validation. First, complicated simulation behavior is abstracted and
classied as ve categories, among which behavior relationship and aggregative behavior
is unique for complex simulation models. And the analysis and validation method of each
kind of behavior is proposed. Second, besides measured data of real system, it proposed to
utilize experience of expert and other kind of domain knowledge for validation task. And
third, the simulation analysis and validation method proposed can be implemented in
knowledge system and accomplish validation task automatically.
Simulation output analysis is the most important step in model validation. Besides clas-
sical continuous dynamic tting and statistical consistency analysis methods, we propose a
domain knowledge-based method for the validation of relationship among behavior seg-
ments and aggregative behavior which is unique for complex simulation models. Domain
knowledge takes the role of reference in model validation, and it varies from measured
data of real system to qualitative experience of experts. Knowledge-based system is imple-
mented based on domain knowledge and validation techniques proposed.
Validation of electromagnetic rail gun simulation models are introduced as an example.
It has been found that this method can provides an automatic validation way for complex
simulation models, and validation tasks can be accomplished efciently.
2009 Elsevier B.V. All rights reserved.
1. Introduction
It is evident from current research that simulation model takes important role in product acquisition, system analysis and
personnel training. People are increasingly concerned with the correctness of their models, and these issues are addressed
with model verication and validation techniques. Verication refers to the processes and techniques that the model devel-
opers use to assure that their models are correct and matches any agreed-on specications and assumptions. And validation
refers to the processes and techniques that the model developers, model customers and decision makers jointly use to assure
that models represent real systems (or proposed real systems) to a sufcient level of accuracy [1]. In this paper, we concen-
trate on the specic issues about the validation of complex simulation models.
1569-190X/$ - see front matter 2009 Elsevier B.V. All rights reserved.
doi:10.1016/j.simpat.2009.12.006
* Corresponding author. Tel.: +86 0451 8600164.
E-mail addresses: minfeiyan@yahoo.com.cn (F.-Y. Min), myang@hit.edu.cn (M. Yang).
Simulation Modelling Practice and Theory 18 (2010) 500515
Contents lists available at ScienceDirect
Simulation Modelling Practice and Theory
j our nal homepage: www. el sevi er . com/ l ocat e/ si mpat
For decades, a variety of validation methods have been developed [14]. In 1980 [2], Balci and Sargent reviewed litera-
tures about model validation, and the total is 125. In 1984 [3], the number increased to 308. These model validation methods
can generally be classied as objective and subjective types. In the same time, the idea to utilize expert system for the val-
idation of simulation model had been proposed, and several expert systems had been applied in some particular areas.
In these expert systems, researchers mainly focused on the implement classical objective validation methods, such as sta-
tistical analysis, and validation tasks were taken in a comparing and goodness of t testing procedure. Birta and Ozmiz-
raks system was a typical rule-based system, and it was designed based on a validation knowledge base (VKB), which was
collection of credible proposition about simulation outputs, and it took the role of validity criterion [5]. On the other hand,
Findler and Mazurs system is in essential a case-based system. The authors pointed out that there were typical ve catego-
ries of errors occurring in simulation model (SM) of large, complex systems and their verication and validation system were
designed mainly based on these error cases [6]. Hopkinson and Sepulvedas system provided a method for real-time evalu-
ation of a trainees performance. It was developed using case-based reasoning to evaluate the trainees decisions as simula-
tion execution. An advantage of this system is that real-time evaluation of a trainee can be performed automatically [7].
One disadvantage of these validation systems was that they were heavily dependent on the availability of measured data
from real system. However, this prerequisite cannot always be satised for the validation of complex simulation system, and
it means direct comparison between simulation outputs and real system behavior impossible [8].
After 1990s, simulation models in research became more and more complex in both structure and behavior, and heavily
dependent on simulation conditions. The traditional validation method, such as statistical analysis and simulation error
analysis can not be applied directly.
On the other hand, the validation of complex simulation models can be very exhaustive and time-consuming. For this
reason, the research to utilize expert system for the validation of simulation models was paid attention again. In 1999, SIM-
VAL reviewed most of simulation model verication and validation methods that had been proposed. They analyzed the
requirements of V&V methods and tools, and development of knowledge-based validation system is a promising scheme
[9]. In 2001, Goalwin et al. discussed the possibility of the automated support tools for verication, validation, and accred-
itation, they pointed out that most of the VV&A tasks were still accomplished by manpower, and its necessary to develop
automatic validation tools [10].
In this paper, a more sophisticated knowledge-based system is developed for the validation of complex simulation mod-
els. First, the complicated simulation behavior, which is hard to validated with traditional method, is abstracted and classi-
ed as ve types, and their simulation analysis and validation methods are researched. Second, besides measured data,
domain knowledge about real system is utilized for validation task, and the domain knowledge varies from classical theorem
to experience about the dynamic behavior of real system. Third, knowledge system is designed dependent of the content of
domain knowledge, and it means that knowledge system can be applied for different kinds of validation task with proper
domain knowledge base.
The structure of this paper is as follows. In Section 2, an overview of the knowledge-based validation method is intro-
duced. And in Section 3, the output behavior of complex simulation models is classied as ve categories, some special sim-
ulation analysis and validation methods based on knowledge are given for certain kind of simulation behavior. In Section 4,
the implementation of a knowledge-based validation system is introduced. For the last, the application of this method to-
gether with an example is described.
2. Overview of knowledge-based validation method
In practice, experience of domain expert is often used for model validation when there is not enough measured data about
real system. The idea of knowledge-based validation roots from this fact.
Besides measured data and experiential knowledge, the classical theorem and formula about the dynamic of real system
can also be utilized for the validation of simulation models. All these information, termed as domain knowledge takes the
role of reference for validity judgment. In other words, domain knowledge describes the valid characterization of simulation
output, i.e. what kind of dynamic features it should take.
Domain knowledge is important in this method. Forrester categorized domain knowledge in simulation as numerical,
written and mental types [11]. We utilize this taxonomy, the content and source of each kind of domain knowledge is listed
in Table 1.
Of the three kinds of reference information, mental information is often the most abundant, especially for complex sim-
ulation model validation. More details about domain knowledge and its acquisition can be found in Wang and Min [12].
The principle of knowledge-based validation method is displayed in Fig. 1.
First, besides domain knowledge, there is other two kinds of knowledge used, i.e. inference knowledge and task knowl-
edge. The content and characteristic of each kind of knowledge is listed in Fig. 1. The three kinds of knowledge are abstracted
and described in knowledge model.
Second, knowledge-based validation system is developed on the basis of inference and task knowledge. In the mean time,
knowledge base is designed based on domain knowledge.
And third, knowledge system is utilized for each validation task automatically after proper domain knowledge is loaded.
This method can settle the embarrassment emerging in the validation of complex simulation models in many aspects:
F.-Y. Min et al. / Simulation Modelling Practice and Theory 18 (2010) 500515 501
(1) Experiential curve and data about real system from domain experts is acquired. And this kind of knowledge can be
used for model validation instead of measured data.
(2) Traditional validation methods mainly concentrate on the validity of continuous dynamic and discrete event. In com-
plex simulation models, the relationships among different behaviors and aggregate of multiple behaviors can be very
complicated and can affect the credibility of simulation result. We propose domain knowledge-based method for the
analysis and validation of these behavioral characterizations in complex simulation model, see Section 3.3 and 3.4.
(3) Knowledge system is designed independent of application domain, it means that this system can be utilized for dif-
ferent validation task with suitable domain knowledge base, see part 4.
(4) Knowledge system can accomplish most of the validation task automatically. It can remove the problem of exhausting
and resource consuming in the validation of complex simulation models to a certain extent.
3. Validation method of complex simulation model based on domain knowledge
R.G. Sargent reviewed that operational validity methods can be classied as comparison and explore model behavior
[4]. And most of the mentioned methods, such as mathematic or statistic analysis, turning test and sensitive analysis, are in
essential based on behavior consistency analysis of simulation outputs.
In traditional research, the behavior of simulation models varies from dynamic of continuous dynamic to discrete event.
In practice, we nd that there are some other types of output behaviors unique to complex simulation models, such as the
logic and timing dependence between behavior, and the aggregative behavior comprising several behaviors. These kinds of
behavior can inuence the credibility of simulation result heavily. In research, we classify the behavior of complex simula-
tion models into ve categories, as listed in Table 2.
Table 1
Category of domain knowledge in simulation model validation.
Type Content Knowledge source
Numerical Measured data about state variable Precise descriptions in the form of data, trajectories, charts, etc., and be
found in various special and general database, observed data and curves of
real system
Measured curve about continuous dynamic
Statistical probability about random discrete event
Written Descriptions about behavioral relationship Explicit descriptions about physical property of real system and simulation
scenario, and come from various classical literatures, and usually in the
form of theories, principles, theorems and formulas
Description about hybrid dynamic sequence
Description about conditional discrete event
Mental Experiential curve about continuous dynamic Ambiguous descriptions from experiences and mental impressions of
experts, often incomplete, informal and biased Experiential data about key index/variable
Experiential probability for random discrete event
Fig. 1. Principle of knowledge-based validation method of complex simulation models.
502 F.-Y. Min et al. / Simulation Modelling Practice and Theory 18 (2010) 500515
3.1. Classical validation methods
3.1.1. Continuous dynamic and its validation
Among the intricate behavior of complex system, discrete event and continuous dynamic is basic, all the behavior of com-
plex simulation models can be decomposed as aggregate of discrete event and continuous dynamic.
When a continuous dynamic takes place, some variables would change gradually as time evolves, and no discrete event
happens during the time interval.
If there is enough measured date about real system, some objective analysis method can be utilized. The most common
objective method is based on mathematical error norm, such as correlation coefcient and Theil inequality coefcient.
Correlation coefcient method can be used to detect a linear relationship between the simulated and the measured data
[13]. The correlation coefcient is dened as:
cc =
n

n
i=1
x
i
x
/
i
_ _


n
i=1
x
i
_ _
n
i=1
x
/
i
_ _

n

n
i=1
x
2
i
_ _


n
i=1
x
i
_ _
2
_
n

n
i=1
x
/2
i
_ _


n
i=1
x
/
i
_ _
2 g
_ (1)
Theil inequality coefcient method can be used to detect the t degree between the simulated and measured data [14].
TIC is dened as:
TIC =

n
i=1
(x
i
x
/
i
)
2
_

n
i=1
x
2
i
_

n
i=1
x
/2
i
_ (2)
Some other kind of error norm can also be utilized, such as expectation of error I
1
=
1
N

N
t=1
e
2
t
; I
2
=

N
t=1
e
2
t
; I
3
=

N
t=1
et
xt
_ _
2
;
and f (e
t
) =
1
1I
1
, maximum of error I
4
= max (e
t
), and time integration of error, etc.
Besides, there are some other kind of objective analysis methods, such as frequency domain coherence analysis [15] and
time series method [16].
3.1.2. Key index/variable and its statistical analysis method
In traditional validation, statistical methods mainly concentrate on the value of some special variables at certain time
point in terminating simulation [10]. In this research, we term these special variables as behavior of key variable and index.
Statistical method is the most common validation techniques for key variable and index appeared in the literatures
[1,2,10]. Most of these statistical analysis methods concentrate on multi-replicas about specic variables, and should be done
with both real measured and simulated data. The hypothesis testing and condence interval is of this category.
The application of statistical method depends on the availability of measured data [10]. If there is enough real data, then
the classical two sample T test can be used. If there is only experiential probability, the Bayesian statistics theory can be ap-
plied, there is some more work should be done in this eld [17].
3.1.3. Discrete event and its validation
Discrete event is another kind of basic behavior of complex simulation system. When a discrete event takes place, some
variables may jump to new values instantaneously while other variables remain unchanged. Discrete event can be catego-
rized as external event, random event and conditional event according to the type of its precondition and change of state
variables, as listed in Table 3.
The validation of discrete event is to ensure that it behaves as described in domain knowledge: its precondition must be
satised when it takes place; all the variables in the system should change to new value as described.
Table 2
Output behavior of complex simulation system.
Simulation behavior Description
Continuous dynamic All related variables vary continuous or keep constant
Discrete event Some variables jump to new values instantaneously
Behavior relationship Casual, logic and timing relationship between continuous dynamic and discrete event
Aggregative behavior Dynamic comprised of both continuous dynamic and discrete event
Key variable and index Variables and metrics selected by considering simulation objective
Table 3
Taxonomy of discrete event in complex engineering system.
Discrete event type Precondition Change of related state variable
External event None Given value or random value
Random event Given random distribution Given value or random value
Conditional event Some logical conditions Given value
F.-Y. Min et al. / Simulation Modelling Practice and Theory 18 (2010) 500515 503
3.2. Experiential knowledge-based validation method
In this section, domain expert experience based validation method is introduced. This method can be integrated in knowl-
edge system and be used for the situation that there is not enough measured data about real system in model validation.
For decades, plenty of expert-based validation methods were proposed, such as animation and turning test [1,4].
Our experience-based validation is essentially to automate these validation methods within computer program. First, the
experience of domain expert is elicited and abstracted into experiential curves and parameters. Second, simulation output
analysis and validation functions are designed with these curves and parameters. And third, these functions are integrated in
knowledge system for validation task.
3.2.1. Acquisition and abstraction of experiential curves
In Fig. 2, experiential curve about the dynamic of exponential decay and second-order inertia system is displayed. The
validation of these two kinds of continuous dynamic is introduced below.
For the rst step, the ambiguous impression about the dynamic is elicited from domain experts. And this mental expe-
rience is then transformed into mathematical descriptions.
As domain expert suggested, the behavior of s
ai
should follow a curve of exponential decay. The mental description about
dynamic behavior of s
ai
is then abstracted as mathematical form:
F(s
ai
) = h
1
ge
h
2
t
h
3
(3)
where h
1
is the decay coefcient, h
2
is decay time constant and h
3
is the initial value of the exponential decay. As it illustrated
in the expert curve, the characterization of the continuous dynamic can be described by the three parameters.
In the same way, we can acquire the mental description and characteristic parameters of dynamic behavior of s
bi
, its out-
put of a second-order inertia link with step input, and its ordinary characteristic parameters are overshoot, stead-state time
and oscillating times, as it shows in the gure.
The characteristic parameters about the behavior in Fig. 2 are listed in Table 4.
In a word, the dynamic characterization of expert curve can be described by certain parameters. And our behavioral con-
sistency analysis based on experience will mainly concentrate on these characteristic parameters.
3.2.2. Acquisition and description of characteristic parameters
For this step, the value of each characteristic parameters about s
ai
and s
bi
are acquired from domain experts.
Fig. 2. Experiential curve of exponential decay and second-order inertia system.
Table 4
The characteristic parameters of the example in Fig. 3.
Behavior Characteristic parameter Description
s
ai
continuous dynamic of variable a Decay coefcient The gain of exponential decay, i.e. h
1
Decay time constant The coefcient describing the length of decay time, i.e. h
2
Initial value of decay The initial state of decay, its h
1
+ h
3
in this case
s
bi
continuous dynamic of variable b Overshoot of inertia link The value that the peak overshoots step unit signal, r
p
Stead-state time The length of time that the systems get to stead state, i.e. t
s
Oscillating times The oscillating times before system get to stead state
504 F.-Y. Min et al. / Simulation Modelling Practice and Theory 18 (2010) 500515
For domain experts, they have impression about the possible value and range of each characteristic parameter in their
experience. And these experiential data can be acquired by some acquisition technology [18].
These experiential data is often obscure and approximate, and it is usually denoted with fuzzy membership function [19].
Its a normal distribution type membership function in Fig. 3. For characteristic parameter h
i
, i = 1, 2, 3, its reference value
is
^
h
i
. When h
i
=
^
h
i
; its membership degree l(h
i
) = 1; and as h
i
deviates from h
i
, l(h
i
) should decline.
For this reason, we can analyze the parameter consistency of simulation outputs with fuzzy member function:
l(h
/
i
) = e

h
/
i

^
h
i
r
2
i
_ _
2
(4)
where h
/
i
is the value of h
i
in simulation results,
^
h
i
is the ideal reference value of characteristic parameter h
i
, and r
2
i
denotes
the precision requirement of simulation application. Both
^
h
i
and r
2
i
is acquired from domain experts.
3.2.3. Analysis characteristic parameter in simulation result
For the last, the characteristic parameters in simulation result are analyzed. Take the analysis of F(s
ai
) as an example, the
least squares estimation algorithm can be utilized.
Run simulation model and get msampling value about variable a among continuous dynamic s
ai
, (t
j
, y
j
), j = 1, 2, . . . , m. The
sampling value is then used for the numerical tting of behavioral function of F(s
ai
), and the deviation function of tting is:
J(h
/
1
; h
/
2
; h
/
3
) =

m
j=1
[y
j
F(s
ai
)[ (5)
To get the best tting value of each parameter, we should consider the derivative condition:
@J
@h
/
i
= 0; k = 1; 2; 3 (6)
As F(s
ai
) is of nonlinear form, its hard to get the analytical solution. There are some least square functions provide by Matlab
can be utilized, such as lsqcurvet(), nlint() and lsqnonlin().
For the last, the characteristic parameters h
/
i
in simulation output is compared with reference value, and simulation con-
sistency of s
ai
is decided. For each parameter h
/
i
, its simulation validity can be obtained with function (4). The analysis of con-
tinuous dynamic s
bi
can be done in similar way.
3.3. Knowledge-based validation method of behavioral relationship
In classical validation methods, they pay lots of attention to the dynamic characteristic of continuous dynamics and sta-
tistical characteristic of key variables. However, there is few research concentrated on simulation validity about dependence
among different behavior segments. We utilize the term behavioral relationship to depict the logic and timing dependence
among continuous dynamics and discrete events.
In complex simulation models, the behavior relationship can be numerous and affect the credibility of simulation results
heavily. For example, in DIS/HLA based simulation, the monitor unit may collect simulation data about exploding event be-
fore weapon ring. Thats resulting from data packet lost or network delay, and its a typical error of timing relationship in
research.
We propose a knowledge-based method for the validation of behavioral relationship and integrate it into knowledge sys-
tem. First, the common behavior relationships involved in complex simulation models are classied and represented with
standard expressions. Second, for each kind of behavioral relationship, we develop its validation function in knowledge sys-
tem. Third, for each simulation model, domain knowledge about its correct of valid behavior relationship is collected by ana-
lyzing simulation context. These knowledge segments are then described with the formal expressions and coded into
knowledge base. And for each validation task, corresponding knowledge base can be used for validation.

i
i

( )
i

1
Fig. 3. Description of experiential data in fuzzy distribution.
F.-Y. Min et al. / Simulation Modelling Practice and Theory 18 (2010) 500515 505
3.3.1. Knowledge representation of behavioral relationships
For a complex system, we denote the sets of its output continuous dynamic and discrete event by V and E, respectively.
For each continuous dynamic s e V, its four behavioral factors are initial value init(s), behavior function F(s), restricted space
res(s) and time domain dom(s). F(s, x) is the dynamic property of state variable x e X, it can be dynamic equation, measured
data about real system, logic proposition or other kind of knowledge. res(s) is the state space that the state variable must
remain during s, it is usually description about restriction on system by environment or simulation scenario.
Take the continuous dynamic s
a1
in Fig. 2 as an example, we only get some experiential description about its behavior. So
its behavioral factors are noted as:
F(s
a1
; a) = curve of exponential decay
res(s
a1
) = a _ A
As for discrete event e e E, its three behavioral factors are change of state variables new(e), enabling condition enable(e) and
time domain dom(e). new(e, x) is the new value of state variable x after the happening of discrete event e, it can be xed va-
lue, analytical expression, or statistical distribution. enable(e) states the precondition that touch off discrete event e, and it is
usually described by appropriate space of relate variables.
With the dened behavioral factors above, we can give formal representation for most of the ordinary behavioral relation-
ships involved in complex simulation models as it shows in Table 5.
Generally, there are two kinds of behavior relationship in complex simulation systems, i.e. logic and timing relationship.
The former denotes dependence between values of related variables, and the latter is the timing dependence between behav-
ior components. Most of behavior relationship can be abstracted and described with these expressions. Besides, these
expressions can be generalized for the relationships among more than two behavioral components.
3.3.2. Implementation of analysis and validation function
For each kind of behavioral relationship, we can develop its simulation output analysis function with standard calling for-
mat. Take continuous dynamic s enables discrete event e as an example, its formal expression is
F(s; x)(t) enable(e); t dom(s) (7)
it means that the trajectory of variable x gets into the enable space of discrete event e at some time t during continuous dy-
namic s. Its behavioral consistency analysis algorithm is as following:
Step1: SET t
step
, t
0
= sftime, t
f
= s.ltime, ag = 0, t
h
= 0;
Step2: IF (t
0
P t
f
) EXIT;
IF (x(t
0
) e enable(e))
{t
h
= t
0
;
IF (e happen)
{IF(x(t
0
+ t
step
) e new(e)) ag = 0;
ELSE ag = 1 ;}
ELSE ag = 2 ;}
ELSE IF(e happen) t
h
= t
0
, ag = 3;
t
0
= t
0
+ t
step
;
Step3: RETURN Step2;
Table 5
Ordinary behavior relationship in complex simulation models.
Behavioral relationship Knowledge representation
Continuous dynamic s enables discrete event e F(s, x)(t) e enable(e), t e dom(s)
Discrete event e
1
enables discrete event e
2
new(e
1
) e enable(e
2
)
Discrete event e disables continuous dynamic s new(e) R res(s)
Discrete event e causes continuous dynamic s new(e) e init(s), dom(e) = s ftime
Discrete event e
1
conicts with discrete event e
2
enable(e
1
) enable(e
2
) = U
Continuous dynamic s
1
conict with continuous dynamic s
2
res(s
1
) res(s
2
) = U
Discrete event e
1
synchronizes with discrete event e
2
dom(e
1
) = dom(e
2
)
Continuous dynamic s
1
synchronizes with continuous dynamic s
2
dom(s
1
) = dom(s
2
)
Discrete event e
1
is t time later than discrete event e
2
dom(e
1
) = dom(e
2
) + t
Continuous dynamic s
1
is t time later than continuous dynamic s
2
s
1
ftime = s
2
ftime + t
e is a periodic discrete event with cycle T dom(e) = {t|t = t
0
+ kT, k e N}
e is a random discrete event with distribution P(k) dom(e
1
) = t
0
t; t - P(k)
506 F.-Y. Min et al. / Simulation Modelling Practice and Theory 18 (2010) 500515
There are four kinds of analysis result. ag = 0 means simulation result is consistent with domain knowledge, ag = 1
means value of variable x is wrong after discrete event e, ag = 2 means e does not happen while its enabled, and ag = 3
means e happens but its not enabled.
In this way, we can develop simulation analysis and validation function for all the behavior relationship in Table 5.
3.3.3. Collection and coding of domain knowledge
For each complex system, we can collect the domain knowledge about all valid behavioral relationship during simulation
operation, by analyzing simulation scenario and physical property of real system.
3.4. Knowledge-based validation of aggregative behavior
In this section, we discuss the validation of aggregative behavior with written or numerical knowledge. The term aggre-
gative behavior is used to characterize the macroscopic property of the behavior comprising several continuous dynamic and
discrete events in complex simulation models.
During simulation run, some simulation agents should follow certain given behavior procedure. The behavior procedure
is often dened in simulation scenario or prescribed by the agents destination. For example, there are usually plenty activ-
ities during the ight simulation of aircraft, such as taking off, preparing combat, attacking, Evading. These activities should
be executed in certain order according to the situation of battling. The correctness of behavior order can have signicant
inuence on simulation result.
On the other hand, there are often plenty of uncertainties and randomness during the operation of simulation, and few
runs may produce very different behavior procedure in output. However, the traditional methods can not be applied directly
for the analysis of the validity of simulation results with different behavior paths.
We propose to utilize hybrid dynamic sequence and discrete event tree for the analysis of the above two aspects of prop-
erties of aggregative behavior.
3.4.1. Hybrid dynamic sequence and its validation
Hybrid dynamic sequence is a kind of aggregative behavior comprising some given discrete events and continuous
dynamics. For example, the hybrid dynamic sequence of variable a in Fig. 2 is denoted as:
h
a
= s
a1
e
1
s
a2
e
2
s
a3
. . . (8)
For the analysis and validation of h
a
, we should collect the domain knowledge about behavioral factors of both related con-
tinuous dynamic and discrete event. And this knowledge can be described with the formal expression in Section 3.3.
By intuition, we can compare behavioral characterization of both continuous dynamic and discrete event with corre-
sponding domain knowledge. And the domain knowledge about hybrid dynamic h
a
can be represented as:
a(t) res(s
ai
) . a(t) F(s
ai
)when t dom(s
ai
)
a(t

) enable(e
i
) . a(t

) new(e
i
) when t = dom(e
i
) (9)
where res(s
ai
), F(s
ai
), enable(e
i
) and new(e
i
) is the formal expression of the domain knowledge about related behavior.
The condition a(t) e F(s
ai
) should be analyzed separately as individual continuous dynamic. The classical method dis-
cussed in Section 3.1 can be used if there is enough measured data, otherwise the experience based method in should be
utilized.
And for the analysis of a(t) e res(s
ai
), a(t

) e enable(e
i
) and a(t
+
) e new(e
i
), we should check that if the value of related var-
iable gets into the corresponding state space in domain knowledge. It can be accomplished with the same way in Section 3.3,
and the analysis algorithm as follow:
Step1: SET t
step
, t
0
= s.ftime, t
f
= s.ltime, i =0, ag[] = 0, t
h
[] = 0 ;
Step2: IF (t
0
P t
f
) EXIT;
IF (a(t
0
) e enable(e
i
))
{IF (e
i
happen)
{ IF(x(t
0
+ t
step
) e new(e
i
)) ag[i] = 0 ;
ELSE ag[i] = 1 , t
h
[i] = t
0
;}
ELSE ag[i] = 2 , t
h
[i] = t
0
;
IF ag[i] = 0 i +=1;}
ELSE
{IF(a(t
0
) e res(s
ai
)) ag[i] = 3;
ELSE ag[i] = 4 , t
h
[i] = t
0
;}
t
0
= t
0
+ t
step
;
Step3: RETURN Step2;
F.-Y. Min et al. / Simulation Modelling Practice and Theory 18 (2010) 500515 507
There are ve kinds of analysis result. ag[i] = 0 means jth discrete event is consistent with its domain knowledge, a-
g[i] = 1 means the value of variable a is wrong after the event, and ag[i] = 2 means the discrete event does not happen while
its enabled. ag[i] = 3 means the continuous dynamic after jth discrete event is consistent with domain knowledge, while
ag[i] = 4 means not.
3.4.2. Discrete event tree and its validation method
The term discrete event tree is used to describe all the possible behavior paths that a simulation model may pass through.
Let E
I
= {e
i
e E, i = 1, 2, . . . , n} be the set of discrete event that may take place, and its discrete event tree can be denoted as:
Path
I
= A
1
A
2
. . . A
n
; A
i
S; F; O (10)
where A
i
= S if e
i
occurs, A
i
= F if e
i
loses, and A
i
= O means e
i
is not involved in current path.
For example, there three possible discrete events e
1
, e
2
and e
3
in Fig. 4. If e
1
occurs, the possible following discrete event is
e
2
. Other wise, e
3
may happen. So there is four possible discrete event paths, i.e. SSO, SFO, FOS and FOF.
For the validation of discrete event tree, we concentrate on the probability of each event path, and take the same idea as
classical statistical validation method. For each possible event path, the statistical characteristic of its occurrence should be
consistent with real system.
For each path P
i
e Path
I
, its probability can be computed conditional probability of related event along the path iteratively:
pro(P
i;j
) = pro(e
j
[P
i;j1
) pro(P
i;j1
)j = 1; 2; . . . ; n (11)
where pro(e
i
|P
ij1
) is the conditional probability of e
i
when P
ij1
occurs, and pro(P
i
) = pro(P
i,n
). And these conditional proba-
bilities may from measured data or experiential probability about real system.
As the probability of each possible event path is computed, the statistical analysis method of condence interval or
hypothesis testing can utilized, as introduced in Section 3.1.
4. Design and implement of knowledge-based validation system
In this part, the implement of knowledge-based validation system is introduced. First, inference structure suitable for the
validation of complex simulation models is discussed. And then, knowledge system is designed based on the discussion.
4.1. Design of inference machine for model validation
Inference machine describes the inferring strategies and knowledge transformation in simulation model validation. By
analysing the context of model validation, we design the proper inference machine, as it shows in Fig. 5. All the validation
methods discussed in part 3 can be implemented in this machine.
Validation inference steps focus on generating new information with static knowledge, i.e. transforming validation tech-
nique into automatic inference. In general, the validation process of complex simulation models is decomposed into ve
steps, i.e. Determine, Design, Select, Analyze, and Judge. These inference steps should be executed in given strategy and there
is specic input and output knowledge segment for each inference step.
The function of each inference step is listed below:
Determine: Choose proper simulation behavior for current validation task, by analyzing simulation context, simulation
objective and validity weight of each candidate task.
Design: Decide validation schema and choose proper analysis and validation method, by analyzing the characterization of
related behavior of determined validation task.
Select: Select the corresponding simulation output data of simulation behavior to be analyzed, the related variables, sim-
ulation logic time interval, and number of simulation replicas is decided.
Analyze: Analyze the consistency between determined simulation behavior and domain knowledge. The related tech-
niques and algorithm is detailed in part 3.
Judge: Compute validity index and make decision, by analyzing behavior consistency with range of acceptable.
Fig. 4. Example of discrete event tree.
508 F.-Y. Min et al. / Simulation Modelling Practice and Theory 18 (2010) 500515
4.2. Implement of knowledge-based system
The central issue involved in the implement of knowledge system is the structure of software. Validation system imple-
ment three levels of functions: user interface, inference machine, and validation infrastructure, as it shows in Fig. 6.
User interface: Provide interface between knowledge system and validation undertaker.
Inference machine: Execute the inference steps in specic strategy, and response to user instructions. The inference ma-
chine automates the simulation output analysis and validation process, it the most important part of knowledge-based val-
idation system. The main functions of inference machine contains: validation task decision, validation schema design,
simulation output selection, behaviour consistency analysis, and validity judgment.
Validation infrastructure: All the supporting data and knowledge for validation is stored here. This information contains
simulation output data, domain knowledge base, and specic behaviour consistence analysis algorithm.
This system can be utilized for simulation data analysis and validation, and functions of this system are listed here:
Fig. 5. Inference machine for the validation of simulation models.
Fig. 6. Software structure of knowledge-based validation system.
F.-Y. Min et al. / Simulation Modelling Practice and Theory 18 (2010) 500515 509
(1) Automate the validation inference process and analyze the behavior consistency between simulation and real system.
(2) Compute and synthesize validity indices for all the validation nodes among different levels of the hierarchical index
tree.
(3) Manage and maintain domain knowledge, it provide interface for validation engineers and domain experts.
(4) Manage, view, edit and update information about each validation nodes in task tree.
(5) Trace the whole validation process, collect information generated in each inference steps, and display validation infor-
mation for every validation node.
(6) Generate validation conclusion and document validation information.
5. An example problem
5.1. Introduction of simulation system
In this part, we take simulation model of electromagnetic rail gun as an example. This model has long been used in our
research, and it has a multi-entity structure and hybrid dynamic behavior, offering excellent case for demonstrating the pro-
posed method, see Fig. 7.
Electromagnetic rail gun is a new concept of launching equipment [20]. It consists of two parallel rails connected to a
source of DC current, and a projectile sliding between the rails propelled by Lorentz force. During the acceleration, there
are mainly there kinds of forces against the projectile motion: mechanical friction, ablation drag, and resistance force arising
from the air. A key equipment is the pulse forming network for optimizing peak current. It contains suits of capacitors, and
the scheduling of switches on capacitors has obvious inuence on launching efciency.
The introduction will mainly about the validation tasks of dynamics of projectile and behavioural relationship of current
in capacitors. As the application of statistical and quantitative validation methods can be found in lots of literatures, our dis-
cussion will mainly focus on experience and logic knowledge-based validation method.
Fig. 7. Sketch and pulse forming network of electromagnetic rail gun.
Table 6
Validation behavior of electromagnetic rail gun simulation system.
Simulation behavior Related variable Description of domain knowledge Index
Switch on Current of rail Conditional event arranged by launching logic A01
Launching efciency Gunpoint velocity Measured data about velocity, A02
Energy in capacitors Measured energy expenditure
Launching process Status of weapon
system
Target ?charge ?acceleration ?departure ?external ight ?end A03
Start acceleration Acceleration Conditional event dependent on Lorenz force B01
Power operational
process
Status of capacitors Recharge ?launching instruction ?discharge ?departure ?feedback B02
Acceleration process Status of rail
Projectile velocity Initial acceleration ?mechanical friction ?heat congregating ?ablation
drag ?air resistance ?departure
B03
Inner acceleration
dynamic
Acceleration The acceleration is inuenced by many factors B04
Lorenz force dynamic Lorenz force
F =
1
2
L
/
I
2
R
B05
Mechanical friction
dynamic
Friction Proportional to velocity B06
Air resistance dynamic Air resistance Inuenced by velocity B07
Ablation dynamic Ablation Inuenced by inner temperature B08
Arc discharge Rail voltage Conditional event dependent on port voltage B09
Target Position of target External event dened in scenario C01
Air disturbance Disturbance Random event during external ight C02
Port velocity Projectile velocity Measured data about projectile velocity C03
Miss distance Projectile position Measured projectile position C04
510 F.-Y. Min et al. / Simulation Modelling Practice and Theory 18 (2010) 500515
5.2. Step 1: validation requirement analysis and validation task tree establishment
For the rst step, the validation tasks are determined by analysing the simulation context of this application. This sim-
ulation system is used for the analysis and synthesis tactical performance of electromagnetic rail gun, and the validation task
analysis begins with identifying those key metrics of tactical performance.
By analysing the simulation context of electromagnetic rail gun, we nd that the variables inuencing projectile dynamic
are the most considerable part in simulation outputs. For this reason, the validation would rst concentrate on the velocity,
acceleration, and miss distance of projectile, the force operating on projectile, such as Lorentz force, mechanical friction, air
resistance, and ablation drag. Besides, the launching efciency of the whole system should also be considered.
The simulation behaviour that inuences credibility of simulation results is listed in Table 6.
To summarize, there are 16 behaviour segments should be considered in the validation. For each behaviour segments, a
validation task is dened. It shows the validation task tree of electromagnetic rail gun simulation system in Fig. 8.
5.3. Step 2: domain knowledge and its acquisition
In this step, the knowledge required for model validation is acquired. This discussion will mainly concentrate on the
acquisition of experiential domain knowledge, i.e. TB01, TB03, TB04 and TB05.
The inner ballistic simulation model should be validated carefully, since the interactions among electromagnetic,
mechanical and thermodynamic loops are considerable complicated. As it shows in Fig. 9, it is the knowledge segment sheet
about dynamic behaviour of the three kinds of resistances against projectile. And it reveals the primary behavioural charac-
terization about inner ballistic dynamic. This knowledge sheet is depicted by domain expert with the help of knowledge
acquisition engineers. More information about knowledge elicitation and acquisition for model validation can be found in
[13].
First, red curve depicts the dynamic pattern of air resistance during acceleration. The air resistance would almost keep
constant at some low value if the velocity of projectile is under value of V
th1
, and the resistance would increase linearly when
velocity exceeds the threshold. Second, the ablation drag is heavily dependent on temperature of inner rail. When the tem-
perature is under value of T
1
, the drag can be considerable low; as the temperature exceeds T
1
, projectile begins to ablate, the
drag would increase linearly; and when the temperature is above another threshold T
2
, the ablation drag would nearly keep
constant. The blue curve shows the dynamic pattern of inner ballistic ablation drag. Third, the mechanical friction is the
Fig. 8. Validity indices tree of electromagnetic rail gun.
Fig. 9. Description of domain knowledge for inner ballistic dynamic of electromagnetic rail gun.
F.-Y. Min et al. / Simulation Modelling Practice and Theory 18 (2010) 500515 511
dominant resistance of projectile. The friction is proportional to the pressure between projectile and rail, and it would in-
crease as Lorentz force increases, as the green curve.
The collected domain knowledge, both measured data about the operation of real system and experience of experts, is
then coded in knowledge base.
There are typically two kinds of simulation output analysis and validation functions involved, general analysis function
and experience based analysis function. The formal is implemented independently of validation domain, can be used for the
analysis of general model behavior, the regular statistical analysis and TIC method is of kind. The latter is mainly used for the
analysis of experiential knowledge, such as the expert curve-based tting.
Take the experiential curve of air resistance (red curve in Fig. 9) as example. It can be considered as two linear dynamic
connected by the discrete event that velocity of projectile exceed value of V
th1
. And the experiential curve can be denoted as:
F(f
p
) =
k
1
t v _ V
th1
k
2
(t V
th1
=k
1
) V
th1
v > V
th1
_
(12)
where k
1
, k
2
is experiential parameter that can be acquired from domain experts. The method about their acquisition and
description is discussed in Section 3.2.
All the experience about inner ballistic dynamic is abstracted in the same way. And the behavior consistency analysis
functions can be implemented based on this experiential curve and data.
5.4. Step 3: simulation experiment
In this step, simulation experiment is done with the electromagnetic rail gun model, and the simulation output of interest
is collected. In these experiments, length of rail is 3 m, mess of projectile is 100 g, and four suits of capacitors are utilized.
Some of the simulation output behavior is listed in Figs. 10 and 11.
0 0.5 1 1.5 2 2.5
x 10
-3
0
0.5
1
1.5
2
2.5
3
(s)
(
m
)
Fig. 10. Projectile position in inner ballistic.
x 10
-3
(s)
0 0.5 1 1.5 2 2.5
0
200
400
600
800
1000
1200
(
N
)
Fd
Ff
Fp
Fig. 11. Inner ballistic resistance during acceleration.
512 F.-Y. Min et al. / Simulation Modelling Practice and Theory 18 (2010) 500515
Simulation outputs for all validation tasks in Table 7 are collected. As it shows following gures, the inner acceleration
end at 1.1 ms, and the velocity is about 1899.6 m/s. The launching efciency g = E
p
/E
c
= 180424 J/1903000 J = 9.5%.
5.5. Step 4: validation of electromagnetic rail gun simulation system
After all the required measured data, domain knowledge and simulation output listed in Table 6 is collected and loaded in
knowledge system, simulation analysis and validation task is accomplished automatically.
It shows the simulation analysis of air resistance in inner ballistic in Fig. 12. The simulation behavior of air resistance is
tting as two linear continuous dynamic connected by the discrete event that velocity of projectile exceed value of V
th1
. The
analysis of all the simulation behavior in Table 6 can be accomplished in the same way.
The main validation result is validity indices of tasks among the validation task tree, as it shows in Table 8.
Table 7
Part of the domain knowledge about inner resistance for electromagnetic rail gun simulation models.
Validation task Simulation behavior Related
variable
Validation method Knowledge description
B06
(mechanical
friction)
Continuous dynamic of
mechanical friction
Friction/
Lorentz force
Behavioral relationship
analysis
The mechanical friction is proportional to the Lorentz
force approximately
B07 (air
resistance)
Continuous dynamic of
initial air resistance
Air resistance Continuous dynamic
tting
It should almost keep constant at some low value if the
velocity is under threshold V
th1
Continuous dynamic of
latter air resistance
Air resistance Continuous dynamic
tting
It should increase linearly with slop k
ar
if the velocity is
above V
th1
B08 (ablation
drag)
Discrete event of beginning
ablation
Inner rail
temperature
Discrete event match Projectile begins to ablate as inner rail temperature
exceeds threshold T
1
Hybrid dynamic of ablation
drag
Ablation drag Hybrid dynamic
sequence Validation
Ablation drag dynamic is dependent on the state of
ablating event tree
Event tree of ablation Inner rail
temperature
Discrete event tree
analysis
The state of ablating event tree changes as temperature
exceeds two thresholds
Fig. 12. Simulation output analysis of inner air resistance.
F.-Y. Min et al. / Simulation Modelling Practice and Theory 18 (2010) 500515 513
By synthesizing the indices among validation task tree, we can get the simulation validity of three aspects of function.
And then, the validity index of the whole simulation system is 0.730 which is synthesizing from the three aspects.
By analyzing the validation information generated in each inference step and validity indices among levels of validation
tasks tree, validation conclusion is made. For example, some direct conclusion about this system is listed here:
(1) The simulation system is quite credible with validity index 0.730. The main aspect that inuencing credibility of sim-
ulation results is simulation function B (inner ballistic dynamic), and it is considerable credible (0.86).
(2) The most important validation task is around velocity of projectile, and it is mainly effected by dynamic of inner bal-
listic, such as Lorenz force, mechanical frication and air resistance. For the validation of inner ballistic model, we uti-
lize both objective and subjective method. For the former, the measured data about the velocity of gunpoint is
collected and classical statistical method is applied. For the latter, we concentrate on the acceleration process of inner
ballistic where measured data is hard to get. However, the validation is accomplished based on knowledge about
related domain, such as electromagnetic, aerodynamics, and mechanical friction, etc. The validation results of both
objective and subjective methods show that inner ballistic behavior is quite consistent with both measured data
and domain experience. And the simulation result can be accepted.
(3) Validity index of external ballistic is 0, thats because the measured data about miss distance is lacking, and we adopt
conservative validation manner. However, the external ballistic dynamic is insignicant for current simulation
objective.
6. Conclusion
This research is mainly aimed at settling the problems in the validation of complex simulation models, such as traditional
method cannot be applied for complicated simulation behavior, measured data needed is often unavailable, and validation
task is usually costly and time-consuming.
The simple example about the validation of electromagnetic rail gun simulation can make clear impression about our
method. First, complicated simulation behavior is decomposed into ve typical categories, and proper knowledge-based
method is used for each behavior segment. Second, besides measured data about real system, experience of domain expert
and other kind of domain knowledge is acquired and coded in knowledge base for validation task. And third, the simulation
analysis and validation can be accomplished automatically.
Domain knowledge is the most important part in this method, and it varies from numerical type to written and experi-
ential type. Numerical knowledge is mainly used in traditional statistical analysis and error analysis method. Written knowl-
edge is used for the validation of the relationship among different behavior and aggregative behavior. Experiential data can
be used in statistical analysis, and experiential curve of real system can be used for the analysis of error of continuous
dynamic.
However, there are several key issues for future research, such as the elicitation and acquisition of domain knowledge, the
analyzing algorithm for complicated system behavior, and the maintenance and management of knowledge base.
Acknowledgements
This research is supported by Natural Science Foundation of China (60434010), and the Foundation of the Outstanding
Youth of Heilongjiang Province (JC200606).
References
[1] John S. Carson, Model verication and validation, in: Proceeding of 2002 Winter Simulation Conference, 2002, pp. 5258.
[2] O. Balci, R.E. Nance, Bibliography on validation of simulation models, Newsletter-TIMS College on Simulation and Gaming 4 (2) (1980) 1115.
[3] O. Balci, A bibliography on the credibility assessment and validation of simulation and mathematical models, Simuletter 15 (3) (1984) 1527.
[4] Robert G. Sargent, Verication and validation of simulation model, in: Proceeding of 2004 Winter Simulation Conference, 2004, pp. 1728.
Table 8
Validity indices for each validation task of electromagnetic rail gun.
Task Simulation deviation Validity index Weight Task Simulation deviation Validity index Weight
TA01 0 1.00 0.138 TB05 13.2% 0.67 0.081
TA02 5.0% 0.90 0.287 TB06 0 1.00 0.081
TA03 0 1.00 0.287 TB07 0 1.00 0.081
TA04 0 1.00 0.287 TB08 99.6 m/s 0.86 0.168
TB01 18.4% 0.54 0.081 TB09 0 1.00 0.081
TB02 0.8% 0.98 0.349 TC01 0 1.00 0.083
TB03 16.7% 0.58 0.081 TC02 2.7% 0.93 0.172
TB04 9.2% 0.80 0.081 TC03 0 0.745
514 F.-Y. Min et al. / Simulation Modelling Practice and Theory 18 (2010) 500515
[5] L.G. Birta, F.N. Ozmizrak, A knowledge-based approach for the validation of simulation models: the foundation, ACM Transactions on Modeling and
Computer Simulation 6 (1996) 7698.
[6] Nicholas V. Findler, Neal M. Mazur, A system for automatic model verication and validation, Transactions of the Society for Computer Simulation 6
(1990) 153172.
[7] W.C. Hopkinson, J.A. Sepulveda, Real time validation of man-in-the-loop simulations, in: Proceedings of the Winter Simulation Conference, 1995, pp.
12501256.
[8] J.P.C. Kleijnen, Validation of models: statistical techniques and data availability, in: Proceeding of Winter Simulation Conference, 1999, pp. 647654.
[9] P. Glasqow, C.M. Parnell et al., Simulation Validation (SIMVAL), in: Making VV&A Effective and Affordable Mini-Symposium and Workshop, 1999, pp.
3446.
[10] Patrick W. Goalwin, Jerry M. Feinberg, Pamela L. Mayne, A detailed look at verication, validation, and accreditation (VV&A) automated support tools,
in: Proceedings of the 2001 Fall Simulation Interoperability Workshop, Orlando, FL, 2001, 01F-SIW-041.
[11] Daviud N. Ford, John D. Sterman, Expert knowledge elicitation to improve formal and mental models, System Dynamic Review 4 (1998) 309340.
[12] J.T. Wang, F.Y. Min, Knowledge elicitation and acquisition for simulation validation, in: International Conference on Computational Intelligence and
Security, 2007, pp. 8588.
[13] A.S. White, R. Sinclair, Quantitative validation techniques a data base (I). Simple example, Simulation Modeling Practice and Theory 12 (2004) 451
473.
[14] D.J. Murray-Smith, Methods for the external validation of continuous system simulation models: a review, mathematical, Computational Model
Dynamical System 4 (1998) 531.
[15] John C. Morris, Matthew P. Newlin, Model validation in the frequency domain, in: Proceedings of the 34th Conference on Decision and Control, New
Orleans, 1995.
[16] Der-Ann Hsu, J. Stuart Hunter, Validation of computer simulation models using parametric time series analysis, in: Proceedings of the 1974 Winter
Simulation Conference, 1974, pp. 1416.
[17] Sigrun Andradorrir, Vicki M. Bier, Applying Bayesian ideas in simulation, Simulation Practice and Theory 8 (2000) 253280.
[18] G. Schreiber, H. Akkermans, et al, Knowledge Engineering and Management: The CommonKADS Methodology, MIT Press, 2000.
[19] L.A. Zadeh, K.S. Fu, et al, Fuzzy sets and their application to cognition and decision process, IEEE Transactions on System, Man, and Cybernetics 7 (2)
(1977) 122123.
[20] H.D. Fair, Electric launch science and technology in the United States, Proceedings of the 11th Symposium on Electromagnetic Launch Technology,
Saint-Louis, 2002.
F.-Y. Min et al. / Simulation Modelling Practice and Theory 18 (2010) 500515 515

You might also like