EMCEurope2014

You might also like

You are on page 1of 7

See discussions, stats, and author profiles for this publication at: https://www.researchgate.

net/publication/271263872

An approach on quality assurance in computational electromagnetics

Conference Paper · September 2014


DOI: 10.1109/EMCEurope.2014.6930910

CITATIONS READS

3 198

3 authors, including:

Frank Sabath
Bundeswehr Research Institute for Protective Technologies and CBRN Protection
184 PUBLICATIONS   972 CITATIONS   

SEE PROFILE

All content following this page was uploaded by Frank Sabath on 10 December 2016.

The user has requested enhancement of the downloaded file.


Proc. of the 2014 International Symposium on Electromagnetic Compatibility (EMC Europe 2014), Gothenburg, Sweden, September 1-4, 2014

An Approach on Quality Assurance in


Computational Electromagnetics
Sebastian Lange, Martin Schaarschmidt and Frank Sabath
Bundeswehr Research Center for Protective Technologies - NBC-Protection
29633 Munster, Germany

Abstract—During the last two decades scientific computing has the emergence of high performance computing. Scientific
been developed as an additional form of science and engineering. computing consists of mathematical modelling, computing
Scientific computing links theory and laboratory experiment in a and numerical approximation and scientific discipline.
kind of theory based virtual testing environment. The approach
is to gain understanding based on computer implemented mathe-
matical models. Common quality assurance systems like ISO 9001 Scientific computing changed the common qualification
demand validation and verification during product development needs in engineering to a focus on design engineering and
and manufacturing. Digital MockUp (DMU) based computer numerical modelling in the corresponding discipline, here in
aided engineering (CAE) is the main computational engineering
service in industrial applications. computational electromagnetics (CEM).
A verification and validation of the computed results is usu- The basis of all scientific and engineering work is confidence
ally not possible, because CAE is used as an alternative to in the originated results. Achieving confidence in theoretical
measurements or the device under test is not measurable or results is possible through mathematical proofs. This is not
does not exist in the early stage of design. For that reason a possible in the case of experimental work. An accreditation as
validation and verification of the CAE based engineering service
is necessary. An approach for quality assurance in computational a testing lab in accordance with ISO 17025 [10] is confidence-
electromagnetics is given in this paper. building and the common way for laboratory results. Scientific
Numerical analysis became a standard procedure in engineering computing is based on mathematical models. Most CEM
and science. Achieving a traceable and confidence-building proof methods are deterministic. Calibration of mathematics and
of quality becomes more important in industry-sector-specific deterministic system seems meaningless, because numerical
computational engineering.
results do not depend on calibration checks of soft- or hard-
Index Terms—Quality Assurance; Computational Electromag- ware. There are fundamentally different issues between virtual
netics; Quality Management; Numerical Computation and laboratory testing in order to achieve high accuracy which
I. I NTRODUCTION is the basis of high confidence.
Traceability to international norms in order to proof confidence
Science and engineering has changed over the last decades.
in the methods of scientific computing and the resulting
Numerical computations allow a description of physical
service is the aim of this paper. CEM is used as an virtual
phenomena based upon mathematical methods and theoretical
testing environment to describe complex real-world behaviour
assumptions. This yields an virtual test environment which
of electromagnetic systems, but it must not be understood as
uses Digital MockUp and allows a mapping between theory
a virtual lab. Assuming that a “testing laboratory approach” is
and experiment. This scientific analysis approach is often
not possible a different solution is needed. In order to classify
called ’Scientific Computing’ (cp. 1).
the work of simulation analysts and their results there are some
fundamental questions:
• Is the exclusive inspection of the personnel competence
sufficient?
• Is the exclusive validation and verification of the software
sufficient?
• What kind of quality management system is needed?
• Is a software certification needed?
An understanding of CEM as a service (cp. [6]) is proposed in
this paper. The advantage of this approach is the consideration
of the personnel qualification and the software as a tool.
Fig. 1. Simulation Results Therefore it is possible to verify and validate the simulation
software as a black box. This functional testing ignores
The methods of scientific computing are well developed the internal mechanism of simulation software and focuses
and still under development. Much of these methods became solely on the numerical results generated in response to the
possible due to the development in computer technology and excitation depending to boundary conditions. For that reason

978-1-4799-3226-9/14/$31.00 © 2014 IEEE 240


Proc. of the 2014 International Symposium on Electromagnetic Compatibility (EMC Europe 2014), Gothenburg, Sweden, September 1-4, 2014

the sufficiency of software-tools may be proofed without identical in both QM-norms. Figure 3 shows the interaction
the need of certification. This is important, because most between ISO 9001 and NAFEMS QSS 001 exemplary for the
numerical packages are closed source without a sufficient ’Ressource Management’. General requirements are made by
documentation of the exact implementation. Furthermore the
simulation analysts environment and other requirements with
respect to ISO 9000 series [3] have to be considered. Based
on this QM-system a service certification scheme is proposed
here.
The paper is organised as follows: Section 2 gives a re-
view of ISO/IEC 9000 “Quality Management System” and
the NAFEMS QSS 001 supplement as and augmented QM-
system. A Validation & Verification (V& V) scheme for com-
putational results is given in section 3. A resulting certification
scheme based on ISO 17000 [6] is proposed in section 4.
Section 5 provides a conclusion and an outlook on future work.
The used terminology and vocabulary refers to [2], [6], [21].
Fig. 3. Augmentation of ISO 9000 by NAFEMS QSS 001
II. Q UALTIY M ANAGEMENT IN CEM
ISO 9001, whereas NAFEMS QSS 001 specifies this general
The achievement of implemented international QM stan- formulation with respect to scientifc computing needs. In ISO
dards is the demonstration of technical competence and the 9001 [3, 6.2.2] the following requirements are given:
ability of consistence in generating technically valid results. “The organization shall:
A QM system ensures that quality principles are built into the
a) determine the necessary competence for per-
simulation design process at a fundamental level. This facil-
sonnel performing work affecting conformity
itates quality control in all aspects of the simulation process
to product requirements,
(modelling, meshing, computing, validation, etc.) enabling
b) where applicable, provide training or take other
consistent results on a professional standard. Figure 2 shows
actions to achieve the necessary competence,
the interaction between ISO 9000 series norms and NAFEMS
c) evaluate the effectiveness of the actions taken,
QSS 001. The terminology is defined in [2], the requirements
d) ensure that its personnel are aware of the
are given by [3] and a guideline to a total quality system is
relevance and importance of their activities and
given by [4]. ISO 9004 is a kind of design guideline for a QM-
how they contribute to the achievement of the
system without own requirements, it gives recommendations
quality objectives, and
for quality culture for organisations. NAFEMS QSS 001 is a
e) maintain appropriate records of education,
QM-system supplement for scientific computing. It conforms
training, skills and experience.”
to ISO 9001 and elaborates augmented requirements. ISO 9001
These requirements are not specific enough to describe
specifies general requirements, whereas NAFEMS QSS 001
the competence needs of computational analysts. One of the
specifies requirements particular to scientific computing. The
specifications is given in NAFEMS QSS 001 [24, 6.2.2.2]:
specifications aim at qualification of personnel, software, and
category product risks etc. In this manner the NAFEMS QSS “Analysts must have an understanding of the
assumptions inherent in the computational methods
used, and the relationship between the simulation
and the engineering application. Analysts shall have
adequate expertise in the computational methods,
and an understanding of software and hardware
system employed, especially with regard to its lim-
itations.”
A comparison of both requirements provides an idea about
the fundamental need for the use of NAFEMS for quality
assurance in computational electromagnetics or in scientific
computing in general.
Nowadays modern and high performance simulation pack-
Fig. 2. Supplement of ISO 9000 Series ages have powerful graphical post-processing and persuasively
present the computed results. This results, when used by
001 norm augments the whole ISO 9000 family with additional untrained personnel, in a tendency to assume correct results
requirements for the needs of scientific computing. Therefore and a lack of validation. Furthermore there is a tendency
ISO 9000 [2] and ISO 9004 [4] are still valid and usable to over-specify the DMU approximation as a computational
for a NAFEMS based QM-system. The chapter taxonomy is model. This is also an issue due to the analysts lack of

241
Proc. of the 2014 International Symposium on Electromagnetic Compatibility (EMC Europe 2014), Gothenburg, Sweden, September 1-4, 2014

skills. Operating a specified quality management system will The post-processing is the phase of evaluation and assessment
minimise this issue and allows computing under controlled of the computed results. A plausibility check with respect
condition and demonstrates reliability in the numerical results. to the initial assumptions must be done. This demands a
high understanding of the engineering problem and of the
III. CEM D ESIGN P ROZESS & P ITFALLS numerical method used.
A computational engineering design process requires the
following initial informations:
TABLE I
• a clear and accurate definition of the problem, E RRORS IN S CIENTIFIC C OMPUTING
• prediction about the anticipated physical behaviour of the
device system, Error Source
• assessment criteria for the results. Analyst Software Usage
Figure 4 shows the main steps in computational Pre- fringe elements no element check
electromagnetics. The definition of the numerical model Processor unexact material data corrupt meshing
corrupt idealisation
demands an accurate understanding and definition of the Processor wrong boundary conditions ignored warnings
problem. This is needed in order to get the right approximation wrong solver imprecise solver
Post scaling wrong smoothing
Prozcessor wrong choice wrong presentation

Analysts have to handle all these pitfalls with care and


experience. This list is sorted in order of work steps but
not meant to be complete. Expert knowledge is needed to
prepare, process, and evaluate numerical computations. Both
the numerical method and the special discipline in electrical
engineering must be well understood by the simulation analyst.
Figure 4 shows the CAE work flow which involves three
steps of work. The pre-processing consists of CAD-design and
Fig. 4. Conventional CAE Process
the discretization. Some CAE software tools allow CAD and
meshing in one environment. If different tools are used an
while designing the digital mockup. The design of digital import-/export-interface must be used. Processing is governed
mockups uses the techniques of computer aided design by solvers which implement numerical methods. The post
(CAD). One design criterion is the extraction of all important processing uses visualisation tools. Simulation analysts must
and relevant parts. On the other hand an over-specification implemented all of these software packages. The operated
must be minimised. The CAD mockup consists of the QM-system must demand special training and documentation
geometry and all assumptions about the device under test. with respect to [5].
The next step in simulation workflow is the discretization of
the CAD model. This is the most error-sensitive step during IV. V ERIFICATION & VALIDATION
pre-processing. This step connect the CAD approximation NAFEMS defines the terms verification and validation as
with the mathematical model of the solver and maps the follows [26]:
device under test into the numerical method. All needs
“Verification:
and requirements of the used method must be met by the
The process of determining that a computational
created mesh. Especially the mesh size and the connectivity
model accurately represents the underlying mathe-
of all parts must be considered. Both CAD and meshing
matical model and its solution.”
are steps which may be very harmful to the results. A good
understanding of the engineering problem (electromagnetic and
compatibility, radar cross section, antenna performance, etc. ) “Validation:
and the used numerical method is needed in order to realize The process of determining the degree to which a
the right model in CAD and the yielding mesh. The third part model is an accurate representation of the real world
of pre-processing is the definition of the boundary and initial from the perspective of the intended uses of the
conditions. This is the design of the electric model which model.”
bases upon excitation signals, numerical boundary conditions, A guideline for verification and validation in computational
lumped elements, ports etc. solid mechanics is given in [27]. NAFEMS has its origins in
The processing is dominated by the used algorithm with solid mechanics, but it is invariant to the scientific discipline.
few user interventions. Some solver settings like precision to In this context normative documents from IEEE [22], [23] are
abort or the choice of the right solver itself must be made used to fulfil the demands of computational electromagnetics.
with attention. In this phase of numerical design process the Referring to both quotations verification is the comparison to
human factor is the residual error. mathematics and validation the comparison to real world and

242
Proc. of the 2014 International Symposium on Electromagnetic Compatibility (EMC Europe 2014), Gothenburg, Sweden, September 1-4, 2014

measured physical phenomena. with two independent numerical methods. The International
Numerical analysis involves three different types of models. Compumag Society presents the well known TEAM problem
The conceptual model is an abstraction of the real-world. [36]. The IEEE standard [22] gives an overview on the
The conceptual model describes the physical phenomenon examples from the Electromagnetic Code Consortium. In order
of interest of the device under test (DuT), like antenna to increase the number of usable examples more cases are
performance, radar cross section (RCS), electromagnetic actually under construction.
compatibility (EMC) etc. All electromagnetic assumptions In order to ensure correctness in computing, the analyst must
and description about the device under test (DuT) are not know the real solution when computing numerical results.
considered in this level of modelling. The next model in the
sequence is the mathematical model. This model describes C. An Example - Mie Series
the scientific problem with equations like the wave equation The Mie-series is an analytical solution to Maxwell’s equa-
of electromagnetic propagation. After the implementation of tions and describes the scattering of electromagnetic radiation
this mathematical model the resulting code becomes under by a sphere. It is the most simple radar-cross section example.
consideration with discretization and physical parameters This section presents the functionality of the FSV tool with
of the computational model. The first verification variant is an exact numerical solution and a erroneus variant. It aims to
called “Code Verification”. Here the computational code is present the grading sensitivity of the FSV routine.
verified with respect to its mathematical background. This Figure 5 shows the Mie-scattering for the analytical, exact
verification is done by software manufacturers, as in most numerical and erroneus numerical solution.
cases the source code is not accessible. The most popular
method is to compare code results with analytical solutions.
“Calculation Verification” estimates the computational errors
due to discretization. One of the most popular methods is the
estimation of the amount of error on different discretizations
i.e. finer or coarser meshes. The responsibility on “Calculation
Verification” lies with the simulation analyst, because it is
not possible for the code developer to assure correct user-
developed meshes (cp. [26], [27]).
Both code and calculation verification may be important
while operating CEM software and should be considered and
documented in a QM-system. In particular if self-developed
code is used. Fig. 5. Mie Scattering

Both numerical solutions are sufficient or good enough


A. Traceable Grading in V& V when estimated rough as equal. This example is constructed
Provide non-discriminatory conditions is the main objective in order to demonstrate the effectiveness and sensitivity of
in third-party certification. Benchmarks may be used to eval- the FSV approach. The difference between the numerical
uate methods and software-tools. The problem in V& V is the solutions is the choice of sampling point. In both cases 501
definition of non-discriminatory and traceable assessment cri- point are calculated. The red solution is assumed as erroneous.
teria. The IEEE Electromagnetic Compatibility Society gives It bases upon an bad choice of sampling point. This small
an approach on a standardised process for validating CEM deviation yields a degradation of the FSV results, where
techniques [22], [22]. Furthermore a tool which implements the grading shifts from “excellent” down. In order to use
the “Feature Selective Validation FSV method‘” [31] is defined the FSV algorithm in verification assessments with respect
by this norm. to analytical solutions a high excellence grading must be
The FSV tool may also be used for verification assessments, defined in advance. Verification and validation assesments
when the highest grading is demanded. should use different confidence levels to pass the testing. A
grading of 0.8 in “good” may be sufficient in a comparison
B. Numerical Benchmarking with measured data under the assumption of errors in the
Analytical solutions may be used for verification assess- measurement. This grading is surely not sufficient to verify
ments and measured data for validation. There are a lot of ex- numerical solutions versus analytical for exact methods like
amples for analytical solution for standard shapes like spheres Method of Moments (MoM) or Finite Element Method (FEM).
[32], rectangles, dipoles etc. Furthermore more complicated Both represent discretizations of differential equations, which
examples are under development [34], [35], which allow for a are derived from Maxwell’s equations without simplifications.
combined description of enclosures and wires. Such solutions These methods are called exact methods. On the other hand
are very sensitive to the frequency and amplitude deviations. asymptotic methods like Physical Optics (PO) or Geomet-
For this reason such examples may be used to verify a new rical Optics (GO) use approximations of electromagnetical
code. In both publications the analytical solution is compared phenomena in order to compute an estimated solution. The

243
Proc. of the 2014 International Symposium on Electromagnetic Compatibility (EMC Europe 2014), Gothenburg, Sweden, September 1-4, 2014

difference of both classes of methods should be considered in putational Fluiddynamics yields lots of analogies, especially
the definition confidence levels. This paper aim at the exact the mathematical background of the methods used. For that
methods. reason quality assurance strategies like NAFEMS may be used
The FSV routine uses a decomposition of the results and analogously. CEM is one of the latest numerical engineer-
compares two measures. After this a recombination of the ing disciplines which develops formalised quality assurance.
results is done in order to compute a “global goodness of Wide parts of quality assurance were done from the earliest
fit measure”. The comparison of amplitudes and “trends” beginning, but there is no formalised, traceable, and universal
yields the Amplitude Difference Measure (ADM). The rapidly procedure for the whole CEM service. A service certification
changing features result in the Feature Difference Measure scheme arises from a combination of ISO, NAFEMS and
(FDM). A combination of ADM and FDM yields the Global IEEE.
Difference Measure (GDM). The ADM, FDM and GDM are
usable as a point-by-point analysis algorithms or as a single A. Fundamentals on ISO 17065
overall measurement. Figure 6, 7 and 8 depicts the measures The overall aim of certifying numerical analyst services is
respecitevely. to give confidence in the numerical results. The value of certi-
fication is the high level of confidence that is established by an
impartial certification body (CB) and demonstrates competend
fulfillment of third-party requirements. Possible parties that
have an interest in CEM-service certifcation are govermental
authorities, non govermental organisations, consumers, CEM-
service customers, clients of CBs, etc.

B. ISO 17065 Certification Scheme


ISO 17065 is the norm on certification process of prod-
ucts, processes and services and defines the requirements for
conformity assessment bodies operating certification schemes.
Fig. 6. FSV Evaluation - ADM The main aspects of ISO 17065 are:
• focus on competence of the certification body,
• consistent operation,
• four-eyes principle in certification decision and
• impartiality.

These aims suggest that the certification body should involve


CEM-analysts in the process of certification. Furthermore
this personnel must be trained in ISO 9000 and 17000.
Both requirements should assure the competence requirements.
Impartiality and consistence may be fulfilled by non-profit
organisations like governmental authorities. Four eyes prin-
ciple is fulfilled by defined processes and sufficient number of
Fig. 7. FSV Evaluation - FDM personnel involved.
In ISO 17065 [11, 6.2.1] evaluation is based on:
• testing - ISO/IEC 17025 [10],
• inspection - ISO/IEC 17020 or
• management system auditing - ISO/IEC 17021 [9].

Both evaluation and four eyes principle are presented in figure


9. The evaluation and review & attestation part must be done
by different persons. The general procedure in [6] results to
the customised scheme 9. The application phase consists of
a consultation with respect to the scope of certification and a
review of the application due to the competence and capability
of CB. The evaluation phase consists of a check of ISO
Fig. 8. FSV Evaluation - GDM
9000 certificate, a NAFEMS conforming audit, a professional
audit of relevant numerical methods and application and the
final review of project planning and benchmarking. These
V. CEM S ERVICE C ERTIFICATION S CHEME evaluation steps results in a documented recommendation
CEM is a modern discipline of electrical engineering. Com- in order to fulfill review and attestation independently from
parison to Structural Analyis, Multibody Simulations or Com- evaluation. The surveillance is mandatory and is done in a six

244
Proc. of the 2014 International Symposium on Electromagnetic Compatibility (EMC Europe 2014), Gothenburg, Sweden, September 1-4, 2014

[3] ISO/IEC 9001:2008 Quality management systems - Requirements


[4] EN ISO 9004:2009 Managing for the sustained success of an organization
- A quality management approach
[5] ISO/TR 10013:2001 Guidelines for quality management system documen-
tation
[6] EN ISO/IEC 17000:2004 Conformity assesment - Vocabulary and general
principles
[7] ISO/IEC 17007:2009 Conformity assessment Guidance for drafting
normative documents suitable for use for conformity assessment
[8] ISO/IEC 17011:2004 Conformity assessment General requirements for
accreditation bodies accrediting conformity assessment bodies
[9] ISO/IEC 17021 Conformity assessment Requirements for bodies provid-
ing audit and certification of management systems
[10] EN ISO/IEC 17025:2005 General requirements for the competence of
testing and calibration laboratories
[11] EN ISO/IEC 17065:2012 Conformity assessment - Requirements for
bodies certifying products, processes and services
[12] ISO/IEC 17067:2013 Conformity assessment - Fundamentals of product
certification and guidelines for product certification schemes
Fig. 9. Implemented Scheme [13] EN ISO 19011:2011 Guidelines for auditing management systems
[14] ISO/IEC Guide 23:1982 Methods of indicating conformity with stan-
dards for third- party certification systems
[15] ISO/IEC Guide 28:2004 Conformity assessment Guidance on a third-
month period. The certification expires regularly after three party certification system for products
years and results in a re-application. [16] ISO/IEC Guide 53:2005 Conformity assessment Guidance on the use
of an organization’s quality management system in product certification
C. Auditing [17] ISO/IEC Guide 60:2004 Conformity assessment Code of good practice
[18] ISO/TC 176/SC 2/N 525R2 ISO 9000 Introduction and Support Pack-
ISO differs in two different norms ISO 19011 [13] and ISO age: Guidance on the Documentation Requirements of ISO 9001:2008
17021 [9]. The first and second party audit based upon ISO [19] ISO/TC 176/SC 2/N 544R3 ISO 9000 Introduction and Support Pack-
age: Guidance on the Concept and Use of the Process Approach for
19011, which also fulfills third party audits. In this scheme ISO management systems
19011 is used for internal audits of the CB and for internal [20] J. Favaro, When the Pursuit of Quality Destroys Value, IEEE Software,
audits of its clients. The CB audits its clients using ISO 17021 vol. 13, no. 3, pp. 93-95, May 1996,
[21] IEEE Std 610.12-1990 IEEE Standard Glossary of Software Engineering
in a third party audit. Terminology
[22] IEEE 1597.1-2008 IEEE Standard for Validation of Computational
VI. C ONCLUSION Electromagnetics Computer Modeling and Simulations
[23] IEEE 1597.2-2010 IEEE Recommended Practice for Validation of Com-
A certification scheme for computational electromagnetics putational Electromagnetics Computer Modeling and Simulations
service is proposed in this paper. A review of all relevant [24] NAFEMS QSS 001:2007 Engineering Simulation - Quality Management
normative documents is given and their interaction is dis- Systems - Requirements
[25] J. Smith Quality Assurance Procedures for Engineering Analysis,
cussed. The aim of this paper is a traceable certification NAFEMS Ltd. 1999
for CEM service providers, which is proposed based upon [26] NAFEMS/ASME What is Verification and Validation?, 2009
ISO, NAFEMS and IEEE norms. The proposed certification [27] L.E. Schwer Guide for Verification and Validation in Computational
Solid Mechanics, ASME 2007
scheme should be operated by qualified CEM analysts (eg. [28] J. Smith Quality Management in Engineering Simulation - A Primer for
PhD in electrical engineering, physics or mathematics in nu- NAFEMS QSS, NAFEMS Ltd., 2008
merical modelling). Furthermore this personnel should actively [29] NAFEMS Registered Analyst Scheme Aug. 2009 Rev. No. 5, UK
[30] DAkkS: Kompetenzanforderungen fr Auditoren und Zertifizierungsper-
perform CEM and shall be trained in ISO auditing. Based sonal im Bereich Qualittsmanagementsysteme ISO 9001 (QMS) und
upon this scheme confidence is given to numerical results. A Umweltmanagementsysteme ISO 14001 (UMS), 71 SD 6 025, 20.02.2013
certification based upon such scheme is the virtual equivalence [31] Duffy, A.; Martin, A.; Antonini, G.; Orlandi, A.; Ritota, C., The feature
selective validation (FSV) method, Electromagnetic Compatibility, 2005.
to a certified laboratory. The benefit in numerical computing EMC 2005. 2005 International Symposium on , vol.1, no., pp.272,277
in academia is low, but there is a big potential in industrial Vol. 1, 8-12 Aug. 2005
needs. At the moment it is not possible to give confidence [32] Duffy, A.; Martin, A.; Antonini, G.; Orlandi, A.; Ritota, C., The feature
selective validation (FSV) method, Electromagnetic Compatibility, 2005.
in numerical results. The common way is a presentation EMC 2005. 2005 International Symposium on , vol.1, no., pp.272,277
of the skills of analysts and the software. A certified and Vol. 1, 8-12 Aug. 2005
traceable CEM process limits the “freedom of art in CEM”, [33] Mie, G. Beiträge zur Optik trüber Medien, speziell kolloidaler Met-
allösungen, 1908, Annalen der Physik, 330, 377
but more importantly the possible failures are also limited. In [34] Rambousky, R.; Tkachenko, S.; Nitsch, J., Calculation of currents
projects which may cause public distresses or human injuries induced in a long transmission line placed symmetrically inside a
a certification should be mandatory for professional numerical rectangular cavity, Electromagnetic Compatibility (EMC), 2013 IEEE
International Symposium on , vol., no., pp.796,801, 5-9 Aug. 2013
engineering. [35] Tkachenko, S.; Rambousky, R.; Nitsch, J., Electromagnetic Field Cou-
pling to a Thin Wire Located Symmetrically Inside a Rectangular En-
R EFERENCES closure IEEE TRANSACTIONS ON ELECTROMAGNETIC COMPAT-
[1] ISO/IEC Guide 2:2004 Standardization and related activities – General IBILITY, VOL. 55, NO. 2, APRIL 2013
vocabulary [36] International Compumag Society, Testing Electromagnetic Analysis
[2] ISO 9000:2005 Quality management systems – Fundamentals and vocab- Methods (T.E.A.M.), http://www.compumag.org/jsite/team.html
ulary

245
View publication stats

You might also like