You are on page 1of 5

Robust Process Capability Index Tracking for

Process Qualification
Cong Gu and Colin C. McAndrew
Freescale Semiconductor, Tempe, AZ 85284

Abstract— This paper presents a robust process qualification qualification, and on-going monitoring of processes, with
and monitoring procedure based on the recently developed YAT application to power BiCMOS technologies. Besides being
and YWL process capability indices. Combined with appropriate based on the new capability indices, our procedures are web-
test structures and measurements the procedure enables rapid
process maturity evaluation and on-going loop closure of man- based and dynamic, and enable simple and fast identification
ufacturing to process specifications. The procedure generates of problems.
interactive web-based reports and data that provide high-level
“scoring” and a time-line of process capability, an ability to II. C APABILITY I NDICES
quickly dive down and identify the root cause of issues, and The traditional capability indices are
capability to compare between fabs.
U L − LL
cp = , (1)
I. I NTRODUCTION 6σ
[ ]
Semiconductor manufacturing operations use statistical pro- U L − µ µ − LL
cpk = min , , (2)
cess control (SPC) to monitor performance and maximize 3σ 3σ
yield. Commonly, this entails tracking capability indices such and
as cp , cpk , and cpm to assess process maturity and to help U L − LL
cpm = √ , (3)
optimize a process during qualification and production ramp. 6 σ 2 + (µ − T )2
These indices are monitored for a set of devices and mea-
where µ and σ are the process mean and variability, estimated
surements that are selected to gauge the overall “health” of
from the sample mean and sample standard deviation. We
a process (parametric alignment to specifications, reliability,
will assume that the statistics, target, and limits are specified,
etc.). The intent is, for each of the identified “crucial” process
although setting limits based on limited data is fraught with
parameters, to:
its own issues, see the excellent discussion in [3].
• measure how well the sample mean m aligns to the
An obvious deficiency of both cp and cpk is that they do not
process target T depend on the process target T , so cannot quantitatively indi-
• quantify the statistical spread, via the sample standard
cate when a process drifts from its target value. cpm is affected
deviation s, from process variation by the difference between the process mean and its target,
• assess compliance with respect to lower LL and upper
however it is symmetric and therefore of limited use where
U L specification limits T may be skewed with respect to (LL + U L)/2. Asymmetry
However, there are several issues with conventional proce- is often observed in distributions of leakage currents, where
dures for process capability monitoring and control. First, the low is acceptable and too high is a problem, and breakdown
standard capability indices often do not fairly represent the voltage, where the reverse is the case, or where the parameter
“health” of a process, see the extensive discussion in [1]. To distribution is non-normal (quantities with significant variation
circumvent that issue the new capability indices yield-around- often exhibit a log-normal distribution, which is skewed; this
target (YAT) and yield-within-limits (YWL) were introduced can be handled by a logarithmic, or more generally Box-
in [1]. Not only do these indices produce less false alarms Cox [4], transformation). None of these indices has intuitive
and miss fewer real process excursions than the conventional bounds (cpk even becomes negative if µ drifts outside of the
capability indices, they are substantially more intuitive to specification limits).
understand and so are easily assimilated by engineers. The new capability indices introduced in [1] are Yield
Second, complex processes, such as power BiCMOS or Around Target
BCD (bipolar, CMOS, and DMOS) processes [2], comprise ( )
T − LL UL − T
many process steps, numerous differently doped silicon layers, YAT = yield T − ,T + (4)
and a large palette of devices, and can therefore have hundreds 3 3
of crucial process parameters to monitor and evaluate. So it can and Yield Within Limits
be difficult to quickly evaluate the maturity level of a process
YWL = yield(LL, U L) . (5)
or to rapidly identify where corrective actions are needed.
In this paper we present an implementation of the YAT- As Fig. 1 shows for an assumed normal distribution with limits
YWL methodology for process maturity evaluation, process set at µ ± 3σ, essentially YAT is the yield within ±1σ of the

978-4799-8304-9/15/$31.00©2015IEEE 54
1
YWL
0.8

Capability Index
0.6

YAT 0.4 cp
cpk
cpm
0.2
YAT
YWL
0
0 0.5 1 1.5 2 2.5 3
(µ−T)/σ
−3 −2 −1 0 1 2 3
z Fig. 2. Capability index sensitivity to mean shift.

Fig. 1. YAT and YWL definitions (using LL, T, U L = µ − 3σ, µ, µ + 3σ),


abscissa is z = (value − µ)/σ. 2 c =c =c
p pk pm
YAT
mean and YWL the yield within ±3σ of the mean. These YWL
1.5
measures are intuitive and simple for engineers, as opposed to
Capability Index
statisticians, to understand.
YAT and YWL, together, embody all the desirable attributes
of process monitoring, see Table I (after [1]). The goal is for 1
parameters that are evaluated for YAT to have 30% of the data
within one third of the limit range around the target, and for
parameters scored for YWL for 90% of the data to be within 0.5
the range defined by the limits. These may seem somewhat
“loose” compared with 68.27% and 99.73% for the case of
ideal ±1σ and ±3σ values for a normal distribution, but 0
experience shows that tighter limits for complex SmartPower 0.5 1 1.5 2
σ/(6(UL−LL))
BiCMOS processes with hundreds of parameters being scored
cause flagging of false positives. Fig. 3. Capability index sensitivity to standard deviation shift.

TABLE I
C APABILITIES OF CAPABILITY INDICES .
accounts for cp cpk cpm YAT YWL in mean, which is the mean shift value assumed in the 6σ
proximity to target no no yes yes no methodology [5].
target to limit range no no no yes no Fig. 3 compares traditional capability indices, for T = µ, to
limit range yes no yes no yes those of (4) and (5) when there is a shift in process standard
data within limits no no no no yes deviation with respect to the process range U L − LL. Because
non-normal data yes no no yes yes the target and process mean are aligned all of the traditional
capability indices are the same. As the process “improves” (by
Fig. 2 compares traditional capability indices, for U L − having σ decrease), the traditional capability indices increase
LL = 6σ, to those of (4) and (5) when there is a shift in without bound, giving the impression that a process can keep
process mean with respect to the process target. Clearly cp , on getting better and better by reducing σ. However, the
being insensitive to proximity of µ to its target, incorrectly change in YWL is quite small, as it can only asymptotically
implies there is no problem when the process is not centered. approach 100%. YAT does improve, but again it is bounded by
cpk drops rapidly (linearly) as µ shifts, even though the actual 100%. The YAT and YWL behaviors are more rational than
yield remains almost unchanged for small shift values, and numerically ballooning to arbitrarily high values.
becomes zero (which is universally interpreted as being “very Note that not all crucial parameters should be scored for
bad”) when µ − T = 3σ even though the actual process yield both YAT and YWL. Quantities that affect parametric circuit
there is 50%. cpm behaves better, but is still very pessimistic performance, such as threshold voltages, resistance and capac-
about actual process yield. YWL portrays yield accurately, as itance values, drive currents, transconductances, etc. must be
it should, and YAT reduces to 30% at close to a 1.5σ shift be scored for both. However, quantities like leakage current (or

55
breakdown voltage) are “one-sided” and the primary concern IV. R EPORT G ENERATION P ROCEDURE
is that they are less (or greater) than a specified limit. YAT As noted above the number of crucial parameters for a
scoring, and even a target value T , are not relevant for such complex process can be in the hundreds: manually check-
quantities, so only YWL need be scored, and the non-critical ing numerical values for µ and σ compared to LL, T, U L
limit can be set to an arbitrarily small (or large) value. specifications to identify issues is a time consuming, tedious,
III. T EST S TRUCTURE AND DATA S TRATEGY and error prone approach. The communication bandwidth of
a graphical presentation of data is much higher than that of
Evaluation and tracking of YAT and YWL is based on fab
tables of numbers, and having an ability to quickly identify and
wafer-level test data (sometimes called electrical test, ET, or
dive down into problem parameters speeds up process “health”
final test data; here we will use the term scribe-grid process
evaluation as well. To enable efficient rapid process maturity
control, SGPC, data). Most fab testing is done using standard
evaluation and on-going loop closure of manufacturing to
measurement structures (sheet resistance, junction, capacitance
process specifications, we have therefore developed a web-
etc. structures) plus appropriately sized MOS transistors, bipo-
based YATYWL reporting system; Fig. 5 shows a high level
lar transistors, resistors, etc. However, testing of large power
view of our overall procedure.
devices (up to 40 A) poses problems because of the current
compliance limitations of standard fab test equipment, and Crucial
Monthly
because large power devices are too big to fit into SGPC Monthly Parameter
Data
arrays. To enable correlation with final unit probe testing of Monthly
Data Specification
full size devices, we select specific cells from final products Data
and use these for our SGPCs, see Fig. 4. Monthly Data Screen
Data Limits
...
YATYWL Report Generation

HTML
Time Summary

Monthly Monthly ... Monthly


Summary Summary Summary

Fig. 4. Power transistor SPGC testing. histograms


...
Probably the most time consuming part of the data analysis
and report generation procedure is ensuring the integrity of the
measured data. Manufacturing data can include measurements Fig. 5. YATYWL report generation procedure.
from split lots. Early in the life cycle of a process these
are used to explore various process recipe options, and for a The inputs to the procedure are:
mature process can still be run to drive yield improvement • a csv (comma separated variable) file that the defines the
and cost reduction. Whatever the reason, split lot data are crucial parameters, their LL, T, U L specifications, and
not representative of the overall process capability so must how to access their measured values
be excluded from analyses used to evaluate process maturity • data files that contain fab SGPC measurements
or ongoing compliance with specified process capability. The • where necessary, screening limits for specific parameters
continuing fab optimizations noted above, for performance or • (optional) requests to generate specific, rather than gen-
yield improvement, can lead to an ostensibly single “process” eral, purpose analyses
being adjusted slightly for particular (high volume) products. The data files are generally for one month periods, and can
Decisions about whether or not a process is centered may be separated by maskset or may be a combination from all
have to factor in the product mix that was run during a certain masksets run during a month. For process maturity audits
period of time. during technology development the data files are from defined
The other tedious data integrity assurance task is, for non- lots and wafers selected to be representative of the baseline
split lot data, getting rid of bad measurements. These can come process; directives to generate one-off reports of this nature
from poor contacting during probing, unreliable measurements are provided in the last item above (for clarity, not shown
(i.e. those near or below the noise floor for the instrument), and in Fig. 5). If there are issues with some measurements (see
sundry other problems. This can sometimes be done based on previous section), then the (optionally) provided limits are
simple screening limits, but experience shows that sporadically used to screen the raw data. Whether or not screening is
a previously unencountered measurement issue will arise. specified, filtering of extreme outlier data is done.

56
Fig. 7. Dive-down into single month/maskset detailed report.

Fig. 6. YATYWL tracking report (HTML).

The outputs of the procedure are:


• a top level report that summarizes YATYWL compliance
over time
• detailed compliance reports, for each crucial parameter,
for each month for each maskset and for each specified
maturity audit
• histograms of the measured fab data for each crucial
parameter, including limits used for YAT and YWL
computation Fig. 8. Cross-fab parameter histograms.
• details of the specific masksets, lots, and wafers from
which the data were measured and, if applicable, the
limits used to screen the data of the page; If the entry were via a maskset link then those
The last of these is not shown in Fig. 5. hyperlinks would be for different months (for clarity these are
Fig. 6 shows an example of a top level report. The abscissa not shown; they are just links). In the details at the top of the
of the plot is months over a one year period, most recent to table are hyperlinks to a web page that list the maskets, lots,
oldest from left right (this is not how a time series would and wafers from which the data used to generate the report
normally be plotted; it is done that way so the most recent, were measured, to a web page that list the screening limits
hence most relevant, data are immediately viewable in a used, if they were specified, and to copies of the same report
browser without having to scroll to the right). In the plot, in both MS-Excel and text formats.
YAT performance is shown using solid lines and filled square The table lists all of the crucial parameters, with com-
symbols, YWL performance using dashed lines and filled puted YWL and YAT values, and useful information such
circle symbols. Different colors distinguish different masksets, as the specified LL, T, U L values, number of raw and
and an amalgamation from all masksets is shown in black. screened/filtered data values, and µ and σ. The YWL and
Specific reports are identified by unfilled symbols, and the YAT columns of the table are color-coded: green means the
horizontal red, yellow, and green lines represent minimum minimum yield level is attained, if not the cell is colored red.
requirements for different process maturity levels. More important, the table is dynamically sortable based on
The lower level monthly summaries are accessible via column entries; clicking on a column heading will sort based
maskset over month, using the hyperlinks at the top left of the on the contents of that column, clicking again on the same
page, or via month over maskset, using the hyperlinks below column heading will reverse the sort order. This allows the
the plot abscissa. Parameter compliance details for specifically parameters that have lowest YWL or YAT to automatically
requested reports are available through the hyperlinks at the move to the top of the table, to be analyzed for remedial
bottom left of the page. action. It is a one-click procedure, so is simpler than sorting
Fig. 7 shows a dive-down into a detailed report for a single in a spreadsheet for example.
maskset and month. If the entry to this page was via a specific The leftmost column for each parameter in Fig. 7 is a
month and year link from the page of Fig. 6, then reports hyperlink to a histogram of the measured fab data. Fig. 8
for different masksets are selectable as hyperlinks at the top shows an example of this, with histograms shown for data

57
from two fabs. The red vertical lines are the lower and upper
limits, the thick green line is the target value, and the thinner
green lines delineate the range over which data are counted
for YAT scoring.
In use, the top-level time summary report is inspected first
to gain an overall picture of how compliant a process is or,
during development, how process compliance is ramping up
over time. You then click and descend into a detailed per
month/maskset, click to sort the parameters with lowest YWL
or YAT to the top, and inspect details of the data distribution
by clicking and descending into, or opening in a new browser
tab or window, individual parameter histograms. Analysis of
a histogram generally indicates either that there is a possible
measurement or probe issue or that there is a true offset
between the specifications and the manufacturing data. In the
former case the test structure and/or probing and measurement
procedure may be improved; in the latter case the process may
be retargeted or the specification limits may be adjusted. Fig. 9. Multi-technology high level summary.
Excessively high YAT and YWL scores also need careful
inspection [1], as they present an opportunity to tighten spec
limits, thereby decreasing overly pessimistic guard-banding in The data from our YAT and YWL analyses are also used
design and affording an opportunity to reduce area or current to define targets for models for integrated circuit design.
drain or improve performance, all of which can help improve Although the targets are in terms of device electrical perfor-
profitability. mances, they are mapped into variations in model parameters
using the backward propagation of variance (BPV) procedure
Note that the report generation itself is not a dynamic, web-
as detailed in [6].
based process. It is triggered monthly (or for scheduled process
maturity level audits), but requires sanity checking of the input R EFERENCES
data so is not initiated until the data are verified to be good. [1] S. Jallepalli, A. Srivastava, R. Wyllie, and S. Veeraraghavan, “Compliance
metrics for a reliable assessment of parametric yield,” IEEE Trans.
V. C ROSS - TECHNOLOGY T RACKING Semicond. Manuf., vol. 26. no. 3, pp. 385-392, Aug. 2013.
[2] V. Parsarathary, R. Zhu, V. Khemka, T. Toggenbauer, A. Bose, P. Hui,
BCD processes and the integrated circuits designed in them P. Rodriquez, J. Nivision, D. Collins, Z. Wu, I. Puchades, and M. Butner,
“A 0.25µm CMOS based 10V smart power technology with deep trench
can have long life cycles. It is therefore not sufficient to track for high-voltage isolation,” IEDM Tech. Dig., pp. 459-462, 2002.
YAT and YWL during the development of a manufacturing [3] H. Schmid and A. Huber, “Measuring a small number of samples and the
process, they need to be tracked over the complete lifetime of 3σ fallacy,” IEEE Solid-State Circuits Magazine, vol. 6, no. 2, pp. 52-58,
2014.
a process. To get an overall picture of the health of a family [4] G. E. P. Box and D. R. Cox, “An analysis of transformations,” J. Royal
of processes we also integrate YAT and YWL summaries for Statistical Society, Series B, vol. 26, pp. 211-252, 1964.
specific processes, masksets, and fabs into a more general [5] [Online]: //http://en.wikipedia.org/wiki/Six Sigma (accessed February,
2015).
summary. Fig. 9 shows an example of such a report. [6] C. C. McAndrew, “Statistical modeling using backward propagation of
variance,” in Compact Modeling: Principles, Techniques and Applica-
VI. C ONCLUSIONS tions, G. Gildenblat (Ed), Springer, pp. 491-520, 2010.

Effective evaluation of the statistical capability of semi-


conductor manufacturing processes, at all stages of matu-
rity, requires: appropriate data, which come from SGPC test
structures and fab measurements; reliable statistical analysis
algorithms, to ensure no real process excursions are missed
and no in-control parameters are falsely flagged as being out
of spec; and procedures and tools to allow efficient inspection
of large numbers of results and rapid identification of, and
detailed inspection of, problems. We have shown that test
structures that accurately correlate with IC devices and are
measurable without specialized test equipment are important
for the first of these; the YAT-YWL methodology has distinct
advantages over conventional process capability indices for the
second; and graphical, web-based tools are an effective means
to enable the third.

58

You might also like