You are on page 1of 8

Paper P256-26

Fitting Generalized Additive Models with the GAM Procedure

Dong Xiang, SAS Institute Inc., Cary, NC

To overcome these difficulties, Stone (1985) proposed


additive models. These models estimate an additive
approximation to the multivariate regression function.
Abstract
The benefits of an additive approximation are at least
This paper describes the use of the GAM proce- twofold. First, since each of the individual additive
dure for fitting generalized additive models (Hastie terms is estimated using a univariate smoother, the
and Tibshirani, 1990). PROC GAM, production in Re- curse of dimensionality is avoided, at the cost of not
lease 8.2, provides an array of powerful tools for data being able to approximate universally. Second, esti-
analysis, incorporating nonparametric regression and mates of the individual terms explain how the depen-
smoothing techniques as well as generalized distribu- dent variable changes with the corresponding inde-
tional modeling. Compared with other regression pro- pendent variables.
cedures such as PROC REG and PROC GLM, the
methodology behind PROC GAM relaxes the usual To extend the additive model to a wide range of dis-
assumption of linearity, enabling you to uncover hid- tribution families, Hastie and Tibshirani (1990) pro-
posed generalized additive models. These models
den structure in the relationship between the indepen-
dent variables and the dependent variable. Moreover, assume that the mean of the dependent variable de-
pends on an additive predictor through a nonlinear
you can use PROC GAM to model not only Gaussian
link function. Generalized additive models permit the
data, but also data from binary, Poisson, and other
non-Gaussian distributions. response probability distribution to be any member of
the exponential family of distributions. Many widely
used statistical models belong to this general class,
Introduction including additive models for Gaussian data, nonpara-
metric logistic models for binary data, and nonpara-
The GAM procedure is the most versatile of sev-
eral new procedures for nonparametric regression in metric log-linear models for Poisson data.
SAS software. The methodology behind the GAM PROC GAM implements the generalized additive
procedure has greater flexibility than traditional para- model proposed by Hastie and Tibshirani (1990). The
metric modeling tools such as linear or nonlinear re- GAM procedure
gression. It relaxes the usual parametric assumption,
and enables you to uncover structure in the relation-  fits nonparametric or semiparametric additive
ship between the independent variables and the de- models
pendent variable that you might otherwise miss.
Many nonparametric methods, such as the thin-plate
 supports the use of multidimensional data
smoothing spline used in PROC TPSPLINE and the  supports multiple SCORE statements
local regression method used in PROC LOESS, do
not perform well when there is a large number of in-  enables you to specify the model degrees of
dependent variables in the model. The sparseness freedom or smoothing parameter
of data in this setting inflates the variance of the es-
timates. The problem of rapidly increasing variance PROC GAM can fit Gaussian, binomial, Poisson, and
for increasing dimensionality is sometimes referred to Gamma distributions. For each distribution, although
as the “curse of dimensionality.” Interpretability is an- theoretically more than one link can exist, PROC
other problem with nonparametric regression based GAM always uses the canonical link. This is because
on kernel and smoothing spline estimates (Hastie and the difference between link alternatives will be less
Tibshirani 1990). pronounced for nonparametric models, in light of the
flexibility of nonparametric model forms.

1
In the next section, we will introduce a data set appro- modeling the expected value of Y as
priate for GAM analysis. Then, we will briefly discuss
the GAM methodology. Finally, we will show a cou- E (Y ) = f (X1;    ; Xp ) = s0 + s1 (X1 ) +    + sp (Xp )
ple of examples using PROC GAM, comparing it with
other procedures in the SAS/STAT system.
where si (X ); i = 1; : : : ; p are smooth functions.
These functions are not given a parametric form but
Kyphosis Study instead are estimated in a nonparametric fashion.
The data introduced in this section are based on
a study by Bell et al. (1989). Bell and his asso-
ciates studied the results of multiple-level thoracic
and lumbar laminectomy, a corrective spinal surgery
commonly performed on children. The data in the
study consist of retrospective measurements on 83
patients. The specific outcome of interest is the pres-
ence (1) or absence (0) of kyphosis (a severe for-
ward flexion of the spine). The available predictor
variables are Age in months at the time of the op-
eration, the first vertebrae level involved in the oper-
ation (StartVert), and the number of levels involved
(NumVert). The goal of this analysis is to identify risk
factors for kyphosis.
Figure 1 is a plot of the incidence of Kyphosis against
StartVert, Age, and NumVert. It is difficult to see
any trends in the data. This is often true of binary
data, especially with multiple predictors. You can try
to use linear logistic regression (PROC GENMOD or
PROC LOGISTIC) to summarize the relationships,
but without prior knowledge it is difficult to do so.
PROC GAM, however, does not require any prior as-
sumptions on the underlying relationship. It can find
a valid form to represent the data in a smooth for-
mat, making it useful as a tool to visualize the pattern
among the variables before using one of the paramet-
ric modeling procedures.
Before using PROC GAM to analyze this data set, we
will briefly introduce the generalized additive model in
the next section.

Generalized Additive Model


Figure 1. Plot of Kyphosis Data
Let Y be a response random variable and X1 ;    ; Xp
be a set of predictor variables. A regression proce- Generalized additive models extend traditional linear
dure can be viewed as a method for estimating how
the value of Y depends on the values of X1 ;    ; Xp .
models in another way, namely by allowing for a link
between f (X1 ;    ; Xp ) and the expected value of Y .
The standard linear regression model assumes the This amounts to allowing for a alternative distribution
expected value of Y has a linear form. for the underlying random variation besides just the
normal distribution. While Gaussian models can be
E (Y ) = f (X1;    ; Xp ) = 0+ 1 X1 +    + p Xp used in many statistical applications, there are types
of problems for which they are not appropriate. For
Given a sample of values for Y and X , estimates of example, the normal distribution may not be adequate
0 ; 1 ;    ; p are often obtained by the least squares for modeling discrete responses such as counts, or
method. bounded responses such as proportions. Thus, gen-
eralized additive models can be applied to a much
The additive model generalizes the linear model by wider range of data analysis problems.

2
Generalized additive models consist of a random This is the foundation for the backfitting algorithm,
component, an additive component, and a link func- providing a way for estimating each smoothing func-
tion relating these two components. The response Y , tion sj () given estimates fs^i (); i 6= j g for all the oth-
the random component, is assumed to have a density ers. The backfitting algorithm is iterative, starting with
in the exponential family initial functions s0 ;    ; sp , and with each iteration, cy-
cling through the partial residuals, fitting the individual
 y b()  smoothing components to its partial residuals. Itera-
fY (y; ; ) = exp + c(y; )
a() tion proceeds until the individual components do not
change.
where  is called the natural parameter and  is the The algorithm so far described fits just additive mod-
scale parameter. The normal, binomial, and Poisson els. The algorithm for generalized additive models is a
distributions are all in this family, along with many oth- little more complicated. Generalized additive models
ers. The quantity extend generalized linear models in the same manner
P models,
as additive models extend linear regression
X
p that is, by replacing the linear form + j Xj j with
P
 = s0 + si (Xi ) the additive form + j sj (Xj ).
i=1
In the same way, estimation of the additive terms for
where s1 ();    ; sp () are smooth functions defines generalized additive models is accomplished by re-
the additive component. Finally, the relationship be- placing the weighted linear regression for the adjusted
tween the mean  of the response variable and  dependent variable by the weighted backfitting algo-
is defined by a link function g () =  . The most rithm, essentially fitting a weighted additive model.
commonly used link function is the canonical link, for The algorithm used in this case is called the local
which  = . scoring algorithm. It is also an iterative algorithm and
starts with initial estimates of s0 ; s1 ;    ; sp . During
Generalized additive models and generalized linear each iteration, an adjusted dependent variable and
models can be applied in similar situations, but they a set weight are computed, and then the smoothing
serve different analytic purposes. Generalized lin- components are estimated using a weighted backfit-
ear models emphasize estimation and inference for ting algorithm. The scoring algorithm stops when the
the parameters of the model, while generalized ad- deviance of the estimates ceases to decrease.
ditive models focus on exploring data nonparametri-
cally. Generalized additive models are more suitable Overall, then, the estimating procedure for general-
for exploring the data set and visualizing the relation- ized additive models consists of two loops. Inside
ship between the dependent variable and the inde- each step of the local scoring algorithm (outer loop),
pendent variables. a weighted backfitting algorithm (inner loop) is used
until convergence. Then, based on the estimates
from this weighted backfitting algorithm, a new set of
Fitting Generalized Additive Models weights is calculated and the next iteration of the scor-
In this section, the general method of fitting additive ing algorithm starts.
models will be outlined. The two important pieces are
Any nonparametric smoothing method can be used
the backfitting and local scoring algorithms.
to obtain si (x). The GAM procedure implements the
Consider the estimation of the smoothing terms B-spline and local regression methods for univariate
s0 ; s1 ();    ; sp () in the additive model. Many ways smoothing components and the thin-plate smoothing
are available to approach the formulation and estima- spline for bivariate smoothing components. More de-
tion of additive models. The backfitting algorithm is a tailed descriptions of these smoothing methods can
general algorithm that can fit an additive model using be found in the SAS/STAT User’s Guide, Version 8.
any regression-type smoothers.
The smoothers used in PROC GAM have a single
Define the j th set of partial residuals as smoothing parameter. The generalized cross valida-
tion (GCV) function has been widely used in many
X nonparametric regression methods as a criterion to
Rj = Y s0 sk (Xk )
choose the smoothing parameters. The GCV func-
k =j
tion approximates the expected prediction error. The
6

model selected by the GCV function is thus judged


The partial residuals remove the effects of all the
to have the best prediction ability. In addition to
other variables from y; therefore they can be used to
automatically choosing the smoothing parameter by
model the effects against xj .

3
GCV, the GAM procedure also gives you the option of results indicate that the effects of Age and StartVert
specifying the degrees of freedom for each individual are significant, while the effect of NumVert is insignif-
smoothing component. icant.

Analysis of Kyphosis Data


The GAM Procedure
In this section, the Kyphosis data are used to illus- Dependent Variable: Kyphosis
Smoothing Model Component(s): spline(Age) spline(StartVert) spline(NumVert)
trate the use of the GAM procedure. The syntax of Regression Model Analysis
the GAM procedure is similar to that of other regres- Parameter Estimates

sion procedures in the SAS software. The following Parameter


Parameter
Estimate
Standard
Error t Value Pr > |t|
statements may be used to fit a GAM model. Intercept -2.01533 1.45620 -1.38 0.1706
Linear(Age) 0.01213 0.00794 1.53 0.1308
Linear(StartVert) -0.18615 0.07628 -2.44 0.0171
Linear(NumVert) 0.38347 0.19102 2.01 0.0484
PROC GAM data=kyphosis;
model kyphosis = spline(NumVert,df=3) Smoothing Model Analysis
spline(Age,df=3) Fit Summary for Smoothing Components
spline(StartVert,df=3) Num
/dist=logist; Smoothing Unique
Component Parameter DF GCV Obs
output out=estimate p uclm lclm;
run; Spline(Age) 0.999996 3.000000 328.512864 66
Spline(StartVert) 0.999551 3.000000 317.646703 16
Spline(NumVert) 0.921758 3.000000 20.144058 10

The above statements request PROC GAM to fit a Smoothing Model Analysis
Analysis of Deviance
logistic additive model with binary dependent vari-
Sum of
able kyphosis and ordinary independent variables Source DF Squares Chi-Square Pr > ChiSq

NumVert, Age, and StartVert. Each term is fitted Spline(Age) 3.00000 10.494369 16.4358 0.0009
Spline(StartVert) 3.00000 5.494968 8.6060 0.0350
using a B-spline smoother with 3 degrees of freedom. Spline(NumVert) 3.00000 2.184518 3.4213 0.3311

Although this might seem to be an unduly modest


amount of flexibility, it is better to be conservative with
Figure 3. Model Fit Statistics
a data set of 83 data points. Also the OUTPUT state-
ment is used to write the estimated function and con-
fidence limits to the data set estimate. The output The partial predictions for each predictor are plotted
from PROC GAM is listed in Figure 2 and Figure 3. in Figure 4. Notice that the 95% confidence limits for
NumVert cover the zero axis, confirming the insignif-
icant of this term. The plot also shows that the partial
predictions corresponding to both Age and StartVert
The GAM Procedure
Dependent Variable: Kyphosis have a strong quadratic pattern.
Smoothing Model Component(s): spline(Age) spline(StartVert) spline(NumVert)

Summary of Input Data Set Having used the GAM procedure to discover an ap-
Number of Observations 83 propriate form of the dependence of Kyphosis the
Number of Missing Observations 0
Distribution Binomial independence variables, you can use the GENMOD
Link Function Logit
procedure to fit and assess the corresponding para-
Iteration Summary and Fit Statistics metric model. The following codes fit a GENMOD
Number of local score iterations 9 model with quadratic terms for Age and StartVert,
Local score convergence criterion 2.6635758E-9
Final Number of Backfitting Iterations 1 including tests for the joint linear and quadratic ef-
Final Backfitting Criterion 5.2326788E-9
The Deviance of the Final Estimate 46.610922317 fects of each variable. The resulting contrast tests are
shown in Output 5.
Figure 2. Summary Statistics
PROC GENMOD data=kyphosis (where=(NumVert ^= 14));
model kyphosis = Age age*age
The first part of the output from PROC GAM (Figure StartVert StartVert*StartVert
2) summarizes the input data set and provides a sum- /link = logit dist=binomial;
contrast ’Age’ age 1, age*age 1;
mary for the backfitting and local scoring algorithms. contrast ’StartVert’ StartVert 1,
The second part of the output (Figure 3) provides an- StartVert*StartVert 1;
alytical information about the fitted model. The critical run;
part of the output is the “Analysis of Deviance” table,
shown in Figure 3. For each smoothing effect in the
model, this table gives a 2 -test comparing the de-
viance between the full model and the model without
this variable. In this case, the analysis of deviance

4
The GENMOD Procedure

PROC GENMOD is modeling the probability that Kyphosis=’1’.

Contrast Results

Chi-
Contrast DF Square Pr > ChiSq Type

Age 2 11.78 0.0028 LR


StartVert 2 21.49 <.0001 LR

Figure 5. The GENMOD Output

Comparing PROC GAM with PROC TP-


SPLINE
In this section, we compare the difference between
the model fitted by PROC GAM and model fitted by
PROC TPSPLINE.
The data represent the deposition of sulfate (SO4 ) at
179 sites in the 48 contiguous states of the United
States in 1990. Each observation records the latitude
and longitude of the site as well as the SO4 deposi-
tion at the site measured in grams per square meter
(g=m2 ). The data are and plotted in Figure 6.

Figure 4. Partial Prediction for Each Predictor Figure 6. Raw Data Plot

The results for the quadratic GENMOD model are GAM assumes an additive model, that is, the differ-
quite consistent with the GAM results. ence between SO4 at two longitude points is identical
across the whole range of latitudes, while the thin-
plate smoothing spline method used in PROC TP-
SPLINE does not make that assumption. The ben-
efit of the assumption is that PROC GAM runs much
faster than PROC TPSPLINE. The downside is that
the additive assumption may not be appropriate for
the spatial data like this.

5
The following code produces the estimates plotted in From Figure 7, we can see that the TPSPLINE fit is
Figure 7. more complicated than the GAM fit. It is flat in the
western region and rises dramatically in the eastern
data pred; region. Along the eastern coast, it has two maxima,
do latitude = 25 to 47 by 1;
do longitude = 68 to 124 by 1;
which correspond to the New York and Atlanta re-
output; gions. The GAM fit only shows that the eastern region
end; has a higher SO4 concentration than the western re-
end; gion. However, the GAM procedure runs in about a
quarter of the time required for the TPSPLINE pro-
PROC TPSPLINE data=so4; cedure. In general, the algorithm for PROC GAM re-
model so4 = (longitude, latitude) quires O(n) operations while the algorithm for PROC
/lognlambda = 0.277 TPSPLINE requires O(n3 ) operations. Thus GAM can
score data=pred out=estimates1;
run; be used on much larger datasets than TPSPLINE.

PROC GAM data=so4;


model so4=spline(longitude) Conclusion
spline(latitude)
score data=pred out=estimate2; In this paper, we have discussed a new addition to
run; SAS/STAT software. The GAM procedure is a very
powerful tool in exploratory analysis when you have
Both procedures score on a data set pred, which little prior information about the data or you want to
has grid points within the ranges of longitude and lati- find new features that parametric tools ignore. When
tude. The plots of predictions for the two methods are combined with other parametric regression proce-
shown in Figure 7. dures, GAM can guide you in fitting parametric mod-
els. However, in some cases, it may produce less
accurate results than other nonparametric regression
procedures because of the additive assumption.

References

Bell, D., Walker, J., O’Connor, G., Orrel, J. and Tib-


shirani, R. (1989), “Spinal Deformation Following
Multi-Level Thoracic and Lumbar Laminectomy in
Children.” Submitted for publication.
Cleveland, W.S., Devlin, S.J., and Grosse, E. (1988),
“Regression by Local Fitting,” Journal of Econo-
metrics, 37, 87–114.
Friedman, J.H. and Stuetzle, W. (1981), “Projection
Pursuit Regression,” Journal of the American Sta-
tistical Association, 76, 817–823.
Hastie, T.J. and Tibshirani, R.J. (1990), Generalized
Additive Models, New York: Chapman and Hall.
Stone, C.J. (1985), “Additive Regression and Other
Nonparametric Models,” Annals of Statistics, 13,
689–705.

Figure 7. Comparing the GAM Fit with the TP-


SPLINE Fit

6
37.65111 94.80361 1.863 39.10222 96.60917 0.898
38.67167 100.91639 0.536 37.70472 85.04889 2.693
37.07778 82.99361 2.195 38.11833 83.54694 2.762
Appendix 36.79056 88.06722 2.377 29.92972 91.71528 1.276
30.81139 90.18083 1.393 44.37389 68.26056 2.268
The two data sets used in this paper are listed in this 46.86889 68.01472 1.551 44.10750 70.72889 1.631
section. 45.48917 69.66528 1.369 39.40889 76.99528 2.535
38.91306 76.15250 2.477 41.97583 70.02472 1.619
42.39250 72.34444 2.156 42.38389 71.21472 2.417
data kyphosis; 45.56083 84.67833 1.701 46.37417 84.74139 1.539
input Age StartVert NumVert Kyphosis @@; 47.10472 88.55139 1.048 42.41028 85.39278 3.107
datalines; 44.22389 85.81806 2.258 47.53111 93.46861 0.550
71 5 3 0 158 14 3 0 128 5 4 1 47.94639 91.49611 0.563 46.24944 94.49722 0.591
2 1 5 0 1 15 4 0 1 16 2 0 44.23722 95.30056 0.604 32.30667 90.31833 1.614
61 17 2 0 37 16 3 0 113 16 2 0 32.33472 89.16583 1.135 34.00250 89.80000 1.503
59 12 6 1 82 14 5 1 148 16 3 0 38.75361 92.19889 1.814 36.91083 90.31861 2.435
18 2 5 0 1 12 4 0 243 8 8 0 45.56861 107.43750 0.217 48.51028 113.99583 0.387
168 18 3 0 1 16 3 0 78 15 6 0 48.49917 109.79750 0.100 46.48500 112.06472 0.209
175 13 5 0 80 16 5 0 27 9 4 0 41.15306 96.49278 0.743 41.05917 100.74639 0.391
22 16 2 0 105 5 6 1 96 12 3 1 36.13583 115.42556 0.139 41.28528 115.85222 0.075
131 3 2 0 15 2 7 1 9 13 5 0 38.79917 119.25667 0.053 39.00500 114.21583 0.273
12 2 14 1 8 6 3 0 100 14 3 0 43.94306 71.70333 2.391 40.31500 74.85472 2.593
4 16 3 0 151 16 2 0 31 16 3 0 33.22028 108.23472 0.377 35.78167 106.26750 0.315
125 11 2 0 130 13 5 0 112 16 3 0 32.90944 105.47056 0.355 36.04083 106.97139 0.376
140 11 5 0 93 16 3 0 1 9 3 0 36.77889 103.98139 0.326 42.73389 76.65972 3.249
52 6 5 1 20 9 6 0 91 12 5 1 42.29944 79.39639 3.344 43.97306 74.22306 2.322
73 1 5 1 35 13 3 0 143 3 9 0 44.39333 73.85944 2.111 41.35083 74.04861 3.306
61 1 4 0 97 16 3 0 139 10 3 1 43.52611 75.94722 3.948 42.10639 77.53583 2.231
136 15 4 0 131 13 5 0 121 3 3 1 41.99361 74.50361 3.022 36.13250 77.17139 1.857
177 14 2 0 68 10 5 0 9 17 2 0 35.06056 83.43056 2.393 35.69694 80.62250 2.082
139 6 10 1 2 17 2 0 140 15 4 0 35.02583 78.27833 1.729 34.97083 79.52833 1.959
72 15 5 0 2 13 3 0 120 8 5 1 35.72833 78.68028 1.780 47.60139 103.26417 0.354
51 9 7 0 102 13 3 0 130 1 4 1 48.78250 97.75417 0.306 47.12556 99.23694 0.273
114 8 7 1 81 1 4 0 118 16 3 0 39.53139 84.72417 3.828 40.35528 83.06611 3.401
118 16 4 0 17 10 4 0 195 17 2 0 39.79278 81.53111 3.961 40.78222 81.92000 3.349
159 13 4 0 18 11 4 0 15 16 5 0 36.80528 98.20056 0.603 34.98000 97.52139 0.994
158 15 4 0 127 12 4 0 87 16 4 0 36.59083 101.61750 0.444 44.38694 123.62306 0.629
206 10 4 0 11 15 3 0 178 15 4 0 44.63472 123.19000 0.329 45.44917 122.15333 0.716
157 13 3 1 26 13 7 0 120 13 2 0 43.12167 121.05778 0.050 44.21333 122.25333 0.423
42 6 7 1 36 13 4 0 43.89944 117.42694 0.071 45.22444 118.51139 0.109
; 40.78833 77.94583 3.275 41.59778 78.76750 4.336
40.65750 77.93972 3.352 41.32750 74.82028 3.081
33.53944 80.43500 1.456 44.35500 98.29083 0.372
data so4; 43.94917 101.85833 0.224 35.96139 84.28722 3.579
input latitude longitude so4 @@; 35.18250 87.19639 2.148 35.66444 83.59028 2.474
dataline; 35.46778 89.15861 1.811 33.95778 102.77611 0.376
32.45833 87.24222 1.403 34.28778 85.96889 2.103 28.46667 97.70694 0.886 29.66139 96.25944 0.934
33.07139 109.86472 0.299 36.07167 112.15500 0.304 30.26139 100.55500 0.938 32.37861 94.71167 2.229
31.95056 112.80000 0.263 33.60500 92.09722 1.950 31.56056 94.86083 1.472 33.27333 99.21528 0.890
34.17944 93.09861 2.168 36.08389 92.58694 1.578 33.39167 97.63972 1.585 37.61861 112.17278 0.237
36.10056 94.17333 1.708 39.00472 123.08472 0.096 41.65833 111.89694 0.271 38.99833 110.16528 0.143
36.56694 118.77722 0.259 41.76583 122.47833 0.065 41.35750 111.04861 0.172 42.87611 73.16333 2.412
34.80611 119.01139 0.053 38.53528 121.77500 0.135 44.52833 72.86889 2.549 38.04056 78.54306 2.478
37.44139 105.86528 0.247 38.11778 103.31611 0.326 37.33139 80.55750 1.650 38.52250 78.43583 2.360
39.99389 105.48000 0.687 39.40306 107.34111 0.225 47.86000 123.93194 1.144 48.54056 121.44528 0.837
39.42722 107.37972 0.339 37.19806 108.49028 0.559 46.83528 122.28667 0.635 46.76056 117.18472 0.255
40.50750 107.70194 0.250 40.36417 105.58194 0.307 37.98000 80.95000 2.396 39.08972 79.66222 3.291
40.53472 106.78000 0.564 37.75139 107.68528 0.557 45.79639 88.39944 1.054 45.05333 88.37278 1.457
39.10111 105.09194 0.371 40.80639 104.75472 0.286 44.66444 89.65222 1.044 43.70194 90.56861 1.309
29.97472 82.19806 1.279 28.54278 80.64444 1.564 46.05278 89.65306 1.132 42.57917 88.50056 1.809
25.39000 80.68000 0.912 30.54806 84.60083 1.243 45.82278 91.87444 0.984 41.34028 106.19083 0.335
27.38000 82.28389 0.991 32.14111 81.97139 1.225 42.73389 108.85000 0.236 42.49472 108.82917 0.313
33.17778 84.40611 1.580 31.47306 83.53306 0.851 42.92889 109.78667 0.182 43.22278 109.99111 0.161
43.46139 113.55472 0.180 43.20528 116.74917 0.103 43.87333 104.19222 0.306 44.91722 110.42028 0.210
44.29778 116.06361 0.161 40.05333 88.37194 2.940 45.07611 72.67556 2.646
41.84139 88.85111 2.090 41.70111 87.99528 3.171 ;
37.71000 89.26889 2.523 38.71000 88.74917 2.317
37.43556 88.67194 3.077 40.93333 90.72306 2.006 Contact Information
40.84000 85.46389 2.725 38.74083 87.48556 3.158
41.63250 87.08778 3.443 40.47528 86.99222 2.775 Dong Xiang, SAS Institute Inc., SAS Campus Drive,
42.90972 91.47000 1.154 40.96306 93.39250 1.423 Cary, NC 27513. Phone (919) 531-4854, FAX (919)

7
677-4444, Email dong.xiang@sas.com.
SAS software and SAS/STAT are registered trade-
marks or trademarks of SAS Institute Inc. in the USA
and other countries.  indicates USA registration.
Other brand and product names are registered trade-
marks or trademarks of their respective companies.

You might also like