You are on page 1of 16

Regression Analysis

An Approximation Problem
Related Engineering Problems

Q
Selecting the most appropriate …
 Approximation function

Q
Steps
1. Selecting function type (polynomial, exp., log., etc.)
2. Optimize its coefficients to obtain “the best” approximation
function
3. Try some other functions
4. Optimize the coefficients
5. Compare and decide the best of the best

What is “the best” ?


Selecting the most appropriate …
 Here, we are trying to evaluate a linear approximation
function
B
A

D
h

Which is “the best” one ?


Selecting the most appropriate …
 Then, we are trying to evaluate a curved approximation
function
A
C
B
h

Which is “the best” one ?


Criteria for “The Best”
 Many alternatives (choices) available
 How to select ? … depends on the problems !
 Some problems concern the average values
 The others concern the extreme values
 Some problems need uniform accuracy throughout the range of data,
some others need more accuracy at a specific range
 Examples:
 To select “the best”, we may see the minimum sum of all “errors”
(distant between measured data and approximating values). This
method is suitable for data with uniform condition of needed or
available accuracy.
 An other way is to minimize the maximum error found among the
data.
 Sometimes a weighting function is introduced before optimizing
Selecting the most appropriate …
 Optimize the approximation function

Max error
error
error

h h

Max error

Q Q
The Least Square Method
 Full expression: “the least of sum of square of errors”
 This is the most favorite method /criteria
 The square of differences between observed and approximate
data are summed. The sum is then minimized by adjusting the
coefficient of the evaluated trial function (approximating
function)
 This means that all data are treated equally regardless the
number of data with the same or closed value.
 What happen when the method is applied for the condition
with most data are within a certain narrow range ? (compared
with the case of evenly distributed data in the range)
Computational Procedure
 Data: yi (xi), i = 1,2,3, …, n
 Approximating function: g(x)
 The differences (errors) are evaluated at xi; Ei = yi - g(xi) for
all i.
 The square of errors; Ei2 = [yi – g(xi) ]2
n

E
2
 The sum of square of errors; D2 = i
i 1

 Find the minimum value of D2 by adjusting the coefficients of


g(x). How to accomplish it ?
Minimizing

The “place” where


 
 D2
is zero is at the minimum
a
D2

Min

One of the coefficients; a or b

Example: when we try the approximating function is of


Example of Finding the Minimum
 Try a linear approximating function g x  ax  b
 Adjust or find the combination values of a and b giving the
minimum value of D2.
 We can accomplish the task by trying many arbitrary values
of a and b pairs, and then evaluate whether they give the
minimum value of D2.
 D 
2
 D 
2

 However when the analytical solutions of a as well as b


for a and b are available, then we can obtain a and b in faster
way.
 Therefore …
The first equation
 D  2 n n n n
 0; D   y  g x  
  i   g xi   2 yi g xi 
2 2 2 2
y
a
i i
i 1 i 1 i 1 i 1

 
 yi2
 0; for all i; why ? g xi   axi  b
a

g xi   2 g xi   xi  2axi  b xi ; for all i;
2

a

 2 yi g xi   2 yi xi
a
n n
2 xi axi  b   yi   0;  x ax  b  y   0;
i i i
i 1 i 1
n n n
a  x  b xi  xi yi  0
2
i
i 1 i 1 i 1
The second equation
 D  2 n n n n
 0; D   y  g x  
    g xi   2 yi g xi 
2 2 2 2
y
b
i i i
i 1 i 1 i 1 i 1

 
 yi2
 0; for all i; g xi   axi  b
b

g xi   2 g xi 1  2axi  b ; for all i;
2

b

 2 yi g xi   2 yi
b
n n
2 axi  b   yi   0;  ax  b  y   0;
i i
i 1 i 1
n n
a  xi  b   yi  0
i 1 i 1
The Resulting Matrix Equation
n n n
a  xi2  b xi   xi yi  0
i 1 i 1 i 1
n n
a  xi  b   yi  0
i 1 i 1

 n 2 n
  n 
 xi  xi 
 
a   xi yi 
 i n1 i 1
      i 1n 
 x   b  
  
1 yi
i 1
i
  i 1 

You might also like