You are on page 1of 6

AS2010

Introduction To Aerospace Engineering


Report

Name: P.Pranay Chowdary


Roll No: AE18B034
1 Result :
Following graphs are the results for the least square fit plot of different data sets and
different no.of points.

• Data set 1 :

C:/Users/Asus/Desktop/data1(25).png
C:/Users/Asus/Desktop/data1(75).png

C:/Users/Asus/Desktop/data1(150).png

• Data set 2 :

1
C:/Users/Asus/Desktop/data2(25).png
C:/Users/Asus/Desktop/data2(75).png

C:/Users/Asus/Desktop/data2(150).png

• Data set 3 :

2 Theory for Least square fit:


Least square fit plot is a line,which minimizes the vertical distance from the
data points to the line. The term “least squares” is used because it is the smallest
sum of squares of errors, which is also called the ”variance”.

Given data (x1 , y1 ), ..., (xn , yn ), we may define the error associated to saying y =
mx + c by
n
X
E(m, c) = (yn − (mxn + c))2 (1)
n=1

This is just n times the variance of the data set. It makes no difference whether
or not we study the variance or N times the variance as our error, and note that
the error is a function of two variables.

2
C:/Users/Asus/Desktop/data3(25).png
C:/Users/Asus/Desktop/data3(75).png

C:/Users/Asus/Desktop/data3(150).png

The goal is to find values of m and c that minimize the error. In multivariable
calculus we learn that this requires us to find the values of (m, c) such that

∂E ∂E
= 0, =0 (2)
∂m ∂c
Solving those two equation gives us the same equation which is derived below.
So to get least square fit plot we use following method.

3 Method followed to get above results:


• Let the given set of points be:
 
x1 y1
 x2 y2 
 
P =  x3 y3 


 .. .. 
. .
xn yn

3
• Now, let us form two matrices from the given set of points. Which are of the form,
   
x1 1 y1
 x2 1  y2 
   
 x3 1
A= B =  y3 
 

 .. ..   .. 
 . . .
xn 1 xn

• Let X be a column vector, with m,c as terms. And below equation is writen.

B = AX (3)
Which is equal to,    
y1 x1 1
 y 2   x2 1  
m
  
 y 3   x3 1
 =
..  c

 ..   ..
. . .
xn xn 1
on simplifying it gives,

y1 = mx1 + c
y2 = mx2 + c
y3 = mx3 + c
: :
yn = mxn + c

• Multiplying (3) with transpose of A gives,

AT B = AT AX (4)

let, AT B = P
AT A = Q
then eq(2) becomes,

P = QX (5)
This gives matrix of the form,
    
p1 q11 q12 m
=
p2 q21 q22 c

• On simplifying this gives two equations in m,c.

p1 = q11 m + q12 c (6)

p2 = q21 m + q22 c (7)


Soving (6) and (7) gives m, c.

• Now we can plot y = mx + c, which is the least sqare fit line with the found
values of m,c.

4
4 Inference :
By observing results, we can say that the variations in graphs are same as in python
code when compared.
It is same that, we can find huge variations for different no. of points in data set 1,
less in data set 2 and negligible in data set 3. Which means huge error of measurement
for data1, less for data2 and negligible for data3

You might also like