Theory of Least square fitting
1 Theory of Least Square Fitting
Linear least square fitting is a technique used to find the best-fit line that minimizes the sum of the squared
residuals between the observed data points and the predicted values. Given a set of data points {(xi , yi )}, the
goal is to find the line of the form y = mx + c that best fits the data. Linear curve :
y = mx + c
Slope and intercept is given by the definition of least square curve fitting :
P P P
n x.y − ( x).( y)
Slope(m) = P P
n x2 − ( x)2
Intercept(c) = ȳ − m · x̄
Theory of Linear Least square fitting
The best-fit line is then given by y = mx + c, where m and c are the values obtained from solving the normal
equations.
Code and output of Linear square fitting
# considerring the equation to be y = mx + c
import math
import [Link] as plt
import numpy as np
x=[Link]([i for i in range(1,7)])
y=[Link]([1.5,2.5,3.5,4.5,13.6,9.9])
xy = x*y
xx = x*x
n= len(x)
sxy = sum(xy)
sxx = sum(xx)
sx = sum(x)
sy = sum(y)
Figure 1: Linear Function
# m = [Link](x.y) - sum(x).sum(y)/ {[Link](x^2) - sum(x).sum(x)}
m = (n*sxy - sx*sy)/(n*sxx - sx*sx)
c = (sy - m*sx)/n
z=[]
for i in x :
[Link](m*i+c)
print("Data points : " )
for i , j in zip(x,y):
print(f'({i},{j})', end = '')
print()
print("Fitted points : ")
for i , j in zip(x,z):
print(f'({i},{j})', end = '')
[Link](x,y,color = 'black' , label = 'Data Points ')
[Link](x,y,color = 'blue' , label = 'Data Points ')
[Link](x,z,color = 'red' , label = 'Fitted Line ')
xx = np. linspace(0,15,100)
[Link]("X axis ")
[Link]("Y axis")
[Link]("Linear least Square Fitting")
[Link]()
[Link]()
[Link]()
Theory of square fitting of power function
By taking the logarithm of both sides of the power function, you
can transform it into a linear equation that can be fitted using linear
regression. Here’s how you can do it:
• Start with the exponential equation: y = axb
• Take the logarithm (loge ) of both sides of the equation: loge (y) =
loge (axb ).
• Apply the properties of logarithms: loge (y) = loge (a) + [Link] (x)
• Now fit the curve same as linear regression Y = MX+C M and C are
slope and intercept
Y = loge (y), X = loge (x), M = b, C = loge (a)
Code and output of square fitting of power function
# considerring the power function to be y =ax**b
# logy = log(a) + blog(x)
# now comparing with linear equation ..........Y = log(y)...\\
#..X = log(x) ........m=b.........c= log(a)
#.....Y = mX+c
import math
import [Link] as plt
import numpy as np
x=[Link]([i for i in range(1,6)])
y=[Link]([2,16,54,128,251])
X = [Link](x)
Y= [Link](y)
X_mean = [Link](X)
Y_mean = [Link](Y)
n=len(x) Figure 2: Linear Function
sxy = [Link](X*Y) -n*(X_mean*Y_mean )
sxx = [Link](X*X) -n*(X_mean*X_mean )
# m = b
m= sxy/sxx
c = Y_mean - m*X_mean
a = 2.71828**c
print("Value of the slope = ", m)
print("Value of the intercept = ", c)
print()
print("The value of a in function ",a)
b=m
print("The value of b in the function ",b)
z=[]
for i in x:
[Link]((a*i)**b)
xx = [Link](1,6,100)
yfit = a*[Link](xx,b)
print("Data points : " )
for i , j in zip(x,y):
print(f'({i},{j})', end = '')
print()
print("Fitted points : ")
for i , j in zip(x,z):
print(f'({i},{j})')
[Link](x,y,color = 'black' , label = 'Data Points ')
[Link](x,y ,color = 'blue' , label = 'Data Points ')
[Link](x,z,color = 'red' , label = 'Fitted Line ')
xx = np. linspace(1,6,100)
[Link]("X axis ")
[Link]("Y axis")
[Link]("Exponential least Square Fitting")
[Link]()
[Link]()
[Link]()
Theory of square fitting of exponential function
By taking the logarithm of both sides of the exponential equation,
you can transform it into a linear equation that can be fitted using
linear regression. Here’s how you can do it:
• Start with the exponential equation: y = aebx
• Take the logarithm (log) of both sides of the equation:
loge (y) = loge (aebx ).
• Apply the properties of logarithms: loge (y) = loge (a) + [Link] (x)
• Now fit the curve same as linear regression Y = MX+C {M and C are
slope and intercept}
Y = loge (y), X = loge (x), M = b, C = loge (a)
Code and output of square fitting of exponential function
# considerring the exponential equation to be y =ae^bx
# logy = log(a) + bx
# now comparing with linear equation ...Y = log(y)...X = x
#m=b...c= log(a)
#.....Y = mX+c
import math
import [Link] as plt
import numpy as np
x=[Link]([i for i in range(1,7)])
y=[Link]([2,5,11,29,78,94])
X = x
Y= [Link](y)
X_mean = [Link](X)
Y_mean = [Link](Y)
n=len(x)
sxy = [Link](X*Y) -n*(X_mean*Y_mean )
sxx = [Link](X*X) -n*(X_mean*X_mean )
# m = b
m= sxy/sxx Figure 3: Secant method
c = Y_mean - m*X_mean
a = 2.71828**c
print("Value of the slope = ", m)
print("Value of the intercept = ", c)
print()
print("The value of a in function ",a)
b=m
print("The value of b in the function ",b)
z=[]
for i in x:
[Link]((a*2.71828)**i)
xx = [Link](1,6,100)
#yfit = a*[Link](xx,b)
print("Data points : " )
for i , j in zip(x,y):
print(f'({i},{j})', end = '')
print()
print("Fitted points : ")
for i , j in zip(x,z):
print(f'({i},{j})')
[Link](x,y ,color = 'black' , label = 'Data Points ')
[Link](x,y ,color = 'blue' , label = 'Data Points ')
[Link](x,z,color = 'red' , label = 'Fitted Line ')
[Link](x,z,color = 'black' , label = 'Fitted Line ')
xx = np. linspace(1,5,10)
[Link]("X axis ")
[Link]("Y axis")
#[Link](xx,yfit)
[Link]("Exponential least Square Fitting")
[Link]()
[Link]()
[Link]()