You are on page 1of 15

FAKULTI KEJURUTERAAN MEKANIKAL DAN PEMBUATAN

ACADEMIC SESSION: 2021/2022

BDU10603

TS. DR. MOHD FADHLI BIN ZULKAFLI


COMPUTER PROGRAMMING: PROJECT

NO STUDENT’S NAME MATRICS NUMBER

1 MUHAMMAD HAIKAL HARITH BIN HAMIZAKI AD210052

2 WAN MUHAMMAD LUQMAN BIN WAN MOHD ADIB AD210070

3 THIUYAH RAJANDRAN CD210187

BACHELOR’S DEGREE WITH HONOURS

FACULTY MECHANICAL ENGINEERING AND MANUFACTURING

UNIVERSITI TUN HUSSEIN ONN MALAYSIA

MARCH 2022
Minutes of meeting
Date: 2th June 2022
Time: 2:00p.m. – 4:30p.m
Location: Library UTHM
Purpose of meeting: To discuss about projects
Goals: To understand project
Actions: List up requirements of question

Date: 20th June 2022


Time: 5:00p.m. – 8:30p.m
Location: Cafe Kolej Tun Fatimah
Purpose of meeting: To discuss problems faced during creating PyQT5
Goals: To come up with a better GUI platform
Actions: Did research on google and revise the notes given during classes.

Submitted by :
Role of members:
All the members are active in each perspective.

Wan Luqman

• Create GUI
• Link coding to GUI

Thiuyah

• Provide Coding
• Report Project
• Slide Presentation

Haikal

• Create GUI
• Fix error in coding
• Merge coding with GUI

Introduction:
This project is about gauss elimination and polynomial regression. In mathematics, Gaussian
elimination, also known as row reduction, is an algorithm for solving systems of linear equations.
It consists of a sequence of operations performed on the corresponding matrix of coefficients.
This method can also be used to compute the rank of a matrix, the determinant of a square
matrix, and the inverse of an invertible matrix. The method is named after Carl Friedrich
Gauss (1777–1855) although some special cases of the method—albeit presented without
proof—were known to Chinese mathematicians as early as circa 179 CE.
To perform row reduction on a matrix, one uses a sequence of elementary row operations to
modify the matrix until the lower left-hand corner of the matrix is filled with zeros, as much as
possible. There are three types of elementary row operations:
• Swapping two rows,
• Multiplying a row by a nonzero number,
• Adding a multiple of one row to another row. (subtraction can be achieved by multiplying
one row with -1 and adding the result to another row)
Using these operations, a matrix can always be transformed into an upper triangular matrix, and in
fact one that is in row echelon form. Once all of the leading coefficients (the leftmost nonzero
entry in each row) are 1, and every column containing a leading coefficient has zeros elsewhere,
the matrix is said to be in reduced row echelon form. This final form is unique; in other words, it
is independent of the sequence of row operations used. For example, in the following sequence of
row operations (where two elementary operations on different rows are done at the first and third
steps), the third and fourth matrices are the ones in row echelon form, and the final matrix is the
unique reduced row echelon form.

Using row operations to convert a matrix into reduced row echelon form is sometimes
called Gauss–Jordan elimination. In this case, the term Gaussian elimination refers to the process
until it has reached its upper triangular, or (unreduced) row echelon form. For computational
reasons, when solving systems of linear equations, it is sometimes preferable to stop row
operations before the matrix is completely reduced.
In statistics, polynomial regression is a form of regression analysis in which the relationship
between the independent variable x and the dependent variable y is modelled as a nth degree
polynomial in x. Polynomial regression fits a nonlinear relationship between the value of x and the
corresponding conditional mean of y, denoted E (y |x). Although polynomial regression fits a
nonlinear model to the data, as a statistical estimation problem it is linear, in the sense that the
regression function E (y | x) is linear in the unknown parameters that are estimated from the data.
For this reason, polynomial regression is considered to be a special case of multiple linear
regression.

The explanatory (independent) variables resulting from the polynomial expansion of the "baseline"
variables are known as higher-degree terms. Such variables are also used in classification settings.
In statistics, polynomial regression is a form of regression analysis in which the relationship
between the independent variable x and the dependent variable y is modelled as a nth degree
polynomial in x. Polynomial regression fits a nonlinear relationship between the value of x and the
corresponding conditional mean of y, denoted E (y |x). Although polynomial regression fits a
nonlinear model to the data, as a statistical estimation problem it is linear, in the sense that the
regression function E (y | x) is linear in the unknown parameters that are estimated from the data.
For this reason, polynomial regression is considered to be a special case of multiple linear
regression.

The explanatory (independent) variables resulting from the polynomial expansion of the "baseline"
variables are known as higher-degree terms. Such variables are also used in classification settings.
A quadratic equation is a general term for a second-degree polynomial equation. This degree, on
the other hand, can go up to nth values. Polynomial regression can so be categorized as follows:

1. Linear – if degree as 1

2. Quadratic – if degree as 2
3. Cubic – if degree as 3 and goes on, on the basis of degree.

Figure 1.1 Order of polynomial degree

Polynomial regression categorization


Assumption of Polynomial Regression
We cannot process all of the datasets and use polynomial regression machine learning to make a
better judgment. We can still do it, but there should be specific constraints for the dataset in order
to get the best polynomial regression results.
A dependent variable’s behavior can be described by a linear, or curved, an additive link between
the dependent variable and a set of k independent factors.
The independent variables have no relationship with one another.
We’re utilizing datasets with independent errors that are normally distributed with a mean of zero
and a constant variance.

Simple math to understand Polynomial Regression


Here we are dealing with mathematics, rather than going deep, just understand the basic structure,
we all know the equation of a linear equation will be a straight line, from that if we have many
features then we opt for multiple regression just increasing features part alone, then how about
polynomial, it’s not about increasing but changing the structure to a quadratic equation, you can
visually understand from the diagram,
Figure 1.2: Polynomial Regression

Problem Statement:
While programming there are errors in coding like indent block or unexpected blocks, syntax errors
and more. Whenever there is error, will fix it in different way that found online or according study
materials. CSV file cannot import in python coding, to overcome it used manual way to call CSV
file content in array terms. In GUI, value of coefficient cannot be shown in GUI, graph cannot be
plot and cannot save file name. Every time, keep on trying to get the answer till it can be plotted
and shown in GUI.
Programming code:

Figure 1.3(a)

Figure 1.3(b)
Figure 1.3(c)

Figure 1.3(d)
Figure 1.3(e)
Flowcharts: Gauss elimination

Start

For i = 0 to n

For j=0 to n+1

For k = 0 to n

Print a[i][j] and b[i]

End for j

End for i

End for k

For i = 0 to n-1

Print x[i]

End for i

Stop
Results:

Figure 1.4 Result shown on GUI


Discussion:

Figure 1.5: Gauss elimination method


Figure 1.5 shows that method of sum of matrix in power of m, coefficients in a*0 and equals to
vector. To calculate the sum of matrix, following formula is used (sum_ax += a[i,j] * x[j]). But
before that all the matrix A have to be calculate inverse matrix, so there we implemented the A^
(-1) formula. Then we solve it in method, np. linalg.solve (a, b).

Figure 1.6: Forward Substitution and Factorization method

Figure 1.7: Backward Substitution method


Factorization method is used for forward and back substitutions method.
In Python language,
factor = a[k,k]/a[i,k]
a[i,j] = a[k,j] - a[i,j]*factor and b[i] = b[k] - b[i]*factor.
Figure 1.8: Final format after gauss elimination method.
Conclusion:
In a nutshell, gauss elimination method has two substitutions to get the coefficient of x.
Polynomial regression has different degree can be plotted. In GUI there is space for decide
degree polynomial to be plot. Then coefficient that get have to multiply with functions.

Link for exe file :

https://drive.google.com/drive/folders/1rNyC58IvQxiQRGTRrEvSKyZHMmw9sHC6?usp=shari
ng

You might also like