Professional Documents
Culture Documents
BDU10603
MARCH 2022
Minutes of meeting
Date: 2th June 2022
Time: 2:00p.m. – 4:30p.m
Location: Library UTHM
Purpose of meeting: To discuss about projects
Goals: To understand project
Actions: List up requirements of question
Submitted by :
Role of members:
All the members are active in each perspective.
Wan Luqman
• Create GUI
• Link coding to GUI
Thiuyah
• Provide Coding
• Report Project
• Slide Presentation
Haikal
• Create GUI
• Fix error in coding
• Merge coding with GUI
Introduction:
This project is about gauss elimination and polynomial regression. In mathematics, Gaussian
elimination, also known as row reduction, is an algorithm for solving systems of linear equations.
It consists of a sequence of operations performed on the corresponding matrix of coefficients.
This method can also be used to compute the rank of a matrix, the determinant of a square
matrix, and the inverse of an invertible matrix. The method is named after Carl Friedrich
Gauss (1777–1855) although some special cases of the method—albeit presented without
proof—were known to Chinese mathematicians as early as circa 179 CE.
To perform row reduction on a matrix, one uses a sequence of elementary row operations to
modify the matrix until the lower left-hand corner of the matrix is filled with zeros, as much as
possible. There are three types of elementary row operations:
• Swapping two rows,
• Multiplying a row by a nonzero number,
• Adding a multiple of one row to another row. (subtraction can be achieved by multiplying
one row with -1 and adding the result to another row)
Using these operations, a matrix can always be transformed into an upper triangular matrix, and in
fact one that is in row echelon form. Once all of the leading coefficients (the leftmost nonzero
entry in each row) are 1, and every column containing a leading coefficient has zeros elsewhere,
the matrix is said to be in reduced row echelon form. This final form is unique; in other words, it
is independent of the sequence of row operations used. For example, in the following sequence of
row operations (where two elementary operations on different rows are done at the first and third
steps), the third and fourth matrices are the ones in row echelon form, and the final matrix is the
unique reduced row echelon form.
Using row operations to convert a matrix into reduced row echelon form is sometimes
called Gauss–Jordan elimination. In this case, the term Gaussian elimination refers to the process
until it has reached its upper triangular, or (unreduced) row echelon form. For computational
reasons, when solving systems of linear equations, it is sometimes preferable to stop row
operations before the matrix is completely reduced.
In statistics, polynomial regression is a form of regression analysis in which the relationship
between the independent variable x and the dependent variable y is modelled as a nth degree
polynomial in x. Polynomial regression fits a nonlinear relationship between the value of x and the
corresponding conditional mean of y, denoted E (y |x). Although polynomial regression fits a
nonlinear model to the data, as a statistical estimation problem it is linear, in the sense that the
regression function E (y | x) is linear in the unknown parameters that are estimated from the data.
For this reason, polynomial regression is considered to be a special case of multiple linear
regression.
The explanatory (independent) variables resulting from the polynomial expansion of the "baseline"
variables are known as higher-degree terms. Such variables are also used in classification settings.
In statistics, polynomial regression is a form of regression analysis in which the relationship
between the independent variable x and the dependent variable y is modelled as a nth degree
polynomial in x. Polynomial regression fits a nonlinear relationship between the value of x and the
corresponding conditional mean of y, denoted E (y |x). Although polynomial regression fits a
nonlinear model to the data, as a statistical estimation problem it is linear, in the sense that the
regression function E (y | x) is linear in the unknown parameters that are estimated from the data.
For this reason, polynomial regression is considered to be a special case of multiple linear
regression.
The explanatory (independent) variables resulting from the polynomial expansion of the "baseline"
variables are known as higher-degree terms. Such variables are also used in classification settings.
A quadratic equation is a general term for a second-degree polynomial equation. This degree, on
the other hand, can go up to nth values. Polynomial regression can so be categorized as follows:
1. Linear – if degree as 1
2. Quadratic – if degree as 2
3. Cubic – if degree as 3 and goes on, on the basis of degree.
Problem Statement:
While programming there are errors in coding like indent block or unexpected blocks, syntax errors
and more. Whenever there is error, will fix it in different way that found online or according study
materials. CSV file cannot import in python coding, to overcome it used manual way to call CSV
file content in array terms. In GUI, value of coefficient cannot be shown in GUI, graph cannot be
plot and cannot save file name. Every time, keep on trying to get the answer till it can be plotted
and shown in GUI.
Programming code:
Figure 1.3(a)
Figure 1.3(b)
Figure 1.3(c)
Figure 1.3(d)
Figure 1.3(e)
Flowcharts: Gauss elimination
Start
For i = 0 to n
For k = 0 to n
End for j
End for i
End for k
For i = 0 to n-1
Print x[i]
End for i
Stop
Results:
https://drive.google.com/drive/folders/1rNyC58IvQxiQRGTRrEvSKyZHMmw9sHC6?usp=shari
ng