0 ratings 0% found this document useful (0 votes) 43 views 9 pages Linear Regression (Code)
The document contains code implementations for various machine learning algorithms, including Linear Regression, Logistic Regression, Decision Trees, K-Means Clustering, and Neural Networks for Fake News Detection. Each section includes data preparation, model training, evaluation, and visualization techniques. The document also demonstrates the use of SQLite for data extraction and manipulation.
AI-enhanced title and description
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content,
claim it here .
Available Formats
Download as PDF or read online on Scribd
Go to previous items Go to next items
Save 1. Linear Regression (Code) For Later
27728, 11.29. AM PE - Lab 2ipymb -Colab
1. Linear Regression (Code)
seport nunpy 25.99
Snport watplotLib-pyplot as pit
‘rom sklearo.toel selection Inport train_test_split
from sklesrn-linear-nodel Inport LinearRegression
from sklearn-netrics Inport wean squared error, r2_score
# Generate symtnetic data
np.randon.seed(42)
x2" mpcrandon.rond(302, 1)
yes 3 e+ aperandan.randa(200, 2)
1 Split the data into training and testing sets
K.train, K.test, yutrain, y-test = train test aplit(K, y, test sizee0.2, rendom staten2)
sole = Lineorhegression()
sodel.fie(x train, y train)
¥.pred = nodel-precict(X test)
# Prine model parameters
print(F"intercept: (eadel-interee
Print(f"Coeffictent: {nodel.coef_[21f0]
# evaluate the model
nse = mean squared ervor(y-test, y-prea)
Pe = ra_scoraly test, y_pred)
prant(f neon Souares Error (nse)")
print(#"R-squares: (r2}")
# Visualiaing the results
plt.scatter(X test, y-test, color=tblue', label='Actual Data")
Pltiplot(X test, y-orad, color=tred", Tinewidthe2, label- "Regression Line")
putaxdaber(x)
prtaytaver(y)
pLU-Uitle("Linear Regression Inptenentation”)
pit-tegena()
Flt show)
Linear Regression Implementation
af e Acualoat .
Regression tine
w
htps:/cola research google com/srva!4zwénUgyyIXPOVHLy4TGCGacGvHeNPjaisrolTo=oOZIqKb3-psbépriniMode=tte
we20778, 11:34 AM. PE Lab 2ipynb -Colab
2. Logistic Regression (Code)
eport numy 28 09
Snport pandas 25 pd
nport matplotiib-pyplot as plt
rom skleara.nodel Selection Import train test_split
‘rom sklearn-preprocessing iaport Standardscaler
‘from sklearn-Linear_sodel import Lopistlckegression
‘rom skearn metrics Seport accuracy score, classifScation report, confusion matrix
fron skleernsdetasats Inport load iris
# tons aacaset
Seis = ood tris()
GF = pe.bataFrane(iris.dsta, colums-iris.feature_nones)
aft target] = inde target
| consider binary classification: Classifying Iris-Versicolon (12bel 1) vs others
binary af = df[OFL'target"] I= 2]. # Renoving class 2
X= binary_of-sloe(:, ‘2].valoes * Use only first tuo features
y = binary_ target] values
# spiiceing data into train and test sets
X train, Ktest, y.train, y-test = train test split(s, yy test size
2» randon_statent2)
4 standardizing features
scaler ~ Standardsealer()
Xitrain = scaler-fit_transtore(X train)
Xkest = scaler. transfora(k test)
4 Logistic Regression Rodel
sodel = Logistickeeression()
model. Fit train, yateain)
y.pred = model predict(X test)
accuracy = accuracy score(y test, y_pred)
Conf_nateix = confusion aateix(y_ test, y_pred)
report = classification_report(y.test, y pred)
prant(f"Accuracy: {accuracy:.2#)°)
Prine( "Confusion hatria:\n", conf matrie)
princ( "Classification Report:\n, report)
4 Plotting decision toundary (only for First two festures)
fet plot decision boundary(X, y, S046)
sin, xnne = X{:, B].misG) = 2, X{:, Omax() ¢ 2
Yininy yimae = x{2, Aven) — 2, xC2) Moma) +2
sa, yy = mpenesheria(ep. inspece(x min, xomexy 100), np.2inspace(y ain, y-max, 100))
21 aadel.prodict (opie [xx-rovel(), 49. F3¥010)1)
2 = Zunesnape(ee. shape)
pit.contourFGe, yy, 2, alphaee.3)
pitiseatter(M[:, 6], X{:, 1p. coy, edgecolorse'k', narker='o")
it abel (irks feature names{0])
pitsylavel Anis feature nawes[])
pit.title("Decision Boundary of Logistic Regression”)
pitishow)
# Visualizing decision boundary using first two features
plot_decision boundary(X train, y-train, model)
-ntps:Ieolab research google comidivel4zwdnlgyy7XpOYHL4TOCGdcCvHaNPjascroTo=o0ZqKb3-psb& prntMode=te
12077, 11:34 AM. PE- Lab 2ipynb-Colab
FB Accuracy: 1.00
Confusion Matrix:
Taz 6)
(ee
Classification Report:
precision recall fa-seore sport
© 100 000
2 tee alee ee .
accuracy 10
macro oe 1.881.008 888
wetgnted avg -0) 8] ee
Decision Boundary of Logistic Regression
sepal with (om)
-ntps:Ieolab research google comidivel4zwdnlgyy7XpOYHL4TOCGdcCvHaNPjascroTo=o0ZqKb3-psb& prntMode=te2177, 11:37 AM PE Lab 2ipynb -Colab
3. Extract Data from Databaee Using Python (Code)
Amport seiites
4 create an imemenory SQLite database
conn = sqlite3.comect("smenory:") # Teeporary database in Rat
corsor = com.cursor()
cursor enecute(** "CREATE TABLE students (
rane TEXT, °
ssnks INTEGER)'"")
-t incert at least 5 stucent records
students. data = [
(Catice", 22, “Computer Science”, 85),
00", 34, "mechanical", 78),
Cohariie’, 23, "Electrical", 98),
(oavie’, 21, “civil”, #2),
(rema",'22," "electronics", 88)
1
cursor.executenany("INSERT INTO students (name, age, department, marks) VALUES (2, 2%, 2 29°,
| const changes
conn.comit()
# excract data from the table
Cursor execute("SELECT * FROM students”)
ows = cursor. fetchall()
| brine extracted data
print("tatracted Stadent Data:")
peinc("1 | name | age | Department | marks")
print(*-" + 45)
print #*(row{@}:<2} | {rom1}:<7} | {row{2}=<3) | {row(3]:<25) | (cow[4):<8)")
corsor.close()
conn.ciose()
BD [Wane | Age | Departnent | Marks
a | 22 | Computer Science | as
2 | 24 | Mechonscal | 78
3 [Granite | 23 | electrical | 90
4 Lomi [2 | avin Ip
S [ema [22 | Electrontes | 8
-ntps:Ieolab research google comidivel4zwdnlgyy7XpOYHL4TOCGdcCvHaNPjascroTo=o0ZqKb3-psb& prntMode=te
students. deta)
12778, 11:41 AM. PE Lab 2ipynb -Colab
Decision Tree Algorithm Implementation (Code)
eport numy 28 09
Snport pandas a8 pd
rom skleara.nodel selection import train test_split
from sklearn-tree inport DecistontreeClassifier
‘rom skeara import tree -
Enport estplotiib.pyplet as ptt
# Lone the dataset
Seis = datavete load Aris()
xe sessseats
# splitting the dataset into training and testing sets
Xtrain, Ketek, y train, y kere + train test spLit(e, yy test size
2, random statent2)
# create and train the Decision Tree sodel
{LF = Decistontreechsssifier(criterion='gint', max.éepth-3, randon_state-42)
elf fst train, train)
# Woke prectctsons
pred = elf predict(X test)
accuracy = accuracy_score(y test, y_pred)
prine( Accuracy: {accuracy:.2f))
# Visualize the oecision Tree
pit Figure( Figstzen(38, 6))
tres.plot_tree(clf, feature nameseiris. feature anes, class nanesciris.target_nanes, fill
pit. show()
[petal ent (em) <= 25
inl = 0.667
samples = 120
value’= (40, 4, 39]
‘ass = versiesor
Thi, Fae
petal length (an) <= 475)
‘ani = 05
samples = 80
value'= (0, 41,39),
‘dass = versicolor
ga=o3.
samples = 8
valve = (0, 4.4)
tpsoolab research googte-comitive/4zw4nUgyy7XpO¥HLy4TOCGdcC vHaNPjatscrolTo=00 ZIgKb3-psbipriiMode=tte
120778, 11:43 AM. PE Lab 2ipynb -Colab
‘5. 1oMeans Clustering Algorithm (Code)
Import funy 25 99
‘rom skleara datasets deport make, p10bs
X, y = make, Dlobs(n_sampless30e, centersad, cluster_std-0.62, randon_state-e)
1 sopaying K-Reans algortthe
means = Wleans(n_clustersns, rancon_state-®)
moans 2800)
yrameans = kneans.pregset(X)
pit.seatter(X[:, @], M4 A, ©
Centers = kneane.cluster centers
pitvsestter(centers{:, 6], conters[:) 2], e='ned', £9200, alpho
pit-title("-heans Clustering”)
pit.sdabel( "Feature 2")
plt-ylavel( "Feature 2°)
pit-legend()
ple-show()
yhneans, cnape'viriais", marker=‘0", edgecolor="t)
75, morkor='X', Lnbel="Centroias”)
KMeans Clustering
%_cenroas
2
°
S a2 - 89 « @ 8
Feature 1
tpsoolab research googte-comitive/4zw4nUgyy7XpO¥HLy4TOCGdcC vHaNPjatscrolTo=00 ZIgKb3-psbipriiMode=tte
"2177s, 12:00 PM PE Lab 2ipynb -Colab
6. Neural Network for Fake News Detection (Code)
tp snstall seiit-Learn pandas sunpy
eport numpy a8 n9
smpart pandas a5 6
‘rom skleara-nodel selection import train test split
from aklearn-datatets inport fetch 20vewngrowpe
‘rom tentorflou.teras.nodels import Sequential
from tensorflow terat layers Inport Dense, LSTH, Embedding, SpatiaiDropovti0
‘rom vensorflow teraspreprocessing-texe deport Tokenizer
‘rom tensorflow.keras.preprocessing. sequence import pad sequences
‘from tonsorflou kerat utils import to_catogoricsl
‘from skleara.preprocessing import Labelencoder
1 Step 1: Load the 28 Newsgroups Dataset
newsgroups = fetch 2Oneusgrovps(subset='3ll')
2 step 2: Preprocess the Data
X= newgroupe data # Nene articlor tort
y = newsgroups sarest # Labels (20 categories)
# Comert labels to categorical (one-hot encoding)
y = to_categorseal(y)
# Tokenization and padding
axiwords = 5000 # Raxinue naber of words to consider
saxlen = 208 # Maxson length of each text
‘tokenizer = Tokentzer(nun wocésenax words, lowersTrue)
tokenizer. fit_on_terts()
X= tokenizer torts te_sequences(X)
X= pad Sequences(X, maxtenemax len)
‘4 Step 2: Split the Dataset Anto Training and Testing Sets
Xtrain, X test, yotrain, yest = train test spLit(K, yy test size
+ randon_statens2)
4 step 4: Bulld the Neural Network Hagel
smoset = Sequentiai()
model. add(Emedding(toput_dinenax nords, output dine128, input_lengthonax_ten))
model ad SpatialOropewtiB(0.2))
oael.aad(LSIM(200, aropoutsd.2, recurrent dropout
oded.ade(ense(2
2))
cesvatione'softmar')) #20 eategortes fn the dataset
4 Step 5: Compile the Rodel
‘model conpile(losse" categorical crossentropy', optisiz
‘aden’, netriceef‘aecuracy"T)
4 Step 6: Trasm the Medel
history = odel-f18(0erain, y_tratn, epochsns, batch sizen6s, validation dotan(K test, y_test))
4 Stop 7: Evaluate the Model
accuracy = model-evaluate(K test, y_te5t)
print(f"Tese Accoracy: {aecarsey[2]¥100: 23")
4 Step 8: Predicting New Dats (Foke/Res1)
Sale text = ["This 15 2 sample news article about technology.
‘chnole seq ~ tokensner. texts to-sequences (sample text)
sawole_pad = pod sequences(sanple_seq, naxienenax_1en)
prediction ~ nodel.predict(sanple_p34)
prediction class = np.argnax(prediction, axis=1)
1 decode the category Sndex into category rane
print(#"the news belongs to the category: (categories prediction class{@}})")
a
eguinenent slnendy satisfied: sctkit-Iearn in /oe/local/13b/python).21/8st-packages (2.6.2)
fequinenent slresdy eatisties! pandae in /usr/ocai/ib/pythond.i/eist-psckager (2-2-2),
fequinenent already satisfied: munpy in /usr/locsi/1ib/pyehons sa/aist-packages (2-36.8)
fequirenent already satisfied: scipype1.6.6 in /ust/local/1ib/python’.11/éist-packages (ron scikit-2earn) (2.23.1)
Reguinerent sinesay satisfies: joblibo-i-3.0 4m (us/Ioesl/Tsb/pythons.12/ctet-paceages (Foon seskit-Tearn) (1-4-2)
feguinesent already satisfies: thresdpovictl>+3.1.0 in /usr/locai/Iib/pythons-in/aset- packages. (from selkse-leana) (9-5-8)
fequirenent already Satisfied: python-dateutii>+2.8.2 in /usr/lecat/1ib/pythens-21/aist packages (fron pandas) (2.8.2)
Mequirenent already satisfied: pyt2>W2028.1 in /usr/local/Lib/python3.11/dist-packages (Fron pandas) (2825-1)
Requirenent already Satisfied: treatape20i2.7 Jn /usr/local/Lib/python3. 11/dist-packages (From pandas) (2035.2)
‘ntps:Ieolab research google.comidrivel4zwdnlgyy7XpOYHLy4TACGdcCvHaNPjascrTo=o0ZigKb3-psb& prntMode=te 12177s, 12:00 PM PE Lab 2ipynb -Colab
fequirenent already satisfied: sbo-1.5 in fusr/local/1ib/python’.a1/dist-packages (fron python-dateutil>=
Epoch 1/5
usr) ioca/148/pyton3.24/4ist-packages/heras/ sc) 1ayers/core/enbeading. py:
‘arming narn(
235/236 508 6eins/step ~ accuracy:
Epoch 2/5,
BBeyane ———— 7 srmme/sten ~ accur
epoch 3/5
Bhsyae rs coons/step ~ accuracy:
foo 0/5
236/236 198 stons/step ~ accuracy:
Epoch 5/5
Bhsyaag as sosas/sten ~ accurat
Sais its oteaystep accuracy
Test Accuracy: 56.16%
1 —— a5 snr iste
‘he news Belongs to the category: talk.religion. misc
2-ppandas) (1.17.
serwarning: Arganent “nput_Length’ As deprecated. 2ust
1197 - toss: 2.8604 - val accuracy
S204 ~ valloss: 2.1367
2961 = Loss: 2.3248 - val_accuracy:
= toss: 1.6907 - val_aceuracy: @.4602 - val_t0ss: 1.6078
= toss: 1.4498 - val_accuracy: 0.5507 - val_l0ss: 1.3633
= toss: 3.2047 - va_accuracy: @.5814 - val_2055: 1.2995
lose! 4.3548
-ntps:Ieolab research google comidivel4zwdnlgyy7XpOYHL4TOCGdcCvHaNPjascroTo=o0ZqKb3-psb& prntMode=te 22.217s, 12:02 PM PE Lab 2ipynb -Colab
7. Nave Bayes for Spam Classication (Code)
Amport pandas 25 od
Seport numpy 25-99
‘from sklearanodel selection import train test split
‘from skiearn-Feature extraction, text daport CountVectordzer
fron sheen ate byes aor alsrotais
© Sample data: Tent messages (ipen or hon)
ata = (
stent": [ . .
stet’s meet for lunen’, “Congratulations, you wont", :
Get a free vacation now", ‘Do you want to hang ovt?", :
L
label": ['span', ‘hom’, "spam",
‘hans "span", ‘han’, "spon", ‘haw’, “span, *han"]
d
+ create Datsfrone
Gf = pa.vatatrone (data)
# Preprocess the data: Convert text labels to binary
F{"Aabel} = oF "Taber" ]an(('spon': 2, “haw @))
f Splse data into train and test sets
A.train, Ktest, y train, y test = train test split(af{"text"], of{‘Iabel"], test stze-8.3, random statens2)
| convert text data nto numerical form using Countvectorizer
vectorizer = Countvectorizer()
Aitrain_yec = vertorizer. St_transfora(X train)
eestvec = vectorizer. transfora(X test)
‘4 Initialize Naive Bayes model and f5¢ St to the training data
rb model = Multinonta1NB()
e-nodel.F1E(% train vec, y_train)
y.pred = nb nodel-predict(X test vec)
1 Evaiuate tne aodel performance
prine(-Aecuracy:", accuracy score(y test, y_pred))
print(-Classi¢ication Report: \n", classification report(y test, y.pred))
=" Classification Report:
precision recall. fl-score support
stro oe o.s0 2
weighted ave 238 2
‘fusr/local/146/python3.21/6ist-packages/stlearn/aetrics/_classification.py:1565: UndeFinedetriciarning:
sarn_prfaverege, modifier, © (netric.capitalize()) 1s, Len(result))
‘sar/ local? Lib/oythond14/dist-packages/sklearn/eetrics/ classification py:1565: Undefinedetriclarning: Precision 4s {ll-defined and be
“nacn_pef(averoge, nodifier, f{oetric.capitalize()} is", Len(result))
‘sar/local/Lib/pythor3.14/dist-packagee/sklearn/netrice/ claceitleation.py:1565; UndeFinedletriclarning: Precision 4s ill-defined and be
“sarn.prf average, modifier, #(netric.copitalize()) is", Len(result))
recislon 1s -defined and be
-ntps:Ieolab research google comidivel4zwdnlgyy7XpOYHL4TOCGdcCvHaNPjascroTo=o0ZqKb3-psb& prntMode=te 1