You are on page 1of 4
aia, 147 PM 20\1I00225_KIRUTHICK_DA2- Jupyter Notebook 244): 9 DIGITAL ASSIGNMENT 2 n (34s): # QuesTION 3 1 trie o python code to closstfy the input digit as 4’ or not. Use MUIST dota set. Also dtsploy f the sorte tmope and print the perforeance mccsure of SDC classtfter n (1661: |# Inport the mnist datoret from the shieorn.dstasets Library from sklearnncatasets import fetch opennl rnnist=feten-openmi(Wist_784",version-1) nist. keys) Our( 14s): dset taye({ "data", “target, “frane", ‘categortes", “feature,nones", “target.nanes", “OESCR', “setasis’, n [147]: #Seperoting the dota ond target features and pasting into the x and y vortobles Royennist{ data" yanist["carget"] 168): Printing the shape of the x Drinttx.snape) (70088, 788) in (149): |aPrinting the shope of the y rintty.sape) 70000, gre nace et woe seen Sun the x datofrane (asa) y-HHee [9] oustisay: “a localhost 8¢8inatebooks!20M100225, KIRUTHICK_DA2 jpynoH M4 aia, 147 PM 20\1I00225_KIRUTHICK_DA2- Jupyter Notebook 152) ouc(as2) 153) 58) 155) 156) 157) 158) our 158) 159) yeastypetap-uiees) ° 3 Nano! class, Gength: 70000, dtype: wineé a Rrain,x_fest.y_train,y_testex{:50000] values, [600001] .values,y( 16000 (699021) Taking the values of the y which consist of enly 9 Fatse ‘from sklearn-Linear_sodel inport sooclassiter faulting the SoD ‘lf-seoelaseitier(randon_state-42) Srttting the xotroiny train 9 volues into the SaDclasstyer model CaF. FUG tratayy- train 8) aigie=(5) Foredicting the new values for the x test volves using the clossifier model putlfapredict(n test) rowing every rou fron te state ung for (2p ‘cpecetng the ortginat values with the predicted volues AU(eL¢ preter (ds resape(2,-2))>-y_test_AU4]) ‘printing how many values have been watched print(count) ‘from sklearn.sodel_selection import Stratifiedtfold from sklearncbese ingort clone sPerforwing the cross fold votiaation Skroldeeseratifieateoid(n-eplsta-3) for taincindes, test andor in skfolds.splittx train, ytrain 4) ‘lone_elferione (ei) ‘for auch valtaation, we are vatidtng trata and test dotaset strain, folde-x srasnferain-Sndex) pcteain.foldsog-train.attraln fedex [fon each validation, we assign the Cader for each row x test_foldex_trainftest_sndex] prhest foldoy-trainvattezt index) 700 Te navel generates Clone. el fie (a train. folds y train folés) y_predclove_cl¢.predict(x test #016) Finding the accuracy for cock Teeration / ‘n(y_presesy test fold) princtn-correet7ien(y pre) ‘rom sklearn.nodel_selection import cross_val_score Pertnting the each valudation score with the setrte accuracy cross_val_score(clfyx train, train cvs, seorknge"2ccuracy") array({0.g7925, 0.97225, 0,96308333, @.9725 , 0.97242667]) ‘rom sklearn.nodel_selection énport crors_val_predict NGlving the validation score te the teatn_pred wartable ¥_Rrain,predncross_val_prediet(clfyx_trathyy=t7 84M 8,603) localhost 8¢8inatebooks!20M100225, KIRUTHICK_DA2 jpynoH 2a aia, 147 PM 20\1I00225_KIRUTHICK_DA2- Jupyter Notebook 160) out( 160): 161) out(1s1) 162) 363) 68) 168) out(165): our 166} 270) 257) from sklearnonetrics Inport confusion matrix printing the confusion matrix by conporing original and predicted values Confusion rotrin(y-train A, y.train_ored) 1281), 5288]}, etypo-tnesey arvay( (52957 Tse from sklearnnetries inport precision score, resall_score,(1_score prine("PReCTSIOW =") Printing the preciston of the wodet Drint(preciston_score(y_train_4,y_train.pred)) prine(RECALL >) Printing the recoll of the node Drint(recali_seare(y train ¢y4_2rain_pree)) Uorintang the Fl of the model prine"rt scone:*) ‘aconely. train ayy_teatn_pree) aeeatt quesrioy 2 # Perfore Logistic regression for iris datoset. Split the dotaset as 89, 20 ond use 20% for 1 testing ond print the MSE. Show the difference im perforeonce between Logistic hd 1 softnox fegresston soport pandas 98 pt from sklearn-datasets daport 2oad_sris fra skiearnaadel_selection import train test split fran aklearncTines?-wodel. import Logistichegression from sklearncnetrics Inport nean_squared error Seis = load iris) x intent Yo ied cterget 1 = pdbataFrane(3) stead) ¥ = pd.vatarranety) Yetead 1 Split the dataset toto 88, 22 Atrain, atest, y-train, y_test = train. test split ey, Reseas82800.2), # Logistic Regression ogisticreg = Logistictegression() aopisticreg-#it(x train, y train) prine(saccunnery- "y logistiereg scone(x_ test y_test)*100) Aecursery: 96, €sss655666667 localhost 8¢8inatebooks!20M100225, KIRUTHICK_DA2 jpynoH a4 aia, 147 PM 20\1I00225_KIRUTHICK_DA2- Jupyter Notebook 259) 27) an 268) 19) ogisticreg_y_pred = logisticreg.predict(x test) Togreg © aean-sguared.error(y text, logistiereg y_pre priae(nse.togres) # soptnox Sofenax » LogisticRegresston(mil€i_classe'nultinorisl, solver="1o¢@s') Sortnan. Fit (a train, train) Sortnan_y.predesortnax predicts test) print “Accuracry: *,sofmarscore(x teat, y.test)#100) pecurncry: 93.23333333533555, se_softnan = mean_squared_error(y test, softnax y_pred) Dpriseensesofemax) 0.0399399998098 # Logistic Regression gives better result thon softmax Regression localhost 8¢8inatebooks!20M100225, KIRUTHICK_DA2 jpynoH 4

You might also like