The resultant decision tree is trained on tennis match data to predict whether a player will win or lose based on weather conditions like rain, overcast skies, or sunny weather. The tree shows that if it is sunny, the prediction is yes (player will win). If it is overcast or rainy, the prediction is no (player will lose). The decision tree is trained on the data and tested on a separate test data set, achieving an accuracy of 76% when predicting the outcomes in the test data.
The resultant decision tree is trained on tennis match data to predict whether a player will win or lose based on weather conditions like rain, overcast skies, or sunny weather. The tree shows that if it is sunny, the prediction is yes (player will win). If it is overcast or rainy, the prediction is no (player will lose). The decision tree is trained on the data and tested on a separate test data set, achieving an accuracy of 76% when predicting the outcomes in the test data.
The resultant decision tree is trained on tennis match data to predict whether a player will win or lose based on weather conditions like rain, overcast skies, or sunny weather. The tree shows that if it is sunny, the prediction is yes (player will win). If it is overcast or rainy, the prediction is no (player will lose). The decision tree is trained on the data and tested on a separate test data set, achieving an accuracy of 76% when predicting the outcomes in the test data.
TD3 algoi thn an ic an an lapxopna te data cet for buildingthe diston tre' Lapply this kna. dqe to cladasity a ntw sampke. inaport pandasas pd pd.rcadcv('tenniscs) om callections inaport Counte det cntopylict Ca lit) Cnt = Counttu (x ur Xinalist) num instance = len(alist) 0 Dxolbs = Lx/hum inctance forxinant valusO eurn tohupy (prabs) import mata
ebxn cuba (I-probtnatt log (prlk, 2)for prob in,pral)
dcf ingo qain (df,cplit tarqetb¡ce =0)i split) nobs=lenldffnd aagg-entziakunt df split.aggl1 target 2|cnbvp print Cafmagg=cht) a -aqgcnt, caumn kzEnbepy prapabsued' he pldfaggeatEnfop lold_entapy cnbay t LaTtargct)) Experiment Name: Date
Experiment No. Page No :
rziurn old trbrnpy - ne Chimp
d{rd3 (d, taxact attnhute _nam. dedautcaus =no):
Yeturn ntxt (íter Ccnt)
Cnat attrilbute name) pt orclass Tehurn else Aedautt claszmaxldl,attr tlrf) huxar attr qadac= Cinku-qain ca) attibute haine in attr in Cqaihd) qaih, indux ChaxLindexmax inde x_nax ==attributenamc hist atty te=hist attr hxl= best_dtr] r attr kal,data subet in dtgrouphy Chest atty) Kubtree =ida Cdata subct,targe rnauningalt, dlautt lau te Tbestatrl[attr rall subtrec retun tYe
del clasi Cinstance ,trL default =Nou)
attibut ncxt Citer (tree) inctancz lattnlbute ntre Tathibuke kuyg ) |aut te Latthbutt Jliastane[attribut ]]! hslatane (redutt dic ) urh kse': Date Experiment Name: Page No: Experiment No. churn ryat he rzhrn dufault Jattributkna nc4 lct ld! tenniL Caluyens tre is: attribitehamec-heMove(plaTennc namc The Resuta nt Decision rainy':1'oindy?:Falc; te-fd3[dltennlC, PlaytnnL attribut 'butlo ok:{'oerast':y, [c:nTrre print("n]n he resultant DeisLOn yes, True: no), 'sonny brnt (tr
'hora l': Syes' })1) :-41
trainsqdata =dltebnLe. ilacl Tu Resatat Decisiem tree is est_dlta= dftnnlttlec Iy: 4':hFabe ain tre =d3The(hainingdata fplaytchalsatabutn'n ) print("nln ru ltaft DecisLon bain-tre Ls :t'co:yd,' hot'; prnt (train tree) tyes',Tuu. (no'}), 'sunng est data L'prdicted2tect data applyCclisitr no jmild!(n6 ))) axtC=l, arqr= (trainte aihin atthnbte' lay hew baind data Is{ s t priat (he Accurtey h Tthnis]-tect-data Iredickdo') unm (tet.data tplau |/6lenltest datal ndux) )