Professional Documents
Culture Documents
1. Problem Description
Using the petal and sepal sizes, we can use neural network to classify which class an Iris flower belongs to.
This is one of the standard benchmark that can be used to show how neural networks (and other techniques)
can be used for classification. The neural network is train with 150 examples of three species of Iris.
Two of the species are not linearly separable, so there is no simple rule for classification of flowers. After
proper training the network is capable of classifying the flowers with a 100% accuracy.
2. Data Description
The training data contains 147 records with measurements of 147 flowers. Two records is used for forecasting
Input columns:
1) "S_length" - sepal length;
2) "S_width" - sepal width;
3) "P_length" - petal length;
4) "P_width" - petal width;
Desire column:
5) "Flower Type" - flower class ( Versico,Setosa,or Virginic)
Input 1 Input 2 Input 3 Input 4 Desire
S_length S_width P_length P_width Flower Type
Setosa = 1
Versicol = 0.5
Virginic = 0
0.4920532847 1 0.25801
0.3526936014 1 0.419006
0.3590337897 1 0.410838
0.3558522307 1 0.414926
0.3681075479 1 0.399288
0.3751487356 1 0.390439
0.3651799576 1 0.402996
0.3624896588 1 0.406419
0.3515938412 1 0.420431
0.3538676204 1 0.417487
0.3690013041 1 0.398159
0.3626938739 1 0.406159
0.3519510385 1 0.419967
0.3547875565 1 0.416299
0.3765994011 1 0.388628
0.3872895483 1 0.375414
0.3768002183 1 0.388378
0.3661978721 1 0.401705
0.3707268031 1 0.395985
0.3734682876 1 0.392542
0.3603793941 1 0.409115
0.3718800483 1 0.394535
0.3710344922 1 0.395598
0.3617258516 1 0.407394
0.3614206288 1 0.407784
0.351524766 1 0.42052
0.3640705363 1 0.404406
0.364465334 1 0.403904
0.362284057 1 0.406682
0.3577546679 1 0.412479
0.3547996556 1 0.416283
0.3632218759 1 0.405486
0.3788252128 1 0.385858
0.3817068255 1 0.382286
0.3538676204 1 0.417487
0.3585203276 1 0.411496
0.3643627714 1 0.404035
0.3538676204 1 0.417487
0.3546806339 1 0.416437
0.362174946 1 0.406821
0.3669352076 1 0.400771
0.336768646 1 0.439876
0.359970053 1 0.409638
0.3685882605 1 0.398681
0.3727532357 1 0.393439
0.3540518269 1 0.417249
0.3720911008 1 0.39427
0.3693175643 1 0.39776
0.3602964469 1 0.409221
0.3494435738 0.5 0.022667
0.3529773007 0.5 0.021616
0.3474496646 0.5 0.023272
0.3331465493 0.5 0.02784
0.3424483821 0.5 0.024823
0.3432064775 0.5 0.024584
0.3558271827 0.5 0.020786
0.3372019175 0.5 0.026503
0.3426614353 0.5 0.024755
0.3456265124 0.5 0.023831
0.3257162431 0.5 0.030375
0.3507799421 0.5 0.022267
0.3260128029 0.5 0.030272
0.3447102034 0.5 0.024115
0.3497251005 0.5 0.022583
0.3490848278 0.5 0.022775
0.3504779272 0.5 0.022357
0.3389550808 0.5 0.025935
0.3288223628 0.5 0.029302
0.3362772918 0.5 0.026805
0.3560205579 0.5 0.02073
0.3440528627 0.5 0.02432
0.3344121872 0.5 0.027419
0.3402092323 0.5 0.025533
0.344454901 0.5 0.024194
0.3469173902 0.5 0.023434
0.3398098378 0.5 0.025661
0.3471002049 0.5 0.023378
0.3467855499 0.5 0.023475
0.3391598005 0.5 0.02587
0.3344150183 0.5 0.027418
0.333772861 0.5 0.027631
0.3418211953 0.5 0.025021
0.3404168983 0.5 0.025467
0.3510896749 0.5 0.022174
0.3599887132 0.5 0.019603
0.3488310551 0.5 0.022852
0.3293210888 0.5 0.029131
0.350194545 0.5 0.022442
0.3382311459 0.5 0.026169
0.3381134443 0.5 0.026207
0.3476032104 0.5 0.023225
0.3388696488 0.5 0.025963
0.3343063954 0.5 0.027454
0.3421978931 0.5 0.024902
0.3485042443 0.5 0.022951
0.3469565838 0.5 0.023422
0.3450480066 0.5 0.02401
0.3414888425 0.5 0.025126
0.3584402848 0 0.128479
0.3439307639 0 0.118288
0.3462416262 0 0.119883
0.3444384247 0 0.118638
0.3491892829 0 0.121933
0.3423269995 0 0.117188
0.3420153129 0 0.116974
0.3390240068 0 0.114937
0.3329508351 0 0.110856
0.3620629111 0 0.13109
0.3547834352 0 0.125871
0.3414683777 0 0.116601
0.3485793002 0 0.121508
0.3407162915 0 0.116088
0.3509643819 0 0.123176
0.3568687662 0 0.127355
0.346664901 0 0.120177
0.3605009319 0 0.129961
0.3338623384 0 0.111464
0.3273975062 0 0.107189
0.3539053062 0 0.125249
0.3486732741 0 0.121573
0.336161255 0 0.113004
0.3423262479 0 0.117187
0.3550820012 0 0.126083
0.3475053527 0 0.12076
0.3454323354 0 0.119323
0.3501833311 0 0.122628
0.344616436 0 0.11876
0.3416268049 0 0.116709
0.3381396031 0 0.114338
0.359463678 0 0.129214
0.3455516941 0 0.119406
0.3410367778 0 0.116306
0.333647426 0 0.111321
0.3457227868 0 0.119524
0.3614040962 0 0.130613
0.3493544642 0 0.122049
0.3508764126 0 0.123114
0.3510040319 0 0.123204
0.3534229293 0 0.124908
0.3538710762 0 0.125225
0.3439307639 0 0.118288
0.3534439296 0 0.124923
0.3583827699 0 0.128438
0.351772375 0 0.123744
0.3380774355 0 0.114296
0.3496746107 0 0.122272