(IJCSIS) International Journal of Computer Science and Information Security,Vol. 8, No. 2, 2010
3.Less computation complexity ascompare to kNN7. Modified k nearestneighbor (MkNN) Uses weights andvalidity of datapoint to classifynearest neighbor1.Partially overcome low accuracy of WkNN2.Stable and robust1.Computation Complexity Methods facingoutlets8. Pseudo/GeneralizedNearest Neighbor(GNN) Utilizesinformation of n-1 neighbors alsoinstead of onlynearest neighbor1.uses n-1 classes which consider thewhole training data set1.does not hold good for small data2.Computationa;l complexityLarge data set9. Clustered k nearestneighbor Clusters areformed to selectnearest neighbor1.Overcome defect of unevendistributions of training samples2.Robust in nature1.Selection of threshold parameter isdifficult before running algorithm2.Biased by value of k for clusteringText Classification10. Ball Tree k nearestneighbor (KNS1)[21,22]Uses ball treestructure toimprove kNNspeed1.Tune well to structure of representeddata2.Deal well with high dimensionalentities3.Easy to implement1.Costly insertionalgorithms2.As distance increases KNS1degradesGeometric Learningtasks like robotic,vision, speech,graphics11. k-d tree nearestneighbor (kdNN) divide thetraining dataexactly into half plane1.Produce perfectly balanced tree2.Fast and simple1.More computation2.Require intensive search3.Blindly slice points into half whichmay miss data structureorganization of multi dimensionalpoints12. Nearest feature LineNeighbor (NFL) take advantage of multipletemplates perclass1.Improve classification accuracy2.Highly effective for small size3.utilises information ignored innearest neighbor i.e. templates perclass1.Fail when prototype in NFL is faraway from query point2.Computations Complexity3.To describe features points bystraight line is hard task Face Recognitionproblems13. Local NearestNeighbor Focus on nearestneighborprototype of query point1.Cover limitations of NFL 1.Number of Computations Face Recognition14. Tunable NearestNeighbor (TNN) A tunable metricis used1.Effective for small data sets 1.Large number of computations Discriminationproblems15. Center based NearestNeighbor (CNN) A Center Line iscalculated1.Highly efficient for small data sets 1. Large number of computations Pattern Recognition16. Principal Axis TreeNearest Neighbor(PAT) Uses PAT 1.Good performance2.Fast Search1.Computation Time Pattern Recognition17. Orthogonal SearchTree Nearest NeighborUses OrthogonalTrees1.Less Computation time2.Effective for large data sets1.Query time is more Pattern Recognition
We compared the nearest neighbor techniques. Some of them are structure less and some are structured base. Bothkinds of techniques are improvements over basic kNNtechniques. Improvements are proposed by researchers to gainspeed efficiency as well as space efficiency. Every techniquehold good in particular field under particular circumstances.R
T. M. Cover and P. E. Hart, “Nearest Neighbor Pattern Classification”,IEEE Trans. Inform. Theory, Vol. IT-13, pp 21-27, Jan 1967.
T. Bailey and A. K. Jain, “A note on Distance weighted k-nearestneighbor rules”, IEEE Trans. Systems, Man Cybernatics, Vol.8, pp 311-313, 1978.
K. Chidananda and G. Krishna, “The condensed nearest neighbor ruleusing the concept of mutual nearest neighbor”, IEEE Trans. InformationTheory, Vol IT- 25 pp. 488-490, 1979.
F Angiulli, “Fast Condensed Nearest Neighbor”, ACM InternationalConference Proceedings, Vol 119, pp 25-32.
E Alpaydin, “Voting Over Multiple Condensed Nearest Neighbors”,Artificial Intelligent Review 11:115-132, 1997.
Geoffrey W. Gates, “Reduced Nearest Neighbor Rule”, IEEE TransInformation Theory, Vol. 18 No. 3, pp 431-433.
G. Guo, H. Wang, D. Bell, “KNN Model based Approach inClassification”, Springer Berlin Vol 2888.
S. C. Bagui, S. Bagui, K. Pal, “Breast Cancer Detection using NearestNeighbor Classification Rules”, Pattern Recognition 36, pp 25-34, 2003.
Y. Zeng, Y. Yang, L. Zhou, “Pseudo Nearest Neighbor Rule for PatternRecognition”, Expert Systems with Applications (36) pp 3587-3595,2009.
H. Parvin, H. Alizadeh and B. Minaei, “Modified k Nearest Neighbor”,Proceedings of the world congress on Engg. and computer science 2008.
Z. Yong, “An Improved kNN Text Classification Algorithm based onClustering”, Journal of Computers, Vol. 4, No. 3, March 2009.
Djouadi and E. Bouktache, “A Fast Algorithm for Nearest NeighborClassification”, IEEE Transactions on Pattern Analysis and MachineIntelligence, Vol. 19. No. 3, 1997.
V.Vaidehi, S. Vasuhi, “Person Authentication using Face Recognition”,Proceedings of the world congress on engg and computer science, 2008.
Shizen, Y. Wu, “An Algorithm for Remote Sensing Image Classificationbased on Artificial Immune b-cell Network”, Springer Berlin, Vol 40.
G. Toker, O. Kirmemis, “Text Categorization using k Nearest NeighborClassification”, Survey Paper, Middle East Technical University.
Y. Liao, V. R. Vemuri, “Using Text Categorization Technique forIntrusion detection”, Survey Paper, University of California.