You are on page 1of 6

See discussions, stats, and author profiles for this publication at: https://www.researchgate.

net/publication/339662419

A Review of Image Classification Approaches and Techniques

Article · March 2020


DOI: 10.23883/IJRTER.2017.3033.XTS7Z

CITATIONS READS

10 3,859

3 authors, including:

Ponnusamy .R Manikandan Kaliyamoorthi


Annamalai University SRM Institute of Science and Technology
21 PUBLICATIONS 52 CITATIONS 6 PUBLICATIONS 24 CITATIONS

SEE PROFILE SEE PROFILE

All content following this page was uploaded by Ponnusamy .R on 16 May 2020.

The user has requested enhancement of the downloaded file.


A Review of Image Classification Approaches and Techniques

R. Ponnusamy1, S. Sathyamoorthy2, K. Manikandan3


1
Department of Technology, Annamalai University, povi2006@yahoo.co.in
2Department of CSE, Annamalai University
3Department of IT, SRM University

Abstract—In this paper, a literature survey on the various approaches used for classifying an image
which is based on the object. Classification is the vital and challenging task within the field of in
computer vision. Classification is based on the description, texture or similarity of items or things.
Image classification refers to the labeling of images into one of a number of predefined categories.
Pixels are the unit represented in an image. Image classification groups the pixels in different classes.
The image classification includes-image acquisition, image pre-processing, image segmentation.
Many classification techniques have been developed for image classification. This paper study about
different classification techniques such as Artificial Neural Networks (ANN), Naive Bayes (NB), K-
Nearest Neighbor (KNN), Multi- Layered Perceptron (MLP), Kernel Support Vector Machines,
Decision Tree (DT), Fuzzy Measure, Radial Basis Function (RBF).
Keywords—Image Classification; Artificial Neural Network; Naive Bayes; K-Nearest Neighbour,;
Decision Tree;
I. INTRODUCTION
Classification of the objects is an easy task, but it's challenging to the machine. The image
classification includes image pre-processing, image sensors, object detection, object segmentation,
feature extraction and object classification. The Image Classification system consists of a database
that contains predefined patterns that compare with an object to classify to appropriate category.
image classification is a crucial and challenging task in various application domains, including
remote sensing, vehicle navigation, biomedical imaging, video-surveillance, biometry, industrial
visual inspection, robot navigation, and vehicle navigation [2]. Fig.1 Shows the steps involved in the
Classification techniques are[1]:
Pre-processing

Detection and
Extraction

Training

Classification

Figure. 1 Steps for image classification

DOI : 10.23883/IJRTER.2017.3033.XTS7Z 1
International Journal of Recent Trends in Engineering & Research (IJRTER)
Volume 03, Issue 03; March - 2017 [ISSN: 2455-1457]

II. IMAGE CLASSIFICATION METHODS


Image classification is one of the important and complex processes in image processing. There are
several image classification methods. The two main image classification methods are supervised
classification and unsupervised classification.

2.1 Supervised classification


In supervised classification some pixels are known grouped and gives the label to classes. This
process is known as training. After that classifier uses trained pixels for classify other images. It
requires prior information before testing process and it must collected by analyst. In this analyst
identifies representative training sites for each informational class and also here algorithm generates
decision boundaries. Commonly used supervised classification approaches are parallelepiped,
minimum distance to mean and maximum likelihood. The steps in supervised classification approach
are:
 Training areas for each informational class are identified by analyst
 Signatures identifies(mean, variance, covariance, etc)
 All pixels are classified
 Map Informational Class

2.2 Unsupervised classification


In unsupervised classification, pixels are grouped with the help of their properties. This process
known as clustering and groups are known a cluster. In this user decide how many clusters he wants.
The unsupervised classification used when no trained pixels are available. In unsupervised
classification, prior information is not needed. It does not require human annotation, it is fully
automated. This algorithm identifies clusters in data and also analyst labels clusters. The steps in
unsupervised classification are
 Clustering data
 All pixels are classified based on clusters
 Spectral class map
 Clusters are labeled by analyst
 Map informational class

III. IMAGE CLASSIFICATION TECHNIQUES


Image Classification includes following steps:

a. Image Acquisition: acquire the images from for image processing.


b. Image Pre-Processing: In preprocessing image transformation, noise removal,
atmospherically correction techniques are used.
c. Feature Extraction: Extracting the important characteristics of the image.
d. Classification: The images are classified based on the extracted features into
predefined categories by using suitable methods that compare the image pattern
with images which inside the database

3.1 Artificial Neural Networks


Artificial Neural Network (ANN) is a type of artificial intelligence that limits some functions of the
person mind. ANN has a normal tendency for storing experiential knowledge. An ANN consists of a
sequence of layer; each layer consists of a set of neurons. All neurons of every layer are linked by
weighted connections to all neurons on the preceding and succeeding layers [3]. ANN is a
computational model inspired by the biological neural network. It could be considered as a weighted

@IJRTER-2017, All Rights Reserved 2


International Journal of Recent Trends in Engineering & Research (IJRTER)
Volume 03, Issue 03; March - 2017 [ISSN: 2455-1457]

directed graph in which nodes are neurons and edges with weights are connection among the
neurons. Each artificial neuron computes a weighted sum of its input signals and generates an output,
based on certain activation functions, such as piecewise linear, sigmoid, Gaussian, etc. It consists of
one input layer, one output layer, and depending on the application it may or may not have hidden
layers. The number of nodes at the output layer is equal to the number of information classes,
whereas the number of nodes at the input is equal to the dimensionality of each pixel. Feed-forward
ANN with the back propagation learning algorithm is most commonly used in ANN literature. In the
learning phase, the network must learn the connection weights iteratively from a set of training
samples. The network gives an output, corresponding to each input. The generated output is
compared to the desired output. The error between these two is used to modify the weights of the
ANN. The training procedure ends when the error becomes less than a predefined threshold. Then,
all the testing data are fed into the classifier to perform the classification. [4]

3.2 Naive Bayes classifier


The Naive Bayes classifier is based on a probability representation and assigns the class, which has
the greatest estimated subsequent probability, to the feature vector extracted from the ROI. This
process is optimal when the attributes are orthogonal. However it performs well without this
statement. The effortlessness of the method allows good performance with small training sets.
Certainly, by building probabilistic models, it is robust to outliers. In addition , it creates soft
decision boundaries, which has the outcome of avoiding overtraining. However, the arbitrary
option of the distribution model for estimating the probabilities P(x) along with the lack of
flexibility of the decision boundaries fallout in limited performance for complex multiclass
configurations [5].

3.3 K-Nearest Neighbor


The k-Nearest Neighbor (KNN) classifier cuts out hyper spheres in the space of instances by
conveying the majority class of the k-nearest instances according to a defined metric .It is
asymptotically optimal and also its implementation allows speedy tests [6]. However, quite a lot of
shortcomings are inherent to this method. It is very sensitive to the irritation of the dimensionality [7]
Certainly, increasing the dimensionality has the effect to sparse the feature space, and local
homogeneous regions that signify the prototypes of the diverse classes are spread out. The
classification performance robustly depends upon the used metric. Moreover, a small value of k
results in chaotic boundaries and makes the process very aware to outliers.

3.4 Multi- Layered Perceptron


Multi-Layered Perceptrons (MLP) are inspired by the human nervous system where in sequence is
processed during unified neurons.MLP is a feed-forward neural network, which defines that the
information propagates as of input to output. The inputs are fed with principles of each feature and
the outputs give the class value. Through one layer of neurons, the output is a weighted linear blend
of the inputs. This network is branded as linear perceptron. By totaling an extra layer of neurons with
nonlinear foundation functions (the hidden layer), a nonlinear mapping among the input and output is
prospective. The teaching phase consists of iterative optimization of the weights concerning the
neurons by minimizing the mean squared error rate of organization. The learning rate, which controls
the adjustments of the weights throughout the teaching phase, must be elected as a trade-off among
mistake on the training set and overtraining. An additional critical constraint is the number of units,
of the hidden layer. Definitely, the MLP is subject to over fitting and requires an optimal choice of
the parameters for regularization. The MLP can generate models with arbitrary difficulty by drawing
infinite decision boundaries. It is also strong to noisy features, as these will find a low weight after
teaching.

@IJRTER-2017, All Rights Reserved 3


International Journal of Recent Trends in Engineering & Research (IJRTER)
Volume 03, Issue 03; March - 2017 [ISSN: 2455-1457]

3.5 Kernel Support Vector Machines


Kernel SVMs implicitly map input feature vectors to a higher dimensional space by using the kernel
function with the width of the Gaussian. In the transformed space, a maximal extrication hyper plane
is built considering a two -class problem. Two parallel hyper planes are constructed symmetrically
on both side of the hyper plane that separates the data. The aim is to exploit the distance between the
two external hyper planes, called the margin. An declaration is made that the enhanced the margin is,
the improved the simplification error of the classifier will exist. Indeed, SVMs were residential
according to the structural risk minimization attitude which seeks to minimize an upper clear of the
generalization error, while most of the classifiers aim at minimizing the empirical risk, the error on
the training set. The SVM algorithm aims at finding a decision function, which minimizes the
functional. The SVMs permit training, nonlinear classifiers in high - dimensional spaces using a
minute training set. This is enabled during the selection of a division of vectors (called the support
vectors) which characterizes the right boundaries between the classes fit [8].

3.6 Decision Tree


Decision Tree (DT) are based on hierarchical rule based method and use Non-parametric approach.
Decision Tree calculates class membership by repeatedly partitioning a dataset into uniform subsets
Hierarchical classifier permits the acceptations and rejection of class labels at each intermediary
stage. This method consists of 3 arts: partitioning the nodes, find the terminal nodes and allocation of
class label to terminal nodes. DT are based on hierarchical rule based method and use Non-
parametric approach [10].

3.7 Fuzzy Measure


In Fuzzy classification, various stochastic associations are determined to describe characteristics of
an image. The various types of stochastic are combined (set of properties) in which the members of
this set of properties are fuzzy in nature. It provides the opportunity to describe different categories
of stochastic characteristics in the similar form. Performance and accuracy depends upon the
threshold selection and fuzzy integral [9].

3.8 Radial basis function


Radial basis functions have received significant attention, most commonly with a Gaussian of the
form,
  || x  m || 2 
 j i 
g (x )  exp  
i j  2 

 i 
where m i - mean vector(centers),
x j  input vector
Classical techniques utilizing radial basis functions employ some method of determining a subset of
centres'. Typically a method of clustering is first employed to select a subset of centres. An attractive
feature of the SVM is that this selection is implicit, with each support vectors contributing one local
Gaussian function, centred at that data point. By further considerations it is possible to select the
global basis function width, s, using the SRM principle [11].

IV. CONCLUSION
In this paper we discussed different Image Classification Techniques. Most common approaches for
image classification can be categories as supervised and unsupervised, or parametric and
nonparametric or object-oriented, sub-pixel, per-pixel and per field or spectral classifiers, contextual
classifiers and spectral-contextual classifiers or hard and soft classification. Some of the most
commonly used techniques are discussed here. This survey gives theoretical knowledge about
classification methods and also for selecting appropriate different classification methods.

@IJRTER-2017, All Rights Reserved 4


International Journal of Recent Trends in Engineering & Research (IJRTER)
Volume 03, Issue 03; March - 2017 [ISSN: 2455-1457]

REFERENCES
[1] Jianxin Wu, ”Efficient Hik Svm Learning For Image Classification”, IEEE Transactions On Image Processing, Vol.
21, No. 10, October 2012
[2] Bohyung Han and Larry S. Davis, "Density-Based Multifeature Background Subtraction With Support Vector
Machine" IEEE Transactions On Pattern Analysis And Machine Intelligence, Vol. 34, No. 5, May 2012
[3] Vijay Kumar, Priyanka Gupta ”Importance of Statistical Measures in Digital Image Processing” International
Journal of Emerging Technology and Advanced Engineering, ISSN 2250-2459, Volume 2,Issue 8, 2012
[4] S.Arunadevi1dr. S. Daniel Madan Raja “A Survey On Image Classification Algorithm Based On Per-Pixel”
International Journal Of Engineering Research And General Science Vol 2, Issue 6, October-November, 2014
[5] A. Taherkhani,"Recognizing sorting algorithms with the C4.5 decision tree classifier," in Program Comprehension
(ICPC), 2010 IEEE 18th International Conference on, 2010, pp. 72-75.
[6] Y. Xu, M. Sonka, G. McLennan, J. Guo, and E. A. Hoffman, “MDCT based 3-D texture classification of
emphysema and early smoking related lung pathologies,” IEEE Trans. Med. Imag., vol. 25, no. 4, pp.464–475, Apr.
2006.
[7] L. Sorensen, S. B. Shaker, and M. de Bruijne, “Quantitative analysis of pulmonary emphysema using local binary
patterns,” IEEE Trans. Med.Imag., vol. 29, no. 2, pp. 559–569, Feb. 2010.
[8] Jian xin Wu, "Effic ient Hik SVM Learning Fo r Image Classification", IEEE Transactions On Image Processing,
Vol. 21, No. 10, October 2012ev and V. P. Veiko, Laser Assisted Microtechnology, 2nd ed., R. M. Osgood, Jr., Ed.
Berlin, Germany: Springer-Verlag, 1998.
[9] Serafe im Moustakidis, Giorgos Mallinis, Nikos Koutsias, John B. Theocharis and Vasilios Petridis, “SVM -Based
Fuzzy Decision Trees for Classification of High Spatial Resolution Remote Sensing Images”, IEEE Transactions On
Geoscience And Remote Sensing, Vol. 50, No. 1, January 2012
[10] Lizhen Lu, Liping Di, Senior and Yanmei Ye, “A Decision-Tree Classifier for Ext racting Transparent Plastic-
Mulched Land cover from Landsat-5 TM Images ”, IEEE Journal Of Selected Topics In Applied Earth Observations
And Remote Sensing , Vo l. 7, No. 11, November 2014
[11] G. Vasumathi “A Survey on SAR Image Classification” International Journal of Advanced Engineering and Global
Technology Vol-03, Issue 12, December 2015.

@IJRTER-2017, All Rights Reserved 5

View publication stats

You might also like