You are on page 1of 10

Performance Evaluation of Optimization

Techniques with Vector Quantization


Used for Image Compression

Rausheen Bal, Aditya Bakshi and Sunanda Gupta

Abstract In this paper, performance evaluation has been performed on vector


quantization techniques used for image compression. Now a days, vector quantiza-
tion is a prime area of research and that can be implemented on different algorithms by
analyzing the advantages and disadvantages of each algorithms. In this paper, a new
technique has been recommended, which will possibly overcome the disadvantage
of the Cuckoo Search-LBG algorithm, i.e., CS-LBG is having slower convergence
speed as compared to other techniques. So the main goal is to eliminate the dis-
advantage of CS-LBG. The performance evaluation is done on LBG, PSO-LBG,
FA-LBG, HBMO-LBG, QPSO-LBG, BAT-LBG, CS-LBG, and KFCG Algorithm.
Further, implementation has been done on hybriding CS-KFCG algorithm and is
then compared with CS-LBG, FA-LBG, HBMO-LBG, BAT-LBG, LBG, PSO-LBG
algorithms.

Keywords Vector quantization · Image compression · LBG · FA -LBG


HBMO-LBG · QPSO-LBG · BAT-LBG · CS-LBG · KFCG Algorithm

1 Introduction

Image compression is ensnared with the deterioration of the quantity of data exchang-
ing units used to display an image. By the ethicalness of the headway in particular

R. Bal · A. Bakshi (B)


School of Computer Science & Engineering, Lovely Professional University,
Phagwara, Punjab, India
e-mail: aditya.17433@lpu.co.in; addybakshi@gmail.com
R. Bal
e-mail: rausheenbal@gmail.com
S. Gupta
Department of Computer Science Engineering, Shri Mata Vaishno Devi University,
Katra, J&K, India
e-mail: sunanda.gupta@smvdu.ac.in

© Springer Nature Singapore Pte Ltd. 2019 879


N. Yadav et al. (eds.), Harmony Search and Nature Inspired Optimization Algorithms,
Advances in Intelligent Systems and Computing 741,
https://doi.org/10.1007/978-981-13-0761-4_83
880 R. Bal et al.

periods of advanced gadgets, viz. image acquisition, data storage and displace, more
than a couple of novel utilizations of the advanced imaging have started; despite
what might be expected gobs of these applications are not common in outcome of
the goodly storage space arrangement. Proportionately, the enthusiasm for image
compression rose strikingly finished the most recent decade. Image compression
plays out a crucial part in image and video applications. In present, endowment of
image compression methods with unrivaled generated image quality is the critical
and challenging task. The image compression is proposed to dispatch the image with
lesser bitrates. The procedure for image compression:
• recognition of repetitions in image,
• suited encoding approach, and
• conversion approach.
Quantization is an amazing and sufficient apparatus for image compression and
is a non-formed lossy compression method. Quantization is divided into two sorts:
• Scalar quantization,
• Vector quantization (VQ),
The goal of vector quantization is to concoct an effective codebook. A codebook
typifies collection of code words to which enter image vector is particularly based
on Euclidean separation which is least. Vector quantization has three stages:
• Design phase.
• Encoding phase.
• Decoding phase.
Image compression assumes a huge part in mixed media applications. By and by
foundation of image compression strategies with superb recreated image quality is a
critical and testing errand for specialists. Image compression is planned to transmit
the image with lesser bits. Recognizable proof of redundancies in image, impeccable
and appropriate encoding strategy and change method are the primary components
for image compression.
Design Phase
1. Function that maps to the vector space of n-dimension to a particular set codebook
2. Generate codebook using algorithm
Encoding Phase
1. Image is taken as input
2. Image is converted into Blocks
3. Blocks are converted into Vectors of n-dimension
4. Each vector is too searched for closest node vector in Codebook
5. Code vector is retrieved its indices are transmitted to receiver
Decoding Phase
1. Image is taken as input in searching process and is split
Performance Evaluation of Optimization Techniques … 881

2. The split image is converted to non-overlapping blocks


3. Each block is converted into training vector
4. Search the codebook for closest code vector
Vector quantization takes after three stages: Encoder, channel, and decoder. These
steps are basically three blocks that are described in the below given table.

Block 1 Block 1 is the encoder segment which comprises of creating image vectors,
codebook generation. Image vectors are made by subdivided the image into real
and non-covering blocks. Manufacturing of effective codebook is the first endeavor
in VQ. A codebook comprises of a group having code expressions of size
equivalent to non-covering block size. An algorithm is said to be unrivaled
algorithm if its created codebook is productive. Accordingly, a useful creation of
the codebook, every last vector is recorded with the indicant number from list table.
These list numbers are sent to the receiver
Block 2 Block 2 is the channel through which listed numbers are impelled to the receiver
Block 3 Block 3 is a decoder segment which exists in record table, codebook and remade
image. The internal bound filed numbers are decoded with receiver list table. The
codebook at the receiver is the same as that of sent codebook. The internal bound
file numbers are appended to its counting code words and these code words are
placed all together in comportment that the extent of the recovered image is
indistinguishable as that of input image

2 Approaches for Vector Quantization

2.1 Linde-Buzo Gray Algorithm (LBG)

The basic and the essentially used VQ approach is Linde-Buzo Gray (LBG) cal-
culation (1980) [1], it is straightforward, sensible, modifiable, versatile and relies
upon the immaterial Euclidean separation between the image vector and proportion-
ate code word. It delivers a local optimal solution in spite of the way that does not
ensure the best global optimal solution. LBG algorithm global solution depends upon
the fundamental codebook which is made subjectively. An adequate and instinctual
algorithm is addressed for the outline of vector quantizers develop on a notable prob-
abilistic model or on an extended training sequence of information. Distortion turns
littler after more than once execution of LBG algorithm. The LBG algorithm ensures
that the distortion would not create from cycle to next cycle or from iteration to next
iteration.

Advantage It is an uncomplicated procedure that develops the local optimal code-


book utilized for image compression strategy.
882 R. Bal et al.

Disadvantage Weakness of LBG algorithm is that the cluster drawn out is—45° to
horizontal axis in 2-D cases. This results in an ineffectual clustering.

2.2 Particle Swarm Optimization-Linde Buzo Gray Algorithm


(PSO-LBG)

Particle Swarm Optimization (PSO) [2] is an optimization technique, which is used


to make huge quality code book for compressing image. They also set the aftereffect
of LBG strategy to figure worldwide (global) best particle with which it can raise
the merging pace of PSO. Image encoding and what’s more image decoding pro-
cess are imitated in their investigations. Results demonstrated that the technique is
reliable and the reformulated images get higher quality to images reformulated by
various frameworks. PSO is at first presented by Kenndey et al. in 1995 [3]. It is
affected by the pleasant thought of feathered creatures (birds) flocking or fish coach-
ing (tutoring). In a multidimensional space, every particle (individual) symbolizes a
possible response for an issue. Further, there abides a wellness evaluation work that
apportions a wellbeing motivation to a particle’s position. Every particle recodes
the two positions, the most raised wellness regard in the whole masses is named
worldwide (global) best (gbest) position. The most essential wellness estimation of
itself at demonstrate is called singular (individual) best (pbest) position. In PSO, the
codebooks (or particles) change their qualities in perspective of the past experience
and the best routine with respect to the swarm to convey a best codebook. Here, sup-
position has been made by the analyst that codebook are considered as the particles.
In programming, PSO is a computational system that updates an issue by iteratively
attempting to upgrade a candidate solution as to a given measure of significant worth.
It deals with an issue by having a people of candidate plans, here named particles,
and moving these particles around in the request space as showed by fundamental
logical formulae over the particle’s position and speed. Each particle’s development
is affected by its adjacent best known position, however then again is guided toward
the best known positions in the interest space, which are invigorated as better posi-
tions are found by various particles. This depended upon to push the swarm toward
the best courses of action. PSO is a metaheuristic as it makes few or no suppositions
about the issue being streamlined and can look for gigantic spaces of cheerful game
plans. Regardless, metaheuristics, for instance, PSO does not guarantee a perfect
plan is ever found.

Advantage It was made to have more accurate compression with quick algorithm
time when contrasted with LBG by refreshing the gbest and pbest arrangement.

Disadvantage The PSO-LBG does not ensure for the instability of particles.
Performance Evaluation of Optimization Techniques … 883

2.3 Honey Bee Mating—Linde Buzo Gray Algorithm


(HBMO-LBG)

In this paper, the inspectors associated another novel swarm algorithm, honey bee
mating enhancement to make the codebook of vector quantization. The outcomes
were associated with the other three methodologies that are LBG [1], PSO-LBG [2]
and QPSO-LBG [4] techniques. Test conclusions showed that the HBMO-LBG cal-
culation is more attempted and genuine and the duplicated images have predominant
quality than those conveyed from the other three systems. A couple of calculations
are essentially spitted into two classes according to their disposition, the searching
nature, and mating nature. Seeking nature of honey bees has been executed in mim-
icked honey bees state (ABC) by Karaboga and Basturk [5]. Honey bees are social
and live in provinces numbering in the thousands. Three sorts of grown-up honey
bees dwell in one province: the queen, male drones, and infertile female special-
ists. In each area, there is only a solitary egg-laying ruler (queen), yet there are an
immense number of experts. The queen honeybees mate with rambles (drones or
male honeybee), set up new states and lay eggs. Queen honey bees lay eggs in the
cells of the home, and when they brood or incubate, they advance toward getting to
be hatchlings. Every state contains just a single queen, who is fit for creating 2000
eggs per day. Grown-up specialists tend the hatchlings inside the cells and encourage
them with dust and nectar for around 3 weeks, and soon thereafter they move toward
becoming grown-ups. Create honey bees upbraid themselves of the settled cells to
raise. Automatons (drones), or male honey bees, are the minority in a state and fill
only a solitary need: to mate with virgin honeybee queen. Not long after consequent
to mating, rambles sustenance the clean. Albeit, barren worker females when in doubt
do not deliver their own eggs nor develop new provinces. Young honey bee special-
ists tend to hatchlings by discharging fluid from their stomach organs. As laborers
develop, they end up plainly in charge of conveying and putting away sustenance
accumulated by foragers. As solid grown-ups, they scrounge for nourishment until
the point that they food the dust.

Advantage HBMO has reproduced image of predominant quality and improved


codebook with slight distortion assessed to PSO-LBG, QPSO-LBG and LBG algo-
rithm.

Disadvantage Drones do not last for longer period, which may be the cause of
slow speed as compared to Cuckoo Search algorithm; Slower as compared to Firefly
Algorithm.

2.4 Bat Algorithm—Linde Buzo Gray (BAT-LBG)

Bat Algorithm (BA) is a nature affected metaheuristic calculation made by Yang


in 2010 [6]. There is good judgment co-association between bat calculation and
884 R. Bal et al.

radio area and going (RADAR). The RADAR utilization relies upon examination of
reflected signs from the things. Proportionately, the essential idea took after by bat
calculation is an echolocation feature of little scale bats with fluctuating pulse-ratio
proportion extent of emission and loudness. The bat radiates a couple of sounds with
moving pulse-ratio proportion rate and loudness. These sound signs are resonated
indeed from objects called reverberate or resound signals. With these reverberate
signals, bats can pick the traverse of the dissent and partition to address, speed
of challenge and even their association in divisions of a minute because of their
refined sentiment hearing. Repeat tuning, modified zooming and parameter control
features makes the bat calculation profitable and snappy, versatile and essential. A Bat
calculation based VQ has been expected for image compression by the investigators.
The Peak Signal to Noise Ratio (PSNR) of VQ is enhanced by applying BA [7]
method. The calculation has been assessed by changing each possible parameter
of Bat calculation for beneficial codebook design and capable VQ of getting ready
vectors. Intensification and diversification of the calculation have gone to a target
with Frequency-tuning and clamor parameter independently. It is perceived that the
Bat calculation PSNR and nature of the recouped picture are more astute to LBG,
PSO-LBG, QPSO-LBG, HBMO-LBG, and FA-LBG [8].
Advantage BA-LBG is around 1.841 times speedier convergence rate than that of
the HBMO-LBG and FA-LBG.
Disadvantage BA-LBG algorithm needs some extra parameters that can be con-
trasted and the PSO-LBG, QPSO-LBG and FA-LBG.

2.5 Firefly-Linde Buzo Gray Algorithm (FF-LBG)

Horng [8] executed a firefly calculation (FA) to lay out a codebook for VQ. In this
paper, analysts gave a consolidated portrayal of how the firefly calculation uses to
execute the VQ and enhance the execution of LBG approach. The greater part of his
trial comes to fruition exhibited that the FF-LBG calculation can raise the idea of
recuperated images with respect to other three procedures, for instance, the customary
LBG, the PSO-LBG, and QPSO-LBG. The Firefly calculation (FA) was introduced
by Yang (in 2008). The FA is affected by the blasting impression and components
of fireflies. Brilliance (firefly)  Value (target work). The lighter firefly (cut down
wellness esteem) approaches brighter firefly (higher wellness esteem). Codebooks
are believed to be the fireflies. It was done up by the researcher after the test that FF-
LBG calculation is having quick than the PSO, QPSO and HBMO calculations and
the recuperated images get predominant quality than those conveyed from the LBG,
PSO and QPSO, yet it is no enormous inimitable quality to the HBMO calculation.
The essential part for a firefly’s glimmer is to go about as flag frameworks to pull in
different fireflies. Xin-She Yang nitty gritty this firefly calculation by anticipating:
1. All fireflies are unisexual, with the objective that any individual firefly will be
pulled into each and every other firefly.
Performance Evaluation of Optimization Techniques … 885

2. Engaging quality is with respect to their brightness, and for any two fireflies,
the less magnificent one will be pulled in by (and therefore move towards) the
brighter one; regardless, the power (clear magnificence) lessen as their common
detachment increases.
3. On the off chance that there are no fireflies brighter than a given firefly, it will
move erratically. The brightness ought to be related with the objective function.

Advantage The exploratory outcomes demonstrated that their proposed FF-LBG


algorithm is as dependable as the HBMO-LBG algorithm; in any case, the FF-LBG
needs considerably less the algorithm time and the quantity of parameters than the
HBMO-LBG algorithm.

Disadvantage The FA experiences an issue when there are no such brighter fireflies
in the search space.

2.6 Kekre’s Fast Codebook Generation Algorithm (KFCG)

The analysts [9] had propositioned a calculation which chips away at arranging modus
to instigate the codebook and the code vectors are accomplished by practicing center
modus. In this calculation, image is disagreed into blocks and squares are changed to
the vectors of size k. Grid T of range M × k including M measure of image getting
planning vectors of size k. Every section of the structure is the image arranging
vector of range k. The getting readied vectors are organized w.r.t the fundamental
follower of entire vectors that is w.r.t. the fundamental portion of the framework T
and the setup cross section is measured as one specific bunch or cluster. The center
of the network (grid) T is picked (code vector) and is put into the codebook, and
the extent of the codebook is custom to one. The network is then repudiated into
dualistic uniform parts and the each of the part is by then afresh masterminded w.r.t.
the subsequent individual from all the getting readied vectors that is w.r.t. the ensuing
area of the cross section T and we get two gatherings both including measure up to
number of arranging vectors. The center of both the parts is the picked up and written
to the codebook, now the extent of the codebook is raised to two including two code
vectors however of course each part is also disagreed to half. Each of the more
than four areas accomplished are organized w.r.t. the third area of the framework T
and four gatherings are picked up and therefore four code vectors are suitably. The
above strategy is repeated till the codebook of required size is secured. Here, quick
sort calculation is used and from the aftermaths, it is seen that this calculation puts
aside tiniest chance to cause codebook, since Euclidean partition calculation is not
required.

Advantage It is stronger algorithm when contrasted with LBG and demonstrated


that algorithm gives slightest MSE rate and high PSNR rate which sets aside small
amount of time to compute.
886 R. Bal et al.

2.7 Cuckoo Search Linde Buzo Gray Algorithm (CS-LBG)

Most typical vector quantization (VQ) is Linde Buzo Gray (LBG), which blueprints
an adjacent perfect codebook for image compression. FA, PSO, and HBMO were
made which make close global codebook, yet look for handle takes after Gaussian
conveyance function. FA experiences an issue when brighter fireflies are irrelevant
and PSO encounters wobbliness in blending when particle speed is uncommonly
high. Thus, they proposed Cuckoo look (CS) [10, 11] meta-heuristic progression
computation that updates the LBG codebook by force flight apportionment work
which takes after the Mantegna’s estimation as opposed to Gaussian distribution.
CS for eats up 25% of union time for neighborhood and 75% of joining time for
overall codebook, so it guarantees the overall codebook with appropriate change
probability and this lead is the genuine estimation of CS. In every practical sense,
CS for estimation has been watched which has high apex banner to uproar extent
(PSNR) and better wellbeing regard appeared differently in relation to LBG, PSO-
LBG, Quantum PSO-LBG, HBMO-LBG and FA-LBG at the cost of high union
time.
Advantage It is observed that the nature of the regenerated image gained with CS
computation wad superior to those obtained with LBG, PSO-LBG, QPSO-LBG,
HBMO-LBG and FA-LBG.
Disadvantage It was observed that CS-LBG was around 1.425 times slower in
convergence when stood out from HBMO-LBG and FA-LBG. Slower combining is
the critical drawback of this CS-LBG [10] method.

3 Simulation Results

It is concluded from the above discussion that CS-LBG is best from all the other
algorithms, but it has only one drawback that it is very slow, so a new method has
been proposed that seems to be faster than CS-LBG.
LBG algorithm is replaced with KFCG because it is observed and reviewed too
that KFCG is faster as compared to LBG. The simulation has been performed and it is
observed in Fig. 1 that CS-KFCG performed very well in comparison with CS-LBG.
The future work related to these technologies can be done by following the work
done in [12–18].

4 Conclusion

It is observed from the work that CS-KFCG performed very well in comparison with
CS-LBG. The main objective of the work is to overcome the disadvantage of CS-
LBG, which is done by replacing LBG with KFCG and hybriding CS with KFCG.
Performance Evaluation of Optimization Techniques … 887

Fig. 1 Performance evaluation in terms of PSNT and bitrate for different image compression
algorithms

The simulation results shown in the previous section depict the results as PSNR with
respect to bitrate. CS-KFCG is performing better because the high the PSNR value,
the superior the quality of algorithm.

References

1. Linde, Y., Buzo, A., Gray, R.M.: Algorithm for vector quantizer design. IEEE Trans. Commun.
Syst. vol. COM-28(1), pp. 84–95 (1980)
2. Chen, Q., Yang, J., Gou, J.: Image Compression Method Using Improved PSO Vector Quanti-
zation, pp. 490–495 (2005)
3. Eberhart, R., Kennedy, J.: A New Optimizer Using Particle Swarm Theory, pp. 39–43
4. Wang, Y., Feng, X., Huang, Y., Pu, D., Zhou, W.: A novel quantum swarm evolutionary algo-
rithm and its applications, vol. 70, pp. 633–640 (2007)
5. Karaboga, D., Basturk, B.: A Powerful and Efficient Algorithm for Numerical Function Opti-
mization : Artificial Bee Colony (ABC) Algorithm, pp. 459–471 (2007)
6. Yang, X.: A New Metaheuristic Bat-Inspired Algorithm, pp. 65–74
7. C. Karri, C., Jena, U.: Fast vector quantization using a Bat algorithm for image compression.
Eng. Sci. Technol. Int. J. 19, pp. 769–781 (2016)
8. Horng, M.: Expert systems with applications vector quantization using the firefly algorithm for
image compression. Expert Syst. Appl. 39(1), 1078–1091 (2012)
888 R. Bal et al.

9. Kekre, H.B., Sarode, T.K.: An efficient fast algorithm to generate codebook for vector quanti-
zation. In: Proceedings—1st International Conference Emerging Trends in Engineering Tech-
nology. ICETET 2008, pp. 62–67 (2008)
10. Chiranjeevi, K., Jena, U.R.: Image compression based on vector quantization using Cuckoo
search optimization technique. Ain Shams Eng. J., pp. 1–15 (2016)
11. Yang, X.-S., Deb, S.: Engineering optimisation by Cuckoo search. Int. J. Math. Model. Numer.
Optim. 1(4), 330–343 (2010)
12. Gupta, A., Jha, R.K., Gandotra, P., Jain, S.: Bandwidth spoofing and intrusion detection system
for multi stage 5G wireless communication network. IEEE Trans. Veh. Technol., vol. PP(99),
1–1
13. Gupta, A., Goyal, K.: An analytic review on antenna modelling. Int. J. Control Theor. Appl.
9(10), 319–336 (2016)
14. Goyal, K., Gupta, A.: A literature survey on different types of pulse based sensor for acquisition
of pulse. Int. J. Control Theor. Appl. 9(10), 361–365 (2016)
15. Kakalia, S., Singh, M.: Performance analysis of DWDM system having 0.8-Tbps date rate with
80 channels. Indian J. Sci. Technol. 9(47) (2016)
16. Mishra, S., et al.: ISDA based precise orbit determination technique for medium earth orbit
satellites. Pertanika J. Sci. Technol. 25(4) (2017)
17. Fidele, M., et al.: Peak to average power ratio reduction for OFDM system using different peak
windowing and modulation techniques. 3(1) (2016)
18. Jena, B., Singh, M.: Performance Comparison of Fixed and Mobile WiMAX Networks Based
on LTE for UDP traffic. 11.1 (2014)

You might also like