14 views

Uploaded by sandyunsiet

Vector quantization (VQ) is a classical quantization technique from signal processing which allows the modeling of probability density functions by the distribution of prototype vectors. It …Full description

save

- Cs 171 07a Games MiniMax
- Computer Notes - Data Structures - 29
- 379 Class 24 Transportation
- Create a Program That Finds Out Which People Have the Closest Birthday
- Speech Rec
- Privacy Preserving Clustering in Data Mining
- t
- SOLVER EXCEL EXAMPLE (8).doc
- Face Recognition Using SOM Network
- Lecture 2
- Tables
- group2-07
- 01StableMatching-2x2
- Speech Enhancement Using Stastical Modelling
- 90191370 Interactive Artificial Bee Colony Optimization
- A Comprehensive Analysis of Neural Solution to the Multitarget Tracking Data Association Problem
- swarm intelligence
- Engineering risk benefit analysis
- a07v27n2
- An Optimization-Based Parallel Particle Filter for Multitarget Tr
- New Microsoft Office Word Document
- Project 2011
- 161411-161601-Modelling, Simulation and Operations Research
- Romanian Society of Applied and Industrial Mathematics
- Genetic Algorithm
- 202Ex2Sp05Sols
- ClusterExam(1)
- Two Step Method
- Readme
- cs.DOCX
- Lab Manual
- Secure Socket Layer
- 06075131
- Handout OS

This page contains information related to vector quantization (VQ). Currently this page includes information about VQ with regards to compression. In the future, we will make this page a more comprehensi e VQ page.

**In what applications is VQ used?
**

Vector !uanti"ation is used in many applications such as image and oice compression, oice recognition (in general statistical pattern recognition), and surprisingly enough in olume rendering (I ha e no idea how VQ is used in olume rendering#).

What is VQ?

$ ector !uanti"er maps k-dimensional ectors in the ector space Rk into a finite set of ectors Y = %yi& i ' (, ), ..., N*. +ach ector yi is called a code ector or a codeword. and the set of all the codewords is called a codebook. $ssociated with each codeword, yi, is a nearest neighbor region called Voronoi region, and it is defined by&

The set of Voronoi regions partition the entire space Rk such that&

for all i

j

$s an e,ample we take ectors in the two dimensional case without loss of generality. -igure ( shows some ectors in space. $ssociated with each cluster of ectors is a representati e codeword. +ach codeword resides in its own Voronoi region. These regions are separated with imaginary lines in figure ( for illustration. .i en an input ector, the codeword that is chosen to represent it is the one in the same Voronoi region.

&ow does VQ wor! in compression? $ ector !uanti"er is composed of two operations. and yij is the jth is component of the codeword yi. of that codeword is sent through a channel (the channel could be a computer storage. In this case the lowest distortion is found by e aluating the +uclidean distance between the input ector and each codeword in the codebook. The first is the encoder. the inde. /nce the closest codeword is found. of the codeword. of the codeword that offers the lowest distortion. with the associated codeword. and so on). The +uclidean distance is defined by& where xj is the jth component of the input ector. communications channel. The representati e codeword is determined to be the closest in +uclidean distance from the input ector. The encoder takes an input ector and outputs the inde. . it replaces the inde. and the second is the decoder. Input ectors are mar!ed with an "# codewords are mar!ed with red circles# and the Voronoi regions are separated with $oundar% lines.Figure 1: Codewords in 2-dimensional space. 0hen the encoder recei es the inde. -igure ) shows a block diagram of the operation of the encoder and decoder.

The initial codewords can be randomly chosen from the set of input ectors. 'he decoder recei es the inde" o+ the codeword# and outputs the codeword.ray. This algorithm is similar to the k8means algorithm. and the first one that comes to mind is the simplest. but we ha en2t talked about how to generate the codebook. The input ector belongs to the cluster of the codeword that yields the minimum distance. &ow is the code$oo! designed? 1o far we ha e talked about the way VQ works.. This is done by taking each input ector and finding the +uclidean distance between it and each codeword. That means that it re!uires an e. 0e therefore resort to suboptimal codebook design schemes.sing the (uclidean distance measure clusterize the ectors around each codeword. the authors of this idea. . . It is named 9:. This is done by obtaining the a erage of each cluster. *i en an input ector# the closest codeword is +ound and the inde" o+ the codeword is sent through the channel. Compute the new set o+ codewords. ). 'he algorithm (. .Figure 2: 'he (ncoder and decoder in a ector )uantizer.hausti e search for the best possible codewords in space. -elect N codewords at random# and let that $e the initial code$oo!.etermine the num$er o+ codewords# N# or the size o+ the code$oo!. for 9inde8:u"o8. and the search increases e.ponentially as the number of codewords increases (if you can find an optimal solution in polynomial time your name will go down in history fore er). $dd the component of each ector and di ide by the number of ectors in the cluster. . <. 0hat code words best represent a gi en set of input ectors3 4ow many should be chosen3 5nfortunately. designing a codebook that best represents the set of input ectors is 678hard.

/epeat steps 2 and 0 until the either the codewords don1t change or the change in the codewords is small.pensi e method. then the number of multiplies becomes k#N. This makes full search an e.where i is the component of each ector (. The first is the time it takes to generate the codebook. What is the measure o+ per+ormance VQ? 4ow does one rate the performance of a compressed image or sound using VQ3 There is no good way to measure the performance of VQ.=() image. . which is also the slowest. directions). and the number of comparisons becomes #N(k 8 (). and second is the speed of the search. yet is not as widely implemented. yet it is ery slow. 1ome of them reduce the math used to determine the codeword that offers the minimum distortion. The simplest search method. and each ector is in k dimensions. m is the number of ectors in the cluster. ". other algorithms preprocess the codewords and e. N codewords. the generation of the codebook took about )? minutes on an 47 machine). The reason it is slow is because for each iteration. Aon2t despair# 0e can always resort to good old #ean !quared *rror (@1+) and Peak !ignal to Noise Ratio (716D). #aximum $escent (@A). $lthough it is locally optimal. and ectors in < dimensions.. and that is due to its simplicity.. There are many other methods to designing the codebook. If there were # input ectors. y. @1+ is defined as follows& . In full search an input ector is compared with e ery codeword in the codebook. methods such as Pairwise Nearest Neig bor (766). @any algorithms ha e be proposed to increase the speed of the search. This algorithm is by far the most popular. is full search. determining each cluster re!uires that each input ector be compared with all the codewords in the codebook (0e ha e programmed this algorithm in C. etc. the number of additions and subtractions become #N((k 8 () B k) = #N()k8().ploit underlying structure. (.. and for an =(). This due to two things. !imulated "nnealing. and %requency-!ensitive &om'etitive (earning (-1C9). a codebook of )=>. This is because the distortion that VQ incurs will be e aluated by us humans and that is a subCecti e measure. &ow does the search engine wor!? $lthough VQ offers more compression for the same distortion rate as scalar !uanti"ation and 7C@.

This is an acrobat file of slides. Check out his publications page. then we set n to E bits. 7redicti e Desidual Vector Quanti"ation (7DVQ) C/A+C.ray teaches at 1tanford 5ni ersity. but the maCority of them are not generali"ed. then we would take the difference between the two images pi. The 716D is defined as follows& where n is the number of bits per symbol. .oyal has some interesting papers on VQ. 6ow he is teaching at the Aepartment of +lectrical G Computer +ngineering at @ississippi 1tate 5ni ersity. • Aynamic 9earning Vector Quanti"ation (A9VQ).ample. 5nfortunately. • Vi ek . 4e also worked with ideo coding using VQ. -or e. -ome sites with VQ $lthough there are many web pages on VQ. They are a must know# If you think you are VQer. • Fim -owler was a graduate at The /hio 1tate 5ni ersity. • 9ight -ield Compression using 0a elet Transform and Vector Quanti"ation .el. • Dobert . if we want to find the 716D between two )=> gray le el images. in alphabetical order by last name. and within his class of Quanti"ation and Aata compression he de otes a topic to ector !uanti"ation. 4e is a definite V+er. s!uare the results.where # is the number of elements in the signal. and a erage the results. I ha e therefore limited the links to those sites that contain useful information or tools. Qcc7ack. I call them V+ers. VQ using neural nets. then send me an email. • -or a really brief description of VQ you can isit this -$Q sheet. $s an e.ample. who constantly work on VQ. or image. VQers This is a list of people. and I will include your name in the list. for !uanti"ation. most of these sites don2t go into the detail of their work. • 7ossibilistic Clustering in Hohonen 6etworks for Vector Quanti"ation. data compression and coding that includes VQ tools. 4e has de eloped a wonderful package. The maCority of these sites describe new ways of implementing VQ or some of its applications.el by pi. • $nother 6eural 6etwork based VQ with code called. if we wanted to find the @1+ between the reconstructed and the original image.

:atuhan 5lug .ray.ersho. Fim -owler. . Dobert @. $llen .• • • • • 1tanley $halt. .

- Cs 171 07a Games MiniMaxUploaded byIlias Yahia
- Computer Notes - Data Structures - 29Uploaded byecomputernotes
- 379 Class 24 TransportationUploaded bymohana2589
- Create a Program That Finds Out Which People Have the Closest BirthdayUploaded bydavidwarn18
- Speech RecUploaded byWaqas Sultan
- Privacy Preserving Clustering in Data MiningUploaded bysujathalavi
- tUploaded bydwi
- SOLVER EXCEL EXAMPLE (8).docUploaded byxyxyxyx8892
- Face Recognition Using SOM NetworkUploaded byEditor IJRITCC
- Lecture 2Uploaded byedwindove
- TablesUploaded bypts_svs
- group2-07Uploaded byNag Bhushan
- 01StableMatching-2x2Uploaded byJohn Le Tourneux
- Speech Enhancement Using Stastical ModellingUploaded byhari ganesh
- 90191370 Interactive Artificial Bee Colony OptimizationUploaded byKrushnasamy Suramaniyan
- A Comprehensive Analysis of Neural Solution to the Multitarget Tracking Data Association ProblemUploaded bySamKian
- swarm intelligenceUploaded byrohan_sh2003
- Engineering risk benefit analysisUploaded byRiki Mandol
- a07v27n2Uploaded byMustika T. A
- An Optimization-Based Parallel Particle Filter for Multitarget TrUploaded bysigie32
- New Microsoft Office Word DocumentUploaded bysayakju
- Project 2011Uploaded bypayam30
- 161411-161601-Modelling, Simulation and Operations ResearchUploaded bySheryaBhatt
- Romanian Society of Applied and Industrial MathematicsUploaded byallinac
- Genetic AlgorithmUploaded byNidhi Shah
- 202Ex2Sp05SolsUploaded byBoilerhelproom
- ClusterExam(1)Uploaded byAlexandre Gum
- Two Step MethodUploaded byksrinivas9999
- ReadmeUploaded bySebastian Khalifa
- cs.DOCXUploaded byYash Bansal