Welcome to Scribd, the world's digital library. Read, publish, and share books and documents. See more
Download
Standard view
Full view
of .
Look up keyword
Like this
1Activity
0 of .
Results for:
No results containing your search query
P. 1
Lossless Image Compression for Transmitting Over Low Bandwidth Line

Lossless Image Compression for Transmitting Over Low Bandwidth Line

Ratings: (0)|Views: 127 |Likes:
Published by ijcsis
The aim of this paper is to develop an effective loss less algorithm technique to convert original image into a compressed one. Here we are using a lossless algorithm technique in order to convert original image into compressed one. Without changing the clarity of the original image. Lossless image compression is a class of image compression algorithms that allows the exact original image to be reconstructed from the compressed data. We present a compression technique that provides progressive transmission as well as lossless and near-lossless compression in a single framework. The proposed technique produces a bit stream that results in a progressive and ultimately lossless reconstruction of an image similar to what one can obtain with a reversible wavelet codec. In addition, the proposed scheme provides near-lossless reconstruction with respect to a given bound after decoding of each layer of the successively refineable bit stream. We formulate the image data compression problem as one of successively refining the probability density function (pdf) estimate of each pixel. Experimental results for both lossless and near-lossless cases indicate that the proposed compression scheme, that innovatively combines lossless, near-lossless and progressive coding attributes, gives competitive performance in comparison to state-of-the-art compression schemes.
The aim of this paper is to develop an effective loss less algorithm technique to convert original image into a compressed one. Here we are using a lossless algorithm technique in order to convert original image into compressed one. Without changing the clarity of the original image. Lossless image compression is a class of image compression algorithms that allows the exact original image to be reconstructed from the compressed data. We present a compression technique that provides progressive transmission as well as lossless and near-lossless compression in a single framework. The proposed technique produces a bit stream that results in a progressive and ultimately lossless reconstruction of an image similar to what one can obtain with a reversible wavelet codec. In addition, the proposed scheme provides near-lossless reconstruction with respect to a given bound after decoding of each layer of the successively refineable bit stream. We formulate the image data compression problem as one of successively refining the probability density function (pdf) estimate of each pixel. Experimental results for both lossless and near-lossless cases indicate that the proposed compression scheme, that innovatively combines lossless, near-lossless and progressive coding attributes, gives competitive performance in comparison to state-of-the-art compression schemes.

More info:

Published by: ijcsis on Oct 12, 2011
Copyright:Attribution Non-commercial

Availability:

Read on Scribd mobile: iPhone, iPad and Android.
download as PDF, TXT or read online from Scribd
See more
See less

10/12/2011

pdf

text

original

 
Lossless Image Compression ForTransmitting Over Low Bandwidth Line
G. Murugan, ResearchScholar , Singhania University& Sri Venkateswara Collegeof Engg , ThiruvallurDr. E. Kannan, Supervisor,Singhania University andDean Academic VeltechUniversityS. Arun , ECE Dept.Asst.Professor Veltech HighEngg college,Chennai email-yesarun001@yahoo.com
 Abstract
The aim of this paper is to
 
develop an effective loss lessalgorithm technique to convert original image into a compressed one. Here we are using a lossless algorithm technique in order to convert original image into compressed one. Without changing the clarity of the original image. Lossless image compression is a class of imagecompression algorithms that allows the exact original image to bereconstructed from the compressed data.We present a compression technique that provides progressive transmission as well as lossless and near-losslesscompression in a single framework. The proposed technique produces a bit stream that results in a progressive and ultimatelylossless reconstruction of an image similar to what one can obtainwith a reversible wavelet codec. In addition, the proposed scheme provides near-lossless reconstruction with respect to a given bound after decoding of each layer of the successively refineable bit stream.We formulate the image data compression problem as one of successively refining the probability density function (pdf) estimate of each pixel. Experimental results for both lossless and near-losslesscases indicate that the proposed compression scheme, that innovatively combines lossless, near-lossless and progressive codingattributes, gives competitive performance in comparison to state-of-the-art compression schemes.
1.INTRODUCTION
Lossless or reversible compression refers tocompression techniques in which the reconstructed dataexactly matches the original. Near-lossless compressiondenotes compression methods, which give quantitative boundson the nature of the loss that is introduced. Such compressiontechniques provide the guarantee that no pixel differencebetween the original and the compressed image is above agiven value [1]. Both lossless and near-lossless compressionfind potential applications in remote sensing, medical andspace imaging, and multispectral image archiving. In theseapplications the volume of the data would call for lossycompression for practical storage or transmission. However,the necessity to preserve the validity and precision of data forsubsequent reconnaissance diagnosis operations, forensicanalysis, as well as scientific or clinical measurements, oftenimposes strict constraints on the reconstruction error. In suchsituations near-lossless compression becomes a viablesolution, as, on the one hand, it provides significantly highercompression gains vis-à-vis lossless algorithms, and on theother hand it provides guaranteed bounds on the nature of lossintroduced by compression.Another way to deal with the lossy-lossless dilemmafaced in applications such as medical imaging and remotesensing is to use a successively refindable compressiontechnique that provides a bit stream that leads to a progressivereconstruction of the image. Using wavelets, for example, onecan obtain an embedded bit stream from which various levelsof rate and distortion can be obtained. In fact with reversibleinteger wavelets, one gets a progressive reconstructioncapability all the way to lossless recovery of the original. Suchtechniques have been explored for potential use in tele-radiology where a physician typically requests portions of animage at increased quality (including lossless reconstruction)while accepting initial renderings and unimportant portions atlower quality, and thus reducing the overall bandwidthrequirements. In fact, the new still image compressionstandard, JPEG 2000, provides such features in its extendedform [2].In this paper, we present a compression techniquethat incorporates the above two desirable characteristics,namely, near-lossless compression and progressive refinementfrom lossy to lossless reconstruction. In other words, theproposed technique produces a bit stream that results in aprogressive reconstruction of the image similar to what onecan obtain with a reversible wavelet codec. In addition, ourscheme provides near-lossless (and lossless) reconstructionwith respect to a given bound after each layer of thesuccessively refinable bit stream is decoded. Note, howeverthat these bounds need to be set at compression time andcannot be changed during decompression. The compressionperformance provided by the proposed technique iscomparable to the best-known lossless and near-losslesstechniques proposed in the literature. It should be noted that tothe best knowledge of the authors, this is the first techniquereported in the literature that provides lossless and near-lossless compression as well as progressive reconstruction allin a single framework.
 
2. METHODOLOGY
(IJCSIS) International Journal of Computer Science and Information Security,Vol. 9, No. 9, September 2011140http://sites.google.com/site/ijcsis/ISSN 1947-5500
 
2.1COMPRESSION TECHNIQUES
 
LOSSLESS COMPRESSION
Where data is compressed and can be reconstituted(uncompressed) without loss of detail or information. Theseare referred to as bit-preserving or reversible compressionsystems also [11].
 
LOSSY COMPRESSION
Where the aim is to obtain the best possible fidelity for agiven bit-rate or minimizing the bit-rate to achieve a givenfidelity measure. Video and audio compression techniques aremost suited to this form of compression [12].
 
If an image is compressed it clearly needs to beuncompressed (decoded) before it canviewed/listened to. Some processing of data may bepossible in encoded form however.
 
Lossless compression frequently involves some formof 
entropy encoding
and are based in informationtheoretic techniques
 
Lossy compression use source encoding techniquesthat may involve transform encoding, differentialencoding or vector quantisationImage compression may be lossy or lossless. Losslesscompression is preferred for archival purposes and often formedical imaging, technical drawings, clip art, or comics. Thisis because lossy compression methods, especially when usedat low bit rates, introduce compression artifacts. Lossymethods are especially suitable for natural images such asphotographs in applications where minor (sometimesimperceptible) loss of fidelity is acceptable to achieve asubstantial reduction in bit rate. The lossy compression thatproduces imperceptible differences may be called visuallylossless.
2.2METHODS FOR LOSSLESS IMAGECOMPRESSION ARE
 
Run-length encoding – used as default methodin PCX and as one of possible in BMP, TGA, TIFF
 
DPCM and Predictive Coding
 
Entropy encoding
 
Adaptive dictionary algorithms such as LZW – usedin GIF and TIFF
 
Deflation – used in PNG, MNG, and TIFF
 
Chain codes
2.3METHODS FOR LOSSY COMPRESSION
 
Reducing the color space to the most common colorsin the image. The selected colors are specified in the colorpalette in the header of the compressed image. Each pixel just references the index of a color in the color palette.This method can be combined with dithering toavoid posterization.
 
Chroma sub sampling. This takes advantage of thefact that the human eye perceives spatial changes of brightness more sharply than those of color, by averagingor dropping some of the chrominance information in theimage.
 
Transform coding. This is the most commonly usedmethod. A Fourier-related transform such as DCT orthe wavelet transform are applied, followedby quantization and entropy coding.
 
Fractal compression.
2.3COMPRESSION
 
The process of coding that will effectively reduce thetotal number of bits needed to represent certain information.
INPUT
Fig.1. a general data compression scheme
Fig.2 lossy image compressionresult resultFig. 3 lossless image comparison ratio
 
ENCODER(COMPRESSION)STORAGE ORNETWORKSDECODER(DECOMPRESSION)
(IJCSIS) International Journal of Computer Science and Information Security,Vol. 9, No. 9, September 2011141http://sites.google.com/site/ijcsis/ISSN 1947-5500
 
 
Fig.4lossy and lossless comparison ratio3.HUFFMAN CODING
Huffman coding is based on the frequency of occurrence of a data item (pixel in images). The principleis to use a lower number of bits to encode the data thatoccurs more frequently. Codes are stored in a Code Book which may be constructed for each image or a set of images. In all cases the code book plus encoded data mustbe transmitted to enable decoding.
The Huffman algorithm is now briefly summarised:
 
A bottom-up approach
 
1. Initialization: Put all nodes in an OPEN list, keep itsorted at all times (e.g., ABCDE).
 
2. Repeat until the OPEN list has only one node left:
 
(a) From OPEN pick two nodes having the lowestfrequencies/probabilities, create a parent node of them.
 
(b) Assign the sum of the children's frequencies/ probabilities to the parent node and insert it intoOPEN.
 
(c) Assign code 0, 1 to the two branches of the tree,and delete the children from OPEN.The following points are worth noting about theabove algorithm:Decoding for the above two algorithms is trivial as longas the coding table (the statistics) is sent before the data.(There is a bit overhead for sending this, negligible if the datafile is big.)Unique Prefix PropertyNo code is a prefix to any other code (all symbolsare at the leaf nodes) great for decoder, unambiguous. If priorstatistics are available and accurate, then Huffman coding isvery good.
3
.1HUFFMAN CODING OF IMAGES
 
In order to encode images:
 
Divide image up into 8x8 blocks
 
Each block is a symbol to be coded
 
Compute Huffman codes for set of block 
 
Encode blocks accordingly
3.2HUFFMAN CODING ALGORITHM
No Huffman code is the prefix of any other Huffman codes sodecoding is unambiguous
 
 
The Huffman coding technique is optimal (but wemust know the probabilities of each symbol for thisto be true)
 
Symbols that occur more frequently have shorterHuffman codes
4.LEMPEL-ZIV-WELCH (LZW) ALGORITHMTHE LZW COMPRESSION ALGORITHM CANSUMMARISED AS FOLLOWS
w = NIL;while ( read a character k ){if wk exists in the dictionaryw = wk;elseadd wk to the dictionary;output the code for w;w = k;}
THE LZW DECOMPRESSION ALGORITHM IS ASFOLLOWS
read a character k;output k;w = k;while ( read a character k )
(IJCSIS) International Journal of Computer Science and Information Security,Vol. 9, No. 9, September 2011142http://sites.google.com/site/ijcsis/ISSN 1947-5500

You're Reading a Free Preview

Download
/*********** DO NOT ALTER ANYTHING BELOW THIS LINE ! ************/ var s_code=s.t();if(s_code)document.write(s_code)//-->