Professional Documents
Culture Documents
RADHIKA.K.N RAMESH.M
PG scholar Asst prof /ECE department
Vivekakanandha Institute of Engg. & Tech., Vivekakanandha Institute of Engg. & Tech.,
for Women, Tiruchengode for Women, Tiruchengode
148
All Rights Reserved © 2015 IJARTET
ISSN 2394-3777 (Print)
ISSN 2394-3785 (Online)
Available online at www.ijartet.com
International Journal of Advanced Research Trends in Engineering and Technology (IJARTET)
Vol. II, Special Issue XIX, March 2015
Lossy compression is the class of data the dictionary, compression of a given fingerprint, quantization
encoding methods that uses inexact approximations or partial and coding and analysis of the algorithm complexity.
data discarding for representing the content that has been ARTIFACT
HISTOGRAM
encoded. Such compression techniques are used to reduce the AFFECTEDImage
Finger Print SEGMENTATION
Image
Input image denoising ANALYSIS
separation
amount of data that would otherwise be needed to store, IMAGE
handle, and/or transmit the represented content. Lossy
compression is most commonly used to
Noise Proposed Compressed
BACKGROUND
compress multimedia data audio, video, and still images, powerTHRESHOLDING
algorithm DILATE/ERODE
image
calculation CLEANING
especially in applications such as streaming media and internet
FINAL RING AREMOVED IMAGE
telephony.
Fig1. Block Diagram for Adaptive Orientation Model
The DCT-based encoder can be thought of as
compression of a stream of 8 × 8 small block of images. This
transform has been adopted in JPEG [5]. The JPEG 2.1 Image de-noising
compression scheme has many advantages such as simplicity, Image de-noising is used to improve the image
universality and availability. However, it has a bad quality by removing noise .The configuration of parallel
performance at low bit-rates mainly allows Because of the ridges and furrows with well-defined frequency and
underlying block-based DCT scheme. For this reason, as early orientation in a fingerprint image provide useful
as 1995, the JPEG-committee began to develop a new wavelet- information which helps in removing undesired noise.
based compression standard for still images, namely JPEG
2000 [6], [7].
2.2 Image separation
The DWT-based algorithms include three steps: a
Image separation is a more complicated case of image
DWT computation of the normalized image, quantization of
de-noising where more than one image is to be recovered from
the DWT coefficients and lossless coding of the quantized
a single observation. It show the effects of adaptive
coefficients. The detail can be found in [8].
dictionaries on separation of complex texture patterns from
The above algorithms are for general image
natural images.
compression. Targeted at fingerprint images, there are special
All the related studies have demonstrated the
compression algorithms. The most common is Wavelet Scalar
advantages that adaptive dictionary learning can have on the
Quantization (WSQ). It became the FBI standard for the
separation of complex texture patterns from natural images.
compression of500 dpi fingerprint images [8].
The minimization of image separation starts with extracting
and rearranging all the patches of co efficient x. The patches
II. PROPOSED METHOD
are then processed by K-SVD, which updates and estimates
sparse coefficients.
In this section, we give the details about how to use
adaptive orientation model to find the PSNR values and also 2.3 Compressed image
compress fingerprint images. The part includes construction of In compressed image using lossless compression
allows the exact original images to be reconstructed from the
149
All Rights Reserved © 2015 IJARTET
ISSN 2394-3777 (Print)
ISSN 2394-3785 (Online)
Available online at www.ijartet.com
International Journal of Advanced Research Trends in Engineering and Technology (IJARTET)
Vol. II, Special Issue XIX, March 2015
compressed data. No information are loss.The Scale or weight term is adaptively chosen according
proposed method also requires the knowledge about noise to the background noise level.
power. Weight term should take larger values in regions with
Furthermore, incorporating the noise power in solving much structured noise and be relatively small in
has an important advantage in the dictionary update stage. It fingerprint regions.
ensures that the dictionary does not learn the existing noise in Segmentation is done which is used to find out the
the patches. Consequently, the estimated image using this directional term.
dictionary would become cleaner, which will later refine the Orientation field estimation.
dictionary atoms in the next iteration. This progressive de- PSNR Measurement.
noising loop is repeated until a clean image is achieved.
values was not good. In DCT PSNR values are 29.0714db and
i. All the minutiae points adjacent to each global based PSNR values are 29.5687db. Compare both our
other and within a pre-specified window is algorithm are better. It will find better PSNR values and also
removed. find clean fingerprint image.
ii. All the minutiae points near the border and
within a certain fixed distance from it are VI. SIMULATION RESULTS
removed.
iii. If two ridge endings are encountered close to
each other and if no ridges pass through
them, and then they are reconnected.
V. COMPARISION FOR EXISTING Fig. 2. Clean Image by adaptive dictionary and PSNR
Values
DCT 29.0714 db
GLOBAL 29.5687 db
BASED Fig.3. Dictionary Trained on Patches From Compressed
Image,
VII. CONCLUSION
A new compression algorithm adapted to fingerprint
better PSNR values, clean fingerprint [9] A. Said and W. A. Pearlman, “A new, fast, and
verification and gives efficient output. efficient image codec based on set partitioning in
It mainly used to all secure purpose for all in our hierarchical trees,” IEEE Trans.Circuits Syst. Video
world. Fingerprint will give unique identification.Using this Technol., vol. 6, no. 3, pp. 243–250, Jun. 1996.
method to avoid overlapping problem also. [10] A.Skodras, C. Christopoulos, and T. Ebrahimi, “The
JPEG 2000 still image compression standard,” IEEE
REFERENCES Signal Process. Mag., vol. 11, no. 5, pp. 36–58, Sep.
[1] N. Ahmed, T. Natarajan, and K. R. Rao, “Discrete 2001.
cosine transform,” IEEE Trans. Comput., vol. C-23, [11] R. Sudhakar, R. Karthiga, and S.Jayaraman,
no. 1, pp. 90–93, Jan. 1974. “Fingerprint compression using contourlet transform
[2] C. M. Brislawn, J. N. Bradley, R. J. Onyshczak, and T. with modified SPIHT algorithm,” IJECEIranian J.
Hopper, “FBI compression standard for digitized Electr. Comput. Eng., vol. 5, no. 1, pp. 3–10, 2005.
fingerprint images,” Proc. SPIE,vol. 2847, pp. 344–
[12] Y. Y. Zhou, T. D. Guo, and M. Wu, “Fingerprint
355, Aug. 1996.
image compression algorithm based on matrix
[3] C. S. Burrus, R. A. Gopinath, and H. Guo,
optimization,” in Proc. 6th Int. Conf. Digital Content,
Introduction to Wavelets and Wavelet Transforms: A
Multimedia Technol. Appl., 2010, pp. 14–19.
Primer. Upper Saddle River, NJ, USA: Prentice-Hall,
1998.
[4] Guangqi Shao, Yanping Wu, Yong A, Xiao Liu, and
Tiande Guo, “Fingerprint Compression Based on
Sparse Representation”, IEEE Transactions on image
processing, vol. 23, NO. 2, Feb. 2014.
[5] D. Maltoni, D. Miao, A. K. Jain, and S. Prabhakar,
Handbook of Fingerprint Recognition, 2nd ed.
London, U.K.: Springer-Verlag, 2009.
[6] W. Marcellin, M. J. Gormish, A. Bilgin, and M. P.
Boliek, “An overview of JPEG-2000,” in Proc. IEEE
Data Compress. Conf., Mar. 2000, pp. 523–541.
[7] W. Pennebaker and J. Mitchell, JPEG—Still Image
Compression Standard. New York, NY, USA: Van
Nostrand Reinhold, 1993.
[8] S. Prabhakar, “Fingerprint classification and matching
using a filterbank,” Ph.D. dissertation, Dept. Comput.
Sci., Eng., Michigan State Univ., East Lansing, MI,
USA, 2001.
152
All Rights Reserved © 2015 IJARTET