Welcome to Scribd, the world's digital library. Read, publish, and share books and documents. See more
Standard view
Full view
of .
Look up keyword
Like this
0 of .
Results for:
No results containing your search query
P. 1
Image Processing Pipeline for Highest Quality Images

Image Processing Pipeline for Highest Quality Images

Ratings: (0)|Views: 63|Likes:
Published by Jakaria Ahmad
This research paper was accepted published by "WASET"
This research paper was accepted published by "WASET"

More info:

Published by: Jakaria Ahmad on May 31, 2010
Copyright:Attribution Non-commercial


Read on Scribd mobile: iPhone, iPad and Android.
download as PDF, TXT or read online from Scribd
See more
See less





Now-a-days, multi-mega pixel cameras are availablefor a wide range of applications including digital still cameras,camcorders, camera phones and video surveillance equipment. Often,the difference between two digital cameras can be measured in morethan just the number of pixels each captures. Now, the imageprocessing pipeline plays a key role in the overall image quality sincethe increasing complexity with image processing is required by acorresponding increase in image resolution. It’s possible to optimizeeach of the image-processing pipeline’s multiple stages to controlimage and video quality, while continuing to minimize complexity inorder to accommodate diverse user preferences. The integrated,hardware-based image-processing pipeline provides the performanceas well as the flexibility required to ensure that the highest qualityimages are produced for various applications.
SoC image processor, MIPS, fine tuning, OECF(Optical electrical conversion Function), RGB, CFA interpolation. 
 HEimage-processing pipeline performs the baseline andenhanced image processing, which takes the raw dataproduced by a camera sensor and generates the digitalimage that will then be viewed by the user or undergo furtherprocessing before being saved to nonvolatile memory. Thispipeline is a series of specialized algorithms that adjusts imagedata in real-time and is often implemented as an integratedcomponent of a system-on-chip (SoC) image processor. Withan image pipeline implemented in hardware, front-end imageprocessing can be completed without placing any processingburden on the main application processor. This allows thecycles to encode and perform advanced processingfunctionalities such as video analytics including objectrecognition and object tracking. On the other hand, quality hasno standard metric and different applications approach qualityin different ways, since the same camera can even be used in avariety of ways. A digital still camera needs to be able to takequality pictures in various lighting scenarios, such as indoors,in bright sunlight and in relative darkness. In order tomaximize quality for these scenarios, the image processingpipeline needs to be readily flexible and configurable toprovide the highest quality for each individual picture.In this paper, the general idea of image pipeline ispresented. Tuning the pipeline, noise reduction, contrast
Jakaria Ahmad is with the Metropolitan University, Sylhet-3100,Bangladesh (Phone: +8801728375804; e-mail: jakaria@metrouni.edu.bd).Md. Mustafijur Rahman Faysal is with Metropolitan University, on,Sylhet-3100, Bangladesh. (e-mail: mustafij@hotmail.com).
enhancement and some other major topics are also coveredhere.II.
 The image-processing pipeline is designed to exploit theparallel nature of image-processing algorithms and enable acamera to process multiple pictures simultaneously whilemaximizing final image quality. Additionally, each stage inthe pipeline begins processing as soon as image data isavailable so the entire image does not have to be receivedfrom the previous sensor or stage before processing is started.This results in an extremely efficient pipeline withdeterministic performance that increases the speed with whichimages can be processed, and therefore the rate at whichpictures can be taken by users.Performance, however, is only one factor that influencesoverall camera quality. There are some technologies which areflexible enough to specifically tune individual stages to matchthe particular sensor and lens combination of a camera.Additionally, sensor/lens combinations can be tuned across awide range of operating conditions under which the cameramight be used.Taking a hardware-based, configurable approachguarantees performance and flexibility. Manyimplementations (ASIC) provide excellent performance andlow cost, but their fixed nature fails to achieve the best qualitypossible under a wide range of operating conditions and alsolimits their ability to adapt to multiple applications. On theother hand, while software-based approaches provide therequired flexibility, they require too many MIPS on the mainapplication processor and consume too much power.As front-end image processing is fairly well-defined, thealgorithms involved are well suited to a configurableapproach. This even includes those stages which tend to beproprietary between vendors, such as CFA interpolation.Through a configurable approach, overall image quality canbe maximized by adjusting specific parameters at eachpipeline stage.Getting the most out of the image-processing pipeline is aprocess of fine tuning each pipeline stage for every sensor andlens combination. We know, most digital cameras offer user-selectable modes to adjust the camera for cloudy, sunny ornight time lighting. Tuning the image-processing pipeline foreach of these modes enables users to assist the camera inachieving the best image quality. Again a camera canautomatically evaluate current lighting conditions and makean intelligent selection between the available modes tomaximize the quality of the particular image being captured.
Image-Processing Pipelinefor Highest Quality Images
Jakaria Ahmad, Md. Mustafijur Rahman Faysal
World Academy of Science, Engineering and Technology 59 2009216
 We know that for getting the best quality image, eachstage of the image pipeline must be fine tuned for everysensor and lens combination. For this, there are 10 steps tofollow.
Fig. 1 Image-processing pipeline
 This stage in the pipeline adjusts for dark current from thesensor and for lens flare, which can lead to the whitening of an image’s darker regions. In other words, sensor black is notthe same as image black. The most common method forcalculating this adjustment is to take a picture of a completelyblack field (typically accomplished by leaving the lens capon), resulting in three base offsets to be subtracted from theraw sensor data. Failure to adjust the black level will result inan undesirable loss of contrast.V.
 There are numerous sources of noise that can distort imagedata – optical, electrical, digital and power – which must beremoved before they are amplified in later pipeline stages. Theactual noise level present in an image, however, plays acritical role in determining how strong the noise filter must besince the use of a strong filter on a clean image will actuallydistort and blur the image rather than clear it up.Noise reduction is achieved by averaging similarneighboring pixels. Through the use of an Optical ElectricalConversion Function (OECF) chart (Figure 2) and a uniformlighting source, the noise level can be characterized fordifferent intensities.
Fig. 2 Standard Optical Electrical Conversion Function (OECF) chart
If the noise level is high for a particular intensity, thenmore weight is given to the average pixel value of similarneighbors. On the other hand, if the noise level is low, moreweight is given to the original pixel value. The OECF chart iscomprised of 12 uniform gray patches and produces 12corresponding power levels based on the noise standarddeviation at the mean value for each intensity/luminance level.These 12 power levels are then used to reduce noise across animage using either a linear or square-root model, dependingon the sensor and gain (or ISO) level.
 Different types of lighting – such as incandescent,fluorescent, natural light sources, XE strobe and W-LED flash– have a pronounced effect on color. The most difficult totune is in mixed-light conditions. White balance automaticallycompensates for color differences based on lighting so whiteactually appears white.
Incandescent Flourescent DaylightHuman Perception
Fig. 3 Standard Optical Electrical Conversion Function (OECF) chart 
Fine tuning white balance begins by measuring theaverage RGB values across the six gray patches on a ColorChecker chart (Figure 4).
Fig. 4 Color checker chart 
Using mean square error minimization, the appropriategains for each color can be calculated.min f 
) =
nnn R
(1)min f 
) =
nnn B
(2)Setting the green gain to a default of one eliminates one setof calculations later in the image-processing pipeline. Theresulting gains are applied to each image pixel:
World Academy of Science, Engineering and Technology 59 2009217
 Typically, digital cameras employ only a single sensor tocapture an image, so the camera can only obtain a single colorcomponent for each pixel even though three components arenecessary to represent RGB color.CFA interpolation is the process of interpolating twomissing color components for each pixel based on theavailable component and neighboring pixels. CFAinterpolation is primarily a transform function that does notvary based on sensor or lighting conditions, and therefore notuning of this image-processing pipeline stage is required.However, it is still one of the most complex algorithms in theimage-processing pipeline, and the quality of its output ishighly dependent upon the expertise of the silicon vendor. TIdigital media processors provide superior CFA interpolationtechnology to ensure the images are as close to what the usersees as possible.
Fig. 5 CFA interpolation
 Different sensors produce different RGB values for thesame color. Tuning this pipeline stage involves creating ablending matrix to convert the sensor RGB color space to astandard RGB color space such as the Rec709 RGB colorspace.
Fig. 6 Sensor RGB color space 
The blending matrix is calculated by starting with a ColorChecker chart and obtaining average RGB values for 18different color patches (the top three rows of the chart) thathave already been white balanced. Next, inverse Gammacorrection is applied to the reference RGB values. Theblending matrix is then constructed using constrainedminimization.
= = =
n i jref nin j ji
RGB RGB M  M  f 
(4)subject to
 j ji
The blending matrix is applied as follows: 
 BG R M  M  M  M  M  M  M  M  M 
(6)The final result is consistent color between cameras usingdifferent sensors.Before RGB tuning After RGB tuning
Fig. 7 Final result 
Figure 8: Gamma Correction Curve
Here, we see that, Gamma correction compensates for thenonlinearity of relative intensity as the frame buffer valuechanges in output displays. In fact,
 R R
 B B
(7c)Typically, displays are calibrated using a standard gammacorrection such as Rec709 or SMPTE240M. Calibrating theimage-processing pipeline to the same standards ensuresoptimal image quality across the majority of displays. Undermost circumstances, this image-processing pipeline stage doesnot require tuning. Tuning only comes into play whenspecialized displays utilize a different gamma correction, suchas those used in airport computers or military applications. X.
 Images need to be adjusted for the human eye, which ismore sensitive to luminance (Y) than color (
World Academy of Science, Engineering and Technology 59 2009218

You're Reading a Free Preview

/*********** DO NOT ALTER ANYTHING BELOW THIS LINE ! ************/ var s_code=s.t();if(s_code)document.write(s_code)//-->