You are on page 1of 23

 

Topic -

Image Resolution Enhancement using conventional methods


Presentaion by -
Shiv Shukla
Electrical And Electronics Engineering (EEE), BITS Pilani

2
1.
Terahertz wave

3
Terahertz wave 
▷ Terahertz (THz) wave  is a kind of electromagnetic radiation with strong
penetrability and harmless photon energy, which is widely used in the field
of non-destructive testing (NDT)The use of THz waves for non-destructive
evaluation enables inspection of multi-layered structures and can identify
abnormalities from foreign material inclusions and delamination,
mechanical impact damage, heat damage, and water or hydraulic fluid
ingression. This new method can play a significant role in a number of
industries for materials characterization applications where precision
thickness mapping 

▷ Some of the Applications: 


                             1. Pharmaceutical application
                             2. Imaging Application - (Identification of defect/damage)
                             3. Spectroscopy Application
                             4.Passive/active imaging techniques

4
 Advantages of Terahertz imaging
➨It can penetrate through a wide variety of dielectric materials, such as fabric,   
    paper, plastic, leather, and wood.
➨It is non-ionizing and has minimal effects on the human body
➨It has very large absorption due to water.
➨ Metals highly reflect terahertz radiation
➨It offers greater bandwidth than microwave frequencies. The data bandwidth
exceeds wireless protocols e.g. 802.11b.
➨THz radiation waves can easily pass through non-conducting materials as
mentioned above.
➨It can be used in image sensing and higher bandwidth wireless networking
systems for distances of about 10 to 100 meters.
➨It has minimum effects on human body as it is non-ionizing in nature.

5
 Limitations
▷ Some major limitation of Terahertz waves are

➨It does not support long range communication due to scattering and
absorption by cloud, dust, rain etc.
➨It supports less penetration depth than microwave radiation. Moreover it has
limited penetration through clouds and fog. THz waves can  not penetrate
liquid water or metal.
➨It is difficult to detect terahertz frequencies as black body radiation at room
temperatures is very strong at these frequencies.
➨Sources, detectors, modulators are not available at affordable prices which
lead to hinderances in its commercial availability as communication system.
  

6
2.
Interpolation Techniques

7
                Interpolation
Image interpolation occurs in all digital images at some stage

▷  Resizing (resampling)

▷  Remapping (geometrical tansformation - rotation, change of


perspective,...) 

▷ Inpainting (restauration of holes)

▷  Morphing, nonlinear transformations


  

8
 Interpolation Methods
▷ Nearest Neighbour 

▷ Bilinear Interpolation

▷ Bicubic Interpolation

▷ Pixel-centered Patch Matching


  

9
Nearest Neighbour 
▷ Details– 
Most basic method Requires the least processing time Only considers one pixel:
the closest one to the interpolated point Has the effect of simply making each
pixel bigger

10
Bilinear Interpolation 
▷ Considers the closest 2x2 neighborhood of known pixel values surrounding the
unknown pixels 

▷ Takes a wheighted average of these 4 pixels to arrive at the final interpolated values

▷  Results in smoother looking images than nearest neighborhood Needs of more


processing time

11
Bicubic Interpolation
▷ Details– 
In addition to going 2×2 neighborhood of known pixel values,
Bicubic goes one step beyond bilinear by considering the closest
4×4 neighborhood of known pixels — for a complete of 16
pixels. The pixels that are closer to the one that’s to be
estimated are given higher weights as compared to those that
are further away. Therefore, the farthest pixels have the smallest
amount of weight. The results of Bicubic interpolation are far
better as compared to NN or bilinear algorithms. This can be
because a greater number of known pixel values are considered
while estimating the desired value. Thus, making it one of all the
foremost standard interpolation methods.

12
Bicubic Interpolation
Methods to apply bicubic interpolation– 

▷ Lagrange Polynomial

▷ Cubic Splines

▷ Cubic convolution
  

13
Bicubic Interpolation
Limitations of bicubic interpolation– 

▷ Falls victim to overshooting

▷ In turns it can yield to clipping or sharpness 


  

14
Pixel Centered Patch Making 
▷ Pixel centered patch matching is again one of the way to find missing pixel,
neighboring pixels within a patch centered by the current missing pixel can be used
as the reference to find texture-relevant LR pixels. Most of the time natural images
have an exponentially decaying power spectrum; the aliasing effect is more
prominent in high frequency areas as compare to low frequency areas. Most image
pixels in low-frequency areas can be interpolated well through initial bicubic
interpolation . By considering proper patch size the unwanted artifacts can get
removed in low frequency area– 

.
15
Iterative Multiscale Interpolation
▷ It is always preferred to have large patch sizes so that more coverage will be there
to find texture relevant LR pixels. On the other hand, smooth textures in an image
require smaller patch sizes to avoid edge sharping .

▷  However, it is difficult to determine the suitable patch size from the observed LR
image with aliasing. Hence, an iterative multi-scale interpolation procedure is used
to integrate advantages from both the large- and small-scale patch matching

16
3.
Deconvolution Methods .

17
                Deconvolution
Deconvolution is a computationally intensive image processing technique that
is being increasingly utilized for improving the contrast and resolution of
digital images captured in the microscope. The foundations are based upon a
suite of methods that are designed to remove or reverse the blurring present
in microscope images induced by the limited aperture of the objective

▷  The model for blur that has evolved in theoretical optics is based on the
concept of a three-dimensional point spread function (PSF). This concept
is of fundamental importance to deconvolution and should be clearly
understood in order to avoid imaging artifacts. The point spread function
is based on an infinitely small point source of light originating in the
specimen (object) space
18
 Deconvolution  Methods
▷ Classical Algorithms for Constrained Iterative Deconvolution 

▷ Statistical Iterative Algorithms

▷ Blind Deconvolution Algorithms

▷ Richardson – Lucy Algorithm


  

19
• Classical Algorithms for Constrained Iterative
Deconvolution 
▷ The first applications of constrained iterative deconvolution algorithms to images
captured in the microscope were based on the Jansson-Van Cittert (JVC) algorithm,
a procedure first developed for application in spectroscopy. Agard later modified
this algorithm for analysis of digital microscope images in a landmark series of
investigations. Commercial firms such as Vaytek, Intelligent Imaging Innovations,
Applied Precision, Carl Zeiss, and Bitplane currently market various implementations
of Agard's modified algorithm. In addition, several research groups have developed a
regularized least squares minimization method that has been marketed by Vaytek
and Scanalytics. These algorithms utilize an additive or multiplicative error criterion
to update the estimate at each iteration.

20
• Statistical Iterative Algorithms 

▷ Statistical algorithms are more computationally intensive than the classical methods
and can take significantly longer to reach a solution. However, they may restore
images to a slightly higher degree of resolution than the classical algorithms. These
algorithms also have the advantage that they impose constraints on the expected
noise statistic (in effect, a Poisson or a Gaussian distribution). As a result, statistical
algorithms have a more subtle noise policy than simply regularization, and they may
produce better results on noisy images. However, the choice of an appropriate noise
statistic may depend on the imaging condition, and some commercial software
packages are more flexible than others in this regard.

21
• Blind Deconvolution Algorithms

▷ Blind deconvolution is a relatively new technique that greatly simplifies the


application of deconvolution for the non-specialist, but the method is not yet widely
available in the commercial arena. The algorithm was developed by altering the
maximum likelihood estimation procedure so that not only the object, but also the
point spread function is estimated. Using this approach, an initial estimate of the
object is made and the estimate is then convolved with a theoretical point spread
function calculated from optical parameters of the imaging system. The resulting
blurred estimate is compared with the raw image, a correction is computed, and this
correction is employed to generate a new estimate, as described above. This same
correction is also applied to the point spread function, generating a new point
spread function estimate. In further iterations, the point spread function estimate
and the object estimate are updated together.
22
THANK YOU!

23

You might also like