You are on page 1of 56

VISUAL TESTING

MODULE 3
Images
Part 3
Photographic Equipment
Basic Equipment
Typical equipment for producing a photographic record of visual tests
includes a camera, a stable tripod, an automatically regulated flash and
a shutter release. Digital photography requires the above items plus a
computer and associated equipment including a power source,
connections, file storage media and software for image processing and
file management.
Photographic Equipment
A macro lens is useful for closeups (Fig. 10), particularly in analog
photography. The macro lens with a 28 to 80 mm zoom lens can be
made to focus on objects some 300 mm (12 in.) from the lens or to
magnify objects difficult or dangerous to approach. A tripod and shutter
release are used to stabilize the camera for long exposures in dim light.
Usually, getting a 28 to 80 mm (1 to 3 in.) zoom lens means that the
camera body and lens are bought separately. A camera, no matter how
costly, is no better than its lens. An inexpensive lens can drastically
reduce an expensive camera's image reproduction quality.
Figure 10 Macro lens view revealing cracks and corrosion
pitting on casting.
.
Photographic Equipment
Lens
The optical component of the camera is the lens. At its simplest, a lens is
an optical device made of transparent material: glass or plastic. A lens
transmits light and makes it converge to form an image that looks like the
scene in front of the lens. This image is focused on a sensor at the
camera's focal point or plane. A convex lens or mirror causes light to
come together to a focal point; a concave lens or mirror causes light to
spread out. A lens system is a series of two or more lenses or mirrors
working together to transmit one beam of light.
Photographic Equipment
As light travels from one medium to another, it changes speed. Light
travels more quickly through air than it does through glass, so a lens
slows it down. When light waves enter a piece of glass at an angle, one
part of the wave will reach the glass before another and will start slowing
down first. Light's change of direction can be described by the' following
analogy. When a shopping cart is pushed from pavement into grass at
an angle, if the right wheel hits the grass first, it slows down while the left
wheel is still on the pavement. Because the left wheel is briefly moving
more quickly than the right wheel, the shopping cart turns to the right as
it moves onto the grass (Fig. 11). The analogy fails in that the cart
experiences mechanical resistance; light does not.
Figure 11 Refraction illustrated by analogy
of shopping cart in grass.
.
Focus
Focal Point
The focal point is where light rays from a lens or mirror come together.
The focal point of a lens is the point of convergence of light or a point
from which the light diverges. The focal distance is the distance from the
lens system to the focal point. When using a lens system for indirect
visual testing, it is very important that the focal point is appropriate for
the inspection parameters (Fig. 12).
Figure 12 Depth of field.
.
Focus
Depth of Field
Depth of field is the distance within which the object is sharply resolved
(Fig. 12). Depth of field is the amount of distance that the subject is still
in focus and is expressed as a range from a certain minimum point close
to the camera to a certain maximum point as distant from the camera.
Everything in this range will be in focus and everything beyond, closer or
farther, will be out of focus (Fig. 13).
Figure 13 Objects closer and farther than focal point are
blurred.
.
Focus
Field of View
The field of view is the entire area that can be seen through an optical
system as light is received from the conical angle subtended by the
system's optics. An astronomical telescope's field of view is the area of
the sky that can be seen with a given eyepiece. Theoretically, a field of
view is three-dimensional, like a room, and not two-dimensional, like a
wall. The area of interest in a field of view, however, is often a flat
surface.
Focus
Different lenses can be attached to an instrument to achieve different
fields of view. Figure 14 represents the field of view as represented by a
rigid borescope. In this example, the field of view of the system is 60
degrees.
For different lenses, the grinding specific for each side results in desired
features: closeups, close focus and high magnification; or short focus,
wide angle views and high magnification.
Figure 14 Field of view of one rigid borescope.
.
Capturing Details
To capture maximum detail, the inspector wants the area of interest to be
as large as possible within the camera's field of view. Many cameras
offer optical zoom, letting the user move the lens telescopically to "zoom
in on" a subject and capture more detail in a closeup. The same
advantage can be achieved simply by moving closer to the subject.
Some digital cameras achieve zoom inexpensively by something called
digital zoom, or interpolation, which crops the image and sacrifices
details because not all the pixels are retained.
Capturing Details
Camera focal length determines the angle of view and the depth of field.
Short focal lengths give wide angles of view; long focal lengths give
narrow angles of view. Short focal lengths give large depth of field; long
focal lengths give small depth of field. The closer the subject is to the
camera, the smaller the depth of field needs to be.
Very closeup photography usually means very limited depth of field. If a
test exposure shows some desired areas are out of focus, depth of field
can be increased by using a smaller lens aperture (higher f stop). A
smaller aperture may require a slower shutter speed and cause blurring
from camera movement.
Capturing Details
Digital cameras focus automatically, but the object of interest must be
centered in the camera's field of view. AIso, the range finder used for
focusing in many cameras is acoustic and will not work through
transparent obstacles such as window glass.
Capturing Details
Zoom Lens
Zooming is a lens adjustment that permits seeing detailed, close shots of
a subject (scene target) or a broad, overall view. Zooming allows a
smooth, continuous change in the angular field of view so the angle of
view can be made narrower or wider depending on the setting. As a
result, a scene can be made to appear close or far, giving the impression
of camera movement even though the camera remains in a fixed
position. Zoom lenses that accomplish this have variable focal length.
Capturing Details
Zoom lenses have variable focal lengths, an advantage over lenses of
fixed focal length. The lens components in these assemblies are moved
to change their relative physical positions, thereby varying the focal
length and angle of view through a specified range of magnifications.
The zoom lens is a cleverly designed assembly of lens elements that can
be moved to change the focal length from a wide angle to a narrow angle
while the image on the sensor remains in focus. The telephoto lens for
long distances is a zoom lens (Fig. 15). To achieve a variable focal
length lens at least three groups of elements are needed.
Figure 15 Telephoto field of view.
.
Capturing Details
The front focusing objective group can be adjusted over limited distance
with an external focus ring to finely focus the lens. Between the front and
rear groups is a movable zoom group, which is moved appreciably (front
to back) using a separate external zoom ring. The zoom group also
contains corrective elements to optimize the image over the full zoom
focal length range. Connected to this group are other lenses that are
moved a small amount to automatically adjust and keep the image on
the sensor in sharp focus, thereby minimizing the external adjustment of
the front focusing group.
Capturing Details
At the camera end of the zoom lens is the rear stationary relay group,
which determines the final image size when it comes to a focus on the
camera sensor. Each group normally consists of several elements. When
the zoom group is positioned correctly, it sees the image produced by
the objective group and creates a new image from it. The rear relay
group picks up the image from the zoom group and relays it to the
camera sensor. In a well-designed zoom lens, a scene in focus at the
wide angle (short focal length) setting remains in focus at the narrow
angle (telephoto) setting and everywhere in between.
Capturing Details
Figure 16 shows the continuously variable nature of the zoom lens. The
field of view of the camera can be changed without replacing the lens.
Surveillance elements in the lenses are physically moved to vary the
focal length and thereby vary the angular field of view and magnification.
By adjusting its zoom ring setting, the zoom lens permits viewing of
narrow, medium and wide angle scenes. This allows a person to initially
view a scene with a wide angle perspective and then use close telephoto
viewing of one portion of the scene that is of particular interest.
Figure 16 Operation of zoom: (a) lens system; (b) wide
view; (e) maintaining focus while zooming; (d) closeup.
Notice that lens 1 moves to maintain focus.
.

Video 1 - How Lenses Function


Video 2 – Convex and Convave Lenses
Capturing Details
Pixels
Digital photographs are designed for a computer display. A display
consists of a grid of squares called picture elements, or pixels (Fig. 17).
Displays range from 320 x 240 pixels to over 3840 x 2400. A standard
display of 800 x 600 pixels is called a super video graphics array. It is
superseded in most computers but remains popular in mobile devices.
Each pixel is part of a larger image, and more pixels provide a more
accurate representation of the original. Pixels are rectangular and on
computer monitors are usually square. The square shape is an artifact of
the display: the original electronic datum is dimensionless. Pixels may be
Figure 17 Array of pixels in digital display.
.
Capturing Details
translated into dots for printing. Dot and pixel are interchangeable terms:
for example, dots per inch (DPI) and pixels per inch (PPI) are equivalent
measures of resolution.
A bit map defines an image and color for all pixels in the image. TIF and
JPG files are bit mapped. A bit map does not need to contain color coded
information for each pixel in every row. It needs to contain information
indicating a new color as the display scans along the row. For this
reason, an image with much solid color requires a small bit map.
Capturing Details
The term pixel can also mean an element in a sensor array. An array is a
group of adjacent receptors each of which records a single point of
radiation and which together provide a composite picture. Such arrays
are in cameras and other radiation detectors (X-ray fluoroscopes and
infrared cameras) at various wavelengths.
Color
A color is described by its hue, saturation and value (HSV). Hue is what
is normally thought of as color. A digital color in a computer display is a
blend of three primary hues: red, green and blue. Saturation is the
intensity or dominance of a hue: a hue can be increased or diminished in
an image just as black versus white can be adjusted in an image through
all shades of gray. Value is the lightness or darkness of a hue and is
usually called brightness when the three hues are combined in white
light. Image processing software permits the intensity and value of these
hues to be adjusted independently within each image.
Video 3 Hue, Saturation and Value
.
Color
If the image is converted to gray scale, as in a report printout, care must
be taken that features of interest remain visible: a fluorescent
nondestructive test indication in brilliant green may disappear entirely if it
changes to a shade of gray like that in an adjacent area.
The camera setting of white balance can improve the realism of color
photographs by preventing unnatural tints. Image processing programs
also enable improvement of color balance. Program menus may refer to
it by other terms, such as color intensity or hue saturation.
Color
If a photograph is taken in a digital camera's RAW file format, image
colors can be fixed without information loss. Most cameras take RAW
photos in twelve-bit color (4096 shades per color) instead of eight-bit
color (256 shades per color), enabling powerful balance adjustments
without visible loss in quality.
Color
Size
The more pixels in an image, the more information it carries. An image's
quantity of pixels is its true size in the dimensionless electronic space of
the computer.
To be viewed, the image must be translated to a medium such as a
computer display or printed page, where details become visible to
human eyes in order to be interpreted. In a process known as scaling,
the pixels can be compressed or spread out to fit the desired viewing
surface (Fig. 18). Scaling affects resolution, discussed below.
Figure 18 Example of scaling of digital images.
.
Color
Image size and resolution are usually expressed in terms of image width,
in the horizontal dimension. The size and resolution of the vertical scale
can be manipulated independently but to prevent distortion usually
undergo the same processes as the horizontal.
The number of pixels in an image can be decreased or increased in a
step called resampling or conversion. Resampling cannot add more data
to a photograph once it has been shot.
Color
Resolution
The resolution of an image is its ability to distinguish small adjacent
objects. Two thin adjacent lines appear as two thin lines, for instance,
and not as one thicker line. Resolution depends on the number of pixels
in the image and hence on its file size. In Fig. 19, two versions of the
same image show the difference between high and low resolution.
Details in an image cannot be restored simply by converting it to a higher
resolution or from a lossy to a lossless format. Once an image's
resolution is reduced, details are lost. If a display's resolution exceeds an
image's resolution, the image will look blurry.
Figure 19 Visible light photograph showing magnetic particle indication
underultraviolet lamp: (a) high resolution, at 40 pixels ·cm-1 (100 PPI); (b) low
resolution, at 10 pixels·cm-1 (25 PPI). Lower resolution blurs details.
Lowering resolution sometimes enhances contrast or increases hue intensity -
side
. effects better achieved by image processes that do not sacrifice detail.
Color
Resolution is measured in dots per inch (DPI) for printers or square
pixels per inch (PPI) for computer displays, measured along the
horizontal axis. Alternatives using SI metric units are to specify pixels per
centimeter (pixels·cm-1) or the pixel width in micrometers (um).
Because the number of pixels in a scalable image is fixed, however, an
expression of resolution is meaningless unless the viewed image size is
also specified. If a given digital image is reduced to half its width, for
instance, the number of pixels per unit of width and the resolution are
doubled (Fig. 18). The data in the image file may remain the same, but
smaller size makes details harder to see. Someone processing digital
images must make decisions balancing image size versus resolution.
Color
Brightness
Brightness is the common term for the luminance of light emitted by each
pixel. Usually, one control setting adjusts brightness to affect all pixels in
an image simultaneously. This parameter can be adjusted to
compensate for poor visibility on overcast days. Too little brightness can
make a picture dark as night, and too much brightness can make it
washed out so no features stand out.
Color
Saturation
In a color image, saturation is the intensity of a hue. A hue of zero
saturation is black or gray. If all three hues have zero saturation, the
image is colorless, black.
Color
Contrast.
The setting of contrast controls the degree of difference between light
intensities at different points in the image. Like brightness, contrast is
adjusted to affect all pixels in an image at once. As contrast is increased,
for example, a dark indication becomes easier to see on a bright
background and a bright indication becomes more visible on a dark
background.
Color
Contrast and brightness interact closely, and the inspector must
sometimes adjust them to find the correct settings for each photograph.
Figure 20 shows the effect of brightness and contrast on an image.
Figure 20 Photograph of eraeked rod: (a) high eontrast,
high brightness; (b) high eontrast, low brightness; (e) low
contrast, high brightness; (d) low eontrast, low brightness.
.
Image Integrity
Inspectors may be tempted to enhance images to justify accept/reject
decisions. This temptation should be treated with caution. The inspector
may consider several rules of thumb.
1. Local features including indications must not be exaggerated or
obscured using image processing tools having descriptors such as
pencil, eraser, flood, spray or smudge. These alterations of an image
could be considered falsification of test results.
2. Image settings such as zoom, brightness and contrast may be freely
adjusted, as photographers have done for generations.
Image Integrity
3. A photograph of the reference standard should be included in the
archive and used to evaluate modifications made to test images. How do
the same modifications affect discontinuity visibility in the reference
standard?
For some applications, a written statement on image processing may be
a desirable part of an inspection procedure. Accept/reject decisions are
usually made at the moment of inspection and in the presence of the test
object. If some decisions are made later, that protocol can be
documented in the procedure.
Image Integrity
In some archives, a file's metadata, including its creation date, are part of
its documentation. These metadata can be lost if an image is saved in
another format
Videos
The considerations discussed above for still images pertain also to
motion pictures, or movies. The term video applies specifically to movie
signals transmitted electronically. There have been three different
platforms for moving pictures.
1. Film projection was used for commercial cinema in the twentieth
century. Each frame was in effect an individual color slide, and they were
sequenced on spools, or reels. A long movie consisted of thousands of
frames and required several reels. Film of smaller width, and hence less
resolution, was widely used at home and by industry. Some film was in
black and white.
Videos
2. Analog video was used for television. The moving images were
converted electronically into a series of horizontal lines that scanned
across the screen successively in a raster pattern. Analog television can
be in color or in black and white.
3. In digital displays, a movie consists of a series of still images, or
frames, viewed in succession to recreate the appearance of motion (Fig.
21). Digital video can be viewed on a computer display or on a digital
television screen.
Figure 21 Falling object in video: (a) 30 trames per
second; (b) 25 trames per second.
.
Videos
Frame Rate
In digital video, moving pictures are produced by successive display of
what are essentially still images, each image corresponding to a frame.
Figure 21 uses the example of a falling object to show the difference
between two frame rates. Capture rates of 60 frames per second are
common in digital video cameras. Human perception performs well in
viewing frames rates over 20 frames per second. In the second half of
the twentieth century, most movies from digital files are each encoded to
play at a particular rate, in synchrony with the computer's integral clock.
Videos
Personal computers can display video recordings at any frame rate the
visual inspector is likely to encounter.
Before the move to digital video in 2009, three systems were used
worldwide for analog video: the National Television System Committee
(NTSC) system, the phase alternating line (PAL) system and the SECAM
(séquentiel couleur a mémoire, "sequential color with memory") system.
Figure 21 shows the difference between NTSC and PAL formats in terms
of frame rate, NTSC being 2S and PAL being 30 frames per second. The
two formats were incompatible in the years of analog video tape.
Videos
Personal computers can display digitized video recordings at either
frame rate.
The analog video is not a compilation of successive still image as in
digital video but rather is a series of cycles in a raster signal. What
appears to be a single image when a video tape is paused is in fact an
image that the video player has constructed from successive lines of the
continuous raster signal. Discussions of video electronics often refer to
frame rate in hertz (Hz) - that is, cycles per second. The two measures
are equivalent; 1 frame per second equals 1 Hz.
Closing
In practice, the viewing quality of a digital photograph depends on
several factors.
1. The vision acuity of the viewing human can be impaired by medical
conditions such as myopia and color blindness.
2. The visibility of a shot is affected by things such as light and camera
position.
3. The useful detail in an image is affected by its size and by its quality in
terms of features such as contrast and resolution.
Closing
Technology has given the inspector options beyond the scope of the
present discussion: (1) light measurement and illumination, (2) optical
gaging and range finding, (3) photography sensitive to other
wavelengths, including ultraviolet and infrared, and (4) video
documentation of inspection, more for procedures than for indications.
These and other options depend on hardware. Further difficulties are
caused by marine environments or extremes of altitude or temperature.
Closing
Digital technology has revolutionized the practice of visual testing.
Personal computers are used for processing, archiving and transmitting
images. Remote and portable devices integrate microprocessors for
recording and transmission of test images on site. And advances in
battery life, hard disk memory and processor speed have made possible
the acquisition of test documentation, including video records,
impractical to obtain in the twentieth century.

You might also like