You are on page 1of 11

Materials and Manufacturing Processes

ISSN: 1042-6914 (Print) 1532-2475 (Online) Journal homepage: https://www.tandfonline.com/loi/lmmp20

An Approach to Improved CNC Machining Using


Vision-Based System

Ghassan Al-Kindi & Hussien Zughaer

To cite this article: Ghassan Al-Kindi & Hussien Zughaer (2012) An Approach to Improved CNC
Machining Using Vision-Based System, Materials and Manufacturing Processes, 27:7, 765-774,
DOI: 10.1080/10426914.2011.648249

To link to this article: https://doi.org/10.1080/10426914.2011.648249

Published online: 24 May 2012.

Submit your article to this journal

Article views: 512

View related articles

Citing articles: 6 View citing articles

Full Terms & Conditions of access and use can be found at


https://www.tandfonline.com/action/journalInformation?journalCode=lmmp20
Materials and Manufacturing Processes, 27: 765–774, 2012
Copyright # Taylor & Francis Group, LLC
ISSN: 1042-6914 print=1532-2475 online
DOI: 10.1080/10426914.2011.648249

An Approach to Improved CNC Machining


Using Vision-Based System
Ghassan Al-Kindi1 and Hussien Zughaer2
1
Faculty of Engineering, Sohar University, Sohar, Sultanate of Oman
2
College of North Atlantic Qatar, Doha, Qatar

CNC machines are still suffering from machine blindness. They cannot automatically assess the performance of applied machined tasks.
In this article, an approach is made to improve the performance of CNC machining by utilizing on-line vision-based monitoring and control
system. To facilitate the integration of computer vision with CNC machines a system is proposed and developed to tackle a number of pin-
pointed issues that obstruct such integration. A practical executable methodology of these steps is developed to enable their beneficial
implementation on lab-scale CNC milling machines. Two different models of bench type CNC machines are employed to generalize the
findings. Two cameras are mounted on the machine spindle of each of the two employed CNC machines to provide valid image data accord-
ing to the cutting direction. Proper selection and activation of relative camera is achieved automatically by the developed system which ana-
lyze the most recent conducted tool path movement to decide on which camera is to be activated. In order to assess the machining surface
quality and cutting tool status, image data are processed to evaluate resulting tool imprints on the machined surface. An indicating para-
meter to assess resulting tool imprints is proposed and used. The overall results show the validity of the approach and encourage further
development to realize real industrial scale intelligent vision-controlled CNC machines.

Keywords Improved CNC machining; Intelligent machining; Surface roughness; Vision system.

INTRODUCTION Roughness measurement is frequently achieved


Although CNC machines are currently regarded as the by stylus-based equipment; however, using such tech-
heart of machining workshops, parts produced on these nique to acquire on-line roughness measurement, while
machines may not be as precise as expected. CNC machining is not possible due to the induced machine
machines cannot automatically monitor and judge the vibration and practical equipment setup. Recently
effectiveness of the applied machining parameters (i.e., vision-based roughness measurement has started to
cutting speed, feed rate, and depth of cut), tool wear, gain attention in acquiring real-life data [8, 9]. If such
and life status while machining, and the resulting accu- techniques are introduced in CNC machines, it would
racy and quality of machined surfaces. Providing CNC enhance their performance and functionality by deciding
machines with abilities to monitor and assess the conduc- whether to keep or alter the employed machining
tion of such tasks will open the way to introduce parameters, adjust the machining path, or to apply tool
additional feedback control to these machines, hence change.
improving their machining performance [1]. Surface In order to introduce vision-based roughness measure-
roughness measurement is one crucial parameter that ment technique to CNC machines, a number of scientific
affects the preciseness of manufactured parts [2]. There- as well as technical issues needs to be investigated and
fore, surface roughness could be used to decide on the handled including scene visibility issues (e.g., camera
quality of machining and tool condition. mounting, camera field of view, camera-optics cali-
Several studies on optimizing the cutting parameters bration, mapping of world coordinates to image coordi-
to produce best surface roughness have been conducted nates, image acquisition of relevant view to meet the
[3–6]. Although positive results are demonstrated in these machining path, effects of coolant on the image quality,
studies to obtain best possible finish for a given tool effects of cut chips blocking the scene, lighting, and
shape and feed, satisfactorily implementation of these workpiece color variation and reflection characteristics),
studies can only be attempted if built-up edge, chatter, effects of machine vibration on the quality of obtained
inaccuracies in machine-tool movement, etc., are elimi- roughness measurement, selection of a suitable rough-
nated [7], which is extremely difficult to avoid in real ness parameter to be employed in the analysis of vision
applications. data, and effect of machining parameters on acquired
vision based roughness measurements.
The pioneer research to integrate CNC machines
Received November 13, 2011; Accepted November 15, 2011
with vision systems is presented in [1] though only
Address correspondence to Ghassan Al-Kindi, Faculty of Engin- limited success was gained at that time due to hardware
eering, Sohar University, P.O. Box 44, PC-311, Sohar, Sultanate of limitation. Several attempts were made later on to
Oman; E-mail: gkindi@soharuni.edu.om gain on-line assessment of machining tools [10–14].

765
766 G. AL-KINDI AND H. ZUGHAER

However, conducted literature review has assured the


following:

1. No comprehensive study yet available that shows the


complete relationship among the machining para-
meters with obtained roughness measurements using
vision-based technique.
2. Currently, vision controlled CNC machines are not
offered as commercial products.
3. To date, CNC machines still suffer from machine
blindness.
4. Vision-based techniques have started to gain research
popularity in roughness measurement application.

SYSTEM HARDWARE
In this research, two separate experimental CNC-
vision rigs are established using two different models of
CNC machines. These two different hardware setups
aim at facilitating the generalization of the developed FIGURE 2.—Hardware setup utilizing the ‘‘Super-Prolight1000’’ CNC
model to be applied to other models and types of CNC milling machine.
machines. In the first rig, a Denford ‘‘Novamill’’ CNC
milling machine is employed, whereas in the second rig,
an Intelitek type ‘‘Super-Pro-light 1000’’ CNC milling THE MACHINING TASK
machine is used. Both machines can be characterized as
bench-type with PC-driven Fanuc-controller emulated High speed steel (HSS) end milling cutters of U 10 mm
machines. In order to introduce vision system to these are used for the experiments. Square size workpieces
two rigs, two high resolution Logitech C-910 cameras, made of commercial aluminium are used, and their sur-
which provide images of 24-bit with real resolution of faces are machined using varying machining parameters.
2,592  1,944 pixels, are used to acquire valid machined Depth of cut in the range of 1–3 mm, feed rate of
surface data and avoid blocking the scene of the 50 mm=min up to 300 mm=min, and speed of cut of
machined surface by the tool or tool movement. These 1,000 rpm are used in the investigation. More than 60
four cameras are mounted on the two machine spindles specimens were actually machined using varying values
to provide plan view of the machined part. Ambient of machining parameters from the above ranges to
lighting is used to minimize the effects of specular light enable the assessment of their surface finish. In order to
reflection from the metallic work parts that are machined facilitate a comparison base for roughness measurement
[15]. Figures 1 and 2 show photographs of the developed using vision system, surface roughness of these machined
rigs on CNC machines at work. specimens are measured using SE1200 Kosaka-Lab
stylus-based profile-meter; see Fig. 3.

FIGURE 1.—Hardware setup utilizing the ‘‘Novamill’’ CNC milling FIGURE 3.—SE1200 stylus-based roughness profile-meter, machined
machine. specimens, and sample of acquired roughness parameter results.
IMPROVED CNC MACHINING USING VISION-BASED SYSTEM 767

Seventeen different standard surface roughness TABLE 1.—Example of resulting surface roughness for a set machined
parameters are computed for each specimen. These specimens. (Speed of cut is set to 1,000 rpm.)
include Ra, Rz, Rp, Rv, Rq, RSm, Rmr, Rt, Rsk, Rku, Rk, Sample Feed rate Depth of Surface roughness Surface roughness
Rpk, Rvk, Mr1, Mr2, A1, and A2 [8]. Figure 4 shows exam- no. mm=min cut mm Ra mm Rt mm
ples of computed values of some of these parameters.
Analyses of acquired results of these different rough- 1 50 2 3.145 18.932
2 100 2 1.719 8.810
ness parameters show that it is not always possible to
3 150 2 3.301 16.896
predict the resulting surface roughness parameters, even 4 200 2 3.581 22.680
if only one machining variable is varied while all other 5 250 2 5.132 22.340
machining variables are kept constant. Hence, based 6 300 2 6.045 30.184
on the acquired results, none of the employed surface 7 50 3 3,132 17.344
roughness parameters could be used with confidence in 8 100 mm=min 3 2.106 12.976
this research to develop a reliable model that predicts 9 150 mm=min 3 1.700 13.978
and evaluates the performance of machining parameters. 10 200 3 3.125 17.534
For instance, it is evident from Table 1, which includes 11 250 3 3.111 19.434
examples of acquired measurements, that although all 12 300 3 4.012 19.563
machining parameters are kept constant apart from feed
rate, the resulting amplitude parameters of stylus-based Nevertheless, if such novel surface roughness indica-
surface roughness do not always provide consistent tors are introduced, they are not meant to replace the
values that can be directly related to the feed rate used. widely used and well-known roughness parameters such
Therefore, there is a real need to introduce alternative as Ra, but they will provide beneficial advantage to allow
surface roughness indicators that could provide direct the evaluation of the performance of the cutting con-
and consistent relationship between resulting surface ditions in conjunction with the currently used roughness
qualities and employed machining parameters. parameters.
It should be noted that surface roughness resulting
from machining operations have features that influence
both micro- and nanoscale regions [16], as shown in
Fig. 5. While both micro- and nano-features are of high
importance in engineering applications, the microscale
region usually results from the tool cutting edge imprint
on the workpiece surface, and hence the on-line monitor-
ing of this roughness scale region will provide advan-
tageous judgement on the cutting tool condition and life.
In face milling operation, tool imprints on the
machined surface are mainly influenced by feed per tooth.
In other words, the tool imprints on the machined surface

FIGURE 4.—Example of computed roughness parameters for a set of


machined specimens. FIGURE 5.—Micro- vs. nanoscale roughness.
768 G. AL-KINDI AND H. ZUGHAER

are affected by the cutting speed, the tool diameter, the


number of cutting teeth in the tool, the feed rate used,
and the cutting path overlapping (if any). Theoretically,
tool movement on the main axis of the cutting direction
is expected to generate tool imprints rate of It
(imprints=mm):

60  V  ln
It ¼ ; ð1Þ
pd f

where V is the tangential cutting speed (mm=sec), tn is the


number of cutting teeth in the tool, d is the tool diameter
(mm), and f is the cutting feed rate (mm=min).
It is worth stating that in face milling each tool tooth
intersects the axis of the tool movement twice in each
single rotation. Hence, this may alter the resulting pat- FIGURE 6.—The proposed system concept.
tern of the tool imprints on the machined surface if
the second interception takes place in a deviated position
from the already imprinted cuts. However, from a theor- c. A dedicated computer that acts as the main server-
etical point of view, the added imprints will generate controller which is networked with the other two
similar frequency of imprints, though with certain value computers to facilitate on-line data exchange, feed-
of phase shift. Such phase shift Ps (mm), if occurred, can back, and decision making
be estimated by
    Figure 7 provides a flowchart of the CNC machining
f  It f  It vision system. The input to the system main computer is
Ps ¼  int ; ð2Þ
d d

where int(x) is the integer value of x.


The monitoring of the resulting frequency of tool
imprints on the machined surface provides a beneficial
indication on the status of the tool condition, tool life,
and correspondingly microscale surface roughness. For
instance, if one of the tool teeth (or more) has broken
or worn out, the resulting pattern of tool imprint on the
machined surface is altered and, therefore, could be recog-
nized if valid procedure for on-line monitoring of such
tool imprint is developed and applied. However, in prac-
tical applications, there are many other factors that affect
the resulting pattern of tool imprints on the machined sur-
face. The important factors are built-up edge, chip trap in
the tool=workpiece interface, chatter, inaccuracies in
machine-tool movement, micro-irregularities of tool cut-
ting edges, and other reasons related to metal shear beha-
vior. All of these factors present a challenge to any
developed model that aims at the automatically monitor-
ing and controling the resulting tool imprints.

CNC-VISION INTEGRATION
In general, CNC machine controllers do not provide
open platform to develop supervisory control programs
by users [17], and hence in order to facilitate the
execution of this research a detailed technical design is
proposed and employed to enable the integration of
the vision system with employed CNC machine. The
proposed system design utilizes the use of three compu-
ters as follows (Fig. 6):

a. A dedicated computer to control the CNC machine


b. A dedicated computer to control the vision system FIGURE 7.—Flowchart of the CNC machining vision system.
IMPROVED CNC MACHINING USING VISION-BASED SYSTEM 769

a normal G-code program that defines the required tool enable movements of 10 mm at a time. However,
path, selected machining parameters, as well as other other values can also be selected to segment the tool
miscellaneous functions. path, though resulting tool path segments should
A dedicated program is developed by the authors to be longer than the usual tracing length used in the
enable data processing and system implementation. The evaluation of tool imprints or surface roughness.
main computer is used to execute the developed program. c. Wait for a defined short duration while compressed
The program analyzes and processes the input G-codes air is still on using G04 (Dwell cycle); in the practical
in order to automatically segment the tool path and out- tests, we set the time for this task to two seconds and
put a set of regenerated G-codes, thus allowing the gained successful results. Once this step is executed
implementation of on-line monitoring and control. Each and the air blast solenoid is turned to off status,
block of the regenerated G-codes facilitates the move- the vision system starts to acquire the image; the
ment of one single-segmented tool path movement of a activation of the vision system is also done via an
predefined linear (or circular) length, where such move- external trigger signal gained from another output
ment is extracted from the originally programmed tool channel of the CNC machine controller, whereas
path. However, the execution of the newly generated camera selection is achieved by the extraction of cut-
G-code program ensures the exactl same tool path, ting direction from the most recent executed block of
machining parameters, and miscellaneous functions, the segmented G-code program.
though it includes subroutine call functions and other d. Apply image-processing procedure to relevant surface
control functions, as will be discussed later. The regener- image data. This stage involves intelligent identifi-
ated G-codes are then stored in the common storage to cation of relevant region of interest to extract image
enable their loading and execution by the CNC control- data for processing. Once the image data is processed,
ler. Once the CNC controller starts to execute the newly new values of machining parameters are decided
generated G-code program, it will sequentially load and according to an automatic decision making pro-
execute certain attached G-code subroutines that are also cedure. These new machining parameters are typed
automatically created and attached to the main G-code to relative G-code subroutine. It should be noted that
program. These G-code subroutines include the spindle the image region of interest aims to monitor the cut-
speed and feed rate, as well as other added control codes; ting tool performance just as the tool completes the
hence, the values of cutting speed and feed rate can be cutting of the most recent tool path segment; hence,
replaced by other decided values without the interruption if no cutting is conducted within this tool path seg-
of the execution of the main G-code program. The ment, e.g., the cutting tool is approaching the work-
instantaneous updating of either=both speed of cut piece, then there is no need to acquire and process
or=and feed rate values is achieved by replacing relative relevant image data or update the machining para-
G-code subroutine by a new version which includes the meter values.
new values of these machining parameters instead of e. The CNC machine waits for another signal input
the current values used. The new values of machining which is sent by the vision system to acknowledge
parameters are decided according to an automatic that the image processing for this step is completed,
decision-making action based on intelligent surface and the new G-code subroutine becomes ready to
roughness feedback gained via the vision system. be loaded and executed.
In order to overcome the problem of coolant and pro- f. Load and execute the newly generated G-code sub-
duced chips obstructing the visual scene of the vision routine by the CNC controller; thus, the new values
system, compressed air is used instead of liquid coolant. of machining parameters are applied.
Activation and deactivation of compressed air is con- g. Repeat step (a) above, and continue executing the
trolled via a solenoid valve that acquires its activation next segmented movement of the tool path.
signal via the main G-code program. The compressed
air, in this case, will also serve to remove the cut chips The above steps were tested successfully on the two
away from the camera field of view; however, to achieve developed rigs; however, we made some alterations to
this task, sequential control steps are used in the main suit each of the two CNC machine controllers. For
G-code program described below: instance, the Novamill CNC controller allow the use of
subroutine call from outside its environment; therefore,
a. Activate the air coolant by generating relative output the developed method works fine, whereas the
signal to the solenoid; for the Intelitek CNC machine, Super-Prolight CNC machine controller does not allow
we used the code (M25 H11) to set the signal to high this task. Hence, in order to overcome this limitation,
level at the output channel H11, which turn the com- we used ‘‘chain to program’’ code (G20) instead of sub-
pressed air flow to on status and the code (M26 H11) routine call. It should be noted here that the chain to pro-
to set the output signal to low level hence turn the gram G20 does not provide a way to come back to the
compressed air flow to off status. main G-code program, and hence the regeneration of
b. Execute one segmented movement of the machining the G-code program, which is achieved at the first stage,
path; a value of 10 mm is used in the conducted was altered to generate and chain the newly generated
experiments of this research, and the generated G-code programs instead of using subroutine call func-
G-codes, therefore, include execution blocks to tion. Figure 8 presents an example of a simple G-code
770 G. AL-KINDI AND H. ZUGHAER

FIGURE 8.—Example of an input and output G-code programs (color


figure available online).

program for the Super-Prolight CNC machine and the


automatically generated output G-code subroutines
from the system to run on the CNC machine controller.

VISION DATA PROCESSING AND TOOL IMPRINT ASSESSMENT


Roughness measurement using vision data is not a
straightforward task. It requires a lot of attention and
development of a methodology that works in a reliable
way to provide on-line and semi-real-time results.
Although many published articles assure that vision data
provide valid information regarding surface roughness, FIGURE 9.—Examples of acquired images of several machined specimens.
e.g., references [8, 9], this task requires further investigation
to suit the environment of each of the two developed In this research, we have developed a technique that aims
CNC-vision systems and ultimately the general case. to overcome problem areas associated with acquiring a
Experimental investigation is carried out by the measure of surface roughness assessment using vision data.
authors of this article to analyze the images of the sixty The technique is designed to face the problems asso-
machined specimens. Eight images of each specimen are ciated with irregular light reflection and color variation
acquired and processed to extract relative roughness of metal work parts. The technique consists of the
data. Figure 9 provides examples of acquired images following steps:
from the machined specimens’ surfaces for several sam- Step 1: Compute ‘‘reference surface’’ (h) to act as a
ples, whereas Fig. 10 gives examples of pixel intensities base in the assessment of surface roughness; we have
across single lines of Fig. 9 images. It could be noted proposed a modified version of the moving average
from Fig. 10 that although roughness information is principle to suit the developed system. The proposed
inherited by vision data, the image data should undergo technique is as follows:
several processing to extract reliable roughness measure-
ment from them.
One processing stage should be dedicated to remove a. If g is an array of image data with the size of I points
localized lighting (reflection) variation from these (pixels) that represents a line across the surface, the
images, a process which could be seen as an equivalent reference surface h is defined as:
(but not identical) to the separation of waviness from
roughness when employing stylus-based data. hðtÞ ¼ gðtÞ for t ¼ 1 to j
IMPROVED CNC MACHINING USING VISION-BASED SYSTEM 771

d. Repeat process (a) above for n number of times,


where n is the constant of the applied calculation
limit which depends on the image quality and could
be set to a selected value in the range of 0.15 to 0.5
of the I number of points; we have used the value
of n ¼ 0.16 in the conducted experiments; however,
we have also successfully tested other values within
the specified range.

Step 2: Use the computed h to filter out the original


line array of the image data, namely, g as follows:

gðiÞ ¼ gðiÞ  hðiÞ þ minðhÞ for i ¼ 1 to I;

where min(h) is the minimum value of the elements in


the h array.
Step 3: Calculate the mean line M of the resulting 
g as
follows:
Pi¼1
i¼1 gðiÞ
M¼ : ð3Þ
I
Step 4: Calculate required surface roughness indicator
or measure; in this research, we have proposed and
implemented an indicator of surface roughness based
on counting tool imprints on the machined surface
according to the following procedure:

a. Set the value of Zero as an initial value to the tool


imprints counter C.
b. Calculate the value of standard deviation r of the
elements in the g array.
c. Starting from the first element of resulting array  g,
identify regions of neighboring elements with values
of less than M, and record their values as subsets.
d. Test the elements of each subset, and increment C by
1 if at least one element in each subset has a value
that is less than (M  r); this condition is meant to
filter out intensity variations due to any unfiltered
noise.
FIGURE 10.—Examples of unprocessed image data of lines across several
machined specimens.
In order to verify the validity of the proposed tech-
nique of reference surface calculation, the developed
procedure is applied to a total of 480 images acquired
for the 60 machined specimens. Satisfactory results are
hðiÞ ¼ gðiÞ for i ¼ 1  j þ 1 to j gained for all cases. Figures 11 and 12 show a represen-
tative example of a set of acquired results which
Pm¼ðiþjÞ demonstrate the effectiveness of the developed pro-
m¼ðijÞ gðmÞ cedure. It is clear from these figures that resulting
hðiÞ  for i  i þ j to I  j; reference surface was successfully gained from the
2j þ 1
vision-based roughness profile, despite the several
where (2j þ 1) is the length of the local average sub- problem areas affecting the image data.
array which is set to a suitable odd number value; The proposed tool imprint indicating parameter C is
this parameter is set to the value of 5 in the conduc- also computed to verify its capability to handle the
ted experiments and provides excellent results. different cases provided by the acquired 480 images.
b. Set Results are generally positive; however, further attenu-
ation of the proposed model is required for more robust-
ness. Table 2 shows a representative example of acquired
gðiÞ ¼ hðiÞ for i ¼ 1 to I: results, where it can be seen that the developed model
772 G. AL-KINDI AND H. ZUGHAER

FIGURE 12.—Examples of processed and filtered image data of lines across


a set of specimens. (Depth of cut 2 mm and speed of cut 1,000 rpm).

TABLE 2.—Some examples of computed tool imprint indicating parameter


C. (Speed of cut is 1,000 rpm and depth of cut is 2 mm.)

Resulting C value

Feed rate Max deviation


mm=min Trial 1 Trial 2 Trial 3 Trial 4 Average from mean %

50 31 33 27 28 29.75 %10.9
100 32 25 25 26 27 %18.5
150 23 20 24 26 23.25 %14
200 18 16 17 22 18.25 %20.5
250 16 15 16 19 16.5 %15.2
300 12 12 15 13 13 %15.4

FIGURE 11.—Examples of resulting reference surface for image lines across


a set of specimens.

THE NEXT STEP: MOVING FORWARD


Although positive and promising results are obtained
has successfully provided consistent indicating values of from the developed vision-roughness model, more inves-
the tool imprints on the machined surface, except for few tigation and attenuation is required to enable reliable
cases. This is mostly related to the lag in image resol- implementation of the developed methodology in real
ution and quality. applications. Such task has been set as the next target
IMPROVED CNC MACHINING USING VISION-BASED SYSTEM 773

for this research team, where a large number of enable on-line in-situ assessment of machined surfaces
additional experiments are planned to be executed to to enable the integration of such developed techniques
generalize the findings. with CNC machines.
Once a reliable model is achieved, on-line processing The developed technique to calculate reference surface
of vision data while machining will be conducted using for machined surfaces using image data showed excellent
the developed CNC-vision system. results and overcoming the problem areas associated
The developed CNC-vision system at the current stage with irregular light reflection and color variation of metal
handles both linear and circular interpolated movements work parts. Hence, the authors strongly recommend
in x and y directions. However, the system capability employing of this technique in surface roughness
could be expanded to enable the handling of three axes evaluation using vision data.
movements; monitoring of the movement in z direction The proposed tool imprint indicator C has showed to
will require adding four additional cameras for each provide valid consistent indicating values that can be
rig, therefore, monitoring and processing of images directly related to machining parameters employed.
acquired from six cameras to achieve real-time control However, more investigation and development of this
need higher computer processing speeds; hence, such parameter is needed for more robustness. In addition,
task is planned for future work. further investigation of the use of this proposed indi-
Nevertheless, several other alternative future direc- cator to assess tool condition is also required.
tions are also planned to expand the findings of this The outcome results gained from the practical execution
research. One of these directions is towards system of this research project encourage further investigation
expansion to accommodate other machining tasks and development to beneficially realize the challenging
within the system, except those which do not allow clear aim of the development of intelligent computer vision con-
image acquisition, e.g., drilling of small holes and taping trolled CNC machines. Further steps are already planed
of small threads. and under development by this research team to achieve
In addition, a goal is also set to investigate the system more robust and reliable system that can be readily
feasibility if applied to industrial type CNC machines. A adopted and applied to industrial CNC machines.
preliminary plan on how to achieve this goal is set by the
authors of this article; however, it needs to be tested and ACKNOWLEDGMENTS
proved. The results presented in this article are part of an
on-going research project funded by Qatar National
Research Fund QNRF (project number NPRP Number:
CONCLUSIONS 08-287-2-096 with a total values of US$ 396,900.00).
In this work, considerable progress is achieved
towards the challenging realization of intelligent vision- REFERENCES
controlled CNC machines. The proposed framework 1. Al-Kindi, G.; Gill, K.; Baul, R. Vision-controlled CNC
and the developed system show satisfactory results. machines. IEE Computing and Control Engineering Journal
The technical problems associated with visibility of the 1993 (April), 92–96.
workpiece scene are handled successfully through the 2. Quintana, G.; Ciurana, J.; Ribatallada, J. Surface roughness
designed and developed procedures. These procedures generation and material removal rate in ball end milling
showed superb results in overcoming effects of coolant operations. Materials and Manufacturing Processes 2010, 25
blocking the image scene, discarding cut chip from the (6), 386–398.
image scene, and also reducing vibration effects on the 3. Cardoso, P.; Davim, P. Optimization of surface roughness in
acquired images. micromilling. Materials and Manufacturing Processes 2010, 25
The developed methodology of tool path segmen- (10), 1115–1119.
tation to allow on-line monitoring and control of 4. Zhang, J.; Chen, J. Surface roughness optimization in a
machining performance through adjusting or altering drilling operation using the Taguchi design method. Materials
the machining parameters has showed to be very effec- and Manufacturing Processes 2009, 24 (4), 459–467.
tive and provides the means to apply vision-based feed- 5. Saikumar, S.; Shunmugam, M. Parameter selection based on
back control to CNC machines to overcome limitations surface finish in high-speed end-milling using differential
associated with CNC machine controllers which in evolution. Materials and Manufacturing Processes 2006, 21,
general do not provide open platform to develop super- 341–347.
visory control programs. 6. Yang, Y.; Shie, J.; Huang, C. Optimization of dry machining
The use of two cameras in the system hardware parameters for high-purity graphite in end-milling process.
together with the developed procedure to activate rel- Materials and Manufacturing Processes 2006, 21, 832–837.
evant camera has enabled valid image data acquisition 7. El-Hossainy, T. A new technique for enhancing surface
of the most recent cut of tool path segment. roughness of metals during turning. Materials and Manufac-
Results showed that employing any of the standard turing Processes 2010, 25 (12), 1505–1512.
surface roughness parameters in on-line assessment 8. Al-Kindi, G.; Shirinzadeh, B. An evaluation of surface
using vision system does not always provide reliable roughness parameters measurement using vision-based data.
and consistent indicating values. Hence, there is a real International Journal of Machine Tools and Manufacture
need to propose and develop alternative procedures to 2007, 47, 697–708.
774 G. AL-KINDI AND H. ZUGHAER

9. Al-Kindi, G.; Shirinzadeh, B. Feasibility assessment of control in machining processes. ICIAR 2008, LNCS (5112),
vision-based surface roughness parameters acquisition for 1101–1110.
different types of machined specimens. Image and Vision 14. Kwona, Y.; Tseng, T.; Ertekin, Y. Characterization of
Computing 2009, 27, 444–458. closed-loop measurement accuracy in precision CNC milling.
10. Castejon, M.; Alegre, E.; Barreiro, J.; Hernandez, L. On-line Robotics and Computer-Integrated Manufacturing 2006, 22,
tool wear monitoring using geometric descriptors from digital 288–296.
images. International Journal of Machine Tools and Manufac- 15. Al-Kindi, G.; Gill, K.; Baul, R. Experimental Evaluation of
ture 2007, 47, 1847–1853. ‘Shape from Shading’ for Engineering Component Profile
11. Eladawi, A.; Gadelmawla, E.; Elewa, I.; Abdel-Shafy, A. An Measurement. Proc. Instn. Mech. Engineers, Part B: Journal
application of computer vision for programming computer of Engineering Manufacture 1989, 203, 211–216.
numerical control machines. Proc. Instn. Mech. Engineers, Part 16. Al-Kindi, G.; Shirinzadeh, B.; Zhong, Y. A vision-based
B: Journal of Engineering Manufacture 2003, 217, 1315–1324. approach for surface roughness assessment at micro- and
12. Kerr, D.; Pengilley, J.; Garwood, R. Assessment and visualisa- nano-scales. In Proceedings of the 10th International Confer-
tion of machine tool wear using computer vision. International ence on Control Automation Roboties and Vision (ICARV),
Journal of Advanced Manufacturing Technology 2006, 28, Hanoi, Vietnam, 2008; 1903–1908.
781–791. 17. Al-Kindi, G.; Abdul Kareem, L. An application of reverse
13. Alegre, E.; Barreiro, J.; Castejon, M.; Suarez, S. Computer engineering using vision-based data. International Journal of
vision and classification techniques on the surface finish Applied Engineering Research 2009, 4 (7), 1333–1346.

You might also like