Professional Documents
Culture Documents
To cite this article: Ghassan Al-Kindi & Hussien Zughaer (2012) An Approach to Improved CNC
Machining Using Vision-Based System, Materials and Manufacturing Processes, 27:7, 765-774,
DOI: 10.1080/10426914.2011.648249
CNC machines are still suffering from machine blindness. They cannot automatically assess the performance of applied machined tasks.
In this article, an approach is made to improve the performance of CNC machining by utilizing on-line vision-based monitoring and control
system. To facilitate the integration of computer vision with CNC machines a system is proposed and developed to tackle a number of pin-
pointed issues that obstruct such integration. A practical executable methodology of these steps is developed to enable their beneficial
implementation on lab-scale CNC milling machines. Two different models of bench type CNC machines are employed to generalize the
findings. Two cameras are mounted on the machine spindle of each of the two employed CNC machines to provide valid image data accord-
ing to the cutting direction. Proper selection and activation of relative camera is achieved automatically by the developed system which ana-
lyze the most recent conducted tool path movement to decide on which camera is to be activated. In order to assess the machining surface
quality and cutting tool status, image data are processed to evaluate resulting tool imprints on the machined surface. An indicating para-
meter to assess resulting tool imprints is proposed and used. The overall results show the validity of the approach and encourage further
development to realize real industrial scale intelligent vision-controlled CNC machines.
Keywords Improved CNC machining; Intelligent machining; Surface roughness; Vision system.
765
766 G. AL-KINDI AND H. ZUGHAER
SYSTEM HARDWARE
In this research, two separate experimental CNC-
vision rigs are established using two different models of
CNC machines. These two different hardware setups
aim at facilitating the generalization of the developed FIGURE 2.—Hardware setup utilizing the ‘‘Super-Prolight1000’’ CNC
model to be applied to other models and types of CNC milling machine.
machines. In the first rig, a Denford ‘‘Novamill’’ CNC
milling machine is employed, whereas in the second rig,
an Intelitek type ‘‘Super-Pro-light 1000’’ CNC milling THE MACHINING TASK
machine is used. Both machines can be characterized as
bench-type with PC-driven Fanuc-controller emulated High speed steel (HSS) end milling cutters of U 10 mm
machines. In order to introduce vision system to these are used for the experiments. Square size workpieces
two rigs, two high resolution Logitech C-910 cameras, made of commercial aluminium are used, and their sur-
which provide images of 24-bit with real resolution of faces are machined using varying machining parameters.
2,592 1,944 pixels, are used to acquire valid machined Depth of cut in the range of 1–3 mm, feed rate of
surface data and avoid blocking the scene of the 50 mm=min up to 300 mm=min, and speed of cut of
machined surface by the tool or tool movement. These 1,000 rpm are used in the investigation. More than 60
four cameras are mounted on the two machine spindles specimens were actually machined using varying values
to provide plan view of the machined part. Ambient of machining parameters from the above ranges to
lighting is used to minimize the effects of specular light enable the assessment of their surface finish. In order to
reflection from the metallic work parts that are machined facilitate a comparison base for roughness measurement
[15]. Figures 1 and 2 show photographs of the developed using vision system, surface roughness of these machined
rigs on CNC machines at work. specimens are measured using SE1200 Kosaka-Lab
stylus-based profile-meter; see Fig. 3.
FIGURE 1.—Hardware setup utilizing the ‘‘Novamill’’ CNC milling FIGURE 3.—SE1200 stylus-based roughness profile-meter, machined
machine. specimens, and sample of acquired roughness parameter results.
IMPROVED CNC MACHINING USING VISION-BASED SYSTEM 767
Seventeen different standard surface roughness TABLE 1.—Example of resulting surface roughness for a set machined
parameters are computed for each specimen. These specimens. (Speed of cut is set to 1,000 rpm.)
include Ra, Rz, Rp, Rv, Rq, RSm, Rmr, Rt, Rsk, Rku, Rk, Sample Feed rate Depth of Surface roughness Surface roughness
Rpk, Rvk, Mr1, Mr2, A1, and A2 [8]. Figure 4 shows exam- no. mm=min cut mm Ra mm Rt mm
ples of computed values of some of these parameters.
Analyses of acquired results of these different rough- 1 50 2 3.145 18.932
2 100 2 1.719 8.810
ness parameters show that it is not always possible to
3 150 2 3.301 16.896
predict the resulting surface roughness parameters, even 4 200 2 3.581 22.680
if only one machining variable is varied while all other 5 250 2 5.132 22.340
machining variables are kept constant. Hence, based 6 300 2 6.045 30.184
on the acquired results, none of the employed surface 7 50 3 3,132 17.344
roughness parameters could be used with confidence in 8 100 mm=min 3 2.106 12.976
this research to develop a reliable model that predicts 9 150 mm=min 3 1.700 13.978
and evaluates the performance of machining parameters. 10 200 3 3.125 17.534
For instance, it is evident from Table 1, which includes 11 250 3 3.111 19.434
examples of acquired measurements, that although all 12 300 3 4.012 19.563
machining parameters are kept constant apart from feed
rate, the resulting amplitude parameters of stylus-based Nevertheless, if such novel surface roughness indica-
surface roughness do not always provide consistent tors are introduced, they are not meant to replace the
values that can be directly related to the feed rate used. widely used and well-known roughness parameters such
Therefore, there is a real need to introduce alternative as Ra, but they will provide beneficial advantage to allow
surface roughness indicators that could provide direct the evaluation of the performance of the cutting con-
and consistent relationship between resulting surface ditions in conjunction with the currently used roughness
qualities and employed machining parameters. parameters.
It should be noted that surface roughness resulting
from machining operations have features that influence
both micro- and nanoscale regions [16], as shown in
Fig. 5. While both micro- and nano-features are of high
importance in engineering applications, the microscale
region usually results from the tool cutting edge imprint
on the workpiece surface, and hence the on-line monitor-
ing of this roughness scale region will provide advan-
tageous judgement on the cutting tool condition and life.
In face milling operation, tool imprints on the
machined surface are mainly influenced by feed per tooth.
In other words, the tool imprints on the machined surface
60 V ln
It ¼ ; ð1Þ
pd f
CNC-VISION INTEGRATION
In general, CNC machine controllers do not provide
open platform to develop supervisory control programs
by users [17], and hence in order to facilitate the
execution of this research a detailed technical design is
proposed and employed to enable the integration of
the vision system with employed CNC machine. The
proposed system design utilizes the use of three compu-
ters as follows (Fig. 6):
a normal G-code program that defines the required tool enable movements of 10 mm at a time. However,
path, selected machining parameters, as well as other other values can also be selected to segment the tool
miscellaneous functions. path, though resulting tool path segments should
A dedicated program is developed by the authors to be longer than the usual tracing length used in the
enable data processing and system implementation. The evaluation of tool imprints or surface roughness.
main computer is used to execute the developed program. c. Wait for a defined short duration while compressed
The program analyzes and processes the input G-codes air is still on using G04 (Dwell cycle); in the practical
in order to automatically segment the tool path and out- tests, we set the time for this task to two seconds and
put a set of regenerated G-codes, thus allowing the gained successful results. Once this step is executed
implementation of on-line monitoring and control. Each and the air blast solenoid is turned to off status,
block of the regenerated G-codes facilitates the move- the vision system starts to acquire the image; the
ment of one single-segmented tool path movement of a activation of the vision system is also done via an
predefined linear (or circular) length, where such move- external trigger signal gained from another output
ment is extracted from the originally programmed tool channel of the CNC machine controller, whereas
path. However, the execution of the newly generated camera selection is achieved by the extraction of cut-
G-code program ensures the exactl same tool path, ting direction from the most recent executed block of
machining parameters, and miscellaneous functions, the segmented G-code program.
though it includes subroutine call functions and other d. Apply image-processing procedure to relevant surface
control functions, as will be discussed later. The regener- image data. This stage involves intelligent identifi-
ated G-codes are then stored in the common storage to cation of relevant region of interest to extract image
enable their loading and execution by the CNC control- data for processing. Once the image data is processed,
ler. Once the CNC controller starts to execute the newly new values of machining parameters are decided
generated G-code program, it will sequentially load and according to an automatic decision making pro-
execute certain attached G-code subroutines that are also cedure. These new machining parameters are typed
automatically created and attached to the main G-code to relative G-code subroutine. It should be noted that
program. These G-code subroutines include the spindle the image region of interest aims to monitor the cut-
speed and feed rate, as well as other added control codes; ting tool performance just as the tool completes the
hence, the values of cutting speed and feed rate can be cutting of the most recent tool path segment; hence,
replaced by other decided values without the interruption if no cutting is conducted within this tool path seg-
of the execution of the main G-code program. The ment, e.g., the cutting tool is approaching the work-
instantaneous updating of either=both speed of cut piece, then there is no need to acquire and process
or=and feed rate values is achieved by replacing relative relevant image data or update the machining para-
G-code subroutine by a new version which includes the meter values.
new values of these machining parameters instead of e. The CNC machine waits for another signal input
the current values used. The new values of machining which is sent by the vision system to acknowledge
parameters are decided according to an automatic that the image processing for this step is completed,
decision-making action based on intelligent surface and the new G-code subroutine becomes ready to
roughness feedback gained via the vision system. be loaded and executed.
In order to overcome the problem of coolant and pro- f. Load and execute the newly generated G-code sub-
duced chips obstructing the visual scene of the vision routine by the CNC controller; thus, the new values
system, compressed air is used instead of liquid coolant. of machining parameters are applied.
Activation and deactivation of compressed air is con- g. Repeat step (a) above, and continue executing the
trolled via a solenoid valve that acquires its activation next segmented movement of the tool path.
signal via the main G-code program. The compressed
air, in this case, will also serve to remove the cut chips The above steps were tested successfully on the two
away from the camera field of view; however, to achieve developed rigs; however, we made some alterations to
this task, sequential control steps are used in the main suit each of the two CNC machine controllers. For
G-code program described below: instance, the Novamill CNC controller allow the use of
subroutine call from outside its environment; therefore,
a. Activate the air coolant by generating relative output the developed method works fine, whereas the
signal to the solenoid; for the Intelitek CNC machine, Super-Prolight CNC machine controller does not allow
we used the code (M25 H11) to set the signal to high this task. Hence, in order to overcome this limitation,
level at the output channel H11, which turn the com- we used ‘‘chain to program’’ code (G20) instead of sub-
pressed air flow to on status and the code (M26 H11) routine call. It should be noted here that the chain to pro-
to set the output signal to low level hence turn the gram G20 does not provide a way to come back to the
compressed air flow to off status. main G-code program, and hence the regeneration of
b. Execute one segmented movement of the machining the G-code program, which is achieved at the first stage,
path; a value of 10 mm is used in the conducted was altered to generate and chain the newly generated
experiments of this research, and the generated G-code programs instead of using subroutine call func-
G-codes, therefore, include execution blocks to tion. Figure 8 presents an example of a simple G-code
770 G. AL-KINDI AND H. ZUGHAER
Resulting C value
50 31 33 27 28 29.75 %10.9
100 32 25 25 26 27 %18.5
150 23 20 24 26 23.25 %14
200 18 16 17 22 18.25 %20.5
250 16 15 16 19 16.5 %15.2
300 12 12 15 13 13 %15.4
for this research team, where a large number of enable on-line in-situ assessment of machined surfaces
additional experiments are planned to be executed to to enable the integration of such developed techniques
generalize the findings. with CNC machines.
Once a reliable model is achieved, on-line processing The developed technique to calculate reference surface
of vision data while machining will be conducted using for machined surfaces using image data showed excellent
the developed CNC-vision system. results and overcoming the problem areas associated
The developed CNC-vision system at the current stage with irregular light reflection and color variation of metal
handles both linear and circular interpolated movements work parts. Hence, the authors strongly recommend
in x and y directions. However, the system capability employing of this technique in surface roughness
could be expanded to enable the handling of three axes evaluation using vision data.
movements; monitoring of the movement in z direction The proposed tool imprint indicator C has showed to
will require adding four additional cameras for each provide valid consistent indicating values that can be
rig, therefore, monitoring and processing of images directly related to machining parameters employed.
acquired from six cameras to achieve real-time control However, more investigation and development of this
need higher computer processing speeds; hence, such parameter is needed for more robustness. In addition,
task is planned for future work. further investigation of the use of this proposed indi-
Nevertheless, several other alternative future direc- cator to assess tool condition is also required.
tions are also planned to expand the findings of this The outcome results gained from the practical execution
research. One of these directions is towards system of this research project encourage further investigation
expansion to accommodate other machining tasks and development to beneficially realize the challenging
within the system, except those which do not allow clear aim of the development of intelligent computer vision con-
image acquisition, e.g., drilling of small holes and taping trolled CNC machines. Further steps are already planed
of small threads. and under development by this research team to achieve
In addition, a goal is also set to investigate the system more robust and reliable system that can be readily
feasibility if applied to industrial type CNC machines. A adopted and applied to industrial CNC machines.
preliminary plan on how to achieve this goal is set by the
authors of this article; however, it needs to be tested and ACKNOWLEDGMENTS
proved. The results presented in this article are part of an
on-going research project funded by Qatar National
Research Fund QNRF (project number NPRP Number:
CONCLUSIONS 08-287-2-096 with a total values of US$ 396,900.00).
In this work, considerable progress is achieved
towards the challenging realization of intelligent vision- REFERENCES
controlled CNC machines. The proposed framework 1. Al-Kindi, G.; Gill, K.; Baul, R. Vision-controlled CNC
and the developed system show satisfactory results. machines. IEE Computing and Control Engineering Journal
The technical problems associated with visibility of the 1993 (April), 92–96.
workpiece scene are handled successfully through the 2. Quintana, G.; Ciurana, J.; Ribatallada, J. Surface roughness
designed and developed procedures. These procedures generation and material removal rate in ball end milling
showed superb results in overcoming effects of coolant operations. Materials and Manufacturing Processes 2010, 25
blocking the image scene, discarding cut chip from the (6), 386–398.
image scene, and also reducing vibration effects on the 3. Cardoso, P.; Davim, P. Optimization of surface roughness in
acquired images. micromilling. Materials and Manufacturing Processes 2010, 25
The developed methodology of tool path segmen- (10), 1115–1119.
tation to allow on-line monitoring and control of 4. Zhang, J.; Chen, J. Surface roughness optimization in a
machining performance through adjusting or altering drilling operation using the Taguchi design method. Materials
the machining parameters has showed to be very effec- and Manufacturing Processes 2009, 24 (4), 459–467.
tive and provides the means to apply vision-based feed- 5. Saikumar, S.; Shunmugam, M. Parameter selection based on
back control to CNC machines to overcome limitations surface finish in high-speed end-milling using differential
associated with CNC machine controllers which in evolution. Materials and Manufacturing Processes 2006, 21,
general do not provide open platform to develop super- 341–347.
visory control programs. 6. Yang, Y.; Shie, J.; Huang, C. Optimization of dry machining
The use of two cameras in the system hardware parameters for high-purity graphite in end-milling process.
together with the developed procedure to activate rel- Materials and Manufacturing Processes 2006, 21, 832–837.
evant camera has enabled valid image data acquisition 7. El-Hossainy, T. A new technique for enhancing surface
of the most recent cut of tool path segment. roughness of metals during turning. Materials and Manufac-
Results showed that employing any of the standard turing Processes 2010, 25 (12), 1505–1512.
surface roughness parameters in on-line assessment 8. Al-Kindi, G.; Shirinzadeh, B. An evaluation of surface
using vision system does not always provide reliable roughness parameters measurement using vision-based data.
and consistent indicating values. Hence, there is a real International Journal of Machine Tools and Manufacture
need to propose and develop alternative procedures to 2007, 47, 697–708.
774 G. AL-KINDI AND H. ZUGHAER
9. Al-Kindi, G.; Shirinzadeh, B. Feasibility assessment of control in machining processes. ICIAR 2008, LNCS (5112),
vision-based surface roughness parameters acquisition for 1101–1110.
different types of machined specimens. Image and Vision 14. Kwona, Y.; Tseng, T.; Ertekin, Y. Characterization of
Computing 2009, 27, 444–458. closed-loop measurement accuracy in precision CNC milling.
10. Castejon, M.; Alegre, E.; Barreiro, J.; Hernandez, L. On-line Robotics and Computer-Integrated Manufacturing 2006, 22,
tool wear monitoring using geometric descriptors from digital 288–296.
images. International Journal of Machine Tools and Manufac- 15. Al-Kindi, G.; Gill, K.; Baul, R. Experimental Evaluation of
ture 2007, 47, 1847–1853. ‘Shape from Shading’ for Engineering Component Profile
11. Eladawi, A.; Gadelmawla, E.; Elewa, I.; Abdel-Shafy, A. An Measurement. Proc. Instn. Mech. Engineers, Part B: Journal
application of computer vision for programming computer of Engineering Manufacture 1989, 203, 211–216.
numerical control machines. Proc. Instn. Mech. Engineers, Part 16. Al-Kindi, G.; Shirinzadeh, B.; Zhong, Y. A vision-based
B: Journal of Engineering Manufacture 2003, 217, 1315–1324. approach for surface roughness assessment at micro- and
12. Kerr, D.; Pengilley, J.; Garwood, R. Assessment and visualisa- nano-scales. In Proceedings of the 10th International Confer-
tion of machine tool wear using computer vision. International ence on Control Automation Roboties and Vision (ICARV),
Journal of Advanced Manufacturing Technology 2006, 28, Hanoi, Vietnam, 2008; 1903–1908.
781–791. 17. Al-Kindi, G.; Abdul Kareem, L. An application of reverse
13. Alegre, E.; Barreiro, J.; Castejon, M.; Suarez, S. Computer engineering using vision-based data. International Journal of
vision and classification techniques on the surface finish Applied Engineering Research 2009, 4 (7), 1333–1346.