Professional Documents
Culture Documents
System and Method For Stereoscopic Image Acquisition
System and Method For Stereoscopic Image Acquisition
Abstract. This paper discusses stereoscopic images gain using library in OpenCV on Python platform.
The simple stereo camera system that can provide stereoscopic images in both as a picture (PNG) or as a video (AVI) is
developed. The system comprises a stereo camera, a USB connector, a tripod, and a notebook with programming language
of Python 3.7, along with the free library of OpenCV 4.0 installed inside the notebook. The system has successfully
provided both images and videos. However, the spatial resolution and the frame rate depend to the specifications of the
stereo camera being used. The camera resolution of 1280 x 360 with a frame rate of 5 fps is obtained based on the stereo
camera specifications in this study. The disparity of the stereo camera as a function of distance of the camera to the object
of interest is measured. From the experimental results, the p constant is 1059.4, q constant is -0.527 over a distance of
10–100 cm. It shows that farther the object from the camera, the disparity decreases. This system will be used for
performing physical measurement in the future.
Keywords: Stereo camera, image acquisition, disparity, 3-D physical measurement, image analysis
INTRODUCTION
Stereo camera is a two-camera system that has a slightly different angles and can capture two images
simultaneously. In Cartesian coordinate, the two cameras are in the same vertical position, but different horizontal
positions. By utilizing the different perspectives between two cameras, a 3-D image can be reconstructed [1, 2].
The stereo camera is widely chosen because of its low cost, non-contact, flexible, and easy to integrate with other
systems [3].
Stereo camera has been used during manufacturing processes and quality control process in many industries. Stereo
camera also used for various fields such as measuring the density of fish at sea [4, 5], biomass ecosystems at sea [6],
speed Hump detection [7], robotics [8], and measuring deformation [9]. Moreover, to improve image quality, stereo
camera acquisitions can be added to an active device in the form of a binary image using an LCD projector [10].
The way the stereo camera works may become an interesting concern for physical measurement, especially in
3-D physical measurement. Meanwhile, the accuracy of a stereo camera calibration can be improved by the use of
lasers illumination [11]. Continuous calibration is a necessity when doing measurement of moving objects [12].
In some cases, tracking accuracy of stereo cameras is also important [13, 14]. This paper deals with a preliminary
study on how a stereo camera can be used to provide images that eventually their images can be analyzed and used as
physical measurement tool. In this paper, the meaning of disparity is explored.
Proceedings of the 6th International Symposium on Current Progress in Mathematics and Sciences 2020 (ISCPMS 2020)
AIP Conf. Proc. 2374, 020014-1–020014-7; https://doi.org/10.1063/5.0058929
Published by AIP Publishing. 978-0-7354-4113-2/$30.00
020014-1
METHODOLOGY
This study aims to create a stereo camera system that can produce stereoscopic images using library provided by
OpenCV and it runs under Python platform. The stereoscopic image can be saved as image and video. OpenCV is a
cross-platform open source library that provides various modules for experimentation and computer vision
applications [15]. Meanwhile, Python is a high-level, open source and cross-platform programming language that is
easy to learn [16]. The description of the system and its method is described below.
Hardware Design
The system comprises of a built-in stereo camera, a USB cable, a tripod, and a notebook. The stereo camera setup
is shown in Fig. 1. The two cameras are mounted in parallel in the horizontal direction as shown in Fig. 1a.
The distance of the two cameras is 6 cm, like the average distance of the human eyes. The specifications for both
cameras are the same. The stereo camera is attached to the tripod as shown in Fig. 1b, in order to adjust the height and
angle of the camera. Figure 1c shows complete system configuration.
Software Design
The software used is a free OpenCV 4.0, which is an optimized computer vision library specifically designed for
real-time application. It is run under the Python 3.7 platform programming language. Programming algorithms are
used as follows. The software calls all required modules, then it read the address of the stereo camera. After that,
it determines the camera resolution, and specifies the video recording frames per second (fps), the type of image
produced in color, and the image format (AVI). Next, it shows stereo image, then records the images. While it captures
the images, it also calculates the disparity of the camera. At last, it saves all images, video and the disparity as data
set.
The module numpy was used for analyzing video frames, while the module cv2 was used for image processing
and manipulation. As an example, the command of “cv2.video capture (1)” provided in module will access first
external camera connected to the notebook. The camera resolution was selected following the specifications of the
stereo camera. The image is saved as color image and also as video at 5 fps.
(a)
(b) (c)
FIGURE 1. Stereo camera set-up (a) Front view, (b) Side view, and (c) whole system configuration.
020014-2
Stereo Camera Performance Test
The stereo camera has produced two images of an object with different perspectives in the horizontal direction.
This difference perspective is defined as a disparity. Furthermore, the perception of 3-D images is generally perceived
based on this disparity, just like the way of human eye works. Evaluation on the stereo camera performance relies on
its disparity profile. The profile disparity as a function of the distance between the object and the stereo camera must
be collected.
From pinhole model [17], the disparity (x) can be calculated using equation below
(1)
(2)
where p =bf are constants that depending on the specifications of stereo cameras used. The q's value is obtained through
the experiment. Theoretically, q's value is -1. The D in this study varies 10–100 cm.
To measure disparity from stereoscopic images it can be use equation below,
(3)
where x1 and x2 are horizontal object position on right and left images [17].
The specifications of stereo camera using in this study are showed at Table 1.
Resolution Testing
In the first step, the resolution of the stereo camera used in this system was tested. Therefore, the stereo camera
register was called up by using the module program “cv2.video capture (1)”. Without a definition of the image
resolution entered, the system will fail to acquire the stereoscopic image. The camera would be able to provide a stereo
image when the aspect ratio of the resolution was 32:9. Therefore, the image resolutions of 2560x720, 1280x360,
1920x540, and 640x180 were working well. So, there were four options of image resolution could be used. The stereo
image is essentially two images that are put side by side. For a stereo image of 1280x360, it comprises of two images
at a resolution of 640x360.
Specifications Value
Baseline (b) 6 cm
Maximum Resolution 2560 x 720 pixel
Focal length ( f ) 600 pixel
020014-3
Frame Rate Testing
Next test is determining the best frame rate for the image acquisition. At the first trial, a frame rate of 30 fps was
used. The results showed that the video was good, but it was too fast. Then the frame rate was gradually decreased by
5 from 25 fps, 20 fps, 15 fps, 10 fps and 5 fps. The results showed that the frame rate of 5 fps was the best corresponds
to the real tempo.
Based on the results of previous tests, the stereoscopic image resolution of 1280x360 and a frame rate of 5 fps
were selected. An example of the stereoscopic image produced in this study is shown in Fig. 2. The image shows that
the left image and the right image is shifted in horizontal direction. This horizontal shift is further defined as a disparity.
Based on this disparity, the algorithm can obtain stereoscopic images. The algorithm is implemented as a software
written in Python using OpenCV modules.
When the object distance to the camera changed, the disparity also changed. By acquiring the disparity as a function
of the object distance to the camera, the disparity profile obtained. The disparity profile can be used to obtain 3-D
images by applying the triangulation concept [18].
Examples of stereoscopic images acquired at a distance of 20cm to 50 cm is shown in Fig. 3. The object was a
board that had a chess motif. The stereo camera stood perpendicular relative to the floor at a height of 50 cm from the
floor. The camera was kept at fixed position, while the distance of the object from the stereo camera changed.
Disparity
The disparity profile as a function of the object distance to the camera in the range of 10–100 cm is shown in
Fig. 4. It presents graphs of the disparity from experimental results and theoretical approach. Both graphs agree that
the farther the object from the camera, the shorter the disparity. Then the curve-fitting based on power function is
applied. The experimental results provide disparity profile of y = 1096.4x-0.527 with a regression value R2 of 0.9787.
From these result, the p constant is 1059.4, q constant is -0.527 based on Equation 2. This power function of
experimental disparity is more sloping than the theoretical disparity profile of y = 3590.4x-1 with a regression value
R2 of 1.0. This difference is caused by an alignment error [17]. However, this difference does not affect the quality of
subsequent data processing. The theoretical approach is only to help construct the disparity equation that is to seek
through experiments. The data used for the next process is the disparity equation obtained through experiments.
Anaglyph Image
Anaglyph image is another presentation for a stereoscopic image. The module for converting the stereoscopic
image as an anaglyph image was developed. Using the module, the stereoscopic image was divided into two parts,
left image (L) and right image (R). For the stereoscopic image of 1280x360 in resolution, there were two images, each
had a resolution of 640x360. Then, the left image is converted to be a blue image, while the right image is converted
to be a red image. After that, both images were superimposed to yield single image. Figure 5 shows the anaglyph
images for the stereoscopic images in Fig. 3. The anaglyph image can be examined using 3-D glasses.
FIGURE 2. Stereoscopic image display using a stereo camera at image resolution of 1280 x 360.
020014-4
(a)
(b)
(c)
(d)
FIGURE 3. The stereoscopic image acquired at a distance of (a) 20 cm, (b) 30 cm, (c) 40 cm and (d) 50 cm from the camera.
The stereoscopic images (a) – (d) represent a combination of associated left image and right image, where there is a shift
in the horizontal direction between both images. Each shift depends to the distance of the object from the camera.
020014-5
FIGURE 4. The disparity profile as a function of the object distance to the camera, resulted from theory and experiment.
(a) (b)
(c) (d)
FIGURE 5. The anaglyph images of the stereoscopic image at a distance of (a) 20 cm, (b) 30 cm,
(c) 40 cm and (d) 50 cm from the camera in Fig. 3.
CONCLUSION
This experience showed that a stereo camera system and its acquisition method using OpenCV could be built
successfully. It can obtain a stereoscopic image with a resolution of 1280x360 and a frame rate of 5 fps, at least defined
by the stereo camera specifications. It can provide images as video and picture. The stereoscopic image can be
transformed as an anaglyph image. The disparity profile as a function of the camera distance to the object yielded
from experiment shows similar profile with the theory, although it has different slope. From the experimental results,
020014-6
the p constant is 1059.4, q constant is -0.527 over a distance of 10–100 cm. We conclude that as the object getting
further from the camera, the disparity decreases.
ACKNOWLEDGMENTS
We would like to express our gratitude to Deputi Bidang Penguatan Riset dan Pengembangan Kemenristek/BRIN
for providing the research grant of the 2019 Doctoral Dissertation Program and to PT Madeena Karya Indonesia for
facilitating the devices for this research.
REFERENCES
020014-7
Vol.11 (2021) No. 6
ISSN: 2088-5334
Abstract— This research aims to create a 3D velocity measurement system using a stereo camera. The 3D velocity in this paper is the
velocity which consists of three velocity components in 3D Cartesian coordinates. Particular attention is focused on translational motion.
The system set consists of a stereo camera and a mini-PC with Python 3.7, and OpenCV 4.0 installed. The measurement method begins
with the selection of the measured object, object detection using template matching, disparity calculation using the triangulation
principle, velocity calculation based on object displacement information and time between frames, and the storage of measurement
results. The measurement system's performance was tested by experimenting with measuring conveyor velocity from forward-looking
and angle-looking directions. The experimental results show that the 3D trajectory of the object can be displayed, the velocity of each
component and the speed as the magnitude of the velocity can be obtained, and so the 3D velocity measurement can be performed. The
camera can be positioned forward-looking or at a certain angle without affecting the measurement results. The measurement of the
speed of the conveyor is 11.6 cm/s with an accuracy of 0.4 cm/s. The results of this study can be applied in the performance inspection
process of conveyors and other industrial equipment that requires speed measurement. In addition, it can also be developed for accident
analysis in transportation systems and practical tools for physics learning.
Keywords— Stereo vision; speed measurement; object tracking; visual odometry; autonomous driving.
Manuscript received 18 Jun. 2021; revised 31 Jul. 2021; accepted 12 Aug. 2021. Date of publication 31 Dec. 2021.
IJASEIT is licensed under a Creative Commons Attribution-Share Alike 4.0 International License.
2482
speed components, making it simpler. However, the
measurement results will be biased if the object's motion does
not match the placement of the camera.
The solution to this problem is to use 3D velocity
measurement, where all three velocity components are
measured simultaneously. Thus, the placement of the detector
is more flexible without changing the programming algorithm.
The detector can not only be placed forward-looking and side-
looking but can be placed angle-looking where the object and
camera form a certain angle.
Research on 3D velocity measurement from angle-looking
is still limited because it involves more variables and is more
complex. However, the advantage of this approach is that the
camera can be placed at various angles so that it is more
flexible in use than forward-looking and side looking.
This paper proposed 3D velocity measurement using a
stereo camera. The detection of objects uses a template
matching algorithm. The measurement process is carried out
by forward-looking as is usually done and then by angle
looking as another alternative. Angle-looking is needed to
show that the 3D velocity has been successfully measured so
that the difference in the placement of the detector affects the
value of the velocity vector component only, but the
magnitude of the velocity is constant. This hypothesis will be
tested with an experiment. The motion measured in this study
is limited to translational motion. Translational motion is the
linear motion of an object accompanied by the displacement
of that object. Examples of translational motion include the Fig. 1 System performance flowchart
trajectory of a car on the highway, the rate of a conveyor belt,
the trajectory of an airplane, a train, and so on. References to The measurement process begins by taking a stereo image
the use of stereo cameras for 3D speed measurement in with 1280 x 480 pixels resolution. The stereo image combines
translational motion are still limited. the left image and the right image, each with a resolution of
640 x 480 pixels.
II. MATERIALS AND METHOD From the resulting stereo image, the object can be
measured manually by hovering the cursor over the object and
A. System Description then pressing the left mouse button. The selected object is
The system used consists of a calibrated stereo camera and automatically used as a template measuring 24x24 pixels. The
a mini-PC with the specifications shown in Table 1. The template represents the object in the real space being
stereo camera is mounted on a tripod connected to the Mini measured. Templates can be changed before taking
PC using a USB cable. measurements or at the time of taking measurements. Users
TABLE I can freely choose objects in the stereo image to be used as
CAMERA AND MINI PC SPECIFICATIONS templates.
Stereo Camera Specifications Next, template matching is applied to the stereo image. The
Focus 3,0 mm (fixed) template matching process is carried out using the library
Sensor ¼ inch contained in OpenCV 4.0. In this paper, the template
Pixel size 3 µm x 3 µm matching process is carried out by dividing the stereo image
Object distance 30 cm – infinity into two equal parts, namely the left image and the right image,
FoV 100º
Image Resolution 2560x720,1280x480,
and then applying template matching to both separately. At
FPS 30 FPS this stage, there are two conditions. If the template on the left
Baseline 6 cm and right images is detected, then proceed to the next process,
Mini PC Specifications but if not, the information "No object detected" is displayed,
Processor Intel(R)Core(TM) i5-8250U CPU@1,6 GHz and the template selection process can be repeated.
1,8 GHz After the object is detected, the next step is to calculate the
RAM 16 GB disparity and 3D position based on the parameters that have
System type 64-bit OS, x64-based processor
Operating system Windows 10
been obtained during calibration. The disparity is the
Software Python 3.7 and OpenCV 4.0 difference in the position of objects on the left and right
images. The disparity is calculated using the pinhole model
The flowchart of the 3D velocity measurement system is approach, as shown in equation (1)
shown in Fig. 1. = − (1)
2483
The 3D position of each component is calculated using
equation (2 – 4)
= (2)
= (3)
= (4)
2484
straight line, changing significantly on the Z-axis with a range
of 35-50 cm, while on the X-axis and Y-axis, it remains the
same.
The results of the angle-looking measurement with an
angle of 30° and 45° to the direction of the object's motion are
shown in Fig. 4a and Fig. 4b. Based on the figure, it appears
that the trajectory of the object is in the form of a straight line,
as in Fig. 4a. The position component on the X-axis and Z-
axis change significantly while the Y position component
remains. However, the Y position component at an angle of
45° is greater than that of an angle of 30°.
Based on Fig. 4(a-c), it can be shown that the object's
trajectory is in the form of a straight line according to the
object's motion on the conveyor. Thus this speed
measurement system has succeeded in showing the 3D
trajectory of the object.
The trajectory of an object is important for visually
demonstrating 3D motion.
Fig. 4c The trajectory of the object when the camera makes an angle of 45°
to the direction of motion
Fig. 4b The trajectory of the object when the camera makes an angle of 30°
to the direction of motion
Fig. 5a Position component measurement results on Forward-looking
2485
In the same way, based on Fig. 5b, the velocity component
obtained at 30° looking angle are VX = 5.6 cm/s, VY = 0.15
cm/s, VZ = 10.5 cm/s. The velocity $ = 5.6% & 0.15' &
10.5( cm/s and the speed V = 11.9 cm/s. The angle value can
be found using the equation = tan-1(VX/ VZ)=28°. This value is
2° degrees different from what it should be.
IV. CONCLUSION
The system developed in this study has successfully
measured 3D velocity. The camera position can be placed
forward-looking or angle-looking. Based on the specifications
of the stereo camera used, this system can measure the speed
of 11.6 cm/s with an absolute error of 0.4 cm/s.
NOMENCLATURE
d disparity pixel
xr horizontal position at right image pixel
xl horizontal position at left image pixel
Fig. 5c The result of measuring the position component at an angle-looking
f focus length pixel
45° b baseline cm
X position component on X-axis cm
A. Speed Measurement and its Accuracy Y position component on Y-axis cm
Speed accuracy is calculated using equation 10. The Z position component on Z-axis cm
measurement is carried out five times. The measurement of V velocity cms-1
the speed for the three variations is shown in Fig. 6. The V speed cms-1
results for each variation are forward-looking (11.6 ± 0.4) θ angle between camera to object direction radian
cm/s, angle-looking 30° (11.8 ± 0, 2) cm/s, and the angle- VX velocity component on X-axis cms-1
looking 45° is (11.7 ± 0.3 ) cm/s. Based on Fig. 6, it appears VY velocity component on X-axis cms-1
that the measurement results along with the angle-looking VZ velocity component on X-axis cms-1
errors of 30° and 45° are within the range of forward-looking FPS Frame per Second s-1
error. Thus, it can be stated that the measurement results of i vector unit on X-axis
the three variations are consistent with each other. j vector unit on Y-axis
k vector unit on Z-axis
2486
ACKNOWLEDGMENT [11] Y. Gao, T. Qiao, H. Zhang, Y. Yang, Y. Pang, and H. Wei, “A
contactless measuring speed system of belt conveyor based on
We are grateful to the Ministry of Research, Technology, machine vision and machine learning,” measurement, vol. 139, pp.
and Higher Education of Indonesia through BPPDN grants 127–133, 2019.
and the 2020-2021 Doctoral Research Grant for funding this [12] P. Najman and P. Zemcik, “Vehicle Speed Measurement Using Stereo
Camera Pair,” IEEE Trans. Intell. Transp. Syst., pp. 1–9, 2020.
research. Also, we are obliged to PT Madeena Karya [13] L. Yang, M. Li, X. Song, Z. Xiong, C. Hou, and B. Qu, “Vehicle Speed
Indonesia for preparing the stereo camera used in this Measurement Based on Binocular Stereovision System,” IEEE Access,
prototype. vol. 7, pp. 106628–106641, 2019.
[14] A. Safaei, “Adaptive relative velocity estimation algorithm for
autonomous mobile robots using the measurements on acceleration
REFERENCES and relative distance,” Int. J. Adapt. Control Signal Process., vol. 34,
[1] J. H. Kim, W. T. Oh, J. H. Choi, and J. C. Park, “Reliability no. 3, pp. 372–388, 2020.
verification of vehicle speed estimate method in forensic videos,” [15] H. Deng, U. Arif, K. Yang, Z. Xi, Q. Quan, and K. Y. Cai, “Global
Forensic Sci. Int., vol. 287, pp. 195–206, 2018. optical flow-based estimation of velocity for multicopters using
[2] L. R. Costa, M. S. Rauen, and A. B. Fronza, “Car speed estimation monocular vision in GPS-denied environments,” Optik (Stuttg)., vol.
based on image scale factor,” Forensic Sci. Int., vol. 310, p. 110229, 219, no. March 2019, p. 164923, 2020.
2020. [16] G. Gallego and D. Scaramuzza, “Accurate angular velocity estimation
[3] S. J. Yoon and T. Kim, “Development of stereo visual odometry based with an event camera,” IEEE Robot. Autom. Lett., vol. 2, no. 2, pp.
on photogrammetric feature optimization,” Remote Sens., vol. 11, no. 632–639, 2017.
1, 2019. [17] F. Guo and R. Chellappa, “Video Metrology Using a Single Camera,”
[4] E. Jung, N. Yang, and D. Cremers, “Multi-frame GAN: Image IEEE Trans. Pattern Anal. Mach. Intell., vol. 32, no. 7, pp. 1329–1335,
enhancement for stereo visual odometry in low light,” arXiv, no. CoRL 2010.
2019, pp. 1–10, 2019. [18] N. Murmu, B. Chakraborty, and D. Nandi, “Relative velocity
[5] R. Fan, L. Wang, M. J. Bocus, and I. Pitas, “Computer stereo vision measurement using low cost single camera-based stereo vision system,”
for autonomous driving,” arXiv, 2020. Measurement, vol. 141, pp. 1–11, 2019.
[6] L. Li, Y. H. Liu, T. Jiang, K. Wang, and M. Fang, “Adaptive [19] Y. C. Lim, M. Lee, C. H. Lee, S. Kwon, and J. H. Lee, “Improvement
Trajectory Tracking of Nonholonomic Mobile Robots Using Vision- of stereo vision-based position and velocity estimation and tracking
Based Position and Velocity Estimation,” IEEE Trans. Cybern., vol. using a stripe-based disparity estimation and inverse perspective map-
48, no. 2, pp. 571–582, 2018. based extended Kalman filter,” Opt. Lasers Eng., vol. 48, no. 9, pp.
[7] Y. Liu, Y. Gu, J. Li, and X. Zhang, “Robust stereo visual odometry 859–868, 2010.
using improved RANSAC-based methods for mobile robot [20] “A Novel Approach to Measurement of the Transverse Velocity of the
localization,” Sensors (Switzerland), vol. 17, no. 10, 2017. Large-Scale Objects.” [Online]. Available:
[8] J. Guo et al., “Real-time measurement and estimation of the 3D https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7988419/. [Accessed:
geometry and motion parameters for spatially unknown moving 30-Apr-2021].
targets,” Aerosp. Sci. Technol., vol. 97, p. 105619, 2020. [21] S. L. Jeng, W. H. Chieng, and H. P. Lu, “Estimating speed using a
[9] D. Ge, D. Wang, Y. Zou, and J. Shi, “Motion and inertial parameter side-looking single-radar vehicle detector,” IEEE Trans. Intell. Transp.
estimation of non-cooperative target on orbit using stereo vision,” Adv. Syst., vol. 15, no. 2, pp. 607–614, 2014.
Sp. Res., vol. 66, no. 6, pp. 1475–1484, 2020.
[10] Z. Li, F. Zeng, C. Yan, J. Wang, and L. Tang, “Design of belt conveyor
speed control system of ‘internet+,’” IOP Conf. Ser. Mater. Sci. Eng.,
vol. 563, no. 4, 2019.
2487
Physics Education
This content was downloaded by joe.murphy@ioppublishing.org from IP address 51.104.190.136 on 27/01/2022 at 15:34
PAPER
Phys. Educ. 57 (2022) 025006 (12pp) iopscience.org/ped
Development of
laboratory devices for
real-time measurement of
object 3D position using
stereo cameras
Sigit Ristanto1,2, Waskito Nugroho1, Eko Sulistya1
and Gede B Suparta1,∗
1
Department of Physics, Universitas Gadjah Mada, DI Yogyakarta 55281, Indonesia
2
Departement of Physics Education, Universitas PGRI Semarang, Semarang 50232,
Indonesia
E-mail: gbsuparta@ugm.ac.id
Abstract
Measuring the 3D position at any time of a given object in real-time
automatically and well documented to understand a physical phenomenon is
essential. Exploring a stereo camera in developing 3D images is very
intriguing since a 3D image perception generated by a stereo image may be
reprojected back to generate a 3D object position at a specific time. This
research aimed to develop a device and measure the 3D object position in
real-time using a stereo camera. The device was constructed from a stereo
camera, tripod, and a mini-PC. Calibration was carried out for position
measurement in X, Y, and Z directions based on the disparity in the two
images. Then, a simple 3D position measurement was carried out based on
the calibration results. Also, whether the measurement was in real-time was
justified. By applying template matching and triangulation algorithms on
those two images, the object position in the 3D coordinate was calculated
and recorded automatically. The disparity resolution characteristic of the
stereo camera was obtained varied from 132 pixels to 58 pixels for an object
distance to the camera from 30 cm to 70 cm. This setup could measure the
3D object position in real-time with an average delay time of less than 50 ms,
using a notebook and a mini-PC. The 3D position measurement can be
performed in real-time along with automatic documentation. Upon the stereo
camera specifications used in this experiment, the maximum accuracy of the
measurement in X, Y, and Z directions are ∆X = 0.6 cm, ∆Y = 0.2 cm, and
∆Z = 0.8 cm at the measurement range of 30 cm–60 cm. This research is
∗
Author to whom any correspondence should be addressed.
March 2022 2 P hy s . E d u c . 5 7 ( 2 0 2 2 ) 0 2 5 0 0 6
Development of laboratory devices for real-time measurement
Variable x is the horizontal axis, and y is the The values f and b depend on the specifications of
vertical axis defined from the center of the image. the stereo camera, thus on a fixed Z object , the value
March 2022 3 P hy s . E d u c . 5 7 ( 2 0 2 2 ) 0 2 5 0 0 6
S Ristanto et al
However, the records of the object position
on the image also depend on many factors. So,
the calibration process is still necessary to find
the proper relationship between the Z and d val-
ues, the Y and y values, and the X and x values.
Many reviews on the camera calibration process
are available extensively [22–26].
Zang et al [22] have proposed a self-
calibration method using each beam captured
by the camera as a basic adjustment unit. This
method works well when applied to Chang’e
cruisers on the Moon. Gai et al [23] has pro-
posed a two-camera calibration method for 3D
optical-based measurements. This method uses a
planar calibration plate that changes its orienta-
tion in the measurement area. Calibration para-
meters are obtained through imagery, and then
translation and rotation matrices are calculated
Figure 2. A model of the 3D visual measurement using a centroid distance increment matrix. Jia
method using a stereo camera. et al [24] has proposed calibration of perpendic-
ularity compensation-based stereo cameras. This
of disparity between stereo cameras with each method uses five images that have obtained preci-
other can be different. In stereo cameras where f sion well. Cui et al [25] proposed calibrating ste-
and b remain, the change in disparity depends only reo cameras by minimizing metric distance errors
on Z object changes. between triangulation principles. Ma [26] has pro-
Furthermore, assuming the distortion effect posed calibrating stereo cameras using 1D objects
is negligible, the relationship between the object that are free to move and then applying Heteros-
position in real space (X, Y, Z) and their position cedastic error in variables models to improve their
in the stereo image (x, y) can be expressed by: accuracy.
fb
Z= (5) 2. Materials and methods
d
The device comprises a built-in stereo camera,
bx tripod, and a mini-PC. The specifications of the
X= (6) stereo camera and the mini-PC are presented in
d
table 1.
The stereo camera was connected through
by
Y= . (7) the USB 3.0 port of the mini-PC. The software
d was developed in Python 3.7, and libraries of
To predict the accuracy of the measurement of OpenCV 4.0 equipped it [27]. The Python code
the distance of the object to the camera (Z) we can be found as an additional file (available online
propose the magnitude of resolution of disparity at stacks.iop.org/PED/57/025006/mmedia) of this
(RoD). RoD is defined as the ratio of distance to paper.
the disparity: Calibration and testing tools use kalibrator
board, water pass, and slider. The system setup is
Zmeasured shown in figure 3. The calibrator object was an
RoD = , (8) acrylic board with a grid pattern of 2 cm and con-
d
centric circles of a multiplied radius of 2.5 cm. We
where Z measured is the measured distance of the cal- also put a piece of isolation tape as a test object.
ibration board to the camera, and d is disparity. The test object could be moved easily around the
March 2022 4 P hy s . E d u c . 5 7 ( 2 0 2 2 ) 0 2 5 0 0 6
Development of laboratory devices for real-time measurement
Table 1. Camera and mini-PC specifications. 2.1.1. Stereo image capture. At this stage, the
Stereo camera Specifications step is to initialize the stereo camera. Then, set
the resolution and frame rate of the stereo image
Model Stereo 3D VR USB 2.0 according to its specifications. In this paper, the
Manufacturer www.kayetoncctv.com/ resolution of the image used is 1280 × 460 pixels.
Focus 3,0 mm (fixed) The resolution results from a combination of the
Sensor ¼ inch left and right cameras with 640 × 460 pixels res-
Pixel size 3 µm × 3 µm
olution. The left camera image is located in the
Object distance 30 cm—infinity
FoV 100◦
horizontal range (0–640) pixels, while the right
Image format MJPEG 2560 × 720 30 FPS camera image is in the range (641–1280) pixels. If
Baseline 6 cm this stage is successful, then on the monitor screen
appears a stereo image display.
Mini PC Specifications
March 2022 5 P hy s . E d u c . 5 7 ( 2 0 2 2 ) 0 2 5 0 0 6
S Ristanto et al
marked with pink squares according to the tem-
plate size.
The template matching success indicator is
characterized by two things. Firstly, the appear-
ance of pink squares on the left and right image.
Secondly, the vertical location (y) of the box on
the left and right images are the same, while the
horizontal location (x) is different. Based on ste-
reo cameras, the height of the left camera and the
right camera are the same. The two cameras are
only separated horizontally. If both indicators are
met, then it can be continued to the following pro-
cess. If not, the meter can reselect the template
using the mouse as in stage 2. This process is done
manually by the gauge.
The next step determines the object location
in the template image on the left and right images.
The assumptions used by moving objects are fol- Figure 4. Position measurement algorithm using a
lowed by the homogeneous displacement of pixels template matching technique on stereoscopic images.
in the template. Assuming that, the object can be
represented the location of a pixel point on the 2.1.6. Result presentation. Once all measure-
template. The pixel point selected is the top left ment parameters have been successfully obtained,
end. the next step is to display the measurement res-
ults on the monitor screen. The magnitude shown
includes the position of the object in the image
2.1.4. Disparity calculation. The disparity is the (x, y) in pixels, the disparity (d) in pixels, and the
difference in the horizontal position of objects in 3D position (X, Y, Z) in cm.
the left image and the right image. The process of This stage also displayed the coordinate axis
calculating disparity starts from equation (3). The in the image. The center of the left image is selec-
xl and xr values are taken from the horizontal pos- ted as the reference point.
ition of the matching template on the left image Stages 1 to 6 are procedures in one meas-
and the right image. urement loop. Once the measurement process is
The validity of disparity in the test uses cri- complete, the results can be saved in videos in
teria d > 0 and the value yr = yl with d as dis- ‘AVI’ format and spreadsheets in ‘xlxs’ format.
parity, yr as the vertical position of objects on the The measurement algorithm is shown in figure 4.
right image, yl as the vertical position of objects
on the left camera. If it does not meet, then the
information ‘object not detected. Please, select 2.2. Method of calibration
the template again’. At this stage, the meter can The device for the calibration process was a rect-
change the template directly on the stereo image angular object being observed and a measure-
displayed on the monitor screen until the inform- ment system that used a stereo camera. The test
ation no longer appears. If valid, then continue in object was an isolation tape affixed to the calib-
the following process. rator board.
In this study, each coordinate component
is calibrated separately. The calibration process
2.1.5. Object position calculation. Once the dis- starts with the Z-axis component. This calibration
parity value is valid, the next step is to calcu- is done because it is based on equation (5).
late the 3D position of the object using equations The calibration process of the object dis-
obtained through calibration. The calibration pro- tance to the camera (Z) was done by changing
cess is described in section 3.2. the camera position gradually at 1 cm intervals on
March 2022 6 P hy s . E d u c . 5 7 ( 2 0 2 2 ) 0 2 5 0 0 6
Development of laboratory devices for real-time measurement
Figure 5. Stereo image display. The test object is in a pink box, as seen by the left camera and right camera.
the Z-axis, while X = Y = 0. The object distance measurement aims to determine the error propaga-
was changed in the range of 30 cm to 70 cm. The tion in each direction.
closest distance of the object to the stereo cam- The measurement range was determined after
era was 30 cm because it was the minimum dis- calibration data sets were obtained. Based on the
tance that the object still in focus. Meanwhile, the calibration data, the measurement range at certain
farthest distance was 70 cm because there is no precision was obtained. In this study, a maximum
significant change in disparity at that distance and error of 1 cm was set, and so the measurement
after. range that meets the criteria was concluded.
The next step is to calibrate the X-axis com-
ponents. The calibration process is done by gradu-
ally changing the object position every 2 cm
according to the size of the calibrator board box, 2.4. Method for real-time performance
while Z = 40 cm, Y = 0. Objects are measured
A computer timer facility is put at the beginning
from X = −9 cm to X = 9 cm according to the
and the end of the measurement process loop, as
range of stereo cameras. This process generates
described from stage-1 to stage-6. The measure-
the horizontal position relationship of the object
ment time delay is calculated based on the dif-
in real space (X) as a function of the horizontal
ference between the end time and the initial time.
position of the object in the image (x), as stated in
One measurement loop works to acquire a single
equation (6).
frame of the stereo image. Technically, if the cam-
Finally, calibrate the Y-axis components.
era frame rate is set at 30 FPS, then the one-time
Objects are placed at Z = 40 cm, X = 0, while
processing time of the measurement should be less
Y is varied gradually every 2 cm from Y = −6 cm
than or equal to 33.3 ms. If the processing time
to Y = 8 cm according to the field of view of ste-
is greater than 33.3, even if the frame rate is set
reo cameras. Thus, the vertical object position in
at 30 FPS, the number of frames produced per
real space (Y) as a function of the vertical object
second is less than 30.
position in the image (y) is obtained according to
Tests were carried out to see the influence
equation (7).
of the computers that were used in real-time
measurements. The real-time measurement by a
mini PC with an Intel processor of Core i5 and
2.3. Method position measuremnet test RAM 16 Gb and a notebook with an Intel Pro-
Measurements are performed by varying the posi- cessor of Core i3 and RAM 4 Gb were compared.
tion at (X, Y, Z) directions of the tested object. The The number of total measurements was 100 times.
March 2022 7 P hy s . E d u c . 5 7 ( 2 0 2 2 ) 0 2 5 0 0 6
S Ristanto et al
Figure 6. The disparity profile represents the effect of Figure 7. Change in RoD along the Z-axis.
the object to camera distance to the disparity.
Equation (9) is following equation (4) with a cor- and the baseline b in cm. Then taking the inverse
relation value of 0.9997. The value of 4039.8 is of the equation (9) will provide the distance Z is a
the multiplication constant of lens focus f in pixels function of disparities.
March 2022 8 P hy s . E d u c . 5 7 ( 2 0 2 2 ) 0 2 5 0 0 6
Development of laboratory devices for real-time measurement
Table 2. The results of the measurement of the object’s position.
Z varied, X = Y = constant
1 3 0 30 2.9 ± 0.1 −0.2 ± 0.2 30.3 ± 0.3
2 3 0 35 2.9 ± 0.1 0.0 ± 0.0 35.3 ± 0.3
3 3 0 40 3.2 ± 0.2 1.0 ± 0.0 40.2 ± 0.2
4 3 0 45 3.3 ± 0.3 2.0 ± 0.0 45.7 ± 0.7
5 3 0 50 3.2 ± 0.2 3.0 ± 0.0 50.8 ± 0.8
6 3 0 55 3.6 ± 0.6 4.0 ± 0.0 55.7 ± 0.7
7 3 0 60 3.4 ± 0.4 5.0 ± 0.0 60.7 ± 0.7
X = varied, Z = Y = constant
1 −5 0 30 −5.0 ± 0.0 6.0 ± 0.0 29.7 ± 0.3
2 −3 0 30 −3.0 ± 0.0 7.0 ± 0.0 30.1 ± 0.1
3 −1 0 30 −1.1 ± 0.1 8.0 ± 0.0 30.3 ± 0.3
4 1 0 30 0.9 ± 0.1 9.0 ± 0.0 30.5 ± 0.5
5 3 0 30 2.8 ± 0.2 10.0 ± 0.0 30.1 ± 0.1
6 5 0 30 4.7 ± 0.3 11.0 ± 0.0 29.7 ± 0.3
Y = varied, Z = X = constant
1 0 6 30 0.1 ± 0.1 6.0 ± 0.0 29.9 ± 0.1
2 0 4 30 0.0 ± 0.0 4.0 ± 0.0 30.1 ± 0.1
3 0 2 30 0.1 ± 0.1 1.9 ± 0.1 29.9 ± 0.1
4 0 0 30 0.0 ± 0.0 0.0 ± 0.0 29.9 ± 0.1
5 0 −2 30 0.0 ± 0.0 −2.0 ± 0.0 30.1 ± 0.1
6 0 −4 30 0.1 ± 0.1 −3.9 ± 0.1 30.1 ± 0.1
7 0 −6 30 0.0 ± 0.0 −5.9 ± 0.1 29.9 ± 0.1
The graph pattern produced in this study is of the X values in real space to x values in the
similar to the previous study results [1], with an images as expressed in equation (6) is approxim-
improvement in the graph correlation coefficient ated linearly with a correlation of 0.9997. Like-
of R2 = 0.9997. Figure 6 is also identical to the ste- wise, figure 8(b) shows the relationship of the
reo camera error modeling done previously [28]. Y values to the y values in the image following
The non-linear change in disparity causes the equation (7) with a correlation of 0.9999. There-
RoD also change along the Z-axis. fore, the X and Y can be calculated by equations
The change in RoD on the Z-axis is shown in (6) and (7).
figure 7.
Figure 7 shows that the RoD increases by
quadratic magnification as the object position gets 3.3. Position measurement test result
further from the camera. The graph can determine After calibration, the next step is to test the meas-
the accuracy of the measurement at a specific posi- urement result. The object being measured is
tion from the camera. For example, if the expected placed in nine different places and then its position
accuracy of the measurement results in less than is measured. Data on the measurement trial pro-
1 cm, then the recommended distance of the object cess are presented in table 2. The 3D position of
should be in the range of 30–60 cm. This range is the object in real space is expressed in R0 (X 0 , Y 0 ,
chosen by the error assumption of 1 pixel so that Z 0 ) while the measurement results are expressed
a measurement error varies in the range of 0.2– in R (X, Y, Z). Errors for each axis are expressed
0.9 cm. in terms of ∆X, ∆Y, ∆Z.
The X-axis and Y-axis calibrations are shown Based on table 2, the measurements of sev-
in figure 8. Figure 8(a) shows that the relationship eral 3D positions of the object can be shown.
March 2022 9 P hy s . E d u c . 5 7 ( 2 0 2 2 ) 0 2 5 0 0 6
S Ristanto et al
4. Conclusions
This research has developed the device and
method for measuring the 3D position using
a stereo camera that automatically measures a
moving object’s position in real-time and is well
documented. The method uses template matching
and triangulation techniques. The calibration res-
ults show that the object position in the z-direction
depends on the disparity, while the position of the
object in the direction of x and y remains constant.
The maximum accuracy of measuring the 3D pos-
ition in the X-axis is 0.6 cm, the Y-axis is 0.2 cm,
Figure 9. Distribution of time delay of 100
measurements.
and the Z-axis is 0.8 cm in range X(−5–5) cm,
Y(−6–6) cm, and Z(30–60) cm, respectively. The
average processing time depends on the computer
The recorded values of the positions of the X, Y, specification being used. However, both Processor
and Z components match the 3D position of the Intel Core i3 and i5 provide timely processing of
object being measured. The maximum error for less than 50 ms, which is regarded as a real-time
each axis is 0.6 cm for ∆X, 0.2 cm for ∆Y, and process.
0.8 cm for ∆Z.
This accuracy is still different from the
accuracy of the previous distance measurement Data availability statement
between two positions of 0.5 mm [5]. This accur- All data that support the findings of this study are
acy is due to differences in the measurement included within the article (and any supplement-
range and the methods used. Meanwhile, 3D posi- ary files).
tion measurement using Opto-Acoustics yields an
accuracy of 2.2 cm [29] in which has no signific-
ant difference from the results in this study. Acknowledgments
We express our gratitude to the Ministry of
Research, Technology, and Higher Educa-
3.4. Real-time performance
tion/Kemdikbud for providing BPPDN Grants and
The result of the time delay measurement is shown the 2020–2021 Doctoral Dissertation Program so
on the time delay graph as a function of the meas- that this research can be carried out. Thank you
urement sequence, as shown in figure 9. to PT Madeena Karya Indonesia for preparing
The average processing time depended on the stereo camera-based measuring instrument
the computer specification used. Using Processor prototype used in this research.
Intel Core i3 and i5, both provided different time
processing of (52 ± 3) ms and (32 ± 15) ms,
respectively. However, these results were faster ORCID iDs
than the optical flow method at 220 ms [15] and Sigit Ristanto https://orcid.org/0000-0003-
162 ms [16]. An interesting error contribution 1395-8739
when using the Processor Intel Core i5 seemed to Gede B Suparta https://orcid.org/0000-0002-
be due to inevitable internal processing. Based on 9048-5560
these test results, the Intel Core i3 processor pro-
duces a maximum frame rate of 19 FPS, while the
Received 16 September 2021, in final form 8 November 2021
Intel Core i5 processor can produce a maximum Accepted for publication 2 December 2021
frame rate of 30 FPS. https://doi.org/10.1088/1361-6552/ac3f72
March 2022 10 P hy s . E d u c . 5 7 ( 2 0 2 2 ) 0 2 5 0 0 6
Development of laboratory devices for real-time measurement
March 2022 11 P hy s . E d u c . 5 7 ( 2 0 2 2 ) 0 2 5 0 0 6
S Ristanto et al
March 2022 12 P hy s . E d u c . 5 7 ( 2 0 2 2 ) 0 2 5 0 0 6
PAPER
Phys. Educ. 57 (2022) 025024 (10pp) iopscience.org/ped
Comprehensive
observations on pendulum
oscillation using stereo
vision
Sigit Ristanto1,2, Waskito Nugroho1, Eko Sulistya1
and Gede B Suparta1,∗
1
Department of Physics, Universitas Gadjah Mada, D.I Yogyakarta 55281, Indonesia
2
Departement of Physics Education, Universitas PGRI Semarang, Semarang 50232,
Indonesia
E-mail: gbsuparta@ugm.ac.id
Abstract
We have developed a stereo vision system for observing real objects for
experimental physics demonstration. The stereo vision observes objects from
two different points of view. An observer can identify the distance and
velocity of an object based on at least two different points of view. This
paper demonstrates our comprehensive observation in a pendulum oscillation
experiment to test the stereo vision system where a stereo camera is in a
forward view position. The stereo camera produces stereo images. We use
automatic object tracking techniques to obtain the 3D position of the
pendulum in real space based on the sequential stereo images of the
pendulum movement. The experimental results in a 3D graph that shows the
trajectory of the pendulum oscillation. We confirm the common
understanding of the oscillation movement by obtaining the graph of the
pendulum oscillation as a function of time. We can obtain the oscillation
motion parameters such as amplitude, angular frequency, frequency, period,
and initial phase. Also, we can show the damping oscillation phenomena and
the damping factor.
∗
Author to whom any correspondence should be addressed.
March 2022 2 P hy s . E d u c . 5 7 ( 2 0 2 2 ) 0 2 5 0 2 4
Comprehensive observations on pendulum oscillation using stereo vision
March 2022 3 P hy s . E d u c . 5 7 ( 2 0 2 2 ) 0 2 5 0 2 4
S Ristanto et al
bx
X= (9)
d
by
Y= . (10)
d
Furthermore, equations (8)–(10) are used as ref-
erences during the calibration process to obtain a
definitive relationship between the pendulum pos-
ition in the image coordinates and the pendulum
position in the Cartesian coordinates in real space.
Figure 1. The concept of stereo vision triangulation. Furthermore, the oscillating motion of the
The horizontal distance of the pendulum on the left pendulum on the Z-axis can be expressed by
image is xl while the right image is xr . The focus of
the lenses of both cameras is the same, namely f. The
∆Z = ∆Zmax sin (ωt + θ) (11)
distance between the two cameras b. The distance of
the pendulum to the camera Z. The horizontal distance
of the pendulum at the centre of the left camera is X l , where ∆Z max as the amplitude, ω as the angular
while at the centre of the right camera is X r . frequency, and θ as the initial phase.
Other oscillation motion parameters are fre-
with the disparity d defined as quency and period. Frequency is the number of
vibrations in 1 s, while the period is the time it
takes the pendulum to make one vibration. The
d ≡ (xr − xl ) . (5)
relationship of frequency (v) and period (T) to the
angular acceleration can be expressed by
Disparity d is the difference between the hori-
zontal position of the pendulum on the right image 2π
ω= = 2πv (12)
and the left image adjacently. T
The centre of the left image coordinates is
regarded as the centre of the image coordinates as where π is a constant of 3.14159. The angular fre-
well as the centre of the Cartesian coordinates of quency (ω) concerning the parameters of the pen-
real space. Therefore, equation (2) can be rewrit- dulum motion can be expressed by
ten by equation (6) √
g
ω= (13)
x l
X=Z (6)
f where g is the acceleration due to gravity and l is
the length of the pendulum string.
while equation (7) is valid in Y-axis The damping of the oscillation amplitude A(t)
can be expressed by
y
Y=Z . (7)
f A (t) = Ao e−∝t (14)
Based on equations (4), (6), and (7), the rela- where A0 is the initial amplitude, α is a damping
tionship between the pendulum position in image factor, and t is time. Based on the value of angu-
coordinates (x, y, z) and the pendulum position in lar frequency and the damping factor, the damping
Cartesian coordinates (X, Y, Z) in real space can amplitude has three categories, namely
be written as
(1) α > ω Overdamping
fb (2) α > ω Critical damping
Z= (8)
d (3) α < ω Underdamping [30].
March 2022 4 P hy s . E d u c . 5 7 ( 2 0 2 2 ) 0 2 5 0 2 4
Comprehensive observations on pendulum oscillation using stereo vision
March 2022 5 P hy s . E d u c . 5 7 ( 2 0 2 2 ) 0 2 5 0 2 4
S Ristanto et al
Figure 4. A Snapshot image of the stereo vision system from the forward-looking side.
equation (5). A square shape located on the pendu- Y-axis is in the range 0 cm < Y < 1 cm. In addition,
lum characterizes successful object detection. The it appears that the pendulum trajectory is parabolic
caption in the above stereo image indicates the in the Z–Y plane. Figure 5(c) shows the trajectory
disparity value and position of the pendulum, and of the linear pendulum in the Z–X plane with a
the caption at the bottom of the stereo image indic- correlation of 0.98. It also shows that the traject-
ates the 3D position of the pendulum. The exper- ory of the pendulum forms an angle as large as
iment found that the disparity value is 128 pixels, tan−1 (1.1 – ((−1.1)/(42.9 – 31.8))) = 11◦ against
while the pendulum position in the image is at the Z-axis. Figure 5(d) shows that the pendulum
x = 23 pixels and y = 14 pixels. Therefore, we oscillations are on the X-axis with X = −1.2 cm
can calculated the position of 3D pendulum in to X = 1.2 cm, while on the Y-axis is in the range
real space as X = 1.1 cm; Y = 0.7 cm; and 0 cm < Y < 1 cm. In addition, it shows that the
Z = 31.8 cm. The developed software has incor- pendulum trajectory is parabolic in the X–Y plane
porated all formulas. with a correlation of 0.94.
The experimental data set obtained in this
experiment is a stereo video of the pendulum 3.3. Pendulum oscillations
oscillation accompanied by the 3D position meas-
urement. Students in the laboratory can take A sample graph of pendulum oscillations as a
advantage of the stereo video as they can see the function of time for 3 s is shown in figure 6.
3D effect using e.g. a VR BOX. In addition, the Based on figure 6, it appears that for 3 s, the
measurement data is stored in a spreadsheet so that pendulum has oscillated 2.5 s. The curve fitting
it can be recalled for further analysis. A sample of provides the oscillation parameters. The curve fit-
stereo video is provided along with this paper. ting is arranged using equations (11) and found
that the correlation coefficient between the meas-
urement result and the curve fitting is 0.997. The
3.2. Trajectory of pendulum oscillation
result of curve fitting is shown in equation (15),
The results of measuring the 3D position of ( ( π ))
the pendulum show the trajectory of the pendu- ∆Z = 5.5 sin 5.094t − cm. (15)
lum oscillation. The trajectory of the pendulum 2
based on the measurement results is shown in Based on equations (15) and (12), the pendulum
figures 5(a)–(c). oscillation motion parameters obtained are shown
Figure 5(a) shows the trajectory of the pen- in table 1.
dulum seen from a 3D perspective. Figure 5(b) The gravitational acceleration constant is
shows that the pendulum oscillations are on the Z- obtained based on the information obtained in
axis with Z = 31.8 cm to Z = 42.9 cm. While the table 1 and equation (13). By entering the value
March 2022 6 P hy s . E d u c . 5 7 ( 2 0 2 2 ) 0 2 5 0 2 4
Comprehensive observations on pendulum oscillation using stereo vision
Figure 5. Pendulum oscillation trajectory (a) 3D perspective view (b) display in the Z–Y plane (c) display in the
Z–X plane (d) display in the X–Y plane.
March 2022 7 P hy s . E d u c . 5 7 ( 2 0 2 2 ) 0 2 5 0 2 4
S Ristanto et al
where the initial amplitude is 5.5 cm, and the amplitude, angular frequency, frequency, period,
damping factor is 2.8 × 10−3 s−1 . Accord- initial phase, and damping factor. This obser-
ing to Fowles and Cassiday [30], this value vation method has contributed to expanding the
is much smaller than the angular frequency of method of observing pendulum oscillations which
5.094 rad s−1 , so the damping is very weak. These were initially only limited to the forward view dir-
results are commensurate to Ladino [28] but with ection. This method can also be applied to observe
a different approach. other oscillation motions.
We attempt to show how a stereo camera can
be used to record oscillation and then do proper
measurement based on the parameters related to Data availability statement
the stereo camera such as disparity and triangu- All data that support the findings of this study are
lation. We have obtained some key parameters included within the article (and any supplement-
related to the oscillation motion of the pendu- ary files (available online at stacks.iop.org/PED/
lum, such as frequency, amplitude, period, and 57/025024/mmedia)).
initial phase. For further work, the influence of
the length of the pendulum, gravity, and the factor
of isochronism may be explored. This work may Acknowledgments
be completed by doing graphical simulation or All authors express the financial support for
numerical simulation. Yet, it is good to show this research from the Ministry of Research,
in a 3D picture when the energy of the pendu- Technology, and Higher Education of Indone-
lum changes by its position. Therefore it needs sia through BPPDN grants and the 2020–2021
to measure the instantaneous height (Y-axis) Doctoral Research Grant, and from PT Madeena
and the instantaneous velocity of the pendulum Karya Indonesia for the prototype of the stereo
simultaneously. camera.
4. Conclusion
ORCID iD
The results showed successful observation of the
Gede B Suparta https://orcid.org/0000-0002-
phenomenon of pendulum oscillation. Compre-
9048-5560
hensive understandings provide the 3D graph
of the pendulum oscillation trajectory and the
Received 4 September 2021, in final form 10 December 2021
pendulum oscillation graph. Also, it provides Accepted for publication 20 December 2021
the parameters of the oscillation motion such as https://doi.org/10.1088/1361-6552/ac44fc
March 2022 8 P hy s . E d u c . 5 7 ( 2 0 2 2 ) 0 2 5 0 2 4
Comprehensive observations on pendulum oscillation using stereo vision
March 2022 9 P hy s . E d u c . 5 7 ( 2 0 2 2 ) 0 2 5 0 2 4
S Ristanto et al
[25] Gintautas V and Dan Hübler A 2009 A simple, Waskito Nugroho, PhD (RIP).
low-cost, data-logging pendulum built from Previously worked as a lecturer at the
a computer mouse Phys. Educ. 44 488 Department of Physics, Universitas
[26] Yulkifli Y, Afandi Z and Yohandri Y 2017 Gadjah Mada, Indonesia. Mainly
Development of gravity acceleration concerned with programming and
measurement using simple harmonic motion measurement based on the Physics of
pendulum method based on digital Imaging. His works related to Non-
technology and photogate sensor IOP Conf. Destructive Testing, Imaging Physics,
Ser.: Mater. Sci. Eng. 335 012064 Applied Nuclear Physics, and Laser-
Induced Breakdown Spectroscopy
[27] Alho J, Silva H, Teodoro V and Bonfait G 2019
(LIBS).
A simple pendulum studied with a low-cost
wireless acquisition board Phys. Educ.
54 015015
[28] Yang H, Xiao J, Yang T and Qiu C 2011 A Eko Sulistya, PhD, a lecturer at the
simple method to measure the trajectory of a Department of Physics, Universitas
spherical pendulum Eur. J. Phys. Gadjah Mada, Indonesia and also
32 867 the Head of the Basic Physics
[29] Ladino L A and Rondón H S 2017 Determining Laboratory, in charge of carrying
the damping coefficient of a simple out basic physics experimental
pendulum oscillating in air Phys. Educ. services and developing experimental
52 033007 equipment and methods. He works
[30] Fowles G and Cassiday G 2005 Analytical mostly on Computational Physics
Mechanics 7th edn (Belmont, CA: Thomson and developing educational kits using
Brooks/Cole) sensors and Arduino.
March 2022 10 P hy s . E d u c . 5 7 ( 2 0 2 2 ) 0 2 5 0 2 4