Optics and Photonics Journal, 2013, 3, 331-336

doi:10.4236/opj.2013.32B076 Published Online June 2013 (http://www.scirp.org/journal/opj)

Holographic Raman Tweezers Controlled by Hand
Gestures and Voice Commands
Zoltan Tomori1, Marian Antalik1,2, Peter Kesa2, Jan Kanka3, Petr Jakl3, Mojmir Sery3,
Silvie Bernatova3, Pavel Zemanek3
1
Department of Biophysics, Institute of Experimental Physics SAS, Kosice, Slovakia
2
Department of Biochemistry, P. J. Safarik University, Kosice, Slovakia
3
Institute of Scientific Instruments of the ASCR v.v.i., Brno, Czech Republic
Email: tomori@saske.sk

Received 2013

ABSTRACT
Several attempts have appeared recently to control optical trapping systems via touch tablets and cameras instead of a
mouse and joystick. Our approach is based on a modern low-cost hardware combined with fingertips and speech recog-
nition software. Positions of operator's hands or fingertips control the positions of trapping beams in holographic optical
tweezers that provide optical manipulation with microobjects. We tested and adapted two systems for hands position
detection and gestures recognition – Creative Interactive Gesture Camera and Leap Motion. We further enhanced the
system of Holographic Raman tweezers (HRT) by voice commands controlling the micropositioning stage and acquisi-
tion of Raman spectra. Interface communicates with HRT either directly by which requires adaptation of HRT firmware,
or indirectly by simulating mouse and keyboard messages. Its utilization in real experiments speeded up the operator’s
communication with the system cca. Two times in comparison with the traditional control by the mouse and the key-
board.

Keywords: Holographic Optical Tweezers; Raman Tweezers; Natural User Interface; Leap Motion; Gesture Camera

1. Introduction a big horizontal board where several operators can work
simultaneously. Natural consequence of the recent
Optical tweezers represent a tool that uses a tightly fo-
worldwide expansion of touch screen tablets was their
cused laser beam for contactless three-dimensional ma-
exploitation to move the particles in the XY plane, while
nipulation with electrically neutral objects of sizes from
the Z-axis coordinate is determined by zooming ("stretch-
tens of nanometers to tens of micrometers [1]. An object
ing" particles between two fingers) [6].
is trapped near the focus of the laser beam and reposi-
The Microsoft Kinect sensor is able to capture 3D im-
tioning of the beam focus is followed by the object
ages (X, Y, Z coordinates) and thus opens new possibili-
movement to the new beam focus position. The human
ties to control optical trapping [7]. However, Kinect was
operator usually controls the position of the trapping
beam by a traditional pointing device like mouse or joy- primary designed to capture the whole human body and
stick. Some research groups have used 3D joystick and therefore the recognition of near objects (hands of sitting
haptic devices [2]. Manipulation with more objects needs person) is limited. The new generation of sensors (Crea-
more independent trapping beams that can be obtained tive Interactive Gesture Camera, Leap Motion) allow to
by several different methods. The most flexible one uses overcome this problem. They integrate several methods
a spatial light modulator that splits one beam into more belonging to the quickly growing area of computer vision
beams with independently positioned foci in all three called "Natural User Interface" (NUI).
dimensions. Such holographic optical tweezers [3] can be Optical micromanipulation techniques can be easily
easily controlled by PC and in combination with a system combined with other techniques, such as force measure-
detecting position of manipulated objects provides the ment (photonic force microscope) or Raman laser spec-
bases for efficient feedback control. For example, a CCD troscopy (Raman tweezers), with rather complex control
camera with appropriate software detects the position of interface. NUI technology in these areas can significantly
each finger and transforms it into controlled manipula- speed up the dwell-time between the action of the opera-
tion with several particles simultaneously [4]. In [5] this tor and the reaction of the system. Our proof-of-concept
idea was modified into "Multi-touch console" which was experiments combine the hand gestures recognition for

Copyright © 2013 SciRes. OPJ

The limited number of commands sig- packages and libraries supplied with hardware. Software commands (see Table 1) which was exploited instead of In our work we exploited several independent software the default one. Se-  HRT software written in Labview which controls all lected set of commands should have well-recognizable Figure 1. Left: Operator sitting in convenient position has his elbows on the table. 2. navigation of the trapping beams with the speech recog. We created specific dictionary of 2. nificantly reduces the risk of the wrong recognition. Both. board & mouse" mode. OPJ . (Andor Newton DU970P) for Raman spectra acquisition  Leap Motion SDK supplied with libraries and ex- and Mad City Labs micropositioning stage (Nano-view) amples for the development of own applications.  “Creative Interactive Gesture Camera" sold by Intel 2.3. Data flow diagram. it is acquired and sent to fingers. Control of HRT via NUI Module (shortly “Gesture Camera” in the following text) captures both RGB image and Depth map [8]. mentioned libraries. hardware parts of optical tweezers in traditional "key- nition for other advanced commands to the HRT. Material and Methods ware development kit and libraries supplied with the 2. acquire the voice commands. which significantly simplify programming.  We exploited Microsoft Visual Studio 2010 with 8GB RAM) based on the Windows7 operating system. Copyright © 2013 SciRes. C++ compiler for development of the NUI software and The hands positions are acquired by two advanced de. Their combination We developed a "Natural User Interface" (NUI) software gives 3D information about the position of human op. pencil) [9]. Voice Commands Recognition  “Leap Motion” is a sensor based on a similar prin- Voice detector watches the acoustic signal of micro- ciple like Kinect but it achieves 200x better precision due phones. shows the whole system crophone array we employed an external microphone to running asynchronously as follows. If a continuous signal appears that can be inter- to the patented algorithm based on the built-in model of preted as a voice command. Hardware Creative Interactive Gesture Camera. The program compares the voice sample with chance to test the prototype version (not sold yet).1. Although the camera contains also a mi.g. As speech recognition program (Nuance Dragon Assistant members of official developers community we have a Core). optical tweezers tures recognition.332 Z. Figure 1. (IPG YLM-10-LP-SC) with maximal output power 10W  "Nuance Dragon Assistant Core" software for the at wavelength 1070 nm for holographic optical tweezers.  Intel Perceptual Computing SDK 2013 Beta3 soft- 2. and NUI devices are controlled by PC (Intel Processor. templates in dictionary and returns a text string of the most similar word. contextual voice recognition which transforms voice spectrometer Shamrock SR-303i and low-noise camera signal from microphone into text. Right: Data from de- tectors are sent directly to tweezers firmware (CONTROL SW module) or to NUI module via keyboard KB.1. hands and elongated tools (e.2. PCSDK contains demo samples We used a homemade system utilizing Spatial Light Modu. The for three-dimensional positioning of the sample in the latest version of Leap Motion SDK supports hand ges- optical microscope of own design. explaining how to acquire depth image of hands and cal- lator (Hamamatsu X10468-03) and trapping fiber laser culate the coordinates of fingertips.3. for the control of the HRT system employing above the erator’s hands. for the modifications of PCSDK samples. vices released almost at the same time (end of 2012). TOMORI ET AL.

Dictionary of voice commands.4. Copyright © 2013 SciRes. its skeleton . Gesture Cam. Gesture camera identifies gestures like THUMB 5 left 37 Press key Left (move stage ) UP/DOWN. English are implemented.3. If the shape of a blob looks like a hand. Dialogs and windows on the screen.3. Hand detector sends the coordinates of fingertips (X. Screen Calibration Unicode coding to simplify transfer to foreign languages. In the simplest case. The screen coordinates (x. Hands and Fingertips Detection center of sensor. Function Both devices send the positions of fingertips (x. WAVE etc. Figure 1.2. TOMORI ET AL.z coor- 1 click 0 Activate clicker 0 (Raman) dinates) to the NUI module along with a flag containing information related to hand visibility.y) depth. C) Live video from the tweezers camera displaying trapped objects. Hand Gestures Recognition 4 next 34 Press key Page Down (focus down) Both detectors support recognition of simple one-hand gestures. the camera is We tested both of the mentioned devices. Leap Motion is able to capture cca. finger movement to- wards screen and back (SCREENTAP gesture simulating acoustic sound.y.3. their centers and index fingertips. 6 up 38 Press key Up (move stage ) Leap motion is able due to the more precise detection of fingertips to recognize fine gestures. B) "Clicker" window which can be placed above any control button to invoke its clicking by software. CIRCLE. Currently. A) Tweezers control dialog box.a set of lines between the hand center and Voice Commands Table the individual fingers (see Figure 2(d)). etc. placed at the center of the screen top edge and we era periodically (25 times/sec) acquires depth images and roughly consider z-coordinate be equal to the distance of identifies connected components (blobs) of the same fingertip from the screen.Y. D) Two-fingers mode (thumb + index) suggesting real tweezers E) Live video from NUI camera displaying hands.0). Z. For instance. 2 one 1 Activate clicker 1 3 previous 33 Press key Page Up (focus up) 2. move- 7 right 39 Press key Right (move stage ) ment of finger towards keyboard and back (KEYTAP 8 down 40 Press key Down (move stage ) gesture simulating key pressing). SWIPE. swiping motion of a finger (SWIPE gesture intui- should not be both included in the same dictionary. Red circles near the centers of particles indicate active traps.Z) in real units (e. openness. All these gestures can be trans- Modification of the dictionary is quite simple by the formed into commands and exploited to control optical modification of text file. 150 frames/sec (more in USB 3. voice commands in tweezers similarly as voice commands. similar words like "one" and "done" click). Horizontal arrows can change the level of transparency of the clicker window between the fully visible to invisible. the extension to other lan- guages is expected soon. 333 Table 1.g. millimeters) where the origin is the 2. VICTORY. it detects are given by the ratio screen/camera resolution. OPJ . Command Param.3. All text strings are based on 2. tively means rejection).

Example of gestures recognized by Leap Motion detector.client and server programs run on the same Conservative users can prefer a traditional approach computer. reasonable compromise is the using of 2 2. ment both . transla. right: SWIPE gesture. other (secondary) hand can show the target position of a moving particle or it can define the gestures controlling 2. Modes of Operation library).3. into account the variable position of the control dialog on bration process. assuming a proper communication interface (such as in In some situations. Relation between the sensor and the screen which controls system firmware simulating keyboard and coordinate systems is called “pose” and is defined by the mouse messages. strongly depends on the application and the type of samples. the controlled button of the dialog box before the ex- Leap Motion supplies a calibration program as well as a periment. OPJ . Copyright © 2013 SciRes.3." shows parameters tied with the given command (either the virtual code of keyboard key in 2. This allows the simultaneous cooperation with the developer of such system is needed. is straightforward. movement of up to 10 particles per user (the number of To avoid complications connected to such direct control users is not limited to one). NUI program can "click" the ex- set of functions (e. obtained by cali. KEYTAP gesture The fingertips recognition module as well as the speech activates pointed laser trap and SWIPE gesture deacti- recognition module are able to control systems directly vates it (see Figure 3). Table 1 in tween the fingertip and the screen plane).8. operatively the primary hand by mouse when necessary. Figure 2. ternal program button whenever required. Leap Motion usually lies on the desktop in front of voice recognition modules by intermediate NUI module the screen).5. User can exploit his finger with adjustable transparency (Window B in Figure 1). Currently.g. ways.Y. mouse events simulations have to take tion and scale factors [10]. Indirect Control of a System index fingers of both hands with the possibility to replace Majority of systems are typically controlled by a firm.Z) the screen.3. For instance.g. for calculation of the distance be. Traditional control of HRT system by mouse or joystick tion string "/voice" or "/hands" which NUI module de. We developed a small target-like window to the screen coordinate (x.334 Z.3. can be combined with new NUI control tools by many tects and processes by a proper function. TOMORI ET AL. transforms 3D fingertip position (X. In our experi. fingers. Of course.7. Direct Control of a System the mode of operation. as a laser pointer – the cursor appears on the calibrated This "clicker" window should be manually placed above screen in the position where the finger is pointing to. It is based on the client/server strategy using UDP net- work protocol (OSC format [11] supported by "liblo" 2.6. column "Param. The the part "Future work". however the network protocols lead to the keeping mouse in the primary (usually right) hand for the straightforward extension to real network as described in precise and reliable positioning of the laser trap. As a result. If this is not the case for a system. Communication between Programs decimal format or the identifier of clicker). While simulation of a keyboard event transformation matrix which contains rotation. Left: KEYTAP gesture. a gertips detector completely. (Pictures copied from the Leap Motion documentation [9]. The head of OSC message contains identifica. ware via keyboard (pressing and releasing a key) and The optimal mode of operation is the user’s choice and it mouse (clicking the corresponding button on dialog box). we can replace mouse by the fin- our HRT system). Generally. the risk of wrong we developed an indirect control via simulation of mouse fingertip detection increases with the number of tracked and keyboard commands instead. This matrix.y). sensor can be placed apart from the screen The idea was to catch control command from camera and (e.

Further experiments should determine evaluates Z-coordinates and changes the focus according optimal set of gestures and voice commands. We found extend the software to full network version allowing re- that the fingertip navigation is more sensitive compared mote control of tweezers ("NUI teletweezing"). Our software was designed to remain open for future jects.14. 2002. Conclusions 26220120033 and 26220220061). Vol. dinates) and exploits OpenCV library proper for image doi:10. pp. G. Acknowledgements device to the Raman spectra measurement mode. reliable and has simpler [4] G. 2011. “Dynamic Holo- scope of this paper and it would require more extensive graphic Optical Tweezers. No.1063/1. Then by navigating the laser beam precisely on the droplet semi-automated methods of optical trapping based on the edge.” Journal of [7] our solution allows the convenient work in sitting Optics. This ex- to the traditional mouse control in experiments where we tension assumes streaming of live images from the mi- excited the whispery gallery modes [12] in the LC droplets croscope camera and sending them to the client. E. Whyte. application of NUI objects. the system uses only XY coordinates to move ob. NUI module was programmed in the way that the open hand compared to mouse based trapping cca. tweezers. One possibility is to exploit this gesture for the focusing. one of such areas is optical microma. 25. Miles. Our proof-of-concept experiment showed Nile red and dispersed in water. 14. C. 169-175. M. We plan additional If both hands were busy by manipulating the objects. We thank to Leap Mo- Extremely fast progress in the NUI technology brings an tion for providing the prototype of sensor. Anyway. Padgett. faster. We could easily add commands for fast movement of micro. 1-6. Z.” Optics Express. Gibson. 2006.1088/2040-8978/13/4/044003 Comparison of both sensors mentioned above is out of [3] J. G. the system improvements. On the other hand. M. 12497-12502. our experiments showed that Leap doi:10. No. testing. Comparison of prices ($70 and $150) does not make sense in this application. Padgett. Block. D. “Optical Trapping. We will try to define a set of spe- functions of the system described in Table 1. 335 3. The droplets were ma- that NUI increases the efficiency of the tweezers control nipulated by two laser beams of total power 3W. Neuman and S.” Optics Community. “Stereoscopic Particle Tracking for 3D Touch. Vision and Unlike to the solution based on Microsoft Kinect sensor Closed-loop Control in Optical Tweezers. The angle between thumb nipulation techniques with respect to expected stan- and index can intuitively suggest the function of a real dardization in this area. How- represents non active trap. B. doi:10. Robert. TOMORI ET AL. Moving of the closed mental experience. 9. G. In our opinion. 207. 4. fingertips positions combined with gestures and voice 2787-2809.doi:10. this number can be higher with increasing experi- the trap on the given position. We plan to to the distance of the hand from the screen. Vol. A. For this purpose we exploited very recent [2] R. 2004. p. Vol. 75. If this angle is above some threshold. Koss and D. 2 times. Curtis. 044003. Leach. Experiment processing applications. SDK. Closed hand with the index finger up increases methods is the way how to improve interactive microma- the sensitivity of detection. Gibson and M. technology (Gesture camera and Leap Motion sensor). D.1016/S0030-4018(02)01524-9 Motion is more precise.1364/OE. image analysis would be possible. Slovak Academy of Science in frame of CEX NAN- OFLUID and Agency for structural funds of EU (projects 4. closing of the hand activates ever. J.Vol. color images and depth maps (not only fingertips coor. pp. intensive search for possible applications in various areas. position with elbows supported by the table. the clicker window above the button RAMAN of the control dialog and thus voice command “click” switches 5. it generates nipulating Micron-sized Objects.1785844 commands. however this function was rarely used in experiment. We placed cific gestures for optical tweezers. pp.012497 Copyright © 2013 SciRes. 13. OPJ . Preece. Gesture Camera and SDK from and M. The efficiency is also image depend- hands intuitively corresponds to the moving of trapped ent and task dependent. No. Bowman. REFERENCES nipulation with microobjects where the three dimensional [1] K. This work was supported by Slovak research grant agen- positioning stage corresponding to SHIFT+Arrow key cies APVV (grant 0526-11) and VEGA (grant 2-191-11). Future Work mal.” Re- positions of trapped objects are intuitively controlled by view of Scientific Instruments. Grier. No. We used the described HRT system and as the sample we Voice commands are helpful especially if both hands took droplets of liquid crystal (6CB or 8CB) dyed with are occupied. However. testing of the other NUI software tools in order to the voice commands were very helpful to control other achieve better control. “An Optical Trapped Microhand for Ma- Intel have broader range of NUI functions. If the angle between index and thumb is mini.

Grieve. McDonald.1221 [8] Intel Perceptual Computing SDK 2013 Beta. No.intel. arXiv: 1211.1088/2040-8978/13/4/044002 http://opensoundcontrol. doi:10. McDougall and D. A.pop-ph]. [5] J. A. No. Cambridge University [6] R. Carberry and M. 2013. M. OPJ . 5..com Holographic Optical Trapping. “Hands. [12] F. Vol. computing-sdk on with Optical Tweezers: A Multitouch Interface for [9] “Leap Motion”.org [7] C. 044002. “iTweezers: optical micromanipu. p. J. Press.336 Z. Bowman. pp. 2004. M. G.1364/OE. doi:10. 17. pp. 2009.1038/Nmeth. D. Subramanian.003595 [10] R.” Nature Methods. 7. “Whispering-gallery-mode McGloin. “Multiple View Geometry in Computer Vision. Vol. cules. Vollmer and S. 2008. Padgett. [11] Open sound format specification doi:10. W. J.” Journal of Optics. Ulcinas.McPherson. lation controlled by an Apple iPad. biosensing: label-free detection down to single mole- ers. Zisserman. 2011.” 2-nd edition. Miles.leapmotion. S. TOMORI ET AL. Gibson. Hartley and A. 3595-3602.et al.com/en-us/vcsource/tools/perceptual- M. http://software. Copyright © 2013 SciRes.5. HoloHands: Kinect Control of Optical Tweez.” Optics Express. Arnold.17. https://www.0220v1[physics. M. 13. 591-596. Vol. C.