Android :Single Stroke Gesture Recognition using $1 Detector

Pi19404
April 5, 2013

. . . . . .2 Android Gesture Capture Application 0. . . . .2 Rejecting invalid Gesture . . . . .5 Translation .3. . . . . . . . . . . . . . . 0. . . . . . . 0. . . . . . . . . . . . . . . . . . . . . . . . . . . . .7 Computing the Similarity Score . . . . 0. . .7. 0. . 0. . . . . . . . . . . . . . . . . . . . . 0. . . .3. . . . . . . . . . . . . . . . . . . . . . . 0. . . . . . . . . . . . . . . . . . . . . . . . . . . .6 Robustness to rotation . . . . . . . . . . . . . . . . . . . . . . . . .5 Rejecting gestures . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 0. . . . . .3 Gesture Normalization . . . . . . . . . . . . . . . . .3. . 0. . . . . . . . 0. . 3 3 3 4 4 5 5 7 7 8 8 9 9 10 12 12 12 12 2 | 13 . . . . . . . . . . . .Contents Contents Android :Single Stroke Gesture Recognition using $1 Detector 0.7. . . . . . . . .8 Launch Application . . . . . . . . . . . . . . . . . . . . . 0. . . . . . . . . . . . . . .1 Registering candidate . . .1 Introduction . . . . . . . .1 Loading the Template Gestures . . . . . . . . . . . .3. . . . . . . . . . . . . . . . . . . . . . . . . .4 Rotational Invariance . . . . . . . . . .4 Scaling . 0. . . . . . . . . References .9 Code . .2 $1 Unistroke Recognizer . . . . . . . . . 0. . . . 0. . . . . . . . . . .3 Re-sampling points . . . . . . . . . . . . .3. . . . . . . .

2010 0. A userâAZs gesture results in a set of candidate points g .The method is simple.Android :Single Stroke Gesture Recognition using $1 Detector Android :Single Stroke Gesture Recognition using $1 Detector 0. In the present application we will ignore the speed at which the 3 | 13 .The candidate points are spatially sampled at the rate determined by the touch screen interface on the mobile device. In the present implementation we will assume that the raw template points have been captured using a desktop application and sampled points are stored in a file.1 Introduction In this article will look at single stroke gesture recognition on android platform using $1 Unistroke Recognizer by Anthony and Wobbrock. The template are constructed using desktop applications using mouse to draw gesture and hence are spatially sampled at the rate determined by the desktop-mouse interface.computationally efficient and can recognize a wide variety of gesture and is suitable for mobile based platforms.2 $1 Unistroke Recognizer $1 Unistroke Recognizer is a method to recognize single/uni stroke gestures made by the user using the mobile touch interface. The candidate and template points are obtained by interactive mean and in the present application the candidate points are obtained by using the mobile touch screen interface. and we must determine which set of previously recorded template points „i it most closely matches.

The class Data Capture provide high level interface to capture the data and to initiate gesture recognition.This step is called registering the candidate. The class AndroidDollar defines android routines to capture the touch gesture performed by the user .3 Gesture Normalization The template and the candidate points may contain different number of sampled points.Android :Single Stroke Gesture Recognition using $1 Detector gestures are drawn. Hence the first step is to perform gesture normalization so that candidate and template points are transformed such that they can be compared this pre processing step is called as gesture normalization. The dragged state indicates that unistroke is being performed without lifting the figure and the 2D co-ordinates of gesture are being captured. The aim to transform the gesture so that they are invariant to translation and rotation 0. 0.1 Registering candidate The first step is to capture the candidate template.released.3. As mentioned earlier the number of points captured would depend on the device spatial resolution.The released state indicates that figure has been lifted and gesture capture process has been completed and to start with gesture recognition process.they may differ in size and spatial location are not the same ie the template and candidate points would not line up.AndroidDollar class contains a instance of DataCapture and whose methods are called based on the touch event detected by the user The java class DataVector is defined which captures the 2D co- 4 | 13 . The gesture capture processing is defined to be in one of the three states dragged.start. The start state indicates that gesture has started and to clear any previous information stored.

size () . 5 | 13 .1) .3.2 Rejecting invalid Gesture A simple check is incorporated to check whether the gesture was intentional or not by specifying a path length criteria. The DataCapture Class The class PUtils contains all the methods for gesture pre processing and recognition.3. ( Point ) points . The code is implemented by the method PathLength is the PUtils Class. 1 2 3 4 5 6 7 8 9 public double PathLength ( Vector points ) { double length = 0. } return length . for ( int i = 1.Android :Single Stroke Gesture Recognition using $1 Detector ordinate information of drawn gesture. } In the present implementation the path length threshold used in 100.If the path length of the gesture is less than a specified threshold no further processing is performed and no gesture recognized status will be displayed.Resampling is one such operation. i < points . elementAt ( i ) ) . 0.3 Re-sampling points Once the gesture has been captured before the process of comparing the candidate gesture with template gesture some pre processing operations are performed . contains instance of DataVector. i ++) { length += Distance (( Point ) points . The PathLength method defined in the PUtils class simply computes the sum of linear distances between all the adjacent points of the captured/template gesture. The DataCapture class contains instance of PUtils class . 0. elementAt ( i .

x + ( I / ( D + d ) ) * ( pt2 . } else { // computing cumulative distance D=D+d. The method used for sampling the data points is uniform sampling. insertElementAt (q .ordinate using cos relationship double qx = pt1 .0.The path length is divided by the number of re-sampled points. x ) .Let points be labeled pt1 and pt2 A linear path is assume to exist between adjacent sample points. y + ( I / ( D + d ) ) * ( pt2 . // resetting cumulative distance D = 0. qy ) . using simple trigonometric relationship of sin and cos we can estimate of location at distance of uniform path interval which lies between pt1 and pt2 .ordinate using sin relationship double qy = pt1 . This ensures than candidate and template have the same number of points enabling us to perform point based comparison.This will be the interval length between the points. // replacing the point in the source array srcPts . } This is implemented by the method Re-sample in the PUtils Class. x . i ) . y . 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 double d = Distance ( pt1 . This new co-ordinate replaces pt2 in the candidate/template coordinate array and the same process is repeated till the last point of the co-ordinate array is reached. We start with the initial point and next point is selected such that distance between points is greater than of equal to interval length.pt1 . In the present implementation the number of re-sampled points 6 | 13 . // adding the point in resampled array dstPts . Point q = new Point ( qx . addElement ( q ) . if (( D + d ) >= I ) { // computation of new co . // computation of new co . y ) . pt2 ) .Android :Single Stroke Gesture Recognition using $1 Detector The re-sampling operations selects from the provided candidate/template gesture a fixed subset of points.pt1 .

3. 0.This is implements by method ScaleToSquare1 in PUtils class. nextElement () . y * ( size1 / B . newpoints . for ( int i = 0. while ( e . hasMoreElements () ) { Point p = ( Point ) e .The new width and height are denoted by W and H and all the points are scaled by factor W=max(x)   min(x) and H=max(y )   min(y ) 1 2 3 4 5 6 7 8 9 Rectangle B = BoundingBox ( points ) . i < points .5 Translation The first step required is computation of mean/centroid of set of co-ordinate location. size () ) .3. double qy = p . double qx = p . Width ) . x * ( size / B .First the bounding width and height of current set of points are computed which is simply max(x)   min(x) and max(y )   min(y ). Vector newpoints = new Vector ( points .4 Scaling The next pre-processing step is to scale the co-ordinates such that third width and height remain within a fixed bounds. 7 | 13 . 1 2 3 4 Enumeration e = points . elements () . addElement ( new Point ( qx . The ScaleDimTo method implements this in the PUtils Class. elementAt ( i ) . size () . } This provides step provides invariance wrt scaling since all gestures are bounded to lie within rectangle of same size. Height ) .Android :Single Stroke Gesture Recognition using $1 Detector used is 32. qy ) ) . 0. In the present implementation the scale is done so than bounding box is a square of dimension 250. i ++) { Point p = ( Point ) points . Compute the ration between the width and height or viceversa of the bounding rectangle if ratio is close to 0 than 1 then perform non uniform scaling else perform uniform scaling. The above method perform uniform scaling another method of scaling is non uniform scaling that maintains the aspect ratio.

This is done by computing difference in x and y co-ordinate of the 8 | 13 . size () . } This is implemented in the method TranslateToOrigin in the PUtils Class. Vector newpoints = new Vector ( points . This is implemented by the method Centroid in class PUtils. ysum += p . ysum / points .This will ensure that all template are aligned . double qx = p .  centroidy ). y . x . xsum += p . However we can reject templates by incorporating crude comparison. size () ) . double qy = p . x . 1 2 3 4 5 6 7 8 9 Point c = Centroid ( points ) . y . size () ) . y .4 Rotational Invariance If rotational invariance is desired. 0. for ( int i = 0.5 Rejecting gestures After the pre processing step the candidate and template gesture are ready to be compared. qy ) ) . 0. x . Translate all the points by ( centroidx . compute the normalized vector 1 between the starting point and the 8 of gesture length. addElement ( new Point ( qx . elementAt ( i ) . The RotateToZero method in PUtils class performs this action. size () .Android :Single Stroke Gesture Recognition using $1 Detector 5 6 7 8 } return new Point ( xsum / points . The next step is to translate all the points such that centroid lies at the origin of co-ordinate system. newpoints . i ++) { Point p = ( Point ) points . i < points .c .The line joining the first point to the origin can be rotated so that it lies along 0 degree axis.c .

0.Hence some robustness to rotation is required to provide a good score. v .i2 . x + v1 . 0. y * v2 .i2 . x * v2 . y ) .Android :Single Stroke Gesture Recognition using $1 Detector two points normalized by the distance between them. y ) . Point v = new Point ( i1 . return Math . return new Point ( v . y . x . y ) . double len = Math . acos ( n ) . i1 . x / len . elementAt ( index ) . // arc cosine of the vector dot product In the present implementation the angle similarity threshold is set to 30 deg. The best matching score amongst all the angles is selected. sqrt ( v . To compare startunitvectors of template and candidate cosine of angle between the unit vectors is used as criteria The AngleBetweenUnitVectors method compute difference in angle 1 2 double n = ( v1 . elementAt (0) . Point i2 =( Point ) points .x .6 Robustness to rotation Although above pre-processing steps normalize the gesture wrt scale and translation. y / len ) . y * v . CalcStartUnitVector method of PUtils class computes this angle. x + v . x * v .7 Computing the Similarity Score The final step is to compute the similarity score 9 | 13 .This vector is called StartUnitVector 1 2 3 4 5 Point i1 =( Point ) points . This is performed by rotating the candidate points about line joining the starting point and the centroid by   and + in small increments and computing score at each incremental angle.every time a gesture is drawn there may be subtle changes like small change in orientation.

elementAt ( i ) . The cost function or score is provided DistanceAtAngle function which computes the distance between the candidate rotated by specified angle and template. i < path1 . ( Point ) path2 .Android :Single Stroke Gesture Recognition using $1 Detector As method earlier a scan of angle withing a specified range is performed.1 Loading the Template Gestures The template gesture were captured using desktop application and co-ordinates are stored in a csv files . 0. DistanceAtBestAngle method in PUtils Class compute the best score by applying golden section scan algorithm about range of specified angles. It is assumed that maximum score exits amongst the angle. DistanceAtAngle method in PUtils Class performs transformation about the points such that all points are rotated by specified angle before comparison. 1 2 3 4 5 6 double distance = 0. Instead of performing a uniform search the Golden section search algorithm is used to search for global maximum from list of candidate by by successively narrowing the range of values inside which the maximum score is known to exist. size () . for ( int i = 0.The score is compute by computing the euclidean distance between the 2D set of vectors. size () . PathDistance method in PUtils class computes the euclidean distance between the set of 2D points.7. The Recognize method in PUtils class provides interface to gesture recognition routine. elementAt ( i ) ) . } return distance / path1 .A directory is create for each class of template and multiple files can be placed withing each 10 | 13 . i ++) { distance += Distance (( Point ) path1 .

123.132.128.57.154.(2.56.The template consist of following shapes made in clockwise and anti clockwise manner.101.62.136.93.105.133.103.131.96. Below are examples of templates demonstrated in the present application.Hence while loading the template nth location is assigned to x co-ordinate while (n + 1)th location of integer array is assigned to y co-ordinate of 2D co-ordinate object representing Point Class. 133.119.150.124.142.119.62.103.89.113.14) for clockwise and anticlockwise respectively.(3.111.85.139.65.115.133.136.96.105. 113.67.130.60.103.The templates are labeled as (1.57. 81. Comparison is made with all the loaded templates and best score is selected amongst the templates.11) .103. An example of data contained in CSV files is shown below 75.128.130.75.119.120.128.56.122. In the CSV files raw x and y co-ordinates of gesture trajectory of mouse pointer captured by desktop application are written side by side and this entire CSV string is loaded into a integer array.107.86.122.(10.108. 119.93. 116.12) .133.For each files a template object will be created.110.Android :Single Stroke Gesture Recognition using $1 Detector directory.59.111.127.104.97.133.133.117.103.85.72.78.103.81.109.116.129.113.81.133.146.114.69.75.139.90.110.135.104.13).101.133.15) (9. 11 | 13 .56.90.103.71.103.122.108.57.107.130.107.103.104.105.57.133. The templates are loaded once during initialization.125.125.133.126.

147. The AndroidGesture capture utility will contain a instance of DataCapture which provides high level interface to gesture registering and recognizing methods.143. 12 | 13 .Android :Single Stroke Gesture Recognition using $1 Detector 142.Few modifications we made to gesture capture interface. When the gesture is completed the On method of DataCapture Class is called with first boolean parameter set to false and second and third parameters are the final x and y co-ordinates respectively. When the gesture is desired to be captured On method of DataCapture class is called with the first boolean parameter set to true and second.147.150.com/p/m19404/source/browse/Android/AndroidGesture.9 Code The code can be found in code repository https://github.third parameters are x and y co-ordinates respectively.152.8 Launch Application Transfer the apk file generated to device and test the application.com/ pi19404/m19404/tree/master/Android/AndroidGesture or https://code.7.140.157.155. google. 0.150.2 Android Gesture Capture Application The android gesture capture exampled code is based on the TouchPaint example provided in the android samples directory under graphics sub-directory.153.158 This contains x and y co-ordinates stored in adjacent locations.The library files are not placed in the repository download them from appropriate packages on send a mail separately for download link.155. 0. 0. The header files is located in jni directory.

Ontario. Bradski.cfm?id= Gary R. 1839214. 511518.org/citation.acm. 245 252.  Rapid Object Detection using a Boosted Cascade of Simple Features. ist. Wobbrock.14.Bibliography Bibliography [1] Lisa Anthony and Jacob O. [2] isbn: 978-1-56881-712-5. pp. Jones. In: CVPR (1).psu.edu/viewdoc/summary?doi=10. pp. In: Proceedings of Graphics Interface 2010. 2010. [3] Paul A. Interface.1. GI '10.7673. Viola and Michael J.1839258. Ottawa. In: Intel Technology Journal Q2 (1998). 2001. 13 | 13 .  Computer Vision Face Tracking For Use in a Perceptual User url: http://citeseerx. url: http://dl. Canada: Canadian Information Processing Society.1.  A lightweight multistroke recognizer for user interface prototypes.

Sign up to vote on this title
UsefulNot useful