Gesture Recognition Based on Scratch Inputs

Gary Halajian (gh96) John Wang (jbw48) ECE 4760 - Final Project April 26, 2009

I.

Contents
           Introduction High Level Design Program/Hardware Design Results of the Design Conclusions Appendix A: Commented Code o Gesture Recognition Code o PC Interface Code Appendix B: Schematics Appendix C: Cost Details Appendix D: Tasks Appendix E: Gestures References

II.

Introduction
Our project utilizes a microphone placed in a stethoscope to recognize various gestures when a fingernail is dragged over a surface. We used the unique acoustic signatures of different gestures on an existing passive surface such as a computer desk or a wall. Our microphone listens to the sound of scratching that is transmitted through the surface material. Our gesture recognition program works by analyzing the number of peaks and the width of these peaks for the various gestures which require your finger to

The general theory behind our project can be used in many different applications. We also created a PC interface program to execute different commands on a computer based on what gesture is observed. We chose this project due to our interest in creating a touch based interface that can easily be used with any computer running Windows. traditional touch based interfaces' fine resolution is simply unnecessary. Even if the cost is reasonable. installing such interfaces on existing hardware may simply be impractical. Normally.move. For example. the faster a gesture is executed. a straight line starts with a single acceleration motion followed by a single deceleration motion. Background Theory By scratching across a textured surface. II. a high frequency sound in the 3 kHz range is generated. accelerate and decelerate in a unique way. have had a meteoric rise in popularity. Furthermore. for many simple gesture based commands. the higher the amplitude and frequency. the cost of touch based interfaces continues to be prohibitively expensive for large surfaces. . Although touch based interfaces. Gesture recognition can be accomplished by placing a microphone against the textured surface and comparing the received acoustic signal with saved acoustic signatures. which is different from most other possible sources of ambient noise. Many simple gestures have unique acoustic signatures because of the need to accelerate and decelerate in a particular fashion. A list of recognized gestures is included in the appendix. High Level Design Rationale/Sources The rationale behind our idea comes from the need for a simple and inexpensive acoustic-based touch interface. Our project is based on a paper we read by Harrison and Hudson ("Scratch Input"). and in turn gesture based commands. Intensity of the gesture can be deduced through the amplitude and frequency of the signal. Logical Structure Figure 1: High-level block diagram.

One such patent is the "Low cost speech recognition system and method" (Patent number 4910784) which uses differences between the received speech and "reference templates" to recognize if a certain word has been said. and short-circuit behavior are typically controlled by a line-driver that converts from the UART's logic level to RS-232 compatible signal levels. Program/Hardware Design . the PC interface software performs specific actions based on the output of the MCU. This patent discusses their design where they used a "feature extractor. Character format and transmission bit rate are controlled by an integrated circuit called a UART that converts data from parallel to asynchronous startstop serial form. slew rate. By moving peak detection to software. The analog circuit filters and amplifies this input and sends it to the MCU where gesture recognition takes place. Hardware/Software Tradeoffs Amplitude independent peak detection was initially supposed to be implemented in analog hardware. we did not have enough memory to sample at the rate fast enough for the speed of sound in wood and still retain the past 96 ms worth of data. Standards Our design complies with IEEE's RS-232 standard for serial communication. and a decision controller which is quite similar to the overall design of our project. Patents After conducting a brief patent search we were not able to find any existing patents with similar techniques and applications as used in our project." a comparator. we did find some patents related to speech recognition which is similar to the acoustic gesture recognition that our project relies on. However. Finally. we switched peak detection to software. not transient AC signals. Hardware would have been fast enough to incorporate our original idea of using time-of-arrival measurements between two microphones to locate the signals originating position. III. A hardware peak detector has the advantage of being faster and consuming no memory. However. Therefore. this patent has recently expired and is now considered public. and a receiver that converts from RS-232 compatible signal levels to the UART's logic levels. However. Voltage levels. The microphone is used to get input from vibration on a surface.Figure 1 shows a high-level block diagram of all our project's components. Thus. we were forced to abandon our dual microphone idea. all the peak detection circuits found were for steady state AC signals.

the signal . This circuit is divided in four stages and is used to produce a reliable and sensitive acoustic waveform. the voltage drop across this resistor will depend on the sound the microphone is hearing. Thus. a microcontroller circuit. and a microphone inside a stethoscope. After the input sound at the microphone has been transformed to a fluctuating analog voltage. Figure 3 shows the circuit we built onto the solder board. Figure 2 below shows all our hardware and an image of our complete packaged project. Figure 2: Hardware and final packaged project. Holes for connecters and switches were then cut out.Hardware Design Our hardware consisted of a fairly small circuit on a solder board. The microphone can be modeled as a variable resistor dependant on sound and pressure waves. We packaged all of this into a small aluminum box which we made from sheet metal. First. The microphone is attached to standoffs which "push" the stethoscope on the surface which the box is placed on. an electret microphone was put in series with a 2k? resistor.

is passed through a high-pass filter which filters out sound lower than 3 kHz. The capacitor of this filter was chosen to be 6. We needed a high enough gain so that scratches are easily picked up and recognized. To determine the resister and capacitor values. The values we chose gave us a bandwidth of about 1.4 k? to achieve a cutoff frequency of approximately 3 kHz: The third stage is used to amplify the signal using an LM358 opamp in a noninverting configuration. We experimented with resister values to find an optimal gain for our application.7 Hz: Figure 3: Hardware Circuit Software Design . we experimented to find the optimal bandwidth. and outputs the "envelope" of the original signal. We decided on using a gain of about 500: The final stage of our analog circuit is an envelope detector which takes a high-frequency signal as input. but too high of a gain would cause the signal to saturate.8 nF and the resistor was 7. This cutoff frequency was chosen to filter out common ambient sounds and because the sound of a fingernail scratching a surface is above 3 kHz.

8 times avg. and lower. moving is a short 60 point moving average that attempts to smooth out small variations in the signal such as the envelope detector's RC ripple. The gesture is considered finished if the voltage to the ADC is below a certain threshold for 360 ms.5 ms. upper. While the signal capture and feature extraction are executed roughly every 0. and gesture interpretation. upper and lower are calculated. gesture interpretation only occurs when the software believes the gesture is finished. Feature Extraction . upper and lower are used as hysteresis limits for peak detection. After each gesture is filtered in analog. feature extraction. we take the features extracted from the signal and attempt to map the features to a gesture. avg.Our software has three main pieces of code. signal capture.2 times avg while lower is simply 0. avg gives the moving average of the past 200 points. which maps the gesture that the microcontroller has interpreted to an action on the PC. The final piece of our software runs on a PC. we extract certain features from the signal from the analog-to-digital converter (ADC). they essentially run in parallel. Although capture and extraction. Once the gesture is finished. and gesture interpretation are linked together by the timeout variable. upper is simply 1. Signal Capture Signal capture grabs the ADC values from port A0 and calculates four different moving averages: moving. From avg. The microcontroller talks to the PC via RS-232 communications. Figure 4: High level software architecture on the microcontroller.

signalWidth is the number of ticks between moving becoming greater than THRESHOLD and the final gesture timeout. we extract features in real time with very little past data points. Rather than storing the entire signal in memory then extracting features and interpreting gestures after the gesture is finished. Each peakWidth is the number of ticks between moving becoming greater than upper and moving becoming less than lower. . The final implementation employed the use of moving averages and hysteresis. A peak is detected when moving becomes greater than upper and THRESHOLD but only if state isUpper is 0. Our biggest software breakthrough was the amplitude independent peak detection algorithm that required very little memory and processing power. are only used in the moving average calculations.Because our greatest concern was the memory limits and processing power of the microcontroller. Our efforts were further hampered by small noise peaks. drastically reducing the memory requirements needed. This algorithm only required the past 96 ms worth of data for moving averages rather than the entire gesture signal. This hysteresis removes noise spikes by preventing multiple but small crossings of upper. Our next issue of importance was an amplitude independent peak detection algorithm. we wrote feature extraction to hold as little past data points as possible. The past data points. isUpper becomes reset when moving becomes smaller than lower. 96 ms worth of data. Our entire idea for gesture interpretation depended on the ability to detect intended local minimums and maximums regardless of amplitude. Each valleyWidth is the number of ticks between moving becoming smaller than lower and moving becoming greater than upper. All other extracted features are based on the hysteresis of moving between upper and lower. Each peakSlope is the number of ticks between moving becoming greater than upper and when peakHeight is reached. Each peakHeight is the greatest value between when moving becomes greater than upper and when moving becomes less than lower.

a single character is sent to the PC via RS-232. Gestures are assumed to be finished if the voltage level falls below THRESHOLD for more than 360 ms. every peak afterwards is assumed to be one revolution. Gesture Interpretation Gesture interpretation for all gestures except circles is done after the gesture is finished. If a gesture's features do fall within the limits of a particular gesture. This is done as a simple blocking fprintf() statement because gestures are not going to be done in rapid succession and blocking is acceptable. PC Software . Gesture interpretation for all gestures except circles is based on a simple decision tree that checks if extracted features fall within hard coded maximum and minimums. blue is upper.Figure 5: Captured double tap data from the microcontroller. green is avg. As soon as a gesture is successfully interpreted. Circles are slightly different because feedback is given to the user as the gesture is being done. This timeout process is accomplished in extract(). Once four or more peaks are detected. yellow is lower. As long as moving is greater than THRESHOLD. time2 is constantly reset. Black line is moving. the software assumes that gesture was drawn.

the PC takes one of four different actions: toggle mute. We then let them play with the system for about 10 minutes. which had significantly less accuracy at 68%. While we had not reproduced the same accuracy as Chris Harrison's reported accuracy of 95%. we are still very happy with our results. . Although our gestures were not 95% accurate. We believe that with further testing and tweaking. a different action is taken. We managed to achieve almost 90% accuracy for all our gestures except for 3's. Results of the Design We tested our system by recruiting friends to help test. The program is written in C# and reads the serial port that the microcontroller writes to. and change volume down. we can achieve 95% accuracy. Figure 6: Accuracy of our gestures. accuracy levels will increase. as a person gets more used to the gesture style of our system. IV. change volume up.Once the microcontroller tells the PC what gesture has been scratched. quick launch one of three different programs. we asked them to perform a series of gestures and recorded their accuracies. most of our gestures were approaching that level of accuracy. Afterwards. Furthermore. A small pamphlet was given to them about how to use the system. Depending on the character written by the microcontroller.

and the gestures double tap. Applicable Standards The only applicable standard in our design was the RS-232 communications with the PC. Because of the extremely low memory and processor utilization. the actual design and implementation of our project was independently developed by us. and circles. thousands of applications are possible. overall we are happy with the accuracy of our system. ranging from furniture with built in remote controls to home automation and control. Foreseeable future improvements to our project include training the gesture interpreter to learn the gestures of a user and adaptive gain control. we assume that the code adheres to RS-232 standards very strictly. triple tap. we can port our project to even cheaper and lower powered microcontrollers. line. real time feature extraction algorithms. This hardware problem forced us to employ software techniques for peak detection. because of memory limitations. Conclusions Project Conclusions Our initial idea of using two microphones for time-of-arrival differences to measure position had to be scrapped because of the inability to detect local maximums and minimums independently of amplitude in analog hardware. and moving average hysteresis based amplitude independent peak detection algorithms were developed from scratch. Intellectual Property Considerations While the idea of a scratch based user input system is based on Chris Harrison's work. we borrowed the use of a microphone attached to a stethoscope for sound capture. the idea that certain simple gestures have unique acoustic signatures. Gesture interpretation accuracy is very good for simple gestures (double tap. The analog hardware. However. time-of-arrival measurements in software were deemed infeasible. due to time constraints. . Currently. triple tap. Very low production costs coupled with its ability to interface with most surfaces. From Harrison's work. single lines) while accuracy for more complex gestures (two's and three's) leave something to be desired.V. Since the universal asynchronous receiver/transmitter (UART) code was already written for us and used in previous lab and works well. However. we force the user to adapt to the gesture interpreter and simply calibrate the system for one particular surface.

we highly doubt there are patent opportunities for our entire system. the wearing down of the fingernails after frequent and continuous use of our system. Every non-original idea was credited to its original author. lower = 0. was disclosed. The one possible health concern that might have been an issue. and the environment. we doubt there are publishing opportunities because our work is based on Harrison's paper "Inexpensive.h> #include <avr/interrupt. our intentions were always in the best interests of individuals. Unpowered and Mobile finger Input Surfaces". We also always conducted ourselves cordially and professionally.We did use code from the public domain for the PC software. However. Legal and Ethical Considerations Throughout every phase of the development of our project. some of our algorithms may be original enough for publishing opportunities. While no safety problems were brought to our attention. We believed that we have the competency to develop this project and based on our final results. #define t1 2 #define t2 1500 #define t2Circle 3000 #define t3 750 #define t4 12000 #define AVG_LEN 200 #define MOVING_LEN 60 #define THRESHOLD 400 #define PAST_LEN 6 #define LOG_LEN 900 volatile char time1. Appendix A: Commented Code 1. peakWidth[PAST_LEN]. Because there are many acoustic processing and recognition patents and gesture recognition patents. we followed the spirit as well as the letter of IEEE's Code of Ethics VI.h" FILE uart_str = FDEV_SETUP_STREAM(uart_putchar. we followed IEEE's Code of Ethics rigorously.h> #include <stdio. However. even when there are disagreements with overall project direction or smaller technical decisions. moving = 0. Gesture Recognition Code #include <inttypes. avg = 0. time4. upper = 0. We combined example code for reading serial ports and performing certain actions on the PC to obtain our final version of the software. peakSlope[PAST_LEN]. time3. . volatile int time2. our amplitude independent hysteresis based peak detector may be novel enough for a patent. Like patents. the public. int mic[AVG_LEN + 1]. valleyWidth[PAST_LEN]. Since we were competent and our intentions were purely good.h> #include "uart. we would have immediately taken action upon discovering of safety concerns. we still believe so. uart_getchar. Last.h> #include <avr/io. _FDEV_SETUP_RW).

tSignalWidth = 0.(mic[index . ADCSRA |= (1 << ADSC). The following anti-tap filters out most // unwanted intial taps. if (peakCount > 1 && peakWidth[0] < 200 && peakWidth[1] > 185) { antiTap = 1. Whenever moving is greater than upper. tPeakWidth = 0. peakCount = 0. valley width. // Often times when starting a gesture. lower = 8 * avg / 10. tapCircle = 0. if (!isUpper && moving > upper) { isUpper = 1. such as peak heights. peak // slope. else valleyWidth[(peakCount . } if (peakCount == 1) { . index = 0.MOVING_LEN] * 10 / MOVING_LEN). peakCount++. if (peakCount == 0) tSignalWidth = 0. } void capture() { // Captures the ADC value and calculates two different moving averages. if (time3 > 0) time3--.1) % PAST_LEN] < moving) { peakHeight[(peakCount . // // Peaks count is amplitude independent. peak width. k = 0. // Moving is a moving average of the past 60 data points. char circleMode = 0. } } if (isUpper) { if (peakHeight[(peakCount . peakWidth[(peakCount . else moving = moving + (mic[index] * 10 / MOVING_LEN) . signal width. if (index >= MOVING_LEN) moving = moving + (mic[index] * 10 / MOVING_LEN) . } } } void selector() { // Attempts to map the attributes extracted from the signal to a gesture. Avg is used to // calculate the hysteresis values upper and lower. tPeakSlope = 0. and peak count. } if (moving < lower) { isUpper = 0. This is done by hysteresis of // moving. a peak is counted only if // moving has been smaller than lower previously isUpper is used to keep // the hysteresis state.1) % PAST_LEN] = tValleyWidth. tValleyWidth = 0.(mic[index + 1] * 10 / AVG_LEN). This is // done without holding any past data points. else time2 = t2. if (index == AVG_LEN) avg = avg + (mic[AVG_LEN] * 10 / AVG_LEN) .peakHeight[PAST_LEN].1) % PAST_LEN] = moving. antiTap = 0. upper = 12 * avg / 10. tempHDiff. peakSlope[(peakCount . if (time2 > 0) time2--. tPeakSlope = 0. } void extract() { // Extracts features from the incoming signal. else avg = avg + (mic[index] * 10 / AVG_LEN) .1) % PAST_LEN] = tPeakSlope. signalWidth = tSignalWidth. if (time4 > 0) time4--. the initial finger placement creates // an unwanted tap with RC decay. tValleyWidth = 0. ISR (TIMER0_COMPA_vect) { if (time1 > 0) time1--. int tempWDiff.(mic[AVG_LEN + 1 MOVING_LEN + index] * 10 / MOVING_LEN). isUpper = 0. peakCount--. Past data points are only // kept for moving averages. // Avg is a moving average of the past 200 data points. tPeakWidth = 0. signalWidth = 0. begin = 0. if (moving > THRESHOLD) { if (circleMode) time2 = t2Circle.1) % PAST_LEN] = tPeakWidth.(mic[0] * 10 / AVG_LEN). Moving simply // smooths out the sometimes erratic signal from the analog filter. mic[index] = ADCH.

else if (peakWidth[0 + antiTap] > 220 && peakWidth[1 + antiTap] > 240 && valleyWidth[0 + antiTap] > 250) fprintf(stdout. for (int i = 0. fprintf(stdout. . "X\n"). else if (tempHDiff > (2 * peakHeight[0 + antiTap] / 5) && peakWidth[0 + antiTap] > 220 && peakWidth[2 + antiTap] > 240) { tempHDiff = peakHeight[2 + antiTap] . } else if (peakCount == 2) { tempHDiff = peakHeight[0] . } else if (peakCount == 3) { tempHDiff = peakHeight[0 + antiTap] . TCCR0B = 3. "1\n"). TCCR0A = (1 << WGM01). time1 = 0. "3\n").if (peakWidth[0 + antiTap] < 200) { tapCircle = 1. sei().peakHeight[1 + antiTap]. isUpper = 0. i++) { peakHeight[i] = 0. if (tempHDiff < 0) tempHDiff = -tempHDiff. ADMUX = (1 << REFS1) | (1 << REFS0) | (1<<ADLAR). if (tPeakSlope < 10000) tPeakSlope++. ADCSRA = ((1 << ADEN) | (1 << ADSC)) + 7. else if (peakWidth[0 + antiTap] > 220 && peakWidth[1 + antiTap] < 300 && valleyWidth[0 + antiTap] > 250 && peakSlope[1 + antiTap] < 300) fprintf(stdout. if (tPeakWidth < 10000) tPeakWidth++. // Readies the ADC. for (int i = 0. } circleMode = 0. peakSlope[i] = 0. OCR0A = 60. } } } void reset() { // Resets the all signal characteristic variables for the next // gesture input. } int main() { // Resets variables for the initial gesture. if (tempHDiff < peakHeight[0] / 10 && peakWidth[0] < 200 && peakWidth[1] < 185) fprintf(stdout. "2\n").5 ms. uart_init(). peakCount = 0. "3\n"). "D\n"). if (tempHDiff < peakHeight[0] / 10 && peakWidth[0] < 200 && peakWidth[1] < 185 && peakWidth[2] < 185) fprintf(stdout.peakHeight[1 + antiTap]. if (tempHDiff > (2 * peakHeight[2 + antiTap] / 5)) fprintf(stdout. "UART Initialized\n"). peakWidth[i] = 0.peakHeight[1]. stdout = stdin = stderr = &uart_str. if (time1 == 0) { capture(). i < PAST_LEN. i++) mic[i] = 0. index++. if (tempHDiff < 0) tempHDiff = -tempHDiff. // Starts Timer0 and Timer0 interrupts for task scheduling. while (1) { // Time1 executes approximately every 0. Time1 captures the ADC // signal and extracts characteristics from the signal. if (tSignalWidth < 10000) tSignalWidth++. extract(). valleyWidth[i] = 0. } else if (peakWidth[0 + antiTap] > 230 && peakWidth[0 + antiTap] < 1000) fprintf(stdout. if (tempHDiff < 0) tempHDiff = -tempHDiff. reset(). i < AVG_LEN. // Readies the UART. time4 = t4. if (tValleyWidth < 10000) tValleyWidth++. TIMSK0 = (1 << OCIE0A). int rPeakCount = 0.

Time4 is only // reset when a circle gesture is detected or when there is a // singular tap. time4 = t4. using System.BaudRate = 9600.index = index % (AVG_LEN + 1). port. time1 = t1. . else fprintf(stdout. PC Interface Code using System. EventArgs e) { initPort(). using System.Generic.88 s after a circle or tap is // detected.Forms. using System.Ports. } } } 2. string str. "W"). } private void initPort() { port.DataReceived += new SerialDataReceivedEventHandler(comInterrupt). using System. port.Collections.Diagnostics. Time4 is a timeout counter for tap circle. } } rPeakCount = peakCount. port.Windows. the software waits 360 ms // to make sure the gesture is finished and not just temporarily below // the threshold. } SerialPort port = new SerialPort(). Time2 // is a timeout counter for all gestures except for circles. using System. Time3 looks for circle // gestures. time3 = t3. namespace WindowsFormsApplication1 { public delegate void SimpleD().None.IO. public partial class Form1 : Form { public Form1() { InitializeComponent().InteropServices.DataBits = 8. using System. Circle gestures are detected in real time rather than after // the gesture is finished. using System. "C").StopBits = StopBits. if (rPeakCount < peakCount) { if (tapCircle) fprintf(stdout. port.Drawing. private const int APPCOMMAND_VOLUME_MUTE = 0x80000.Data. After // a gesture falls below the threshold value. reset(). using System. if (time4 == 0 && tapCircle == 1) tapCircle = 0.One. } private void Form1_Load(object sender.PortName = "COM1". using System.ComponentModel.1) % PAST_LEN] > 240) { circleMode = 1. using System. using System.Text. if (time3 == 0 && peakCount > 3) { if (peakWidth[(peakCount .Media. } // Time4 executes approximately 2.Runtime.Linq. // Time3 executes approximately every 180 ms. } // Time2 executes approximately 360 ms after a gesture is done. port. if (time2 == 0 && peakCount > 0 && peakWidth[0] > 0) { selector().Parity = Parity.

case "1": d = new SimpleD(launch1). //private const int APPCOMMAND_MEDIA_NEXTTRACK = 0xB0000. p.Normal. this.exe". case "2": d = new SimpleD(launch2). //private const int APPCOMMAND_MEDIA_PREVIOUSTRACK = 0xC0000.Invoke(d). case "3": d = new SimpleD(launch3). SimpleD d. IntPtr wParam.StartInfo. this. WM_APPCOMMAND.Normal.Send("%({F4})"). break. break.StartInfo.Invoke(d).exe". (IntPtr)APPCOMMAND_VOLUME_MUTE).StartInfo. } private void launch1() { Process p = new Process().Invoke(d). private const int APPCOMMAND_VOLUME_DOWN = 0x90000. case "W": d = new SimpleD(volDown). case "C": d = new SimpleD(volUp). break.StartInfo.Invoke(d). case "X": d = new SimpleD(close). p.1.SelectedText = string.Handle. p. break.ReadExisting().Invoke(d). public void comInterrupt(object sender. break. //private const int APPCOMMAND_MEDIA_STOP = 0xD0000.. switch (str) { case "D": d = new SimpleD(toggleMute). p. this.Handle. this. break. SerialDataReceivedEventArgs e) { //read data waiting in the buffer str = port. } } private void toggleMute() { SendMessageW(this.AppendText(str).oxgamesminesweeper_31bf3856ad364e35_6. this.private const int APPCOMMAND_VOLUME_UP = 0xA0000.dll")] public static extern IntPtr SendMessageW(IntPtr hWnd. int Msg.Start().Invoke(d).ScrollToCaret(). } private void launch2() { Process p = new Process().exe".FileName = @"C:\Windows\winsxs\x86_microsoft-windows-s. this.WindowStyle = ProcessWindowStyle.WindowStyle = ProcessWindowStyle. p.WindowStyle = ProcessWindowStyle.7000. } private void close() { SendKeys. this.FileName = @"C:\Program Files\Microsoft Office\Office12\winword. IntPtr lParam). [DllImport("user32. rtbDisplay. rtbDisplay. private const int WM_APPCOMMAND = 0x319. p.Normal. //display the data to the user rtbDisplay. } private void launch3() { Process p = new Process(). //private const int APPCOMMAND_MEDIA_PLAY = 0x2E000. p.0_none_12cb03887526dd81. .Empty.MineSweeper.Invoke(d).Invoke(new EventHandler(delegate { rtbDisplay.StartInfo.StartInfo.Start(). break. })). p.FileName = @"C:\Windows\System32\calc. this.

Open().Handle. WM_APPCOMMAND.Handle.50 $3. } // Volume Up private void volUp() { SendMessageW(this.Text.00 Sampled Sampled $1. (IntPtr)APPCOMMAND_VOLUME_DOWN).p. port.00 Salvaged Part Description Electret Microphone Stethoscope ATmega644 Microcontroller Max233 CPP Solder Board Analog Circuit Components RS232 Connector for Custom PCB LM358 Header sockets Power Supply .00 $0. } private void button1_Click(object sender.Start(). (IntPtr)APPCOMMAND_VOLUME_UP). this. } } } VII.Handle.00 $8. Appendix B: Schematics 1. Appendix C: Cost Details Quantity 1 1 1 1 1 N/A 1 1 2 1 Cost $1. WM_APPCOMMAND.00 Free $1.Handle. EventArgs e) { port. this.PortName = comboBox1. } // Volume Down private void volDown() { SendMessageW(this. Hardware Schematic VIII.

standoffs. It should also be noted that this gesture can be imitated by two quick claps of your hands.5 seconds apart). and the RC decay after each tap. box machining. The waveform shows the acoustic signature that you should replicate. Appendix D: Tasks John Wang: Software design. Appendix E: Gestures To create the "double tap" gesture. website. testing Gary Halajian: Hardware design. etc. This gesture will be used to toggle mute on or off. simply double tap the surface with your fingernail perpendicular to the surface. The two taps should be fairly quick (about 0.Aluminum for Box Assorted Hardware (screws. Notice the sharp increasing edges which correspond to the start of each tap. . The taps should also be of approximately the same intensity or loudness.) Total N/A N/A Salvaged Salvaged $14.50 IX. Using one finger for both taps helps to make these taps equal. testing X.

. This will be used for closing the active window or application just like the Alt-F4 keyboard shortcut.The "triple tap" gesture is just like "double tap" except that there is one more tap which means one more peak in the waveform.

The "1" gesture is also very simple. This main peak then decays throughout the swipe since you naturally reduce the pressure on the surface while swiping. Swipe your finger in a straight line again using the top of your fingernail. . Using your index finger from top to bottom works well. This gesture will quick launch the first application. followed by a fairly sharp rise when you begin the swipe. The waveform shows a very small peak from when you initially touch the surface. The direction of the swipe should be perpendicular to the edge of your fingernail and you must swipe towards your body. This gesture is "longer" than a tap.

begin the motion for the number 2 with your fingernail starting from the top but do not complete the base of the number 2. . The waveform for the first motion looks similar to the "1" gesture with a rising then decreasing peak.5 seconds after completing the first motion. First. The next peak with the sharp rise and RC decay corresponds to the tap. Instead.The "2" gesture is a little different than you might except. simply tap somewhere on the surface after the first motion. This tap should be done within about 0. This gesture will quick launch another application.

. This gesture will quick launch a third application. The bumps should be well rounded. The pattern resembles a double tap but is much more spread out (larger base width) and has slower rising edges. Create a number 3 starting from the top while two fairly equal bumps.The "3" gesture is fairly straightforward. You can clearly see the large equal bumps from the waveform.

This is formed from the natural acceleration and deceleration of your finger throughout different parts of the circle. .The "CCW circle" gesture is also straightforward. After this startup period. The first few revolutions will be used to recognize the pattern and enter circle mode. The exact size of the circle is not critical but it should be fairly large. This is to prevent any interface with other gestures. Begin to make a counterclockwise circle with your fingernail starting at any point. each revolution will decrease the volume by one notch. A diameter of about 3 to 4 inches works well. The waveform shows a sequence of large peaks followed by small peaks. We will use this gesture for decreasing the volume of a PC.

The waveform for a clockwise circle is essentially the same as that of a counterclockwise circle.The "CW circle" gesture is the similar to the "CCW circle" except that you need to tap on the surface before you begin to create the clockwise circles. followed by a sequence of large and small peaks which corresponds to circular motion. This gesture will be used to increase the volume of a PC. which is one reason for the initial tap. Thus. then a 1 second pause. this waveform will start with a quick sharp peak as seen previously. the first few revolutions will be used to recognize the pattern and enter circle mode. You need to wait approximately 1 second after the initial tap before you before the clockwise circle motion." and to avoid a possible situation where the volume may become painfully loud due to random noise. each revolution will increase the volume by one notch. . The initial tap is needed to distinguish from a "CCW circle. Once again. After this startup period.

UIST '08. References       Harrison.XI. New York. ACM. Scott E. Unpowered and Mobile finger Input Surfaces. Inexpensive. Scratch Input: Creating Large. 205-208. In Proceedings of the 21st Annual ACM Symposium on User interface Software and Technology. Chris and Hudson. NY. LM358 Datasheet Jameco Part #136574 Datasheet ECE476 Dueling Banjo's ATMega644 Datasheet C# Serial Port Reading .

Sign up to vote on this title
UsefulNot useful