You are on page 1of 4

Basic Gesture Recognition Using mmWave Sensor

- TI AWR1642
Collecting data from TI AWR1642 via serial port and passing it through convo-
lutional,lstm or transformer neural network for recognizing one of nine following
gestures:
• Swipe Up
• Swipe Down
• Swipe Right
• Swipe Left
• Spin CW
• Spin CCW
• Letter Z
• Letter X
• Letter S

Figure 1: Demo

Getting Started
Installation
Install mmwave package locally:
git clone https://gitlab.com/vilari-mickopf/mmwave-gesture-recognition.git
cd mmwave-gesture-recognition
git lfs pull
pip3 install -e ./

1
Note: trans_model is saved on lfs, because it has 200+mb, but github bandiwth
is way too small so the file is now blocked. And additionally, the bandwith is
not resetting after a month as supposed to, so I would advise just pulling it from
gitlab instead (without trans model, you can still use lstm or conv models).

Serial permissions
The group name can differ from distribution to distribution.

Arch
gpasswd -a <username> uucp

Ubuntu:
gpasswd -a <username> dialout
The change will take effect on the next login.
The group name can be obtained by running:
stat /dev/ttyACM0 | grep Gid

One time only (permissions will be reseted after unplugging):


chmod 666 /dev/ttyACM0
chmod 666 /dev/ttyACM1

Flashing
The code used for AWR1642 is just a variation of mmWaveSDK demo provided
with the version 02.00.00.04. Bin file is located in firmware directory.
1. Close SOP0 and SOP2, and reset the power.
2. Start the console and run flash command:
python console.py
>> flash xwr16xx_mmw_demo.bin
3. Remove SOP0 and reset the power again.

Running
If the board was connected before starting the console, the script should automat-
ically find the ports and connect to them. This is only applicable for boards with
XDS. If the board is connected after starting the console, autoconnect command
should be run. If for some reason this is not working, manual connection is
available via connect command. Manual connection can also be used for boards
without XDS. Type help connect or help autoconnect for more info.

2
If the board is connected, the prompt will be green, otherwise, it will be red.
After connecting, simple start command will start listener, parser, plotter and
prediction.
python console.py
>> start
Use Ctrl-C to stop this command.

Collecting data
The console can be used for easy data collection. Use log command to save
gesture samples in .csv files in mmwave/data/ directory. If nothing is captured for
more than a half a second, the command will automatically stop. redraw/remove
commands will redraw/remove the last captured sample.
python console.py
>> listen
>> plot
>> log up
>> log up
>> redraw up
>> remove up
>> log down
>> log ccw

Training
Console can be used for the training process. X and y data is cached in pickle files
located in mmwave/data/ directory. If new data is captured, refresh argument
should be passed (this option will take few minutes to execute).
python console.py
>> train
or
python console.py
>> train refresh

Selecting model
By default, lstm model is used. Other models can be selected using set_model
option.
python console.py
>> set_model conv
>> set_model lstm
>> set_model trans

3
Known issue: Tensorflow 2 introduced memory leak on repeatedly load-
ing/unloading of the models, which can cause crashes due to not having enough
memory to initialize new model.

Help
Use help command to list all available commands and get documentation on
them.
python console.py
>> help
>> help flash
>> help listen

Authors
• Filip Markovic

License
This project is licensed under the MIT License - see the LICENSE file for details

Acknowledgments
• Thanks to NOVELIC for providing me with sensors

You might also like