Professional Documents
Culture Documents
Android Controlled Robot Spy Camera PDF
Android Controlled Robot Spy Camera PDF
by danionescu
This interesting but complicated project will cover use it to control the robot remotely using the camera
things from designing building a robot, to advanced and a wifi connection.
configurations in linux (raspberry pi) to building an
Android application and controlling the robot. The technologies and concepts will be exploring
here:
So this being said it's more ambitious then the
average projects, but i think you'll have a lot to learn Development platforms: Arduino, Raspberry pi,
by examining some ideas here or even replicating the Android
entire project.
Electronics: H-bridge, using a transistor to drive a big
First we'll build the robot using plexiglass, plastic load, infrared sensors
sheets, DC motors with gearboxes and various
electronic components. The device will be able to Linux : using docker, docker compose, configuring
move independently the two front wheels and it will services using systemctl, video streaming
be able to use it's headlight. Then we're going to set
up the raspberry pi powering the robot and configure Programming: Android applications, python, arduino
the project and install various dependencies. Then language, serial communication, MQTT
we're going to build and install an android app and
https://youtu.be/6FrEs4C9D-Y
2. Plastic sheet ( you can also use a plexiglass sheet 22. 2S 1300 mAh LiPo battery with XT-60 connector
here )
23. 5v battery pack
3. Glue
24. Raspberry Pi 3
4. Tyre + DC motor with gearbox + bracket (eBay)
13$ 25. Raspberry Pi card
5. Small nuts and bolts, hexagonal metal spacers 26. Raspberry Pi case
18. 3 v regulator (for communication between arduino 3. Linux service configuration, package installation
and raspberry pi)
The robot we're going to build is going to have the details and building tips.
following specifications:
We're going to build the robot from plexiglass or hard
- It will have traction on the front wheels by two plastic, i've used them both but you can choose
separately DC motors whatever you wish.
- the back wheels should be able to move in any The base plate will be 18 x 13 cm on the base plate
direction 360 degrees the DC engines will be attached with metal brackets
nuts and bolts. The H-bridge will be mounted the
- the direction will be controlled by varying speed on middle of the plate facing the floor. The back wheels
the front wheels so no separately direction will be attached using 2 cm hexagonal metal spacers
mechanism is needed, also the robot will be able to (one side male one side female)
rotate on the spot
A big hole near the H-bridge is needed to connect the
- it will have lights on the top electronics on the top side.
- it should have enough room for the electronics, The top part of the robot will consist of two plates in
batteries, and a raspberry pi case with a camera "L" shape one will be 12 x 13 cm and the other 6,5 x
13 cm. The plastic plates will be glued together.
- a few cm ground clearance is needed to overcome These plates will provide cover for the electronics, a
small obstacles place to mount the headlight and a support for the
raspberry pi case. The top part will be attached from
Don't forget to check the images for important the bottom part using 6 cm hexagonal metal spacers )
Right motor: PWM (D6), EN1, EN2(A3, A2) For more information on how to install the camera
and enable it check this official article.
Infrared sensors: Front (A0), Back(A1)
4. Building the PCB with the main electronic
Raspberry pi communication pins: Tx: D11, Rx: D10 components. I've attached the fritzing schematics in
fzz format and as a picture. You can use it as a
reference oh how to build the electronics.
Soldering steps:
Building the PCB, assembly
1. In the last step we've already accommodated the
a. Cut the female PCB connectors, there are two 12
H-bridge on the floor side of the robot. We'll also need
pin connectors for the microcontroller and two 5 pin
to install the two infrared sensors one in front and
connectors, there are two 3 pin connectors for the IR
one in the back. We're going to mount them on the
sensors, a six pin connector for the H-bridge and a
chassis using a small metal plate. The metal plate will
pin connector for the raspberry pi communication
be like an "L" shape, and will have two holes. Using
(ground, TX, RX)
nuts an bolts we're going to install it on the chassis.
b. After all the connectors are cut there must be
The sensors will be on the middle of the chassis one
soldered on the back of the PCB
in the front and one in the back.
c. Solder the KF301-2P connector
2. Next the headlight part, i've used a 5 volts led
flashlight for this. I've cut the headlight exposing only
d. Solder the NPN tranzistor and the corresponding
the "head" part and soldered two wires to power it.
resistor to it's base
Then i've glued the headlight on the middle of the
robot top part and drilled a hole near the headlight put
e. Solder the L7805CV 5V regulator
the cables through the hole and soldered a small
female two wire connector to it.
f. Solder the 3.3 volts regulator on the arduino to
raspeberry pi TX line
3. Assembling the raspberry pi case. You will need
a raspberry pi, a pi camera, memory card at least 4
g. Solder the male pins to the arduino pro mini
GB, an pi camera connector. Insert the card with the
latest Raspian installed, then take the pi camera
h. Solder all the red(+), black(-), and white(signal) thin
5. Connectors
First i need to answer an important question: Why does an intermediary arduino layer has to exist and not
directly connect the Pi to the electronics?
1. It's more modular, you can reuse the arduino robot in another project without the PI
2. For safety, it's cheaper to replace a 3$ arduino pro mini than to replace a Pi (35$)
3. An arduino it's not intrerupted by the operating system like the pi is, so it's more efficient to implement PWM
controlls for the mottors, polling the front and back sensors a few times per second
4. If an error might occur in the python script the robot might run forever draining the batteries and probably
damaging it or catching fire if not supervised, in an arduino sketch a safeguard it's more reliable because it does
not depends on an operating system
Ok, so i've got the Why part covered, i'll explain the arduino sketch a bit. It basically does two things:
1. It receives motor and light commands from the serial line and drive the motors or toggle the light
For example:
* "M:-25:16;" means (-25 left), and (16 power), it wil translate to left motor 17% and right motor 32%, and direction
forward
2. It polls the infrared sensors from the back and the front of the robot and sends data about distances through
the serial line
The main code is located on github repository here, or from you can copy paste it from below.
Upload the code to the arduino using a FTDI adapter. Now you can give the robot commands to see it work, for
this just connect the second serial line and send motor or light through it. One way to do this is using a bluetooth
module like HC-05 and connect it to a phone using a bluetooth application. Then give it serial commands like "L:1"
#include <SoftwareSerial.h>
#include <TextMotorCommandsInterpretter.h>
void setup()
{
Serial.begin(9600);
masterComm.begin(9600);
masterComm.setTimeout(10);
pinMode(FLASH_PIN, OUTPUT);
pinMode(LEFT_MOTOR_PWM_PIN, OUTPUT);
pinMode(LEFT_MOTOR_EN1_PIN, OUTPUT);
pinMode(LEFT_MOTOR_EN2_PIN, OUTPUT);
pinMode(RIGHT_MOTOR_PWM_PIN, OUTPUT);
pinMode(RIGHT_MOTOR_EN1_PIN, OUTPUT);
pinMode(RIGHT_MOTOR_EN2_PIN, OUTPUT);
lastCheckedTime = millis();
lastTransmitTime = millis();
}
Android Controlled Robot Spy Camera: Page 18
}
void loop()
{
if (masterComm.available() > 0) {
currentCommand = masterComm.readString();
processCommand();
}
if (inMotion && millis() - lastCheckedTime > maxDurationForMottorCommand) {
stopMotors();
}
if (millis() - lastTransmitTime > transmitingInterval) {
lastTransmitTime = millis();
masterComm.print(getObstacleData());
Serial.print(analogRead(BACK_DISTANCE_SENSOR));Serial.print("---");
Serial.println(getObstacleData());
}
/* FOR DEBUG
motorCommandsInterpretter.analizeText("M:-14:40;");
Serial.write("Left==");Serial.println(motorCommandsInterpretter.getPercentLeft());
Serial.write("Right==");Serial.println(motorCommandsInterpretter.getPercentRight());
delay(10000);*/
}
String getObstacleData()
{
int frontDistance = analogRead(FRONT_DISTANCE_SENSOR);
int backDistace = analogRead(BACK_DISTANCE_SENSOR);
frontDistance = map(frontDistance, maxObstacleDetection, minObstacleDetection, 0, 10);
backDistace = map(backDistace, maxObstacleDetection, minObstacleDetection, 0, 10);
void processCommand()
{
switch (currentCommand.charAt(0)) {
case (MOTOR_COMMAND):
steerCar();
break;
case (LIGHT_COMMAND):
toggleLight(currentCommand.charAt(2));
break;
}
}
void steerCar()
{
motorCommandsInterpretter.analizeText(currentCommand);
float percentLeftMotor = motorCommandsInterpretter.getPercentLeft();
float percentRightMotor = motorCommandsInterpretter.getPercentRight();
Serial.write("Left=");Serial.println(percentLeftMotor);
Serial.write("Right=");Serial.println(percentRightMotor);
setMotorsDirection(motorCommandsInterpretter.getDirection());
analogWrite(LEFT_MOTOR_PWM_PIN, percentLeftMotor * maxPwmValue);
analogWrite(RIGHT_MOTOR_PWM_PIN, percentRightMotor * maxPwmValue);
inMotion = true;
lastCheckedTime = millis();
}
void stopMotors()
{
Android Controlled Robot Spy Camera: Page 19
{
Serial.println("Stopping motors");
analogWrite(LEFT_MOTOR_PWM_PIN, 0);
analogWrite(RIGHT_MOTOR_PWM_PIN, 0);
inMotion = false;
}
You will see a diagram of what i'll try to explain above in the attached images.
a. The android app shows the uv4l streaming inside a webview. The uv4l process runs on the raspberry pi,
captures video input from the camera and streams it. It's an awesome tool with many features
b. Using controls inside the android app lights and engines commands are issued to the MQTT server
c. The python server inside the docker container on the raspberry pi listens to MQTT commands and passes them
using serial interface to the arduino. The arduino board controls the motors and the lights.
d. The arduino senses distances in front and back of the robot and sends the data through the serial interface to
the python server, the python forwards them to the MQTT and they get picked up by the android interface and
shown to the user
First thing you'll need a fully installed and configured Raspbian on the raspberry pi, and the camera needs to be
psychically connected and configured. Also all the configuration will be done using ssh, so it's a good thing to get it
configured.
We can't cover all the basic things here, but do try these links if necessary:
If you don't know how to install the Raspbian operating system check this link.
For more information on how to install the camera and enable it check this official article.
If you wish to control your robot using the android app from outside the wifi, you should consider port forwarding on
your wifi router, otherwise you'll be restricted to use your local ip addresses inside your wifi.
.........
eth0
........
The ports to be forwarded (defaults) are: 9090 for uv4l and 1883 for mosquitto. You can forward this ports to the
same output ports if they are'n banned from the internet provider firewall or some other ports.
Port forwarding is something is's done differently on every router, here are some tutorials: this, and you can also
try to search on google "port forwarding your_router_model" to see more relevant results.
Prerequisites:
a. install git using command line
The folder location it's important because in docker-compose.yml the location is hard coded as: /home/pi/robot-
camera-platform:/root/debug If you need to change the location, please change the value in docker-compose too
c. disable the pi's serial console, if you don't know how to do that, check this link
If this fails or you need to find out more details about uv4l, check this tutorial.
Configuration:
a. by editing uv4l/start.sh you can configure the following aspects of the video streaming: password, port, frame
rate, with, height, rotation and some other minor aspects
b. edit config.py and replace password with your own password that you've set on the mosquitto server
d. edit config.py and replace serial port and baud rate with your own, i recommend you'll keep the baud rate
though. If you want to change it don't forget to edit that on the arduino-sketch too
sh ./uv4l/start.sh
c. Stop it
b. Enable services
c. Reboot
We're all most done, in this step we're going to install the android application. These are all the prerequisites:
The next steps involves setting up your environment, i'll just enumerate them and give a link to a specialized
tutorial, in case you don't know how to do it.
2. Enable developer options on your android phone. You can find out more here:
https://developer.android.com/studio/debug/dev-opt...
Now we're going to configure the the streaming and MQTT credentials:
there are the controls. The main control is a steering wheel, touch the steering wheel in the direction you wish the
robot to move. Below the steering wheel there is a headlight button, touch it to toggle the light.
In the top right corner there is a text like : "- Batt Connected".
* First dash means no obstacles, if there is an obstacle in front or in the back of the robot it will be signaled with a
small arrow pointing in front or in the back.
* "Connected" means that MQTT server is connected so the robot can be used, the other possible value is
"Disconnected"
- Connect the FTDI adapter to the second serial line to the laptop (RX to pin 11 and TX to pin 10) and issue motor
commands and light commands to see if the robot responds to these commands
- Double check the connections, if the motors are moving backwards reverse the both motor wires, if one motor is
moving backwards reverse the it's wires
- Check if the arduino is connected properly to the H-bridge, check this link for more information
- Check docker is running the two containers (mosquitto and python server)
pi@raspberrypi:~ $ docker ps
CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES
473a56da2230 dockercontainer_python-server "python /root/debu..." 9 months ago Up 4 hours dockercontainer_python-server_1
3e0b1933d310 robot-camera-mosquitto "/usr/bin/entry.sh..." 9 months ago Up 4 hours 0.0.0.0:1883->1883/tcp dockercontainer_mosquitto_1
- Check the processes are running on specified ports, you should look for 9090 (streamming) and 1883 (mosquitto)
- Check the serial port exists (it's the correct one) and it's specified in the project config.py
pi@raspberrypi:~ $ ls -l /dev/ttyS0
crw-rw---- 1 root dialout 4, 64 Jan 14 19:59 /dev/ttyS0
- Stop the docker process and manually connect to serial using picocom
Then issue motor and light commands directly to see if the robot responds
- Check the streaming and MQTT are accessible outside the raspberry pi suing a mosquitto client (for MQTT) and
checking the streaming in a web browser)
- Check all the necessary steps to enable the phone into debugging mode (check this out)
- Make sure you've set up the passwords and endpoints correctly in the passwords.xml
- Check the streaming and MQTT are accessible outside the raspberry pi suing a mosquitto client (for MQTT) and
checking the streaming in a web browser)
- See the top right corner of the app and check for "Connected"
Another use case is a object following robot .The robot will follow an object of a specific color and size threshold
Because this is out of scope of this tutorial i'll just give you some hints:
- for this to work you won't need the video streamming, mqtt and docker installed
HSV means hue saturation value, and for our color object detection to work it has a lower and an upper bound, our
object color will have to be in this range to be detected. Here you can find a visual HSV object threshold detector.
Object size threshold means the smallest and the highest object radius size (in percents from width) which will be
considered a detection.
- install and configure VNC (more information of how to install VNC here)
- Run the object tracking script in VNC graphical interface in a terminal. This will enable you to view the video, with
a circle drawn over it. The circle means that the object has been detected.
I think the main advantage of this platform is versatility, it can be adapted easily to other interesting usages, some
of my ideas:
- Replace the wifi with a 3G modem and the robot can be controlled outside, maybe exploring dangerous zones
This being a very complex project i assume that there will be errors, i will appreciate if you ask me anything in the
comments area.
If you like my project please subscribe to my instructables channel and to my youtube channel for more interesting
stuff.
https://youtu.be/z9qLmHRMCZY