Professional Documents
Culture Documents
Virtual Reality
Virtual Reality
1. INTRODUCTION
Technological breakthroughs have allowed automation to replace many jobs that are
repetitive in nature and hazardous to human health, thus replacing human operators in many
work environments. However, it has also allowed a new phenomenon to occur: the remote
control of physical systems by humans through the mediation of computers. Indeed, there are
still many jobs that are non-repetitive, unpredictable and hazardous to humans. Clearing up a
nuclear power plant leak or exploring the extreme depths of the ocean are just some
examples. Reliance on specially designed automatic machines for such jobs may prove to be
inefficient, less flexible and less cost-effective than a human operator is. Therefore, a remote
control system by human operators using video inspection and master-slave mechanical
manipulation (called tele-operation) is used. Due to the growing popularity and also due to
the research in the field of robotics, the days are very near wherein robots will replace all
human activities.
This project is an effort to simulate a robot, which converts the real time actions into
digital signals which in turn is used to produce virtual actions. The software for this project
acts as interface between the hardware devices and controller computer. Inputs to the system
are taken from sensors which are fitted on the intended part or body. These sensors supply the
real-time conditions of the intended body to the computer. The software processes these
signals and according to received signal levels it moves the graphical picture on the monitor.
That means the virtual movement is created by software part, where as real actions are
captured by array of sensors. This project gives an idea of the working of a simple robot and
to demonstrate the use, implementation and applications of the virtual reality technology.
Here the project is designed using the C and C graphics. This project is connected to
the external hardware; simulator accepts the commands from the hardware and accordingly
performs the desired task. This project finds various applications in the field of tele-medicine
and tele-robotics.
The project aims at the C graphics and peripheral interface par. Here the objective is
to connect or interface the Image on the PC to the external hardware and also to control the
image properties like shape, color and size by operating the external hardware.
1
Virtual Reality For Robotics
The software reads the data from the interface, generated from the external hardware
and interprets them into suitable image formats. For various data, the project image behaves
in a different pre-programmed manner. For example, if the joystick is bent for 45 degrees, the
toy image on the computer screen also bends by 45 degree and similar.
2. Highly flexible in nature. The system can be used as a stand-alone unit or a remote
interfaced one.
3. Due to the computer interface facility, the system is highly accurate & automatic in
nature.
4. Since this system is associated with a PC, it is very easy to store the data, keep track
of the parameters & analyze it, for further study of the parameter.
5. Once the system is activated, and set to receive the parameters; the whole project can
be used as a computerized data logger system for storing the parameters.
2
Virtual Reality For Robotics
The term 'Virtual Reality' (VR) was initially coined by Jaron Lanier, founder of VPL
Research (1989). Other related terms include 'Artificial Reality' (Myron Krueger, 1970s),
'Cyberspace' (William Gibson, 1984), and, more recently, 'Virtual Worlds' and 'Virtual
Environments' (1990s). Today, 'Virtual Reality' is used in a variety of ways and often in a
confusing and misleading manner. The virtual reality is a hot theme and is evident in
recently released Hollywood films such as ‘Spy Kids’ series & “Spider Man’ series. This
virtual reality concept is out-come of most exciting and challenging field of automation,
‘Artificial Intelligence’. As the name implies, the virtual reality concept allows the user to do
the work which is hundred fold heavier than the actual one. For example, by moving one
small model of Himalayan mountains sitting inside the laboratory one can move the real
Himalayan mountain with the help of the machinery situated near it. This seems too
aggregated application of virtual reality concept, but it succeeds in showing the power of it.
This concept is clearly understood by taking the ‘Warm Hand’ project as an example,
conducted by US and the Britain universities jointly. These two universities constructed two
human hands, using high sensitive materials covered with innumerable sensors around it.
These two hands are connected to computer which is connected to internet. The arrangements
of both hands is as shown in fig.2.1
3
Virtual Reality For Robotics
When any person shakes the hand A situated in US university, it records every change
in parameters during the hand-shaking action. Here the ‘parameter’ means warmth of the
hand, pressure applied by the hand-shaker, the grip-level, inclined angle etc. These
parameters are sensed by sensitive sensors spread throughout the hand-gloves and stored in
memory of computer. This stored data is transmitted quickly to Britain university’s computer
via broad band internet connection. There it is again converted back to original signals and
fed to hand B. If any body hand-shake the hand B he will feel that the actual person, who is
hand-shaking hand A, is hand shaking with him. This project really demonstrates the virtual
reality concept’s application.
Such virtual reality applications finds very important role in saving any life, doing
remote works, working with hundred fold bigger than the real capacity machines etc. Let us
see some examples, where this killer application can be implemented.
2.2 Tele-Medicine:
Suppose there is only one specialist available in the world, who cannot move around
to help or perform operation on patients lying on the death-bed. He can sit in his own
hospital, which is having virtual reality facility, and perform the operations on various
patients around the world at a time. The doctor will have one model- virtual body, which
resembles the human patient, fitted with all the sophisticated sensors around its body. On the
other side, the real patient will be kept under another virtual reality facilitated hospital. Here
all the parameters of the patient, on whom operation has to be performed, are sensed, stored
and sent quickly to doctor via broadband internet connection. The parameters sensed are heart
beat, pulse rate, body temperature, body position etc. With the help of these received
parameters from real body the virtual body tries to act like it. It means virtual body tries to
imitate the real body, which is far away from it. This factor helps the specialist doctor to
understand the patient’s condition and accordingly he can operate on him. This virtual reality
concept brings the patient, who is at other corner of the world, to doctor’s operation table.
Doctor can check pulse rate, heart beats, measure body temperature of his virtual body, as if
of real body.
4
Virtual Reality For Robotics
Fig 2.2 shows how this tele-medicine system works. Here, a doll and monitor are used
as output devices. The output, graphics on the monitor and the doll, reflects the actions of the
input devices. eg: Movement of a glove or movement of the different parts of the suit results
in the movement of different parts of the doll and the graphics on the monitor.
5
Virtual Reality For Robotics
Here the person standing for demonstration is attached with number of sensors.
These sensors are connected to computer’s CPU via sensor cable bunch. Sensor S1 &
S2 are connected to observe shoulder positions, S3 & S4 are used to get the bending
action of hands and S5 & S6 sensors are used to watch the movements of legs. In
this picture only six sensors are used, and the numbers can be increased or decreased
as per one’s requirements. The person’s picture will be created on the screen of
computer using C++/C-graphics language.
Whenever the person shows any movement, correspondingly the sensors sense
the actions and send them to computer. The C-program sitting inside the computer
process the data it received via sensor cable bunch and accordingly instructs C++/C-
graphics language to move the respective parts on the graphical picture. Different
actions and their respective movements on screen can be observed.
Fig 2.4 shows what happens when person lifts his both hands up. The sensors
S5 & S6 sense no actions, as the person has not changed his standing position. The
sensors S1 & S2, which are normally movement detectors, detects that both hands
are lifted up. The sensors fitted on elbow of both hands sends signal to computer that
they are intact. Thus processing the entire sensor data, computer’s software program
concludes that person has lifted his both hands up. So C-program thus instructs C+
+/C-graphics program to change the necessary changes on graphical representation
of screen.
6
Virtual Reality For Robotics
Even though this project depicts only half part of the virtual reality concept, it
demonstrates how data can be gathered and stored in the local computer. Instead of
transmitting it to remote computer for further process, sensed data is used to move
the graphical element on the same local computer. It is evident by seeing fig 2.5, that
sending the sensed data to remote computer for further processing is not a big deal.
7
Virtual Reality For Robotics
3. BLOCK DIAGRAM
8
Virtual Reality For Robotics
The “Virtual Reality for Robotics” is built using several blocks. The building
blocks and their inter-relation is shown in fig 3.1.
9
Virtual Reality For Robotics
10
Virtual Reality For Robotics
Sensor Block: This block contains number of sensors or transducers spread across the
intended body. Several types of sensors are used in this block as per one’s requirements, such
as tactile switches are used to sense the bending of arms and ankles and micro-switches or
sliding potentiometers are used to sense the movement of hands & legs or any body part.
These weak sensed signals are given to instrumentation amplifier block for further
amplification.
Instrumentation Amplifier Block: The output of sensor will be very feeble to drive any
further sections. So some kind of amplification is needed before utilizing that sensed signals.
This amplification is done using reliable amplifier circuit called instrumentation amplifier.
This amplifier, as the name implies, is specially used in instrumentation field to amplify weak
sensed field signals from various sensors spread all over the area [for example in industries
several sensors will be fitted to check so many parameters viz., speed of motor, motion of
conveyor belt, level of material in storing silos etc.,].
ADC: The conversion of analog signals received from instrumentation amplifier block is
done using this ADC block. The analog-to-digital converter block converts any input signals,
which are in analog form into digital form. This is necessary as the computer can understand
11
Virtual Reality For Robotics
only digital data and process them. The output of ADC is given to I/O interface block,
through buffer stage, to pass the signals from ADC to computer.
I/O Interface: This stage receives boosted sensed signals from buffer and fed to computer
port for further processing. This stage also sees that signal level should be compatible with
TTL signal of computer port.
Personal Computer: The brain of this project is the computer. With the help of interfacing
software module the computer processes the sensed signals and gives out the output in
graphical form.
Software Module: The software module is responsible for producing the graphical output
on computer’s screen, which depends on the received sensed signal levels. This core software
[i.e., receiving the sensed signal level] is written in popular language C and its graphics part
is in C graphics and C++.
Power Supply Unit: This specially designed power supply provides +12 V for op-amp,
relay & driver stage and +5V for rest of the digital & analog circuits.
4. CIRCUIT DESCRIPTION
12
Virtual Reality For Robotics
The circuit construction of this ‘Virtual Reality for Robotics’ project is divided into
these parts: Sensors block, Instrumentation block, ADC, I/O interface stage & finally power
supply unit.
This block holds all the sensors used in the project in one place. Depending upon the
requirement, different types of sensors or transducers are used. Before going to block
explanation, first let us see what is sensor or transducer and where it is used.
Transducer: The transducer, also known as a sensor, is a device which converts physical
parameters to be measured, viz., pressure, temperature, movements etc, into a corresponding
quantity. If the transducer converts physical quantity to useful electronic signals, such as
voltage and current analogous, then it is an electrical transducer.
Optical Sensors: It converts light that falls on it into electronic signals. Optoelectronic
devices such as Light Emitting Diodes [LEDs] can generate electromagnetic energy while
other devices such as Light Dependent Resistors [LDRs] can sense visible light energy.
Visible light occupies the range from about 400nm to 600nm. Although various types of
optical sensor are available, for general purpose applications LDRs are employed. The 1M
rated typical LDR exhibits a resistance which varies from 400 under bright room lighting
[1000 lux] to as great as 1M in total darkness. The spectral response peaks at about 550nm
and falls rapidly below 500nm and above 650nm. Thus this optical sensor is useful in light
sensing equipment and illumination level sensing applications generally.
Sound Sensors: Electronic devices such as loud speakers can generate electromagnetic
energy while other devices such as condenser microphones can sense audible energy. Audible
13
Virtual Reality For Robotics
frequency range is from about 300Hz to 3000Hz. Although various types of sound sensors are
available, for general purpose applications microphones are employed. This sound sensor is
useful in listening bug or listening amplifier circuits.
Movement Detectors: The sensor which detects the movement or change of position of
the fitted body/part and sends message in electronic signal form is called movement
detectors. Usually tactile switches, reed relays or slider potentiometers are used for detecting
the movements.
Circuit Description: As the circuit in fig 4.1 shows, the project allows any number of
sensors to use, as per one’s requirements. To detect the bending of arms and ankles movement
detectors can be used. To detect the movements of legs and hands or any other body part
sliding potentiometers can be used.
14
Virtual Reality For Robotics
This block amplifies the sensed signals to sufficient level and fed to ADC card for
further processing. The present circuit shows only one instrumentation amplifier stage, which
takes input signal from any type of sensor. This circuit can be used for any type of sensor
which needs amplification. Note that each sensor needs one instrumentation amplifier, like
this one. Before explaining the actual circuit, let us see some terms and their definitions in
detail.
Amplifier: A common theme in many electronic circuits is the need to increase the level of
voltage, current, or power present. This need is satisfied by some form of amplifier. The
important parameters regarding the amplifier circuit designing are gain, input impedance,
output impedance, frequency response, bandwidth, phase shift, and efficiency. Not all of the
above characteristics or parameters will be important in any given application. An essential
first stage in designing an amplifier will be to select those characteristics which are important
and then specify the parameters required.
voltages, signal conditioning and (moderate) isolation for data acquisition signals wherever
the common “ground” is noisy or of questionable integrity.
The instrumentation amplifier is used where it is desirable to eliminate the d.c. path
from the signal input to amplifier ground, i.e., to isolate the signal from ground. It has its
signal input circuit-isolated from the signal output and power input circuits. Such amplifiers
are used: When low-level signals are processed on top of high common-mode voltages;
possibilities exist of troublesome ground disturbances and ground loops, where patient
isolation is required, systems operate in noisy environments and processing circuitry must be
protected in case of fault conditions. The primary features of instrumentation amplifier are,
inherent protection from high-voltage differences between input and output ports and
between high and low input terminals. High rejection of noise as well as a.c., d.c. and
transient signals common to both input terminals. And very high impedance leakage with a
50Hz, 120V input to output ports.
The selection of an instrumentation amplifier may fall into one of the three categories:
industrial, instrumentation and medical. The instrumentation application or data acquisition
process may not necessarily involve the extreme hazards of medical or industrial
applications, but the precision requires demand loops and offer better common-mode-
rejection than conventional data amplifiers. High accuracy performance with low level
transducer signals can be enhanced by choosing an amplifier with isolated power available
for a low-drift signal conditioning pre-amplifier. In medical monitoring applications
amplifiers must first and foremost protect the patient from leakage current and amplifier fault
currents. This applies to both input protection and input/output isolation.
Circuit Description:
16
Virtual Reality For Robotics
Fig 4.2 shows, a simple module where the main circuit needs the transducer which
gives the varying voltage levels with respect to varying input signals. The heart of this
module is an op-amp CA 3140 whose gain is very high. Thus the small level of information
will be easily picked and amplified to required level by it. The op-amp is in unity gain non-
inverting voltage amplifier mode. This voltage follower configuration employs cent percent
voltage negative feedback and combines a very high input resistance with a very low output
resistance. As the operation of this circuit is analogous, the sensor’s sensed signal is carried
away to the next stage easily.
The sensor is connected at non-inverting input pin-3 and output is observed at pin-6.
To stabilize the sensed signals by sensor, two biasing resistors [R1 & R2] arrangement is
done. As the parameter [i.e., movement, bending etc] increases in the vicinity of sensor the
output goes on increasing. This output signal is further amplified and fed to ADC circuit for
conversion. The table 4.1 shows various components used in the amplifier circuit.
17
Virtual Reality For Robotics
SEMICONDUCTORS
IC1 3140 Op-Amp IC 1
RESISTORS
R1 5.6 K Ohm, ¼ Watt Carbon Type 3
R2 3.9 k Ohm, ¼ Watt Carbon Type 1
MISCELLANEOUS
C1 47 µF/ 25 V Electrolytic 1
Sensor LM 335 Temperature Sensor 1
BASIC CONCEPTS
18
Virtual Reality For Robotics
Fig 4.3 shows a block diagram of a 3-bit A/D converter. It has one input line for an
analog signal and three output lines for digital signals.
Fig 4.4 shows the graph of the analog input voltage (0 to 1V) and the corresponding
digital output signal. It shows eight (23) discrete output states from 0002 to 1112, each step
being 1/8 V apart. This is defined as the resolution of the converter.
Circuit Description:
19
Virtual Reality For Robotics
This 8-bit analog to digital converter is built around industry standard AD 0809 from
National Semiconductors. This 28-pin DIL packaged IC contains an 8 channel analog
multiplexer and can directly interface up to 8 analog inputs in the range 0 to 5V. It has a small
conversion time, as 100 micro seconds with 15mW power consumption. This ADC interface
consists of a NOR gate crystal oscillator, a CMOS clock divider which feeds 768 kHz as the
input clock to the ADC [at pin-10]. This stage is performed with the help of U1, U2 and a
quartz crystal.
The usually available supply voltage +12V is step-downed to +5V, a working voltage
of AD 0809 and fed to U4’s Vcc pin 11. This regulation is achieved by 10-pin [TO packaged]
regulator IC 723 [U5] with current limiting zener diode D1. The stable 5V reference by LM
336 and an op-amp U6 in reference & buffer provider mode gives the Ref+ at pin-12. Multi-
turn cermet [R18] allows adjustment of the reference voltage.
The channel select, ALE, start conversion and output enable lines are interfaced
through port lines. The ADC IC provides three channels PB0, PB1 and PB2 to the user to
select for his operation. The PB5 line provides Address Latch Enable [ALE} signal to hold
the present address. The PB6 signal line asks ADC IC to start conversion either in polled
mode or interrupt mode. And the PB7 is input enable signal line.
Fig 4.5 shows the RS232 board interface section, which uses a MAX232 5V to
RS232 converter chip, this converts the 0-5V TTL levels at the ATMEL 89C51 pins
to the +12V/-12V levels used in RS232 links. As is common with these devices it
inverts the data during the conversion, the ATMEL 89C51 USART hardware is
designed to take account of this but for software serial communications you need to
make sure that you invert both the incoming and outgoing data bits.
The two closed links on the RC7 and RC6 lines are for connection to the
Atmel 89C51 board, and are the two top wire links shown on the top view of the
board below. The two open links on the RC1 and RC2 lines are for the 16F628 board
20
Virtual Reality For Robotics
(the 16F628 uses RB1 and RB2 for its USART connection), and are the two top track
breaks shown on the bottom view of the board below.
So, for use with the Atmel 89C51 board fit the top two wire links, and cut the
top two tracks shown, for the 16F628 leave the top two links out, and don't cut the
two top track breaks. This only applies if you are using the hardware USART, for
software serial communications you can use any pins you like. Although it's labeled
as connecting to port 1 for the Atmel 89C51 processor board, it can also be
connected to other ports if required, and if not using the hardware USART.
Circuit description:
As shown in fig 4.6, a d.c. power supply which maintains the output voltage constant
irrespective of a.c. mains fluctuations or load variations is known as regulated d.c. power
supply. It is also referred as full-wave regulated power supply as it uses four diodes in bridge
fashion with the transformer. This laboratory power supply offers excellent line and load
regulation and output voltages of +5V & +12V at output currents up to one amp. The
components used are listed in table 4.2.
2. Rectifier Stage: The two diodes D1 & D2 are connected across the secondary winding
of the transformer as a full-wave rectifier. During the positive half-cycle of secondary
voltage, the end A of the secondary winding becomes positive and end B negative. This
makes the diode D1 forward biased and diode D2 reverse biased. Therefore diode D1
conducts while diode D2 does not. During the negative half-cycle, end A of the secondary
winding becomes negative and end B positive. Therefore diode D2 conducts while diode D1
does not. Note that current across the centre tap terminal is in the same direction for both
half-cycles of input a.c. voltage. Therefore, pulsating d.c. is obtained at point ‘C’ with respect
to ground.
3. Filter Stage: Here capacitor C1 is used for filtering purpose and connected across the
rectifier output. It filters the a.c. components present in the rectified d.c. and gives steady d.c.
voltage. As the rectifier voltage increases, it charges the capacitor and also supplies current to
the load. When capacitor is charged to the peak value of the rectifier voltage, rectifier voltage
starts to decrease. As the next voltage peak immediately recharges the capacitor, the discharge
period is of very small duration. Due to this continuous charge-discharge-recharge cycle very
little ripple is observed in the filtered output. Moreover, output voltage is higher as it remains
22
Virtual Reality For Robotics
substantially near the peak value of rectifier output voltage. This phenomenon is also
explained in other form as: the shunt capacitor offers a low reactance path to the a.c.
components of current and open circuit to d.c. component. During positive half cycle the
capacitor stores energy in the form of electrostatic field. During negative half cycle, the filter
capacitor releases stored energy to the load.
Fig 4.6 Circuit diagram of +5V & +12V full wave regulated power supply
SEMICONDUCTORS
IC1 7812 Regulator IC 1
IC2 7805 Regulator IC 1
D1& D2 1N4007 Rectifier Diodes 2
CAPACITORS
C1 1000 µf/25V Electrolytic 1
MISCELLANEOUS
X1 230V AC Pri,14-0-14 1Amp Sec Transformer 1
4. Voltage Regulation Stage: Across the point ‘D’ and Ground there is rectified and
filtered d.c. In the present circuit KIA 7812 three terminal voltage regulator IC is used to get
23
Virtual Reality For Robotics
+12V and KIA 7805 voltage regulator IC is used to get +5V regulated d.c. output. In the three
terminals, pin 1 is input i.e., rectified & filtered d.c. is connected to this pin. Pin 2 is common
pin and is grounded. The pin 3 gives the stabilized d.c. output to the load. The circuit shows
two more decoupling capacitors C2 & C3, which provides ground path to the high frequency
noise signals. Across the point ‘E’ and ‘F’ with respect to ground +5V & +12V stabilized or
regulated d.c output is measured, which can be connected to the required circuit.
24
Virtual Reality For Robotics
Fig 4.6 shows the 555 Timer connected as an astable multivibrator. In this circuit there
are two timing resistors Ra and Rb. Both the trigger and threshold inputs (pins 2 and 6) of the
two comparators are shorted and connected to the external timing capacitor C. as a
consequence, the circuit will trigger itself continuously.
The astable operation can be understood by the waveforms of fig4.7. Assume initially
the 555 timer output is in the high state and the capacitor C is uncharged. The control flip-
flop output is now at Low with Q=0.
The discharge transistor Q1 is off and capacitor C starts charging towards Vcc through
Ra and Rb. When the capacitor voltage rises above 2/3 Vcc, it causes the comparator 1 output
to go High and comparator output 2 to go Low. Hence, the control flip-flop output is set to
high, with Q=1 and the timer output is forced to low state. Consequently, the discharge
transistor Q1 is turned ON connecting upper end of Rb to ground. The capacitor C now
discharges through Rb towards 0V. When the capacitor voltage drops below 1/3 Vcc the
output of the comparator 2 goes to high and that of the comparator 1 to low. As a result, the
25
Virtual Reality For Robotics
output of the control flip-flop resets to Low state with Q=0 and the timer output goes to high
state. In effect, the discharge transistor Q1 is turned OFF and the capacitor C charges again
towards 2/3 Vcc. This cycle of capacitor charging from 1/3 Vcc to 2/3 Vcc and then
discharging from 2/3 Vcc to 1/3 Vcc repeats continually producing a square-wave output.
Ton = 0.693(Ra+Rb) C
Toff = 0.693 Rb C
f = 1/T = 1.44/ (Ra+2Rb) C
Duty Cycle, D = Ton / T
D= Ra+Rb/Ra+2Rb
where,
Ton is the on-time during which the capacitor charges.
Toff is the off-time during which the capacitor discharges.
f is the output frequency.
T is the total time period.
D is the Duty cycle of the output pulses generated.
26
Virtual Reality For Robotics
5. Software Description
5.1 Flowchart
The flowchart of the program is shown in fig 5.1. In brief the program works as
explained below.
When the program is executed, the initgraph function is called. It initializes the
graphics system. Then the function doll is executed. Then signals are read from the hardware
device. This device acts as an input device. It has 8 channels of input (numbered from 0 to 7)
and each channel can vary its input signal from 0 to 255. Depending upon the channel
selected the image on the screen shows movements. If channel 0 is selected then the left hand
of the image shows movement. The amount of movement depends upon the signal i.e. its
range from 0 to 255. If channel 1 is selected then the right hand of the image shows
movement. The amount of movement depends upon the signal i.e. its range from 0 to 255. If
channel 2 is selected then the left leg of the image shows movement. The amount of
movement depends upon the signal i.e. its range from 0 to 255. If channel 3 is selected then
the right leg of the image shows movement. The amount of movement depends upon the
signal i.e. its range from 0 to 255.By varying the signals of channel 4, right forearms of the
image can be moved respectively.
27
Virtual Reality For Robotics
START
INITGRAPH
DOLL
WHILE
(!
KBHIT)
SIGNALS
FROM
HARDWARE
HAND
SIGNA
MOVEMENT
L (LEFT ARM)
= 0
HAND MOVEMENT
SIGNA
L (RIGHT ARM)
= 1
LEG MOVEMENT
SIGNA
(LEFT LEG)
L
= 2
HAND MOVEMENT
SIGNA (RIGHT LEG)
L
= 3
HAND MOVEMENT
SIGNA
(RIGHT ARM)
L
= 4
C B
A
28
Virtual Reality For Robotics
29
Virtual Reality For Robotics
main()
{
char val;
port2 = 0xff;
port3 = 0xff;
while(1)
{
wr = 0;
rd = 1;
wr = 1;
intr = 1;
while ( intr == 0 )
;
rd = 0;
val = port2;
//port3 = val;
if ( val > 0x05 && val < 0x16 )
port1 = 0x01;
else if ( val > 0x16 && val < 0x32 )
port1 = 0x02;
else if ( val > 0x32 && val < 0x128 )
port1 = 0x04;
else if ( val > 0x128 && val < 0xff )
port1 = 0x08;
rd = 1;
}
}
30
Virtual Reality For Robotics
5.3 C Program:
#include <stdio.h>
#include <alloc.h>
#include <conio.h>
#include <graphics.h>
#include <math.h>
#include <dos.h>
#include <stdlib.h>
#include <bios.h>
#define COM 0
#define DATA_READY 0x100
#define TRUE 1
#define FALSE 0
void doll(void);
void face(void);
void shirt();
void hand_movement(int,int,int,int,int);
void leg_movement(int,int,int,int);
void pant(void);
void lhand(int,int,int);
void rhand(int,int,int);
void lleg(int,int,int,int,int);
void rleg(int,int,int,int,int);
void movelhand(int,int,int,int [500][2]);
void moverhand(int,int,int,int [500][2]);
int gd=DETECT,gm,a[20];
int x=300,y=100;
static int lh11[500][2],lh12[500][2],lh21[500][2],lh22[500][2],
rh11[500][2],rh12[500][2],rh21[500][2],rh22[500][2],
rl11[500][2],rl12[500][2],rl21[500][2],rl22[500][2],
ll11[500][2],ll12[500][2],ll21[500][2],ll22[500][2],
le11[500][2],le12[500][2],le21[500][2],le22[500][2],
re11[500][2],re12[500][2],re21[500][2],re22[500][2],
r1=115,r2=105,r3=150,r4=117,q;
float r[3][3],b[3][1],c[3][1],p[10];
31
Virtual Reality For Robotics
void main(void)
{
char c,*area;
int i=10,input=0,ip0=0,ip1=0,ip2=0,ip3=0,ip4=0,ip5=0,ip6=0,ip7=0,
temp1=0,temp2=0,temp3=0,temp4=0,size,flag1=0,flag2=0;
initgraph(&gd,&gm," ");
cleardevice();
//outportb(0x303,0x91);
movelhand(x-71,y+78,r1,lh11); /* hands */
movelhand(x-50,y+100,r2,lh12);
moverhand(x+70,y+78,r1,rh11);
moverhand(x+50,y+100,r2,rh12);
movelhand(x-71,y+78,r1-8,lh21); /* hands */
movelhand(x-50,y+100,r2+8,lh22);
moverhand(x+70,y+78,r1-8,rh21);
moverhand(x+50,y+100,r2+8,rh22);
moverhand(x+41,y+198,r3,rl11); /* legs */
movelhand(x-41,y+198,r3,ll11);
moverhand(x,y+231,r4,rl12);
movelhand(x,y+231,r4,ll12);
moverhand(x+41,y+198,r3-8,rl21); /* legs */
movelhand(x-41,y+198,r3-8,ll21);
moverhand(x,y+231,r4+8,rl22);
movelhand(x,y+231,r4+8,ll22);
movelhand(x-71,y+78,r1/2,le11); /* elbow */
movelhand(x-50,y+100,r2/2,le12); /* left */
movelhand(x-71,y+78,r1/3-5,le21);
movelhand(x-50,y+100,r2/2+5,le22);
moverhand(x+71,y+78,r1/2,re11); /* elbow */
moverhand(x+50,y+100,r2/2,re12); /* right */
moverhand(x+71,y+78,r1/3-5,re21);
moverhand(x+50,y+100,r2/2+5,re22);
doll();
size = imagesize(180,0,400,380);
area = malloc(size);
getimage(180,0,400,380,area);
32
Virtual Reality For Robotics
while(!kbhit())
{
skip1:
if ( j == 'D' ) goto skip3;
skip2:
printf("\nSystem not ready \n");
skip3:
ix= 1;
case 1 :
if(ip6==0)
ip1=value[CH_No]*36;
break;
case 2 :
if(ip6==0)
{
if(ip2 != value[CH_No]*50)
{
setfillstyle(1,BLACK);
bar(x-3,y+281,x,y+325);
bar(x-45,y+281,x-41,y+324);
33
Virtual Reality For Robotics
// putimage(180,value[CH_No]*51*.16,area,0);
leg_movement(ip2*0.2+5,BLACK,1,0);
leg_movement(value[CH_No]*50*0.2+5,WHITE,1,0);
ip2=value[CH_No]*50;
}
}
break;
case 3 :
if(ip6==0)
{
if(ip3 != value[CH_No]*50)
{
setfillstyle(1,BLACK);
bar(x,y+281,x+3,y+325);
bar(x+41,y+281,x+44,y+324);
// putimage(180,value[CH_No]*51*.16,area,0);
leg_movement(ip3*0.2+5,BLACK,0,1);
leg_movement(value[CH_No]*50*.2+5,WHITE,0,1);
ip3=value[CH_No]*50;
}
}
break;
case 4 :
if(ip6==0)
ip4=value[CH_No]*51;
break;
case 5 :
if(ip6 == 0)
if ( value[CH_No] > 3.33 )
ip5=value[CH_No]*51;
else
ip5 = 0;
break;
}
34
Virtual Reality For Robotics
}
if((ip0 != temp3) || (ip5 != temp4))
{
if(flag2)
hand_movement(0*.79+10,0*.18,BLACK,1,0);
else
hand_movement(temp3*.79+10,temp4*.18,BLACK,1,0);
hand_movement(ip0*.79+10,ip5*.18 ,WHITE,1,0);
temp3=ip0;temp4=ip5;flag2=0;
}
ip4=0;
}
ix++;
}
closegraph();
/* function DOLL */
void doll()
{
face();
shirt();
pant();
hand_movement(10,0,WHITE,1,1);
leg_movement(5,WHITE,1,1);
}
/* face */
void face()
{
setcolor(16);
setfillstyle(1,13);
fillellipse(x,y,30,50); /* Face */
setcolor(13);
sector(x-30,y,0,360,5,10); /* Ear-left */
sector(x+30,y,0,360,5,10); /* Ear-right */
setcolor(16);
ellipse(x-30,y,120,240,3,8); /* ear-left */
ellipse(x+30,y,300,60,3,8); /* ear-right */
setfillstyle(1,15);
fillellipse(x-15,y-5,7,3); /* Eye-left */
fillellipse(x+15,y-5,7,3); /* Eye-right */
setfillstyle(1,16);
fillellipse(x-15,y-5,3,3); /* Eye Ball - left */
fillellipse(x+15,y-5,3,3); /* Eye Ball - right */
setfillstyle(1,RED);
ellipse(x,y+30,180,0,10,5); /* mouth */
ellipse(x,y+25,210,330,15,7); /* mouth */
35
Virtual Reality For Robotics
floodfill(x,y+34,BLACK); /* mouth */
ellipse(x+30,y+25,185,205,18,15); /* mouth - right arc */
ellipse(x-30,y+25,338,358,18,15); /* mouth - left arc */
line(x,y+3,x-6,y+19); /* nose */
line(x,y+3,x+6,y+19); /* nose */
fillellipse(x+3,y+19,3,1); /* nose */
fillellipse(x-3,y+19,3,1); /* nose */
setfillstyle(1,BLUE);
fillellipse(x,y-25,40,10); /* Hat - lower ellipse */
setcolor(BLUE);
a[0]=a[6]=x-30;a[1]=a[3]=y-25; /* Hat - poly values */
a[2]=a[4]=x+30;a[5]=a[7]=y-60; /* Hat - poly values */
fillpoly(4,a); /* Hat - poly */
setcolor(16);
fillellipse(x,y-60,30,10); /* Hat - upper ellipse */
setlinestyle(0,1,3);
ellipse(x-15,y-5,50,130,12,8); /* Eyebrow - left */
ellipse(x+15,y-5,50,130,12,8); /* Eyebrow - right */
a[0]=a[6]=x-12;a[1]=a[3]=y+45; /* Neck - poly values */
a[2]=a[4]=x+13;a[5]=a[7]=y+60; /* Neck - poly values */
setfillstyle(1,13);
setlinestyle(0,1,1);
setcolor(13);
fillpoly(4,a); /* Neck */
setcolor(15);
}
void shirt()
{
setcolor(WHITE);
ellipse(x,y+80,6,172,70,20); /* shoulder */
ellipse(x-60,y+170,340,85,20,75); /* hips-left */
ellipse(x+60,y+170,95,202,20,75); /* hips-right */
line(x-41,y+196,x+42,y+196); /* vest line */
setfillstyle(9,BLUE);
setcolor(WHITE);
line(x-70,y+78,x-55,y+100);
line(x+70,y+78,x+55,y+100);
a[0]=a[6]=a[12]=x;a[5]=a[11]=y+100;
a[1]=a[7]=a[13]=y+90;a[4]=x-10;
a[2]=x-20; a[3]=a[9]=a[15]=y+60;
a[8]=a[14]=x+20;a[10]=x+10;
fillpoly(8,a);
setfillstyle(1,13);
floodfill(x,y+80,WHITE);
36
Virtual Reality For Robotics
setcolor(WHITE);
line(x-41,y+185,x+42,y+185);
setfillstyle(1,7);
floodfill(x-38,y+190,WHITE); /* belt */
setcolor(RED);
setfillstyle(1,RED);
fillellipse(x,y+190,10,4); /* belt */
setcolor(13);
ellipse(x,y+80,75,105,70,20);
setcolor(WHITE);
setfillstyle(9,BLUE);
floodfill(x-60,y+80,WHITE);
setfillstyle(1,7);
setcolor(7);
fillellipse(x,y+95,3,3); /* button */
fillellipse(x,y+125,3,3); /* button */
fillellipse(x,y+155,3,3); /* button */
fillellipse(x,y+180,3,3); /* button */
}
/* Function hand_movement */
ellipse(x-60,y+170,340,85,20,75); /* hips-left */
ellipse(x+60,y+170,95,202,20,75); /* hips-right */
if(color==BLACK)
{
setfillstyle(9,BLACK);
if(l)
{
floodfill(x-70,y+85,WHITE);
if((w<160)&&(w>120))
floodfill((p[0]+p[2])/2+5,(p[1]+p[3])/2,WHITE);
else if(!q)
floodfill((p[0]+p[2])/2,(p[1]+p[3])/2-5,WHITE);
else
37
Virtual Reality For Robotics
floodfill((p[0]+p[2])/2,(p[1]+p[3])/2+5,WHITE);
}
if(r)
{
floodfill(x+70,y+85,WHITE);
floodfill((re11[w][0]+rh11[w][0])/2-5,(re11[w][1]+rh11[w][1])/2+5,WHITE);
}
}
if(w<=140)
{
if(w==140)
{
setcolor(BLACK);
}
q=0;
setcolor(color);
if(l)
lhand(color,w,j);
if(r)
rhand(color,w,j);
}
else
{
if(w==142)
{
setcolor(BLACK);
}
q=1;
setcolor(color);
if(l)
lhand(color,w,j);
if(r)
rhand(color,w,j);
if(color==WHITE)
{
setfillstyle(9,BLUE);
if(l)
{
floodfill(x-70,y+85,WHITE);
floodfill((p[0]+p[2])/2,(p[1]+p[3])/2-5,WHITE);
floodfill(le11[w][0],le11[w][1]+5,WHITE);
}
if(r)
38
Virtual Reality For Robotics
{
floodfill(x+70,y+85,WHITE);
floodfill((re11[w][0]+rh11[w][0])/2-5,(re11[w][1]+rh11[w][1])/2+5,WHITE);
}
}
/* Function pant */
void pant()
{
setcolor(15);
line(x-41,y+198,x,y+231);
line(x+41,y+198,x,y+231);
line(x-41,y+198,x,y+231);
line(x+41,y+198,x,y+231);
setcolor(WHITE);
setfillstyle(9,YELLOW);
floodfill(x+2,y+205,WHITE);
setlinestyle(1,0,1);
line(x-3,y+195,x-3,y+230); /* zip */
line(x+5,y+195,x+5,y+222); /* zip */
line(x-3,y+230,x+5,y+222); /* zip */
setlinestyle(0,0,1);
39
Virtual Reality For Robotics
/* Function leg_movement */
}
else
{
if(l)
lleg(ll21[w-4][0],ll21[w-4][1],ll22[w+4][0],ll22[w+4][1],color);
if(r)
rleg(rl21[w-4][0],rl21[w-4][1],rl22[w+4][0],rl22[w+4][1],color);
}
if(color==WHITE)
40
Virtual Reality For Robotics
{
setfillstyle(9,YELLOW);
if(l)
floodfill(x-7,y+231,WHITE);
if(r)
floodfill(x+7,y+231,WHITE);
}
}
if ((j>10)&&(i>140)&&(j<21))
movelhand(x-50,y+100,r2+15,lh22);
else if((j>20)&&(i>140)&&(j<31))
movelhand(x-50,y+100,r2+20,lh22);
else if((j>30)&&(i>140))
movelhand(x-50,y+100,r2+25,lh22);
else
movelhand(x-50,y+100,r2+10,lh22);
41
Virtual Reality For Robotics
rotate((le21[i][0]),(le21[i][1]),j,lh21[i][0],lh21[i][1]);//s,t);
s=c[0][0];t=c[1][0];
line((le21[i][0]),(le21[i][1]),(le22[i][0]),(le22[i][1]));
line(x-50,y+100,le22[i][0],le22[i][1]);
rotate(le22[i][0],le22[i][1],j,lh22[i][0],lh22[i][1]);//u,v);
u=c[0][0];v=c[1][0];
{
p[0]=s;p[1]=t;p[2]=u;p[3]=v;
}
}
else
{
line(x-71,y+78,(le11[i][0]),(le11[i][1])); /* left - arm */
rotate((le11[i][0]),(le11[i][1]),j,lh11[i][0],lh11[i][1]);
s=c[0][0];t=c[1][0];
line((le11[i][0]),(le11[i][1]),(le12[i][0]),(le12[i][1]));
line(x-50,y+100,le12[i][0],le12[i][1]);
if((j>10)&&(i>70)&&(j<21))
movelhand(x-50,y+100,r2+5,lh12);
else if((j>20)&&(i>70)&&(j<31))
movelhand(x-50,y+100,r2+10,lh12);
else if((j>30)&&(i>70))
movelhand(x-50,y+100,r2+15,lh12);
else
movelhand(x-50,y+100,r2,lh12);
rotate(le12[i][0],le12[i][1],j,lh12[i][0],lh12[i][1]);//u,v);
u=c[0][0];v=c[1][0];
{
p[0]=s;p[1]=t;p[2]=u;p[3]=v;
}
}
setcolor(color);
line(s,t,u,v);
}
42
Virtual Reality For Robotics
if(q)
{
line(x+65,y+73,(re21[i][0]),(re21[i][1])); /* 1st half */
if ((j>(360-10))&&(i>140)&&(j<(360-21)))
moverhand(x+50,y+100,r2+15,rh22);
else if((j>(360-20))&&(i>140)&&(j<(360-31)))
moverhand(x+50,y+100,r2+20,rh22);
else if((j>(360-30))&&(i>140))
moverhand(x+50,y+100,r2+25,rh22);
else
moverhand(x+50,y+100,r2+10,rh22);
rotate((re21[i][0]),(re21[i][1]),360-j,rh21[i][0],rh21[i][1]);//s,t);
s=c[0][0];t=c[1][0];
line((re21[i][0]),(re21[i][1]),(re22[i][0]),(re22[i][1]));
line(x+50,y+100,re22[i][0],re22[i][1]);
rotate(re22[i][0],re22[i][1],360-j,rh22[i][0],rh22[i][1]);//u,v);
u=c[0][0];v=c[1][0];
if(i==140)
{
p[0]=s;p[1]=t;p[2]=u;p[3]=v;
}
}
else
{
line(x+71,y+78,(re11[i][0]),(re11[i][1]));
// line((le11[i/2.5][0]),(le11[i/2.5][1]),s,t); /* left - arm */
rotate((re11[i][0]),(re11[i][1]),360-j,rh11[i][0],rh11[i][1]);
s=c[0][0];t=c[1][0];
line((re11[i][0]),(re11[i][1]),(re12[i][0]),(re12[i][1]));
line(x+50,y+100,re12[i][0],re12[i][1]);
if((j<(360-10))&&(i>70)&&(j>(360-21)))
moverhand(x+50,y+100,r2+5,rh12);
else if((j<(360-20))&&(i>70)&&(j>(360-31)))
moverhand(x+50,y+100,r2+10,rh12);
else if((j<(360-30))&&(i>70))
moverhand(x+50,y+100,r2+15,rh12);
else
moverhand(x+50,y+100,r2,rh12);
rotate(re12[i][0],re12[i][1],360-j,rh12[i][0],rh12[i][1]);//u,v);
u=c[0][0];v=c[1][0];
43
Virtual Reality For Robotics
if(i==142)
{
p[0]=s;p[1]=t;p[2]=u;p[3]=v;
}
}
setcolor(color);
line(s,t,u,v); /* " " */
}
int Rx_Data(void)
{
int Str_Char, CH , End_Char;
loop1:
status = bioscom(_COM_STATUS, 0, COM );
loop2:
status = bioscom(_COM_STATUS, 0, COM );
loop3:
status = bioscom(_COM_STATUS, 0, COM );
loop4:
status = bioscom(_COM_STATUS, 0, COM );
if (status & DATA_READY)
{ DataL = bioscom(_COM_RECEIVE, 0, COM );
goto loop5; }
else goto loop4;
loop5:
status = bioscom(_COM_STATUS, 0, COM );
if (status & DATA_READY)
{ End_Char = bioscom(_COM_RECEIVE, 0, COM );
44
Virtual Reality For Robotics
goto loop6; }
else goto loop5;
loop6:
if( DataH<=0x39) DataH-= 0x30; else DataH-= 0x37;
if( DataL<=0x39) DataL-= 0x30; else DataL-= 0x37;
Data = DataH<<4; // Left shift MS Byte by 8 bits
Data = Data+DataL; // Add shifted MS value with LS byte
value[CH_No]=( Data * 5.00)/0xFF; // ( overall is to convert 2 8 bits to 16 bit
return (0); // Store the Data in buffer
}
45
Virtual Reality For Robotics
The head-mounted display (HMD) shown in fig 6.1, was the first device providing its
wearer with an immersive experience. Evans and Sutherland demonstrated a head-mounted
stereo display already in 1965. It took more then 20 years before VPL Research introduced a
commercially available HMD, the famous "EyePhone" system (1989).
A typical HMD houses two miniature display screens and an optical system that
channels the images from the screens to the eyes, thereby, presenting a stereo view of a
virtual world. A motion tracker continuously measures the position and orientation of the
user's head and allows the image generating computer to adjust the scene representation to the
current view. As a result, the viewer can lookaround and walk through the surrounding virtual
environment. To overcome the often uncomfortable intrusiveness of a head-mounted display,
alternative concepts (e.g., BOOM and CAVE) for immersive viewing of virtual environments
were developed.
6.2 BOOM
The BOOM (Binocular Omni-Orientation Monitor) as shown in fig 6.2 from Fake
Space is a head-coupled stereoscopic display device. Screens and optical system are housed
in a box that is attached to a multi-link arm. The user looks into the box through two holes,
sees the virtual world, and can guide the box to any position within the operational volume of
the device. Head tracking is accomplished via sensors in the links of the arm that holds the
box.
The CAVE (Cave Automatic Virtual Environment) as shown in fig 6.3, was developed
at the University of Illinois at Chicago and provides the illusion of immersion by projecting
stereo images on the walls and floor of a room-sized cube. Several persons wearing
lightweight stereo glasses can enter and walk freely inside the CAVE. A head tracking system
continuously adjusts the stereo projection to the current position of the leading viewer.
A variety of input devices like data gloves, joysticks, and hand-held wands allow the
user to navigate through a virtual environment and to interact with virtual objects. Directional
sound, tactile and force feedback devices, voice recognition and other technologies are being
employed to enrich the immersive experience and to create more "sensualized" interfaces.
The fig 6.4 illustrates this technology.
Fig 6.4 A data glove allows for interactions with the virtual world
48
Virtual Reality For Robotics
In the example illustrated below in fig 6.5, three networked users at different locations
(anywhere in the world) meet in the same virtual world by using a BOOM device, a CAVE
system, and a HMD, respectively. All users see the same virtual environment from their
respective points of view. Each user is presented as a virtual human (avatar) to the other
participants. The users can see each other, communicated with each other, and interact with
the virtual world as a team.
Non-immersive VR
Today, the term 'Virtual Reality' is also used for applications that are not fully
immersive. The boundaries are becoming blurred, but all variations of VR will be important
in the future. This includes mouse-controlled navigation through a three-dimensional
environment on a graphics monitor, stereo viewing from the monitor via stereo glasses, stereo
projection systems, and others. Apple's QuickTime VR, for example, uses photographs for the
modeling of three-dimensional worlds and provides pseudo look-around and walk-trough
capabilities on a graphics monitor.
49
Virtual Reality For Robotics
VRML
VR-related Technologies
50
Virtual Reality For Robotics
6.6 Applications
Virtual Reality is well known for its use with flight simulators and games. However,
these are only two of the many ways virtual reality is being used today. This article will
summarize how virtual reality is used in medicine, architecture, weather simulation,
chemistry and the visualization of voxel data. fig 6.7 shows real and abstract virtual worlds
(Michigan stadium, Flow structure) device.
Note that a virtual environment can represent any three-dimensional world that is
either real or abstract. This includes real systems like buildings, landscapes, underwater
shipwrecks, spacecrafts, archaeological excavation sites, human anatomy, sculptures, crime
scene reconstructions, solar systems, and so on. Of special interest is the visual and sensual
representation of abstract systems like magnetic fields, turbulent flow structures, molecular
models, mathematical systems, auditorium acoustics, stock market behaviour, population
densities, information flows, and any other conceivable system including artistic and creative
work of abstract nature. These virtual worlds can be animated, interactive, shared, and can
expose behaviour and functionality.
51
Virtual Reality For Robotics
maintenance tasks, assistance for the handicapped, study and treatment of phobias (e.g., fear
of height), entertainment, and much more.
7. Conclusion
52
Virtual Reality For Robotics
The following modifications can be made to the present circuit, which leads to still
smarter project. This is a very cost effective system as it can simulate a robot. In some cases
it can be used instead of a robot. This system can be used in nuclear power stations which are
very hazardous for human beings to work in. The output of this system can be taken as a
multiple of the input to do some serious tasks. eg: If 50 Grams weight is lifted on the input
side then the output can be made to lift any multiple of 50 grams. This project is open for
developments from all sides. It is the users’ imagination which limits the working of this
project. One can go on adding the extra, rich features to this project.
Appendix A
General Characteristics:
A.2 CA3140
54
Virtual Reality For Robotics
Common Mode Rejection is a measure of the change in output voltage when both
inputs are changed by equal amounts, CMR is usually specified for a full range common
mode voltage change (CMV), at a given frequency, and a specified impedance (e.g. 1k
source unbalance at 60Hz).. The common mode rejection ratio of common mode signal
appearing at the output to the input CMV.
In most instrumentation amplifiers, the CMR increases with gain because the front-
end configuration does not amplify common-mode signals, and the amount of common mode
signal appearing at the output stays relatively constant as the signal gain increases. However,
at higher gains, amplifier bandwidth decreases since differences in phase shift through the
mode errors; CMR becomes more frequency dependent at high gains.
Important Note:
The instrumentation amplifiers have differential inputs and there must be a return path
for the bias current, if it is not provided these currents will change capacitance, causing the
output to drift uncontrollably or to saturate. Therefore, when amplifying output of “Floating”
source, such as transforms and thermocouples as well as ac-coupled sources there must be a
55
Virtual Reality For Robotics
DC path from each input to common, or to the guard terminal. If a return path is
impracticable, an isolator must be used.
Rate = 1v/us°
56
Virtual Reality For Robotics
57
Virtual Reality For Robotics
58
Virtual Reality For Robotics
Features of 89C51
• 4K Bytes of ROM
• Fully Static Operation: 0 Hz to 24 MHz
• Three-level program memory lock
• 128 x 8-bit internal RAM
• 32 programmable I/O lines
59
Virtual Reality For Robotics
Pin Description
Port 0
Port 0 is an 8-bit open drain bi-directional I/O port. As an output port, each pin can
sink eight TTL inputs. When 1s are written to port 0 pins, the pins can be used as high
impedance inputs. Port 0 can also be configured to be the multiplexed low order address/data
bus during accesses to external program and data memory. In this mode, P0 has internal pull
ups. Port 0 also receives the code bytes during flash programming and outputs the code by
test during program verification. External pull ups are required during program verification.
Port 1
Port 1 is an 8-bit bi-directional I/O port with internal pull ups. The port 1 output
buffers can sink/source four TTL inputs. When 1s are written to port 1 pins, they are pulled
high by the internal pull ups and can be used as inputs
In addition, P1.0 and P1.1 can be configured to be the timer/counter 2 external count
input (P1.0/T2) and the timer/counter 2 trigger input (P1.1/T2EX), respectively. Port 1 also
receives the low-order address bytes during flash programming and verification. Port pin
alternate functions P1.0 T2 (external count input to Timer/Counter 2), clock-out P1.1 T2 EX
(Timer/Counter 2 capture/reload trigger and direction control) AT89C52
Port 2
Port 2 is an 8-bit bi-directional I/O port with internal pull ups. The port 2 output
buffers can sink/source four TTL inputs. When 1s are written to port 2 pins, they are pulled
high by the internal pull ups and can be used as inputs. Port 2 emits the high-order address
60
Virtual Reality For Robotics
byte during fetches from external program memory and during accesses to external data
memory that uses 16-bit addresses (MOVX @ DPTR). In this application, port 2 uses strong
internal pull ups when emitting 1s. During accesses to external data memory that uses 8-bit
addresses (MOVX @ RI), port 2 emits the contents of the P2 special function register.
Port 3
Port 3 is an 8-bit bi-directional I/O port with internal pull ups. The port 3 output
buffers can sink/source four TTL inputs. When 1s are written to port 3 pins, they are pulled
high by the internal pull ups and can be used as inputs.
Port 3 also receives some control signals for Flash programming and verification.
RST
Reset input. A high on this pin for two machine cycles while the oscillator is running resets
the device.
ALE/PROG
Address latch enable is an output pulse for latching the low byte of the address during
accesses to external memory. This pin is also the program pulse input (PROG) during flash
programming. In normal operation, ALE is emitted at a constant rate of 1/6 the oscillator
frequency and may be used for external timing or clocking purposes. Note, however, that one
ALE pulse is skipped during each access to external data memory.
61
Virtual Reality For Robotics
If desired, ALE operation can be disabled by setting bit 0 of SFR location 8EH. With
the bit set, ALE is active only during a MOVX or MOVC instruction. Otherwise, the pin is
weakly pulled high. Setting the ALE-disable bit has no effect if the microcontroller is in
external execution mode.
PSEN
Program Store Enable is the read strobe to external program memory. When the
AT89C51 is executing code from external program memory, PSEN is activated twice each
machine cycle, except that two PSEN activations are skipped during each access to external
data memory.
EA/VPP
External access enable EA must be strapped to GND in order to enable the device to
fetch code from external program memory locations starting at 0000H up to FFFFH.
XTAL1
Input to the inverting oscillator amplifier and input to the internal clock operating circuit.
It is a map of the on-chip memory area called the Special Function Register (SFR)
space. Note that not all of the addresses are occupied, and unoccupied addresses may not be
implemented on the chip. Read accesses to these addresses will in general return random
data, and write accesses will have an indeterminate effect. User software should not write 1s
to these unlisted locations, since they may be used in future products to invoke AT89C51 new
features. In that case, the reset or inactive values of the new bits will always be 0.
Data Memory
There are 128 bytes of RAM in 8051 which are assigned addresses ranging from 00h
to 7Fh. These 128 groups are divided into three different groups as follows:
62
Virtual Reality For Robotics
1. A total of 32 bytes from location 00 to 1Fh are set aside for register banks
and the stack.
2. A total of 16 bytes from locations 20h to 2fh are set aside for bit-
addressable memory.
3. A total of 80 bytes from locations 30h to 7fh are used for read and write
storage which is normally called as a scratch pad memory. These 80 bytes
of RAM are used for the purpose of storing parameters by 8051
programmers.
Timer 0 and 1
Both Timer 0 and Timer 1 in the AT89C51 are 16 bits wide. Since the 8051 has an 8-
bit architecture, each 16-bit timers are accessed as two separate registers of low byte and high
byte.
Interrupts
The AT89C51 has a total of six interrupt vectors: two external interrupts (INT0 and
INT1), two timer interrupts (Timers 0, 1), the serial port interrupt and the reset pin interrupt.
Each of these interrupt sources can be individually enabled or disabled by setting or clearing
a bit in special function register IE. IE also contains a global disable bit, EA, which disables
all interrupts at once.
Oscillator Characteristics
XTAL1 and XTAL2 are the input and output, respectively, of an inverting amplifier
that can be configured for use as an on-chip oscillator, as shown in Figure 7. Either a quartz
crystal or ceramic resonator may be used. To drive the device from an external clock source,
XTAL2 should be left unconnected while XTAL1 is driven. There are no requirements on the
duty cycle of the external clock signal, since the input to the internal clocking circuitry is
through a divide-by-two flip-flop, but minimum and maximum voltage high and low time
specifications must be observed.
63
Virtual Reality For Robotics
DC Characteristics
64
Virtual Reality For Robotics
Appendix B
Void doll(void)
This function takes no parameters. This function is used to draw the initial static
image. It accomplishes this job by calling other functions as hand_movement( ),
leg_movement( ), shirt( ) and pant( ).
Void face(void)
This function takes no parameters. This function is used to draw the face of the image.
It accomplishes this job by executing other library functions in C graphics such as ellipse ( ),
fillellipse ( ), sector ( ), floodfill ( ), line ( ), arc ( ), circle ( ) etc.
Void shirt(void)
This function takes no parameters. This function is used to draw the shirt of the
image. It accomplishes this job by executing other functions, some from C graphics library
65
Virtual Reality For Robotics
and some user defined functions. It uses the hand_movement function to draw the hands of
the image.
66
Virtual Reality For Robotics
The second parameter specifies the signal value for right arm movement. It indicates the
exact position of the right hand for that execution of the function. The third parameter
specifies the signal value for right elbow movement. It indicates the exact position of the
right elbow for that execution of the function.
Void pant(void)
This function takes no parameters. This function is used to draw the pant of the
image. It accomplishes this job by executing other functions, some from C graphics library
and some user defined functions. It uses the leg_movement function to draw the hands of the
image.
67
Virtual Reality For Robotics
Appendix C
The large variety of plug-ins, add-ons and ancillary equipment’s force the user to look
a little carefully into the hardware and software aspects of interfacing various peripherals to a
PC.
The open architecture of the IBM PC has helped its 62-pin I/O channel bus in
becoming a universally accepted interface standard for joining extra hardware to a PC. A
peripheral, in strict sense, is a piece of equipment which in itself is a separate entity and gets
attached to the PC through a specific connection protocol called Interface Standard.
The block diagram in Fig. C.1 illustrates a typical scheme chosen for peripheral
interface. Here different sockets have been provided for each one of the peripherals, which
are not interchangeable among themselves. If one looks inside a PC, all these sockets
originate from printed circuit cards, which share a common bus, called I/O, channel bus, and
sometimes PC bus a name given to the set of signals available on 62-pin edge connectors on
the PC motherboard.
68
Virtual Reality For Robotics
BIBLIOGRAPHY:
1. Muhammad Ali Mazidi, The 8051 Microcontroller and Embedded system,
Pearson Education Asia.
2. Kenneth J Ayala, The 8051 Microcontroller, Prentice hall 3rd Edition
3. Haptic Technologies, http://www.haptech.com/prod/index.htm, 1999.
5. www.wikipedia.org
6. www.atmel.com
69