You are on page 1of 69

Virtual Reality For Robotics

1. INTRODUCTION

Technological breakthroughs have allowed automation to replace many jobs that are
repetitive in nature and hazardous to human health, thus replacing human operators in many
work environments. However, it has also allowed a new phenomenon to occur: the remote
control of physical systems by humans through the mediation of computers. Indeed, there are
still many jobs that are non-repetitive, unpredictable and hazardous to humans. Clearing up a
nuclear power plant leak or exploring the extreme depths of the ocean are just some
examples. Reliance on specially designed automatic machines for such jobs may prove to be
inefficient, less flexible and less cost-effective than a human operator is. Therefore, a remote
control system by human operators using video inspection and master-slave mechanical
manipulation (called tele-operation) is used. Due to the growing popularity and also due to
the research in the field of robotics, the days are very near wherein robots will replace all
human activities.

This project is an effort to simulate a robot, which converts the real time actions into
digital signals which in turn is used to produce virtual actions. The software for this project
acts as interface between the hardware devices and controller computer. Inputs to the system
are taken from sensors which are fitted on the intended part or body. These sensors supply the
real-time conditions of the intended body to the computer. The software processes these
signals and according to received signal levels it moves the graphical picture on the monitor.
That means the virtual movement is created by software part, where as real actions are
captured by array of sensors. This project gives an idea of the working of a simple robot and
to demonstrate the use, implementation and applications of the virtual reality technology.

Here the project is designed using the C and C graphics. This project is connected to
the external hardware; simulator accepts the commands from the hardware and accordingly
performs the desired task. This project finds various applications in the field of tele-medicine
and tele-robotics.

The project aims at the C graphics and peripheral interface par. Here the objective is
to connect or interface the Image on the PC to the external hardware and also to control the
image properties like shape, color and size by operating the external hardware.

1
Virtual Reality For Robotics

Here the project consists of following blocks:


1. Software with C & C-graphics
2. Interface
3. Hardware

The software reads the data from the interface, generated from the external hardware
and interprets them into suitable image formats. For various data, the project image behaves
in a different pre-programmed manner. For example, if the joystick is bent for 45 degrees, the
toy image on the computer screen also bends by 45 degree and similar.

1.1 Main features of the project:

1. Economical in nature, simple in design & operation.

2. Highly flexible in nature. The system can be used as a stand-alone unit or a remote
interfaced one.

3. Due to the computer interface facility, the system is highly accurate & automatic in
nature.

4. Since this system is associated with a PC, it is very easy to store the data, keep track
of the parameters & analyze it, for further study of the parameter.

5. Once the system is activated, and set to receive the parameters; the whole project can
be used as a computerized data logger system for storing the parameters.

2
Virtual Reality For Robotics

2. PREVIEW OF VIRTUAL REALITY TECHNOLOGY

2.1 Virtual Reality: An Introduction

The term 'Virtual Reality' (VR) was initially coined by Jaron Lanier, founder of VPL
Research (1989). Other related terms include 'Artificial Reality' (Myron Krueger, 1970s),
'Cyberspace' (William Gibson, 1984), and, more recently, 'Virtual Worlds' and 'Virtual
Environments' (1990s). Today, 'Virtual Reality' is used in a variety of ways and often in a
confusing and misleading manner. The virtual reality is a hot theme and is evident in
recently released Hollywood films such as ‘Spy Kids’ series & “Spider Man’ series. This
virtual reality concept is out-come of most exciting and challenging field of automation,
‘Artificial Intelligence’. As the name implies, the virtual reality concept allows the user to do
the work which is hundred fold heavier than the actual one. For example, by moving one
small model of Himalayan mountains sitting inside the laboratory one can move the real
Himalayan mountain with the help of the machinery situated near it. This seems too
aggregated application of virtual reality concept, but it succeeds in showing the power of it.

This concept is clearly understood by taking the ‘Warm Hand’ project as an example,
conducted by US and the Britain universities jointly. These two universities constructed two
human hands, using high sensitive materials covered with innumerable sensors around it.
These two hands are connected to computer which is connected to internet. The arrangements
of both hands is as shown in fig.2.1

Fig 2.1 Depicts warm hand project

3
Virtual Reality For Robotics

When any person shakes the hand A situated in US university, it records every change
in parameters during the hand-shaking action. Here the ‘parameter’ means warmth of the
hand, pressure applied by the hand-shaker, the grip-level, inclined angle etc. These
parameters are sensed by sensitive sensors spread throughout the hand-gloves and stored in
memory of computer. This stored data is transmitted quickly to Britain university’s computer
via broad band internet connection. There it is again converted back to original signals and
fed to hand B. If any body hand-shake the hand B he will feel that the actual person, who is
hand-shaking hand A, is hand shaking with him. This project really demonstrates the virtual
reality concept’s application.

Such virtual reality applications finds very important role in saving any life, doing
remote works, working with hundred fold bigger than the real capacity machines etc. Let us
see some examples, where this killer application can be implemented.

2.2 Tele-Medicine:

Suppose there is only one specialist available in the world, who cannot move around
to help or perform operation on patients lying on the death-bed. He can sit in his own
hospital, which is having virtual reality facility, and perform the operations on various
patients around the world at a time. The doctor will have one model- virtual body, which
resembles the human patient, fitted with all the sophisticated sensors around its body. On the
other side, the real patient will be kept under another virtual reality facilitated hospital. Here
all the parameters of the patient, on whom operation has to be performed, are sensed, stored
and sent quickly to doctor via broadband internet connection. The parameters sensed are heart
beat, pulse rate, body temperature, body position etc. With the help of these received
parameters from real body the virtual body tries to act like it. It means virtual body tries to
imitate the real body, which is far away from it. This factor helps the specialist doctor to
understand the patient’s condition and accordingly he can operate on him. This virtual reality
concept brings the patient, who is at other corner of the world, to doctor’s operation table.
Doctor can check pulse rate, heart beats, measure body temperature of his virtual body, as if
of real body.

4
Virtual Reality For Robotics

Fig 2.2 Picture represents a tele-medicine system

Fig 2.2 shows how this tele-medicine system works. Here, a doll and monitor are used
as output devices. The output, graphics on the monitor and the doll, reflects the actions of the
input devices. eg: Movement of a glove or movement of the different parts of the suit results
in the movement of different parts of the doll and the graphics on the monitor.

2.3 Brief about this project:


This virtual reality for robotics project gives the better half of the real virtual
reality concept. In this project, number of sensors is connected to a person, as shown
in fig 2.3.

Fig 2.3 Picture shows arrangement of the project

5
Virtual Reality For Robotics

Here the person standing for demonstration is attached with number of sensors.
These sensors are connected to computer’s CPU via sensor cable bunch. Sensor S1 &
S2 are connected to observe shoulder positions, S3 & S4 are used to get the bending
action of hands and S5 & S6 sensors are used to watch the movements of legs. In
this picture only six sensors are used, and the numbers can be increased or decreased
as per one’s requirements. The person’s picture will be created on the screen of
computer using C++/C-graphics language.

Whenever the person shows any movement, correspondingly the sensors sense
the actions and send them to computer. The C-program sitting inside the computer
process the data it received via sensor cable bunch and accordingly instructs C++/C-
graphics language to move the respective parts on the graphical picture. Different
actions and their respective movements on screen can be observed.

Fig 2.4 Picture shows a person lifting up his hands

Fig 2.4 shows what happens when person lifts his both hands up. The sensors
S5 & S6 sense no actions, as the person has not changed his standing position. The
sensors S1 & S2, which are normally movement detectors, detects that both hands
are lifted up. The sensors fitted on elbow of both hands sends signal to computer that
they are intact. Thus processing the entire sensor data, computer’s software program
concludes that person has lifted his both hands up. So C-program thus instructs C+
+/C-graphics program to change the necessary changes on graphical representation
of screen.

6
Virtual Reality For Robotics

Fig 2.5 Actual virtual reality system of this project

Even though this project depicts only half part of the virtual reality concept, it
demonstrates how data can be gathered and stored in the local computer. Instead of
transmitting it to remote computer for further process, sensed data is used to move
the graphical element on the same local computer. It is evident by seeing fig 2.5, that
sending the sensed data to remote computer for further processing is not a big deal.

7
Virtual Reality For Robotics

3. BLOCK DIAGRAM

8
Virtual Reality For Robotics

The “Virtual Reality for Robotics” is built using several blocks. The building
blocks and their inter-relation is shown in fig 3.1.

9
Virtual Reality For Robotics

Fig 3.1 Block diagram of hardware unit

Explanation of the block diagram:

10
Virtual Reality For Robotics

This project has number of sensors/transducers connected across the intended


body [to whole body, like in EEG scanning or parts only, like hands or legs to
observe the particular parts movement] to sense the activity of it. As soon any sensor
senses any changes in its current situation it sends message to its respective
instrumentation amplifier. The instrumentation amplifier amplifies received weak
signal to sufficient level and feeds it in to an ADC. This ADC or analog-to-digital
converter, converts the input analog signals to digital one, so that further sections
can process them easily. The output of ADC is fed to buffer section, where unit gain
amplification is done and also provides proper impedance matching between I/O
interface and ADC. The computer, brain of this project, receives these signals and
with the help of software module gives out the output. Here the software module
produces the output in graphical form.

The block diagram is explained by taking each block separately.

Sensor Block: This block contains number of sensors or transducers spread across the
intended body. Several types of sensors are used in this block as per one’s requirements, such
as tactile switches are used to sense the bending of arms and ankles and micro-switches or
sliding potentiometers are used to sense the movement of hands & legs or any body part.
These weak sensed signals are given to instrumentation amplifier block for further
amplification.

Instrumentation Amplifier Block: The output of sensor will be very feeble to drive any
further sections. So some kind of amplification is needed before utilizing that sensed signals.
This amplification is done using reliable amplifier circuit called instrumentation amplifier.
This amplifier, as the name implies, is specially used in instrumentation field to amplify weak
sensed field signals from various sensors spread all over the area [for example in industries
several sensors will be fitted to check so many parameters viz., speed of motor, motion of
conveyor belt, level of material in storing silos etc.,].

ADC: The conversion of analog signals received from instrumentation amplifier block is
done using this ADC block. The analog-to-digital converter block converts any input signals,
which are in analog form into digital form. This is necessary as the computer can understand

11
Virtual Reality For Robotics

only digital data and process them. The output of ADC is given to I/O interface block,
through buffer stage, to pass the signals from ADC to computer.

I/O Interface: This stage receives boosted sensed signals from buffer and fed to computer
port for further processing. This stage also sees that signal level should be compatible with
TTL signal of computer port.

Personal Computer: The brain of this project is the computer. With the help of interfacing
software module the computer processes the sensed signals and gives out the output in
graphical form.

Software Module: The software module is responsible for producing the graphical output
on computer’s screen, which depends on the received sensed signal levels. This core software
[i.e., receiving the sensed signal level] is written in popular language C and its graphics part
is in C graphics and C++.

Power Supply Unit: This specially designed power supply provides +12 V for op-amp,
relay & driver stage and +5V for rest of the digital & analog circuits.

4. CIRCUIT DESCRIPTION

12
Virtual Reality For Robotics

The circuit construction of this ‘Virtual Reality for Robotics’ project is divided into
these parts: Sensors block, Instrumentation block, ADC, I/O interface stage & finally power
supply unit.

4.1 SENSORS BLOCK

This block holds all the sensors used in the project in one place. Depending upon the
requirement, different types of sensors or transducers are used. Before going to block
explanation, first let us see what is sensor or transducer and where it is used.

Transducer: The transducer, also known as a sensor, is a device which converts physical
parameters to be measured, viz., pressure, temperature, movements etc, into a corresponding
quantity. If the transducer converts physical quantity to useful electronic signals, such as
voltage and current analogous, then it is an electrical transducer.

Temperature Transducers: Electronic measurement of temperature is of vital concern in


modern industrial practice. Hence a wide range of devices has been developed to serve as
temperature transducer. Some of the important temperature transducers are: thermistors,
thermocouples, piezoelectric crystals etc.

Optical Sensors: It converts light that falls on it into electronic signals. Optoelectronic
devices such as Light Emitting Diodes [LEDs] can generate electromagnetic energy while
other devices such as Light Dependent Resistors [LDRs] can sense visible light energy.
Visible light occupies the range from about 400nm to 600nm. Although various types of
optical sensor are available, for general purpose applications LDRs are employed. The 1M
rated typical LDR exhibits a resistance which varies from 400 under bright room lighting
[1000 lux] to as great as 1M in total darkness. The spectral response peaks at about 550nm
and falls rapidly below 500nm and above 650nm. Thus this optical sensor is useful in light
sensing equipment and illumination level sensing applications generally.

Sound Sensors: Electronic devices such as loud speakers can generate electromagnetic
energy while other devices such as condenser microphones can sense audible energy. Audible

13
Virtual Reality For Robotics

frequency range is from about 300Hz to 3000Hz. Although various types of sound sensors are
available, for general purpose applications microphones are employed. This sound sensor is
useful in listening bug or listening amplifier circuits.

Movement Detectors: The sensor which detects the movement or change of position of
the fitted body/part and sends message in electronic signal form is called movement
detectors. Usually tactile switches, reed relays or slider potentiometers are used for detecting
the movements.

Circuit Description: As the circuit in fig 4.1 shows, the project allows any number of
sensors to use, as per one’s requirements. To detect the bending of arms and ankles movement
detectors can be used. To detect the movements of legs and hands or any other body part
sliding potentiometers can be used.

The use of sensors or transducers is purely based on one’s individual requirements.


This sensor block brings all sensor terminals to one common point. From here, each sensor is
connected to individual instrumentation amplifier circuits as input signal.

Fig 4.1 Circuit diagram of sensor block

4.2 INSTRUMENTATION BLOCK

14
Virtual Reality For Robotics

This block amplifies the sensed signals to sufficient level and fed to ADC card for
further processing. The present circuit shows only one instrumentation amplifier stage, which
takes input signal from any type of sensor. This circuit can be used for any type of sensor
which needs amplification. Note that each sensor needs one instrumentation amplifier, like
this one. Before explaining the actual circuit, let us see some terms and their definitions in
detail.

Amplifier: A common theme in many electronic circuits is the need to increase the level of
voltage, current, or power present. This need is satisfied by some form of amplifier. The
important parameters regarding the amplifier circuit designing are gain, input impedance,
output impedance, frequency response, bandwidth, phase shift, and efficiency. Not all of the
above characteristics or parameters will be important in any given application. An essential
first stage in designing an amplifier will be to select those characteristics which are important
and then specify the parameters required.

Instrumentation Amplifiers: The low-level signal outputs of electrical transducers often


need to be amplified before further processing; this is carried out with instrumentation
amplifier. These are also used to provide impedance matching and isolation. Instrumentation
amplifiers or differential amplifiers are specifically designed to extract and amplify small
differential signals from much larger common mode voltages. To serve as building blocks in
instrumentation amplifiers, op-amps must have very low offset voltage drift, high gain and
wide bandwidth. The popular ICs used to build instrumentation amplifiers are LM324 and
UA 741.

An instrumentation amplifier is a committed “gain block” that measures the difference


between the voltage existing at its two input terminals, amplifies it by a precisely set gain,
usually from 1v to 100v and more causes the result to appear between a pair of terminals in
the output circuit. An ideal instrumentation amplifier responds only to the difference between
the input voltages. If the input voltage is equal the output of the ideal instrumentation
amplifier will be zero. An amplifier circuit which is optimized for performance as an
instrumentation-amplifier gain block has high input impedance, low offset and drift, low non-
linearity, stable gain and low effective output impedance. It is commonly used for
applications which capitalize on these advantages. Examples include: transducer,
amplification for thermocouples, strain-gauge bridges, current shunts and biological probes,
pre-amplification of small differential signals super imposed on high common-mode
15
Virtual Reality For Robotics

voltages, signal conditioning and (moderate) isolation for data acquisition signals wherever
the common “ground” is noisy or of questionable integrity.

The instrumentation amplifier is used where it is desirable to eliminate the d.c. path
from the signal input to amplifier ground, i.e., to isolate the signal from ground. It has its
signal input circuit-isolated from the signal output and power input circuits. Such amplifiers
are used: When low-level signals are processed on top of high common-mode voltages;
possibilities exist of troublesome ground disturbances and ground loops, where patient
isolation is required, systems operate in noisy environments and processing circuitry must be
protected in case of fault conditions. The primary features of instrumentation amplifier are,
inherent protection from high-voltage differences between input and output ports and
between high and low input terminals. High rejection of noise as well as a.c., d.c. and
transient signals common to both input terminals. And very high impedance leakage with a
50Hz, 120V input to output ports.

Instrumentation-amplifiers are usually chosen in preference to user-assembled op-amp


circuitry. If the application calls for high common-mode voltages (typically, voltages in
excess of the amplifier supply voltage), or if isolation impedance must be very high, the
designer should consider an isolation amplifier. Instrumentation amplifiers have two high
impedance input terminals, a set of terminals for gain programming an “output” terminal and
a pair of feedback terminals, labeled sense and reference, as well as terminals for power
supply and offset trim.

The selection of an instrumentation amplifier may fall into one of the three categories:
industrial, instrumentation and medical. The instrumentation application or data acquisition
process may not necessarily involve the extreme hazards of medical or industrial
applications, but the precision requires demand loops and offer better common-mode-
rejection than conventional data amplifiers. High accuracy performance with low level
transducer signals can be enhanced by choosing an amplifier with isolated power available
for a low-drift signal conditioning pre-amplifier. In medical monitoring applications
amplifiers must first and foremost protect the patient from leakage current and amplifier fault
currents. This applies to both input protection and input/output isolation.
Circuit Description:

16
Virtual Reality For Robotics

Fig 4.2 shows, a simple module where the main circuit needs the transducer which
gives the varying voltage levels with respect to varying input signals. The heart of this
module is an op-amp CA 3140 whose gain is very high. Thus the small level of information
will be easily picked and amplified to required level by it. The op-amp is in unity gain non-
inverting voltage amplifier mode. This voltage follower configuration employs cent percent
voltage negative feedback and combines a very high input resistance with a very low output
resistance. As the operation of this circuit is analogous, the sensor’s sensed signal is carried
away to the next stage easily.

Fig 4.2 Instrumentation Amplifier

The sensor is connected at non-inverting input pin-3 and output is observed at pin-6.
To stabilize the sensed signals by sensor, two biasing resistors [R1 & R2] arrangement is
done. As the parameter [i.e., movement, bending etc] increases in the vicinity of sensor the
output goes on increasing. This output signal is further amplified and fed to ADC circuit for
conversion. The table 4.1 shows various components used in the amplifier circuit.

Table 4.1 Components List:

17
Virtual Reality For Robotics

SEMICONDUCTORS
IC1 3140 Op-Amp IC 1
RESISTORS
R1 5.6 K Ohm, ¼ Watt Carbon Type 3
R2 3.9 k Ohm, ¼ Watt Carbon Type 1
MISCELLANEOUS
C1 47 µF/ 25 V Electrolytic 1
Sensor LM 335 Temperature Sensor 1

4.3 Analog to Digital Converter:

The A\D conversion is a quantizing process whereby an analog signal is represented


by equivalent binary states; this is opposite to the D\A conversion process. Analog-to-Digital
converters can be classified into two general groups based on the conversion technique. One
technique involves comparing a given analog signal with the internally generated equivalent
signal. This group includes successive-approximation, counter and flash-type converters. The
second technique involves changing an analog signal into time or frequency and comparing
these new parameters to know values. This group includes integrator converters and voltage-
to-frequency converters. The trade-off between the two techniques is based on accuracy vs.
speed. The successive-approximation and the flash type are faster but generally less accurate
than the integrator and the voltage-to-frequency type converters. In A/D conversion, another
critical parameter is conversion time. This is defined as the total time required to convert an
analog signal into its digital output and is determined by the conversion technique used and
by the propagation delay in various circuits. The successive-approximation A/D converters
are used in applications such as data loggers and instrumentation, where conversions speed is
important. On the other hand, integrating-type converters are used in applications such as
digital meters, panel meters, and monitoring systems, where the conversion accuracy is
critical.

BASIC CONCEPTS

18
Virtual Reality For Robotics

Fig 4.3 shows a block diagram of a 3-bit A/D converter. It has one input line for an
analog signal and three output lines for digital signals.

Fig 4.3 3-bit A/D converter

Fig 4.4 shows the graph of the analog input voltage (0 to 1V) and the corresponding
digital output signal. It shows eight (23) discrete output states from 0002 to 1112, each step
being 1/8 V apart. This is defined as the resolution of the converter.

In A/D conversion, another critical parameter is conversion time. This is defined as


the ‘total time required to convert an analog signal into its digital output’ and is determined
by the conversion technique used and by the propagation delay in various circuits.

Fig 4.4 Graph of analog to digital conversion

Circuit Description:

19
Virtual Reality For Robotics

This 8-bit analog to digital converter is built around industry standard AD 0809 from
National Semiconductors. This 28-pin DIL packaged IC contains an 8 channel analog
multiplexer and can directly interface up to 8 analog inputs in the range 0 to 5V. It has a small
conversion time, as 100 micro seconds with 15mW power consumption. This ADC interface
consists of a NOR gate crystal oscillator, a CMOS clock divider which feeds 768 kHz as the
input clock to the ADC [at pin-10]. This stage is performed with the help of U1, U2 and a
quartz crystal.

The usually available supply voltage +12V is step-downed to +5V, a working voltage
of AD 0809 and fed to U4’s Vcc pin 11. This regulation is achieved by 10-pin [TO packaged]
regulator IC 723 [U5] with current limiting zener diode D1. The stable 5V reference by LM
336 and an op-amp U6 in reference & buffer provider mode gives the Ref+ at pin-12. Multi-
turn cermet [R18] allows adjustment of the reference voltage.

The channel select, ALE, start conversion and output enable lines are interfaced
through port lines. The ADC IC provides three channels PB0, PB1 and PB2 to the user to
select for his operation. The PB5 line provides Address Latch Enable [ALE} signal to hold
the present address. The PB6 signal line asks ADC IC to start conversion either in polled
mode or interrupt mode. And the PB7 is input enable signal line.

4.4 RS232 Board Interface Section

Fig 4.5 shows the RS232 board interface section, which uses a MAX232 5V to
RS232 converter chip, this converts the 0-5V TTL levels at the ATMEL 89C51 pins
to the +12V/-12V levels used in RS232 links. As is common with these devices it
inverts the data during the conversion, the ATMEL 89C51 USART hardware is
designed to take account of this but for software serial communications you need to
make sure that you invert both the incoming and outgoing data bits.

The two closed links on the RC7 and RC6 lines are for connection to the
Atmel 89C51 board, and are the two top wire links shown on the top view of the
board below. The two open links on the RC1 and RC2 lines are for the 16F628 board

20
Virtual Reality For Robotics

(the 16F628 uses RB1 and RB2 for its USART connection), and are the two top track
breaks shown on the bottom view of the board below.

Fig 4.5 RS232 Interfacing Circuit Diagram

So, for use with the Atmel 89C51 board fit the top two wire links, and cut the
top two tracks shown, for the 16F628 leave the top two links out, and don't cut the
two top track breaks. This only applies if you are using the hardware USART, for
software serial communications you can use any pins you like. Although it's labeled
as connecting to port 1 for the Atmel 89C51 processor board, it can also be
connected to other ports if required, and if not using the hardware USART.

4.5 Power supply


21
Virtual Reality For Robotics

Circuit description:

As shown in fig 4.6, a d.c. power supply which maintains the output voltage constant
irrespective of a.c. mains fluctuations or load variations is known as regulated d.c. power
supply. It is also referred as full-wave regulated power supply as it uses four diodes in bridge
fashion with the transformer. This laboratory power supply offers excellent line and load
regulation and output voltages of +5V & +12V at output currents up to one amp. The
components used are listed in table 4.2.

1. Step-down Transformer: The transformer rating is 230V AC at Primary and 12-0-


12V, 1A across secondary winding. This transformer has a capability to deliver a current of
1A, which is more than enough to drive an electronic circuit or varying load. The 12V AC
appearing across the secondary is the RMS value of the waveform and the peak value would
be 12 x 1.414 = 16.8 volts. This value limits our choice of rectifier diode as 1N4007, which is
having PIV rating more than 16Volts.

2. Rectifier Stage: The two diodes D1 & D2 are connected across the secondary winding
of the transformer as a full-wave rectifier. During the positive half-cycle of secondary
voltage, the end A of the secondary winding becomes positive and end B negative. This
makes the diode D1 forward biased and diode D2 reverse biased. Therefore diode D1
conducts while diode D2 does not. During the negative half-cycle, end A of the secondary
winding becomes negative and end B positive. Therefore diode D2 conducts while diode D1
does not. Note that current across the centre tap terminal is in the same direction for both
half-cycles of input a.c. voltage. Therefore, pulsating d.c. is obtained at point ‘C’ with respect
to ground.

3. Filter Stage: Here capacitor C1 is used for filtering purpose and connected across the
rectifier output. It filters the a.c. components present in the rectified d.c. and gives steady d.c.
voltage. As the rectifier voltage increases, it charges the capacitor and also supplies current to
the load. When capacitor is charged to the peak value of the rectifier voltage, rectifier voltage
starts to decrease. As the next voltage peak immediately recharges the capacitor, the discharge
period is of very small duration. Due to this continuous charge-discharge-recharge cycle very
little ripple is observed in the filtered output. Moreover, output voltage is higher as it remains

22
Virtual Reality For Robotics

substantially near the peak value of rectifier output voltage. This phenomenon is also
explained in other form as: the shunt capacitor offers a low reactance path to the a.c.
components of current and open circuit to d.c. component. During positive half cycle the
capacitor stores energy in the form of electrostatic field. During negative half cycle, the filter
capacitor releases stored energy to the load.

Fig 4.6 Circuit diagram of +5V & +12V full wave regulated power supply

Table 4.2 Components List

SEMICONDUCTORS
IC1 7812 Regulator IC 1
IC2 7805 Regulator IC 1
D1& D2 1N4007 Rectifier Diodes 2

CAPACITORS
C1 1000 µf/25V Electrolytic 1

C2 0.1µF Ceramic Disc type 1

MISCELLANEOUS
X1 230V AC Pri,14-0-14 1Amp Sec Transformer 1

4. Voltage Regulation Stage: Across the point ‘D’ and Ground there is rectified and
filtered d.c. In the present circuit KIA 7812 three terminal voltage regulator IC is used to get

23
Virtual Reality For Robotics

+12V and KIA 7805 voltage regulator IC is used to get +5V regulated d.c. output. In the three
terminals, pin 1 is input i.e., rectified & filtered d.c. is connected to this pin. Pin 2 is common
pin and is grounded. The pin 3 gives the stabilized d.c. output to the load. The circuit shows
two more decoupling capacitors C2 & C3, which provides ground path to the high frequency
noise signals. Across the point ‘E’ and ‘F’ with respect to ground +5V & +12V stabilized or
regulated d.c output is measured, which can be connected to the required circuit.

4.6 Astable multivibrator using 555-Timer

Fig 4.5 shows an Astable Multivibrator which is a square-wave generator. It is also a


free-running multivibrator, because it does not require an external trigger pulse to change its
output. The output continuously alternates between high and low states. The time for which
the output remains at high or low is determined by two timing resistors and a timing capacitor
that are externally connected to the 555 timer. Since the circuit does not stay permanently in
any of the two states it is known as, astable meaning no stable state.

Fig 4.5.Astable Multivibrator

24
Virtual Reality For Robotics

Fig 4.6 shows the 555 Timer connected as an astable multivibrator. In this circuit there
are two timing resistors Ra and Rb. Both the trigger and threshold inputs (pins 2 and 6) of the
two comparators are shorted and connected to the external timing capacitor C. as a
consequence, the circuit will trigger itself continuously.

Fig 4.6 Circuit diagram with internal blocks shown

The astable operation can be understood by the waveforms of fig4.7. Assume initially
the 555 timer output is in the high state and the capacitor C is uncharged. The control flip-
flop output is now at Low with Q=0.

The discharge transistor Q1 is off and capacitor C starts charging towards Vcc through
Ra and Rb. When the capacitor voltage rises above 2/3 Vcc, it causes the comparator 1 output
to go High and comparator output 2 to go Low. Hence, the control flip-flop output is set to
high, with Q=1 and the timer output is forced to low state. Consequently, the discharge
transistor Q1 is turned ON connecting upper end of Rb to ground. The capacitor C now
discharges through Rb towards 0V. When the capacitor voltage drops below 1/3 Vcc the
output of the comparator 2 goes to high and that of the comparator 1 to low. As a result, the

25
Virtual Reality For Robotics

output of the control flip-flop resets to Low state with Q=0 and the timer output goes to high
state. In effect, the discharge transistor Q1 is turned OFF and the capacitor C charges again
towards 2/3 Vcc. This cycle of capacitor charging from 1/3 Vcc to 2/3 Vcc and then
discharging from 2/3 Vcc to 1/3 Vcc repeats continually producing a square-wave output.

Fig 4.7 Timing Diagram

Ton = 0.693(Ra+Rb) C
Toff = 0.693 Rb C
f = 1/T = 1.44/ (Ra+2Rb) C
Duty Cycle, D = Ton / T
D= Ra+Rb/Ra+2Rb

where,
Ton is the on-time during which the capacitor charges.
Toff is the off-time during which the capacitor discharges.
f is the output frequency.
T is the total time period.
D is the Duty cycle of the output pulses generated.

26
Virtual Reality For Robotics

5. Software Description

5.1 Flowchart
The flowchart of the program is shown in fig 5.1. In brief the program works as
explained below.

When the program is executed, the initgraph function is called. It initializes the
graphics system. Then the function doll is executed. Then signals are read from the hardware
device. This device acts as an input device. It has 8 channels of input (numbered from 0 to 7)
and each channel can vary its input signal from 0 to 255. Depending upon the channel
selected the image on the screen shows movements. If channel 0 is selected then the left hand
of the image shows movement. The amount of movement depends upon the signal i.e. its
range from 0 to 255. If channel 1 is selected then the right hand of the image shows
movement. The amount of movement depends upon the signal i.e. its range from 0 to 255. If
channel 2 is selected then the left leg of the image shows movement. The amount of
movement depends upon the signal i.e. its range from 0 to 255. If channel 3 is selected then
the right leg of the image shows movement. The amount of movement depends upon the
signal i.e. its range from 0 to 255.By varying the signals of channel 4, right forearms of the
image can be moved respectively.

27
Virtual Reality For Robotics

START

INITGRAPH

DOLL

WHILE
(!
KBHIT)

SIGNALS
FROM
HARDWARE

HAND
SIGNA
MOVEMENT
L (LEFT ARM)
= 0

HAND MOVEMENT
SIGNA
L (RIGHT ARM)
= 1

LEG MOVEMENT
SIGNA
(LEFT LEG)
L
= 2

HAND MOVEMENT
SIGNA (RIGHT LEG)
L
= 3

HAND MOVEMENT
SIGNA
(RIGHT ARM)
L
= 4

C B
A

28
Virtual Reality For Robotics

Fig 5.1 Flowchart for the program

29
Virtual Reality For Robotics

5.2 Program ADC 0809


#include<stdio.h>
#include <at89x51.h>

sfr port2 = 0xa0;


sfr port1 = 0x90;
sfr port3 = 0xb0;
sbit rd = port3 ^ 0;
sbit wr = port3 ^ 1;
sbit intr = port3 ^ 2;

main()
{
char val;
port2 = 0xff;
port3 = 0xff;

while(1)
{

wr = 0;
rd = 1;
wr = 1;
intr = 1;
while ( intr == 0 )
;
rd = 0;
val = port2;
//port3 = val;
if ( val > 0x05 && val < 0x16 )
port1 = 0x01;
else if ( val > 0x16 && val < 0x32 )
port1 = 0x02;
else if ( val > 0x32 && val < 0x128 )
port1 = 0x04;
else if ( val > 0x128 && val < 0xff )
port1 = 0x08;
rd = 1;
}
}

30
Virtual Reality For Robotics

5.3 C Program:
#include <stdio.h>
#include <alloc.h>
#include <conio.h>
#include <graphics.h>
#include <math.h>
#include <dos.h>
#include <stdlib.h>
#include <bios.h>

#define COM 0
#define DATA_READY 0x100
#define TRUE 1
#define FALSE 0

#define SETTINGS ( _COM_9600 | _COM_NOPARITY | _COM_STOP1 | _COM_CHR8)

void doll(void);
void face(void);
void shirt();
void hand_movement(int,int,int,int,int);
void leg_movement(int,int,int,int);

void pant(void);
void lhand(int,int,int);
void rhand(int,int,int);
void lleg(int,int,int,int,int);
void rleg(int,int,int,int,int);
void movelhand(int,int,int,int [500][2]);
void moverhand(int,int,int,int [500][2]);
int gd=DETECT,gm,a[20];
int x=300,y=100;
static int lh11[500][2],lh12[500][2],lh21[500][2],lh22[500][2],
rh11[500][2],rh12[500][2],rh21[500][2],rh22[500][2],
rl11[500][2],rl12[500][2],rl21[500][2],rl22[500][2],
ll11[500][2],ll12[500][2],ll21[500][2],ll22[500][2],
le11[500][2],le12[500][2],le21[500][2],le22[500][2],
re11[500][2],re12[500][2],re21[500][2],re22[500][2],
r1=115,r2=105,r3=150,r4=117,q;

float r[3][3],b[3][1],c[3][1],p[10];

int Rx_Data(void); /* Declaring global variables */


float value[8];
int ix,j,status;
int CH_No,DataL,DataH,Data;

31
Virtual Reality For Robotics

void main(void)
{
char c,*area;

int i=10,input=0,ip0=0,ip1=0,ip2=0,ip3=0,ip4=0,ip5=0,ip6=0,ip7=0,
temp1=0,temp2=0,temp3=0,temp4=0,size,flag1=0,flag2=0;

initgraph(&gd,&gm," ");
cleardevice();
//outportb(0x303,0x91);

movelhand(x-71,y+78,r1,lh11); /* hands */
movelhand(x-50,y+100,r2,lh12);
moverhand(x+70,y+78,r1,rh11);
moverhand(x+50,y+100,r2,rh12);

movelhand(x-71,y+78,r1-8,lh21); /* hands */
movelhand(x-50,y+100,r2+8,lh22);
moverhand(x+70,y+78,r1-8,rh21);
moverhand(x+50,y+100,r2+8,rh22);

moverhand(x+41,y+198,r3,rl11); /* legs */
movelhand(x-41,y+198,r3,ll11);
moverhand(x,y+231,r4,rl12);
movelhand(x,y+231,r4,ll12);

moverhand(x+41,y+198,r3-8,rl21); /* legs */
movelhand(x-41,y+198,r3-8,ll21);
moverhand(x,y+231,r4+8,rl22);
movelhand(x,y+231,r4+8,ll22);

movelhand(x-71,y+78,r1/2,le11); /* elbow */
movelhand(x-50,y+100,r2/2,le12); /* left */
movelhand(x-71,y+78,r1/3-5,le21);
movelhand(x-50,y+100,r2/2+5,le22);

moverhand(x+71,y+78,r1/2,re11); /* elbow */
moverhand(x+50,y+100,r2/2,re12); /* right */
moverhand(x+71,y+78,r1/3-5,re21);
moverhand(x+50,y+100,r2/2+5,re22);

doll();
size = imagesize(180,0,400,380);
area = malloc(size);
getimage(180,0,400,380,area);

32
Virtual Reality For Robotics

while(!kbhit())
{

bioscom(_COM_INIT, SETTINGS, COM);


outportb( (0x2F8+0x02),0x07 ); // Enable FIFO Buffer
// 0x2F8 for COM2
// 0x3F8 for COM1
status = bioscom(_COM_STATUS, 0, COM ); // If any previous unwanted
if (status & DATA_READY) // data has been received
bioscom(_COM_RECEIVE, 0, COM ); // the read it

bioscom( _COM_SEND, 'D' , COM ); // Send the Device Check command


for ( ix=0;ix<10;ix++)
{
status = bioscom(_COM_STATUS, 0, COM );
if (status & DATA_READY)
{ j = bioscom(_COM_RECEIVE, 0, COM );
goto skip1; }
delay (500);
}
goto skip2;

skip1:
if ( j == 'D' ) goto skip3;
skip2:
printf("\nSystem not ready \n");
skip3:
ix= 1;

for ( CH_No = 0; CH_No <= 7 ; CH_No++)


{ bioscom( _COM_SEND, CH_No+0x30 , COM ); // Send CH_No in Ascii
delay(10);
Rx_Data();
switch(CH_No)
{
case 0 :
if(ip6==0)
ip0=value[CH_No]*36;
break;

case 1 :
if(ip6==0)
ip1=value[CH_No]*36;
break;
case 2 :
if(ip6==0)
{
if(ip2 != value[CH_No]*50)
{
setfillstyle(1,BLACK);
bar(x-3,y+281,x,y+325);
bar(x-45,y+281,x-41,y+324);

33
Virtual Reality For Robotics

// putimage(180,value[CH_No]*51*.16,area,0);
leg_movement(ip2*0.2+5,BLACK,1,0);
leg_movement(value[CH_No]*50*0.2+5,WHITE,1,0);
ip2=value[CH_No]*50;
}
}
break;

case 3 :
if(ip6==0)
{
if(ip3 != value[CH_No]*50)
{
setfillstyle(1,BLACK);
bar(x,y+281,x+3,y+325);
bar(x+41,y+281,x+44,y+324);
// putimage(180,value[CH_No]*51*.16,area,0);
leg_movement(ip3*0.2+5,BLACK,0,1);
leg_movement(value[CH_No]*50*.2+5,WHITE,0,1);
ip3=value[CH_No]*50;
}
}
break;

case 4 :
if(ip6==0)
ip4=value[CH_No]*51;
break;

case 5 :
if(ip6 == 0)
if ( value[CH_No] > 3.33 )
ip5=value[CH_No]*51;
else
ip5 = 0;
break;
}

if((ip1 != temp1) || (ip4 != temp2))


{
if(flag1)
hand_movement(0*.79+10,0*.18,BLACK,0,1);
else
hand_movement(temp1*.79+10,temp2*.18,BLACK,0,1);
hand_movement(ip1*.79+10,ip4*.18 ,WHITE,0,1);
temp1=ip1;temp2=ip4;flag1=0;
}
if(ip5>=150)
{
ip0=150;ip4=180;

34
Virtual Reality For Robotics

}
if((ip0 != temp3) || (ip5 != temp4))
{
if(flag2)
hand_movement(0*.79+10,0*.18,BLACK,1,0);
else
hand_movement(temp3*.79+10,temp4*.18,BLACK,1,0);
hand_movement(ip0*.79+10,ip5*.18 ,WHITE,1,0);
temp3=ip0;temp4=ip5;flag2=0;
}
ip4=0;
}
ix++;
}
closegraph();

/* function DOLL */

void doll()
{
face();
shirt();
pant();
hand_movement(10,0,WHITE,1,1);
leg_movement(5,WHITE,1,1);
}

/* face */
void face()
{
setcolor(16);
setfillstyle(1,13);
fillellipse(x,y,30,50); /* Face */
setcolor(13);
sector(x-30,y,0,360,5,10); /* Ear-left */
sector(x+30,y,0,360,5,10); /* Ear-right */
setcolor(16);
ellipse(x-30,y,120,240,3,8); /* ear-left */
ellipse(x+30,y,300,60,3,8); /* ear-right */
setfillstyle(1,15);
fillellipse(x-15,y-5,7,3); /* Eye-left */
fillellipse(x+15,y-5,7,3); /* Eye-right */
setfillstyle(1,16);
fillellipse(x-15,y-5,3,3); /* Eye Ball - left */
fillellipse(x+15,y-5,3,3); /* Eye Ball - right */
setfillstyle(1,RED);
ellipse(x,y+30,180,0,10,5); /* mouth */
ellipse(x,y+25,210,330,15,7); /* mouth */

35
Virtual Reality For Robotics

floodfill(x,y+34,BLACK); /* mouth */
ellipse(x+30,y+25,185,205,18,15); /* mouth - right arc */
ellipse(x-30,y+25,338,358,18,15); /* mouth - left arc */
line(x,y+3,x-6,y+19); /* nose */
line(x,y+3,x+6,y+19); /* nose */
fillellipse(x+3,y+19,3,1); /* nose */
fillellipse(x-3,y+19,3,1); /* nose */
setfillstyle(1,BLUE);
fillellipse(x,y-25,40,10); /* Hat - lower ellipse */
setcolor(BLUE);
a[0]=a[6]=x-30;a[1]=a[3]=y-25; /* Hat - poly values */
a[2]=a[4]=x+30;a[5]=a[7]=y-60; /* Hat - poly values */
fillpoly(4,a); /* Hat - poly */
setcolor(16);
fillellipse(x,y-60,30,10); /* Hat - upper ellipse */
setlinestyle(0,1,3);
ellipse(x-15,y-5,50,130,12,8); /* Eyebrow - left */
ellipse(x+15,y-5,50,130,12,8); /* Eyebrow - right */
a[0]=a[6]=x-12;a[1]=a[3]=y+45; /* Neck - poly values */
a[2]=a[4]=x+13;a[5]=a[7]=y+60; /* Neck - poly values */
setfillstyle(1,13);
setlinestyle(0,1,1);
setcolor(13);
fillpoly(4,a); /* Neck */
setcolor(15);
}

void shirt()
{
setcolor(WHITE);
ellipse(x,y+80,6,172,70,20); /* shoulder */
ellipse(x-60,y+170,340,85,20,75); /* hips-left */
ellipse(x+60,y+170,95,202,20,75); /* hips-right */
line(x-41,y+196,x+42,y+196); /* vest line */
setfillstyle(9,BLUE);
setcolor(WHITE);
line(x-70,y+78,x-55,y+100);
line(x+70,y+78,x+55,y+100);

a[0]=a[6]=a[12]=x;a[5]=a[11]=y+100;
a[1]=a[7]=a[13]=y+90;a[4]=x-10;
a[2]=x-20; a[3]=a[9]=a[15]=y+60;
a[8]=a[14]=x+20;a[10]=x+10;

fillpoly(8,a);

setfillstyle(1,13);
floodfill(x,y+80,WHITE);

36
Virtual Reality For Robotics

setcolor(WHITE);
line(x-41,y+185,x+42,y+185);

setfillstyle(1,7);
floodfill(x-38,y+190,WHITE); /* belt */
setcolor(RED);

setfillstyle(1,RED);
fillellipse(x,y+190,10,4); /* belt */

setcolor(13);
ellipse(x,y+80,75,105,70,20);
setcolor(WHITE);
setfillstyle(9,BLUE);
floodfill(x-60,y+80,WHITE);

setfillstyle(1,7);
setcolor(7);
fillellipse(x,y+95,3,3); /* button */
fillellipse(x,y+125,3,3); /* button */
fillellipse(x,y+155,3,3); /* button */
fillellipse(x,y+180,3,3); /* button */
}

/* Function hand_movement */

void hand_movement(int w,int j,int color,int l,int r)


{
setcolor(WHITE);
line(x-70,y+78,x-55,y+100);
line(x+70,y+78,x+55,y+100);
ellipse(x,y+80,120,172,70,20);
ellipse(x,y+80,6,70,70,20);

ellipse(x-60,y+170,340,85,20,75); /* hips-left */
ellipse(x+60,y+170,95,202,20,75); /* hips-right */

if(color==BLACK)
{
setfillstyle(9,BLACK);
if(l)
{
floodfill(x-70,y+85,WHITE);
if((w<160)&&(w>120))
floodfill((p[0]+p[2])/2+5,(p[1]+p[3])/2,WHITE);
else if(!q)
floodfill((p[0]+p[2])/2,(p[1]+p[3])/2-5,WHITE);
else

37
Virtual Reality For Robotics

floodfill((p[0]+p[2])/2,(p[1]+p[3])/2+5,WHITE);

}
if(r)
{
floodfill(x+70,y+85,WHITE);
floodfill((re11[w][0]+rh11[w][0])/2-5,(re11[w][1]+rh11[w][1])/2+5,WHITE);

}
}
if(w<=140)
{
if(w==140)
{
setcolor(BLACK);
}
q=0;
setcolor(color);
if(l)
lhand(color,w,j);
if(r)
rhand(color,w,j);

}
else
{
if(w==142)
{
setcolor(BLACK);
}
q=1;
setcolor(color);
if(l)
lhand(color,w,j);
if(r)
rhand(color,w,j);

if(color==WHITE)
{
setfillstyle(9,BLUE);
if(l)
{
floodfill(x-70,y+85,WHITE);

floodfill((p[0]+p[2])/2,(p[1]+p[3])/2-5,WHITE);
floodfill(le11[w][0],le11[w][1]+5,WHITE);
}
if(r)

38
Virtual Reality For Robotics

{
floodfill(x+70,y+85,WHITE);
floodfill((re11[w][0]+rh11[w][0])/2-5,(re11[w][1]+rh11[w][1])/2+5,WHITE);
}
}

/* moving hand co-ordinates - left */

void movelhand(int xc ,int yc ,int r,int b[400][2])


{
float th,x,y,i=0;
for(th=0;th<=180;th+=.5,++i)
{
x = r*cos((th*3.142)/180);
y = r*sin((th*3.142)/180);
b[i][0] = xc-y;
b[i][1] = yc+x;
}
}

/* moving hand co-ordinates - right */


void moverhand(int xc ,int yc ,int r,int b[400][2])
{
float x,y,i=0,th;
for(th=0;th<=180;++i,th+=.5)
{
x = r*cos((th*3.142)/180);
y = r*sin((th*3.142)/180);
b[i][0] = xc+y;
b[i][1] = yc+x;
}
}

/* Function pant */

void pant()
{
setcolor(15);
line(x-41,y+198,x,y+231);
line(x+41,y+198,x,y+231);
line(x-41,y+198,x,y+231);
line(x+41,y+198,x,y+231);
setcolor(WHITE);
setfillstyle(9,YELLOW);
floodfill(x+2,y+205,WHITE);
setlinestyle(1,0,1);
line(x-3,y+195,x-3,y+230); /* zip */
line(x+5,y+195,x+5,y+222); /* zip */
line(x-3,y+230,x+5,y+222); /* zip */
setlinestyle(0,0,1);

39
Virtual Reality For Robotics

/* function for right leg */

void rleg(int s,int t,int u,int v,int color)


{
setcolor(color);
line(x+41,y+198,s,t); /* right leg */
line(x,y+231,u,v); /* " " */
line(s,t,u,v); /* " " */
}

/* function for left leg */

void lleg(int s,int t,int u,int v,int color)


{
setcolor(color);
line(x-41,y+198,s,t); /* left leg */
line(x,y+231,u,v); /* " " */
line(s,t,u,v); /* " " */
}

/* Function leg_movement */

void leg_movement(int w,int color,int l,int r)


{
setcolor(color);
if(color==BLACK)
{
setfillstyle(9,BLACK);
if(l)
floodfill(x-5,y+231,WHITE);
if(r)
floodfill(x+7,y+231,WHITE);
}
if(w<=40)
{
if(l)
lleg(ll11[w][0],ll11[w][1],ll12[w][0],ll12[w][1],color);
if(r)
rleg(rl11[w][0],rl11[w][1],rl12[w][0],rl12[w][1],color);

}
else
{
if(l)
lleg(ll21[w-4][0],ll21[w-4][1],ll22[w+4][0],ll22[w+4][1],color);
if(r)
rleg(rl21[w-4][0],rl21[w-4][1],rl22[w+4][0],rl22[w+4][1],color);

}
if(color==WHITE)

40
Virtual Reality For Robotics

{
setfillstyle(9,YELLOW);
if(l)
floodfill(x-7,y+231,WHITE);
if(r)
floodfill(x+7,y+231,WHITE);
}
}

void rotate(int xr,int yr,float th,int x1,int y1)


{
int i,j,k,d[10];
th = th*3.142/180;
r[0][0]=r[1][1]=cos(th);
r[2][0]=r[2][1]=0;
r[0][1]= -sin(th);
r[1][0]= sin(th);
r[2][2]=1;
r[0][2]= xr*(1-cos(th))+ yr*sin(th);
r[1][2]= yr*(1-cos(th))- xr*sin(th);
b[0][0]= x1; b[1][0]=y1; b[2][0]=1;
for(i=0;i<3;++i)
{
for(j=0;j<1;++j)
{
c[i][j]=0;
for(k=0;k<3;++k)
c[i][j] = c[i][j]+(r[i][k]*b[k][j]);
}
}
line(xr,yr,c[0][0],c[1][0]);

/* function for left hand */


void lhand(int color,int i,int j)
{
int s,t,u,v;
setcolor(color);
if(q)
{
line(x-65,y+73,(le21[i][0]),(le21[i][1])); /* 1st half */

if ((j>10)&&(i>140)&&(j<21))
movelhand(x-50,y+100,r2+15,lh22);
else if((j>20)&&(i>140)&&(j<31))
movelhand(x-50,y+100,r2+20,lh22);
else if((j>30)&&(i>140))
movelhand(x-50,y+100,r2+25,lh22);
else
movelhand(x-50,y+100,r2+10,lh22);

41
Virtual Reality For Robotics

rotate((le21[i][0]),(le21[i][1]),j,lh21[i][0],lh21[i][1]);//s,t);
s=c[0][0];t=c[1][0];

line((le21[i][0]),(le21[i][1]),(le22[i][0]),(le22[i][1]));
line(x-50,y+100,le22[i][0],le22[i][1]);

rotate(le22[i][0],le22[i][1],j,lh22[i][0],lh22[i][1]);//u,v);
u=c[0][0];v=c[1][0];

{
p[0]=s;p[1]=t;p[2]=u;p[3]=v;
}
}
else
{
line(x-71,y+78,(le11[i][0]),(le11[i][1])); /* left - arm */

rotate((le11[i][0]),(le11[i][1]),j,lh11[i][0],lh11[i][1]);
s=c[0][0];t=c[1][0];

line((le11[i][0]),(le11[i][1]),(le12[i][0]),(le12[i][1]));
line(x-50,y+100,le12[i][0],le12[i][1]);

if((j>10)&&(i>70)&&(j<21))
movelhand(x-50,y+100,r2+5,lh12);
else if((j>20)&&(i>70)&&(j<31))
movelhand(x-50,y+100,r2+10,lh12);
else if((j>30)&&(i>70))
movelhand(x-50,y+100,r2+15,lh12);
else
movelhand(x-50,y+100,r2,lh12);

rotate(le12[i][0],le12[i][1],j,lh12[i][0],lh12[i][1]);//u,v);
u=c[0][0];v=c[1][0];
{
p[0]=s;p[1]=t;p[2]=u;p[3]=v;
}
}
setcolor(color);
line(s,t,u,v);
}

/* function for right hand */

void rhand(int color,int i,int j)


{
int s,t,u,v;
setcolor(color);

42
Virtual Reality For Robotics

if(q)
{
line(x+65,y+73,(re21[i][0]),(re21[i][1])); /* 1st half */

if ((j>(360-10))&&(i>140)&&(j<(360-21)))
moverhand(x+50,y+100,r2+15,rh22);
else if((j>(360-20))&&(i>140)&&(j<(360-31)))
moverhand(x+50,y+100,r2+20,rh22);
else if((j>(360-30))&&(i>140))
moverhand(x+50,y+100,r2+25,rh22);
else
moverhand(x+50,y+100,r2+10,rh22);

rotate((re21[i][0]),(re21[i][1]),360-j,rh21[i][0],rh21[i][1]);//s,t);
s=c[0][0];t=c[1][0];

line((re21[i][0]),(re21[i][1]),(re22[i][0]),(re22[i][1]));
line(x+50,y+100,re22[i][0],re22[i][1]);

rotate(re22[i][0],re22[i][1],360-j,rh22[i][0],rh22[i][1]);//u,v);
u=c[0][0];v=c[1][0];
if(i==140)
{
p[0]=s;p[1]=t;p[2]=u;p[3]=v;
}

}
else
{
line(x+71,y+78,(re11[i][0]),(re11[i][1]));
// line((le11[i/2.5][0]),(le11[i/2.5][1]),s,t); /* left - arm */

rotate((re11[i][0]),(re11[i][1]),360-j,rh11[i][0],rh11[i][1]);
s=c[0][0];t=c[1][0];

line((re11[i][0]),(re11[i][1]),(re12[i][0]),(re12[i][1]));
line(x+50,y+100,re12[i][0],re12[i][1]);

if((j<(360-10))&&(i>70)&&(j>(360-21)))
moverhand(x+50,y+100,r2+5,rh12);
else if((j<(360-20))&&(i>70)&&(j>(360-31)))
moverhand(x+50,y+100,r2+10,rh12);
else if((j<(360-30))&&(i>70))
moverhand(x+50,y+100,r2+15,rh12);
else
moverhand(x+50,y+100,r2,rh12);

rotate(re12[i][0],re12[i][1],360-j,rh12[i][0],rh12[i][1]);//u,v);
u=c[0][0];v=c[1][0];

43
Virtual Reality For Robotics

if(i==142)
{
p[0]=s;p[1]=t;p[2]=u;p[3]=v;
}
}
setcolor(color);
line(s,t,u,v); /* " " */
}

int Rx_Data(void)
{
int Str_Char, CH , End_Char;

loop1:
status = bioscom(_COM_STATUS, 0, COM );

if (status & DATA_READY)


{ Str_Char = bioscom(_COM_RECEIVE, 0, COM );
goto loop2; }
else goto loop1;

loop2:
status = bioscom(_COM_STATUS, 0, COM );

if (status & DATA_READY)


{ CH = bioscom(_COM_RECEIVE, 0, COM );
goto loop3; }
else goto loop2;

loop3:
status = bioscom(_COM_STATUS, 0, COM );

if (status & DATA_READY)


{ DataH = bioscom(_COM_RECEIVE, 0, COM );
goto loop4; }
else goto loop3;

loop4:
status = bioscom(_COM_STATUS, 0, COM );
if (status & DATA_READY)
{ DataL = bioscom(_COM_RECEIVE, 0, COM );
goto loop5; }
else goto loop4;

loop5:
status = bioscom(_COM_STATUS, 0, COM );
if (status & DATA_READY)
{ End_Char = bioscom(_COM_RECEIVE, 0, COM );

44
Virtual Reality For Robotics

goto loop6; }
else goto loop5;

loop6:
if( DataH<=0x39) DataH-= 0x30; else DataH-= 0x37;
if( DataL<=0x39) DataL-= 0x30; else DataL-= 0x37;
Data = DataH<<4; // Left shift MS Byte by 8 bits
Data = Data+DataL; // Add shifted MS value with LS byte
value[CH_No]=( Data * 5.00)/0xFF; // ( overall is to convert 2 8 bits to 16 bit
return (0); // Store the Data in buffer
}

45
Virtual Reality For Robotics

6. Present Day Virtual Reality Technology

6.1 Head-Mounted Display (HMD)

The head-mounted display (HMD) shown in fig 6.1, was the first device providing its
wearer with an immersive experience. Evans and Sutherland demonstrated a head-mounted
stereo display already in 1965. It took more then 20 years before VPL Research introduced a
commercially available HMD, the famous "EyePhone" system (1989).

A typical HMD houses two miniature display screens and an optical system that
channels the images from the screens to the eyes, thereby, presenting a stereo view of a
virtual world. A motion tracker continuously measures the position and orientation of the
user's head and allows the image generating computer to adjust the scene representation to the
current view. As a result, the viewer can lookaround and walk through the surrounding virtual
environment. To overcome the often uncomfortable intrusiveness of a head-mounted display,
alternative concepts (e.g., BOOM and CAVE) for immersive viewing of virtual environments
were developed.

Fig 6.1 A head-mounted display


(HMD) 46
Virtual Reality For Robotics

6.2 BOOM

The BOOM (Binocular Omni-Orientation Monitor) as shown in fig 6.2 from Fake
Space is a head-coupled stereoscopic display device. Screens and optical system are housed
in a box that is attached to a multi-link arm. The user looks into the box through two holes,
sees the virtual world, and can guide the box to any position within the operational volume of
the device. Head tracking is accomplished via sensors in the links of the arm that holds the
box.

Fig 6.2 The BOOM, a head-coupled

6.3 CAVE display device

The CAVE (Cave Automatic Virtual Environment) as shown in fig 6.3, was developed
at the University of Illinois at Chicago and provides the illusion of immersion by projecting
stereo images on the walls and floor of a room-sized cube. Several persons wearing
lightweight stereo glasses can enter and walk freely inside the CAVE. A head tracking system
continuously adjusts the stereo projection to the current position of the leading viewer.

Fig 6.3 CAVE system


(schematic))principle)
47
Virtual Reality For Robotics

6.4 Input Devices and other Sensual Technologies

A variety of input devices like data gloves, joysticks, and hand-held wands allow the
user to navigate through a virtual environment and to interact with virtual objects. Directional
sound, tactile and force feedback devices, voice recognition and other technologies are being
employed to enrich the immersive experience and to create more "sensualized" interfaces.
The fig 6.4 illustrates this technology.

Fig 6.4 A data glove allows for interactions with the virtual world

6.5 Characteristics of Immersive VR

The unique characteristics of immersive virtual reality can be summarized as follows:

 Head-referenced viewing provides a natural interface for the navigation in three-


dimensional space and allows for look-around, walk-around, and fly-through
capabilities in virtual environments.
 Stereoscopic viewing enhances the perception of depth and the sense of space.
 The virtual world is presented in full scale and relates properly to the human size.
 Realistic interactions with virtual objects via data glove and similar devices allow for
manipulation, operation, and control of virtual worlds.
 The convincing illusion of being fully immersed in an artificial world can be
enhanced by auditory, haptic, and other non-visual technologies.
 Networked applications allow for shared virtual environments (see below).

48
Virtual Reality For Robotics

Shared Virtual Environments

In the example illustrated below in fig 6.5, three networked users at different locations
(anywhere in the world) meet in the same virtual world by using a BOOM device, a CAVE
system, and a HMD, respectively. All users see the same virtual environment from their
respective points of view. Each user is presented as a virtual human (avatar) to the other
participants. The users can see each other, communicated with each other, and interact with
the virtual world as a team.

Fig 6.5 Shared virtual environment

Non-immersive VR

Today, the term 'Virtual Reality' is also used for applications that are not fully
immersive. The boundaries are becoming blurred, but all variations of VR will be important
in the future. This includes mouse-controlled navigation through a three-dimensional
environment on a graphics monitor, stereo viewing from the monitor via stereo glasses, stereo
projection systems, and others. Apple's QuickTime VR, for example, uses photographs for the
modeling of three-dimensional worlds and provides pseudo look-around and walk-trough
capabilities on a graphics monitor.

49
Virtual Reality For Robotics

VRML

Most exciting is the ongoing development of VRML (Virtual Reality Modeling


Language) on the World Wide Web. In addition to HTML (Hyper Text Markup Language),
that has become a standard authoring tool for the creation of home pages, VRML provides
three-dimensional worlds with integrated hyperlinks on the web. Home pages become home
spaces. The viewing of VRML models via a VRML plug-in for web browsers is usually done
on a graphics monitor under mouse-control and, therefore, not fully immersive. However, the
syntax and data structure of VRML provide an excellent tool for the modeling of three-
dimensional worlds that are functional and interactive and that can, ultimately, be transferred
into fully immersive viewing systems. The current version VRML 2.0 has become an
international ISO/IEC standard under the name VRML97. This is as shown in fig 6.6.

Fig 6.6 Rendering of Escher's Penrose Staircase device

VR-related Technologies

Other VR-related technologies combine virtual and real environments. Motion


trackers are employed to monitor the movements of dancers or athletes for subsequent studies
in immersive VR. The technologies of 'Augmented Reality' allow for the viewing of real
environments with superimposed virtual objects. Tele-presence systems (e.g., tele-medicine,
tele-robotics) immerse a viewer in a real world that is captured by video cameras at a distant
location and allow for the remote manipulation of real objects via robot arms and
manipulators.

50
Virtual Reality For Robotics

6.6 Applications

Virtual Reality is well known for its use with flight simulators and games. However,
these are only two of the many ways virtual reality is being used today. This article will
summarize how virtual reality is used in medicine, architecture, weather simulation,
chemistry and the visualization of voxel data. fig 6.7 shows real and abstract virtual worlds
(Michigan stadium, Flow structure) device.

As the technologies of virtual reality evolve, the applications of VR become literally


Fig 6.7 Real and abstract virtual worlds (Michigan stadium, Flow structure) device
unlimited. It is assumed that VR will reshape the interface between people and information
technology by offering new ways for the communication of information, the visualization of
processes, and the creative expression of ideas.

Note that a virtual environment can represent any three-dimensional world that is
either real or abstract. This includes real systems like buildings, landscapes, underwater
shipwrecks, spacecrafts, archaeological excavation sites, human anatomy, sculptures, crime
scene reconstructions, solar systems, and so on. Of special interest is the visual and sensual
representation of abstract systems like magnetic fields, turbulent flow structures, molecular
models, mathematical systems, auditorium acoustics, stock market behaviour, population
densities, information flows, and any other conceivable system including artistic and creative
work of abstract nature. These virtual worlds can be animated, interactive, shared, and can
expose behaviour and functionality.

Useful applications of VR include training in a variety of areas (military, medical,


equipment operation, etc.), education, design evaluation (virtual prototyping), architectural
walk-through, human factors and ergonomic studies, simulation of assembly sequences and

51
Virtual Reality For Robotics

maintenance tasks, assistance for the handicapped, study and treatment of phobias (e.g., fear
of height), entertainment, and much more.

7. Conclusion

52
Virtual Reality For Robotics

The following modifications can be made to the present circuit, which leads to still
smarter project. This is a very cost effective system as it can simulate a robot. In some cases
it can be used instead of a robot. This system can be used in nuclear power stations which are
very hazardous for human beings to work in. The output of this system can be taken as a
multiple of the input to do some serious tasks. eg: If 50 Grams weight is lifted on the input
side then the output can be made to lift any multiple of 50 grams. This project is open for
developments from all sides. It is the users’ imagination which limits the working of this
project. One can go on adding the extra, rich features to this project.

Appendix A

Pin details of various IC’s used


53
Virtual Reality For Robotics

A.1 Voltage Regulator KIA 7805

Fig A.1 KIA 7805

General Characteristics:

1. Output voltage : 05V

2. Operating Temperature : 0°c - 70°c

3. Output Current : 100mA

4. Dropout Voltage : 1.7V

A.2 CA3140

54
Virtual Reality For Robotics

Fig A.2 Amplifier CA3140

Common Mode Rejection (CMR)

Common Mode Rejection is a measure of the change in output voltage when both
inputs are changed by equal amounts, CMR is usually specified for a full range common
mode voltage change (CMV), at a given frequency, and a specified impedance (e.g. 1k
source unbalance at 60Hz).. The common mode rejection ratio of common mode signal
appearing at the output to the input CMV.

In most instrumentation amplifiers, the CMR increases with gain because the front-
end configuration does not amplify common-mode signals, and the amount of common mode
signal appearing at the output stays relatively constant as the signal gain increases. However,
at higher gains, amplifier bandwidth decreases since differences in phase shift through the
mode errors; CMR becomes more frequency dependent at high gains.

Important Note:

The instrumentation amplifiers have differential inputs and there must be a return path
for the bias current, if it is not provided these currents will change capacitance, causing the
output to drift uncontrollably or to saturate. Therefore, when amplifying output of “Floating”
source, such as transforms and thermocouples as well as ac-coupled sources there must be a

55
Virtual Reality For Robotics

DC path from each input to common, or to the guard terminal. If a return path is
impracticable, an isolator must be used.

Various parameters of a typical operational amplifier is as shown below:

Open loop gain Ad = 50,000

Input offset voltage Vio = 1mx

Input offset voltage Iio = 10nA

Common mode rejection ratio = 100dB

Power supply rejection ratio = 20uv/v

Input offset voltage drift = 0.1na/c

Input offset current drift = 1.0uv/c°

Rate = 1v/us°

56
Virtual Reality For Robotics

A.3 ADC 0809

Fig A.3 ADC 0809

57
Virtual Reality For Robotics

A.4 Pin details of AT89C51

Fig A.4 AT89C651

58
Virtual Reality For Robotics

Fig A.5 Block diagram of AT89C51

Features of 89C51

• 4K Bytes of ROM
• Fully Static Operation: 0 Hz to 24 MHz
• Three-level program memory lock
• 128 x 8-bit internal RAM
• 32 programmable I/O lines

59
Virtual Reality For Robotics

• Two 16-bit Timer/Counters


• Six interrupt sources
• Programmable serial channel

The AT89C51 is a low-power, high-performance CMOS 8-bit microcontroller with


4K bytes of read only memory (ROM). The device is manufactured using Atmel’s high-
density nonvolatile memory technology. The AT89C51 provides the following standard
features: 4K bytes of ROM, 128 bytes of RAM, 32 I/O lines, two 16-bit timer/counters, six
interrupt architecture, a full-duplex serial port.

Pin Description

Port 0
Port 0 is an 8-bit open drain bi-directional I/O port. As an output port, each pin can
sink eight TTL inputs. When 1s are written to port 0 pins, the pins can be used as high
impedance inputs. Port 0 can also be configured to be the multiplexed low order address/data
bus during accesses to external program and data memory. In this mode, P0 has internal pull
ups. Port 0 also receives the code bytes during flash programming and outputs the code by
test during program verification. External pull ups are required during program verification.

Port 1
Port 1 is an 8-bit bi-directional I/O port with internal pull ups. The port 1 output
buffers can sink/source four TTL inputs. When 1s are written to port 1 pins, they are pulled
high by the internal pull ups and can be used as inputs
In addition, P1.0 and P1.1 can be configured to be the timer/counter 2 external count
input (P1.0/T2) and the timer/counter 2 trigger input (P1.1/T2EX), respectively. Port 1 also
receives the low-order address bytes during flash programming and verification. Port pin
alternate functions P1.0 T2 (external count input to Timer/Counter 2), clock-out P1.1 T2 EX
(Timer/Counter 2 capture/reload trigger and direction control) AT89C52

Port 2
Port 2 is an 8-bit bi-directional I/O port with internal pull ups. The port 2 output
buffers can sink/source four TTL inputs. When 1s are written to port 2 pins, they are pulled
high by the internal pull ups and can be used as inputs. Port 2 emits the high-order address

60
Virtual Reality For Robotics

byte during fetches from external program memory and during accesses to external data
memory that uses 16-bit addresses (MOVX @ DPTR). In this application, port 2 uses strong
internal pull ups when emitting 1s. During accesses to external data memory that uses 8-bit
addresses (MOVX @ RI), port 2 emits the contents of the P2 special function register.
Port 3
Port 3 is an 8-bit bi-directional I/O port with internal pull ups. The port 3 output
buffers can sink/source four TTL inputs. When 1s are written to port 3 pins, they are pulled
high by the internal pull ups and can be used as inputs.

Port 3 also receives some control signals for Flash programming and verification.

Port Pin Alternate Functions


P3.0 RXD (serial input port)
P3.1 TXD (serial output port)
P3.2 INT0 (external interrupt 0)
P3.3 INT1 (external interrupt 1)
P3.4 T0 (timer 0 external input)
P3.5 T1 (timer 1 external input)
P3.6 WR (external data memory write strobe)
P3.7 RD (external data memory read strobe)

RST

Reset input. A high on this pin for two machine cycles while the oscillator is running resets
the device.

ALE/PROG

Address latch enable is an output pulse for latching the low byte of the address during
accesses to external memory. This pin is also the program pulse input (PROG) during flash
programming. In normal operation, ALE is emitted at a constant rate of 1/6 the oscillator
frequency and may be used for external timing or clocking purposes. Note, however, that one
ALE pulse is skipped during each access to external data memory.

61
Virtual Reality For Robotics

If desired, ALE operation can be disabled by setting bit 0 of SFR location 8EH. With
the bit set, ALE is active only during a MOVX or MOVC instruction. Otherwise, the pin is
weakly pulled high. Setting the ALE-disable bit has no effect if the microcontroller is in
external execution mode.
PSEN

Program Store Enable is the read strobe to external program memory. When the
AT89C51 is executing code from external program memory, PSEN is activated twice each
machine cycle, except that two PSEN activations are skipped during each access to external
data memory.

EA/VPP

External access enable EA must be strapped to GND in order to enable the device to
fetch code from external program memory locations starting at 0000H up to FFFFH.

XTAL1
Input to the inverting oscillator amplifier and input to the internal clock operating circuit.

Special Function Registers

It is a map of the on-chip memory area called the Special Function Register (SFR)
space. Note that not all of the addresses are occupied, and unoccupied addresses may not be
implemented on the chip. Read accesses to these addresses will in general return random
data, and write accesses will have an indeterminate effect. User software should not write 1s
to these unlisted locations, since they may be used in future products to invoke AT89C51 new
features. In that case, the reset or inactive values of the new bits will always be 0.

Data Memory

There are 128 bytes of RAM in 8051 which are assigned addresses ranging from 00h
to 7Fh. These 128 groups are divided into three different groups as follows:

62
Virtual Reality For Robotics

1. A total of 32 bytes from location 00 to 1Fh are set aside for register banks
and the stack.
2. A total of 16 bytes from locations 20h to 2fh are set aside for bit-
addressable memory.
3. A total of 80 bytes from locations 30h to 7fh are used for read and write
storage which is normally called as a scratch pad memory. These 80 bytes
of RAM are used for the purpose of storing parameters by 8051
programmers.

Timer 0 and 1

Both Timer 0 and Timer 1 in the AT89C51 are 16 bits wide. Since the 8051 has an 8-
bit architecture, each 16-bit timers are accessed as two separate registers of low byte and high
byte.

Interrupts

The AT89C51 has a total of six interrupt vectors: two external interrupts (INT0 and
INT1), two timer interrupts (Timers 0, 1), the serial port interrupt and the reset pin interrupt.
Each of these interrupt sources can be individually enabled or disabled by setting or clearing
a bit in special function register IE. IE also contains a global disable bit, EA, which disables
all interrupts at once.

Oscillator Characteristics

XTAL1 and XTAL2 are the input and output, respectively, of an inverting amplifier
that can be configured for use as an on-chip oscillator, as shown in Figure 7. Either a quartz
crystal or ceramic resonator may be used. To drive the device from an external clock source,
XTAL2 should be left unconnected while XTAL1 is driven. There are no requirements on the
duty cycle of the external clock signal, since the input to the internal clocking circuitry is
through a divide-by-two flip-flop, but minimum and maximum voltage high and low time
specifications must be observed.

63
Virtual Reality For Robotics

DC Characteristics

Absolute Maximum Ratings*


Operating Temperature.................................. -55°C to +125°C
Storage Temperature ..................................... -65°C to +150°C
Voltage on Any Pin with Respect to Ground .....................................-1.0V to +7.0V
Maximum Operating Voltage............................................ 6.6V
DC Output Current...................................................... 15.0 mA

64
Virtual Reality For Robotics

Appendix B

B.1 User defined functions

User defined functions used:-


Doll( )
Hand_movement( )
Leg_movement( )
Rotate( )
Shirt( )
Pant( )
Face( )
Movelhand( )
Moverhand( )
Rhand( )
Lhand( )
Rleg( )
Lleg( )

Void doll(void)
This function takes no parameters. This function is used to draw the initial static
image. It accomplishes this job by calling other functions as hand_movement( ),
leg_movement( ), shirt( ) and pant( ).

Void face(void)
This function takes no parameters. This function is used to draw the face of the image.
It accomplishes this job by executing other library functions in C graphics such as ellipse ( ),
fillellipse ( ), sector ( ), floodfill ( ), line ( ), arc ( ), circle ( ) etc.

Void shirt(void)
This function takes no parameters. This function is used to draw the shirt of the
image. It accomplishes this job by executing other functions, some from C graphics library

65
Virtual Reality For Robotics

and some user defined functions. It uses the hand_movement function to draw the hands of
the image.

Void hand_movement(int, int, int, int, int)


This function takes 5 integer parameters. This function is used to control the
movement of the hands of the image. It controls the movement of both arm and elbow. The
first parameter species the signal value for arm movement. The second parameter specifies
the signal value for elbow movement. The third parameter specifies the color to be used for
current execution of the function because it is alternately executed with colors black and
white for smooth transition. The fourth and fifth parameters take values either 0 or 1. The
fourth parameter is set to 1 if left hand is to be moved. The fifth parameter is set to 1 if right
hand is to be moved.

Void leg_movement(int, int, int, int)


This function takes 4 integer parameters. This function is used to control the
movement of the legs of the image. The first parameter specifies the signal value for leg
movement. The second parameter specifies the color to be used for current execution of the
function because it is alternately executed with colors black and white for smooth transition.
The third and fourth parameters take values either 0 or 1. The third parameter is set to 1 if left
leg is to be moved. The fourth parameter is set to 1 if right leg is to be moved.

Void lhand(int, int, int)


This function takes 3 integer parameters. This function is used to draw the left hand of
the image. The first parameter specifies the color to be used for current execution of the
function because it is alternately executed with colors black and white for smooth transition.
The second parameter specifies the signal value for left arm movement. It indicates the exact
position of the left hand for that execution of the function. The third parameter specifies the
signal value for left elbow movement. It indicates the exact position of the left elbow for that
execution of the function.

Void rhand(int, int, int)


This function takes 3 integer parameters. This function is used to draw the right hand
of the image. The first parameter specifies the color to be used for current execution of the
function because it is alternately executed with colors black and white for smooth transition.

66
Virtual Reality For Robotics

The second parameter specifies the signal value for right arm movement. It indicates the
exact position of the right hand for that execution of the function. The third parameter
specifies the signal value for right elbow movement. It indicates the exact position of the
right elbow for that execution of the function.

Void lleg (int, int, int, int, int)


This function takes 5 integer parameters. This function is used to draw the left leg of
the image. The first four parameters are the (x,y) positions of the two points used to draw
lines that represent the border of the left leg. The last parameter specifies the color to be used
for current execution of the function because it is alternately executed with colors black and
white for smooth transition.

Void rleg (int, int, int, int, int)


This function takes 5 integer parameters. This function is used to draw the right leg of
the image. The first four parameters are the (x, y) positions of the two points used to draw
lines that represent the border of the right leg. The last parameter specifies the color to be
used for current execution of the function because it is alternately executed with colors black
and white for smooth transition.

Void pant(void)
This function takes no parameters. This function is used to draw the pant of the
image. It accomplishes this job by executing other functions, some from C graphics library
and some user defined functions. It uses the leg_movement function to draw the hands of the
image.

67
Virtual Reality For Robotics

Appendix C

C.1Peripheral Interface Standards

The large variety of plug-ins, add-ons and ancillary equipment’s force the user to look
a little carefully into the hardware and software aspects of interfacing various peripherals to a
PC.

The open architecture of the IBM PC has helped its 62-pin I/O channel bus in
becoming a universally accepted interface standard for joining extra hardware to a PC. A
peripheral, in strict sense, is a piece of equipment which in itself is a separate entity and gets
attached to the PC through a specific connection protocol called Interface Standard.

The block diagram in Fig. C.1 illustrates a typical scheme chosen for peripheral
interface. Here different sockets have been provided for each one of the peripherals, which
are not interchangeable among themselves. If one looks inside a PC, all these sockets
originate from printed circuit cards, which share a common bus, called I/O, channel bus, and
sometimes PC bus a name given to the set of signals available on 62-pin edge connectors on
the PC motherboard.

Fig C.1 showing the block diagram of peripheral connected to processor.

68
Virtual Reality For Robotics

BIBLIOGRAPHY:
1. Muhammad Ali Mazidi, The 8051 Microcontroller and Embedded system,
Pearson Education Asia.
2. Kenneth J Ayala, The 8051 Microcontroller, Prentice hall 3rd Edition
3. Haptic Technologies, http://www.haptech.com/prod/index.htm, 1999.

4. Proceedings of SPIE's 8th Annual International Symposium on Smart


Structures and Materials, 5-8 March, 2001, Newport,CA. Paper No. 4329-47
SPIE Copyright © 2001

5. www.wikipedia.org

6. www.atmel.com

69

You might also like