Virtual Reality For Robotics

1. INTRODUCTION
Technological breakthroughs have allowed automation to replace many jobs that are repetitive in nature and hazardous to human health, thus replacing human operators in many work environments. However, it has also allowed a new phenomenon to occur: the remote control of physical systems by humans through the mediation of computers. Indeed, there are still many jobs that are non-repetitive, unpredictable and hazardous to humans. Clearing up a nuclear power plant leak or exploring the extreme depths of the ocean are just some examples. Reliance on specially designed automatic machines for such jobs may prove to be inefficient, less flexible and less cost-effective than a human operator is. Therefore, a remote control system by human operators using video inspection and master-slave mechanical manipulation (called tele-operation) is used. Due to the growing popularity and also due to the research in the field of robotics, the days are very near wherein robots will replace all human activities. This project is an effort to simulate a robot, which converts the real time actions into digital signals which in turn is used to produce virtual actions. The software for this project acts as interface between the hardware devices and controller computer. Inputs to the system are taken from sensors which are fitted on the intended part or body. These sensors supply the real-time conditions of the intended body to the computer. The software processes these signals and according to received signal levels it moves the graphical picture on the monitor. That means the virtual movement is created by software part, where as real actions are captured by array of sensors. This project gives an idea of the working of a simple robot and to demonstrate the use, implementation and applications of the virtual reality technology. Here the project is designed using the C and C graphics. This project is connected to the external hardware; simulator accepts the commands from the hardware and accordingly performs the desired task. This project finds various applications in the field of tele-medicine and tele-robotics. The project aims at the C graphics and peripheral interface par. Here the objective is to connect or interface the Image on the PC to the external hardware and also to control the image properties like shape, color and size by operating the external hardware.

1

Virtual Reality For Robotics

Here the project consists of following blocks: 1. Software with C & C-graphics 2. Interface 3. Hardware The software reads the data from the interface, generated from the external hardware and interprets them into suitable image formats. For various data, the project image behaves in a different pre-programmed manner. For example, if the joystick is bent for 45 degrees, the toy image on the computer screen also bends by 45 degree and similar.

1.1 Main features of the project:
1. Economical in nature, simple in design & operation. 2. Highly flexible in nature. The system can be used as a stand-alone unit or a remote interfaced one. 3. Due to the computer interface facility, the system is highly accurate & automatic in nature. 4. Since this system is associated with a PC, it is very easy to store the data, keep track of the parameters & analyze it, for further study of the parameter. 5. Once the system is activated, and set to receive the parameters; the whole project can be used as a computerized data logger system for storing the parameters.

2

Virtual Reality For Robotics

2. PREVIEW OF VIRTUAL REALITY TECHNOLOGY
2.1 Virtual Reality: An Introduction
The term 'Virtual Reality' (VR) was initially coined by Jaron Lanier, founder of VPL Research (1989). Other related terms include 'Artificial Reality' (Myron Krueger, 1970s), 'Cyberspace' (William Gibson, 1984), and, more recently, 'Virtual Worlds' and 'Virtual Environments' (1990s). Today, 'Virtual Reality' is used in a variety of ways and often in a confusing and misleading manner. The virtual reality is a hot theme and is evident in recently released Hollywood films such as ‘Spy Kids’ series & “Spider Man’ series. This virtual reality concept is out-come of most exciting and challenging field of automation, ‘Artificial Intelligence’. As the name implies, the virtual reality concept allows the user to do the work which is hundred fold heavier than the actual one. For example, by moving one small model of Himalayan mountains sitting inside the laboratory one can move the real Himalayan mountain with the help of the machinery situated near it. This seems too aggregated application of virtual reality concept, but it succeeds in showing the power of it.

This concept is clearly understood by taking the ‘Warm Hand’ project as an example, conducted by US and the Britain universities jointly. These two universities constructed two human hands, using high sensitive materials covered with innumerable sensors around it. These two hands are connected to computer which is connected to internet. The arrangements of both hands is as shown in fig.2.1

Fig 2.1 Depicts warm hand project 3

are sensed. 4 . There it is again converted back to original signals and fed to hand B. Here the ‘parameter’ means warmth of the hand. Here all the parameters of the patient. fitted with all the sophisticated sensors around its body. The parameters sensed are heart beat. This virtual reality concept brings the patient. which resembles the human patient. which is having virtual reality facility. Such virtual reality applications finds very important role in saving any life. which is far away from it. These parameters are sensed by sensitive sensors spread throughout the hand-gloves and stored in memory of computer. who is hand-shaking hand A. and perform the operations on various patients around the world at a time. It means virtual body tries to imitate the real body. measure body temperature of his virtual body. stored and sent quickly to doctor via broadband internet connection. who is at other corner of the world. working with hundred fold bigger than the real capacity machines etc. He can sit in his own hospital. who cannot move around to help or perform operation on patients lying on the death-bed. Let us see some examples. heart beats. it records every change in parameters during the hand-shaking action. This stored data is transmitted quickly to Britain university’s computer via broad band internet connection. as if of real body. body temperature. to doctor’s operation table.Virtual Reality For Robotics When any person shakes the hand A situated in US university. the grip-level. 2. Doctor can check pulse rate. With the help of these received parameters from real body the virtual body tries to act like it.virtual body. doing remote works. is hand shaking with him. on whom operation has to be performed. pressure applied by the hand-shaker. On the other side. The doctor will have one model. the real patient will be kept under another virtual reality facilitated hospital. This factor helps the specialist doctor to understand the patient’s condition and accordingly he can operate on him. inclined angle etc. This project really demonstrates the virtual reality concept’s application. If any body hand-shake the hand B he will feel that the actual person. pulse rate. body position etc. where this killer application can be implemented.2 Tele-Medicine: Suppose there is only one specialist available in the world.

graphics on the monitor and the doll. Here.3 Picture shows arrangement of the project 5 . 2. reflects the actions of the input devices. Fig 2. as shown in fig 2. The output.Virtual Reality For Robotics Fig 2. eg: Movement of a glove or movement of the different parts of the suit results in the movement of different parts of the doll and the graphics on the monitor.3 Brief about this project: This virtual reality for robotics project gives the better half of the real virtual reality concept.2 Picture represents a tele-medicine system Fig 2.2 shows how this tele-medicine system works.3. In this project. a doll and monitor are used as output devices. number of sensors is connected to a person.

correspondingly the sensors sense the actions and send them to computer. and the numbers can be increased or decreased as per one’s requirements.4 Picture shows a person lifting up his hands Fig 2.4 shows what happens when person lifts his both hands up. detects that both hands are lifted up. Sensor S1 & S2 are connected to observe shoulder positions. Different actions and their respective movements on screen can be observed. These sensors are connected to computer’s CPU via sensor cable bunch.Virtual Reality For Robotics Here the person standing for demonstration is attached with number of sensors. So C-program thus instructs C+ +/C-graphics program to change the necessary changes on graphical representation of screen. Whenever the person shows any movement. The sensors S5 & S6 sense no actions. The sensors fitted on elbow of both hands sends signal to computer that they are intact. The person’s picture will be created on the screen of computer using C++/C-graphics language. Thus processing the entire sensor data. which are normally movement detectors. The sensors S1 & S2. computer’s software program concludes that person has lifted his both hands up. The C-program sitting inside the computer process the data it received via sensor cable bunch and accordingly instructs C++/Cgraphics language to move the respective parts on the graphical picture. S3 & S4 are used to get the bending action of hands and S5 & S6 sensors are used to watch the movements of legs. In this picture only six sensors are used. Fig 2. 6 . as the person has not changed his standing position.

it demonstrates how data can be gathered and stored in the local computer. 7 .Virtual Reality For Robotics Fig 2.5 Actual virtual reality system of this project Even though this project depicts only half part of the virtual reality concept. that sending the sensed data to remote computer for further processing is not a big deal.5. Instead of transmitting it to remote computer for further process. It is evident by seeing fig 2. sensed data is used to move the graphical element on the same local computer.

BLOCK DIAGRAM 8 .Virtual Reality For Robotics 3.

1.Virtual Reality For Robotics The “Virtual Reality for Robotics” is built using several blocks. The building blocks and their inter-relation is shown in fig 3. 9 .

Virtual Reality For Robotics Fig 3.1 Block diagram of hardware unit Explanation of the block diagram: 10 .

is specially used in instrumentation field to amplify weak sensed field signals from various sensors spread all over the area [for example in industries several sensors will be fitted to check so many parameters viz. These weak sensed signals are given to instrumentation amplifier block for further amplification. Here the software module produces the output in graphical form. The analog-to-digital converter block converts any input signals. brain of this project. The computer. This ADC or analog-to-digital converter. where unit gain amplification is done and also provides proper impedance matching between I/O interface and ADC. The instrumentation amplifier amplifies received weak signal to sufficient level and feeds it in to an ADC.]. receives these signals and with the help of software module gives out the output. like hands or legs to observe the particular parts movement] to sense the activity of it..Virtual Reality For Robotics This project has number of sensors/transducers connected across the intended body [to whole body. Several types of sensors are used in this block as per one’s requirements. Instrumentation Amplifier Block: The output of sensor will be very feeble to drive any further sections. The block diagram is explained by taking each block separately. Sensor Block: This block contains number of sensors or transducers spread across the intended body. As soon any sensor senses any changes in its current situation it sends message to its respective instrumentation amplifier. as the name implies. motion of conveyor belt. which are in analog form into digital form. So some kind of amplification is needed before utilizing that sensed signals. speed of motor.. such as tactile switches are used to sense the bending of arms and ankles and micro-switches or sliding potentiometers are used to sense the movement of hands & legs or any body part. ADC: The conversion of analog signals received from instrumentation amplifier block is done using this ADC block. The output of ADC is fed to buffer section. like in EEG scanning or parts only. level of material in storing silos etc. This is necessary as the computer can understand 11 . This amplifier. This amplification is done using reliable amplifier circuit called instrumentation amplifier. so that further sections can process them easily. converts the input analog signals to digital one.

CIRCUIT DESCRIPTION 12 . Power Supply Unit: This specially designed power supply provides +12 V for op-amp. This core software [i. Personal Computer: The brain of this project is the computer. This stage also sees that signal level should be compatible with TTL signal of computer port. The output of ADC is given to I/O interface block. I/O Interface: This stage receives boosted sensed signals from buffer and fed to computer port for further processing. relay & driver stage and +5V for rest of the digital & analog circuits. Software Module: The software module is responsible for producing the graphical output on computer’s screen..e. through buffer stage. receiving the sensed signal level] is written in popular language C and its graphics part is in C graphics and C++. With the help of interfacing software module the computer processes the sensed signals and gives out the output in graphical form. which depends on the received sensed signal levels. to pass the signals from ADC to computer.Virtual Reality For Robotics only digital data and process them. 4.

first let us see what is sensor or transducer and where it is used. into a corresponding quantity. temperature. Depending upon the requirement.Virtual Reality For Robotics The circuit construction of this ‘Virtual Reality for Robotics’ project is divided into these parts: Sensors block. I/O interface stage & finally power supply unit. movements etc. Audible frequency range is from about 300Hz to 3000Hz. Thus this optical sensor is useful in light sensing equipment and illumination level sensing applications generally. is a device which converts physical parameters to be measured. ADC. Some of the important temperature transducers are: thermistors. Although various types of optical sensor are available. Visible light occupies the range from about 400nm to 600nm. Transducer: The transducer. then it is an electrical transducer. Sound Sensors: Electronic devices such as loud speakers can generate electromagnetic energy while other devices such as condenser microphones can sense audible energy. piezoelectric crystals etc. viz.1 SENSORS BLOCK This block holds all the sensors used in the project in one place. If the transducer converts physical quantity to useful electronic signals. such as voltage and current analogous. Before going to block explanation. Instrumentation block. Optoelectronic devices such as Light Emitting Diodes [LEDs] can generate electromagnetic energy while other devices such as Light Dependent Resistors [LDRs] can sense visible light energy. for general purpose applications LDRs are employed. pressure. different types of sensors or transducers are used. The spectral response peaks at about 550nm and falls rapidly below 500nm and above 650nm. Temperature Transducers: Electronic measurement of temperature is of vital concern in modern industrial practice. 4. Optical Sensors: It converts light that falls on it into electronic signals. also known as a sensor. Although various types of sound sensors 13 . Hence a wide range of devices has been developed to serve as temperature transducer. thermocouples. The 1MΩ rated typical LDR exhibits a resistance which varies from 400Ω under bright room lighting [1000 lux] to as great as 1MΩ in total darkness..

1 Circuit diagram of sensor block 4. From here. the project allows any number of sensors to use. To detect the movements of legs and hands or any other body part sliding potentiometers can be used. To detect the bending of arms and ankles movement detectors can be used. Movement Detectors: The sensor which detects the movement or change of position of the fitted body/part and sends message in electronic signal form is called movement detectors.2 INSTRUMENTATION BLOCK 14 . reed relays or slider potentiometers are used for detecting the movements. Usually tactile switches. This sound sensor is useful in listening bug or listening amplifier circuits.1 shows.Virtual Reality For Robotics are available. Circuit Description: As the circuit in fig 4. as per one’s requirements. This sensor block brings all sensor terminals to one common point. The use of sensors or transducers is purely based on one’s individual requirements. for general purpose applications microphones are employed. Fig 4. each sensor is connected to individual instrumentation amplifier circuits as input signal.

The popular ICs used to build instrumentation amplifiers are LM324 and UA 741. pre-amplification of small differential signals super imposed on high common-mode 15 . current shunts and biological probes. Examples include: transducer. The present circuit shows only one instrumentation amplifier stage. op-amps must have very low offset voltage drift. An instrumentation amplifier is a committed “gain block” that measures the difference between the voltage existing at its two input terminals. like this one. output impedance. Instrumentation amplifiers or differential amplifiers are specifically designed to extract and amplify small differential signals from much larger common mode voltages. Not all of the above characteristics or parameters will be important in any given application. and efficiency. To serve as building blocks in instrumentation amplifiers. or power present. The important parameters regarding the amplifier circuit designing are gain. let us see some terms and their definitions in detail. If the input voltage is equal the output of the ideal instrumentation amplifier will be zero. Amplifier: A common theme in many electronic circuits is the need to increase the level of voltage.Virtual Reality For Robotics This block amplifies the sensed signals to sufficient level and fed to ADC card for further processing. high gain and wide bandwidth. amplification for thermocouples. It is commonly used for applications which capitalize on these advantages. strain-gauge bridges. These are also used to provide impedance matching and isolation. Note that each sensor needs one instrumentation amplifier. phase shift. Before explaining the actual circuit. This circuit can be used for any type of sensor which needs amplification. this is carried out with instrumentation amplifier. current. An ideal instrumentation amplifier responds only to the difference between the input voltages. input impedance. low offset and drift. This need is satisfied by some form of amplifier. An essential first stage in designing an amplifier will be to select those characteristics which are important and then specify the parameters required. usually from 1v to 100v and more causes the result to appear between a pair of terminals in the output circuit. which takes input signal from any type of sensor. stable gain and low effective output impedance. low non-linearity. amplifies it by a precisely set gain. An amplifier circuit which is optimized for performance as an instrumentation-amplifier gain block has high input impedance. frequency response. bandwidth. Instrumentation Amplifiers: The low-level signal outputs of electrical transducers often need to be amplified before further processing.

as well as terminals for power supply and offset trim. 120V input to output ports. inherent protection from high-voltage differences between input and output ports and between high and low input terminals. High accuracy performance with low level transducer signals can be enhanced by choosing an amplifier with isolated power available for a low-drift signal conditioning pre-amplifier. d. The primary features of instrumentation amplifier are. labeled sense and reference. to isolate the signal from ground. i. or if isolation impedance must be very high. It has its signal input circuit-isolated from the signal output and power input circuits. voltages in excess of the amplifier supply voltage). and transient signals common to both input terminals. possibilities exist of troublesome ground disturbances and ground loops. High rejection of noise as well as a. This applies to both input protection and input/output isolation. The instrumentation amplifier is used where it is desirable to eliminate the d. the designer should consider an isolation amplifier.Virtual Reality For Robotics voltages. where patient isolation is required.c.c.e.. instrumentation and medical.c. In medical monitoring applications amplifiers must first and foremost protect the patient from leakage current and amplifier fault currents. path from the signal input to amplifier ground. a set of terminals for gain programming an “output” terminal and a pair of feedback terminals.. signal conditioning and (moderate) isolation for data acquisition signals wherever the common “ground” is noisy or of questionable integrity. The selection of an instrumentation amplifier may fall into one of the three categories: industrial. systems operate in noisy environments and processing circuitry must be protected in case of fault conditions. If the application calls for high common-mode voltages (typically. And very high impedance leakage with a 50Hz. Such amplifiers are used: When low-level signals are processed on top of high common-mode voltages. but the precision requires demand loops and offer better common-moderejection than conventional data amplifiers. Circuit Description: 16 . Instrumentation-amplifiers are usually chosen in preference to user-assembled op-amp circuitry. The instrumentation application or data acquisition process may not necessarily involve the extreme hazards of medical or industrial applications. Instrumentation amplifiers have two high impedance input terminals.

1 Components List: 17 . As the parameter [i.e. As the operation of this circuit is analogous. This voltage follower configuration employs cent percent voltage negative feedback and combines a very high input resistance with a very low output resistance. the sensor’s sensed signal is carried away to the next stage easily. Thus the small level of information will be easily picked and amplified to required level by it.. bending etc] increases in the vicinity of sensor the output goes on increasing. two biasing resistors [R1 & R2] arrangement is done. movement. The heart of this module is an op-amp CA 3140 whose gain is very high. a simple module where the main circuit needs the transducer which gives the varying voltage levels with respect to varying input signals. Fig 4.1 shows various components used in the amplifier circuit. This output signal is further amplified and fed to ADC circuit for conversion. The op-amp is in unity gain noninverting voltage amplifier mode.2 shows.Virtual Reality For Robotics Fig 4. To stabilize the sensed signals by sensor.2 Instrumentation Amplifier The sensor is connected at non-inverting input pin-3 and output is observed at pin-6. Table 4. The table 4.

This group includes integrator converters and voltageto-frequency converters. ¼ Watt Carbon Type 3. where conversions speed is important. On the other hand. panel meters. ¼ Watt Carbon Type 3 1 3140 Op-Amp IC 1 4. Analog-to-Digital converters can be classified into two general groups based on the conversion technique. BASIC CONCEPTS 18 .6 K Ohm.9 k Ohm. The second technique involves changing an analog signal into time or frequency and comparing these new parameters to know values. The successive-approximation A/D converters are used in applications such as data loggers and instrumentation. One technique involves comparing a given analog signal with the internally generated equivalent signal. The trade-off between the two techniques is based on accuracy vs. speed. another critical parameter is conversion time. The successive-approximation and the flash type are faster but generally less accurate than the integrator and the voltage-to-frequency type converters. this is opposite to the D\A conversion process. In A/D conversion. where the conversion accuracy is critical.3 Analog to Digital Converter: The A\D conversion is a quantizing process whereby an analog signal is represented by equivalent binary states. This group includes successive-approximation.Virtual Reality For Robotics SEMICONDUCTORS IC1 RESISTORS R1 R2 MISCELLANEOUS C1 Sensor 47 µF/ 25 V Electrolytic LM 335 Temperature Sensor 1 1 5. and monitoring systems. integrating-type converters are used in applications such as digital meters. This is defined as the total time required to convert an analog signal into its digital output and is determined by the conversion technique used and by the propagation delay in various circuits. counter and flash-type converters.

4 shows the graph of the analog input voltage (0 to 1V) and the corresponding digital output signal. each step being 1/8 V apart. It has one input line for an analog signal and three output lines for digital signals.3 shows a block diagram of a 3-bit A/D converter.3 3-bit A/D converter Fig 4. It shows eight (23) discrete output states from 0002 to 1112.Virtual Reality For Robotics Fig 4. This is defined as the resolution of the converter.4 Graph of analog to digital conversion Circuit Description: 19 . In A/D conversion. This is defined as the ‘total time required to convert an analog signal into its digital output’ and is determined by the conversion technique used and by the propagation delay in various circuits. Fig 4. Fig 4. another critical parameter is conversion time.

PB1 and PB2 to the user to select for his operation. U2 and a quartz crystal. This stage is performed with the help of U1. which uses a MAX232 5V to RS232 converter chip. This ADC interface consists of a NOR gate crystal oscillator. The usually available supply voltage +12V is step-downed to +5V.Virtual Reality For Robotics This 8-bit analog to digital converter is built around industry standard AD 0809 from National Semiconductors. a working voltage of AD 0809 and fed to U4’s Vcc pin 11. the ATMEL 89C51 USART hardware is designed to take account of this but for software serial communications you need to make sure that you invert both the incoming and outgoing data bits. ALE. This regulation is achieved by 10-pin [TO packaged] regulator IC 723 [U5] with current limiting zener diode D1. It has a small conversion time. 4.5 shows the RS232 board interface section. The channel select. a CMOS clock divider which feeds 768 kHz as the input clock to the ADC [at pin-10]. The ADC IC provides three channels PB0.4 RS232 Board Interface Section Fig 4. This 28-pin DIL packaged IC contains an 8 channel analog multiplexer and can directly interface up to 8 analog inputs in the range 0 to 5V. As is common with these devices it inverts the data during the conversion. and are the two top wire links shown on the top view of the board below. The two closed links on the RC7 and RC6 lines are for connection to the Atmel 89C51 board. as 100 micro seconds with 15mW power consumption. And the PB7 is input enable signal line. The two open links on the RC1 and RC2 lines are for the 16F628 board 20 . The PB5 line provides Address Latch Enable [ALE} signal to hold the present address. this converts the 0-5V TTL levels at the ATMEL 89C51 pins to the +12V/-12V levels used in RS232 links. The stable 5V reference by LM 336 and an op-amp U6 in reference & buffer provider mode gives the Ref+ at pin-12. Multiturn cermet [R18] allows adjustment of the reference voltage. start conversion and output enable lines are interfaced through port lines. The PB6 signal line asks ADC IC to start conversion either in polled mode or interrupt mode.

for use with the Atmel 89C51 board fit the top two wire links. and are the two top track breaks shown on the bottom view of the board below.5 Power supply 21 . and if not using the hardware USART.5 RS232 Interfacing Circuit Diagram So. This only applies if you are using the hardware USART. and don't cut the two top track breaks.Virtual Reality For Robotics (the 16F628 uses RB1 and RB2 for its USART connection). Although it's labeled as connecting to port 1 for the Atmel 89C51 processor board. 4. for software serial communications you can use any pins you like. it can also be connected to other ports if required. and cut the top two tracks shown. for the 16F628 leave the top two links out. Fig 4.

3. It filters the a. This laboratory power supply offers excellent line and load regulation and output voltages of +5V & +12V at output currents up to one amp.8 volts. power supply.c. end A of the secondary winding becomes negative and end B positive. and gives steady d. 2. During the negative half-cycle.c.Virtual Reality For Robotics Circuit description: As shown in fig 4. is obtained at point ‘C’ with respect to ground. power supply which maintains the output voltage constant irrespective of a. Therefore. voltage. Rectifier Stage: The two diodes D1 & D2 are connected across the secondary winding of the transformer as a full-wave rectifier. a d.c. components present in the rectified d. Note that current across the centre tap terminal is in the same direction for both half-cycles of input a. The 12V AC appearing across the secondary is the RMS value of the waveform and the peak value would be 12 x 1. As the rectifier voltage increases. During the positive half-cycle of secondary voltage.c. Therefore diode D2 conducts while diode D1 does not. Therefore diode D1 conducts while diode D2 does not. This value limits our choice of rectifier diode as 1N4007.c. This phenomenon is 22 . voltage. Step-down Transformer: The transformer rating is 230V AC at Primary and 12-012V. rectifier voltage starts to decrease.6. This transformer has a capability to deliver a current of 1A. 1A across secondary winding. 1. Due to this continuous charge-discharge-recharge cycle very little ripple is observed in the filtered output. pulsating d. The components used are listed in table 4. When capacitor is charged to the peak value of the rectifier voltage. Moreover. mains fluctuations or load variations is known as regulated d.414 = 16. This makes the diode D1 forward biased and diode D2 reverse biased.c. it charges the capacitor and also supplies current to the load. As the next voltage peak immediately recharges the capacitor. output voltage is higher as it remains substantially near the peak value of rectifier output voltage. which is more than enough to drive an electronic circuit or varying load.2. which is having PIV rating more than 16Volts. the end A of the secondary winding becomes positive and end B negative.c. the discharge period is of very small duration. Filter Stage: Here capacitor C1 is used for filtering purpose and connected across the rectifier output. It is also referred as full-wave regulated power supply as it uses four diodes in bridge fashion with the transformer.c.

During positive half cycle the capacitor stores energy in the form of electrostatic field. In the present circuit KIA 7812 three terminal voltage regulator IC is used to get +12V and KIA 7805 voltage regulator IC is used to get +5V regulated d. During negative half cycle. Fig 4.14-0-14 1Amp Sec Transformer 1 4. component. Voltage Regulation Stage: Across the point ‘D’ and Ground there is rectified and filtered d. output.c. components of current and open circuit to d.c.1µF Ceramic Disc type 1 1 230V AC Pri.c. In the 23 .6 Circuit diagram of +5V & +12V full wave regulated power supply Table 4.c.Virtual Reality For Robotics also explained in other form as: the shunt capacitor offers a low reactance path to the a. the filter capacitor releases stored energy to the load.2 Components List SEMICONDUCTORS IC1 IC2 D1& D2 CAPACITORS C1 C2 MISCELLANEOUS X1 7812 Regulator IC 7805 Regulator IC 1N4007 Rectifier Diodes 1 1 2 1000 µf/25V Electrolytic 0.

because it does not require an external trigger pulse to change its output.e. Pin 2 is common pin and is grounded.c.. It is also a free-running multivibrator.Astable Multivibrator 24 . is connected to this pin.c output is measured. The time for which the output remains at high or low is determined by two timing resistors and a timing capacitor that are externally connected to the 555 timer.5 shows an Astable Multivibrator which is a square-wave generator. pin 1 is input i. The circuit shows two more decoupling capacitors C2 & C3. which can be connected to the required circuit. output to the load. Since the circuit does not stay permanently in any of the two states it is known as. The pin 3 gives the stabilized d. 4. Fig 4. astable meaning no stable state.6 Astable multivibrator using 555-Timer Fig 4.c. The output continuously alternates between high and low states. rectified & filtered d. Across the point ‘E’ and ‘F’ with respect to ground +5V & +12V stabilized or regulated d. which provides ground path to the high frequency noise signals.Virtual Reality For Robotics three terminals.5.

with Q=1 and the timer output is forced to low state. it causes the comparator 1 output to go High and comparator output 2 to go Low. the circuit will trigger itself continuously. Fig 4. As a result. Hence.7. the control flip-flop output is set to high. In this circuit there are two timing resistors Ra and Rb. The control flipflop output is now at Low with Q=0.6 Circuit diagram with internal blocks shown The astable operation can be understood by the waveforms of fig4. the 25 . When the capacitor voltage rises above 2/3 Vcc. Both the trigger and threshold inputs (pins 2 and 6) of the two comparators are shorted and connected to the external timing capacitor C.6 shows the 555 Timer connected as an astable multivibrator. Assume initially the 555 timer output is in the high state and the capacitor C is uncharged. as a consequence. the discharge transistor Q1 is turned ON connecting upper end of Rb to ground.Virtual Reality For Robotics Fig 4. When the capacitor voltage drops below 1/3 Vcc the output of the comparator 2 goes to high and that of the comparator 1 to low. The capacitor C now discharges through Rb towards 0V. The discharge transistor Q1 is off and capacitor C starts charging towards Vcc through Ra and Rb. Consequently.

This cycle of capacitor charging from 1/3 Vcc to 2/3 Vcc and then discharging from 2/3 Vcc to 1/3 Vcc repeats continually producing a square-wave output. the discharge transistor Q1 is turned OFF and the capacitor C charges again towards 2/3 Vcc.693(Ra+Rb) C Toff = 0. 26 . Ton is the on-time during which the capacitor charges. D = Ton / T D= Ra+Rb/Ra+2Rb where. f is the output frequency.7 Timing Diagram Ton = 0. In effect.693 Rb C f = 1/T = 1. Fig 4. D is the Duty cycle of the output pulses generated. T is the total time period. Toff is the off-time during which the capacitor discharges.Virtual Reality For Robotics output of the control flip-flop resets to Low state with Q=0 and the timer output goes to high state.44/ (Ra+2Rb) C Duty Cycle.

It initializes the graphics system. Then the function doll is executed. The amount of movement depends upon the signal i.1. its range from 0 to 255.e. It has 8 channels of input (numbered from 0 to 7) and each channel can vary its input signal from 0 to 255.e. If channel 1 is selected then the right hand of the image shows movement. The amount of movement depends upon the signal i. If channel 2 is selected then the left leg of the image shows movement. In brief the program works as explained below. If channel 3 is selected then the right leg of the image shows movement. If channel 0 is selected then the left hand of the image shows movement. Then signals are read from the hardware device.e. its range from 0 to 255. Depending upon the channel selected the image on the screen shows movements.1 Flowchart The flowchart of the program is shown in fig 5. Software Description 5.Virtual Reality For Robotics 5. This device acts as an input device. its range from 0 to 255. 27 . The amount of movement depends upon the signal i. When the program is executed.By varying the signals of channel 4. The amount of movement depends upon the signal i. the initgraph function is called.e. right forearms of the image can be moved respectively. its range from 0 to 255.

Virtual Reality For Robotics START INITGRAPH DOLL WHILE (! KBHIT) SIGNALS FROM HARDWARE SIGNA L = 0 HAND MOVEMENT (LEFT ARM) SIGNA L = 1 HAND MOVEMENT (RIGHT ARM) SIGNA L = 2 LEG MOVEMENT (LEFT LEG) SIGNA L = 3 SIGNA L = 4 HAND MOVEMENT (RIGHT LEG) HAND MOVEMENT (RIGHT ARM) 4 C A B 28 .

1 Flowchart for the program 29 .Virtual Reality For Robotics Fig 5.

2 Program ADC 0809 #include<stdio. else if ( val > 0x32 && val < 0x128 ) port1 = 0x04. val = port2. intr = 1. while(1) { wr = 0. sfr port3 = 0xb0. } } 30 . else if ( val > 0x16 && val < 0x32 ) port1 = 0x02.Virtual Reality For Robotics 5. while ( intr == 0 ) . if ( val > 0x05 && val < 0x16 ) port1 = 0x01. sbit intr = port3 ^ 2. sbit wr = port3 ^ 1.h> #include <at89x51. else if ( val > 0x128 && val < 0xff ) port1 = 0x08. rd = 1. sfr port1 = 0x90. //port3 = val. rd = 1.h> sfr port2 = 0xa0. sbit rd = port3 ^ 0. wr = 1. rd = 0. main() { char val. port2 = 0xff. port3 = 0xff.

rl21[500][2]. int gd=DETECT.int.p[10].int.status.h> #include <alloc.int). le11[500][2]. ll11[500][2].c[3][1].rh22[500][2].int).Virtual Reality For Robotics 5. /* Declaring global variables */ int Rx_Data(void).int).a[20].q.int.int.re22[500][2].le22[500][2].int.h> #define COM 0 #define DATA_READY 0x100 #define TRUE 1 #define FALSE 0 #define SETTINGS ( _COM_9600 | _COM_NOPARITY | _COM_STOP1 | _COM_CHR8) void doll(void).int).int.int [500][2]). int CH_No. rh11[500][2]. void leg_movement(int. void face(void).int.r4=117.ll12[500][2].ll21[500][2]. static int lh11[500][2].int.r2=105.b[3][1].int [500][2]). void hand_movement(int.3 C Program: #include <stdio.le21[500][2].r3=150.re12[500][2]. float value[8].int.ll22[500][2].lh21[500][2].int.DataL.int. void moverhand(int. re11[500][2]. void lleg(int. int ix.rl12[500][2]. r1=115.h> #include <math.int).rl22[500][2]. void movelhand(int.j.DataH.int).rh21[500][2].int.le12[500][2].gm.lh22[500][2]. rl11[500][2].int.int.int. void lhand(int.y=100.h> #include <stdlib.Data. void rhand(int. void shirt().h> #include <dos.re21[500][2].h> #include <bios.int.lh12[500][2]. void rleg(int. void pant(void).h> #include <graphics.h> #include <conio. 31 . int x=300. float r[3][3].rh12[500][2].int.

ll11). while(!kbhit()) 32 . movelhand(x-50. int i=10. /* right */ moverhand(x+71.le22).y+198.rl22).r2+8.ip7=0. moverhand(x+50.0.400.&gm.r3. moverhand(x. movelhand(x-71.y+198.r1. size = imagesize(180.ip2=0.ll21).r3-8.y+231. moverhand(x+70.ip4=0. /* hands */ movelhand(x-50.temp4=0. //outportb(0x303.rl12). moverhand(x.r2/2+5.r2/2.0x91).r4+8.le12). moverhand(x+50.rl11).y+100.ip5=0. moverhand(x+41.lh11).0. /* left */ movelhand(x-71.rh22). getimage(180.lh22).r4+8.ip0=0.r2+8. movelhand(x-41.r1-8. moverhand(x+70.*area.r3-8.lh21). moverhand(x+71.re22). movelhand(x-41.rh11).rh21).r1.r1/2.ip3=0. /* legs */ /* legs */ movelhand(x-71.y+231.y+231.y+78.ll12).y+100.y+78.y+100.le11). /* elbow */ moverhand(x+50." ").y+100.re12).temp2=0.y+100.re11).380).ip6=0. cleardevice().y+100. area = malloc(size).r1-8.y+100.y+231.y+78.y+78.r1/2. movelhand(x.r2.input=0.380. /* hands */ movelhand(x-50. doll().flag1=0.y+198. temp1=0.r4.r2/2.y+100.r2.temp3=0. /* elbow */ movelhand(x-50.y+78.flag2=0. moverhand(x+41. movelhand(x.lh12).le21).rh12).400.r1/3-5.r2/2+5.ll22).size.y+78.Virtual Reality For Robotics void main(void) { char c. moverhand(x+50.ip1=0.y+78.re21). movelhand(x-71.r3.r1/3-5.rl21).r4.y+78. initgraph(&gd.y+198.area).

ix<10. break. COM ). if (status & DATA_READY) { j = bioscom(_COM_RECEIVE. bar(x-3. COM ).x. case 2 : if(ip6==0) { if(ip2 != value[CH_No]*50) { setfillstyle(1.ix++) { status = bioscom(_COM_STATUS. case 1 : if(ip6==0) ip1=value[CH_No]*36.BLACK). 0. 0. COM ).0). // Send CH_No in Ascii delay(10). bar(x-45. switch(CH_No) { case 0 : if(ip6==0) ip0=value[CH_No]*36. Rx_Data(). skip3: ix= 1. break. skip2: printf("\nSystem not ready \n"). goto skip1. // If any previous unwanted if (status & DATA_READY) // data has been received bioscom(_COM_RECEIVE. // putimage(180. SETTINGS.x-41. for ( CH_No = 0.area. COM).value[CH_No]*51*.0x07 ). skip1: if ( j == 'D' ) goto skip3. // Enable FIFO Buffer // 0x2F8 for COM2 // 0x3F8 for COM1 status = bioscom(_COM_STATUS.16.y+281.Virtual Reality For Robotics { bioscom(_COM_INIT. } delay (500).y+281. CH_No++) { bioscom( _COM_SEND. COM ). 0.y+324). COM ). COM ). outportb( (0x2F8+0x02). 33 // Send the Device Check command .y+325). 'D' . // the read it bioscom( _COM_SEND. for ( ix=0. 0. CH_No+0x30 . CH_No <= 7 . } goto skip2.

temp1=ip1.BLACK).flag1=0. } if((ip1 != temp1) || (ip4 != temp2)) { if(flag1) hand_movement(0*.0.2+5.33 ) ip5=value[CH_No]*51.WHITE.1).0*.BLACK. hand_movement(ip1*. ip2=value[CH_No]*50. bar(x+41. case 3 : if(ip6==0) { if(ip3 != value[CH_No]*50) { setfillstyle(1. } } break. 34 .WHITE.16.y+324). ip3=value[CH_No]*50.2+5.1).0.18 . else hand_movement(temp1*.BLACK. leg_movement(value[CH_No]*50*0. leg_movement(ip3*0.0. bar(x.0). } if(ip5>=150) { ip0=150.1. // putimage(180.y+325). case 4 : if(ip6==0) ip4=value[CH_No]*51.BLACK.value[CH_No]*51*.temp2=ip4.1). break.79+10.0.0.WHITE.18.79+10.1).y+281.x+44.1). case 5 : if(ip6 == 0) if ( value[CH_No] > 3. } } break.18.Virtual Reality For Robotics leg_movement(ip2*0.2+5.79+10. leg_movement(value[CH_No]*50*.0).x+3.temp2*.area.ip4*.0).ip4=180.1.BLACK. break. else ip5 = 0.y+281.2+5.

fillellipse(x.15).y.3.120.13).BLACK).3.0.5).10. fillellipse(x-15.15.50).BLACK.7. setfillstyle(1.0).330. /* Eye-left */ fillellipse(x+15.360.flag2=0. /* Eye-right */ setfillstyle(1. /* Ear-right */ setcolor(16).y+34.7. } ip4=0.300. /* Eye Ball .1.18.y+30.79+10. ellipse(x.0. leg_movement(5.180.210.8).18.temp4*.360.5.y.1). } closegraph(). /* mouth */ ellipse(x.30. temp3=ip0.0).1).1.y.7). sector(x-30.0.WHITE.0. /* ear-left */ ellipse(x+30.16). hand_movement(ip0*.60. pant().WHITE.y-5.BLACK.10).1. /* ear-right */ setfillstyle(1.3.y-5.right */ setfillstyle(1.79+10. /* Face */ setcolor(13). /* mouth */ floodfill(x.8).0*.WHITE.3).y-5.ip5*. ellipse(x-30.left */ fillellipse(x+15.3. } ix++.3).3). hand_movement(10.Virtual Reality For Robotics } if((ip0 != temp3) || (ip5 != temp4)) { if(flag2) hand_movement(0*.y+25. else hand_movement(temp3*. fillellipse(x-15. /* mouth */ 35 .18 .temp4=ip5. /* Ear-left */ sector(x+30.y. /* Eye Ball .1.10).79+10. } /* function DOLL */ void doll() { face().1.240.RED).3).0).y-5. } /* face */ void face() { setcolor(16). shirt().5.y.

y+19). /* nose */ setfillstyle(1.y-5.right arc */ ellipse(x-30.50. ellipse(x.a[1]=a[3]=y-25.y-25.y+170.20.1). setlinestyle(0.poly values */ fillpoly(4. a[1]=a[7]=a[13]=y+90.3.y+19.15).a).130. a[2]=x-20.13).1).12.x+55. line(x-70.18. /* hips-left */ ellipse(x+60.left */ ellipse(x+15.y+3. setcolor(13).poly */ setcolor(16).upper ellipse */ setlinestyle(0. setcolor(WHITE).y+170. /* mouth .x+42.20).a). } void shirt() { setcolor(WHITE).lower ellipse */ setcolor(BLUE).10).WHITE).70. a[3]=a[9]=a[15]=y+60.left arc */ line(x.75).205.20.y+19).75). ellipse(x-15.BLUE).13).y+196.y+3.y+80.172.95.y+78.202. a[8]=a[14]=x+20.y+19.a[1]=a[3]=y+45. /* Hat . /* shoulder */ ellipse(x-60.poly values */ a[2]=a[4]=x+13. /* Eyebrow . fillellipse(x.a[10]=x+10. fillpoly(8.8). fillpoly(4.y+25.8).185.a).y-60. /* Hat . fillellipse(x. floodfill(x.y+25.y+100).y+100).18. /* Hat .a[5]=a[7]=y-60.1.1. /* nose */ line(x.x+6.50.1).130.y+78.10).x-6.6. /* nose */ fillellipse(x-3. line(x+70.85. /* vest line */ setfillstyle(9. /* nose */ fillellipse(x+3.a[5]=a[7]=y+60.3.poly values */ setfillstyle(1.15).30.a[4]=x-10. /* Neck .40. /* mouth .340. /* Hat .y+80. /* Neck .Virtual Reality For Robotics ellipse(x+30. /* hips-right */ line(x-41. setfillstyle(1.y+196).358.poly values */ a[2]=a[4]=x+30.right */ a[0]=a[6]=x-12.x-55. /* Eyebrow .3). a[0]=a[6]=x-30.BLUE).a[5]=a[11]=y+100. a[0]=a[6]=a[12]=x.y-5. 36 . /* Hat .12. /* Neck */ setcolor(15).338.

floodfill(x-38.95.(p[1]+p[3])/2-5.y+170.y+155.y+190.3).70. line(x-41.y+80.3).3.y+80.7).105.RED). ellipse(x.int j. ellipse(x-60.WHITE). else floodfill((p[0]+p[2])/2.4).WHITE).20).10.y+80.y+78.7). line(x-70. 37 .70.WHITE). setfillstyle(1.int color.Virtual Reality For Robotics setcolor(WHITE).120. /* ellipse(x+60.70.y+180. setcolor(13).y+95.75).y+170.y+100).x+42. line(x+70. setcolor(WHITE).3. setcolor(7).3.int r) { setcolor(WHITE).x+55.y+85.WHITE).BLUE).WHITE).y+125. ellipse(x.y+80. /* hips-left */ hips-right */ if(color==BLACK) { setfillstyle(9. setcolor(RED).y+100).y+78. fillellipse(x.3).70. } /* belt */ /* belt */ /* button */ /* button */ /* button */ /* button */ /* Function hand_movement */ void hand_movement(int w.(p[1]+p[3])/2+5.y+185.(p[1]+p[3])/2.75).20.BLACK). setfillstyle(9.y+190.3).20).20). if(l) { floodfill(x-70.85.340. floodfill(x-60.x-55. if((w<160)&&(w>120)) floodfill((p[0]+p[2])/2+5. setfillstyle(1.75.172.202.3.WHITE). fillellipse(x. fillellipse(x.y+185). else if(!q) floodfill((p[0]+p[2])/2. fillellipse(x.20.6.int l. ellipse(x. fillellipse(x. setfillstyle(1.

floodfill(le11[w][0]. setcolor(color). if(r) rhand(color.y+85.w. } if(color==WHITE) { setfillstyle(9.(p[1]+p[3])/2-5. } } if(w<=140) { if(w==140) { setcolor(BLACK).y+85.BLUE). if(r) rhand(color.Virtual Reality For Robotics } if(r) { floodfill(x+70. if(l) { floodfill(x-70.WHITE).j). } if(r) { 38 .le11[w][1]+5. } else { if(w==142) { setcolor(BLACK).(re11[w][1]+rh11[w][1])/2+5.w.WHITE).WHITE).j).j).WHITE). if(l) lhand(color.WHITE).j).w. } q=0.w. if(l) lhand(color. } q=1. setcolor(color). floodfill((re11[w][0]+rh11[w][0])/2-5. floodfill((p[0]+p[2])/2.

x+5.th+=.WHITE).++i) { x = r*cos((th*3. y = r*sin((th*3. for(th=0.142)/180). floodfill((re11[w][0]+rh11[w][0])/2-5. line(x+41.y+85. floodfill(x+2.th<=180.right */ void moverhand(int xc .left */ void movelhand(int xc .y+222).int b[400][2]) { float th.int yc . for(th=0.y.y+231).5) { x = r*cos((th*3.th<=180.y+195.YELLOW).Virtual Reality For Robotics floodfill(x+70.x. line(x+41.y+198. b[i][1] = yc+x. line(x-41. /* zip */ line(x+5. /* zip */ setlinestyle(0. /* zip */ line(x-3. line(x-3. setlinestyle(1.x.y.0.++i.i=0.1).x.x.x+5. b[i][0] = xc-y.x.WHITE).0.int r.th+=. b[i][0] = xc+y.WHITE).(re11[w][1]+rh11[w][1])/2+5. } } /* Function pant */ void pant() { setcolor(15).y+231).x-3. setfillstyle(9. } } } /* moving hand co-ordinates .y+195.y+198.int b[400][2]) { float x.int r.int yc . setcolor(WHITE).142)/180).1). } } /* moving hand co-ordinates .th.y+205. line(x-41.y+198.y+231).i=0.142)/180).142)/180).y+230).y+222).y+230.5.y+231). b[i][1] = yc+x. y = r*sin((th*3.y+198. } 39 .

rl22[w+4][0].y+198. /* left leg */ line(x.ll11[w][1].t.ll12[w][0].v).s.t.t).ll21[w-4][1]. line(x+41.int t. if(r) rleg(rl21[w-4][0].int color) { setcolor(color).int v.y+231.WHITE).int t.ll22[w+4][0]. if(color==BLACK) { setfillstyle(9.color).rl11[w][1].int u.color).y+231. if(l) floodfill(x-5. } if(w<=40) { if(l) lleg(ll11[w][0].Virtual Reality For Robotics /* function for right leg */ void rleg(int s.rl21[w-4][1].color).int v. /* " " */ } /* Function leg_movement */ void leg_movement(int w.ll22[w+4][1].int l.rl12[w][0].WHITE).color). if(r) rleg(rl11[w][0]. } else { if(l) lleg(ll21[w-4][0].int color) { setcolor(color).int r) { setcolor(color).v).ll12[w][1]. } if(color==WHITE) { 40 .y+231.s.t).rl22[w+4][1]. /* " " */ } /* function for left leg */ void lleg(int s. if(r) floodfill(x+7. line(x-41. /* right leg */ line(x.y+231.v).u.u.int u.u.BLACK).v).y+198.rl12[w][1].int color. /* " " */ line(s.u. /* " " */ line(s.

r[1][0]= sin(th).k.142/180.t.yr.r2+25. b[2][0]=1.y+231. } /* function for left hand */ void lhand(int color.float th. for(k=0.y+100.y+73.++k) c[i][j] = c[i][j]+(r[i][k]*b[k][j]).u.y+100. else movelhand(x-50.v.r2+15. } } line(xr.int yr. /* 1st half */ if ((j>10)&&(i>140)&&(j<21)) movelhand(x-50.y+100.y+231.xr*sin(th). if(q) { line(x-65. r[0][0]=r[1][1]=cos(th).r2+10. r[2][0]=r[2][1]=0. 41 . r[0][2]= xr*(1-cos(th))+ yr*sin(th).lh22).lh22). r[2][2]=1. else if((j>20)&&(i>140)&&(j<31)) movelhand(x-50.c[1][0]).int i. } } void rotate(int xr. for(i=0. b[1][0]=y1.r2+20. th = th*3.i<3.(le21[i][0]).y+100.lh22).lh22).++j) { c[i][j]=0. b[0][0]= x1. if(r) floodfill(x+7.j<1.c[0][0].k<3.j.int y1) { int i.WHITE).++i) { for(j=0. setcolor(color).WHITE). r[0][1]= -sin(th).int x1.d[10].(le21[i][1])). if(l) floodfill(x-7.Virtual Reality For Robotics setfillstyle(9. r[1][2]= yr*(1-cos(th)).int j) { int s.YELLOW). else if((j>30)&&(i>140)) movelhand(x-50.

(le11[i][1])).le12[i][0]. if(q) 42 .j.y+100.p[1]=t.lh11[i][1]).lh21[i][1]).le12[i][1].int i.lh12).v=c[1][0].lh22[i][1]).le22[i][0].t).v). line((le11[i][0]).le22[i][1]. { p[0]=s.y+78.Virtual Reality For Robotics rotate((le21[i][0]).u.v).arm */ rotate((le11[i][0]).y+100. line(x-50.p[1]=t.(le22[i][0]).(le21[i][1]).r2+5.//u.v. else if((j>30)&&(i>70)) movelhand(x-50.(le12[i][0]).y+100.(le22[i][1])).y+100. else movelhand(x-50.v).p[2]=u. if((j>10)&&(i>70)&&(j<21)) movelhand(x-50.p[2]=u.lh21[i][0]. s=c[0][0].le22[i][1]).p[3]=v. } /* function for right hand */ void rhand(int color.(le11[i][0]).lh11[i][0].lh22[i][0].y+100. else if((j>20)&&(i>70)&&(j<31)) movelhand(x-50. s=c[0][0]. /* left .(le11[i][1]).int j) { int s.v=c[1][0].r2+10.lh12).lh12).(le12[i][1])).j.(le21[i][1]).r2+15.t.//s. line((le21[i][0]).j. u=c[0][0]. setcolor(color). line(s.t.t=c[1][0].y+100. } } setcolor(color). line(x-50.lh12[i][0]. u=c[0][0].lh12[i][1]). rotate(le12[i][0].le12[i][1]).//u.j. } } else { line(x-71.p[3]=v.lh12).u. rotate(le22[i][0].t=c[1][0].r2.(le11[i][1]). { p[0]=s.

p[1]=t.(re12[i][0]). line(x+50.Virtual Reality For Robotics { line(x+65.rh12[i][1]).(re12[i][1])).(re21[i][1]).rh11[i][0].360-j. if((j<(360-10))&&(i>70)&&(j>(360-21))) moverhand(x+50.rh22).rh12[i][0]. line((re21[i][0]). line((re11[i][0]).p[2]=u.arm */ rotate((re11[i][0]).(re22[i][0]).y+100. else moverhand(x+50.//u.re12[i][0].r2.t).(le11[i/2. // line((le11[i/2.rh11[i][1]).rh12).t=c[1][0].r2+5. rotate((re21[i][0]).re22[i][1]). rotate(re12[i][0].//u.y+78. if(i==140) { p[0]=s.360-j.rh22[i][1]).y+100.(re22[i][1])). rotate(re22[i][0].rh22).p[3]=v. if(i==142) 43 . /* 1st half */ if ((j>(360-10))&&(i>140)&&(j<(360-21))) moverhand(x+50.y+100.360-j. line(x+50.r2+10.rh22).y+100.(re11[i][1])).rh12). else if((j>(360-20))&&(i>140)&&(j<(360-31))) moverhand(x+50.y+100.v=c[1][0].(re21[i][1])).(re21[i][0]). u=c[0][0].//s.rh21[i][1]).360-j.v).(re21[i][1]).y+100.(re11[i][0]). s=c[0][0].s. /* left .y+100.re12[i][1].re12[i][1]).t). else if((j<(360-30))&&(i>70)) moverhand(x+50.rh22). u=c[0][0].r2+15.rh22[i][0].y+73.v=c[1][0].5][0]).r2+25.rh12).r2+10.t=c[1][0].(re11[i][1]).(re11[i][1]).re22[i][1].v). s=c[0][0].r2+20. else if((j>(360-30))&&(i>140)) moverhand(x+50.rh12).y+100.y+100. else moverhand(x+50.5][1]).r2+15. else if((j<(360-20))&&(i>70)&&(j>(360-31))) moverhand(x+50.y+100. } } else { line(x+71.rh21[i][0].re22[i][0].

COM ). COM ). COM ). } else goto loop3.v). 0. 0. 0. COM ). } else goto loop1. } else goto loop4. COM ). goto loop5. COM ). 0. 0. loop3: status = bioscom(_COM_STATUS. 0. if (status & DATA_READY) { DataL = bioscom(_COM_RECEIVE.p[1]=t. if (status & DATA_READY) { End_Char = bioscom(_COM_RECEIVE. goto loop4. goto loop2.p[2]=u. loop1: status = bioscom(_COM_STATUS. COM ). /* " } int Rx_Data(void) { int Str_Char. loop4: status = bioscom(_COM_STATUS. End_Char. COM ). COM ). 0. if (status & DATA_READY) { CH = bioscom(_COM_RECEIVE. CH . " */ if (status & DATA_READY) { Str_Char = bioscom(_COM_RECEIVE. 0. goto loop6.t. line(s. goto loop3. } 44 . 0. } } setcolor(color). COM ). } else goto loop2. if (status & DATA_READY) { DataH = bioscom(_COM_RECEIVE. 0.Virtual Reality For Robotics { p[0]=s. loop2: status = bioscom(_COM_STATUS.u.p[3]=v. loop5: status = bioscom(_COM_STATUS.

// Add shifted MS value with LS byte value[CH_No]=( Data * 5. // Store the Data in buffer } 45 . else DataH-= 0x37. // ( overall is to convert 2 8 bits to 16 bit return (0). else DataL-= 0x37. Data = DataH<<4.Virtual Reality For Robotics else goto loop5. // Left shift MS Byte by 8 bits Data = Data+DataL.00)/0xFF. loop6: if( DataH<=0x39) DataH-= 0x30. if( DataL<=0x39) DataL-= 0x30.

Evans and Sutherland demonstrated a head-mounted stereo display already in 1965.1. alternative concepts (e. Present Day Virtual Reality Technology 6. the viewer can lookaround and walk through the surrounding virtual environment.Virtual Reality For Robotics 6. A motion tracker continuously measures the position and orientation of the user's head and allows the image generating computer to adjust the scene representation to the current view.1 A head-mounted display (HMD) 46 . It took more then 20 years before VPL Research introduced a commercially available HMD.1 Head-Mounted Display (HMD) The head-mounted display (HMD) shown in fig 6. was the first device providing its wearer with an immersive experience. As a result. To overcome the often uncomfortable intrusiveness of a head-mounted display. BOOM and CAVE) for immersive viewing of virtual environments were developed.g. presenting a stereo view of a virtual world. the famous "EyePhone" system (1989). A typical HMD houses two miniature display screens and an optical system that channels the images from the screens to the eyes.. thereby. Fig 6.

Screens and optical system are housed in a box that is attached to a multi-link arm. a head-coupled 6. 6. The user looks into the box through two holes. Several persons wearing lightweight stereo glasses can enter and walk freely inside the CAVE. was developed at the University of Illinois at Chicago and provides the illusion of immersion by projecting stereo images on the walls and floor of a room-sized cube.2 BOOM The BOOM (Binocular Omni-Orientation Monitor) as shown in fig 6.4 Input Fig 6.3 CAVE display device The CAVE (Cave Automatic Virtual Environment) as shown in fig 6. A head tracking system continuously adjusts the stereo projection to the current position of the leading viewer. and can guide the box to any position within the operational volume of the device. sees the virtual world.2 The BOOM.3. Fig 6.Virtual Reality For Robotics 6.3 CAVE system (schematic))principle) Devices and other Sensual Technologies 47 . Head tracking is accomplished via sensors in the links of the arm that holds the box.2 from Fake Space is a head-coupled stereoscopic display device.

and hand-held wands allow the user to navigate through a virtual environment and to interact with virtual objects. and other non-visual technologies. • • • Stereoscopic viewing enhances the perception of depth and the sense of space. voice recognition and other technologies are being employed to enrich the immersive experience and to create more "sensualized" interfaces. Networked applications allow for shared virtual environments (see below). • • Shared Virtual Environments 48 . The fig 6. Directional sound. tactile and force feedback devices. Fig 6.4 A data glove allows for interactions with the virtual world 6. walk-around. joysticks. haptic. The virtual world is presented in full scale and relates properly to the human size. Realistic interactions with virtual objects via data glove and similar devices allow for manipulation. The convincing illusion of being fully immersed in an artificial world can be enhanced by auditory.4 illustrates this technology.5 Characteristics of Immersive VR The unique characteristics of immersive virtual reality can be summarized as follows: • Head-referenced viewing provides a natural interface for the navigation in threedimensional space and allows for look-around.Virtual Reality For Robotics A variety of input devices like data gloves. and control of virtual worlds. operation. and fly-through capabilities in virtual environments.

uses photographs for the modeling of three-dimensional worlds and provides pseudo look-around and walk-trough capabilities on a graphics monitor. Each user is presented as a virtual human (avatar) to the other participants. Fig 6.5 Shared virtual environment Non-immersive VR Today.5. three networked users at different locations (anywhere in the world) meet in the same virtual world by using a BOOM device. communicated with each other. and others. respectively. the term 'Virtual Reality' is also used for applications that are not fully immersive. stereo projection systems. stereo viewing from the monitor via stereo glasses. but all variations of VR will be important in the future. Apple's QuickTime VR. The users can see each other. All users see the same virtual environment from their respective points of view. VRML 49 . and interact with the virtual world as a team.Virtual Reality For Robotics In the example illustrated below in fig 6. The boundaries are becoming blurred. a CAVE system. and a HMD. for example. This includes mouse-controlled navigation through a three-dimensional environment on a graphics monitor.

However. The current version VRML 2. that has become a standard authoring tool for the creation of home pages. not fully immersive. The viewing of VRML models via a VRML plug-in for web browsers is usually done on a graphics monitor under mouse-control and.0 has become an international ISO/IEC standard under the name VRML97.6 Applications 50 .g. VRML provides three-dimensional worlds with integrated hyperlinks on the web. The technologies of 'Augmented Reality' allow for the viewing of real environments with superimposed virtual objects. This is as shown in fig 6. Home pages become home spaces.6. Motion trackers are employed to monitor the movements of dancers or athletes for subsequent studies in immersive VR. tele-robotics) immerse a viewer in a real world that is captured by video cameras at a distant location and allow for the remote manipulation of real objects via robot arms and manipulators. 6. the syntax and data structure of VRML provide an excellent tool for the modeling of threedimensional worlds that are functional and interactive and that can.6 Rendering of Escher's Penrose Staircase device VR-related Technologies Other VR-related technologies combine virtual and real environments. Tele-presence systems (e. be transferred into fully immersive viewing systems. therefore. ultimately. Fig 6.Virtual Reality For Robotics Most exciting is the ongoing development of VRML (Virtual Reality Modeling Language) on the World Wide Web. In addition to HTML (Hyper Text Markup Language). tele-medicine..

underwater shipwrecks.. 51 . solar systems. landscapes. fig 6. spacecrafts.Virtual Reality For Robotics Virtual Reality is well known for its use with flight simulators and games. architectural walk-through.7 shows real and abstract virtual worlds (Michigan stadium. simulation of assembly sequences and maintenance tasks.7 Real and abstract virtual worlds (Michigan stadium. equipment operation. human anatomy. medical. assistance for the handicapped. These virtual worlds can be animated. population densities. auditorium acoustics. and can expose behaviour and functionality. education. turbulent flow structures. interactive. architecture. Flow structure) device unlimited. study and treatment of phobias (e. crime scene reconstructions. It is assumed that VR will reshape the interface between people and information technology by offering new ways for the communication of information. Flow structure) device. stock market behaviour. As the technologies of virtual reality evolve. shared. archaeological excavation sites. molecular models. and the creative expression of ideas. This article will summarize how virtual reality is used in medicine. the visualization of processes. sculptures. and so on. these are only two of the many ways virtual reality is being used today. Of special interest is the visual and sensual representation of abstract systems like magnetic fields.g. However. design evaluation (virtual prototyping). entertainment. Note that a virtual environment can represent any three-dimensional world that is either real or abstract. chemistry and the visualization of voxel data. and any other conceivable system including artistic and creative work of abstract nature. etc. the applications of VR become literally Fig 6. and much more. information flows. human factors and ergonomic studies. mathematical systems. Useful applications of VR include training in a variety of areas (military. This includes real systems like buildings. weather simulation.). fear of height).

In some cases it can be used instead of a robot. Conclusion The following modifications can be made to the present circuit. This is a very cost effective system as it can simulate a robot. which leads to still smarter project.Virtual Reality For Robotics 7. This system can be used in nuclear power stations which are 52 .

It is the users’ imagination which limits the working of this project.1 Voltage Regulator KIA 7805 53 . The output of this system can be taken as a multiple of the input to do some serious tasks.Virtual Reality For Robotics very hazardous for human beings to work in. Appendix A Pin details of various IC’s used A. rich features to this project. This project is open for developments from all sides. One can go on adding the extra. eg: If 50 Grams weight is lifted on the input side then the output can be made to lift any multiple of 50 grams.

Operating Temperature 3.7V A.Virtual Reality For Robotics Fig A.2 CA3140 54 .70°c : 100mA : 1. Output voltage 2. Output Current 4.1 KIA 7805 General Characteristics: 1. Dropout Voltage : 05V : 0°c .

Therefore. However. CMR becomes more frequency dependent at high gains. if it is not provided these currents will change capacitance. and a specified impedance (e. when amplifying output of “Floating” source. at a given frequency.g. In most instrumentation amplifiers. Important Note: The instrumentation amplifiers have differential inputs and there must be a return path for the bias current.. 1k source unbalance at 60Hz). an isolator must be used. or to the guard terminal. If a return path is impracticable.Virtual Reality For Robotics Fig A. the CMR increases with gain because the frontend configuration does not amplify common-mode signals. causing the output to drift uncontrollably or to saturate. at higher gains.2 Amplifier CA3140 Common Mode Rejection (CMR) Common Mode Rejection is a measure of the change in output voltage when both inputs are changed by equal amounts. 55 . and the amount of common mode signal appearing at the output stays relatively constant as the signal gain increases. such as transforms and thermocouples as well as ac-coupled sources there must be a DC path from each input to common. CMR is usually specified for a full range common mode voltage change (CMV). The common mode rejection ratio of common mode signal appearing at the output to the input CMV. amplifier bandwidth decreases since differences in phase shift through the mode errors.

Virtual Reality For Robotics Various parameters of a typical operational amplifier is as shown below: Open loop gain Ad Input offset voltage Vio Input offset voltage Iio Common mode rejection ratio Power supply rejection ratio Input offset voltage drift Input offset current drift Rate = 50.1na/c = 1.3 ADC 0809 56 .000 = 1mx = 10nA = 100dB = 20uv/v = 0.0uv/c° = 1v/us° A.

3 ADC 0809 A.4 Pin details of AT89C51 57 .Virtual Reality For Robotics Fig A.

Virtual Reality For Robotics

Fig A.4 AT89C651

58

Virtual Reality For Robotics

Fig A.5 Block diagram of AT89C51

Features of 89C51
• 4K Bytes of ROM • Fully Static Operation: 0 Hz to 24 MHz • Three-level program memory lock • 128 x 8-bit internal RAM • 32 programmable I/O lines 59

Virtual Reality For Robotics

• Two 16-bit Timer/Counters • Six interrupt sources • Programmable serial channel The AT89C51 is a low-power, high-performance CMOS 8-bit microcontroller with 4K bytes of read only memory (ROM). The device is manufactured using Atmel’s highdensity nonvolatile memory technology. The AT89C51 provides the following standard features: 4K bytes of ROM, 128 bytes of RAM, 32 I/O lines, two 16-bit timer/counters, six interrupt architecture, a full-duplex serial port.

Pin Description
Port 0
Port 0 is an 8-bit open drain bi-directional I/O port. As an output port, each pin can sink eight TTL inputs. When 1s are written to port 0 pins, the pins can be used as high impedance inputs. Port 0 can also be configured to be the multiplexed low order address/data bus during accesses to external program and data memory. In this mode, P0 has internal pull ups. Port 0 also receives the code bytes during flash programming and outputs the code by test during program verification. External pull ups are required during program verification.

Port 1
Port 1 is an 8-bit bi-directional I/O port with internal pull ups. The port 1 output buffers can sink/source four TTL inputs. When 1s are written to port 1 pins, they are pulled high by the internal pull ups and can be used as inputs In addition, P1.0 and P1.1 can be configured to be the timer/counter 2 external count input (P1.0/T2) and the timer/counter 2 trigger input (P1.1/T2EX), respectively. Port 1 also receives the low-order address bytes during flash programming and verification. Port pin alternate functions P1.0 T2 (external count input to Timer/Counter 2), clock-out P1.1 T2 EX (Timer/Counter 2 capture/reload trigger and direction control) AT89C52

Port 2
Port 2 is an 8-bit bi-directional I/O port with internal pull ups. The port 2 output buffers can sink/source four TTL inputs. When 1s are written to port 2 pins, they are pulled high by the internal pull ups and can be used as inputs. Port 2 emits the high-order address 60

This pin is also the program pulse input (PROG) during flash programming. When 1s are written to port 3 pins. ALE/PROG Address latch enable is an output pulse for latching the low byte of the address during accesses to external memory. 61 . however.6 WR (external data memory write strobe) P3. During accesses to external data memory that uses 8-bit addresses (MOVX @ RI).3 INT1 (external interrupt 1) P3.Virtual Reality For Robotics byte during fetches from external program memory and during accesses to external data memory that uses 16-bit addresses (MOVX @ DPTR). A high on this pin for two machine cycles while the oscillator is running resets the device. Note.4 T0 (timer 0 external input) P3. Port 3 also receives some control signals for Flash programming and verification. The port 3 output buffers can sink/source four TTL inputs.2 INT0 (external interrupt 0) P3.7 RD (external data memory read strobe) RST Reset input. Port 3 Port 3 is an 8-bit bi-directional I/O port with internal pull ups. they are pulled high by the internal pull ups and can be used as inputs. port 2 uses strong internal pull ups when emitting 1s. that one ALE pulse is skipped during each access to external data memory.5 T1 (timer 1 external input) P3. ALE is emitted at a constant rate of 1/6 the oscillator frequency and may be used for external timing or clocking purposes. In normal operation.1 TXD (serial output port) P3. Port Pin Alternate Functions P3. port 2 emits the contents of the P2 special function register. In this application.0 RXD (serial input port) P3.

With the bit set. and unoccupied addresses may not be implemented on the chip. Special Function Registers It is a map of the on-chip memory area called the Special Function Register (SFR) space. Otherwise. since they may be used in future products to invoke AT89C51 new features. User software should not write 1s to these unlisted locations. ALE is active only during a MOVX or MOVC instruction.Virtual Reality For Robotics If desired. ALE operation can be disabled by setting bit 0 of SFR location 8EH. Read accesses to these addresses will in general return random data. These 128 groups are divided into three different groups as follows: 62 . the pin is weakly pulled high. When the AT89C51 is executing code from external program memory. PSEN Program Store Enable is the read strobe to external program memory. XTAL1 Input to the inverting oscillator amplifier and input to the internal clock operating circuit. In that case. Note that not all of the addresses are occupied. EA/VPP External access enable EA must be strapped to GND in order to enable the device to fetch code from external program memory locations starting at 0000H up to FFFFH. the reset or inactive values of the new bits will always be 0. except that two PSEN activations are skipped during each access to external data memory. Data Memory There are 128 bytes of RAM in 8051 which are assigned addresses ranging from 00h to 7Fh. Setting the ALE-disable bit has no effect if the microcontroller is in external execution mode. and write accesses will have an indeterminate effect. PSEN is activated twice each machine cycle.

which disables all interrupts at once. Since the 8051 has an 8bit architecture. XTAL2 should be left unconnected while XTAL1 is driven. two timer interrupts (Timers 0. 3. IE also contains a global disable bit. Timer 0 and 1 Both Timer 0 and Timer 1 in the AT89C51 are 16 bits wide. Each of these interrupt sources can be individually enabled or disabled by setting or clearing a bit in special function register IE. 2. These 80 bytes of RAM are used for the purpose of storing parameters by 8051 programmers. Oscillator Characteristics XTAL1 and XTAL2 are the input and output. A total of 32 bytes from location 00 to 1Fh are set aside for register banks and the stack. There are no requirements on the duty cycle of the external clock signal. but minimum and maximum voltage high and low time specifications must be observed. EA. A total of 16 bytes from locations 20h to 2fh are set aside for bitaddressable memory. each 16-bit timers are accessed as two separate registers of low byte and high byte. since the input to the internal clocking circuitry is through a divide-by-two flip-flop. 1).Virtual Reality For Robotics 1. respectively. 63 . To drive the device from an external clock source. Either a quartz crystal or ceramic resonator may be used. Interrupts The AT89C51 has a total of six interrupt vectors: two external interrupts (INT0 and INT1). of an inverting amplifier that can be configured for use as an on-chip oscillator. A total of 80 bytes from locations 30h to 7fh are used for read and write storage which is normally called as a scratch pad memory. as shown in Figure 7. the serial port interrupt and the reset pin interrupt.

......... 15......0V Maximum Operating Voltage................... 6...-1................................................................0 mA 64 .............6V DC Output Current.........Virtual Reality For Robotics DC Characteristics Absolute Maximum Ratings* Operating Temperature..................... -55°C to +125°C Storage Temperature ..................................... -65°C to +150°C Voltage on Any Pin with Respect to Ground ...................0V to +7...........

It accomplishes this job by executing other library functions in C graphics such as ellipse ( ). floodfill ( ). fillellipse ( ). shirt( ) and pant( ). This function is used to draw the initial static image. This function is used to draw the face of the image. arc ( ). It accomplishes this job by calling other functions as hand_movement( ). Void shirt(void) This function takes no parameters. sector ( ). It accomplishes this job by executing other functions.1 User defined functions User defined functions used:Doll( ) Hand_movement( ) Leg_movement( ) Rotate( ) Shirt( ) Pant( ) Face( ) Movelhand( ) Moverhand( ) Rhand( ) Lhand( ) Rleg( ) Lleg( ) Void doll(void) This function takes no parameters. some from C graphics library 65 . leg_movement( ). circle ( ) etc. line ( ).Virtual Reality For Robotics Appendix B B. This function is used to draw the shirt of the image. Void face(void) This function takes no parameters.

The first parameter specifies the color to be used for current execution of the function because it is alternately executed with colors black and white for smooth transition. int. It controls the movement of both arm and elbow. This function is used to draw the right hand of the image. int. Void rhand(int. int. It uses the hand_movement function to draw the hands of the image. The third parameter specifies the signal value for left elbow movement. int. This function is used to draw the left hand of the image. int) This function takes 5 integer parameters. The second parameter specifies the signal value for elbow movement. int. 66 . The fourth and fifth parameters take values either 0 or 1. int) This function takes 3 integer parameters. The first parameter species the signal value for arm movement. int. int) This function takes 4 integer parameters. It indicates the exact position of the left elbow for that execution of the function. int) This function takes 3 integer parameters. It indicates the exact position of the left hand for that execution of the function. The second parameter specifies the color to be used for current execution of the function because it is alternately executed with colors black and white for smooth transition. The third parameter is set to 1 if left leg is to be moved. The fourth parameter is set to 1 if left hand is to be moved. This function is used to control the movement of the legs of the image. int.Virtual Reality For Robotics and some user defined functions. This function is used to control the movement of the hands of the image. The first parameter specifies the color to be used for current execution of the function because it is alternately executed with colors black and white for smooth transition. Void leg_movement(int. The first parameter specifies the signal value for leg movement. The third and fourth parameters take values either 0 or 1. Void hand_movement(int. Void lhand(int. The fifth parameter is set to 1 if right hand is to be moved. The fourth parameter is set to 1 if right leg is to be moved. The third parameter specifies the color to be used for current execution of the function because it is alternately executed with colors black and white for smooth transition. The second parameter specifies the signal value for left arm movement.

Void pant(void) This function takes no parameters. The first four parameters are the (x. It indicates the exact position of the right hand for that execution of the function. y) positions of the two points used to draw lines that represent the border of the right leg. The last parameter specifies the color to be used for current execution of the function because it is alternately executed with colors black and white for smooth transition. Void lleg (int. It uses the leg_movement function to draw the hands of the image. int) This function takes 5 integer parameters.y) positions of the two points used to draw lines that represent the border of the left leg. It accomplishes this job by executing other functions. int. This function is used to draw the right leg of the image. Void rleg (int. some from C graphics library and some user defined functions. int. This function is used to draw the left leg of the image. int. It indicates the exact position of the right elbow for that execution of the function. int) This function takes 5 integer parameters. int. int. This function is used to draw the pant of the image. The first four parameters are the (x. 67 . The last parameter specifies the color to be used for current execution of the function because it is alternately executed with colors black and white for smooth transition.Virtual Reality For Robotics The second parameter specifies the signal value for right arm movement. The third parameter specifies the signal value for right elbow movement. int.

68 . If one looks inside a PC. The block diagram in Fig.1 illustrates a typical scheme chosen for peripheral interface.1Peripheral Interface Standards The large variety of plug-ins. which are not interchangeable among themselves. The open architecture of the IBM PC has helped its 62-pin I/O channel bus in becoming a universally accepted interface standard for joining extra hardware to a PC. which share a common bus.1 showing the block diagram of peripheral connected to processor. and sometimes PC bus a name given to the set of signals available on 62-pin edge connectors on the PC motherboard. C. add-ons and ancillary equipment’s force the user to look a little carefully into the hardware and software aspects of interfacing various peripherals to a PC. attached to the PC through a specific connection protocol called Interface Standard.Virtual Reality For Robotics Appendix C C. channel bus. all these sockets originate from printed circuit cards. Here different sockets have been provided for each one of the peripherals. in strict sense. A peripheral. is a piece of equipment which in itself is a separate entity and gets Fig C. called I/O.

4329-47 SPIE Copyright © 2001 5.com 69 . Pearson Education Asia.com/prod/index. www.atmel. Newport. 2.htm. 1999.Virtual Reality For Robotics BIBLIOGRAPHY: 1.org 6. Proceedings of SPIE's 8th Annual International Symposium on Smart Structures and Materials.haptech. Kenneth J Ayala. http://www. The 8051 Microcontroller and Embedded system. The 8051 Microcontroller. www. 5-8 March.CA. 4. Prentice hall 3rd Edition 3. Muhammad Ali Mazidi.wikipedia. Paper No. Haptic Technologies. 2001.

Sign up to vote on this title
UsefulNot useful