Professional Documents
Culture Documents
Everyone must have used, seen, or read about computers. This is because computer is an integral
part of our everyday existence like school, banks, shops, railway stations, hospital, hotel etc.
Computers are present everywhere, making our work easier and faster for us. As they are such
integral parts of our lives, everyone must know what they are and how they function.
The literal meaning of computer is to compute (Calculate). It a device that can calculate.
However, modern computers can do a lot more than calculate. Computers can accomplish our
tasks easily and repeatedly without getting bored and without committing errors.
The term computer is derived from the Latin term ‘computare’, this means to calculate or
programmable machine. Computer cannot do anything without a Program. It represents the
decimal numbers through a string of binary digits. The Word ‘Computer’ usually refers to the
Center Processor Unit plus Internal memory.
Computer is an advanced electronic device that takes raw data as input from the user and
processes these data under the control of set of instructions (called program) and gives the result
(output) and saves output for the future use. It can process both numerical and non-numerical
(arithmetic and logical) calculations.
Computer is an electronic device that receives input, stores or processes the input as per user
instructions and provides output in desired format.
A computer is a machine that can be programmed to accept data (input), process it into useful
information (output), and store it away (in a secondary storage device) for safekeeping or later
reuse. The processing of input to output is directed by the software but performed by the
hardware.
Charles Babbage is called the “Grand Father” of the computer. The First mechanical computer
designed by Charles Babbage was called Analytical Engine. It uses read-only memory in the
form of punch cards.
(ii) Bit-Shifting Operations: This pertains to shifting the positions of the bits by a certain
number of places to the right or left, which is considered a multiplication operation.
(iii) Arithmetic Operations: This refers to bit addition and subtraction. Although multiplication
and division are sometimes used, these operations are more expensive to make. Addition can be
used to substitute for multiplication and subtraction for division.
An arithmetic logic unit is also known as an integer unit (IU).
(i) Hardwired control: Design is based on a fixed architecture. The CU is made up of flip-flops,
logic gates, digital circuits and encoder and decoder circuits that are wired in a specific and fixed
way. When instruction set changes are required, wiring and circuit changes must be made. This
is preferred in a reduced instruction set computing (RISC) architecture, which only has a small
number of instructions.
(ii) Microprogram control: Microprograms are stored in a special control memory and are
based on flowcharts. They are replaceable and ideal because of their simplicity.
Input / output devices.
INPUT DEVICE
An input device is any hardware device that sends data to a computer, allowing you to interact
with and control it. The picture shows a Logitech trackball mouse, which is an example of an
input device.
The most commonly used or primary input devices on a computer are the keyboard and mouse.
However, there are dozens of other devices that can also be used to input data into the computer.
Input Devices:
• Graphics Tablets
• Video Capture Hardware
• Trackballs
• Barcode reader
• Digital camera
• MIDI keyboard
• Gamepad
• Joystick
• Keyboard
• Cameras
• Microphone
• Mouse (pointing device)
• Scanner
• Webcam
• Touchpad’s
• Microphone
• Electronic Whiteboard
• OMR
• OCR
• Pen Input
• Punch card reader
• MICR (Magnetic Ink character reader)
• Magnetic Tape Drive
Some of the popular input devices are:
1. Keyboard
The keyboard is a basic input device that is used to enter data into a computer or any other
electronic device by pressing keys. It has different sets of keys for letters, numbers, characters,
and functions. Keyboards are connected to a computer through USB or a Bluetooth device for
wireless communication.
2. Mouse
The mouse is a hand-held input device which is used to move cursor or pointer across the screen.
It is designed to be used on a flat surface and generally has left and right button and a scroll
wheel between them. Laptop computers come with a touchpad that works as a mouse. It lets you
control the movement of cursor or pointer by moving your finger over the touchpad. Some
mouse comes with integrated features such as extra buttons to perform different buttons.
The mouse was invented by Douglas C. Engelbart in 1963. Early mouse had a roller ball
integrated as a movement sensor underneath the device. Modern mouse devices come with
optical technology that controls cursor movements by a visible or invisible light beam. A mouse
is connected to a computer through different ports depending on the type of computer and type of
a mouse.
3. Scanner
The scanner uses the pictures and pages of text as input. It scans the picture or a document. The
scanned picture or document then converted into a digital format or file and is displayed on the
screen as an output. It uses optical character recognition techniques to convert images into digital
ones.
4. Joystick
A joystick is also a pointing input device like a mouse. It is made up of a stick with a spherical
base. The base is fitted in a socket that allows free movement of the stick. The movement of stick
controls the cursor or pointer on the screen.
The frist joystick was invented by C. B. Mirick at the U.S. Naval Research Laboratory. A
joystick can be of different types such as displacement joysticks, finger-operated joysticks, hand
operated, isometric joystick, and more. In joystick, the cursor keeps moving in the direction of
the joystick unless it is upright, whereas, in mouse, the cursor moves only when the mouse
moves.
5. Light Pen
A light pen is a computer input device that looks like a pen. The tip of the light pen contains a
light-sensitive detector that enables the user to point to or select objects on the display screen. Its
light sensitive tip detects the object location and sends the corresponding signals to the CPU. It is
not compatible with LCD screens, so it is not in use today. It also helps you draw on the screen if
needed. The first light pen was invented around 1955 as a part of the Whirlwind project at the
Massachusetts Institute of Technology (MIT).
6. Digitizer
Digitizer is a computer input device that has a flat surface and usually comes with a stylus. It
enables the user to draw images and graphics using the stylus as we draw on paper with a pencil.
The images or graphics drawn on the digitizer appear on the computer monitor or display screen.
The software converts the touch inputs into lines and can also convert handwritten text to
typewritten words.
It can be used to capture handwritten signatures and data or images from taped papers.
Furthermore, it is also used to receive information in the form of drawings and send output to a
CAD (Computer-aided design) application and software like AutoCAD. Thus, it allows you to
convert hand-drawn images into a format suitable for computer processing.
7. Microphone
The microphone is a computer input device that is used to input the sound. It receives the sound
vibrations and converts them into audio signals or sends to a recording medium. The audio
signals are converted into digital data and stored in the computer. The microphone also enables
the user to telecommunicate with others. It is also used to add sound to presentations and with
webcams for video conferencing.
8. Magnetic Ink Character Recognition (MICR)
MICR computer input device is designed to read the text printed with magnetic ink. MICR is a
character recognition technology that makes use of special magnetized ink which is sensitive to
magnetic fields. It is widely used in banks to process the cheques and other organizations where
security is a major concern. It can process three hundred cheques in a minute with hundred-
percent accuracy. The details on the bottom of the cheque (MICR No.) are written with magnetic
ink. A laser printer with MICR toner can be used to print the magnetic ink.
The device reads the details and sends to a computer for processing. A document printed in
magnetic ink is required to pass through a machine which magnetizes the ink, and the magnetic
information is then translated into characters.
9. Optical Character Reader (OCR)
OCR computer input device is designed to convert the scanned images of handwritten, typed or
printed text into digital text. It is widely used in offices and libraries to convert documents and
books into electronic files.
It processes and copies the physical form of a document using a scanner. After copying the
documents, the OCR software converts the documents into a two-color (black and white),
version called bitmap. Then it is analyzed for light and dark areas, where the dark areas are
selected as characters, and the light area is identified as background. It is widely used to convert
hard copy legal or historic documents into PDFs. The converted documents can be edited if
required like we edit documents created in ms word.
OUTPUT DEVICE
An output device is any device used to send data from a computer to another device or user.
Most computer data output that is meant for humans is in the form of audio or video. Thus, most
output devices used by humans are in these categories. Examples include monitors, projectors,
speakers, headphones and printers.
OUTPUT DEVICES:
• LCD Projection Panels
• Monitor (LED, LCD, CRT etc)
• Printers (all types)
• Plotters
• Microfiche
• Projector
• Head Phone
• Computer Output Microfilm (COM)
• Speaker(s)
• Visual Display Unit
• Film Recorder
Following are some of the important output devices used in a computer.
1. Monitors
Monitors, commonly called as Visual Display Unit (VDU), are the main output device of a
computer. It forms images from tiny dots, called pixels that are arranged in a rectangular form.
The sharpness of the image depends upon the number of pixels.
There are two kinds of viewing screen used for monitors.
• Cathode-Ray Tube (CRT)
• Flat-Panel Display
2. Printer
A printer produces hard copies of the processed data. It enables the user, to print images, text or
any other information onto the paper.
Based on the printing mechanism, the printers are of two types: Impact Printers and Non-impact
Printers.
(a) Impact Printers: They are of two types:
(i) Character Printers
• Dot Matrix printers
• Daisy Wheel printers
(ii) Line printers
• Drum printers
• Chain printers
(b) Non-impact printers: They are of two types:
• Laser printers
• Inkjet printers
3. Projector
A projector is an output device that enables the user to project the output onto a large surface
such as a big screen or wall. It can be connected to a computer and similar devices to project
their output onto a screen. It uses light and lenses to produce magnified texts, images, and
videos. So, it is an ideal output device to give presentations or to teach a large number of people.
Modern projects (digital projectors) come with multiple input sources such as HDMI ports for
newer equipment and VGA ports that support older devices. Some projectors are designed to
support Wi-Fi and Bluetooth as well. They can be fixed onto the ceiling, placed on a stand, and
more and are frequently used for classroom teaching, giving presentations, home cinemas, etc.
A digital projector can be of two types:
(i) Liquid Crystal Display (LCD) digital projector: This type of digital projectors are very
popular as they are lightweight and provide crisp output. An LCD projector uses transmissive
technology to produce output. It allows the light source, which is a standard lamp, to pass
through the three colored liquid crystal light panels. Some colors pass through the panels and
some are blocked by the panels and thus images are on the screen.
(ii) Digital Light Processing (DLP) digital projector: It has a set of tiny mirrors, a separate
mirror for each pixel of the image and thus provide high-quality images. These projectors are
mostly used in theatres as they fulfill the requirement of high-quality video output.
4. Speakers – speakers are attached to computers to facilitate the output of sound; sound cards
are required in the computer for speakers to function. The different kinds of speakers range from
simple, two-speaker output devices right the way up to surround-sound multi-channel units.
5.Headset – this is a combination of speakers and microphone. It is mostly used by gamers, and
is also a great tool for communicating with family and friends over the internet using some VOIP
program or other.
6. Plotter – this generates a hard copy of a digitally depicted design. The design is sent to the
plotter through a graphics card, and the design is formed by using a pen. It is generally used
with engineering applications, and essentially draws a given image using a series of straight
lines.
When a computer or other device is not turned on or connected to other devices, it is said to be
"offline." This is the opposite of being "online," when a device can readily communicate with
other devices. ... Offline can also mean not being connected to the Internet.
Storage devices
Integration of Application
Generation of Computer Technology
A computer is an electronic device that manipulates information or data. It has the ability to
store, retrieve, and process data.
Nowadays, a computer can be used to type documents, send email, play games, and browse the
Web. It can also be used to edit or create spreadsheets, presentations, and even videos. But the
evolution of this complex system started around 1946 with the first Generation of Computer and
evolving ever since.
Even more so the generation who have grown from infancy within the global desktop and laptop
revolution since the 1980s.
The history of the computer goes back several decades however and there are five definable
generations of computers.
Each generation is defined by a significant technological development that changes
fundamentally how computers operate – leading to more compact, less expensive, but more
powerful, efficient and robust machines.
There are five generations of computers:
1940 – 1956: First Generation – Vacuum Tubes
These early computers used vacuum tubes as circuitry and magnetic drums for memory. As a
result they were enormous, literally taking up entire rooms and costing a fortune to run. These
were inefficient materials which generated a lot of heat, sucked huge electricity and subsequently
generated a lot of heat which caused ongoing breakdowns.
These first generation computers relied on ‘machine language’ (which is the most basic
programming language that can be understood by computers). These computers were limited to
solving one problem at a time. Input was based on punched cards and paper tape. Output came
out on print-outs. The two notable machines of this era were the UNIVAC and ENIAC machines
– the UNIVAC is the first every commercial computer which was purchased in 1951 by a
business – the US Census Bureau.
1956 – 1963: Second Generation – Transistors
The replacement of vacuum tubes by transistors saw the advent of the second generation of
computing. Although first invented in 1947, transistors weren’t used significantly in computers
until the end of the 1950s. They were a big improvement over the vacuum tube, despite still
subjecting computers to damaging levels of heat. However they were hugely superior to the
vacuum tubes, making computers smaller, faster, cheaper and less heavy on electricity use. They
still relied on punched card for input/printouts.
The language evolved from cryptic binary language to symbolic (‘assembly’) languages. This
meant programmers could create instructions in words. About the same time high level
programming languages were being developed (early versions of COBOL and FORTRAN).
Transistor-driven machines were the first computers to store instructions into their memories –
moving from magnetic drum to magnetic core ‘technology’. The early versions of these
machines were developed for the atomic energy industry.
1964 – 1971: Third Generation – Integrated Circuits
By this phase, transistors were now being miniaturised and put on silicon chips (called
semiconductors). This led to a massive increase in speed and efficiency of these
machines. These were the first computers where users interacted using keyboards and monitors
which interfaced with an operating system, a significant leap up from the punch cards and
printouts. This enabled these machines to run several applications at once using a central
program which functioned to monitor memory.
As a result of these advances which again made machines cheaper and smaller, a new mass
market of users emerged during the ‘60s.
1972 – 2010: Fourth Generation – Microprocessors
This revolution can be summed in one word: Intel. The chip-maker developed the Intel 4004 chip
in 1971, which positioned all computer components (CPU, memory, input/output controls) onto
a single chip. What filled a room in the 1940s now fit in the palm of the hand. The Intel chip
housed thousands of integrated circuits. The year 1981 saw the first ever computer (IBM)
specifically designed for home use and 1984 saw the MacIntosh introduced by Apple.
Microprocessors even moved beyond the realm of computers and into an increasing number of
everyday products.
The increased power of these small computers meant they could be linked, creating networks.
Which ultimately led to the development, birth and rapid evolution of the Internet. Other major
advances during this period have been the Graphical user interface (GUI), the mouse and more
recently the astounding advances in lap-top capability and hand-held devices.
2010 : Fifth Generation – Artificial Intelligence
Computer devices with artificial intelligence are still in development, but some of these
technologies are beginning to emerge and be used such as voice recognition.
AI is a reality made possible by using parallel processing and superconductors. Leaning to the
future, computers will be radically transformed again by quantum computation, molecular and
nano technology.
The essence of fifth generation will be using these technologies to ultimately create machines
which can process and respond to natural language, and have capability to learn and organise
themselves.
Programming Languages
Types of Computer languages
Just as humans use language to communicate, and different regions have different languages,
computers also have their own languages that are specific to them.
Different kinds of languages have been developed to perform different types of work on the
computer. Basically, languages can be divided into two categories according to how the
computer understands them.
Two Basic Types of Computer Language
1. Low-Level Languages
Low-level computer languages are either machine codes or are very close them. A computer
cannot understand instructions given to it in high-level languages or in English. It can only
understand and execute instructions given in the form of machine language i.e. binary. There are
two types of low-level languages:
• Machine Language: a language that is directly interpreted into the hardware
• Assembly Language: a slightly more user-friendly language that directly corresponds to machine
language
2. High-Level Languages
High-level computer languages use formats that are similar to English. The purpose of
developing high-level languages was to enable people to write programs easily, in their own
native language environment (English).
High-level languages are basically symbolic languages that use English words and/or
mathematical symbols rather than mnemonic codes. Each instruction in the high-level language
is translated into many machine language instructions that the computer can understand.