You are on page 1of 24

Computer

 History of computer .
Computers and electronics play an enormous role in today's
society, impacting everything from communication and
medicine to science.
Although computers are typically viewed as a modern
invention involving electronics, computing predates the use of
electrical devices. The ancient abacus was perhaps the first
digital computing device. Analog computing dates back
several millennia as primitive computing devices were used as
early as the ancient Greeks and Romans, the most known
complex of which being the Antikythera mechanism. Later
devices such as the castle clock (1206), slide rule (c. 1624) and
Babbage's Difference Engine (1822) are other examples of
early mechanical analog computers.
The introduction of electric power in the 19th century led to
the rise of electrical and hybrid electro-mechanical devices to
carry out both digital (Hollerith punch-card machine) and
analog (Bush’s differential analyzer) calculation. Telephone
switching came to be based on this technology, which led to
the development of machines that we would recognize as early
computers.
The presentation of the Edison Effect in 1885 provided the
theoretical background for electronic devices. Originally in
the form of vacuum tubes, electronic components were
rapidly integrated into electric devices, revolutionizing radio
and later television. It was in computers however, where the
full impact of electronics was felt. Analog computers used to
calculate ballistics were crucial to the outcome of World War
II, and the Colossus and the ENIAC, the two earliest
electronic digital computers, were developed during the war.

page1
Computer
 Generation of computer
Introduction:
A computer is an electronic device that manipulates
information or data. It has the ability to store, retrieve, and
process data.
Nowadays, a computer can be used to type documents,
send email, play games, and browse the Web. It can also
be used to edit or create spreadsheets, presentations, and
even videos. But the evolution of this complex system
started around 1940 with the first Generation of Computer
and evolving ever since.
There are five generations of computers.
1. FIRST GENERATION
1. 1946-1959 is the period of first generation
computer.

2. SECOND GENERATION
1. 1959-1965 is the period of second-generation
computer.
2. .
3. THIRD GENERATION
1. 1965-1971 is the period of third generation
computer.
2. These computers were based on Integrated circuits.
4. FOURTH GENERATION:
1. 1971-1980 is the period of fourth generation
computer.
2. This technology is based on Microprocessor.
 :
5. FIFTH GENERATION
 Introduction:
1. The period of the fifth generation in 1980-onwards.

page2
Computer
 Definition of computer
÷ A computer is a machine that can be programmed to
manipulate symbols. Its principal characteristics are:
 It responds to a specific set of instructions in a well-
defined manner.
 It can execute a prerecorded list of instructions (a
program).
 It can quickly store and retrieve large amounts of
data.
Therefore computers can perform complex and repetitive
procedures quickly, precisely and reliably. Modern
computers are electronic and digital. The actual
machinery (wires, transistors, and circuits) is called
hardware; the instructions and data are called software.
All general-purpose computers require the following
hardware components:
 Central processing unit (CPU): The heart of the
computer, this is the component that actually executes
instructions organized in programs ("software") which
tell the computer what to do.
 Memory (fast, expensive, short-term memory): Enables
a computer to store, at least temporarily, data,
programs, and intermediate results.
 Mass storage device (slower, cheaper, long-term
memory): Allows a computer to permanently retain
large amounts of data and programs between jobs.
Common mass storage devices include disk drives and
tape drives.
page3
Computer
 Super and mainframe
computer
 mainframes supercomputer is a broad term for one of
the fastest computers currently available.
Supercomputers are very expensive and are employed
for specialized applications that require immense
amounts of mathematical calculations (number
crunching). For example, weather forecasting requires a
supercomputer. Other uses of supercomputers scientific
simulations, (animated) graphics, fluid dynamic
calculations, nuclear energy research, electronic design,
and analysis of geological data (e.g. in petrochemical
prospecting). Perhaps the best known supercomputer
manufacturer is Cray Research.
 Mainframe was a term originally referring to the
cabinet containing the central processor unit or "main
frame" of a room-filling Stone Age batch machine.
After the emergence of smaller "minicomputer" designs
in the early 1970s, the traditional big iron machines
were described as "mainframe computers" and
eventually just as mainframes. Nowadays a Mainframe
is a very large and expensive computer capable of
supporting hundreds, or even thousands, of users
simultaneously. The chief difference between a
supercomputer and a mainframe is that a
supercomputer channels all its power into executing a
few programs as fast as possible, whereas a mainframe
uses its power to execute many programs concurrently.

page4
Computer
 Personal computer
.
Personal computers first appeared in the late 1970s. One
of the first and most popular personal computers was the
Apple II, introduced in 1977 by Apple Computer. During
the late 1970s and early 1980s, new models and competing
operating systems seemed to appear daily. Then, in 1981,
IBM entered the fray with its first personal computer,
known as the IBM PC. The IBM PC quickly became the
personal computer of choice, and most other personal
computer manufacturers fell by the wayside. P.C. is short
for personal computer or IBM PC. One of the few
companies to survive IBM's onslaught was Apple
Computer, which remains a major player in the personal
computer marketplace. Other companies adjusted to
IBM's dominance by building IBM clones, computers that
were internally almost the same as the IBM PC, but that
cost less. Because IBM clones used the same
microprocessors as IBM PCs, they were capable of
running the same software. Over the years, IBM has lost
much of its influence in directing the evolution of PCs.
or pointing sticks, function by reporting their angle of
deflection. Movements of the pointing device are echoed
on the screen by movements of the pointer, creating a
simple, intuitive way to navigate a computer's graphical
user interface (GUI).

page5
Computer
 Mouse the input device
Device that allows a user to input spatial data to a
computer. In the case of mouse and touchpads, this is
usually achieved by detecting movement across a physical
surface. Analog devices, such as 3D mice, joysticks,
or pointing sticks, function by reporting their angle of
deflection. Movements of the pointing device are echoed
on the screen by movements of the pointer, creating a
simple, intuitive way to navigate a computer's graphical
user interface (GUI).
Pointing devices, which are input devices used to specify a
position in space, can further be classified according to:

 Whether the input is direct or indirect. With direct input,


the input space coincides with the display space, i.e.
pointing is done in the space where visual feedback or
the pointer appears. Touchscreens and light
pens involve direct input. Examples involving indirect
input include the mouse and trackball.
 Whether the positional information is absolute (e.g. on a
touch screen) or relative (e.g. with a mouse that can be
lifted and repositioned)
For pointing devices, direct input is almost necessarily
absolute, but indirect input may be either absolute or
relative. For example, digitizing graphics tablets that do
not have an embedded screen involve indirect input and
sense absolute positions and are often run in an absolute
input mode, but they may also be set up to simulate a
relative input mode like that of a touchpad, where the
stylus or puck can be lifted and repositioned. Embeded
LCD tablets which are also referred to as graphics tablet
monitor is the extension of digitizing graphics tablets.

page6
Computer
 Introduction to the Boolean algebra

In mathematics and mathematical


logic, Boolean algebra is the branch of algebra in
which the values of the variables are the truth
values true and false, usually denoted 1 and 0
respectively. Instead of elementary algebra where
the values of the variables are numbers, and the
prime operations are addition and multiplication,
the main operations of Boolean algebra are
the conjunction (and) denoted as ∧,
the disjunction (or) denoted as ∨, and
the negation (not) denoted as ¬. It is thus a
formalism for describing logical operations in the
same way that elementary algebra describes
numerical operations.
 Boolean algebra was introduced by George
Boole in his first book The Mathematical
Analysis of Logic (1847), and set forth more
fully in his An Investigation of the Laws of
Thought (1854).[1] According to Huntington, the
term "Boolean algebra" was first suggested
by Sheffer in 1913,[2] although Charles Sanders
Peirce in 1880 gave the title "A Boolian Algebra
with One Constant" to the first chapter of his
"The Simplest Mathematics".[3] Boolean algebra
has been fundamental in the development
of digital electronics, and is provided for in all.

page7
Computer
 History of computer architecture

The first documented computer architecture was in the


correspondence between Charles Babbage and Ada Lovelace,
describing the analytical engine. When building the computer Z1 in
1936, Konrad Zuse described in two patent applications for his
future projects that machine instructions could be stored in the
same storage used for data, i.e., the stored-
program concept.[3][4] Two other early and important examples are:

 John von Neumann's 1945 paper, First Draft of a Report on the


EDVAC, which described an organization of logical
elements;[5] and
 Alan Turing's more detailed Proposed Electronic Calculator for
the Automatic Computing Engine, also 1945 and which
cited John von Neumann's paper.[6]
The term “architecture” in computer literature can be traced to the
work of Lyle R. Johnson and Frederick P. Brooks, Jr., members of
the Machine Organization department in IBM's main research
center in 1959. Johnson had the opportunity to write a proprietary
research communication about the Stretch, an IBM-
developed supercomputer for Los Alamos National Laboratory (at
the time known as Los Alamos Scientific Laboratory). To describe
the level of detail for discussing the luxuriously embellished
computer, he noted that his description of formats, instruction
types, hardware parameters, and speed enhancements were at the
level of “system architecture”, a term that seemed more useful than
“machine organization”.[7]
Subsequently, Brooks, a Stretch designer, opened Chapter 2 of a
book called Planning a Computer System: Project Stretch by
stating, "Computer architecture, like other architecture, is the art of
determining the needs of the user of a structure and then designing
to meet those needs as effectively as possible within economic and
technological constraints[8]

page8
Computer
 Input unit
The input device fives signals to an information
processing system such as a computer or information
appliance. Examples of input devices
include keyboards, mouse, scanners, digital
cameras, joysticks, and microphones.
Input devices can be categorized based on:

 modality of input (e.g. mechanical motion, audio, visual,


etc.)
 whether the input is discrete (e.g. pressing of key) or
continuous (e.g. a mouse's position, though digitized
into a discrete quantity, is fast enough to be considered
continuous)
 the number of degrees of freedom involved (e.g. two-
dimensional traditional mice, or three-dimensional
navigators designed for CAD applications)

some examples of input devices are given below:

 1Keyboard
 2Mouse
 3High-degree of freedom input devices
 4Composite devices
 5Video input devices
 6Audio input devices
 7Punched paper
 8Other
 9See also
 Joystick
 Touchscreen
 Touch kiosk
page9
Computer
 Monitor
Monitors
A display device is the most common form of output device. It
presents output visually on computer screen. The output
appears temporarily on the screen and can easily altered or
erased, it is sometimes referred to as soft copy also. The
display device for a desktop PC is called monitor.
With all-in-one PCs, notebook computers, hand held PCs and
other devices; the term display screen is used for the display
device. The display devices are also used in home
entertainment systems, mobile systems, cameras and video
games.
Types of Display (Monitor)
Monochrome Display
A monochrome monitor is a type of CRT computer display
which was very common in the early days of computing, from
the 1960s through the 1980s, before color monitors became
popular. The most important component in the monitor is the
picture tube. CRT basically means cathode ray tube. [4] The CRT
use cathode-ray-tube technology to display images, so they are
large, bulky and heavy like conventional or old televisions,
because old televisions also used the CRT technology only to
display the television films or television images. To form the
image on the screen, an electronic gun sealed inside a mages.
 TFT (Thin-film transistor),
 flat panel[6]
 LCD (Liquid Crystal Display)
 OLED
 LED
page10
Computer
 Printer the output device
A printer is an external hardware output device that takes
the electronic data stored on a computer or other device and
generates a hard copy of it. For example, if you created a
report on your computer, you could print several copies to
hand out at a staff meeting. Printers are one of the most
popular computer peripherals and are commonly used to
print text and photos. The picture is an example of an inkjet
computer printer, the Lexmark Z605. Types of printers

Below is a list of all the different


types of computer printers. Today,
the most common printers used
with a computer are inkjet and
laser printers.

 3D printer

 AIO (all-in-one) printer

 Dot matrix printer

 Inkjet printer

 Laser printer

 LED printer

 MFP (multifunction printer)

 Plotter

 Thermal printe

page11
Computer
 Computer software
 Computer software, or simply software, is a collection
of data or computer instructions that tell the computer how to
work. This is in contrast to physical hardware, from which the
system is built and actually performs the work. In computer
science and software engineering, computer software is
all information processed by computer
systems, programs and data. Computer software
includes computer programs, libraries and related non-
executable data, such as online documentation or digital
media. Computer hardware and software require each other
and neither can be realistically used on its own. Application
software
which is software that uses the computer system to perform
special functions or provide entertainment functions beyond
the basic operation of the computer itself. There are many
different types of application software, because the range of
tasks that can be performed with a modern computer is so
large—see list of software.
 System software
which is software for managing computer
hardware behaviour, as to provide basic functionalities that
are required by users, or for other software to run properly, if
at all. System software is also designed for providing a
platform for running application software,[11] and it includes
the following
 Operating system
 Device driver

page12
Computer
 Computer virus
Acomputer virus is a type of computer program that,
when executed, replicates itself by
modifying other computer programs and inserting its own
code.[1] When this replication succeeds, the affected areas
are then said to be "infected" with a computer virus.[2][3]
Virus writers use social engineering deceptions and
exploit detailed knowledge of security vulnerabilities to
initially infect systems and to spread the virus. The vast
majority of viruses target systems running Microsoft
Windows,[4][5][6] employing a variety of mechanisms to infect
new hosts,[7] and often using complex anti-detection/stealth
strategies to evade antivirus software.[8][9][10][11] Motives for
creating viruses can include seeking profit (e.g.,
with ransomware), desire to send a political message,
personal amusement, to demonstrate that a vulnerability
exists in software, for sabotage and denial of service, or
simply because they wish to
explore cybersecurity issues, artificial life and evolutionary
algorithms.[12]
Computer viruses currently cause billions of dollars' worth
of economic damage each year,[13] due to causing system
failure, wasting computer resources, corrupting data,
increasing maintenance costs, stealing personal
information etc. In response, free, open-source antivirus
tools have been developed, and an industry of antivirus
software has cropped up, selling or freely distributing
virus protection to users of various operating
systems.[14] As of 2005, even though no currently existing
antivirus software was able to uncover all computer
viruses (especially new ones), computer security
researchers are actively searching for new ways to enable
an viruses, before .
page13
Computer
 Memory unit
 Computer memory is a temporary storage area. It holds
the data and instructions that the Central Processing
Unit (CPU) needs. Before a program can run, the
program is loaded from storage into the memory. This
allows the CPU direct access to the computer program.
Memory is needed in all computers.
 A computer is usually a binary digital electronics device.
Binary means it has only two states. On or Off. Zero or
One. In a binary digital computer transistors are used to
switch the electricity on and off. The computer's memory
is made from lots of transistors.
 Each on/off setting in the computer's memory is called
a binary digit or bit. A group of eight bits is called a byte.
A byte is made from two nibbles of four bits each.
Computer scientists made up the words bit and byte.
The word bit is short for binary digit. It takes bi from
binary and adds the t from digit. A collection of bits was
called a bite. The comptuer scientists changed the
spelling to byte to avoid confusion. When the computer
scientists needed a word for half a byte, they
thought nibble, as in half a bite, would be a fun word to
choose.[1]

Characters in memory[change | change


source]
A byte of memory is used to store a code to represent a
character such as a number, a letter or a symbol. Eight bits
can store 256 different codes. This was thought enough
and a byte became fixed at eight bits. This allows the
ten decimal digits, 26 lower-case letters.

page14
Computer
 Operating system
An operating system acts as an intermediary
between the user of a computer and computer
hardware. The purpose of an operating system
is to provide an environment in which a user can
execute programs in a convenient and efficient
manner.
An operating system is a program that controls
the execution of application programs and acts
as an interface between the user of a computer
and the computer hardware.
A more common definition is that the operating
system is the one program running at all times
on the computer (usually called the kernel), with
all else being application programs.
 An operating system is concerned with the
allocation of resources and services, such as
memory, processors, devices, and information.
The operating system correspondingly includes
programs to manage these resources, such as a
traffic controller, a scheduler, memory
management module, I/O programs, and a file
system.

page15
Computer
 Features of operating system

almost all Operating systems have many of the same capabilities


or features because users have the same basic needs (such as
running programs, managing files, and connecting to the
Internet etc.) no matter which OS they use. Every OS also gives
users the capability to navigate with shortcut keys, create screen
captures, and configure accessibility options. Every OS has so
many features but in this blog, I will cover the below main
features of any operating system.

 Programs Execution
 File Management
 Connecting to Internet
 Navigation with hot keys
 Screen capturing
 Accessibility options configuration

Programs Execution
All operating systems can execute programs which means they
can run programs or applications. They run programs that do
something useful. We can run a program in operating system
through both GUI and CLI. In a GUI environment, we can run a
program by double-clicking an icon or button for the program
while in CLI, we can run a program by executing commands in
the command terminal. The following figure shows the
execution of a program “AeroWeather” in Windows 7.
:

page16
Computer
 Types and function of operating system

Multi-user: is the one that concede two or more users to use their
programs at the same time. Some of O.S permits hundreds or even
thousands of users simultaneously.

Single-User: just allows one user to use the programs at one time.

Multiprocessor: Supports opening the same program more than just in


one CPU.

Multitasking: Allows multiple programs running at the same time.

Single-tasking: Allows different parts of a single program running at any


one time.

Real time: Responds to input instantly. Operating systems such as DOS


and UNIX, do not work in real time.

 Management Processor:

Operating System Processor manages the distribution among programs


using a programming algorithm.

 Management Random Access Memory:

Operating system manages the memory space allocated for each


application and each user, if appropriate. When physical memory is
insufficient, the O.S creates an area of memory on the hard drive, called
“virtual memory.” Virtual memory permits you to run applications that
require a capacity of memory beyond available RAM in the system.
However, this memory is much slower.

 Management of input / output:

Operating system to unify and control access to material resources


programs through the drivers (also known as administrators.
page17
Computer
 Introduction to window operating system

In this topic, we are going to learn about the Introduction to Windows.

Microsoft Windows is a multitasking operating system developed by

Microsoft Corporation which uses Graphical User Interface to interact

with the users. Microsoft was originally named “Traf-O-Data” in 1972,

was renamed as “Micro-soft” in November 1975, then “Microsoft” on

November 26, 1976. Microsoft entered the marketplace in August

1981 by releasing version 1.0 of the operating system Microsoft

DOS (MS-DOS), a 16-bit command-line operating system. Bill Gates

and Paul Allen founded Microsoft and windows operating system has

been its primary product.

In this Introduction to Windows, we will also clear you about the latest

OS release of Windows is “Windows 10” which was launched in the

year 2015.

 In a nutshell, below is how Microsoft windows evolved over

time:

 Thus there are lots of uses aned importance of windows .


page18
Computer
 Programming
Programming language is a set of rules that provides a
platform for instructing computer to perform some specific
tasks. It is classified into five sub categories: machine
language, assembly language, third generation, fourth
generation and fifth generation language.

a. Machine language: machine language is the first


language of a computer system. It is the language of
CPU till date. In the early days of computing, there
were no complex hardware and software, so machine
language was used for data input/output and process.
Merits :
i. Machine language does not require translation process
because it is the language of CPU, so it does not require
translator program.
ii. Execution time of machine language program is
extremely fastest.

the programming language that are close to human


language are called high level programming languages. The
characteristics of high level languages:
a. Easy to learn
b. Easy to find errors
c. Machine-Independent
d. Availability of Library Functions
e. Shorter Programs
f. Well-Defined Syntax and Standard
g. Source code understandable by any other program

page19
Computer
 Difference between compler and interpreter
Programming is the process of designing, writing, testing,
debugging, and maintaining the source code of computer
programs. This code can be written in a variety of computer
programming languages. Some of these languages include
Java, C, and Python. Computer code is a collection of typed
words that the computer can clearly understand.

Compiler Interpreter
It translates high level It translates high level
language into machine language program
level at single attempt. into machine level by
one instruction at a
time.
It finds syntax errors It finds the syntax
after compiling the errors after
whole program. translating a line of
the program a t a time.
It is difficult to trace It is easy to trace
errors and causes of it. errors and causes of it
also.
Compiler process if Interpreting process is
faster than slower then compiler.
interpreter.
It is more efficient It is less efficient than
than interpreter. compiler.
It creates the object It does not create
code. object code.
Examples: C, C++, Example: BASIC,
Visual Basic, etc. LISP, etc.

page20
Computer
 Data representation codes

 ASCII (American Standard Code for Information Interchange)


is the most common format for text files in computers and on the
Internet. In an ASCII file, each alphabetic, numeric, or special
character is represented with a 7-bit binary number (a string of
seven 0s or 1s). 128 possible characters are defined.

 c. EBCDIC
 EBCDIC is a binary code for alphabetic and numeric characters
that IBM developed for its larger operating systems. It is the
code for text files that is used in IBM's OS/390 operating system
for its S/390 servers and that thousands of corporations use for
their legacy applications and databases. In an EBCDIC file, each
alphabetic or numeric character is represented with an 8-bit
binary number (a string of eight 0's or 1's). 256 possible
characters (letters of the alphabet, numerals, and special
characters) are defined.

 d. Unicode
 Unicode is a universal character encoding standard. It defines
the way individual characters are represented in text files, web
pages, and other types of documents.
 There are several different types of Unicode encodings, though
UTF-8 and UTF-16 are the most common. UTF-8 has become
the standard character encoding used on the Web and is also the
default encoding used by many software programs.

 e. FORTRAN
 FORTRAN (FORmula TRANslation) is a third-generation
(3GL) programming language that was designed for use by
engineers, mathematicians, and other users and creators of
 ASP documents use an .ASP extension.
page21
Computer

Microsoft word
A feature supported by many word processors that
enables us to generate form letters. To use a mail-merge
system, we first store a set of information, like a list of
names and addresses, in one file. In another file, we write
a letter, substituting special symbols in place of names
and addresses (or whatever other information will come
from the first file). For example, we might write:
Dear NAME:
Our records show that your address is: STREET
CITY, STATE ZIP
If this is incorrect,...
When we execute the merge command, the word processor
automatically generates letters by replacing symbols
(NAME, STREET, CITY, STATE, and ZIP) in the second
file with the appropriate data from the first file.

Steps are as follows:


Step 1: Choose a document type and main document
Step 2: Connect to the data file
Step 3: Choose the records in the data file that you want to
use
Step 4: Add fields
Step 5: Match fields
Step 6: Preview the merge
Step 7: Complete the merge

THUS, THE MICROSOFT WORD IS USED IN VARIOS


PURPOSES AS GIVEN ABOVE.
page22
Computer
 Introduction to spread sheet
A spreadsheet is an interactive computer application program for
organization and analysis of data in tabular form. Spreadsheets
developed as computerized simulations of paper accounting
worksheets. The program operates on data represented as cells of
an array, organized in rows and columns.

1) Selecting or filling ranges without the screen moving: Just


highlight the initial cells in the range right click on the range and
pick the select or fill feature that you want to run. The screen does
not move when you select or fill a range. Eliminate the wild screen
scrolling you used to do (unless you enjoy it).

2) Perform any math action on a range of cells: Highlight a range


of cells select the Any Math Action feature and enter the action (/
100 * 1000 ...) you want to take.

3) Insert Sticky Notes anywhere in your worksheets for quick


reminders. Simply select Insert Sticky Note from the menus. You
have a choice of colors and features. Even better the sticky notes
do not print unless you want them to. If sticky notes are useful for
your paper work you can imagine how useful they are in your
complex spreadsheets.

4) Adding any feature you want to the Quick Access Toolbar: Just
right click on any button on the Assistants tab and selecting add to
quick access toolbar.

5) The Favorite Directories feature and Bookmark feature which


lets you quickly access your favorite directories. We know that is
two features but they are companion features. These features are
especially useful if you have files in different directories such as on
your PC and on a network.
6) The Hide and Un-Hide Sheets feature surprised us and we wrote
it: Now when you are working on two widely separation.
page23
Computer

page24

You might also like