You are on page 1of 5

Definition - What does Central Processing Unit (CPU) mean?

The central processing unit (CPU) is the unit which performs most of the processing
inside a computer. To control instructions and data flow to and from other parts of
the computer, the CPU relies heavily on a chipset, which is a group of microchips
located on the motherboard.

The CPU has two components:

Control Unit: extracts instructions from memory and decodes and executes them
Arithmetic Logic Unit (ALU): handles arithmetic and logical operations

To function properly, the CPU relies on the system clock, memory, secondary
storage, and data and address buses.

This term is also known as a central processor, microprocessor or chip.

Techopedia explains Central Processing Unit (CPU)

The CPU is the heart and brain of a computer. It receives data input, executes
instructions, and processes information. It communicates with input/output (I/O)
devices, which send and receive data to and from the CPU. Additionally, the CPU has
an internal bus for communication with the internal cache memory, called the
backside bus. The main bus for data transfer to and from the CPU, memory, chipset,
and AGP socket is called the front-side bus.

The CPU contains internal memory units, which are called registers. These registers
contain data, instructions, counters and addresses used in the ALU's information
processing.
Some computers utilize two or more processors. These consist of separate physical
CPUs located side by side on the same board or on separate boards. Each CPU has an
independent interface, separate cache, and individual paths to the system front-
side bus. Multiple processors are ideal for intensive parallel tasks requiring
multitasking. Multicore CPUs are also common, in which a single chip contains
multiple CPUs.

///////////////////////////////////////////////////////////////////////////////////
///////////////////////////////////////////////

Central processing unit (CPU), principal part of any digital computer system,
generally composed of the main memory, control unit, and arithmetic-logic unit. It
constitutes the physical heart of the entire computer system; to it is linked
various peripheral equipment, including input/output devices and auxiliary storage
units. In modern computers, the CPU is contained on an integrated circuit chip
called a microprocessor.
The control unit of the central processing unit regulates and integrates the
operations of the computer. It selects and retrieves instructions from the main
memory in proper sequence and interprets them so as to activate the other
functional elements of the system at the appropriate moment to perform their
respective operations. All input data are transferred via the main memory to the
arithmetic-logic unit for processing, which involves the four basic arithmetic
functions (i.e., addition, subtraction, multiplication, and division) and certain
logic operations such as the comparing of data and the selection of the desired
problem-solving procedure or a viable alternative based on predetermined decision
criteria.
Analog computer, any of a class of devices in which continuously variable physical
quantities such as electrical potential, fluid pressure, or mechanical motion are
represented in a way analogous to the corresponding quantities in the problem to be
solved. The analog system is set up according to initial conditions and then
allowed to change freely. Answers to the problem are obtained by measuring the
variables in the analog model. See also digital computer.
The earliest analog computers were special-purpose machines, as for example the
tide predictor developed in 1873 by William Thomson (later known as Lord Kelvin).
Along the same lines, A.A. Michelson and S.W. Stratton built in 1898 a harmonic
analyzer (q.v.) having 80 components. Each of these was capable of generating a
sinusoidal motion, which could be multiplied by constant factors by adjustment of a
fulcrum on levers. The components were added by means of springs to produce a
resultant. Another milestone in the development of the modern analog computer was
the invention of the so-called differential analyzer in the early 1930s by Vannevar
Bush, an American electrical engineer, and his colleagues. This machine, which used
mechanical integrators (gears of variable speed) to solve differential equations,
was the first practical and reliable device of its kind.
Most present-day electronic analog computers operate by manipulating potential
differences (voltages). Their basic component is an operational amplifier, a device
whose output current is proportional to its input potential difference. By causing
this output current to flow through appropriate components, further potential
differences are obtained, and a wide variety of mathematical operations, including
inversion, summation, differentiation, and integration, can be carried out on them.
A typical electronic analog computer consists of numerous types of amplifiers,
which can be connected so as to build up a mathematical expression, sometimes of
great complexity and with a multitude of variables.
Analog computers are especially well suited to simulating dynamic systems; such
simulations may be conducted in real time or at greatly accelerated rates, thereby
allowing experimentation by repeated runs with altered variables. They have been
widely used in simulations of aircraft, nuclear-power plants, and industrial
chemical processes. Other major uses include analysis of hydraulic networks (e.g.,
flow of liquids through a sewer system) and electronics networks (e.g., performance
of long-distance circuits).
Multiprocessing, in computing, a mode of operation in which two or more processors
in a computer simultaneously process two or more different portions of the same
program (set of instructions). Multiprocessing is typically carried out by two or
more microprocessors, each of which is in effect a central processing unit (CPU) on
a single tiny chip. Supercomputers typically combine thousands of such
microprocessors to interpret and execute instructions.

The primary advantage of a multiprocessor computer is speed, and thus the ability
to manage larger amounts of information. Because each processor in such a system is
assigned to perform a specific function, it can perform its task, pass the
instruction set on to the next processor, and begin working on a new set of
instructions. For example, different processors may be used to manage memory
storage, data communications, or arithmetic functions. Or a larger processor might
utilize “slave” processors to conduct miscellaneous housekeeping duties, such as
memory management. Multiprocessor systems first appeared in large computers known
as mainframes, before their costs declined enough to warrant inclusion in personal
computers (PCs).

Personal computers had long relied on increasing clock speeds, measured in


megahertz (MHz) or gigahertz (GHz), which correlates to the number of computations
the CPU calculates per second, in order to handle ever more complex tasks. But as
gains in clock speed became difficult to sustain, in part because of overheating in
the microprocessor circuitry, another approach developed in which specialized
processors were used for tasks such as video display. These video processors
typically come on modular units known as video cards, or graphic accelerator cards.
The best cards, which are needed to play the most graphic-intensive electronic
games on personal computers, often cost more than a bargain PC. The commercial
demands for ever better cards to run ever more realistic games, on PCs and video
game systems, led IBM to develop a multiprocessor microchip, known as the Cell
Broadband Engine, for use in the Sony Computer Entertainment PlayStation 3 and a
new supercomputer that included thousands of the microchips.
It must be noted, however, that simply adding more processors does not guarantee
significant gains in computing power; computer program problems remain. While
programmers and computer programming languages have developed some proficiency in
allocating executions among a small number of processors, parsing instructions
beyond two to eight processors is impracticable for all but the most repetitive
tasks. (Fortunately, many of the typical supercomputer scientific applications
involve applying exactly the same formula or computation to a vast array of data,
which is a difficult but tractable problem.)
Embedded processor, a class of computer, or computer chip, embedded in various
machines. These are small computers that use simple microprocessors to control
electrical and mechanical functions. They generally do not have to do elaborate
computations or be extremely fast, nor do they have to have great input/output
capability, and so they can be inexpensive. Embedded processors help to control
aircraft and industrial automation, and they are common in automobiles and in both
large and small household appliances. One particular type, the digital signal
processor (DSP), has become as prevalent as the microprocessor. DSPs are used in
wireless telephones, digital telephone and cable modems, and some stereo equipment.
Zuse computer, any of a series of computers designed and built in Germany during
the 1930s and ’40s by the German engineer Konrad Zuse. He had been thinking about
designing a better calculating machine, but he was advised by a calculator
manufacturer in 1937 that the field was a dead end and that every computing problem
had already been solved. Zuse had something else in mind, though.

For one thing, Zuse worked in binary from the beginning. All of his prototype
machines, built in 1936, used binary representation in order to simplify
construction. This had the added advantage of making the connection with logic
clearer, and Zuse worked out the details of how the operations of logic (e.g., AND,
OR, and NOT) could be mapped onto the design of the computer’s circuits. (English
mathematician George Boole had shown the connection between logic and mathematics
in the mid-19th century, developing an algebra of logic now known as Boolean
algebra.) Zuse also spent more time than his predecessors and contemporaries
developing software for his computer, the language in which it was to be
programmed. Although all his early prewar machines were really calculators—not
computers—his Z3, completed in December 1941 (and destroyed on April 6, 1945,
during an Allied air raid on Berlin), was the first program-controlled processor.

Because all Zuse’s work was done in relative isolation, he knew little about work
on computers in the United States and England, and, when the war began, his
isolation became complete.
Zuse began construction of the Z4 in 1943 with funding from the German Air
Ministry. Like his Z3, the Z4 used electromechanical relays, in part because of the
difficulty in acquiring the roughly 2,000 necessary vacuum tubes in wartime
Germany. The Z4 was evacuated from Berlin in early 1945, and it eventually wound up
in Hinterstein, a small village in the Bavarian Alps, where it remained until Zuse
brought it to the Federal Technical Institute in Zürich, Switz., for refurbishing
in 1950. Although unable to continue with hardware development, Zuse made a number
of advances in software design.
Zuse’s use of floating-point representation for numbers—the significant digits,
known as the mantissa, are stored separately from a pointer to the decimal point,
known as the exponent, allowing a very large range of numbers to be handled—was far
ahead of its time. In addition, Zuse developed a rich set of instructions, handled
infinite values correctly, and included a “no-op”—that is, an instruction that did
nothing. Only significant experience in programming would show the need for
something so apparently useless.
The Z4’s program was punched on used movie film and was separate from the
mechanical memory for data (in other words, there was no stored program). The
machine was relatively reliable (it normally ran all night unattended), but it had
no decision-making ability. Addition took 0.5 to 1.25 seconds, multiplication 3.5
seconds.

Zuse also developed the first real computer programming language, Plankalkül (“Plan
Calculus”), in 1944–45. Zuse’s language allowed for the creation of procedures
(also called routines or subroutines; stored chunks of code that could be invoked
repeatedly to perform routine operations such as taking a square root) and
structured data (such as a record in a database, with a mixture of alphabetic and
numeric data representing, for instance, name, address, and birth date). In
addition, it provided conditional statements that could modify program execution,
as well as repeat, or loop, statements that would cause a marked block of
statements or a subroutine to be repeated a specified number of times or for as
long as some condition held.

Zuse knew that computers could do more than arithmetic, but he was aware of the
propensity of anyone introduced to them to view them as nothing more than
calculators. So he took pains to demonstrate nonnumeric solutions with Plankalkül.
He wrote programs to check the syntactical correctness of Boolean expressions (an
application in logic and text handling) and even to check chess moves.

Unlike flowcharts, Zuse’s program was no intermediate language intended for pencil-
and-paper translation by mathematicians. It was deliberately intended for machine
translation (that is, into machine language), and Zuse did some work toward
implementing a translator for Plankalkül. He did not get very far, however; he had
to disassemble his machine near the end of the war and was not able to put it back
together for several years. Unfortunately, his computer language and his work,
which were roughly a dozen years ahead of their time, were not generally known
outside Germany.

///////////////////////////////////////

Digital computer, any of a class of devices capable of solving problems by


processing information in discrete form. It operates on data, including magnitudes,
letters, and symbols, that are expressed in binary code—i.e., using only the two
digits 0 and 1. By counting, comparing, and manipulating these digits or their
combinations according to a set of instructions held in its memory, a digital
computer can perform such tasks as to control industrial processes and regulate the
operations of machines; analyze and organize vast amounts of business data; and
simulate the behaviour of dynamic systems (e.g., global weather patterns and
chemical reactions) in scientific research.
Functional elements

A typical digital computer system has four basic functional elements: (1) input-
output equipment, (2) main memory, (3) control unit, and (4) arithmetic-logic unit.
Any of a number of devices is used to enter data and program instructions into a
computer and to gain access to the results of the processing operation. Common
input devices include keyboards and optical scanners; output devices include
printers and monitors. The information received by a computer from its input unit
is stored in the main memory or, if not for immediate use, in an auxiliary storage
device. The control unit selects and calls up instructions from the memory in
appropriate sequence and relays the proper commands to the appropriate unit. It
also synchronizes the varied operating speeds of the input and output devices to
that of the arithmetic-logic unit (ALU) so as to ensure the proper movement of data
through the entire computer system. The ALU performs the arithmetic and logic
algorithms selected to process the incoming data at extremely high speeds—in many
cases in nanoseconds (billionths of a second). The main memory, control unit, and
ALU together make up the central processing unit (CPU) of most digital computer
systems, while the input-output devices and auxiliary storage units constitute
peripheral equipment.
Development of the digital computer

Blaise Pascal of France and Gottfried Wilhelm Leibniz of Germany invented


mechanical digital calculating machines during the 17th century. The English
inventor Charles Babbage, however, is generally credited with having conceived the
first automatic digital computer. During the 1830s Babbage devised his so-called
Analytical Engine, a mechanical device designed to combine basic arithmetic
operations with decisions based on its own computations. Babbage’s plans embodied
most of the fundamental elements of the modern digital computer. For example, they
called for sequential control—i.e., program control that included branching,
looping, and both arithmetic and storage units with automatic printout. Babbage’s
device, however, was never completed and was forgotten until his writings were
rediscovered over a century later.

You might also like