Professional Documents
Culture Documents
Early computers were only for computation. Simple manual tools like the abacus have helped
people calculate since ancient times. In the early days of the industrial revolution, several
mechanical devices were built to automate monotonous and monotonous tasks such as: B.
Loom pattern. More sophisticated electrical machines performed specialized analog
computations in the early 20th century. The first digital electronic calculators were developed
during World War II. The first semiconductor transistors in the late 1940s were followed by
his silicon-based MOSFET (MOS transistor) and monolithic integrated circuit chip
technology in the late 1950s, leading to the microprocessor and microcomputer revolution in
the 1970s. Since then, the speed, power, and versatility of computers have increased
dramatically, and the number of transistors has increased rapidly (as predicted by Moore's
Law), leading to the late 20th century and his early 21st century. led to the digital revolution
of Modern computers traditionally consist of at least one processing element, usually a
central processing unit (CPU) in the form of a microprocessor, and some form of computer
memory, usually a semiconductor memory chip. Processing elements perform arithmetic and
logical operations, and sequence and control units can change the order of operations
depending on the information stored. Peripherals include input devices (keyboards, mice,
joysticks, etc.), output devices (monitors, printers, etc.), and input/output devices that perform
both functions (2000s touchscreens, etc.). Peripherals allow acquisition of information from
external sources and storage of surgical results