You are on page 1of 30

NAME ROLL NO

: HARIHARAN . T : UCS 11209.
COMPUTER SCIENCE.

SUBJECT : SEMESTER :

I

TOPIC
Evolution of computer & Components of computer

1|Page

Abacus

The abacus, also called a counting frame, is a calculating tool used primarily in parts of Asia for performing arithmetic processes. Today, abaci are often constructed as a bamboo frame with beads sliding on wires, but originally they were beans or stones moved in grooves in sand or on tablets of wood, stone, or metal. The abacus was in use centuries before the adoption of the written modern numeral system and is still widely used by merchants, traders and clerks in Asia, Africa, and elsewhere. The user of an abacus is called an abacist.

Indian abacus
First century sources, such as the Abhidharmakosa describe the knowledge and use of abacus in India. Around the 5th century, Indian clerks were already finding new ways of recording the contents of the Abacus. Hindu texts used the term shunya(zero) to indicate the empty column on the abacus.

Mesopotamian abacus
The period 2700–2300 BC saw the first appearance of the Sumerian abacus, a table of successive columns which delimited the successive orders of magnitude of their sexagesimal number system.Some scholars point to a character from the Babylonian cuneiform which may have been derived from a representation of the abacus. It is the belief of Carruccio (and other Old Babylonian scholars) thatOld Babylonians "may have used the abacus for the operations of addition and subtraction; however, this primitive device proved difficult to use for more complex calculations".

2|Page

Napier's bones

Napier's bones is an abacus created by John Napier for calculation of products and quotients of numbers that was based on Arab mathematics and lattice multiplication used by Matrakci Nasuh in the Umdet-ul Hisab and Fibonacci writing in the Liber Abaci. Also called Rabdology (from Greek ῥάβδoς [r(h)abdos], "rod" and -λογία [logia], "study"). Napier published his version of rods in a work printed in Edinburgh,Scotland, at the end of 1617 entitled Rabdologiæ. Using the multiplication tables embedded in the rods, multiplication can be reduced to addition operations and division to subtractions. More advanced use of the rods can even extract square roots. Note that Napier's bones are not the same as logarithms, with which Napier's name is also associated. The abacus consists of a board with a rim; the user places Napier's rods in the rim to conduct multiplication or division. The board's left edge is divided into 9 squares, holding the numbers 1 to 9. The Napier's rods consist of strips of wood, metal or heavy cardboard. Napier's bones are three dimensional, square in cross section, with four different rods engraved on each one. A set of such bones might be enclosed in a convenient carrying case. A rod's surface comprises 9 squares, and each square, except for the top one, comprises two halves divided by a diagonal line. The first square of each rod holds a single digit, and the other squares hold this number's double, triple, quadruple, quintuple, and so on until the last square contains nine times the number in the top square. The digits of each product are written one to each side of the diagonal; numbers less than 10 occupy the lower triangle, with a zero in the top half.A set consists of 10 rods corresponding to digits 0 to 9. The rod 0, although it may look unnecessary, is obviously still needed for multipliers or multiplicands having 0 in them.
3|Page

4|Page . The use of slide rules continued to grow through the 1950s and 1960s even as digital computing devices were being gradually introduced. William Oughtred and others developed the slide rule in the 17th century based on the emerging work on logarithms by John Napier. but around 1974 the electronic scientific calculator made it largely obsolete and most suppliers left the business. also known colloquially as a slipstick. is a mechanical analog computer.logarithms and trigonometry. The slide rule is used primarily for multiplication and division. and also for functions such as roots. Before the advent of the pocket calculator. but is not normally used for addition or subtraction.Slide rule The slide rule. it was the most commonly used calculation tool in science and engineering. Slide rules come in a diverse range of styles and generally appear in a linear or circular form with a standardized set of markings (scales) essential to performing mathematical computations. Slide rules manufactured for specialized fields such as aviation or finance typically feature additional scales that aid in calculations common to that field.

The mechanical calculator industry owes a lot of its key machines and inventions to the pascaline. then Thomas de Colmar drew his inspiration from Pascal and Leibniz when he designed his arithmometer in 1820. Pascal's Calculator and later Pascaline. the chancellor of France at the time. most of them being on display in European museums. He dedicated it to Pierre Séguier. First Gottfried Leibniz invented his Leibniz wheels after 1671 while trying to add an automatic multiplication and division feature to the pascaline. often improving on his original design.Pascal's calculator Blaise Pascal invented the mechanical calculator in 1642. 5|Page . especially with the machines of Dr. it could add and subtract directly and multiply and divide by repetition. Its introduction launched the development of mechanical calculators in Europe first and then all over the world. first called Arithmetic Machine. by the invention of themicroprocessor developed for a Busicom calculator in 1971. He built around twenty more machines during the next decade. He conceived it while trying to help his father who had been assigned the task of reorganizing the tax revenues of the French province of Haute-Normandie . The pascaline was also constantly improved upon. Feltsubstituted the input wheels of the pascaline by columns of keys to invent his comptometer around 1887. In 1649 a royal privilege. Pascal went through 50 prototypes before presenting his first machine to the public in 1645. three centuries later. gave him the exclusivity of the design and manufacturing of calculating machines in France. signed by Louis XIV of France. and then with some portable machines until the creation of the first electronic calculators. Nine machines have survived the centuries. and finally Dorr E. Roth around 1840. development which culminated.

and divide. 6|Page . The speed of calculation for multiplication or division was acceptable.Gottfried Leibniz Specifications Can multiply. Mechanical device made of copper and steel. Carriage is performed with a stepped wheel. Leibniz (1646-1716) successfully introduced a calculator onto the market. which mechanism is still in use today. add and substract. devide. Wheels are placed at right angles which could be displaced by a special stepping mechanism. this calculator required that the operator using the device had to understand how to turn the wheels and know the way of performing calculations with the calculator. multiply. It is designed in 1673 but it takes until 1694 to complete. But like the Pascaline. Chronology Contrary to Pascal. The calculator can add. subtract.

such as words or large numbers. were stored in adjacent card columns known as fields. named and marked by vertical lines. For some applications printing might have included fields. The rectangular. Early digital computers used punched cards. Multicharacter data. Cards were commonly printed so that the row and column position of a punch could be identified. or oval bits of paper punched out are called chad (recently. Now an obsolete recording medium. One upper corner of each card was usually cut so that cards not oriented correctly. It wasn't until around 1928 that punched cards and machines were made "general purpose". or cards with different corner cuts. could be easily identified. punch card. round. IBM card. processing. punched cards were widely used throughout the 19th century for controlling textile looms and in the late 19th and early 20th century for operating fairground organs and related instruments. 7|Page . or Hollerith card is a piece of stiff paper that contains digital information represented by the presence or absence of holes in predefined positions. and more. The early applications of punched cards all used specifically designed card layouts. Some voting machines use punched cards. A group of cards is called a deck.chads) or chips (in IBM usage).Punched card A punched card. as the primary medium for input of both computer programs and data. often prepared using keypunch machines. logos. and data storage. They were used through the 20th century in unit record machines for input.

Like the central processing unit (CPU) in a modern computer. and one for load and store operations. During this project he realized that a much more general design. was a special-purpose calculator designed to tabulate logarithms and trigonometric functions by evaluating finite differences to create approximating polynomials.7 kB). the description was translated into English and extensively annotated byAda Byron. with the long store exiting off to one side. whom Babbage had met while travelling in Italy. 8|Page . transferring numbers from the store to the arithmetical unit or back. Three different types of punch cards were used: one for arithmetical operations. In 1842. a memory) capable of holding 1. Joseph Clement. The input (programs and data) was to be provided to the machine via punched cards. wrote a description of the engine in French. Loops and conditional branching were possible. a method being used at the time to direct mechanical looms such as the Jacquard loom.Analytical Engine Babbage's first attempt at a mechanical computing device. The modern computer programming language Ada is named in her honour. who had become interested in the engine ten years earlier. the Analytical Engine. There was to be a store (that is. one for numerical constants. plus comparisons and optionally square roots. and so the language as conceived would have been Turing-complete long before Alan Turing's concept. which included a way to calculate Bernoulli numbers using the machine. and ultimately the British government withdrew its funding for the project. she has been described as the first computer programmer. was possible. The programming language to be employed by users was akin to modern day assembly languages. In 1843. For output. in a generally circular layout. Construction of this machine was never completed. to be stored in the form of pegs inserted into rotating drums called "barrels". to carry out some of the more complex instructions the user's program might specify. the Italian mathematician Luigi Menabrea. the difference engine. It employed ordinary base-10 fixed-point arithmetic.000 numbers of 50 decimal digits each (ca. Babbage had conflicts with his chief engineer. An arithmetical unit (the "mill") would be able to perform all four arithmetic operations. Countess of Lovelace. There were three separate readers for the three types of cards. the machine would have a printer. the mill would rely upon its own internal procedures. 20. The machine would also be able to punch numbers onto cards to be read in later. In recognition of her additions to Menabrea's paper. a curve plotter and a bell. Initially it was conceived as a difference engine curved back upon itself.

gender.5 hours. race. by age category.5 hours. The first two contestants captured the data in 144. etc. The third contestant. the operator positioned the punching stylus over the desired hole in a punch card template. Pressing the stylus into the template created a punched hole in the paper card that was read by the Hollerith tabulator's card reader.e. the agency held a competition in 1888 to find a more efficient method to process and tabulate data. the contestants had to prove that their designs could prepare data for tabulation (i. completed the data capture process in 72. The manuallyoperated card reader consisted of two hinged plates operated by a lever (similar to a 9|Page . Modified versions of his technology would continue to be used at the Census Bureau until replaced by computers in the 1950s.375 inches and contained 12 rows of 20 columns. Hollerith astounded Census Bureau officials by completing the task in just 5.).Three contestants accepted the Census Bureau's challenge. The punch cards measured 3. Each hole in the template corresponded to a specific demographic category.5 hours and 100. MO. As a result. a former Census Bureau employee named Herman Hollerith.Next.5 hours and 55. Whoever captured and processed the data fastest would win a contract for the 1890 census. Pantograph To begin tabulating data. Card Reader Each Hollerith tabulator was equiped with a card reading station.) Each position in a row and column corresponded to a specific data entry on the census schedule. Contestants were asked to process 1880 census data from four areas in St Louis. Two contestants required 44.Census Bureau clerks using pantographs could prepare approximately 500 cards per day. To operate the mechanism. (Cards used in later censuses had additional columns to collect more data. census information had to be transferred from the census schedules to paper punch cards using a pantograph.5 hours!Herman Hollerith's impressive results earned him the contract to process and tabulate 1890 census data. the Census Bureau was collecting more data than it could tabulate..Herman Hollerith 1888 Competition Following the 1880 census.25 by 7. Components of the Hollerith Tabulator Herman Hollerith's tabulator consisted of electrically-operated components that captured and processed census data by "reading" holes on paper punch cards.5 hours.

placed the punch card in the designated sorter drawer. the sorter specified which drawer the operator should place the card. through the bottom plate. The electrical impulses received as the reader's pins passed through the card into the mercury advanced the hands on the dials corresponding to the data contained on the punch card (i. Hollerith Tabulator Dials The 1890 Hollerith tabulators consisted of 40 data-recording dials. Upon completion of the electrical cicuit (signaled by the ringing of a bell). and reset the dials. etc). An experienced tabulator clerk could process 80 punch cards per minute. citizenship. Each dial represented a different data item collected during the census. the operator recorded the data on the dials. the clerk transcibed the data indicated by the dial hands. Upon closing the plates. and positioned a new card to repeat the process. After registering the punch card data on the dials. When the bell signalled the card had been read. The clerk opened the reader. spring-loaded metal pins in the upper plate passed through the punched data holes in the cards. opened the card reader. The completed circuit energized the magnetic dials on the Hollerith tabulator and advanced the counting hands. removed the punch cards.. age. Sorting Table A sorting table was positioned next to each tabulator.waffle iron). responses to inquiries about race. Pins that passed through the punch card completed an electrical circuit when contacting the mercury below. and into wells of mercury beneath. gender. reset the dials. Clerks opened the reader and positioned a punched card between the plates.e. 10 | P a g e .

N. all only a few inches in depth. A project conceived by Harvard University's Dr.000 pounds (4500 kg).225 counters. The enclosure for the Mark I was designed by futuristic American industrial designer Norman Bel Geddes. The electromechanical ASCC was devised by Howard H. the Mark I was built by IBM engineers in Endicott. Navy Bureau of Ships in May and was officially presented to the university on August 7. The ASCC was built from switches. much more so than early electronic computers.4 m) in height. since computing power was in high demand during the war and the funds ($50. was an electro-mechanical computer. 2. The basic calculating units had to be synchronized mechanically. counters. and two feet (~61 cm) deep. eight feet (2. Howard Aiken.000 contacts. 1944.Harvard Mark I The IBM Automatic Sequence Controlled Calculator (ASCC). It used 765. built at IBM and shipped to Harvard in February 1944. A steel frame 51 feet (16 m) long and eight feet high held the calculator. It began computations for the U. From the IBM Archives:The Automatic Sequence Controlled Calculator (Harvard Mark I) was the first operating machine that could execute long computations automatically. Aiken considered the elaborate case to be a waste of resources. Aiken.500 multipole relays with 35. It was the industry's largest electromechanical calculator. each with 23 significant numbers.000 components and hundreds of miles of wire.Y. so they were run by a 50-foot (~15.5 m) shaft driven by a five-horsepower (4 kW) electric motor. rotating shafts. called the Mark I by Harvard University. comprising a volume of 51 feet (16 m) in length. The ASCC used 500 miles (800 km) of wire with three million connections. It had a weight of about 10.000 or more according to Grace Hopper) could have been used to build additional computer equipment.S.464 tenpole switches and tiers of 72 adding machines. 11 | P a g e . It was very reliable. relays. switches and control circuits. and clutches. 3. which consisted of an interlocking panel of small gears. It has been described as "the beginning of the era of the modern computer" and "the real dawn of the computer age". 1.

its intermediate result storage mechanism. The machine was. The computer was designated an IEEE Milestone in 1990. the ENIACwas considered to be the first computer in the modern sense. Atanasoff and Clifford Berry's computer work was not widely known until it was rediscovered in the 1960s. but its special-purpose nature and lack of a changeable. but in 1973 a U. Performing all calculations using electronics rather than wheels. including binary arithmetic and electronic switching elements. It was successfully tested in 1942. At that time. being designed only to solve systems of linear equations.Atanasoff–Berry Computer ( ABC ) The Atanasoff–Berry Computer (ABC) was the first electronic digital computing device. ratchets. work on the machine was discontinued. The ABC pioneered important elements of modern computing. Using binary digits to represent all numbers and data 2. a paper card writer/reader. Conceived in 1937.S. Organizing a system in which computation and memory are separated. District Court invalidated the ENIAC patent and concluded that the ENIAC inventors had derived the subject matter of the electronic digital computer from Atanasoff (see Patent dispute). or mechanical switches 3. however. was unreliable. stored program distinguish it from modern computers. the first to implement three critical ideas that are still part of every modern computer: 1. and when inventor John Vincent Atanasoff left Iowa State College for World War II assignments. amidst conflicting claims about the first instance of an electronic computer. 12 | P a g e . the machine was not programmable. However.

This mathematical power. ENIAC was conceived and designed by John Mauchly and J. adjusted for inflation). Maryland in 1947. Harry Huskey (reader/printer) and Jack Davis (accumulators). coupled with general-purpose programmability. 1946 and formally dedicated the next day at the University of Pennsylvania. Shaw (function tables). It boasted speeds one thousand times faster than electro-mechanical machines. There. Thomas Kite Sharpless (master programmer).000 (nearly $6 million in 2010. 1943. and work on the computer began in secret by theUniversity of Pennsylvania's Moore School of Electrical Engineering starting the following month under the code name "Project PX". ENIAC was designed to calculate artillery firing tables for the United States Army's Ballistic Research Laboratory.S. Presper Eckert of the University of Pennsylvania. Arthur Burks (multiplier). Army Ordnance Corps in July 1946. It was a Turing-complete digital computer capable of being reprogrammed to solve a full range of computing problems. it was turned on and was in continuous operation until 11:45 p. 1946 for a refurbishment and a memory upgrade.ENIAC ( electronic numerical integrator and calculator ) ENIAC ( Electronic Numerical Integrator And Computer) was the first generalpurpose electronic computer. It was formally accepted by the U. ENIAC was shut down on November 9. on July 29. Chuan Chu (divider/square-rooter). ENIAC was named an IEEE Milestone in 1987. a leap in computing power that no single machine has since matched. The construction contract was signed on June 5. 1947. excited scientists and industrialists. The team of design engineers assisting the development included Robert F. 1955. on October 2. The ENIAC's design and construction was financed by the United States Army during World War II. having cost almost $500. When ENIAC was announced in 1946 it was heralded in the press as a "Giant Brain". The inventors promoted the spread of these new ideas by teaching a series of lectures on computer architecture. and was transferred to Aberdeen Proving Ground. 13 | P a g e . The completed machine was announced to the public the evening of February 14.m.

Unlike its predecessor the ENIAC. was similar to the ENIAC's. The final cost of EDVAC. it was binary rather than decimal.S. and design work for the EDVAC commenced before the ENIAC was fully operational. 14 | P a g e . Eckert and Mauchly and the other ENIAC designers were joined by John von Neumann in a consulting role. A contract to build the new computer was signed in April 1946 with an initial budget of US$100. and was a stored programmachine. however. von Neumann summarized and elaborated upon logical design developments in his 1945 First Draft of a Report on the EDVAC. Like the ENIAC. Army's Ballistics Research Laboratory at the Aberdeen Proving Ground by the University of Pennsylvania's Moore School of Electrical Engineering.EDVAC ( electronic discrete variable automatic computer ) EDVAC (Electronic Discrete Variable Automatic Computer) was one of the earliest electronic computers. Presper Eckert proposed the EDVAC's construction in August 1944.000. The contract named the device the Electronic Discrete Variable Automatic Calculator. the EDVAC was built for the U. at just under $500. Project origin and plan ENIAC inventors John Mauchly and J. The design would implement a number of important architectural and logical improvements conceived during the ENIAC's construction and would incorporate a high speed serial access memory.000.

designed an index register as an extension to the original EDSAC hardware. and derated vacuum tubes for logic. 15 | P a g e . Lyons & Co. Later the project was supported by J. was constructed by Maurice Wilkes and his team at the University of Cambridge Mathematical Laboratory in England. based on the EDSAC design. it began serving the University's research needs. David Wheeler. Initially registers were limited to an accumulator and a multiplier register. Physical components As soon as EDSAC was completed. LEO I. In 1953. who were rewarded with the first commercially applied computer. EDSAC was the first practical storedprogram electronic computer. EDSAC ran its first programs on 6 May 1949. having been inspired by John von Neumann's seminal First Draft of a Report on the EDVAC. Ltd.. when it calculated a table of squares and a list of prime numbers. The machine. It used mercury delay lines for memory. None of its components were experimental. returning from a stay at the University of Illinois. Input was via 5-hole punched tape and output was via a teleprinter.EDSAC ( Electronic Delay Storage Automatic Calculator ) Electronic Delay Storage Automatic Calculator (EDSAC) was an early British computer. a British firm.

the world's first commercially available general-purpose electronic computer. a program written to search forMersenne primes ran error-free for nine hours on the night of 16/17 June 1949. which would include a floating point unit.Manchester Mark 1 The Manchester Mark 1 was one of the earliest stored-program computers. Thirty-four patents resulted from the machine's development.The Mark 1 was initially developed to provide a computing resource within the university. Development ceased at the end of 1949. which used the phrase "electronic brain" in describing it to their readers. and the machine was scrapped towards the end of 1950. Work began in August 1948. The chief designers. the Mark 1's successor. and many of the ideas behind its design were incorporated in subsequent commercial products such as the IBM 701 and 702 as well as the Ferranti Mark 1. Williams and Tom Kilburn. The computer is especially historically significant because of its pioneering inclusion of index registers. to allow researchers to gain experience in the practical use of computers. the start of a long-running debate as to whether an electronic computer could ever be truly creative. Frederic C. and the first version was operational by April 1949.The machine's successful operation was widely reported in the British press. an innovation which made it easier for a program to read sequentially through an array of words in memory. or MADM. In 1951 they started development work on Meg. developed at the Victoria University of Manchester from the Small-Scale Experimental Machine (SSEM) or "Baby" (operational in June 1948). 16 | P a g e . That description provoked a reaction from the head of the University of Manchester's Department of Neurosurgery. concluded from their experiences with the Mark 1 that computers would be used more in scientific roles than in pure mathematics. but it very quickly also became a prototype on which the design of Ferranti's commercial version could be based. replaced in February 1951 by a Ferranti Mark 1. It was also called the Manchester Automatic Digital Machine.

Presper Eckert and John Mauchly. (In the years before successor models of the UNIVAC I appeared. 1951. With a sample of just 1% of the voting population it correctly predicted that Dwight Eisenhower would win. to transfer data between cards and UNIVAC magnetic tapes.As well as being the first American commercial computer. The UNIVAC I computers were built by Remington Rand's UNIVAC division (successor of the Eckert-Mauchly Computer Corporation. now Unisys). However. as opposed to the complex numerical calculations required by scientific computers). Atomic Energy Commission) was used byCBS to predict the result of the 1952 presidential election. the UNIVAC Card to Tape converterand the UNIVAC Tape to Card converter. due to potential manual conversion costs). bought by Rand in 1950 which later became part of Sperry. 17 | P a g e . the UNIVAC I was the first American computer designed at the outset for business and administrative use (i.UNIVAC I ( universal automatic computer ) The UNIVAC I (UNIVersal Automatic Computer I) was the first commercial computer produced in the United States. The result for UNIVAC I was a greater public awareness in computing technology. for the fast execution of large numbers of relatively simple arithmetic and data transport operations. the early market share of the UNIVAC I was lower than the Remington Rand Company wished. the machine was simply known as "the UNIVAC". and was dedicated on June 14 that year. the company joined with CBS to have UNIVAC I predict the result of the 1952 Presidential election..[1] The fifth machine (built for the U. EckertMauchly Computer Corporation.)The first UNIVAC was delivered to the United States Census Bureau on March 31. the inventors of the ENIAC. Design work was begun by their company.e. As such the UNIVAC competed directly against punch-card machines (mainly made by IBM). UNIVAC I predicted Ike Eisenhower would have a landslide victory over Adlai Stevenson who the pollsters favored. This was corrected by adding offline card processing equipment. It was designed principally by J. In an effort to increase market share.S. but oddly enough the UNIVAC originally had no means of either reading or punching cards (which initially hindered sales to some companies with large quantities of data on cards. and was completed after the company had been acquired by Remington Rand.

the cost of manufacturing a chip (with smaller components built on a semiconductor chip the same size) generally stays the same. It is an example of sequential digital logic. and often multiple boards would have to be interconnected in a chassis. Microprocessors integrated into one or a few large-scale ICs the architectures that had previously been implemented using many medium. Many more microprocessors are part of embedded systems. The integration of a whole CPU onto a single chip or on a few chips greatly reduced the cost of processing power. The advent of low-cost computers on integrated circuits has transformed modern society. and communication over the Internet.Microprocessor A microprocessor incorporates the functions of a computer's central processing unit (CPU) on a single integrated circuit . programmable device that accepts digital data as input. or at most a few integrated circuits. as it has internal memory. During the 1960s. and provides results as output. For each computer built. Single-chip processors increase reliability as there were many fewer electrical connections to fail. It is a multipurpose. As microprocessor designs get faster. processes it according to instructions stored in its memory. so unit cost was low. The large number of discrete logic gates used more electrical power— and therefore. Continued increases in microprocessor capacity have 18 | P a g e . produced more heat—than a more integrated design with fewer ICs. providing digital control of a myriad of objects from appliances to automobiles to cellular phones and industrial process control. multimedia display. all of these had to be placed and soldered onto printed circuit boards. computer processors were constructed out of small and medium-scale ICs each containing from tens to a few hundred transistors. text editing.and small-scale integrated circuits. General-purpose microprocessors inpersonal computers are used for computation. The distance that signals had to travel between ICs on the boards limited the speed at which a computer could operate. The integrated circuit processor was produced in large numbers by highly automated processes. Microprocessors operate on numbers and symbols represented in the binary numeral system.

. Other embedded uses of 4-bit and 8-bit microprocessors. Moore later refined the period to two years. the increase in capacity of microprocessors has followed Moore's law. printers. using binary-coded decimal (BCD) arithmetic on 4bit words.[4] It is often incorrectly quoted as a doubling of transistors every 18 months.rendered other forms of computers almost completely obsolete (see history of computing hardware). various kinds of automation etc. which suggests that the number of transistors that can be fitted onto a chip doubles every two years. such as terminals. followed soon after. and Garrett AiResearch's Central Air Data Computer (CADC). Since the early 1970s. Affordable 8-bit microprocessors with 16-bit addressing also led to the first general-purpose microcomputers from the mid-1970s on. 19 | P a g e . with one or more microprocessors used in everything from the smallest embedded systems and handheld devices to the largest mainframes and supercomputers. Although originally calculated as a doubling every year. The first microprocessors emerged in the early 1970s and were used for electronic calculators. Texas Instruments (TI) TMS 1000. Three projects delivered a microprocessor at about the same time: Intel's 4004.

spreadsheets. games. allowing access to the World Wide Web and a wide range of other resources. Since the 1980s. whereas software for many mobile phones and other portable systems is approved and distributed through a centralized online store. Large data processing systems require a full-time staff to operate efficiently. and Tablet PC's. the batch processing or time-sharing models allowed large expensive mainframe systems to be used by many people. 20 | P a g e .Personal computer A personal computer (PC) is any general-purpose computer whose size. Personal computers may be connected to a local area network (LAN). tablet PC. but are not limited to. capabilities. databases. A personal computer may be a desktop computer or a laptop. Web browsers and email clients. and original sales price make it useful for individuals. Applications and games for PCs are typically developed and distributed independently from the hardware or OS manufacturers. or a handheld PC. word processing. AMD is the major alternative to Intel. which is provided in ready-to-run or ready-to-compile form. an era where the desktop form factor was being replace with more portable computing such as netbooks. and myriad personal productivity and special-purpose software applications. While early PC owners usually had to write their own programs to do anything useful with the machines. Microsoft and Intel have dominated much of the personal computer market. digital media playback. In July & August 2011. Modern personal computers often have connections to the Internet. first with MS-DOS and then with the Wintel platform. Alternatives to Windows include Apple's Mac OS X and the open-source Linux OSes. today's users have access to a wide range of commercial software and free software. In contrast. either by a cable or a wireless connection. Software applications for personal computers include. usually at the same time. marketing businesses and journalists started to talk about the 'Post-PC Era'. and which is intended to be operated directly by an end-user with no intervening computer operator.

An extremely small number of these 601 and 601+ processors were relabeled with Motorola logos and part numbers and distributed through Motorola. and in a variety of RS/6000 workstations and SMP servers from IBM and Groupe Bull. PowerPC 604 and the 64-bit PowerPC 620. and the second generation soon followed with the PowerPC 603. jointly funded and staffed by engineers from IBM and Motorola as a part of the AIM alliance. IBM was the sole manufacturer of the 601 and 601+ microprocessors in its Burlington. Somerset was opened in 1992 and its goal was to make the first PowerPC processor and then keep designing general purpose PowerPC processors for personal computers. Texas. The PowerPC 601 was used in the first Power Macintosh computers from Apple. 21 | P a g e . First launched in IBM systems in the fall of 1993. The first incarnation became the PowerPC 601 in 1993. a capacity that was considered large at the time for an on-chip cache. The 601 used the IBM CMOS-4s process and the 601+ used the IBM CMOS-5x process. These facts are somewhat obscured given there are various pictures of the "Motorola MPC601".8 million transistors. Vermont and East Fishkill. It operated at speeds ranging from 50 to 80 MHz. Thanks partly to the large cache it was considered a high performance processor in its segment. The chip was designed to suit a wide variety applications and had support for external L2 cache and symmetric multiprocessing. New York production facilities. the branch pipeline two stages long. They were designed at the Somerset facility in Austin.6 µm CMOSprocess with four levels of aluminum interconnect. an integer unit. outperforming the competing Intel Pentium. and the floating-point pipeline six stages long. The 601 has a 32 kB unified L1 cache. The integer pipeline was four stages long. including a floating point unit. it was marketed by IBM as the PPC601 and by Motorola as the MPC601. The processor also included a memory management unit. The die was 121 mm² large and contained 2. It had four functional units. a branch unit and a sequencer unit.PowerPC 600 The PowerPC 600 family was the first family of PowerPC processors built. the memory pipeline five stages long. particularly one specific case of masterful Motorola marketing where the 601 was named one of Time Magazine's 1994 "Products of the Year" with a Motorola marking. It was fabricated using a 0.

It worked on a place-value notion meaning that the place of a bead or rock on the apparatus determined how much it was worth. the Atanasoff-Berry- 22 | P a g e . Babbage. Although this machine could perform addition and subtraction on whole numbers. invents the first mechanical digital calculator using gears. "The first programmer" suggested that a binary system shouled be used for staorage rather than a decimal system. His company would eventually become International Business Machines (IBM). called the Pascaline. • 1812 : Charles P. repeated operations. Atanasoff and his assistant Clifford Berry build the first electronic digital computer. discovered that many long calculations involved many similar. • 1850s : George Boole developed Boolean logic which would later be used in the design of computer circuitry. • 1600s : John Napier discovers logarithms. In 1833. originally from Asia. fully automatic and commanded by a fixed instruction program. • 1642 : Blaise Pascal. • 1804 : Joseph Marie Jacquard used punch cards to automate a weaving loom. it was too expensive and only Pascal himself could repare it. Robert Bissaker invents the slide rule which will remain in popular use until 19??. Herman Hollerith introduced the first electromechanical. Their machine. • 1906 : The vacuum tube is invented by American physicist Lee De Forest. a French mathematician and philosopher. Hollerith's tabulator became so successful that he started his own business to market it. Therefore.Glossary The first counting device was the abacus. • 1840s: Augusta Ada. punched-card data-processing machine which was used to compile information for the 1890 U.S. • 1890: Dr. Babbage quit working on this machine to concentrate on the analytical engine. the difference engine which would be steam-powered. the "father of the computer". he designed a machine. John V. • 1939 : Dr. census.

EDVAC (Electronic Discrete Variable Automatic Computer). It would rid computers of vacuum tubes and radios. Mauchly and J. punch-card input. called the Z3. John Bardeen and Walter Brattain of Bell Labs. It used 18. This computer could handle all four arithmetic opreations.Computer (ABC) provided the foundation for the advances in electronic digital computers. • 1943 : British mathematician Alan Turing developped a hypothetical device. the first stored-program computer. Wilkes built the EDSAC (Electronic Delay Storage Automatic Computer). • 1941 : Konrad Zuse (recently deceased in January of 1996). a machine used to counteract the German code scrambling device. • 1944 : Howard Aiken. was also the first to work on the binary system instead of the decimal system. Eckert. It would presage programmable computers. in collaboration with engineers from IBM. • 1947 : The giant ENIAC (Electrical Numerical Integrator and Calculator) machine was developped by John W. the Turing machine which would be designed to perform logical operation and could read and write. the second storedprogram computer was built by Mauchly. That same year. constructed a large automatic digital sequence-controlled computer called the Harvard Mark I. This machine. and had special built-in programs for logarithms and trigonometric functions. Jr. Presper Eckert. John von Neumann presented a paper outlining the storedprogram concept. 23 | P a g e . introduced the first programmable computer designed to solve complex engineering equations. 000 vacuums. the transistor was invented by William Shockley. He also used vacuum technology to build British Colossus. • 1945 : Dr. and von Neumann. from Germany. • 1950 : Turing built the ACE. weighed thirty tons and occupied a thirty-by-fifty-foot space. An Wang developped magnetic-core memory which Jay Forrester would reorganize to be more efficient. • 1949 : Maurice V. Enigma. at the University of Pennsylvania. considered by some to be the first programmable digital computer. It wasn't programmable but was productive from 1946 to 1955 and was used to compute artillery firing tables.

The computer MESM (МЭСМ. developed by Frederic C. the circuitry detected whether the pulse represented a 1 or 0 and caused the oscillator to re-send the pulse. This basic design.GENERATION OF COMPUTER FIRST-GENERATION: Machines Even before the ENIAC was finished. Manchester University's machine became the prototype for the Ferranti Mark 1. Another early machine was CSIRAC. Eckert and Mauchly recognized its limitations and started the design of a stored-program computer. It had about 6.000 vacuum tubes and consumed 25 kW of power. a complete system. it was followed in 1949 by the Manchester Mark 1 computer. would serve as the foundation for the worldwide development of ENIAC's successors. A series of acoustic pulses is sent along a tube. magnetic core memory was rapidly displacing most other forms of temporary storage. it was also capable of tackling real problems. after a time. This design was simpler and was the first to be implemented in each succeeding wave of miniaturization. EDVAC used a single processing unit. John von Neumann was credited with a widely circulated reportdescribing the EDVAC design in which both the programs and working data were stored in a single. denoted the von Neumann architecture. however it was not the first to run. these plans were already in place by the time ENIAC was successfully operational. In this generation of equipment. 1951 and at least nine others were sold between 1951 and 1957. which used parallel processing.EDVAC was the first stored-program computer designed. and dominated the field through the mid-1970s. an Australian design that ran its first test program in 1949. and introducing index registers. temporary or working storage was provided by acoustic delay lines. Others used Williams tubes.The first universal programmable computer in the Soviet Union was created by a team of scientists under direction of Sergei Alekseyevich Lebedev from Kiev Institute of Electrotechnology. Unlike ENIAC.000 operations per second. as the pulse reached the end of the tube. The other contender for the title "first digital stored-program computer" had been EDSAC. Operational less than one year after the Manchester "Baby". It could perform approximately 3. unified store. Eckert and Mauchly left the project and its construction floundered. EDVAC. Williams and Tom Kilburn at the University of Manchester in 1948 as a test bed for the Williams tube. By 1954. CSIRAC is the oldest computer still in existence and the first to have been used to play digital music. Small Electronic Calculating Machine) became operational in 1950. designed and constructed at the University of Cambridge. using Williams tube andmagnetic drum memory. EDSAC was actually inspired by plans for EDVAC (Electronic Discrete Variable Automatic Computer). and increased reliability. Some view Manchester Mark 1 / EDSAC / EDVAC as the "Eves" from which nearly all current computers derive their architecture. 24 | P a g e . which use the ability of a small cathode-ray tube (CRT) to store and retrieve data as charged areas on the phosphor screen. the successor to ENIAC. Soviet Union (nowUkraine). The first Ferranti Mark 1 machine was delivered to the University in February. The first working von Neumann machine was the Manchester "Baby" or Small-Scale Experimental Machine. which used the propagation time of sound through a medium such as liquid mercury (or through a wire) to briefly store data.

transistors have many advantages: they are smaller. one for the instruction. and require less power than vacuum tubes.SECOND GENERATION: Transistors The bipolar transistor was invented in 1947.000 operations per second) because most operations took at least two memory cycles. service life. Transistorized computers could contain tens of thousands of binary logic circuits in a relatively compact space. A removable disk stack can be easily exchanged with another stack in a few seconds. The later machine used 200 transistors and 1.A second generation computer. their interchangeability guarantees a nearly unlimited quantity of data close at hand. Telephone connections provided sufficient speed for early remote terminals and allowed hundreds of kilometers separation between remoteterminals and the computing center. second-generation computers were composed of large numbers of printed circuit boards such as the IBM Standard Modular System each carrying one to four logic gates or flip-flops. However. consequently most arithmetic instructions took 10 microseconds (100. but this improved once the more reliable bipolar junction transistors became available. The first transistorised computer was built at theUniversity of Manchester and was operational by 1953. the IBM 1401. Many second-generation CPUs delegated peripheral device communications to a secondary processor. Magnetic tape provided archival capability for this data.During the second generation remote terminal units (often in the form of teletype machines like a Friden Flexowriter) saw greatly increased use. Next to the fixed disk storage units. it still required valves to generate the clock waveforms at 125 kHz and to read and write on the magnetic drum memory. Compared to vacuum tubes. of 58 kHz when it became operational in February 1955. Problems with the reliability of early batches of point contact and alloyed junction transistors meant that the machine's mean time between failures was about 90 minutes.Transistorized electronics improved not only the CPU (Central Processing Unit). One databus would bear data between the main CPU and core memory at the CPU's fetchexecute cycle rate. a second version was completed there in April 1955. For example. were removable disk data storage units. the main CPU executed calculations and binary branch instructions. initial cost.300 solid-state diodes and had a power consumption of 150 watts. From 1955 onwards transistors replaced vacuum tubes in computer designs. so give off less heat. and other databusses would typically serve the peripheral devices. whereas the Harwell CADET operated without any valves by using a lower clock frequency. which although less reliable than the vacuum tubes they replaced had the advantage of consuming far less power. giving rise to the "second generation" of computers. one for the operand data fetch. and operating cost. Silicon junction transistors were much more reliable than vacuum tubes and had longer. The second generation disk data storage units were able to store tens of millions of letters and digits. connected to the CPU via high-speed data transmission. On the PDP-1. 25 | P a g e . the core memory's cycle time was 5 microseconds. Transistors greatly reduced computers' size. Initially the only devices available were germanium point-contact transistors. captured about one third of the world market. indefinite. IBM installed more than ten thousand 1401s between 1960 and 1964. The IBM 350 RAMAC was introduced in 1956 and was the world's first disk drive. while the communication processor controlled card reading and punching. Typically. Even if the removable disks' capacity is smaller than fixed disks. but also the peripheral devices. at a lower cost than disk. Eventually these stand-alone computer networks would be generalized into an interconnected network of networks—the Internet.

It supported a wide variety of languages. not only through more convenient physical size but also through broadening the computer vendor field.By 1971. designed by Don Lancaster. The minicomputer was a significant innovation in the 1960s and 1970s. the Illiac IV supercomputer. notably used by NASA for the Apollo Guidance Computer. the TV Typewriter. Clive Sinclair later used the same approach in his legendary Sinclair ZX80. and in the Central Air Data Computer used for flight control in the US Navy's F-14A Tomcat fighter jet. The original design included two memory boards and could generate and store 512 characters as 16 lines of 32 characters. used about a quarter-million small-scale ECL logic gate integrated circuits to make up sixty-four parallel data processors. These generally relied on Jack Kilby's invention of the integrated circuit (or microchip). Also notable was that the entire central processor was contained on one 15-inch printed circuit board. It used $120 worth of electronics components. Digital Equipment Corporation became the number two computer company behind IBM with their popular PDPand VAX computer systems. with subsequent models using largescale integrated (LSI) circuits. Hewlett-Packard entered the general purpose computer business with its HP-2116. which were solid-state devices interconnected on a substrate with discrete wires. ALGOL. A 90-minute cassette tape provided supplementary storage for about 100 pages of text.THIRD GENERATION The mass increase in the use of computers accelerated with 'Third Generation' computers. 26 | P a g e . The Nova was one of the first 16-bit minicomputers and led the way toward word lengths that were multiples of the 8-bit byte. among them BASIC. Some of their early uses were in embedded systems. Smaller. and FORTRAN.The first integrated circuit was produced in September 1958 but computers using them didn't begin to appear until 1963. However. provided electronics hobbyists with a display of alphanumeric information on an ordinary television set. His design used minimalistic hardware to generate the timing of the various signals needed to create the TV signal.In 1966. Data General shipped a total of 50. the IBMSystem/360 used hybrid circuits. as outlined in the September 1973 issue of Radio Electronics magazine. It was first to employ medium-scale integration (MSI) circuits from Fairchild Semiconductor. starting around 1965. which was the fastest computer in the world for several years. While large mainframe computers such as the System/360 increased storage and processing abilities. offering computing power formerly found only in much larger computers. In 1973.In 1969. It brought computing power to more people.000 Novas at $8000 each. affordable hardware also brought about the development of important new operating systems like Unix. the integrated circuit also allowed development of much smaller computers. by the military in the LGM-30 Minuteman intercontinental ballistic missile.

Unlike third generation minicomputers. This machine. 8080 (used in many computers using the CP/M operating system). (kilobits of memory on one chip) . as an alternative to hardwired circuitry. In 1976 the Cray1 was developed by Seymour Cray. Busicom. with much of their processing abilities provided by one small microprocessor chip. based on an invention by Robert Dennard of IBM. has been a fundamental supercomputer processing method ever since. Microprocessor-based computers were originally very limited in their computational ability and speed. but its successors. and the 8086/8088 family (the IBM personal computer (PC) and compatibles use processors still backwards-compatible with the 8086) brought ever-growing speed and power to the computers.FOUTH GENERATION The basis of the fourth generation was the invention of the microprocessor by a team at Intel. Microprocessors On November 15. Coupled with one of Intel's other products the RAM chip. Supercomputers At the other end of the computing spectrum from the microcomputers. who had left Control Data in 1972 to form his own company. They were addressing an entirely different market. Vector processing. Although processing power and storage capacities have grown beyond all recognition since the 1970s. the first supercomputer to make vector processing practical. It was developed for a Japanese calculator company. the underlying technology of large-scale integration (LSI) or very-large-scale integration (VLSI) microchips has remained basically the same. The Cray-1 had a CPU that was mostly constructed of SSI and MSI ECL ICs. 85 were shipped at a price of $5 million each.the microprocessor allowed fourth generation computers to be smaller and faster than prior computers. the powerful supercomputers of the era also used integrated circuit technology. The 4004 was only capable of 60. which were essentially scaled down versions of mainframe computers. Intel released the world's first commercial microprocessor. 27 | P a g e . had a characteristic horseshoe shape. so it is widely regarded that most of today's computers still belong to the fourth generation.000 instructions per second. and were in no way an attempt to downsize the minicomputer. but computers were developed around it. to speed processing by shortening circuit paths. the fourth generation's origins are fundamentally different. The Cray-1 could calculate 150 million floating point operations per second (150 megaflops). 1971. Other producers also made microprocessors which were widely used in microcomputers. which uses one instruction to perform the same operation on many arguments. the Intel 8008. the 4004.

they were much smaller. such as card punches. costly systems owned by large institutions: corporations. In some organizations it could take hours or days between submitting a job to the computing center and receiving the output. because of its graphical user interface. and others. A different model of computer use was foreshadowed by the way in which early. The minicomputer Xerox Alto (1973) was a landmark step in the development of personal computers.Mainframes and Mini computers Before the introduction of the microprocessor in the early 1970s. This was common in business applications and in science and engineering. 28 | P a g e . and the like. Minicomputers largely freed these organizations from the batch processing and bureaucracy of a commercial or university computing center. computers were generally large. However. large internal and external memory storage. users could collect the output printouts and punched cards. Some of the first computers that might be called "personal" were early minicomputers such as the LINC and PDP-8. experimental computers were used. In a time-sharing system. but instead prepared tasks for the computer on off-line equipment. Prime Computer. taking on some routine tasks and freeing the processor for computation. government agencies. where one user had exclusive use of a processor. In addition. and thus were rarely purchased by individuals. Users—who were experienced specialists—did not usually interact with the machine itself. mouse. bit-mapped high resolution screen. and later on VAX and larger minicomputers from Digital Equipment Corporation (DEC). By today's standards they were physically large (about the size of a refrigerator) and costly (typically tens of thousands of US dollars). They originated as peripheral processors for mainframe computers. multiple teletype terminals let many people share the use of one mainframe computer processor. Data General. and thus affordable by individual laboratories and research projects. and soon had their own operating systems. precommercial. and special software. minicomputers were more interactive than mainframes. After the jobs had completed. universities. less expensive. A more interactive form of computer use developed commercially by the middle 1960s. A number of assignments for the computer would be gathered up and processed in batch mode. and generally simpler to operate than the mainframe computers of the time.

it was widely believed at the time. Sun workstations and Intel x86 machines). transistors and diodes. the third. MITI stopped funding large-scale computer research projects. the committed choice feature of concurrent constraint logic programming interfered with the logical semantics of the languages. or it was ahead of its time. The highly parallel computer architecture was eventually surpassed in speed by less specialized hardware (for example.The term fifth generation was intended to convey the system as being a leap beyond existing machines. Although a number of workstations of increasing capacity were designed and built over the project's lifespan. integrated circuits. This never happened cleanly. to create a "fifth generation computer" (see History of computing hardware) which was supposed to perform much calculation using massive parallel processing. all with their own limitations. and the value of parallel computing quickly dropped to the point where it was for some time used only in niche situations. after which it was considered ended and investment in a new. and those usingmicroprocessors. It was to be the end result of a massive government/industry research project in Japan during the 1980s. Opinions about its outcome are divided: Either it was a failure. The project did produce a new generation of promising Japanese researchers. But after the FGCS Project. The project was to create the computer over a ten year period. and the research momentum developed by the FGCS Project dissipated. Another problem was that existing CPU performance quickly pushed through the "obvious" barriers that experts perceived in the 1980s. Sixth Generation project. the second. begun in 1982. they generally found 29 | P a g e . In particular. However MITI/ICOT embarked on a Sixth Generation Project in the 1990s. Failure The FGCS Project did not meet with commercial success for reasons similar to the Lisp machine companies and Thinking Machines. would instead turn to massive numbers of CPUs for added performance. the fifth generation. began.Fifth generation The Fifth Generation Computer Systems project (FGCS) was an initiative by Japan's Ministry of International Trade and Industry. the fourth. Computers using vacuum tubes were called the first generation. a number of languages were developed.A primary problem was the choice of concurrent logic programming as the bridge between the parallel computer architecture and the use of logic as a knowledge representation and problem solving language for AI applications. It aimed to create an "epoch-making computer" with supercomputer-like performance and to provide a platform for future developments in artificial intelligence. Whereas previous computer generations had focused on increasing the number of logic elements in a single CPU.

making expensive Lisp machines unnecessary. GUIs became mainstream in computers.[citation needed]At the end of the ten year period the project had spent over ¥50 billion (about US$400 million at 1992 exchange rates) and was terminated without having met its goals. TheWeb Ontology Language (OWL) employs several layers of logic-based knowledge representation systems. 30 | P a g e . such as logic programming distributed over massive knowledge-bases. while many flavors of parallel computing proliferate. and even simple research projects provided better real-world results in data mining.000 worth of industry funding and an equal amount of government funding. In spite of the possibility of considering the project a failure. a concurrent Prolog-variant with object oriented extensions.themselves soon outperformed by "off the shelf" units available commercially. the internet enabled locally stored databases to become distributed. This is parallel to the Lisp machine market. where rule-based systems such as CLIPS could run on generalpurpose computers. During its lifespan. including multi-core architectures at the low-end and massively parallel processing at the high end. The workstations had no appeal in a market where general purpose systems could now take over their job and even outrun them. are now being re-interpreted in current technologies. 1985: the first FGCS hardware known as the Personal Sequential Inference Machine (PSI) and the first version of the Sequential Inference Machine Programming Operating System (SIMPOS) operating system is released.The project also suffered from being on the wrong side of the technology curve. many of the approaches envisioned in the Fifth-Generation project. Timeline   1982: the FGCS project begins and receives $450. SIMPOS is programmed inKernel Language 0 (KL0).[citation needed] Moreover the project found that the promises of logic programming were largely negated by the use of committed choice.000.

A removable disk stack can be easily exchanged with another stack in a few seconds.During the second generation remote terminal units (often in the form of teletype machines like a Friden Flexowriter) saw greatly increased use. and other databusses would typically serve the peripheral devices. One databus would bear data between the main CPU and core memory at the CPU's fetchexecute cycle rate. used about a quarter-million small-scale ECL logic gate integrated circuits to make up sixty-four parallel data processors. the Illiac IV supercomputer. their interchangeability guarantees a nearly unlimited quantity of data close at hand. while the communication processor controlled card reading and punching.000 operations per second) because most operations took at least two memory cycles.removable disk data storage units. Many second-generation CPUs delegated peripheral device communications to a secondary processor. and in the Central Air Data Computer used for flight control in the US Navy's F-14A Tomcat fighter jet. Magnetic tape provided archival capability for this data. For example. These generally relied on Jack Kilby's invention of the integrated circuit (or microchip). Some of their early uses were in embedded systems. On the PDP-1. one for the operand data fetch. THIRD GENERATION The mass increase in the use of computers accelerated with 'Third Generation' computers. Digital 31 | P a g e . one for the instruction. notably used by NASA for the Apollo Guidance Computer. the integrated circuit also allowed development of much smaller computers. Even if the removable disks' capacity is smaller than fixed disks. Eventually these stand-alone computer networks would be generalized into an interconnected network of networks—the Internet. While large mainframe computers such as the System/360 increased storage and processing abilities. It brought computing power to more people. which was the fastest computer in the world for several years. not only through more convenient physical size but also through broadening the computer vendor field. starting around 1965. at a lower cost than disk. consequently most arithmetic instructions took 10 microseconds (100. the core memory's cycle time was 5 microseconds. the IBMSystem/360 used hybrid circuits. by the military in the LGM-30 Minuteman intercontinental ballistic missile. which were solid-state devices interconnected on a substrate with discrete wires.By 1971. Telephone connections provided sufficient speed for early remote terminals and allowed hundreds of kilometers separation between remote-terminals and the computing center. The minicomputer was a significant innovation in the 1960s and 1970s. However.The first integrated circuit was produced in September 1958 but computers using them didn't begin to appear until 1963. the main CPU executed calculations and binary branch instructions.

It was first to employ medium-scale integration (MSI) circuits from Fairchild Semiconductor. the fourth generation's origins are fundamentally different. Unlike third generation minicomputers. Microprocessors On November 15. A 90-minute cassette tape provided supplementary storage for about 100 pages of text. Also notable was that the entire central processor was contained on one 15-inch printed circuit board. It supported a wide variety of languages. The original design included two memory boards and could generate and store 512 characters as 16 lines of 32 characters. Intel released the world's first commercial microprocessor. Hewlett-Packard entered the general purpose computer business with its HP-2116. Data General shipped a total of 50. provided electronics hobbyists with a display of alphanumeric information on an ordinary television set. but computers were developed around it. FOUTH GENERATION The basis of the fourth generation was the invention of the microprocessor by a team at Intel. Busicom. affordable hardware also brought about the development of important new operating systems like Unix. which were essentially scaled down versions of mainframe computers. and were in no way an attempt to downsize the minicomputer. among them BASIC. Microprocessor-based computers were originally very limited in their computational ability and speed. 1971.Equipment Corporation became the number two computer company behind IBM with their popular PDPand VAX computer systems. Although processing power and storage capacities have grown beyond all recognition since the 1970s.In 1966.000 Novas at $8000 each. The Nova was one of the first 16-bit minicomputers and led the way toward word lengths that were multiples of the 8-bit byte.In 1969. with much of their processing abilities 32 | P a g e . as outlined in the September 1973 issue of Radio Electronics magazine. ALGOL. Smaller. designed by Don Lancaster. His design used minimalistic hardware to generate the timing of the various signals needed to create the TV signal. They were addressing an entirely different market. Clive Sinclair later used the same approach in his legendary Sinclair ZX80. the underlying technology of large-scale integration (LSI) or very-large-scale integration (VLSI) microchips has remained basically the same. offering computing power formerly found only in much larger computers. so it is widely regarded that most of today's computers still belong to the fourth generation. as an alternative to hardwired circuitry. It was developed for a Japanese calculator company. the 4004. and FORTRAN. It used $120 worth of electronics components. In 1973. the TV Typewriter. with subsequent models using largescale integrated (LSI) circuits.

but its successors. who had left Control Data in 1972 to form his own company. The Cray-1 had a CPU that was mostly constructed of SSI and MSI ECL ICs. In 1976 the Cray1 was developed by Seymour Cray. The Cray-1 could calculate 150 million floating point operations per second (150 megaflops). Coupled with one of Intel's other products the RAM chip. based on an invention by Robert Dennard of IBM. universities.the microprocessor allowed fourth generation computers to be smaller and faster than prior computers. users could collect the output printouts 33 | P a g e .000 instructions per second. 8080 (used in many computers using the CP/M operating system). 85 were shipped at a price of $5 million each. A number of assignments for the computer would be gathered up and processed in batch mode. Mainframes and Mini computers Before the introduction of the microprocessor in the early 1970s. (kilobits of memory on one chip) . computers were generally large. the powerful supercomputers of the era also used integrated circuit technology. such as card punches. Supercomputers At the other end of the computing spectrum from the microcomputers. and the like. The 4004 was only capable of 60. has been a fundamental supercomputer processing method ever since. the first supercomputer to make vector processing practical. which uses one instruction to perform the same operation on many arguments. Other producers also made microprocessors which were widely used in microcomputers. government agencies. to speed processing by shortening circuit paths. Vector processing. After the jobs had completed. the Intel 8008. had a characteristic horseshoe shape.provided by one small microprocessor chip. costly systems owned by large institutions: corporations. and the 8086/8088 family (the IBM personal computer (PC) and compatibles use processors still backwards-compatible with the 8086) brought ever-growing speed and power to the computers. Users—who were experienced specialists—did not usually interact with the machine itself. This machine. but instead prepared tasks for the computer on off-line equipment.

and later on VAX and larger minicomputers from Digital Equipment Corporation (DEC). mouse. multiple teletype terminals let many people share the use of one mainframe computer processor. The minicomputer Xerox Alto (1973) was a landmark step in the development of personal computers. bit-mapped high resolution screen. Fifth generation 34 | P a g e . experimental computers were used. taking on some routine tasks and freeing the processor for computation. precommercial. They originated as peripheral processors for mainframe computers. A more interactive form of computer use developed commercially by the middle 1960s. and special software. In some organizations it could take hours or days between submitting a job to the computing center and receiving the output. and generally simpler to operate than the mainframe computers of the time. In addition. and others. because of its graphical user interface. In a time-sharing system. Data General. less expensive. minicomputers were more interactive than mainframes.and punched cards. This was common in business applications and in science and engineering. where one user had exclusive use of a processor. and thus were rarely purchased by individuals. they were much smaller. However. and thus affordable by individual laboratories and research projects. Minicomputers largely freed these organizations from the batch processing and bureaucracy of a commercial or university computing center. and soon had their own operating systems. Some of the first computers that might be called "personal" were early minicomputers such as the LINC and PDP-8. Prime Computer. By today's standards they were physically large (about the size of a refrigerator) and costly (typically tens of thousands of US dollars). large internal and external memory storage. A different model of computer use was foreshadowed by the way in which early.

would instead turn to massive numbers of CPUs for added performance. Computers using vacuum tubes were called the first generation. MITI stopped funding large-scale computer research projects. to create a "fifth generation computer" (see History of computing hardware) which was supposed to perform much calculation using massive parallel processing. However MITI/ICOT embarked on a Sixth Generation Project in the 1990s. The project did produce a new generation of promising Japanese researchers. In particular. It aimed to create an "epoch-making computer" with supercomputer-like performance and to provide a platform for future developments in artificial intelligence. the second. the third. the committed choice feature of concurrent constraint logic programming interfered with the logical semantics of the languages. The project was to create the computer over a ten year period.A primary problem was the choice of concurrent logic programming as the bridge between the parallel computer architecture and the use of logic as a knowledge representation and problem solving language for AI applications. after which it was considered ended and investment in a new.The term fifth generation was intended to convey the system as being a leap beyond existing machines. the fifth generation. all with their own limitations. it was widely believed at the time. Another problem was that existing CPU performance quickly pushed through the "obvious" barriers that experts perceived in the 1980s. and the value of parallel computing quickly dropped to the point where 35 | P a g e . began. Sixth Generation project. and the research momentum developed by the FGCS Project dissipated. It was to be the end result of a massive government/industry research project in Japan during the 1980s.The Fifth Generation Computer Systems project (FGCS) was an initiative by Japan's Ministry of International Trade and Industry. and those usingmicroprocessors. But after the FGCS Project. transistors and diodes. Opinions about its outcome are divided: Either it was a failure. begun in 1982. Sun workstations and Intel x86 machines). Failure The FGCS Project did not meet with commercial success for reasons similar to the Lisp machine companies and Thinking Machines. a number of languages were developed. Whereas previous computer generations had focused on increasing the number of logic elements in a single CPU. The highly parallel computer architecture was eventually surpassed in speed by less specialized hardware (for example. the fourth. This never happened cleanly. integrated circuits. or it was ahead of its time.

GUIs became mainstream in computers. while many flavors of parallel computing proliferate. and even simple research projects provided better real-world results in data mining. During its lifespan. In spite of the possibility of considering the project a failure. many of the approaches envisioned in the Fifth-Generation project. This is parallel to the Lisp machine market. making expensive Lisp machines unnecessary.it was for some time used only in niche situations. SIMPOS is programmed inKernel Language 0 (KL0). they generally found themselves soon outperformed by "off the shelf" units available commercially. a concurrent Prolog-variant with object oriented extensions.000 worth of industry funding and an equal amount of government funding. The workstations had no appeal in a market where general purpose systems could now take over their job and even outrun them. including multicore architectures at the low-end and massively parallel processing at the high end.[citation needed] Moreover the project found that the promises of logic programming were largely negated by the use of committed choice. the internet enabled locally stored databases to become distributed.[citation needed]At the end of the ten year period the project had spent over ¥50 billion (about US$400 million at 1992 exchange rates) and was terminated without having met its goals. 1985: the first FGCS hardware known as the Personal Sequential Inference Machine (PSI) and the first version of the Sequential Inference Machine Programming Operating System (SIMPOS) operating system is released. Although a number of workstations of increasing capacity were designed and built over the project's lifespan. where rule-based systems such as CLIPS could run on general-purpose computers. TheWeb Ontology Language (OWL) employs several layers of logic-based knowledge representation systems. Timeline   1982: the FGCS project begins and receives $450.000.The project also suffered from being on the wrong side of the technology curve. such as logic programming distributed over massive knowledge-bases. are now being re-interpreted in current technologies. 36 | P a g e .