You are on page 1of 33

BY:- ANURAG JAIN DEEPTI SARASWAT

1

INTRODUCTION MAINFRAME AND SUPERCOMPUTER DISTRIBUTED SUPERCOMPUTER GRID COMPUTING APPLICATION OF SUPERCOMPUTING SUPERCOMPUTER OPERATING SYSTEMS QUASI-OPPORTUNISTIC COMPUTER CLUSTER PERFORMANCE

2

MEMORY REQUIREMENTS QUANTUM SUPERCOMPUTERS

PARALLEL PROCESSING
DNA SUPERCOMPUTER

3

4 .

5 .A supercomputer is a computer that is used for applications that requires large amount of mathematical calculations.

6 . as well as gracefully dealing with inevitable hardware failures when tens of thousands of processors are present. the job management system needs to manage the allocation of both computational and communication resources.While in a traditional multi-user computer system job scheduling is in effect a tasking problem for processing and peripheral resources. in a massively parallel system.

Aspect of Super computer 7 .

or array processor.”  tightly connected cluster computers: A computer cluster is “a group of connected computers that work together as a unit. 8 . is.Vector processing machines: Vector processor. Two-node clusters. Multi-node clusters.  commodity computers: A large number of commodity PC’s interconnected by high-bandwidth low-latency local area networks. and Massively parallel clusters.” There are basically four types of clusters: Director-based clusters. “a CPU that is able to run mathematical operations on a large number of data elements very quickly.

SUPER COMPUTER MOTHERBOARD 9 .

MSI K9N6PGM-F Micro ATX AMD Athlon 64 X2 3800+ AM2 CPU Kingston DDR2-6661GByte RAM Echo Star 325W Micro ATX Power Supply Intel PRO/1000 PT PCI-Express NIC Intel PRO/100 S PCI NIC Seagate 7200 250GB SATA HD 10 .

whereas a mainframe uses its power to execute many programs concurrently.The chief difference between a supercomputer and a mainframe is that a supercomputer channels all its power into executing a few programs as fast as possible. 11 .

In 1970’s only few processors were used. 12 . but in 1990s machines with thousands of processors began to appear and by the end of the 20th century. massively parallel supercomputers with tens of thousands of "off-the-shelf" processors were used.

While early operating systems were custom tailored to each supercomputer to gain speed. supercomputer operating systems have undergone major transformations. as sea changes have taken place in supercomputer architecture. 13 . the trend has been to move away from in-house operating systems to the adaptation of generic software such as Linux.Since the end of the 20th century.

Quasi-opportunistic Supercomputing is a form of distributed computing whereby the “super virtual computer” of a large number of networked geographically disperse computers performs huge processing power demanding computing tasks. Quasi-opportunistic supercomputing aims to provide a higher quality of service than opportunistic grid computing by achieving more control over the assignment of tasks to distributed resources and the use of intelligence about the availability and reliability of individual systems within the supercomputing network. 14 14 .

Japan's K computer (a cluster) is the fastest in the world.g.In another approach. The use of multi-core processors combined with centralization is an emerging direction. e. a large number of processors are used in close proximity to each other. 15 . Currently. in a computer cluster.

16 .

g. a very complex weather simulation application. Capability computing is typically thought of as using the maximum computing power to solve a single large problem in the shortest amount of time. capacity Supercomputers generally aim for the maximum in capability computing rather than capacity computing. 17 . e.Capability vs. Often a capability system is able to solve a problem of a size or complexity that no other computer can.

less powerful computers. 18 .Supercomputer is the term normally used to denote the fastest. most powerful class of computers at any given point in time. These are built for optimum computational performance A supercomputer is can be defined as any computer which utilizes more than one processor at the same time by using the concept of parallel processing to attain ultra high speeds in processing with matching accuracy and applications Supercomputers are used to solve large and complex problems that cannot be solved by smaller.

oil and gas exploration. weather forecasting. Supercomputers are used for highly calculation-intensive tasks such as problems including quantum physics. molecular modeling and physical simulations 19 . climate research.

Also it is mandatory for the memory to access the same data at a same rate.  Supercomputers are With the dawn of 21st century many new technologies have come into existence which support such kind of storage structures Following two types of memory related techniques have come into picture due to their revolutionary methods: 20 . So it is very essential that the memory which supports any Supercomputer should store results of trillions of calculations within fraction of a second.known to perform complex operations at rates measuring in Teraflops.

This field involves the application of atoms. molecules or photons to provide processing. This property of a computer to exist in multiple states at the same time is called QUANTUM PARELLELISM.  21 .Quantum computers use the concept of quantum mechanics. But the revolutionary concept of a quantum computer states that a processor can be in all the 256 states at a time in an 8-bit digital supercomputer. Traditionally an 8-bit digital computer can exist in one of the256 states possible at a time.

e. is opportunistically used whenever a computer is available. in grid computing the processing power of a large number of computers in distributed.g.Systems with a massive number of processors generally take one of two paths: in one approach. diverse administrative domains. 22 .

This level requires the development of parallel process able algorithms 23 . The highest level of parallel processing is conducted among multiple jobs or programs through multiprogramming. Parallel processing demands in a computer execution of many programs in the computer. and multiprocessing. time sharing. and pipelining. simultaneity. Parallel processing is an efficient form of information processing which emphasizes the exploitation of concurrent events in the computing process. It is in contrast to sequential processing. Concurrency implies parallelism.

24 .

25 .

Vector and array processors fall into this category.MultipleData(SIMD) Stream: A single machine instruction controls the simultaneous execution of a number of processing elements on a lockstep basis. 26 . so that each instruction is executed on a different set of data by the different processors. Each processing element has an associated data memory.SingleInstruction.

Single Data(MISD)Stream:- A sequence of data is transmitted to a set processors. This structure is not commercially implemented. SMPs. Multiple Instruction.  MultipleInstruction. and NUMA systems fit into this category. 27 . each of which executes different instruction sequence. clusters.MultipleData(MIMD) Stream:A set of processors simultaneously execute different instruction sequences different data sets.

early experiments with shared memory multiprocessors. flexibility.Milestones in Parallel Processing 1840s Desirability of computing multiple values at the same time was noted in connection with Babbage’s Analytical Engine 1970-80s Vector machines. and using wormhole routing. and reliability for server farms and other areas 28 . message-passing multicomputers. offering speed. and gracefully degrading systems paved the way for further progress 1990s Massively parallel computers with mesh or torus interconnection. emerged and quickly dominated the supercomputer market. with deeply pipelined function units. became dominant 2000-12s Commodity and network-based parallel computing took hold.

guanine. Similarly a DNA is coded in the form of four nucleotides which are adenine. DNA stores permanent information about genes. The concept of DNA as a memory is derived from the logic between the binary language and DNA. The binary language consists of 0¶s and 1¶s.DNA (DEOXY RIBOSE NUCLEIC ACID) is the hereditary material in human beings.cytosine and thymine. Genes in the form of DNA store the information which is to be transferred from parent to child. 29 .

Windows NT. Ensure during the Windows set up phase that TCP/IP. and NETBUI are installed.). and that the network is started with all the network cards detected and the correct drivers installed. switch etc. We will call these two computers a Windows cluster. There are several implementations of this standard.Hardware At least two computers with Windows XP. You now you need some sort of software that will help you to develop. deploy and execute applications over this cluster. Software : The Message Passing Interface (MPI) is an evolving de facto standard for supporting clustered computing based on message passing. This software is the core of what makes a Windows cluster possible. 30 . SP6 or Windows 2000 networked with some sort of LAN equipment (hub.

5% of your brain's processes . 31 .Building models of the complete human brain will require supercomputers more than a thousand times more powerful than today’s largest machines Even supercomputers not yet close to the raw power of human brain IBM's Blue Gene supercomputer can handle 4.and might have all of them covered by 2019.

32 .

33 .