Arrays Array is a list or collection of data items which are stored in the form of a table and is given a common

name which is called array name or Subscript variable name. (Read the advantages of Array from the book) Pointers: Pointers are the special type of variable which point to another variable, structure, function or even a pointer. (Read the advantages of pointer from the book) Data Structure (Unit 2) A data structure is the organization of data in a computer’s memory or in a file. An ADT (Abstract Data type) defines a data structure with set of entities, their use in independent from a particular programming language and a set of operations on the entities. (Read Unit 2 carefully as it is the introductory part of the whole book). Stack: A stack is a data structure where we insert and delete the data from the same end known as Top. The insert operation in stack is known as Push and delete operation is know as Pop operation. Stack is based on LIFO basis. Normally stack has limited size but stack can be built with unlimited size using linked list. Queue: A queue is a data structure where data is inserted from one end and deleted from another end. The end from were we insert data is known as Rear end and the end from were we delete the data is known as Front end. The types of queues are : Ordinary Queue, Double ended queue, Circular queue and Priority queue.

Tree:
A tree a is a data structure which consists of a finite set of elements, called nodes and a finite set of directed lines, called branches that connect the nodes. The number of branches associated with a node is the degree of the node. The branch coming towards the node is the indegree of the node. Similarly the outgoing branches from the node means the outdegree of that node. The first node of the tree is known as root. When the tree is empty, then root is equals to null. In another case the indegree of the root is always 0. Elements of Tree: Leaf: Leaf is a node with no successors. Or Leaf is a node with outdegree equals to 0. Root: The unique node with no predecessor. Or the first node of the tree which has always indegree equals to 0. Non Leaf: A node which has both a parent and at least one child.

Traverse the left subtree in Inorder.) Inorder Traversal (Algorithm: i. W. The degree of internal nodes is 3 i. In a complete binary tree of level 2 or more than 2. (iii) Process the node. P. Binary Tree Traversal techniques: Pre-order Traversal (Algorithm: i. Q. 1 Preorder Traversal: P.R.P Splay trees . R. Traverse the left subtree in postorder.) Post-order Traversal (Algorithm: i.U.V Inorder Traversal: S. the degree of a root node is 2. (ii) Traverse the right subtree in postorder.V. Siblings: Nodes that have the same parent.T.Q. Binary Tree: It is a directed tree in which outdegree of each node is less than or equal to two i.U.T. (ii) Process the node (iii) Traverse the right subtree in inorder. In a binary tree the two children are called left and right.e.R.S. each node in a tree can not have more than two children. the indegree of internal node is 1 and the outdegree is 2. (ii) Traverse the left subtree in Pre-order (iii) Traverse the right subtree in Pre-order.e.) Example of the tree traversal: P Q S W T R U V Fig.R.Q.Children: The successors of a node Parent: The unique predecessor of a node.W. U. Process the node. V Postorder Traversal: W. Internal Nodes: Nodes that are not root and not leaf are called as internal nodes. T.S.

• File structure which is also called as data structure is the logical organization of records in a file. • A multi-list is a slightly modified inverted file. The multi-list is designed to overcome the difficulties of updating an inverted file. • Logical structure of the data is the relationship that will exist between data items independenty. the amortized cost. Zig-zag and Zig-zig. • We continue to apply the three rules until x is at the root of the tree and the rules are : Zig. for example inserting a new record may require moving a large proportion of the file. It provides fast access to the next record using lexicographic order. are a form of binary search tree on which the standard searchtree operations run in O(lg n) amortized time.• Splay trees. • An index-sequential file is an inverted file with a hierarchy of indices. • The balance theorem has sequence of m splays in a tree of n nodes and takes time O(m log (n) + n log(n)). • The join operation on two splay tree A and B takes O(log n). • Sequential file structure is easy to implement. • Insertion operation in splay trees takes O(log(n+1) i. • The zig-zig rule is the one that distinguishes splaying from just rotating z to the root of the tree.e. • Some examples of file structures are sequential files. inverted files. The ‘and’ and ‘OR’ operations can be performed with one pass through both list in and inverted file. • An inverted file is a file structure in which every list contains only one record. Random access is extremely slow. File Structure • Records are collected into logical units called files. It is difficult to update. index-sequential files and multi-list files. • Every time the accessed node is moved to the root in splay tree. • Deletion of a node x in splay tree is also done in O(log n) amortized cost. . • Doing a rotation between a pair of nodes x and y only effects the ranks of the nodes x and y and no other nodes in the tree.

Graph: A graph is a collection of vertices V and a collection of edges E consisting of pairs of vertices. Sometimes. • • Hash addressing is the technique by which the file structures are implemented. (3. every vertex lies in the tree. Informally. the beginning and the end of the list are the same record. numerical values are associated with edges.e. (3. Such vertices are termed ``isolated''. every bridge of G must belong to T. a spanning tree of G is a selection of edges of G that form a tree spanning every vertex. A similar definition holds for node-weighted graphs. • Tree structure can be used to organize the files.6)}. 4. The advantages of hashing are: (i) It is simple (ii) Insertion and search strategies are identical. The K-lists are limited in cellular multi-list so that they will not cross the page(cell) boundaries of the storage media.3). 6} E = { (1. a spanning tree T of a connected. 3.4). (iii) The search time is independent of the number of keys to be inserted. • A ring structure is a linear list that closes upon itself i. That is. The value associated with an edge is called the weight of the edge. Minimum Spanning Tree: In the mathematical field of graph theory. edges are lines between them. 5. undirected graph G is a tree composed of all the vertices and minimum number of the edges of G. Each vertex is a member of the set V. specifying lengths or costs. (2. Each edge is a member of the set E. such graphs are called edge-weighted graphs (or weighted graphs). A ring structure is particularly useful to show classification of data.6). On the other hand. Vertices are points or circles. In this example graph: V = {1. Note that some vertices might not be the end point of any edge. but no cycles (or loops) are formed. The graph is normally represented using that analogy. A vertex is sometimes called a node.Cellular multi-list is again the modified form of the multi-list. 2. (1. .5). • Hash addressing is the technique by which the file structures are implemented.

Relaxation The relaxation process updates the costs of all the vertices. i. Djikstra's algorithm solves the problem of finding the shortest path from a point in a graph (the source) to a destination. or as a minimal set of edges that connect all vertices. The time complexity is O(VE). We can find out the Minimum Spanning Tree using Kruskal’s and Prim’s Algorithm.A spanning tree of a connected graph G can also be defined as a maximal set of edges of G that contains no cycle. Bellman-Ford algorithm: An efficient algorithm to solve the single-source shortest-path problem. or updating. Analyze the running times of your algorithms. It then does V-1 passes (V is the number of vertices) over all edges relaxing. v. Finally it checks each edge again to detect negative weight cycles. Kruskal’s algorithm is also known as Greedy algorithm. The algorithm initializes the distance to the source vertex to 0 and all other vertices to ∞. the distance to the destination of each edge.u) in VxV : (u.v) in E}. where E is the number of edges.E) Output: graph GT = (V. It turns out that one can find the shortest paths from a given source to all points in a graph in the same time. Graph Transpose Graph Transpose problem Input: directed graph G = (V. where ET = {(v. if we could improve the best estimate of the shortest path to v by including (u. in which case it returns false.e. GT is G with all its edges reversed. Weights may be negative.ET). . In Prim’s Algorithm we go on selecting the minimum cost edge adjacent to the already selected edges. u. We go on selecting the minimum cost edge one after another in ascending order of the weights in Kruskal’s algorithm.v) in the path to v. Describe efficient algorithms for computing GT from G. hence this problem is sometimes called the single-source shortest paths problem. for both the adjacency-list and adjacencymatrix representations of G. connected to a vertex.

. Total time = O(V2) Eulerian Cycle & Eulerian Path Euler Cycle Input: Connected.i++) repeat { append i to the end of linked list B[A[i]].i<=p. directed graph G = (V. Theorem: A directed graph possesses an Eulerian cycle iff 1) It is connected 2) For all {v} in {V} indegree(v) = outdegree(v) Euler Path Input: Connected. get next A[i]. for (i=1.j++) D[j][i] = C[i][j].E) Output: A path from v1 to v2.i<=p.i<=p. array B is the new array of Adjacency List GT for (i=1. D[][] is the new adj matrix for GT for (i=1.v2 in which case. } until A[i] = nil. although it may visit a vertex more than once.j<=p. that traverses every edge of G exactly once. although it may visit a vertex more than once.E) Output: A cycle that traverses every edge of G exactly once.i++) B[i] = nil.i++) for (j=1. Total Time = O(V+E) Using Adjacency Matrix representation. Theorem: A directed graph possesses an Eulerian path iff 1) It is connected 2) For all {v} in {V} indegree(v) = outdegree(v) with the possible exception of two vertices v1.==> Using Adjacency List representation. directed graph G = (V.

It’s a measure of the longest amount of time it could possibly take for algorithm to complete. this makes g(n) a lower bound function.E) Output: A linear ordering of all vertices in V such that if G contains an edge (u. f(n) and g(n).v). If graphed. O(log n) or O(ln n) : taking a list of items. looking at each item once. except that “f(n) ≥ cg(n)”. Some Examples: O(n) : printing a list of n items to the screen. then f(n) is Big O of g(n). f(n) ≥ cg(n). if there exists an integer n0 and a constant c>0 such that for all integers n>n0.1 Topological Sort Topological Sort problem Input: A directed acyclic graph (DAG) G = (V. O(n2): taking a list of n items. It describes the best that can happen for a given data size. if there exists an integer n0 and a constant c>0 such that for all integers n>n0.Omega Notation For non-negative functions. then f(n) is omega of g(n). for non-negative functions. Big O Notation It is the formal method of expressing the upper bound of an algorithm’s running time. cutting it in half repeatedly until there’s only on item left. . g(n) serves as an upper bound to the curve you are analyzing. Big.a) indegree(v1) = outdegree(v2) + 1 b) indegree(v2) = outdegree(v1) . f(n) and g(n). This is almost the same definition as Big O. instead of an upper bound function. This is denoted as “f(n) = O(g(n))”. then u appears before v in the ordering. f(n)≤ cg(n). If drawn on a diagram. More formally. Topological Sort can be seen as a vertices along horizontal line. Note: A directed graph G is acylic if and only if a DFS of G yields no back edge. f(n). This is denoted as f(n) = Ω (g(n)). where all directed edges go from left to right. and comparing every item to every other item.

g(n) bounds from the top. This represents a loose bounding version of Big O. This is denoted as “f(n) = ω(g(n))”. f(n) and g(n). This represents a loose bounding version of Big Omega. g(n) bounds from the bottom. . This is basically saying that the function. f(n) is little omega of g(n) if and only if f(n) = Ω(g(n)). This is denoted as “f(n) = Θ(g(n)). f(n) is little o of g(n) if and only if f(n) = O(g(n)). but it does not bound the top.Theta Notation: For non-negative functions. Note: Don’t forget to read the Summary given after each chapters. but f(n) ≠ Θ(g(n)). but it does not bound the bottom. Little Omega Notation For non-negative functions. f(n) and g(n). f(n) is theta of g(n) if and only if f(n) = O(g(n)) and f(n) = Ω (g(n)). g(n). Little o Notation For non-negative functions. but f(n) ≠ Θ(g(n)). f(n) and g(n). f(n) is bounded both from top and bottom by the same function. This is denoted as “f(n) = o(g(n))”.

Sign up to vote on this title
UsefulNot useful