You are on page 1of 4

University of Engineering and Technology Lahore Department of Computer Science and Engineering

Problem Set # 2

Truth is subject to too much analysis.


Frank Herbert

Asymptotic Notations
1. Does it make sense to say that the worst-case running time of any procedure is (g(n))? Yes, because Asymptotic notations are independent of the fact whether we are using best/average/worst case running time. So yes we can have a lower bound on the worst case running time of any algorithm 2. In each of the following situations indicate whether f = O(g(n)) or f = (g(n)) or both (in which case f = (g(n))). (a) f (n) = n 100 and g(n) = n 200. (b) f (n) = n1/2 and g(n) = n2/3 .O (c) f (n) = log 2n and g(n) = log 3n. (d) f (n) = (log n)log n and g(n) = n/ log n. (e) f (n) = n! and g(n) = 2n . 3. Show that if c is a positive real number, then g(n) = 1 + c + c2 + cn is: (a) (1) if c < 1.
n

g(n) =
i=1

ci =

1 cn+1 1 < g(n) = O(1) 1c 1c

since c < 1 1 < g(n) = 1 + c + c2 + cn g(n) = (1) which implies g(n)=(1) (b) (n) if c = 1. Since c = 1 we get g(n) = n + 1. This means that for all n>2, g(n) = n + 1 2n. Hence g(n) = O(n). On the other hand for all n, n< g(n). Hence g(n) = (n). Combining the two results we get g(n) = (n). (c) (cn ) if c > 1.
n

g(n) =
i=1

ci =

cn+1 c cn+1 1 < = .cn c1 c1 c1

since g(n) < kcn kcn < g(n) = 1 + c + c2 + cn g(n) = O(cn ). On other hand cn < 1 + c + c2 + cn g(n) = (cn ) which implies g(n) = (cn ) 4. Give the Big-Oh for each of following a. sum = 0; for (i = 1; i <= n; i++) { for (j = 1; j <= i * i; j++) { 1

if (j % i == 0) { for (k = 1; k <= j; k++) { sum++;} } } } b. sum = 0; for (i = 1; i <= n; i++) { for (j = 1; j <= n; j++) { sum++; sum++; } } n2

Divide & Conquer


= 1 therefore T(n)=nlgb a < 1 therefore T(n)=nlgb a = 1

Solve the recurrences: (a) T (n) = 5T (n/2) + n. case 1 of Mater Theorem: take

(b) T (n) = T (n/2) + 1/n. case 1 of Mater Theorem: take any

(c) T (n) = T (n/8)+T (7n/8)+(1) Master theorem not applicable, could be done with recursion tree method 1 (d) T (n) = T (n 1) + 1/n. Master theorem not applicable, could be done with recursion tree method T (n) = T (n 1) + = T (n 2) + 1 n

1 1 + n1 n 1 1 1 = T (n 3) + + + n2 n1 n ... ... ... 1 1 1 1 + + + + 2 3 n1 n = (ln n) =1+

(1)

(e) T (n) = nT ( n) + n Master not applicable directly could be done by replacing the variables 2 but applying recursion tree makes sense where at level 1 the original problem will divide it self into n = n1/2 problems each of size n1/2 with the cost to be n1/2 n1/2 . At level 2
1 2

check the 2nd example on recursion tree method see CLRS for changing variables

each problem will be of size n1/4 and and total problems will be n1/2 n1/4 thus the cost will i1 i be n1/2 n1/4 n1/4 ( or at any level i there will be n1/2 n1/2 ) cost at every level is therefore i n. and problem size is reducing at every level i is n1/2 tree will bottom out at level k when k n1/2 = 2 k = lg lg n T otalcost = T otallevels costateachlevel = (lg lg n + 1) n (2)

(f) T (n) = 4T (n/2) + n2 n case 3 of Mater Theorem: take 0 < < 1/2. therefore T(n)=n5/2 (g) T (n) = 6T (n/3)+n2 logn case 3 of Mater Theorem: take 0 < < 1/2. therefore T(n)=n2 logn (h) T (n) = 64T (n/8) - n2 logn Master theorem not applicable

Binary Search Tree


1. 2.2.4 selecting a key from each set 7,5,19 for which 7 < 5 < 19 is not true 1

12.2.5 max element never has left child and min element never has right and if node has left or right child then its successor and predecessor is in right and left subtree respectively not in ancestors.

12.2.9 that is : if x is leaf node than its parent y is either its successor or predecessor, since x is leaf node then its successor or predecessor has to be one of its ancestor. If x is left child of y then ys value will be immediatly greater then value of x. Since x is leaf therefor there can be no value greater than x but less than y (if there is such value it should be 3

on right side of x for 7 and 10 in gure 1 if there happens to be some value like 8 it would be on right of x thus making it non leaf). Hence y has to be xs sucessor. Similar logic applies if x is ys right child, then y be its predecessor 2. 12.3-2 Abviously, for insering a node at level x, algorithm compares with all th enodes in the path till its supposed parent at level x-1, so therefore x-1 comparisons are made but while searching that node, search has to be conducted till that level x to nd that node, that makes it one comparison more. 12.3-3 INORDER-TRAVERSAL is n and INSERTION is O(h) According to Algorithm: T(n)=nO(h) + n WorstCase: it will happen when the data to be sorted is already sorted which will result in the leftest or rightest tree( series of nodes on all right of one another) in that case algorithm complexity will be n2 as h=n-1 T(n)=n(n 1) + n BestCase: it will happen when the data to be carefully arranged to make a full/almost full tree the T(n)=n lg(n) as h = lg n T(n)=n(lg n) + n 12.3-4 deleting node 7 and 10

You might also like