You are on page 1of 78

Inspire…Educate…Transform.

Decision Trees &


Association Rules

Dr. Anand Jayaraman

The best place for students to learn Applied Engineering http://www.insofe.edu.in


Supervised Learning

The best place for students to learn Applied Engineering 2 http://www.insofe.edu.in


Unsupervised Learning

The best place for students to learn Applied Engineering 3 http://www.insofe.edu.in


Unsupervised Learning: Examples

The best place for students to learn Applied Engineering 4 http://www.insofe.edu.in


Agenda

• Decision Trees

• Association Rules

The best place for students to learn Applied Engineering 5 http://www.insofe.edu.in


Classification Problem

Can logistic regression do a good job of


classifying this data?

The best place for students to learn Applied Engineering 6 http://www.insofe.edu.in


Remember this example from logistic regression?

•Using the MTcars


dataset, estimate the
probability of a vehicle
being fitted with a
manual transmission if
it has a 120hp engine
and weights 2800 lbs.

The best place for students to learn Applied Engineering 7 http://www.insofe.edu.in


Logistic Regression: Class Boundary

Equation of the fit


Ln(S) = 18.8663 – 8.080348 wt + 0.03636 hp

Setting Ln(S) = 0 in the above equation gives


the equation of dividing line between the two classes.
This line marks the set of points for which prob=0.5

0 = 18.8663 – 8.080348 wt + 0.03636 hp

The best place for students to learn Applied Engineering 8 http://www.insofe.edu.in


Classification Problem: Logistic

The best place for students to learn Applied Engineering 9 http://www.insofe.edu.in


DECISION TREES

The best place for students to learn Applied Engineering 10 http://www.insofe.edu.in


Can we predict who would have survived the sinking of titanic?
Variable Description
Pclass Passenger Class (1 = 1st; 2 = 2nd; 3 = 3rd)
survival Survival (1 = No; 0 = Yes)
name Name
sex Sex
age Age
sibsp Number of Siblings/Spouses Aboard
parch Number of Parents/Children Aboard
ticket Ticket Number
fare Passenger fare(British pound)
cabin Cabin number
Port of embarkation (C = Cherbourg; Q =
embarked Queenstown; S = Southampton)

The best place for students to learn Applied Engineering 11 http://www.insofe.edu.in


Structure of the Data
PassengerId Survived Pclass Name Sex Age SibSp Parch Ticket Fare Cabin Embarked
1 0 3 Braund, Mr. Owen Harris male 22 1 0 A/5 21171 7.25 S
2 1 1 Cumings, Mrs. John Bradley (Florence Briggs
female
Thayer) 38 1 0 PC 17599 71.2833 C85 C
3 1 3 Heikkinen, Miss. Laina female 26 0 0 STON/O2. 3101282
7.925 S
4 1 1 Futrelle, Mrs. Jacques Heath (Lily May Peel)
female 35 1 0 113803 53.1 C123 S
5 0 3 Allen, Mr. William Henry male 35 0 0 373450 8.05 S
6 0 3 Moran, Mr. James male 0 0 330877 8.4583 Q
7 0 1 McCarthy, Mr. Timothy J male 54 0 0 17463 51.8625 E46 S
8 0 3 Palsson, Master. Gosta Leonard male 2 3 1 349909 21.075 S
9 1 3 Johnson, Mrs. Oscar W (Elisabeth Vilhelmina
female
Berg) 27 0 2 347742 11.1333 S
10 1 2 Nasser, Mrs. Nicholas (Adele Achem) female 14 1 0 237736 30.0708 C
11 1 3 Sandstrom, Miss. Marguerite Rut female 4 1 1 PP 9549 16.7 G6 S
12 1 1 Bonnell, Miss. Elizabeth female 58 0 0 113783 26.55 C103 S
13 0 3 Saundercock, Mr. William Henry male 20 0 0 A/5. 2151 8.05 S
14 0 3 Andersson, Mr. Anders Johan male 39 1 5 347082 31.275 S
15 0 3 Vestrom, Miss. Hulda Amanda Adolfina female 14 0 0 350406 7.8542 S
16 1 2 Hewlett, Mrs. (Mary D Kingcome) female 55 0 0 248706 16 S
17 0 3 Rice, Master. Eugene male 2 4 1 382652 29.125 Q
18 1 2 Williams, Mr. Charles Eugene male 0 0 244373 13 S
19 0 3 Vander Planke, Mrs. Julius (Emelia Maria Vandemoortele)
female 31 1 0 345763 18 S
20 1 3 Masselmani, Mrs. Fatima female 0 0 2649 7.225 C
21 0 2 Fynney, Mr. Joseph J male 35 0 0 239865 26 S

897 passengers in “train” dataset. 418 passengers in “test”.


The best place for students to learn Applied Engineering 12 http://www.insofe.edu.in
Understanding of the data
• Train data statistics
–What percentage of people survived?
Did not survive Survived
61.6% 38.4%

Majority did not make it.

Can we make a forecast using this info?


The best place for students to learn Applied Engineering 13 http://www.insofe.edu.in
Simple forecast
• Since majority did not make it
– A random person in the test set would more
likely have died than survived.
– Lets make that our prediction!

What’s the accuracy of our prediction?


62%

Can we do better?
The best place for students to learn Applied Engineering 14 http://www.insofe.edu.in
Women and children first?
Train data analysis

Did not survive Survived


Female 25.8% 74.2%
Male 81.1% 18.9%

New prediction rule:


Is Male?
y n

Did Not
Survived
Survive

Accuracy on test set: 76.5%!!


The best place for students to learn Applied Engineering 15 http://www.insofe.edu.in
Decision Tree

The best place for students to learn Applied Engineering 16 http://www.insofe.edu.in


Decision Tree
• Add more features – cabin type, fare
paid etc

Accuracy on test set: 77.5%


The best place for students to learn Applied Engineering 17 http://www.insofe.edu.in
Decision Tree : More features

The best place for students to learn Applied Engineering 18 http://www.insofe.edu.in


Building & Using Trees

Build Use
• Think : “If, Then” rules specified in the feature • Every observation mapped to a leaf node assigned
space. the label most commonly occurring in that leaf
(Classification)
• Greedily divide (binary split) the feature
space into distinct, non-overlapping regions • Every observation mapped to a leaf node assigned
the mean of the samples in the leaf (Regression)
• “Natural” clustering given the target variable.

The best place for students to learn Applied Engineering 19 http://www.insofe.edu.in


Decision Trees : Continuous splitting of
the feature spaces

• In Feature Space • As a Tree

• The feature space contains all • Root contains all data


data • Leaves contain “homogeneous”
• Divided regions contain data subsets
“homogeneous” data subsets • Paths along branches define
• Region boundaries define regions leaves (homogeneous data)
(homogeneous data)

The best place for students to learn Applied Engineering 20 http://www.insofe.edu.in


Theme & Key Variations

• Decision Trees : • Algorithm names


– Continuous splitting of the feature space – CART
– Recursive Partitioning of the feature – C4.5
space – C5.0
– CHAID
• Split
– ID3
– Divide the data such that all data in subset-1
is “different” from the data in subset-2 in a – …
certain dimension.
• How to split? • Other Variations
– Which variable to use to split the data? – Handling missing values
– Which value of the variable to split on? • Different category, surrogate splits
etc.
• What criteria should be used to – More than two child nodes
evaluate a split? • One variable appears only once in
– What is the split trying to achieve? the tree
– How do you measure the homogeneity
of a subset?
– In Classification / Regression
– Supervised Clustering

The best place for students to learn Applied Engineering 21 http://www.insofe.edu.in


Choosing the Split - Classification

What is a good split? Choosing feature, split-point


• Among all possible splits (all features, all split points) • Cluster “homogeneous” data (subset of data)
• What is a good split measure?
• Which split maximizes gain / minimizes error
(Greedy) • Classification Error 1− max p j
j

• Information Gain / Impurity reduction. • Gini Index p1 (1 − p2 )+ p2 (1 − p1 )


• Entropy p1 log( p1 )+ p2 log( p2 )

The best place for students to learn Applied Engineering 22 http://www.insofe.edu.in


Impurity = Classification Error Rate

23

The best place for students to learn Applied Engineering 23 http://www.insofe.edu.in


Impurity = Classification Error Rate (cont’d)

24

The best place for students to learn Applied Engineering 24 http://www.insofe.edu.in


Impurity = Gini Index

25

The best place for students to learn Applied Engineering 25 http://www.insofe.edu.in


Impurity = Cross Entropy

26

The best place for students to learn Applied Engineering 26 http://www.insofe.edu.in


Classification error vs. Gini vs. Entropy

• Measures of Impurity
– Determine Information Gain
– Determine split choice

• For binary classification


– All measures reach a
maxima at (0.5,0.5)
– All measure are
symmetrical around the
maxima
– Differ in how they evaluate
impurity

http://people.revoledu.com/kardi/tutorial/DecisionTree/how-to-measure-impurity.htm

The best place for students to learn Applied Engineering 27 http://www.insofe.edu.in


Splitting based on Categorical variable

• How many potential split points possible? cs.nyu.edu/~dsontag

– Determines the computational complexity


• Categorical variable with k factors
– In how many ways can you split the data into two
(binary) sets?
– kC2 (k if we restrict split points to be “k = t and k ≠ t”

The best place for students to learn Applied Engineering 28 http://www.insofe.edu.in


Splitting based on Numeric variable
• How many potential split points possible?
– Infinite??

• What is the range of values of the numeric variable


– Consider split points of the form t = xi + (xi+1 – xi)/2
– One branch: < t; Other branch ≥ t

• Can do better
– Sort the attributes that you can split on.
– Find all the "breakpoints" where the class labels associated with them
change.
– Consider the split points where the labels change.

The best place for students to learn Applied Engineering 29 http://www.insofe.edu.in


Classification Problem

The best place for students to learn Applied Engineering 30 http://www.insofe.edu.in


Classification Problem: Decision Tree

The best place for students to learn Applied Engineering 31 http://www.insofe.edu.in


How deep should the tree be?

The best place for students to learn Applied Engineering 32 http://www.insofe.edu.in


DTs can Overfit!!

The best place for students to learn Applied Engineering 33 http://www.insofe.edu.in


When to stop splitting?

Decision Tree Early Stopping


• Continuous splitting of the feature space • Information Gain < Threshold
• Recursive Partitioning of the feature space • Minimum Instances per Node
• Maximum Tree Depth
Grow & Prune
When to stop splitting (Avoiding overfitting) • Tree building is greedy!
When will we be “forced” to stop? • Current split gain < Future split gain (Gotcha !)
• When all nodes are pure (homogeneous
leaves)
• These trees can be very deep : Overfitting
• Good trees don’t over-fit !
• All models must guard against overfitting

The best place for students to learn Applied Engineering 34 http://www.insofe.edu.in


Cost complexity pruning
• J(Tree, S) = ErrorRate(Tree, S) + a |#Nodes|

• Play with several values a starting


from 0
• Do a K-fold validation on all of them
and find the best pruning a

The best place for students to learn Applied Engineering 35 http://www.insofe.edu.in


How do we choose a?

Train

Test

alpha

Mean Cross-Validation
Error
The best place for students to learn Applied Engineering 36 http://www.insofe.edu.in
Reduced Error Pruning
• Start from leaf nodes
• If removal of node Wind

does not change Strong Weak

validation accuracy Temp Yes

–Combine leaf elements Mild Hot

Yes No
of this node into
previous node
• Continue till root node
The best place for students to learn Applied Engineering 37 http://www.insofe.edu.in
Controlling Overfit
Other parameters
• Minimum information gain
• Specify the minimum count in leaf
node
• Max number of leaf nodes

The best place for students to learn Applied Engineering 38 http://www.insofe.edu.in


Python code: DT with Cross Validation

The best place for students to learn Applied Engineering 39 http://www.insofe.edu.in


Advantages of Trees
• They are fast
• Robust
• Requires very little experimentation
• You may also build some intuitions
about your customer base. E.g. “Are
customers with different family sizes
truly different?
The best place for students to learn Applied Engineering 40 http://www.insofe.edu.in
Regression Trees
• Can we use a decision tree only for
classification or can we use them
for forecasting or predicting a
numeric attribute?

The best place for students to learn Applied Engineering 41 http://www.insofe.edu.in


Regression trees
• It turns out that, we are collecting
very similar records at each leaf. So,
we can use median or mean of the
records at a leaf as the predictor
value for all the new records that
obey similar conditions. Such trees
are called regression trees.

The best place for students to learn Applied Engineering 42 http://www.insofe.edu.in


Two most popular decision tree algorithms
• CART
– Binary split
– Gini index
– Cost complexity pruning
• C5.0
– Multi split
– Info gain
– pessimistic pruning

The best place for students to learn Applied Engineering 43 http://www.insofe.edu.in


CART

The best place for students to learn Applied Engineering 44 http://www.insofe.edu.in


C5.0

The best place for students to learn Applied Engineering 45 http://www.insofe.edu.in


Missing values
• Missing values
–if node n tests A, assign most common
value of A among other examples
routed to node n
–if node n tests A, assign most common
value of A among other examples
routed to node n that have the same
class label as x
The best place for students to learn Applied Engineering 46 http://www.insofe.edu.in
Classification Problem: Which method?

The best place for students to learn Applied Engineering 47 http://www.insofe.edu.in


Goodness of RULES
• Decision trees provide us prediction
rules
• Each Leaf node represents a single
rule
• The effectiveness of individual rules
are measured using
–Support, Confidence, Lift
The best place for students to learn Applied Engineering 48 http://www.insofe.edu.in
An if-then propositional rule
• If (x) and (y) and (z) then A

• x, y, z: Antecedent
• A: Consequent

• Length of a rule: Number of


antecedents
The best place for students to learn Applied Engineering 49 http://www.insofe.edu.in
“If CCAvg is medium then loan = accept”

This rule covers


3 in 13 samples,
or 23% of data

This is called
support

The best place for students to learn Applied Engineering 50 http://www.insofe.edu.in


“If CCAvg is medium then loan = accept”
Of the three
occasions left is
present, right
too is present.

This rule has


100% confidence

23% support
The best place for students to learn Applied Engineering 51 http://www.insofe.edu.in
Lift
• Is confidence and support always good?
– Example: Leak in oil gasket causes Engine
rebuild
• Prob of Engine rebuild will be very small, hence support for this rule is small
• Confidence will be high

– Hence you use lift:


•Confidence of rule / Overall prob of the event
•i.e. Confidence of oil gasket rule / prob of
engine rebuild

The best place for students to learn Applied Engineering 52 http://www.insofe.edu.in


Decision Trees : Summary

Versatility Predictive Accuracy


• Not so great.
• Can be used for classification, regression &
clustering • But : Advanced methods exist to fix it
• Effectively handle missing values.
• Can be adapted to streaming data.

Interpretability Model Stability


• Easy to understand / present / visualize • High Variance: Strong dependence on training
• Human interpretable rules set.
• Allow post processing: Rules systems • But : Advanced methods exist to fix it

The best place for students to learn Applied Engineering 53 http://www.insofe.edu.in


Unsupervised rule induction
ASSOCIATION RULES – AFFINITY
ANALYSIS / MARKET BASKET
ANALYSIS

The best place for students to learn Applied Engineering 54 http://www.insofe.edu.in


The best place for students to learn Applied Engineering 55 http://www.insofe.edu.in
Who purchased the basket?

The best place for students to learn Applied Engineering 56 http://www.insofe.edu.in


It is not over yet
• Most likely he/she is a vegetarian!

• He/she has been exposed to some


foreign culture (how many Indians eat
pickles!)

The best place for students to learn Applied Engineering 57 http://www.insofe.edu.in


Market basket analysis
• Provides insight into which products
tend to be purchased together and
which are most amenable to
promotion.

The best place for students to learn Applied Engineering 58 http://www.insofe.edu.in


Market Basket analysis can give
• Actionable rules
• Trivial rules
–People who buy shoes also buy socks
• Inexplicable
–People who buy shirts also buy milk

The best place for students to learn Applied Engineering 59 http://www.insofe.edu.in


Association Rules

• What does the value of one feature tell us about the value of another feature?
• People who buy diapers are likely to buy baby powder
• If (people buy diaper), then (they buy baby powder)
• Caution : Watch the directionality! (A➔B does not mean B ➔A)
• Association rules
• Are statements about relations among features (attributes) : across elements (tuples)
• Use a transaction-itemset data model

The best place for students to learn Applied Engineering 60 http://www.insofe.edu.in


Association Rules = Market Basket Analysis?

• Most common use


• Each basket (purchase) is a row and each item is a column

• Not the only use


• Can work in any dataset where features take only two use values : 0/1
• Can work in any dataset where features can be represented as taking only two
use values : 0/1
• Preprocessing: Discretization, Feature selection

The best place for students to learn Applied Engineering 61 http://www.insofe.edu.in


It is not just retail and baskets

• Unusual combinations of insurance claims can be a sign of


fraud and can spark further investigation.

• Medical patient histories can give indications of likely


complications based on certain combinations of treatments.

• Association Rules beyond Market Basket Analysis


– People who visit webpage X are likely visit webpage Y.
– Nodes which run a web server are likely to run linux.
– People who have age-group [30,40] & income [>$100k] are likely
to own home
The best place for students to learn Applied Engineering 62 http://www.insofe.edu.in
Measures of effectiveness
• What do association rules look like? • Support (a.k.a. Coverage) of X➔Y
• {diapers} ➔ {baby powder} • Fraction of rows containing both X & Y
• {bread, butter} ➔ {milk} • P(X and Y): Joint Probability
• {bat, ball, pads} ➔ {helmet} • Support (X ➔ Y) = Support (Y ➔ X)
• X ➔ Y :: If {X}, Then {Y}
• If Precondition, Then Conclusion
• Confidence of X➔Y
• If Antecedent, Then Consequent
• Among rows containing X, fraction of rows containing Y
• P(Y|X) : Conditional Probability
• How good / significant is a rule? • Confidence (X ➔ Y) ≠ Confidence (Y ➔ X)
• An association rule is a probabilistic
statement
• How much historical data supports your rule? • What do association rules really look like?
• How confident are we that the rule holds? • X ➔support, confidence Y

The best place for students to learn Applied Engineering 63 http://www.insofe.edu.in


Measures of effectiveness (cont’d)
• {Diaper, Beer} ➔ Milk
• Support = 2/5, Confidence = 2/3
• {Milk} ➔ {Diaper, Beer}
• Support = 2/5, Confidence = 2/4
• {Milk, Diaper} ➔ Bread
• Support = 2/5, Confidence = 2/3
• {Milk, Beer} ➔ Diaper?

• Confidence = 1?
• Caution : Diaper is very popular!
• Does the inclusion of {Milk, Beer} increase the probability of Diaper?
• Lift
• Confidence (X➔ Y)/Support(Y) or equivalently P(Y|X) / P(Y)
• > 1 : X & Y positively correlated (Presence of X lifts probability of Y’s presence)
• < 1 : X & Y negatively correlated (Presence of X reduces probability of Y’s presence)
• = 1 X & Y not correlated
The best place for students to learn Applied Engineering 64 http://www.insofe.edu.in
Question
• How many rules can we generate
when we have p items in our dataset?

–Approx 2p

• How do we generate these rules


automatically on large data
The best place for students to learn Applied Engineering 65 http://www.insofe.edu.in
APRIORI ALGORITHM

The best place for students to learn Applied Engineering 66 http://www.insofe.edu.in


Downward Closure Property
• Suppose {A,B} has a certain frequency (f). Since each
occurrence of A,B includes both A and B, then both
A and B must also have frequency >= f.

• Similar argument for larger itemsets

• So, if a k-itemset meets a cut-off frequency, all its


subsets (k-1, k-2 itemsets) also meet this cut-off
frequency

The best place for students to learn Applied Engineering 67 http://www.insofe.edu.in


Downward closure

The best place for students to learn Applied Engineering 68 http://www.insofe.edu.in


The Apriori Algorithm
• Key Idea
– If {a,c,f} is frequent, {a,c} must be frequent
– Downward closure a.k.a. anti-monotonicity

• Algorithm
– Find all frequent 1-itemsets (frequent ➔ > support)
– Find all frequent 2-itemsets for filtered 1-itemsets
– Find all frequent 3-itemsets for filtered 2-itemsets
– ….

• The reliance on downward closure property, allows us to reduce our search space
for the rules.

The best place for students to learn Applied Engineering 69 http://www.insofe.edu.in


The Apriori Algorithm — Example
Database D
Minsup = 0.5 itemset sup. L1 {1} 2/4
TID Items C1 {1} 2/4 {2} 3/4
100 134 {2} 3/4 {3} 3/4
Scan D
200 235 {3} 3/4 {5} 3/4
300 1235 {4} 1/4
400 25 {5} 3/4
C2 C2 itemset
{1 2} 1/4
L2 itemset sup {1 3} 2/4 Scan D {1 2}
{1 3} 2 {1 5} 1/4 {1 3}
{2 3} 2 {2 3} 2/4 {1 5}
{2 5} 3/4 {2 3}
{2 5} 3
{3 5} 2/4 {2 5}
{3 5} 2
{3 5}
C3 itemset Scan D L3 itemset sup
{2 3 5} 2/4
{2 3 5}
The best place for students to learn Applied Engineering 70 http://www.insofe.edu.in
The Apriori Algorithm — Example TID Items
100 1 3 4
200 2 3 5
• Apriori gives possible answers 300 1 2 3 5
400 2 5

• We have to prioritize associations: itemset sup


– Analyze confidence and lift, e.g: {2,3,5} {1 3} 2/4
{2 3} 2/4
{2 5} 3/4
Antecedent Consequent Support Confidence Lift {3 5} 2/4
{2} {3,5} 3/4 2/3 conf({2}->{3,5})/prob(3,5)=0.67/0.5
{3} {2,5} 3/4 2/3 0.67/0.75
{5} {2,3} 3/4 2/3 0.67/0.5
{2,3} {5} 2/4 2/2 conf({2,3}->{5})/prob(5)=1/0.75
{3,5} {2} 2/4 2/2 1/0.75
{2,5} {3} 2/4 2/3 0.67/0.75

Confidence: Number of times antecedent and consequent were occurring


together/number of times Antecedent was present
Lift: Confidence / Prob of the Consequent

The best place for students to learn Applied Engineering 71 http://www.insofe.edu.in


Groceries transaction dataset

The best place for students to learn Applied Engineering 72 http://www.insofe.edu.in


Using Apriori in R

see https://www.r-bloggers.com/association-rule-learning-and-the-apriori-algorithm/ for details


The best place for students to learn Applied Engineering 73 http://www.insofe.edu.in
Applying apriori algorithm
Color : Lift
Size : Support

The best place for students to learn Applied Engineering 74 http://www.insofe.edu.in


Visualizing the Rules

plot(basket_rules, measure=c("support","lift"),
shading="confidence");

The best place for students to learn Applied Engineering 75 http://www.insofe.edu.in


Apriori Limitations
● Computational Complexity

▪ How long does it take to run?

▪ How much memory does it need?


● Approaches

○ Throw more compute / RAM at it

○ Parallelize

○ Increase support

○ Leverage item hierarchy

○ Another algorithm? (FP-Growth)

○ Rare pattern (NB-rules)

■ Rules with low support might be valuable

■ People who buy ______ likely to buy luxury cars

76
The best place for students to learn Applied Engineering 76 http://www.insofe.edu.in
Association Rules : Summary

• Association Rules • Apriori


• Are probabilistic statements • Input : Dataset, minsupport
• About relations among features - across • Output: association rules
elements • Exploits downward closure to optimize search
• Use a transaction-itemset data model • Lower Support ➔ Higher computational
• The strength (statistical significance) of an complexity
association rule is measured using support, • Confidence, Lift as post-processing filters
confidence, lift etc.

• Applications
• Market Basket Analysis
• Any dataset where features take values :
0/1
• Can work in any dataset where features can
be represented as taking only two use
values : 0/1
• Preprocessing: Discretization, Feature
selection

The best place for students to learn Applied Engineering 77 http://www.insofe.edu.in


HYDERABAD BENGALURU
Office and Classrooms Office
Plot 63/A, Floors 1&2, Road # 13, Film Nagar, Incubex, #728, Grace Platina, 4th Floor, CMH Road,
Jubilee Hills, Hyderabad - 500 033 Indira Nagar, 1st Stage, Bengaluru – 560038
+91-9701685511 (Individuals) +91-9502334561 (Individuals)
+91-9618483483 (Corporates) +91-9502799088 (Corporates)

Social Media Classroom


Web: http://www.insofe.edu.in KnowledgeHut Solutions Pvt. Ltd., Reliable Plaza,
Facebook: https://www.facebook.com/insofe
Jakkasandra Main Road, Teacher's Colony, 14th Main
Road, Sector – 5, HSR Layout, Bengaluru - 560102
Twitter: https://twitter.com/Insofeedu
YouTube: http://www.youtube.com/InsofeVideos
SlideShare: http://www.slideshare.net/INSOFE
LinkedIn: http://www.linkedin.com/company/international-school-of-engineering
This presentation may contain references to findings of various reports available in the public domain. INSOFE makes no representation as to their accuracy or that the organization
subscribes to those findings.

The best place for students to learn Applied Engineering 78 http://www.insofe.edu.in

You might also like