Apriori Algorithm -Examples
Example -1
Let’s see an example of the Apriori Algorithm.
Find the frequent itemsets and generate association rules on this. Assume that minimum support threshold (s = 33.33%) and minimum confident threshold (c = 60%)
Let’s start,
There is only one itemset with minimum support 2. So only one itemset is frequent.
Frequent Itemset (I) = {Hot Dogs, Coke, Chips}
Association rules,
- [Hot Dogs^Coke]=>[Chips] //confidence = sup(Hot Dogs^Coke^Chips)/sup(Hot Dogs^Coke) = 2/2*100=100% //Selected
- [Hot Dogs^Chips]=>[Coke] //confidence = sup(Hot Dogs^Coke^Chips)/sup(Hot Dogs^Chips) = 2/2*100=100% //Selected
- [Coke^Chips]=>[Hot Dogs] //confidence = sup(Hot Dogs^Coke^Chips)/sup(Coke^Chips) = 2/3*100=66.67% //Selected
- [Hot Dogs]=>[Coke^Chips] //confidence = sup(Hot Dogs^Coke^Chips)/sup(Hot Dogs) = 2/4*100=50% //Rejected
- [Coke]=>[Hot Dogs^Chips] //confidence = sup(Hot Dogs^Coke^Chips)/sup(Coke) = 2/3*100=66.67% //Selected
- [Chips]=>[Hot Dogs^Coke] //confidence = sup(Hot Dogs^Coke^Chips)/sup(Chips) = 2/4*100=50% //Rejected
There are four strong results (minimum confidence greater than 60%)
Example -2
Let’s see another example of the Apriori Algorithm.
Find the frequent itemsets on this. Assume that minimum support (s = 3)
There is only one itemset with minimum support 3. So only one itemset is frequent.
Frequent Itemset (I) = {Coke, Chips}
Note: Refer to https://kaumadiechamalka100.medium.com/apriori-algorithm-f7fb30793274, to get the basic idea about the Apriori algorithm.