MSApriori Algorithm Steps
MSApriori Algorithm Steps
We need to construct a case where a (k-1)-item subset is infrequent, causing pruning in the
MSApriori algorithm.
Dataset
TID Items
1 A, B, C
2 A, C, D
3 B, C, D
4 A, B
5 B, D
MIS Values
A 40% (2 transactions)
B 50% (3 transactions)
C 30% (2 transactions)
Item MIS (%)
D 20% (1 transaction)
A 3
B 4
C 3
D 3
✅ All items meet their MIS values, so they are included in L1 (frequent 1-itemsets).
✅ Frequent 2-itemsets:
1. {D, C} → Support = 2 ✅
2. {D, B} → Support = 2 ✅
3. {C, B} → Support = 3 ✅
1. {C, A} → Support = 2 ✅
2. {C, B} → Support = 3 ✅
3. {A, B} → Support = 2` ✅
{D, C, B} 2 ✅ Yes
{D, C, A} 1 ❌ No
{D, A, B} 1 ❌ No
{C, A, B} 2 ✅ Yes
✅ {D, C, B}
✅ {C, A, B}
❌ {D, C, A, B} (pruned because {D, C, A} and {D, A, B} are infrequent)
Key Takeaways
Association rules are extracted from frequent itemsets by dividing them into antecedents
(LHS) and consequents (RHS) and evaluating their confidence.
where:
After calculating confidence, we only keep rules where confidence ≥ MinConfidence (e.g.,
60%).
1. {C, A} → B (100%)
2. {C, B} → A (66.67%)
3. {A, B} → C (100%)
4. {C} → {A, B} (66.67%)
5. {D, C} → B (100%)
6. {D, B} → C (100%)
7. {C, B} → D (66.67%)
8. {D} → {C, B} (66.67%)