Basics of Machine Learning

Machine Learning – Types with Simple Numerical Examples

Machine Learning – Types with Simple Numerical Examples

📘

1) Supervised Learning

Meaning: Learn from labeled examples (inputs X with correct outputs y).

Example: House price: Size (sq.ft) -> Price (lakhs). Train on (1000->50), (1500->75), (2000->100). Learned rule ~ Price = 0.05 * Size. Predict 1800 -> 90 lakhs.

Has labelsPrediction

🍎

1a) Classification (Supervised)

Meaning: Predict a class label.

Example: Fruit by weight (g). Bananas: 100,110,120. Apples: 150,160,170. New fruit 140. Means: Banana=110, Apple=160. Distances: |140-110|=30, |140-160|=20. Predict Apple (closer to 160).

Yes/No or Category

📈

1b) Regression (Supervised)

Meaning: Predict a number.

Example: Study hours vs score. Train: (1->30), (3->50), (5->70). Linear fit Score = 10 * Hours + 20 fits these points. For 4 hours -> 60.

Numeric output

📗

2) Unsupervised Learning

Meaning: Find structure/patterns without labels.

Example: Movie ratings (Action, Comedy): (9,2), (8,3), (2,9), (3,8). K-means forms 2 groups. New viewer (7,2) joins the action group (near (9,2),(8,3)).

No labelsPattern discovery

🔵

2a) Clustering (Unsupervised)

Meaning: Group similar points.

Example: Data: [1,2,8,9]. Two clusters -> {1,2} and {8,9}. New point 7 joins {8,9} (closer to 8 and 9).

Natural groups

🧩

3) Semi-supervised Learning

Meaning: Few labeled points + many unlabeled points; use both.

Example: Hours vs Pass/Fail. Labeled: 2->Fail, 6->Pass. Unlabeled: 4, 5, 5.5. Using the distribution, the boundary sits near ~4; infer 4 ~ Fail, 5 and 5.5 ~ Pass.

Mix of labeled + unlabeled

🔧

4) Self-supervised Learning

Meaning: Create labels from the data itself (pretext task), then learn useful patterns.

Example: Next-number prediction. Train on sequences like [2,4,6,8]->10 and [5,10,15]->20. Model learns "add a constant". Given [10,12,?] it predicts 14.

Auto-labelsRepresentation learning

🎮

5) Reinforcement Learning

Meaning: Learn by trial-and-error to maximize reward.

Example: Two choices A or B. Rewards over 4 trials: A: 0, 2 (avg 1.0); B: 3, 2 (avg 2.5). An epsilon-greedy learner favors B (higher average) while still exploring A.

RewardsExplore vs exploit

🌐

6) Online Learning

Meaning: Update the model continuously as new data arrives.

Example: Running average of price. Start avg=100. New price 110 -> new avg = 100 + (110-100)/2 = 105. Next price 108 -> new avg = 105 + (108-105)/3 = 106.0.

Streaming updates

🚚

7) Transfer Learning

Meaning: Reuse knowledge from one task/domain in another with little new data.

Example: House price model from City A: Price = 0.05 * Size. Small sample shows City B is ~20% higher. Adjust slope to 0.06. Predict 1800 sq.ft -> 108 lakhs with only a few City B examples.

Reuse + fine-tune

Advanced ML (Bonus)

🤖

8) Deep Learning (Model Family)

Meaning: Neural networks with many layers; can be used for supervised, unsupervised, or RL.

Example: Digit image (3×3 grayscale) classified as 7 or 1. Pixels: [0,1,1; 0,1,0; 0,1,0]. A tiny neural net learns weights; output neuron gives score_7=0.82, score_1=0.18 → predict 7.

Neural netsLayers

🧠

9) Ensemble Learning

Meaning: Combine multiple models to improve accuracy (bagging/boosting/stacking).

Example: Three classifiers vote on a fruit: M1→Apple, M2→Banana, M3→Banana. Majority vote = Banana (2/3). Numeric aggregation: probs Apple=[0.6,0.3,0.4], Banana=[0.4,0.7,0.6] → avg Banana=0.567 > avg Apple=0.433.

Random ForestBoosting

🛰️

10) Federated Learning

Meaning: Many devices train locally and share only model updates; server aggregates without seeing raw data.

Example: Linear model weight updates from phones: Phone A (n=100) w=0.50, Phone B (n=300) w=0.70. Server aggregate (weighted): w_global = (100×0.50 + 300×0.70) / 400 = 0.65.

PrivacyEdge devices

Note
Classification and Regression are supervised subtypes.

Clustering is an unsupervised subtype.

Deep learning is a model family usable across types.