Mike's ML Forge – Telegram
Mike's ML Forge
252 subscribers
130 photos
10 videos
16 files
58 links
Welcome to this channel,in this channel, we're diving deep into the world of Data Science and ML Also a bit of my personal journey, becoming a person who says " I designed the board, collected the data, trained the model, and deployed it"
Download Telegram
I can't keep calm stg Machine Learning got me hooked 😭
👍1
One minute, I’m tuning hyperparameters… next thing I know, my brain is overfitting! 😂 But for real, ML is WILD—turning data into intelligence feels like magic🙂
🤣5
How to Improve a Predictive Model?

1. Evaluate Performance

Compare with a baseline model (simple predictor).
Check key metrics (accuracy, RMSE, etc.).

2. Improve Data
Collect more data if possible.
Clean, preprocess, and engineer better features.

3. Choose a Better Model
Try different algorithms (e.g., Decision Trees, SVM, Neural Networks) we'll see them in future

4. Tune Hyperparameters
Adjust settings like learning rate, tree depth, or batch size.
•Use GridSearchCV or •RandomizedSearchCV for optimization.

5. Prevent Overfitting & Underfitting

Overfitting? Regularization, dropout, or pruning.
Underfitting? More features, deeper models.
Optimize these steps, and your model will improve!
So today we're going to see 3 ways to adjust hyperparameters
Hyperparameters are settings on a model you can adjust to improve its ability to find patterns
Parameters are patterns that the model finds in the data
Improving hyperparameters by hand means manually adjusting hyperparameter values through trial and error instead of using automated tuning methods like GridSearchCV or RandomizedSearchCV.
How It Works:
*Start with default hyperparameters.
*Train the model and evaluate performance.
*Adjust one hyperparameter at a time (e.g. max features nestimators, max depth).
*Retrain and compare results.
*Repeat until you find the best settings.
👍3
RandomizedSearchCV (Faster Alternative)

🎯 How it works:

    Randomly selects a subset of hyperparameter combinations instead of trying all.
    Still uses cross-validation to evaluate performance.
    Saves time by focusing on random but diverse samples.

Pros:
✔️ Much faster than GridSearchCV.
✔️ Works well when there are many hyperparameters.

Cons:
Might not find the absolute best combination (since it’s random).
Less exhaustive compared to GridSearchCV.
GridSearchCV (Exhaustive Search)

🔍 How it works:

    Tries every possible combination of hyperparameters from a predefined set.
    Uses cross-validation to evaluate each combination.
    Selects the best performing set.

Pros:
✔️ Finds the best hyperparameters since it checks all options.
✔️ Ensures optimal tuning when the search space is small.

Cons:
Very slow if there are many parameters and values.
Computationally expensive.
And you can clearly see the difference in accuracy 🙂
5
Ofc im in class room😁
🔥5🤣1
Mike's ML Forge
Ofc im in class room😁
I already filled and cleaned the missing values also seeing some visualisation mannn this so fun😅
👍2
Even the new passport got LED light 😁
😁4
😁4
Media is too big
VIEW IN TELEGRAM
Feature Encoding 101: Prepare Data For Machine Learning

various feature encoding methods. These are important in order to turn all sorts of features into meaningful numerical representations.
Sun’s up, ideas loading… Let’s go!🙌🏽
If you ever see me staring at a flower for too long… don’t interrupt. It’s a moment of deep appreciation stg😭
6
what if there is Imbalanced Dataset ?