DEV Community

Cover image for I Built My Own Hands-on AI Tutorial – Chapter 1: Regression (From Scratch + XGBoost)
zkaria gamal
zkaria gamal

Posted on

I Built My Own Hands-on AI Tutorial – Chapter 1: Regression (From Scratch + XGBoost)

A few weeks ago, I revisited my old AI/ML projects.

As I looked through the code, I felt something was missing. I was using models like RandomForestRegressor and XGBRegressor, getting decent results… but I didn’t feel I truly understood what was happening under the hood.

So I made a decision:

Instead of consuming more tutorials, I would build my own comprehensive Hands-on AI Tutorial — first for myself, and then for the community.

Today, I’m happy to announce that Chapter 1: Regression is complete! 🎉

What’s Inside Chapter 1

I implemented and compared 5 different regression techniques on real-world datasets:

  • Linear Regression — Implemented from scratch using the Normal Equation (NumPy only)
  • Decision Tree Regression
  • Random Forest Regression
  • XGBoost Regression — This one consistently delivered impressive performance
  • Support Vector Regression (SVR) with linear, RBF, and polynomial kernels

For every algorithm, I did the following:

  • Built a from-scratch version (where applicable)
  • Compared it with the industry library version (scikit-learn / XGBoost)
  • Explained the math intuitively
  • Ran experiments on multiple datasets (House Prices, Life Expectancy, Advertising, Student Performance, etc.)
  • Evaluated using MSE, RMSE, R², and residual plots
  • Generated visualizations and saved models

Key Learnings

  • Why simple Linear Regression is still a powerful baseline
  • How Decision Trees can overfit and why ensembles (Random Forest & XGBoost) fix many of those issues
  • The real power of boosting vs bagging
  • The importance of hyperparameter tuning and model evaluation
  • How kernels work in SVR

The most satisfying moment was watching XGBoost and Random Forest outperform everything else — and finally understanding why that happens.

Project Structure (Clean & Practical)

ml_fundamentals/chapter1/
├── notebooks/          # Interactive Jupyter Notebook
├── src/                # From-scratch implementations
├── docs/               # Deep math explanations
├── configs/            # Easy-to-modify YAML configs
├── data/               # Real datasets
├── results/            # Plots + reports
└── models/             # Saved models
Enter fullscreen mode Exit fullscreen mode

Who Is This For?

  • Beginners who know Python and want to start ML properly
  • Juniors who want to move from “copy-paste” to deep understanding
  • Anyone who wants both theory and practical code in one place

Try It Yourself

Repository:

https://github.com/zkzkGamal/hands-on-ai-tutorial

Just clone, install the dependencies, and start with the Chapter 1 notebook.

git clone https://github.com/zkzkGamal/hands-on-ai-tutorial.git
cd hands-on-ai-tutorial
pip install -r requirements.txt
Enter fullscreen mode Exit fullscreen mode

I’m already working on Chapter 2: Classification.

Model Comparetion

Top comments (0)