DEV Community

Cover image for I Stopped Watching AI Tutorials. Here is the exact code I wrote to actually learn AI
Akarshit Patial
Akarshit Patial

Posted on

I Stopped Watching AI Tutorials. Here is the exact code I wrote to actually learn AI

I Stopped Watching AI Tutorials. Here is the exact code I wrote to actually learn AI.

For three months, I was a tutorial hoarder. I watched 20 hours of "AI for Beginners" on 2x speed. I felt smart... until I closed the laptop.

I couldn't build anything.

If I asked the model why it made a mistake, I had no idea.

So, I deleted my watch history. I opened VS Code. And I started typing.

Here is the exact 4-step code journey that turned me from a "watcher" into someone who can actually call themselves a budding AI engineer.

Phase 1: Stop being fancy. Just guess the price of a house.

Everyone wants to build ChatGPT. I had to learn how to draw a straight line first.

I wrote this Linear Regression model from scratch. No scikit-learn magic. Just math.

# real_estate_guess.py
# I wrote this on a Sunday night. It felt like magic.

import numpy as np

# Data: House size (sq ft) vs Price ($1000s)
# [Size]     [Price]
X = np.array([600, 800, 1000, 1200, 1500])
y = np.array([120, 150, 200, 240, 300])

# The "Brain" (Just y = mx + b)
m = 0.0  # slope
b = 0.0  # intercept
learning_rate = 0.0000001

# Training loop: Do this 1000 times
for i in range(1000):
    # Make a guess
    y_predicted = m * X + b

    # How wrong were we? (Mean Squared Error)
    error = (y - y_predicted) ** 2
    cost = np.mean(error)

    # Learn from the mistake (Gradient Descent)
    # (Don't panic at the calculus. It just means "adjust slightly")
    m -= learning_rate * np.mean(-2 * X * (y - y_predicted))
    b -= learning_rate * np.mean(-2 * (y - y_predicted))

    if i % 250 == 0:
        print(f"Step {i}: Cost = {cost:.2f}")

# Final test
my_house_size = 1350
price_guess = m * my_house_size + b
print(f"\n🏠 My 1350 sq ft house should cost: ${price_guess:.0f}k")
Enter fullscreen mode Exit fullscreen mode

The Human Truth: When I ran this and saw the cost number go down? I literally texted my mom. It felt like teaching a baby to walk.

Phase 2: Teaching a computer to tell a cat from a dog (The real test)

After linear regression, I hit The Wall. Classification. This is where AI actually decides things.

I used the Iris flower dataset (the "Hello World" of AI). I wrote a K-Nearest Neighbors (KNN) model. No libraries. Just Euclidean distance.

# knn_classifier.py
# If it walks like a duck and quacks like a duck...
# This code finds the 'neighbors' to decide.

import numpy as np
from collections import Counter
from sklearn.datasets import load_iris
from sklearn.model_selection import train_test_split

# Load the famous flower dataset
iris = load_iris()
X, y = iris.data, iris.target

# Split into training (80%) and testing (20%)
X_train, X_test, y_train, y_test = train_test_split(X, y, test_size=0.2, random_state=42)

def euclidean_distance(point1, point2):
    """The Pythagorean theorem, but for AI."""
    return np.sqrt(np.sum((point1 - point2) ** 2))

def predict(k, X_train, y_train, new_point):
    # 1. Calculate distance to every known flower
    distances = [euclidean_distance(x, new_point) for x in X_train]

    # 2. Get the indices of the 'k' closest flowers
    k_indices = np.argsort(distances)[:k]

    # 3. Vote on the label
    k_labels = [y_train[i] for i in k_indices]
    most_common = Counter(k_labels).most_common(1)[0][0]

    return most_common

# Test my KNN model
k = 5
predictions = [predict(k, X_train, y_train, point) for point in X_test]

accuracy = np.sum(predictions == y_test) / len(y_test)
print(f"🌸 My KNN Model Accuracy: {accuracy * 100:.2f}%")
Enter fullscreen mode Exit fullscreen mode

The Human Truth: This failed the first 10 times. I got IndexErrors and 20% accuracy. I realized I forgot to shuffle my data. AI is 90% debugging your own lazy mistakes.

Phase 3: The "Wow" moment (Real Neural Network)

This is the code that made me feel like a wizard. I built a tiny neural network using TensorFlow to recognize handwritten numbers (the MNIST dataset).

# my_first_neural_net.py
# I copied this from a tutorial, but then I broke it and fixed it myself.

import tensorflow as tf
from tensorflow import keras

# Load the data (8x8 images of numbers 0-9)
mnist = keras.datasets.mnist
(X_train, y_train), (X_test, y_test) = mnist.load_data()

# Normalize the data (turn 0-255 grayscale into 0-1)
X_train = X_train / 255.0
X_test = X_test / 255.0

# Build the architecture
model = keras.Sequential([
    keras.layers.Flatten(input_shape=(28, 28)),  # Flatten the image into 1 line
    keras.layers.Dense(128, activation='relu'),  # Layer 1: 128 neurons
    keras.layers.Dropout(0.2),                   # Prevents overfitting (it forgets a little)
    keras.layers.Dense(10, activation='softmax') # Output: 10 possible digits
])

# Compile the brain
model.compile(optimizer='adam',
              loss='sparse_categorical_crossentropy',
              metrics=['accuracy'])

# Train the model (this takes coffee break time)
print("🧠 Training...")
model.fit(X_train, y_train, epochs=5)

# Evaluate
test_loss, test_acc = model.evaluate(X_test, y_test)
print(f"\nβœ… My neural network got {test_acc * 100:.2f}% accuracy on handwritten digits.")
Enter fullscreen mode Exit fullscreen mode

The Human Truth: The first time I ran fit() and saw the accuracy go from 0.10 to 0.98 in 30 seconds... I yelled. Out loud. At my screen.

Phase 4: The "Real" Coder Skill (Asking AI to help you build AI)

Ironically, the best skill isn't memorizing syntax. It's using AI to help you write AI.

I use GitHub Copilot (or Codeium, which is free). I write the comment, and it suggests the code.

This is how I actually work now:

# function to load a CSV, clean missing values, and return X and y
# (Copilot writes the next 5 lines for me)

# create a random forest classifier with 100 trees and train it
# (Copilot finishes the class instantiation)

# plot the confusion matrix in a heatmap
# (Copilot imports seaborn and writes the loop)
Enter fullscreen mode Exit fullscreen mode

The Human Truth: I don't memorize sklearn parameters anymore. I know what I want to do (classification, regression, clustering). The AI helps me with the how.

The Master's Checklist (What I actually did)

If you want to stop watching and start mastering, here is the exact path:

  1. Week 1-2: Wrote Linear Regression from scratch (like my first code block).
  2. Week 3-4: Built a KNN classifier without libraries (like my second code block).
  3. Week 5-6: Trained a Neural Network on MNIST (like my third code block).
  4. Week 7+: Started using AI tools (Copilot/ChatGPT) to generate the boilerplate while I focus on the logic.

The honest truth they don't tell you

You will never feel "done." I still don't feel like a master.

But yesterday, a friend asked me, "Can AI predict house prices?" I didn't say, "I don't know." I opened VS Code. I typed import numpy as np. And I started writing.

That is the difference between a viewer and a coder.

Go write your first broken script. Fix it. Then come back and show me.

What code are you struggling with right now? Drop it in the comments. Let's debug together. πŸ‘‡

Top comments (0)