DEV Community

Scott Antwi
Scott Antwi

Posted on

I'm 17, I Have No Laptop, and I Just Built a Neural Network from Scratch

I Built a Neural Network from Scratch on My Phone — No Laptop, No Frameworks

I'm a high school student in Ghana. I don't have a laptop. Everything I code, I do on my phone using Google Colab.

Yesterday, I built a neural network from scratch — just Python and NumPy. No TensorFlow. No PyTorch. Every piece written by hand: forward pass, backpropagation, gradient descent. It recognizes handwritten digits with 96% accuracy.

Here's how.

The Setup

I opened Google Colab on my phone browser, loaded the MNIST dataset (60,000 handwritten digit images), and started coding.

The network has three layers:

  • Input: 784 neurons (each pixel of a 28x28 image)
  • Hidden: 64 neurons with sigmoid activation
  • Output: 10 neurons (one for each digit 0-9)

The Hard Part

Writing backpropagation by hand. When the network makes a wrong prediction, you need to figure out which weights caused the error and adjust them. The math isn't complicated — it's chain rule from calculus — but implementing it without a framework means you understand exactly what's happening.

No model.fit(). No magic. Just matrix multiplication and derivatives.

The Results

After 20 passes through the training data:

  • 95.75% accuracy on test data
  • Correctly identifies most digits on the first try
  • Struggles most with 5s and 8s (they look similar even to humans)

I submitted my predictions to the Kaggle Digit Recognizer competition and got a score of 0.947.

Predictions

What I Actually Learned

Before this project, I could tell you "neural networks learn from data." After building one from scratch, I can tell you exactly HOW:

  1. Data goes in, gets multiplied by weights, passes through an activation function
  2. The output gets compared to the correct answer
  3. The error flows backward through the network (backpropagation)
  4. Each weight gets adjusted slightly to reduce the error
  5. Repeat 60,000 times

That's it. That's deep learning at its core. Frameworks hide this behind one line of code. Building it yourself means you actually get it.

Why I Did This

I'm applying to study AI at university. I wanted my GitHub to show that I understand the fundamentals — not just that I can call an API.

I also built a Telegram bot called ScholarFinder that helps students find fully-funded scholarships. 50+ scholarships searchable by level, field, and region. If you're a student looking for funding, try it.

The Code

Everything is on GitHub: mnist-neural-network

It's one Python file. No dependencies beyond NumPy. If you want to understand how neural networks work under the hood, read it.

What's Next

  • Adding ReLU activation to improve accuracy
  • Building a second hidden layer
  • Trying convolutional neural networks (still from scratch)
  • Improving my Kaggle ranking

If I can build this on a phone in Ghana with no laptop, you can build it too. Stop watching tutorials. Open a notebook and start.


Find me on GitHub or Kaggle.

Top comments (0)