DEV Community

Vivek
Vivek Subscriber

Posted on

Day 2: When Reality Punches You in the Face

The brutal truth about Day 1 and why I'm doubling down


Let's Be Honest About Yesterday 😤

I barely made it through Day 1.

There, I said it. While my initial blog post was full of confidence and ambitious plans, the reality of diving into graduate-level linear algebra after years of web development was like trying to drink from a fire hose while someone's screaming at you in a foreign language.

The plan: Master vector operations, implement everything from scratch, solve 20+ problems, write clean documentation.

The reality: I spent 3 hours just trying to remember what the hell a dot product actually means geometrically, not just computationally.

The Humbling Moments šŸ¤•

Gilbert Strang Almost Broke Me

Watching MIT 18.06 Lecture 1, I thought I was following along fine until Strang casually mentioned linear independence and my brain just... stopped. I realized I was nodding along without actually understanding what he was saying. The mathematical intuition that should have been built over years was just missing.

My "From Scratch" Implementation Was Embarrassing

My vector operations library? It was basically just NumPy wrapped in a class with some print statements. I wasn't implementing anything from first principles—I was just moving existing functionality around and calling it "understanding."

The Problem Sets Were Brutal

Those "20+ vector problems from Khan Academy"? I got through 8 before hitting a wall on basic concepts like span and linear combinations. Problems that should have taken 5 minutes were taking 30+ minutes, and I was second-guessing every answer.

Documentation? What Documentation?

By hour 10, I was so mentally drained that my "clean GitHub commits" turned into desperate pushes with commit messages like "vectors maybe working idk" and "fixed thing that was broken probably."

The Identity Crisis Moment šŸ¤”

Around hour 8 yesterday, I had a genuine moment of panic. I was staring at a simple 3x3 matrix multiplication problem, something that should be elementary, and I realized I was doing it mechanically without any geometric intuition.

The question that hit me: Am I actually learning this, or am I just going through the motions?

This is the difference between:

  • Surface learning: Memorizing formulas and procedures
  • Deep understanding: Grasping the fundamental concepts and their relationships

I was definitely doing the former, and for an ML Research Engineer role, that's not going to cut it.

What I Actually Accomplished (The Real Numbers) šŸ“Š

Let me be brutally honest about yesterday's deliverables:

āœ… Partial Wins:

  • Watched 2 Gilbert Strang lectures (though understanding was patchy)
  • Implemented basic vector operations (poorly, but they work)
  • Solved 8 Khan Academy problems (target was 20)
  • Started understanding the geometric meaning of vectors
  • Realized how much I don't know (actually valuable)

āŒ Clear Failures:

  • No clean documentation written
  • Mathematical derivations incomplete
  • Visualization tools not created
  • Advanced topics barely touched
  • Evening review session skipped due to exhaustion

šŸ¤·ā€ā™‚ļø Mixed Results:

  • Vector class implemented but not from true first principles
  • Problems solved but with too much struggle for basic concepts
  • Blog post written but overly optimistic about Day 1 results

The Deep Dive: Where I Actually Struggled šŸ”

Linear Independence - The Mind Bender

I thought I understood this concept, but when I tried to explain it to myself out loud, I realized I was just reciting definitions. The geometric intuition of what it means for vectors to be linearly independent—that they can't be expressed as combinations of each other—didn't click until my 4th attempt at visualization.

Dot Product: Computation vs. Meaning

Sure, I can compute a·b = Σaᵢbᵢ, but understanding that it represents the projection of one vector onto another? That it measures how much two vectors "agree" in direction? That took hours of drawing diagrams and multiple YouTube videos to truly grasp.

The Span Concept

This one nearly broke me. The idea that the span of a set of vectors is all possible linear combinations sounds simple, but visualizing what this means in 3D space, understanding how it relates to basis vectors, and grasping why it matters for machine learning—that was a genuine struggle.

Today's Shift in Strategy šŸŽÆ

After yesterday's reality check, I'm making some crucial adjustments:

Depth Over Coverage

Instead of trying to implement 5 different concepts poorly, I'm going to focus on truly mastering matrix operations today. Better to understand one thing deeply than to have surface knowledge of many things.

Emphasis on Geometric Intuition

For every mathematical operation I implement, I'm going to force myself to:

  1. Draw it by hand
  2. Visualize it geometrically
  3. Explain it in plain English
  4. Connect it to ML applications

Implementation as Learning Tool

My implementations need to be true learning exercises, not just code that works. I'm going to implement matrix multiplication in multiple ways:

  • Naive approach (to understand the basic operation)
  • Optimized approach (to understand computational efficiency)
  • Block matrix approach (to understand how it scales)

What Matrix Operations Actually Mean (My Current Understanding) 🧮

Let me test my understanding by explaining matrix multiplication without looking anything up:

Matrix multiplication isn't just a computational trick—it's composition of linear transformations. When you multiply matrix A by matrix B, you're saying "first apply transformation B, then apply transformation A."

This is why matrix multiplication isn't commutative (AB ≠ BA generally). The order matters because transformations are being composed, not just numbers being multiplied.

In ML context: When we do forward propagation through a neural network, each layer is essentially a matrix multiplication (linear transformation) followed by a non-linear activation. Understanding matrix multiplication deeply means understanding how information flows through neural networks.

Did I get that right? I think so, but the fact that I'm uncertain shows how much work I still have to do.

The Psychological Battle 🧠

The hardest part of Day 1 wasn't the mathematics—it was the psychological challenge of realizing how much I don't know.

Imposter syndrome was real. Looking at job descriptions asking for people who can "implement transformers from scratch" while I'm struggling with basic linear algebra felt overwhelming.

But here's the reframe: Every ML researcher started somewhere. The difference between me and someone with a PhD isn't that they're smarter—it's that they've spent more time deeply understanding these fundamentals.

I have one advantage: I know how to learn complex technical concepts quickly. Blockchain development taught me that. The challenge is applying that same intensity and systematic approach to mathematics.

Today's Concrete Goals (Learning from Yesterday) šŸ“

Core Implementation Focus:

  • Matrix multiplication from scratch (3 different approaches)
  • Determinant calculation (both computational and geometric understanding)
  • Matrix inverse (when it exists and why it matters)

Deep Understanding Goals:

  • Geometric interpretation of matrix operations
  • Connection to linear transformations
  • Relevance to neural network operations

Documentation Goals:

  • Clean, well-commented code that teaches
  • Mathematical derivations written out by hand
  • Visualizations that demonstrate concepts

Problem-Solving Goals:

  • 15 matrix problems (down from yesterday's overly ambitious 30)
  • Focus on understanding each one deeply
  • Connect each problem to ML applications

The Adjusted Timeline Reality ā°

Yesterday made me realize that my initial 60-day timeline, while still the goal, needs to account for the actual learning curve.

Original assumption: I could absorb graduate-level mathematics at the same pace I learned JavaScript frameworks.

Reality: Mathematical intuition takes time to develop. You can't just "npm install" understanding of eigenvalues.

Adjusted approach: Same 60-day goal, but with more realistic daily expectations and deeper focus on true understanding rather than coverage.

Why I'm Sharing the Struggles šŸ’Ŗ

Most learning content online shows only the successes. The clean implementations, the "aha!" moments, the polished final results. But the real learning happens in the struggle, in the moments when you're completely lost and forcing yourself to push through.

For anyone following this journey: If you're also trying to learn ML/AI, know that feeling overwhelmed is normal. The difference between success and failure isn't avoiding the overwhelm—it's pushing through it systematically.

For experienced ML engineers: Was your learning journey similar? How did you develop mathematical intuition? I'd genuinely appreciate any advice in the comments.

Day 2 Commitment šŸ”„

Today, I'm going to prioritize depth over breadth. I'm going to implement matrix operations not just to make them work, but to truly understand what they represent geometrically and how they connect to machine learning.

I'm going to struggle with determinants until I can explain why they matter for understanding neural network behavior.

I'm going to visualize linear transformations until I can see them in my mind when I look at a matrix.

The goal isn't to check boxes—it's to build genuine understanding that will support everything else I learn in the next 58 days.

Tomorrow's Preview šŸ”®

Day 3 will focus on eigenvalues and eigenvectors—concepts that are absolutely crucial for understanding how neural networks learn but are notoriously difficult to grasp intuitively.

If today goes better than yesterday (which it has to), I'll dive into why eigenvalues matter for understanding the behavior of gradient descent and how eigenvectors relate to the principal directions of data.

If today is another struggle (which is possible), I'll adjust again and focus even more deeply on the fundamentals.


The journey continues. Reality has been brutal, but I'm not backing down.

To everyone following along: Thank you for the encouragement on Day 1. This is harder than I expected, but that just makes it more worth doing.(got 2 likes 🄹)

See you tomorrow for Day 3: Eigenvalues, Eigenvectors, and (Hopefully) Some Actual Understanding.


How do you handle the psychological challenge of learning really difficult technical concepts? Have you ever felt completely overwhelmed when starting something new? Let me know in the comments—misery loves company, but so does determination.

Top comments (0)