DEV Community

Cover image for πŸ“… Day 58 of My Data Analytics Journey
Ramya .C
Ramya .C

Posted on

πŸ“… Day 58 of My Data Analytics Journey

Today I dived into Linear Algebra concepts in NumPy πŸ€“
These operations are essential in Machine Learning, Data Science, and Deep Learning because they help us work with vectors, matrices, and transformations used in models like *Neural *

Here’s what I learned πŸ‘‡


βœ… 1️⃣ Dot Product (np.dot())

πŸ“Œ What it does

Multiplies two vectors/matrices. Used in ML for calculating weights Γ— features.

🧠 Example

import numpy as np

a = np.array([1, 2])
b = np.array([3, 4])

result = np.dot(a, b)
print(result)  # Output: 11
Enter fullscreen mode Exit fullscreen mode

Calculation β†’ (1*3 + 2*4) = 11


βœ… 2️⃣ Inner Product (np.inner())

πŸ“Œ What it does

Same as dot product for 1D vectors, but behaves differently for higher dimensions.

🧠 Example

a = np.array([1, 2])
b = np.array([3, 4])

print(np.inner(a, b))  # Output: 11
Enter fullscreen mode Exit fullscreen mode

For simple vectors, inner = dot


βœ… 3️⃣ Outer Product (np.outer())

πŸ“Œ What it does

Creates a matrix by multiplying each element of a vector with another.

🧠 Example

print(np.outer(a, b))
Enter fullscreen mode Exit fullscreen mode

Output

[[3 4]
 [6 8]]
Enter fullscreen mode Exit fullscreen mode

βœ… 4️⃣ Determinant (np.linalg.det())

πŸ“Œ What it does

Determines if a matrix is invertible (used in solving linear equations)

🧠 Example

m = np.array([[2, 3], 
              [1, 4]])

print(np.linalg.det(m))  # Output: 5.0
Enter fullscreen mode Exit fullscreen mode

βœ… 5️⃣ Solve Linear Equations (np.linalg.solve())

πŸ“Œ Example: Solve Ax = B

A = np.array([[2, 1],
              [1, 3]])

B = np.array([8, 13])
x = np.linalg.solve(A, B)

print(x)  # Output: [3. 5.]
Enter fullscreen mode Exit fullscreen mode

Solution β†’ x = 3, y = 5


βœ… 6️⃣ Inverse of Matrix (np.linalg.inv())

πŸ“Œ What it does

Computes matrix inverse (important in ML algorithms like normal equation)

print(np.linalg.inv(m))
Enter fullscreen mode Exit fullscreen mode

βœ… 7️⃣ Trace (np.trace())

πŸ“Œ What it does

Sum of diagonal elements

print(np.trace(m))  # Output: 6
Enter fullscreen mode Exit fullscreen mode

🎯 Summary Table

Operation Meaning NumPy
Dot Product Feature-weight multiplication np.dot()
Inner Product Similar to dot (1D) np.inner()
Outer Product Builds matrix from vectors np.outer()
Determinant Matrix validity np.linalg.det()
Solve Solve Ax = B np.linalg.solve()
Inverse Reverse matrix effect np.linalg.inv()
Trace Sum of diagonal values np.trace()

πŸš€ Why this matters?

These concepts are used in:

  • Machine Learning Model training
  • Deep Learning (Neural Networks)
  • PCA / Dimensionality Reduction
  • Matrix transformations

Today was a strong foundation day πŸ’ͺ


πŸ“‚ GitHub Code

πŸ”— https://github.com/ramyacse21/numpy_workspace/blob/main/linear%20algebra.py
Example format:

GitHub Project: https://github.com/ramyacse21/your-repo-link
#RamyaAnalyticsJourney
``


Enter fullscreen mode Exit fullscreen mode

Top comments (0)