DEV Community

Cover image for Understanding ReLU Through Visual Python Examples
Rijul Rajesh
Rijul Rajesh

Posted on

Understanding ReLU Through Visual Python Examples

In the previous articles, we used backpropagation and plotted graphs to predict values correctly.

In all those examples, we used the Softplus activation function.

Now, let’s use the ReLU (Rectified Linear Unit) activation function, which is one of the most popular activation functions used in deep learning and convolutional neural networks.

ReLU is defined as:

ReLU(x) = max(0, x)
Enter fullscreen mode Exit fullscreen mode

The output range is from 0 to infinity.


Assumed values

w1 = 1.70   b1 = -0.85
w2 = 12.6   b2 = 0.00
w3 = -40.8  b3 = -16
w4 = 2.70
Enter fullscreen mode Exit fullscreen mode

We will use dosage values from 0 to 1.


Step 1: First linear transformation (w1, b1) + ReLU

If we plug in 0 for dosage:

0 × 1.70 + (-0.85) = -0.85
ReLU(-0.85) = 0
Enter fullscreen mode Exit fullscreen mode

So the y-axis value is 0.

If we plug in 0.2:

0.2 × 1.70 + (-0.85) = -0.51
ReLU(-0.51) = 0
Enter fullscreen mode Exit fullscreen mode

Still 0.

If we plug in 0.6:

0.6 × 1.70 + (-0.85) = 0.17
ReLU(0.17) = 0.17
Enter fullscreen mode Exit fullscreen mode

Now the output becomes positive.

As we continue the dosage up to 1, we get a bent blue line.

Demo code

import numpy as np
import matplotlib.pyplot as plt

x = np.linspace(0, 1, 100)
w1, b1 = 1.70, -0.85

z1 = w1 * x + b1
relu1 = np.maximum(0, z1)

plt.plot(x, relu1)
plt.xlabel("Dosage")
plt.ylabel("Activation")
plt.title("ReLU(w1 * x + b1)")
plt.show()
Enter fullscreen mode Exit fullscreen mode


Step 2: Multiply the output by w3 = -40.8

Now we multiply the bent blue line by -40.8.

This flips the curve downward and scales it.

Demo code

w3 = -40.8
scaled_blue = relu1 * w3

plt.plot(x, scaled_blue)
plt.xlabel("Dosage")
plt.ylabel("Value")
plt.title("ReLU Output × w3")
plt.show()
Enter fullscreen mode Exit fullscreen mode


Step 3: Bottom node (w2, b2)

Now we repeat the process for the bottom node using w2 and b2.

Since b2 = 0, this produces a straight orange line.

Demo code

w2, b2 = 12.6, 0.0

z2 = w2 * x + b2

plt.plot(x, z2, color="orange")
plt.xlabel("Dosage")
plt.ylabel("Value")
plt.title("w2 * x + b2")
plt.show()

Enter fullscreen mode Exit fullscreen mode


Step 4: Multiply bottom node by w4 = 2.70

Now multiply this straight line by 2.70.

Demo code

w4 = 2.70
scaled_orange = z2 * w4

plt.plot(x, scaled_orange, color="orange")
plt.xlabel("Dosage")
plt.ylabel("Value")
plt.title("(w2 * x + b2) × w4")
plt.show()

Enter fullscreen mode Exit fullscreen mode


Step 5: Add the two paths together

Now we add the bent blue line and the straight orange line.

This gives us a green wedge-shaped curve.

Demo code

combined = scaled_blue + scaled_orange

plt.plot(x, combined, color="green")
plt.xlabel("Dosage")
plt.ylabel("Value")
plt.title("Combined Signal")
plt.show()

Enter fullscreen mode Exit fullscreen mode


Step 6: Add bias b3 = -16

Now we add the bias b3 = -16 to the combined signal.

Demo code

b3 = -16
combined_bias = combined + b3

plt.plot(x, combined_bias, color="green")
plt.xlabel("Dosage")
plt.ylabel("Value")
plt.title("Combined Signal + b3")
plt.show()
Enter fullscreen mode Exit fullscreen mode


Step 7: Apply ReLU again

Now we apply ReLU over the green wedge.

This converts all negative values to 0 and keeps positive values unchanged.

Demo code

final_output = np.maximum(0, combined_bias)

plt.plot(x, final_output,color="green")
plt.xlabel("Dosage")
plt.ylabel("Activation")
plt.title("Final ReLU Output")
plt.show()
Enter fullscreen mode Exit fullscreen mode

So this is our final result, where we plotted a curve using ReLU, making in more realistic for real-world situations.

We will explore more on Neural networks in the coming articles.

You can try the examples out in the Colab notebook.

Looking for an easier way to install tools, libraries, or entire repositories?
Try Installerpedia: a community-driven, structured installation platform that lets you install almost anything with minimal hassle and clear, reliable guidance.

Just run:

ipm install repo-name
Enter fullscreen mode Exit fullscreen mode

… and you’re done! 🚀

Installerpedia Screenshot

🔗 Explore Installerpedia here

Top comments (0)