In the previous article, we went through cross entropy for backpropagation in the case of Softmax.
In this article, we will compute the total cross entropy.
When we calculate the cross entropy, we get the following values:
Setosa = 0.56
Virginica = 0.54
Versicolor = 0.65
To get the total error for the neural network, all we do is add up the cross entropy values.
So the total becomes:
0.56 + 0.54 + 0.65 = 1.75
This value, 1.75, is the total cross entropy.
We then use backpropagation to adjust the weights and biases in order to minimize this error.
Now let us plot a graph.
import numpy as np
import matplotlib.pyplot as plt
# Probability values from 0 to 1 (avoid exact 0 and 1 to prevent log issues)
p = np.linspace(0.001, 0.999, 500)
# Cross entropy loss for true label = 1
loss = -np.log(p)
plt.figure()
plt.plot(p, loss)
plt.xlabel("Predicted probability (p)")
plt.ylabel("Cross entropy loss")
plt.title("Cross Entropy Loss vs Predicted Probability")
plt.show()
On the X axis, we take the Setosa probability value, which is 0.57. This value lies roughly around the middle of the range.
We can plug in values for p from 0 to 1 into the cross entropy function and plot the output on the Y axis.
From the graph, we can see that the loss spikes when p is near 0.
The Y axis represents the loss, which is a measure of how bad the prediction is.
Because of these spikes, when the neural network makes a very bad prediction, cross entropy produces a relatively large gradient. This allows the network to take a relatively large step toward a better prediction, since the slope of the tangent line is large in those regions.
In the next article, we will see how backpropagation is actually performed using cross entropy.
Looking for an easier way to install tools, libraries, or entire repositories?
Try Installerpedia: a community-driven, structured installation platform that lets you install almost anything with minimal hassle and clear, reliable guidance.
Just run:
ipm install repo-name
โฆ and youโre done! ๐


Top comments (0)