DEV Community


Discussion on: Build a flexible Neural Network with Backpropagation in Python

eternal_learner profile image

Samay, this has been great to read.

Assume I wanted to add another layer to the NN.

Would I update the backprop to something like:

def backward(self, X, y, o):
# backward propgate through the network
self.o_error = y - o
self.o_delta = self.o_error*self.sigmoidPrime(o)

self.z3_error = 
self.z3_delta = self.z3_error*self.sigmoidPrime(self.z3) 

self.z2_error = 
self.z2_delta = self.z2_error*self.sigmoidPrime(self.z2) 

self.W1 += 
self.W2 += 
self.W3 +=