DEV Community

loading...

Discussion on: Build a flexible Neural Network with Backpropagation in Python

Collapse
eternal_learner profile image
eternal_learner

Samay, this has been great to read.

Assume I wanted to add another layer to the NN.

Would I update the backprop to something like:

def backward(self, X, y, o):
# backward propgate through the network
self.o_error = y - o
self.o_delta = self.o_error*self.sigmoidPrime(o)

self.z3_error = self.o_delta.dot(self.W3.T) 
self.z3_delta = self.z3_error*self.sigmoidPrime(self.z3) 

self.z2_error = self.o_delta.dot(self.W2.T) 
self.z2_delta = self.z2_error*self.sigmoidPrime(self.z2) 

self.W1 += X.T.dot(self.z2_delta) 
self.W2 += self.z2.T.dot(self.z3_delta) 
self.W3 += self.z3.T.dot(self.o_delta)