DEV Community

loading...

Discussion on: CartPole with a Deep Q-Network

Collapse
hoiy profile image
Hoiy

However, since the input features can be negative, ReLU might cause dead neurons, doesn't it? To overcome that problem, I decided to tanh as an activation function.

I am not sure about the original setting. If it is Dense layer with ReLU then it makes sense because it is possible for the Dense layer to learn a negative weight which maps negative input to a positive value.