Open
Description
Hello,
I started to implement the relu function for the genann library on a fork under my name (https://github.com/kasey-/genann) before sending you a PR:
double inline genann_act_relu(const struct genann *ann unused, double a) {
return (a > 0.0) ? a : 0.0;
}
But I am a bit lost in the way you compute the back propagation of the neural network. The derivative of relu formula is trivial (a > 0.0) ? 1.0 : 0.0
But I cannot understand were I should plug-it inside your formula as I do not understand how do you compute your back propagation. Did you implemented only the derivate of the sigmoid ?
Metadata
Metadata
Assignees
Labels
No labels