Hello everyone!
I am just asking how can I use ReLU activation function in neural net library of SciSnap?
I want to get outputs greater than 1!
Is it possible at all?
If someone knows how to do it, please help me.
Thank you!
RELU(x)={x>=0:x,x<0:-βx}
This is not what an activation function should do.
It should present a positive slope everywhere and be between -1 and 1 and retain the slope long enough.
Sigmoid is probably better.
Thank you for you response!
But I meant that even if I did put wanted input to 4 it converted to 1 when predicting.
It doesn't work
Show your project