Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

p005-ReLU-Activation.c Program does not use RelU activation function correctly #153

Open
johnamostan opened this issue Feb 10, 2021 · 2 comments

Comments

@johnamostan
Copy link

Hi.

I was trying the C Dot Product function using my own dataset and I noticed that there is an issue with the placement of the RelU/Activation function.

image

I think the bias and the activation function should be interchanged.

Thanks!

Regards,

@johnamostan
Copy link
Author

This happens in p005-ReLU-Activation.c program

@johnamostan johnamostan changed the title C Program does not use RelU activation function correctly p005-ReLU-Activation.c Program does not use RelU activation function correctly Feb 10, 2021
Devilbinder added a commit to Devilbinder/NNfSiX that referenced this issue Feb 19, 2021
@Devilbinder
Copy link
Contributor

Error has been corrected in a048024

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants