Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Unexpected output behaviour #16

Open
pszyu opened this issue Aug 21, 2017 · 0 comments
Open

Unexpected output behaviour #16

pszyu opened this issue Aug 21, 2017 · 0 comments

Comments

@pszyu
Copy link

pszyu commented Aug 21, 2017

Hello everyone, I met a problem about unexpected output behaviour of batch normalisation layer or Cadd table.

I would like to investigate the output of these layers with layer.output attribute, but I found this unexpected behaviour: if the layer is followed by a relu unit, then the output of that layer will be the same as relu (non-negative).

For example, in the residual nets, there are two types of arrangements -batch normalisation layer with and without relu unit following. For those without relu following, they behave as expected (have negative value). However, those with relu following, the output that should be negative becoms zero.

Does anyone know what is happening there? Thank you a lot.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant