You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hello everyone, I met a problem about unexpected output behaviour of batch normalisation layer or Cadd table.
I would like to investigate the output of these layers with layer.output attribute, but I found this unexpected behaviour: if the layer is followed by a relu unit, then the output of that layer will be the same as relu (non-negative).
For example, in the residual nets, there are two types of arrangements -batch normalisation layer with and without relu unit following. For those without relu following, they behave as expected (have negative value). However, those with relu following, the output that should be negative becoms zero.
Does anyone know what is happening there? Thank you a lot.
The text was updated successfully, but these errors were encountered:
Hello everyone, I met a problem about unexpected output behaviour of batch normalisation layer or Cadd table.
I would like to investigate the output of these layers with layer.output attribute, but I found this unexpected behaviour: if the layer is followed by a relu unit, then the output of that layer will be the same as relu (non-negative).
For example, in the residual nets, there are two types of arrangements -batch normalisation layer with and without relu unit following. For those without relu following, they behave as expected (have negative value). However, those with relu following, the output that should be negative becoms zero.
Does anyone know what is happening there? Thank you a lot.
The text was updated successfully, but these errors were encountered: