You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Referring to the comment in the attached PR, the changes requested not approved.You may have a look into the PR and comments there.
The additional layer with relu activation will not anyway affect the output but only increases one layer. Hence this is not a bug anyway. Shall we close the issue now?
According to the code, x is activated twice when input into the residual block right?
The text was updated successfully, but these errors were encountered: