You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
In RefineNet of the ConvNets.ipynb notebook, function rcu residual convolution unit. Code:
x = ReLU()(tensor)
x = Conv2D(f, 3, padding='same')(x)
x = ReLU()(tensor)
x = Conv2D(f, 3, padding='same')(x)
maybe should be replaced by:
x = ReLU()(tensor)
x = Conv2D(f, 3, padding='same')(x)
x = ReLU()(x)
x = Conv2D(f, 3, padding='same')(x)
i.e. tensor in the second ReLU should be replaced by x. Otherwise, the output of the first ReLU+Conv2D is rewritten by the second ReLU+Conv2D. This assumption is confirmed by the article, where RCU (Residual Conv Unit) has sequential ReLU->Conv2D->ReLU->Conv2D.
The text was updated successfully, but these errors were encountered:
In RefineNet of the ConvNets.ipynb notebook, function
rcu
residual convolution unit. Code:maybe should be replaced by:
i.e.
tensor
in the second ReLU should be replaced byx
. Otherwise, the output of the first ReLU+Conv2D is rewritten by the second ReLU+Conv2D. This assumption is confirmed by the article, where RCU (Residual Conv Unit) has sequential ReLU->Conv2D->ReLU->Conv2D.The text was updated successfully, but these errors were encountered: