You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hello,
I'm successfully using your tripletlosslayer.py to train a triplet net, but I have some doubts:
The self.margin parameter is only used in the forward() function to compute the loss, but not in backward. As I understand, in Caffe the gradient should be set in bottom[].diff in the backward() function to do the backpropagation. So as the code is, the margin does not affect the training, but only the loss display, right?
I see than in setup you set self.a to 1, which you later use in backward for gradient computation. Why do you use that parameter and set it to 1?
The text was updated successfully, but these errors were encountered:
Hi, I think the margin is the hyper parameter to modify the positive triplet case, you may find that some triplet samples DON'T need to do backpropagation.
OK. So as I see in the code, the margin value is only used to omit from backpropagration those triplets fulfilling the margin, but has no influence in those triplets that are backpropagated, right?
Hello,
I'm successfully using your tripletlosslayer.py to train a triplet net, but I have some doubts:
The text was updated successfully, but these errors were encountered: