SSIM loss difference #2626
-
Hello. The results of the other two match between each-other, while the kornia ssim_loss is a bit different (even after multiplying by 2). Following is an example of code: from kornia.losses import SSIMLoss
draem_loss = SSIM(window_size=11, val_range=1.0) # from DRAEM
riad_loss = SSIMLoss_riad(kernel_size=11) # from RIAD
kornia_loss = SSIMLoss(window_size=11, max_val=1.0)
x = torch.rand(4, 3, 50, 50)
y = torch.rand(4, 3, 50, 50)
print("draem: ", draem_loss(x, y))
print("riad: ", riad_loss(x, y))
print("kornia:", kornia_loss(y, x) * 2) which produces: draem: tensor(0.9139)
riad: tensor(0.9139)
kornia: tensor(0.9896) All produce very similar results in training. What could be the case here if anyone maybe knows? |
Beta Was this translation helpful? Give feedback.
Replies: 1 comment 3 replies
-
i see that we average in https://github.com/kornia/kornia/blob/master/kornia/losses/ssim.py#L53 |
Beta Was this translation helpful? Give feedback.
then the only difference is the separable convolution? maybe some of the "border" parameters do not match well