You can not select more than 25 topics Topics must start with a chinese character,a letter or number, can include dashes ('-') and can be up to 35 characters long.
coincheung 99e04f64fb fix pfc gradient accumulation method 1 year ago
csrc fix pfc gradient accumulation method 1 year ago
pytorch_loss fix pfc gradient accumulation method 1 year ago
.gitignore add soft-dice loss and its cuda implementation 4 years ago
LICENSE Initial commit 5 years ago
README.md readme 1 year ago
affinity_loss.py add a modified affinity loss 3 years ago
amsoftmax.py tiny modify 3 years ago
conv_ops.py tiny modify 3 years ago
dice_loss.py tiny modify 3 years ago
dual_focal_loss.py tiny modify 3 years ago
ema.py tiny modify 3 years ago
focal_loss.py make focal loss suitable for regression task like bce 2 years ago
generalized_iou_loss.py tiny modify 3 years ago
hswish.py fix import problem 3 years ago
info_nce_dist.py fix dependencies, and modify division_floor 2 years ago
iou_loss.py add ciou/diou with 1/(h^2+w^2) replaced by 1, and refine giou 1 year ago
label_smooth.py add a bit comment 3 years ago
large_margin_softmax.py add a bit comment 3 years ago
lovasz_softmax.py a bit adjust 3 years ago
mish.py tiny modify 3 years ago
one_hot.py tiny modify 3 years ago
partial_fc_amsoftmax.py fix pfc gradient accumulation method 1 year ago
pc_softmax.py tiny modify 3 years ago
setup.py first version layernorm 2 years ago
soft_dice_loss.py fix lsr problem when sample size is very small 2 years ago
swish.py tiny modify 3 years ago
taylor_softmax.py add taylor softmax cuda 3 years ago
triplet_loss.py tiny modify 3 years ago