Skip to content

Experiments (common)

burkinamaria edited this page May 21, 2020 · 18 revisions

ADDA (Ivan)

See Ivan's experiments -> ADDA

DANN, ADDA - MNIST (Dima, Ivan)

Source Target Train on target Source only Vanilla DANN DANN w Alexnet DANN paper ADDA ADDA paper
MNIST MNIST-M 0.9715 0.4583 0.8652 0.7844 0.7666 0.6396
MNIST-M MNIST 0.9902 0.9673 0.9748
MNIST SVHN 0.8823 0.2006 0.2601
SVHN MNIST 0.9902 0.6489 0.6161 0.8353 0.7385 0.7600
MNIST USPS 0.9681 0.4036 0.7634 0.9413 0.8940
USPS MNIST 0.9902 0.6225 0.9037 0.9410 0.8205 0.9010

See Dima's experiments

DANN - AlexNet(Boris, Katya, Masha)

A->W experiments with different dropout

EXP name domain head adaptation block usage domain dropout class dropout Source best acc Target best acc Source final acc Target final acc
Alexnet_vanilla (Katya) vanilla-dann true false false 1.0 0.59375 0.9989 0.48698
Alexnet_domain_dropout (Katya) dropout_dann true true false 1.0 0.61719 1.0 0.58203
Alexnet_domain_dropout (Boris) dropout_dann true true false ? 0.61719 ? ?
Alexnet_domain_dropout (Masha) dropout_dann true true false 1.0 0.6302 1.0 0.605 (mean of 3 launches)
Paper DANN dropout_dann true true ? - 0.73 - -
Alexnet_domain_and_class_dropout (Katya) dropout_dann true true true 1.0 0.62370 1.0 0.59375

AlexNet on different domains

A -> W D -> W W -> D A -> D D -> A W -> A
DANN Paper 0.73 0.964 0.992 - - -
Our (Boris) 0.61719 0.88281 0.95625 0.55625 0.41619 0.45668
Our (Masha) 0.605 0.9231 0.9625 - - -

A->W, D->W, W->D DANN and Source/Target only (Masha):

Final accuracy score averaged by 3 launches:

A->W

Results DANN Source-Only Target-Only
Ours 0.605 0.4748 0.9875
Paper 0.73 0.642 -

D->W

Results DANN Source-Only Target-Only
Ours 0.9231 0.9214 0.9875
Paper 0.964 0.961 -

W->D

Results DANN Source-Only Target-Only
Ours 0.9625 0.9652 0.9934
Paper 0.992 0.978 -

AlexNet with different bottleneck size (Masha)

See Masha's experiments

Comparison of the architectures with bottleneck size 256 (from the paper) and bottleneck size 2048

DANN - ResNet50(Anya, Boris, Ivan, Katya, Masha)

Common table

N Model Frozen Bottleneck Domain head Class head A → W D→ W W→ D A→ D D → A W→ A
1 DANN Alexnet Статья ? 256 ? ? 0.73 0.964 0.992
2 DANN Resnet Обзор ? ? ? ? 0.826 0.978 1 0.833 0.668 0.661
3 OneDomain ResNet Frozen (Anya) all - - - 0.67839 0.92318 0.98884 0.71652 0.62003 0.63388
4 OneDomain ResNet Frozen 129 (Anya) 129 - - - 0.74740 0.97005 0.99777 0.76786 0.63317 0.64453
5 Source-Only ResNet50 Bottleneck 2048 (Masha) 129 2048 - - 0.712
6 Target-Only ResNet50 Bottleneck 2048 (Masha) 129 2048 - - 0.987
7 DANN Resnet Frozen (Anya) all - vanilla get_resnet50 0.73438 0.95573 0.99554 0.77455 0.62749 0.63175
8 DANN Resnet Frozen 141 (Anya) 141 - vanilla get_resnet50 0.76172 0.97526 0.99777 0.78125 0.65270 0.63849
9 DANN Resnet Frozen 129 (Anya) 129 - vanilla get_resnet50 0.76693 0.98307 1.0 0.77679 0.65554 0.63849
10 DANN ResNet 0 freeze without domain loss (Ivan-exp2) 0 - vanilla get_resnet50 0.5443
11 DANN ResNet 72 freeze without domain loss (Ivan-exp2) 72 - vanilla get_resnet50 0.7348
12 DANN ResNet 129 freeze without domain loss (Ivan-exp2) 129 - vanilla get_resnet50 0.7849
13 DANN ResNet 141 freeze with domain loss (Ivan-exp2) 141 - vanilla get_resnet50 0.71484 0.62708 0.63246 0.67543
14 DANN ResNet 141 freeze without domain loss (Ivan-exp2) 141 - vanilla get_resnet50 0.77995 0.96875 0.99583 0.79583 0.63565 0.63175
15 DANN Rich ResNet Frozen (Anya) all - vanilla get_resnet50_rich_classifier 0.74740
16 DANN Rich ResNet Frozen 141 (Anya) 141 - vanilla get_resnet50_rich_classifier 0.85677 0.70916
17 DANN Rich ResNet Frozen 129 (Anya) 129 - vanilla get_resnet50_rich_classifier 0.85807
18 DANN Rich ResNet Frozen 72(Anya) 72 - vanilla get_resnet50_rich_classifier 0.83464
19 DANN Rich ResNet Frozen 141 Bottleneck 128 (Anya) 141 128 vanilla get_resnet50_rich_classifier 0.81771
20 DANN Rich ResNet Frozen 141 Bottleneck 256 (Anya) 141 256 vanilla get_resnet50_rich_classifier 0.86328 0.70881
21 DANN Rich ResNet Frozen 141 Bottleneck 512 (Anya) 141 512 vanilla get_resnet50_rich_classifier 0.84896 0.70810
22 DANN Rich ResNet Frozen 141 Bottleneck 1024 (Anya) 141 1024 vanilla get_resnet50_rich_classifier 0.87891 0.71626
23 DANN Rich ResNet Frozen 141 Bottleneck 2048 (Anya) 141 2048 vanilla get_resnet50_rich_classifier 0.86068 0.6985
24 Resnet_vanilla(Katya) 141 2048 vanilla get_resnet50 0.78255
25 Resnet_domain_dropout(Katya) 141 2048 dropout get_resnet50 0.85286
26 Resnet_domain_and_class_dropout(Katya) 141 2048 dropout dropout 0.85156
27 DANN ResNet50 domain dropout Bottleneck 2048 (Masha) 129 2048 dropout - 0.8307
28 dropout domain head (Boris) ? ? dropout ? 0.82292 0.97656 0.99375 0.78125 0.64453 0.66264
29 ResNet with domain loss (Ivan-exp4) 141 2048 vanilla get_resnet50 0.71484 0.62708 0.63246 0.67543
30 ResNet without domain loss (Ivan-exp4) 141 2048 vanilla get_resnet50 0.77995 0.96875 0.99583 0.79583 0.63565 0.63175
31 vanilla domain head - 141 layer freeze(Ivan-exp1) 141 2048 vanilla get_resnet50_righ_classifier 0.8451 0.9349 0.9754 0.7969 0.6719 0.6893
32 without domain loss 141 layer freeze (Ivan-exp1) 141 2048 vanilla get_resnet50_righ_classifier 0.7604 0.9674 0.9911 0.7924 0.6403 0.6317
33 test5 classificator - 141 layer freeze(Ivan-exp5) 141 2048 vanilla test5 0.8503 0.9154 0.9263 0.7902 0.6708 0.7109

Class heads at table:

get_resnet50 (== vanilla): nn.Sequential(nn.Linear(2048, CLASSES_CNT))

get_resnet50_righ_classifier: nn.Sequential( nn.Linear(2048, 2048), nn.BatchNorm1d(2048), nn.Dropout2d(), nn.ReLU(), nn.Linear(2048, domain_input_len), nn.BatchNorm1d(domain_input_len), nn.ReLU(), nn.Linear(domain_input_len, CLASSES_CNT))

dropout: nn.Sequential(nn.Linear(2048, 1024), nn.ReLU(), nn.Dropout(0.5), nn.Linear(1024, CLASSES_CNT))

test5: nn.Sequential( nn.Linear(2048, 2048), nn.BatchNorm1d(2048), nn.Dropout(), nn.ReLU(), nn.Linear(2048, 2048), nn.BatchNorm1d(2048), nn.Dropout(), nn.ReLU(), nn.Linear(2048, 2048), nn.BatchNorm1d(2048), nn.ReLU(), nn.Linear(2048, dann_config.CLASSES_CNT), )

Dropout heads at table:

vanilla nn.Sequential( nn.Linear(domain_input_len, 1024), nn.BatchNorm1d(1024), nn.ReLU(), nn.Linear(1024, 1024), nn.BatchNorm1d(1024), nn.ReLU(), nn.Linear(1024, 1) )

dropout nn.Sequential( nn.Linear(domain_input_len, 1024), nn.ReLU(), nn.Dropout(0.5), nn.Linear(1024, 1024), nn.ReLU(), nn.Dropout(0.5), nn.Linear(1024, 1) )

A->W experiments with different dropouts (Katya)

EXP name domain head adaptation block usage domain dropout class dropout Source best acc Target best acc Source final acc Target final acc
Resnet_vanilla vanilla-dann false false false 0.99751 0.78255 0.99503 0.72786
Resnet_domain_dropout dropout_dann false true false 1.0 0.85286 1.0 0.80339
Resnet_domain_and_class_dropout dropout_dann false true true 1.0 0.85156 1.0. 0.84766

Different preprocessing for ResNet backbone (Katya)

EXP name Source best acc Target best acc Source final acc Target final acc
resnet50_no_crop 1.0 0.85156 1.0 0.84766
resnet50_with_crop 1.0 0.83984 1.0 0.82161

Different input normalization for Alexnet backbone (Masha)

See Masha's experiments

Comparison of the different input normalizations