Skip to content

Latest commit

 

History

History
12 lines (10 loc) · 3.83 KB

File metadata and controls

12 lines (10 loc) · 3.83 KB

AAAI 2022

Year Title Author Publication Code Tasks Notes Datasets Notions
2022 Active Learning for Domain Adaptation: An Energy-Based Approach Xie et al. AAAI code Image Classification, Semantic segmentation Energy, DNNs, Domain Adaptation, Tra, Hard VisDA-2017 (Peng et al. 2017), Office- Home (Venkateswara et al. 2017) and Office-31 (Saenko et al. 2010), GTAV (Richter et al. 2016) to Cityscapes (Cordts et al. 2016).
2022 Towards Discriminant Analysis Classifiers Using Online Active Learning via Myoelectric Interfaces Jaramillo-Yanez et al. AAAI code streaming
2022 Active Learning on Pre-Trained Language Model with Task-Independent Triplet Loss Seo et al. AAAI - relation ex- traction and sentence classification informative+Diversity, Pre-trained LM, None, Pre+FT,Hard NYT-10, Wiki-KBP, AG News, PubMed Previous active learning methods usually rely on specific network architectures or task-dependent sam- ple acquisition algorithms. Moreover, when selecting a batch sample, previous works suffer from insufficient diversity of batch samples
2022 TrustAL: Trustworthy Active Learning Using Knowledge Distillation Kwak et al. AAAI - text classification uncertainty/diversity, PLM, None, PT+FT,Hard Movie review (Pang and Lee 2005) and SST-2 (Socher et al. 2013),
2022 CPRAL: Collaborative Panoptic-Regional Active Learning for Semantic Segmentation Qiao et al. AAAI - Semantic Segmentation vote en- tropy, Encoder-Decoder, None, PT+FT, Soft, Explain Cityscapes and BDD10K
2022 Boosting Active Learning via Improving Test Performance Wang et al. AAAI - image classification and semantic segmentation expected-gradnorm + entropy-gradnorm, ResNet-18, None, PT+FT, Hard Cifar10, Cifar100, SVHN, Caltech101, Cityscapes
2022 Similarity Search for Efficient Active Learning and Search of Rare Concepts Coleman et al. AAAI - Image Classification Similarity search, PLM, nearest neighbors for each labeled exam- ple, PT+FT, Hard ImageNet, OpenImages we improve the com- putational efficiency of active learning and search methods by restricting the candidate pool for labeling to the nearest neigh- bors of the currently labeled set instead of scanning over all of the unlabeled data.