attention distillation (1) 썸네일형 리스트형 [논문이해] Dataset Distillation with Attention Labels for Fine-tuning BERT 논문명: Dataset Distillation with Attention Labels for Fine-tuning BERT 논문링크: https://aclanthology.org/2023.acl-short.12/ Dataset Distillation with Attention Labels for Fine-tuning BERT Aru Maekawa, Naoki Kobayashi, Kotaro Funakoshi, Manabu Okumura. Proceedings of the 61st Annual Meeting of the Association for Computational Linguistics (Volume 2: Short Papers). 2023. aclanthology.org 아이디어만 정리합니다 Da.. 이전 1 다음