Journal
PROCEEDINGS OF THE 60TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2022), VOL 1: (LONG PAPERS)
Volume -, Issue -, Pages 583-595Publisher
ASSOC COMPUTATIONAL LINGUISTICS-ACL
Keywords
-
Categories
Funding
- Hong Kong Research Grants Council [16204920]
- National Natural Science Foundation of China [62106275]
Ask authors/readers for more resources
Building models of natural language processing in low-resource scenarios is challenging. Optimization-based meta-learning algorithms achieve promising results by adapting a well-generalized model initialization to handle new tasks. However, these approaches suffer from the issue of memorization overfitting. To address this issue, we propose a memory imitation meta-learning method that enhances the model's reliance on support sets for task adaptation.
Building models of natural language processing (NLP) is challenging in low-resource scenarios where only limited data are available. Optimization-based meta-learning algorithms achieve promising results in low-resource scenarios by adapting a well-generalized model initialization to handle new tasks. Nonetheless, these approaches suffer from the memorization overfitting issue, where the model tends to memorize the meta-training tasks while ignoring support sets when adapting to new tasks. To address this issue, we propose a memory imitation meta-learning (MemIML) method that enhances the model's reliance on support sets for task adaptation. Specifically, we introduce a task-specific memory module to store support set information and construct an imitation module to force query sets to imitate the behaviors of some representative support-set samples stored in the memory. A theoretical analysis is provided to prove the effectiveness of our method, and empirical results also demonstrate that our method outperforms competitive baselines on both text classification and generation tasks.
Authors
I am an author on this paper
Click your name to claim this paper and add it to your profile.
Reviews
Recommended
No Data Available