4.6 Article

Zero-Shot Learners for Natural Language Understanding via a Unified Multiple-Choice Perspective

Journal

IEEE ACCESS
Volume 11, Issue -, Pages 142829-142845

Publisher

IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
DOI: 10.1109/ACCESS.2023.3343123

Keywords

Multitasking; Multi-task learning; natural language understanding; zero-shot learning

Ask authors/readers for more resources

This paper introduces the Unified Multiple-Choice (UniMC) framework for zero-shot learning, which achieves outstanding performance on tasks like text classification and sentiment analysis. The two-stage tuning method enables direct predictions on unseen tasks and reduces parameters, avoiding issues in large-scale models. Experimental results demonstrate the State-of-the-Art (SOTA) performance of UniMC and its strong generalization capacity across languages.
Zero-shot learning is an approach where models generalize to unseen tasks without direct training on them. We introduce the Unified Multiple-Choice (UniMC) framework, which is format-independent, compatible with various formats, and applicable to tasks like text classification and sentiment analysis. Furthermore, we design a two-stage tuning method, initially training on multiple-choice formats to develop format-agnostic capabilities, and subsequently enabling direct predictions on unseen tasks for zero-shot learning. Our methodology avoids issues in large-scale models like FLAN, enhancing generalization and reducing parameters. In experiments, UniMC shows State-of-the-Art (SOTA) performance across out-of-domain and in-domain benchmarks, with only 235M parameters, far fewer than previous methods. Moreover, the UniMC-Chinese model excels beyond human performance on benchmarks like EPRSTMT and CHID-FC, underscoring its generalization capacity across languages. Additionally, ablation experiments demonstrate the effectiveness of our design. The code and model weights are available at https://github.com/IDEA-CCNL/Fengshenbang-LM/tree/main/fengshen/examples/unimc.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.6
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available