Journal
ACM TRANSACTIONS ON INTELLIGENT SYSTEMS AND TECHNOLOGY
Volume 14, Issue 5, Pages -Publisher
ASSOC COMPUTING MACHINERY
DOI: 10.1145/3610299
Keywords
Neural architecture search; text classification; discretization; differentiable neural architecture search; mutual information maximization; representation learning
Ask authors/readers for more resources
This article introduces a novel method called Discretized Differentiable Neural Architecture Search (DDNAS) for learning and classifying text representations. By continuously relaxing the architecture representation and applying gradient descent, DDNAS can optimize the search. Additionally, a discretization layer based on mutual information maximization is proposed to model the latent hierarchical categorization in text representation. Extensive experiments demonstrate that DDNAS consistently outperforms other state-of-the-art NAS methods.
Neural Architecture Search (NAS) has shown promising capability in learning text representation. However, existing text-based NAS neither performs a learnable fusion of neural operations to optimize the architecture nor encodes the latent hierarchical categorization behind text input. This article presents a novel NAS method, Discretized Differentiable Neural Architecture Search (DDNAS), for text representation learning and classification. With the continuous relaxation of architecture representation, DDNAS can use gradient descent to optimize the search. We also propose a novel discretization layer via mutual information maximization, which is imposed on every search node to model the latent hierarchical categorization in text representation. Extensive experiments conducted on eight diverse real datasets exhibit that DDNAS can consistently outperform the state-of-the-art NAS methods. While DDNAS relies on only three basic operations, i.e., convolution, pooling, and none, to be the candidates of NAS building blocks, its promising performance is noticeable and extensible to obtain further improvement by adding more different operations.
Authors
I am an author on this paper
Click your name to claim this paper and add it to your profile.
Reviews
Recommended
No Data Available