4.7 Article

A General Approach to Dropout in Quantum Neural Networks

Journal

ADVANCED QUANTUM TECHNOLOGIES
Volume -, Issue -, Pages -

Publisher

WILEY
DOI: 10.1002/qute.202300220

Keywords

dropout; overfitting; overparametrization; quantum neural networks

Ask authors/readers for more resources

This article presents a generalized approach to applying the dropout technique in quantum neural network models, with different quantum dropout strategies analyzed to avoid overfitting and achieve a high level of generalization. The study highlights that quantum dropout does not impact the expressibility and entanglement of QNN models.
In classical machine learning (ML), overfitting is the phenomenon occurring when a given model learns the training data excessively well, and it thus performs poorly on unseen data. A commonly employed technique in ML is the so called dropout, which prevents computational units from becoming too specialized, hence reducing the risk of overfitting. With the advent of quantum neural networks (QNNs) as learning models, overfitting might soon become an issue, owing to the increasing depth of quantum circuits as well as multiple embedding of classical features, which are employed to give the computational nonlinearity. Here, a generalized approach is presented to apply the dropout technique in QNN models, defining and analyzing different quantum dropout strategies to avoid overfitting and achieve a high level of generalization. This study allows to envision the power of quantum dropout in enabling generalization, providing useful guidelines on determining the maximal dropout probability for a given model, based on overparametrization theory. It also highlights how quantum dropout does not impact the features of the QNN models, such as expressibility and entanglement. All these conclusions are supported by extensive numerical simulations and may pave the way to efficiently employing deep quantum machine learning (QML) models based on state-of-the-art QNNs. Randomly dropping artificial neurons and all their connections in the training phase reduces overfitting issues in classical neural networks, thus improving performances on previously unseen data. The authors introduce different dropout strategies applied to quantum neural networks, learning models based on parametrized quantum circuits. Quantum dropout strategies might help reducing overfitting without impacting the expressibility and entanglement of these models.image

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.7
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available